```,归一化时用到的最大值并不是用全局最大值,而是取排序后95%位置处的特征值作为最大值,同时保留极值。
+
+### CTR网络输入的定义
+
+正如前所述,Criteo数据集中,分为连续数据与离散(稀疏)数据,所以整体而言,CTR-DNN模型的数据输入层包括三个,分别是:`dense_input`用于输入连续数据,维度由超参数`dense_feature_dim`指定,数据类型是归一化后的浮点型数据。`sparse_input_ids`用于记录离散数据,在Criteo数据集中,共有26个slot,所以我们创建了名为`C1~C26`的26个稀疏参数输入,并设置`lod_level=1`,代表其为变长数据,数据类型为整数;最后是每条样本的`label`,代表了是否被点击,数据类型是整数,0代表负样例,1代表正样例。
+
+在Paddle中数据输入的声明使用`paddle.fluid.layers.data()`,会创建指定类型的占位符,数据IO会依据此定义进行数据的输入。
+
+稀疏参数输入的定义:
+```python
+def sparse_inputs():
+ ids = envs.get_global_env("hyper_parameters.sparse_inputs_slots", None, self._namespace)
+
+ sparse_input_ids = [
+ fluid.layers.data(name="S" + str(i),
+ shape=[1],
+ lod_level=1,
+ dtype="int64") for i in range(1, ids)
+ ]
+ return sparse_input_ids
+```
+
+稠密参数输入的定义:
+```python
+def dense_input():
+ dim = envs.get_global_env("hyper_parameters.dense_input_dim", None, self._namespace)
+
+ dense_input_var = fluid.layers.data(name="D",
+ shape=[dim],
+ dtype="float32")
+ return dense_input_var
+```
+
+标签的定义:
+```python
+def label_input():
+ label = fluid.layers.data(name="click", shape=[1], dtype="int64")
+ return label
+```
+
+组合起来,正确的声明他们:
+```python
+self.sparse_inputs = sparse_inputs()
+self.dense_input = dense_input()
+self.label_input = label_input()
+
+self._data_var.append(self.dense_input)
+
+for input in self.sparse_inputs:
+ self._data_var.append(input)
+
+self._data_var.append(self.label_input)
+
+```
+
+
+### Criteo Reader写法
+
+```python
+# 引入PaddleRec的Reader基类
+from paddlerec.core.reader import Reader
+# 引入PaddleRec的读取yaml配置文件的方法
+from paddlerec.core.utils import envs
+
+# 定义TrainReader,需要继承 paddlerec.core.reader.Reader
+class TrainReader(Reader):
+
+ # 数据预处理逻辑,继承自基类
+ # 如果无需处理, 使用pass跳过该函数的执行
+ def init(self):
+ self.cont_min_ = [0, -3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
+ self.cont_max_ = [20, 600, 100, 50, 64000, 500, 100, 50, 500, 10, 10, 10, 50]
+ self.cont_diff_ = [20, 603, 100, 50, 64000, 500, 100, 50, 500, 10, 10, 10, 50]
+ self.hash_dim_ = envs.get_global_env("hyper_parameters.sparse_feature_number", None, "train.model")
+ self.continuous_range_ = range(1, 14)
+ self.categorical_range_ = range(14, 40)
+
+ # 读取数据方法,继承自基类
+ # 实现可以迭代的reader函数,逐行处理数据
+ def generate_sample(self, line):
+ """
+ Read the data line by line and process it as a dictionary
+ """
+
+ def reader():
+ """
+ This function needs to be implemented by the user, based on data format
+ """
+ features = line.rstrip('\n').split('\t')
+
+ dense_feature = []
+ sparse_feature = []
+ for idx in self.continuous_range_:
+ if features[idx] == "":
+ dense_feature.append(0.0)
+ else:
+ dense_feature.append(
+ (float(features[idx]) - self.cont_min_[idx - 1]) /
+ self.cont_diff_[idx - 1])
+
+ for idx in self.categorical_range_:
+ sparse_feature.append(
+ [hash(str(idx) + features[idx]) % self.hash_dim_])
+ label = [int(features[0])]
+ feature_name = ["D"]
+ for idx in self.categorical_range_:
+ feature_name.append("S" + str(idx - 13))
+ feature_name.append("label")
+ yield zip(feature_name, [dense_feature] + sparse_feature + [label])
+
+ return reader
+```
+
+
+### 调试Reader
+
+在Linux下运行时,默认启动`Dataset`模式,在Win/Mac下运行时,默认启动`Dataloader`模式。
+
+通过在`config.yaml`中添加或修改`reader_debug_mode=True`打开debug模式,只会结合组网运行reader的部分,读取10条样本,并print,方便您观察格式是否符合预期或隐藏bug。
+```yaml
+reader:
+ batch_size: 2
+ class: "{workspace}/../criteo_reader.py"
+ train_data_path: "{workspace}/data/train"
+ reader_debug_mode: True
+```
+
+修改后,使用paddlerec.run执行该修改后的yaml文件,可以观察输出。
+```bash
+python -m paddlerec.run -m ./models/rank/dnn/config.yaml -e single
+```
+
+### Dataset调试
+
+dataset输出的数据格式如下:
+` dense_input:size ; dense_input:value ; sparse_input:size ; sparse_input:value ; ... ; sparse_input:size ; sparse_input:value ; label:size ; label:value `
+
+基本规律是对于每个变量,会先输出其维度大小,再输出其具体值。
+
+直接debug `criteo_reader`理想的输出为(截取了一个片段):
+```bash
+...
+13 0.0 0.00497512437811 0.05 0.08 0.207421875 0.028 0.35 0.08 0.082 0.0 0.4 0.0 0.08 1 737395 1 210498 1 903564 1 286224 1 286835 1 906818 1 90
+6116 1 67180 1 27346 1 51086 1 142177 1 95024 1 157883 1 873363 1 600281 1 812592 1 228085 1 35900 1 880474 1 984402 1 100885 1 26235 1 410878 1 798162 1 499868 1 306163 1 0
+...
+```
+可以看到首先输出的是13维的dense参数,随后是分立的sparse参数,最后一个是1维的label,数值为0,输出符合预期。
+
+>使用Dataset的一些注意事项
+> - Dataset的基本原理:将数据print到缓存,再由C++端的代码实现读取,因此,我们不能在dataset的读取代码中,加入与数据读取无关的print信息,会导致C++端拿到错误的数据信息。
+> - dataset目前只支持在`unbuntu`及`CentOS`等标准Linux环境下使用,在`Windows`及`Mac`下使用时,会产生预料之外的错误,请知悉。
+
+### DataLoader调试
+
+dataloader的输出格式为`list: [ list[var_1], list[var_2], ... , list[var_3]]`,每条样本的数据会被放在一个 **list[list]** 中,list[0]为第一个variable。
+
+直接debug `criteo_reader`理想的输出为(截取了一个片段):
+```bash
+...
+[[0.0, 0.004975124378109453, 0.05, 0.08, 0.207421875, 0.028, 0.35, 0.08, 0.082, 0.0, 0.4, 0.0, 0.08], [560746], [902436], [262029], [182633], [368411], [735166], [321120], [39572], [185732], [140298], [926671], [81559], [461249], [728372], [915018], [907965], [818961], [850958], [311492], [980340], [254960], [175041], [524857], [764893], [526288], [220126], [0]]
+...
+```
+可以看到首先输出的是13维的dense参数的list,随后是分立的sparse参数,各自在一个list中,最后一个是1维的label的list,数值为0,输出符合预期。
diff --git a/doc/design.md b/doc/design.md
new file mode 100644
index 0000000000000000000000000000000000000000..a442bd16a25301178538f482cd537a4ca23bc395
--- /dev/null
+++ b/doc/design.md
@@ -0,0 +1,282 @@
+# PaddleRec 设计
+
+
+## PaddleRec 整体设计概览
+PaddleRec将推荐模型的训练与预测流程,整体抽象为了五个大模块:
+
+* [Engine 流程执行引擎](#engine)
+* [Trainer 流程具体定义](#trainer)
+* [Model 模型组网定义](#model)
+* [Reader 数据读取定义](#reader)
+* [Metric 精度指标打印](#metric)
+
+层级结构,以及一键启动训练时的调用关系如下图所示:
+
+
+
+
+
+core的文件结构如下,后续分别对各个模块进行介绍。
+```
+.core
+├── engine/ 运行引擎实现
+├── metrics/ 全局指标实现
+├── modules/ 自定义op实现
+├── trainers/ 运行流程实现
+├── utils/ 辅助工具
+├── factory.py 运行流程的注册
+├── layer.py 自定义op基类定义
+├── metric.py Metric基类定义
+├── model.py Model基类定义
+├── reader.py Reader基类定义
+└── trainer.py Trainer基类定义
+```
+
+
+## Engine
+
+Engine是整体训练的执行引擎,与组网逻辑及数据无关,只与当前运行模式、运行环境及运行设备有关。
+
+运行模式具体是指:
+- 单机运行
+- 分布式运行
+- 本地模拟分布式
+
+运行环境是指:
+- Linux
+- Windows
+- Mac
+
+运行设备是指:
+- CPU
+- GPU
+- AI芯片
+
+在用户调用`python -m paddlerec.run`时,首先会根据`yaml`文件中的配置信息选择合适的执行引擎, 以下代码位于[run.py](../run.py):
+```python
+engine_registry()
+which_engine = get_engine(args)
+engine = which_engine(args)
+engine.run()
+```
+
+我们以`single engine`为例,概览engine的行为:
+```python
+def single_engine(args):
+ trainer = get_trainer_prefix(args) + "SingleTrainer"
+ single_envs = {}
+ single_envs["train.trainer.trainer"] = trainer
+ single_envs["train.trainer.threads"] = "2"
+ single_envs["train.trainer.engine"] = "single"
+ single_envs["train.trainer.device"] = args.device
+ single_envs["train.trainer.platform"] = envs.get_platform()
+ print("use {} engine to run model: {}".format(trainer, args.model))
+
+ set_runtime_envs(single_envs, args.model)
+ trainer = TrainerFactory.create(args.model)
+ return trainer
+```
+single_engine被调用后,主要进行了以下两个工作:
+
+1. 根据`yaml`配置文件,设置了**当前进程的环境变量**,后续的所有流程都依赖于环境变量。
+2. 根据模型及环境,指定并初始化了运行流程所用的`Trainer`
+
+进一步细化第一步工作
+- 本地模拟分布式引擎会在单机环境变量的基础上,额外设置本地模拟分布式的环境变量,比如:为各个进程设置不同通信端口,分配ID。最后会启动多个`Trainer`完成本地模拟分布式的工作。
+- 分布式引擎会在单机环境变量的基础上,基于运行参数`-b --backend`所指定的脚本或配置文件,完成分布式任务的文件打包,上传,提交等操作。该脚本格式与分布式任务运行的集群有关,如MPI/K8S/PaddleCloud等,用户可以自定义分布式运行逻辑。
+
+Engine的自定义实现,可以参考[local_cluster.py](../core/engine/local_cluster.py)
+
+## Trainer
+
+`Trainer`是训练与预测流程的具体实现,会run模型中定义的各个流程,与model、reader、metric紧密相关。PaddleRec以有限状态机的逻辑定义了训练中的各个阶段,不同的Trainer子类会分别实现阶段中的特殊需求。有限状态机的流程在`def processor_register()`中注册。
+
+我们以SingleTrainer为例,概览Trainer行为:
+
+```python
+class SingleTrainer(TranspileTrainer):
+ def processor_register(self):
+ self.regist_context_processor('uninit', self.instance)
+ self.regist_context_processor('init_pass', self.init)
+ self.regist_context_processor('startup_pass', self.startup)
+ if envs.get_platform() == "LINUX" and envs.get_global_env("dataset_class", None, "train.reader") != "DataLoader":
+ self.regist_context_processor('train_pass', self.dataset_train)
+ else:
+ self.regist_context_processor('train_pass', self.dataloader_train)
+
+ self.regist_context_processor('infer_pass', self.infer)
+ self.regist_context_processor('terminal_pass', self.terminal)
+```
+
+SingleTrainer首先注册了完成任务所需的步骤,各步骤首先按照注册顺序加入`Trainer`基类中名为`status_processor`的字典,运行的先后顺序,可以在每个执行步骤中改变`context['status']`的值,指定下一步运行哪个步骤。
+
+SingleTrainer指定了以下6个步骤:
+1. uninit:默认排在首位,通过环境变量决定model的对象
+1. init_pass:调用model_的接口,生成模型的组网,初始化fetch及metric的变量
+2. startup_pass:初始化模型组网中的各个参数,run(fluid.default_startup_program)
+3. train_pass:会根据环境分别调用`dataset`与`dataloader`进行训练的流程。
+4. infer_pass:在训练结束后,会对训练保存的模型在测试集上验证效果
+5. terminal_pass:打印全局变量及预测结果等自定义的信息。
+
+Trainer的自定义实现,可以参照[single_trainer.py](../core/trainers/single_trainer.py)
+
+## Model
+
+Model定义了各个模型实现的范式,模型只要继承并实现基类中的函数,并给一些成员赋值,就可以保证模型被Trainer正确调用。
+
+我们首先看一下Model基类中的部分重要定义,对模型的实现流程有初步概念。
+
+```python
+class Model(object):
+ __metaclass__ = abc.ABCMeta
+
+ def __init__(self, config):
+ self._cost = None
+ self._metrics = {}
+ self._data_var = []
+ self._infer_data_var = []
+ self._infer_results = {}
+ self._data_loader = None
+ self._infer_data_loader = None
+ self._fetch_interval = 20
+ self._namespace = "train.model"
+ self._platform = envs.get_platform()
+
+ def get_inputs(self):
+ return self._data_var
+
+ @abc.abstractmethod
+ def train_net(self):
+ pass
+
+ @abc.abstractmethod
+ def infer_net(self):
+ pass
+
+ def get_avg_cost(self):
+ return self._cost
+
+```
+
+每个模型都一定需要继承`def train_net`与`def infer_net`,并且给`self._data_var`与`self._cost`成员赋值,指定模型入口,实现组网的整体逻辑。若有更多或更复杂的需求,可以参照下面的接口,分别继承各个函数,并实现需要的功能:
+
+```python
+def get_infer_inputs(self):
+ return self._infer_data_var
+
+def get_infer_results(self):
+ return self._infer_results
+
+def get_metrics(self):
+ return self._metrics
+
+def get_fetch_period(self):
+ return self._fetch_interval
+```
+
+model的具体实现,可以参考dnn的示例[model.py](../../models/rank/dnn/../../../paddlerec/core/model.py)
+
+
+## Reader
+
+PaddleRec会根据运行环境,分别指定不同的数据IO方式。在Linux下,优先使用`Dataset`,Win及Mac优先使用`Dataloader`。
+
+
+Dataset的使用介绍可以参考[DatasetFactory](https://www.paddlepaddle.org.cn/documentation/docs/zh/api_cn/dataset_cn/DatasetFactory_cn.html)
+
+Dataloader的使用介绍可以参考[异步数据读取](https://www.paddlepaddle.org.cn/documentation/docs/zh/advanced_guide/data_preparing/use_py_reader.html)
+
+
+考虑到以上两种高效的数据IO方式仍然有很高的学习门槛,PaddleRec将两种数据读取方式进行了更高层次的封装,用户需要实现的仅是每行数据的处理逻辑,剩下的工作交给PaddleRec的Reader基类完成。
+
+首先浏览以下Reader基类的定义,有一个初步的印象:
+
+```python
+class Reader(dg.MultiSlotDataGenerator):
+ __metaclass__ = abc.ABCMeta
+
+ def __init__(self, config):
+ dg.MultiSlotDataGenerator.__init__(self)
+
+ if os.path.isfile(config):
+ with open(config, 'r') as rb:
+ _config = yaml.load(rb.read(), Loader=yaml.FullLoader)
+ else:
+ raise ValueError("reader config only support yaml")
+
+ envs.set_global_envs(_config)
+ envs.update_workspace()
+
+ @abc.abstractmethod
+ def init(self):
+ pass
+
+ @abc.abstractmethod
+ def generate_sample(self, line):
+ pass
+
+```
+
+用户需要关注并实现的是`def init(self)`与`def generate_sample(self,line)`函数,分别执行数据读取中预处理所需变量的初始化,以及每一行string的切分及处理逻辑。
+
+当用户定义好以上两个函数,完成自己的Reader后,PaddleRec分别使用
+- [dataset_instance.py](../core/utils/dataset_instance.py)
+- [dataloader_instance.py](../core/utils/dataloader_instance.py)
+
+完成reader的构建工作。
+
+Reader数据处理的逻辑,可以参考[criteo_reader.py](../../models/rank/../../paddlerec/models/rank/criteo_reader.py)
+
+
+
+## Metric
+
+训练必然伴随着训练指标的打印,当单机运行时,打印相关信息比较简单。但分布式训练时,单机指标与全局指标往往有很大diff,比如`auc`以及正逆序`pn`。PaddleRec面向大规模分布式训练,将指标打印的逻辑抽象出来单独实现,以解决分布式训练时全局指标打印的问题。
+
+Metric基类定义了基本的接口,如下:
+```python
+class Metric(object):
+ __metaclass__ = abc.ABCMeta
+
+ def __init__(self, config):
+ """ init """
+ pass
+
+ @abc.abstractmethod
+ def clear(self, scope, params):
+ """
+ clear current value
+ Args:
+ scope: value container
+ params: extend varilable for clear
+ """
+ pass
+
+ @abc.abstractmethod
+ def calculate(self, scope, params):
+ """
+ calculate result
+ Args:
+ scope: value container
+ params: extend varilable for clear
+ """
+ pass
+
+ @abc.abstractmethod
+ def get_result(self):
+ """
+ Return:
+ result(dict) : calculate result
+ """
+ pass
+
+ @abc.abstractmethod
+ def get_result_to_string(self):
+ """
+ Return:
+ result(string) : calculate result with string format, for output
+ """
+ pass
+```
+
+全局指标的计算及输出,需要分别继承并实现以上四个成员函数。具体实现的例子,可以参考[auc_metric.py](../core/metrics/auc_metrics.py)
diff --git a/doc/development.md b/doc/development.md
new file mode 100644
index 0000000000000000000000000000000000000000..0c0162ba16349c1b7bd15cce97ec211d2bebb6b3
--- /dev/null
+++ b/doc/development.md
@@ -0,0 +1,163 @@
+# 二次开发
+
+## 如何添加自定义模型
+
+当您希望开发自定义模型时,需要继承模型的模板基类,并实现三个必要的方法`init_hyper_parameter`,`intput_data`,`net`
+
+并按照以下规范添加代码。
+
+### 基类的继承
+
+继承`paddlerec.core.model`的ModelBase,命名为`Class Model`
+
+```python
+from paddlerec.core.model import ModelBase
+
+
+class Model(ModelBase):
+
+ # 构造函数无需显式指定
+ # 若继承,务必调用基类的__init__方法
+ def __init__(self, config):
+ ModelBase.__init__(self, config)
+ # ModelBase的__init__方法会调用_init_hyper_parameter()
+
+```
+
+### 超参的初始化
+
+继承并实现`_init_hyper_parameter`方法(必要),可以在该方法中,从`yaml`文件获取超参或进行自定义操作。如下面的示例:
+
+所有的envs调用接口在_init_hyper_parameters()方法中实现,同时类成员也推荐在此做声明及初始化。
+
+```python
+ def _init_hyper_parameters(self):
+ self.feature_size = envs.get_global_env(
+ "hyper_parameters.feature_size")
+ self.expert_num = envs.get_global_env("hyper_parameters.expert_num")
+ self.gate_num = envs.get_global_env("hyper_parameters.gate_num")
+ self.expert_size = envs.get_global_env("hyper_parameters.expert_size")
+ self.tower_size = envs.get_global_env("hyper_parameters.tower_size")
+```
+
+
+### 数据输入的定义
+继承并实现`input_data`方法(非必要)
+
+
+#### 直接使用基类的数据读取方法
+
+`ModelBase`中的input_data默认实现为slot_reader,在`config.yaml`中分别配置`reader.sparse_slot`及`reader.dense_slot`选项实现`slog:feasign`模式的数据读取。
+
+> Slot : Feasign 是什么?
+>
+> Slot直译是槽位,在Rec工程中,是指某一个宽泛的特征类别,比如用户ID、性别、年龄就是Slot,Feasign则是具体值,比如:12345,男,20岁。
+>
+> 在实践过程中,很多特征槽位不是单一属性,或无法量化并且离散稀疏的,比如某用户兴趣爱好有三个:游戏/足球/数码,且每个具体兴趣又有多个特征维度,则在兴趣爱好这个Slot兴趣槽位中,就会有多个Feasign值。
+>
+> PaddleRec在读取数据时,每个Slot ID对应的特征,支持稀疏,且支持变长,可以非常灵活的支持各种场景的推荐模型训练。
+
+使用示例请参考`rank.dnn`模型。
+
+#### 自定义数据输入
+
+
+如果您不想使用`slot:feasign`模式,则需继承并实现`input_data`接口,接口定义:`def input_data(self, is_infer=False, **kwargs)`
+
+使用示例如下:
+
+```python
+def input_data(self, is_infer=False, **kwargs):
+ ser_slot_names = fluid.data(
+ name='user_slot_names',
+ shape=[None, 1],
+ dtype='int64',
+ lod_level=1)
+ item_slot_names = fluid.data(
+ name='item_slot_names',
+ shape=[None, self.item_len],
+ dtype='int64',
+ lod_level=1)
+ lens = fluid.data(name='lens', shape=[None], dtype='int64')
+ labels = fluid.data(
+ name='labels',
+ shape=[None, self.item_len],
+ dtype='int64',
+ lod_level=1)
+
+ train_inputs = [user_slot_names] + [item_slot_names] + [lens] + [labels]
+ infer_inputs = [user_slot_names] + [item_slot_names] + [lens]
+
+ if is_infer:
+ return infer_inputs
+ else:
+ return train_inputs
+```
+
+更多数据读取教程,请参考[自定义数据集及Reader](custom_dataset_reader.md)
+
+
+### 组网的定义
+
+继承并实现`net`方法(必要)
+
+- 接口定义`def net(self, inputs, is_infer=False)`
+- 自定义网络需在该函数中使用paddle组网,实现前向逻辑,定义网络的Loss及Metrics,通过`is_infer`判断是否为infer网络。
+- 我们强烈建议`train`及`infer`尽量复用相同代码,
+- `net`中调用的其他函数以下划线为头进行命名,封装网络中的结构模块,如`_sparse_embedding_layer(self)`。
+- `inputs`为`def input_data()`的输出,若使用`slot_reader`方式,inputs为占位符,无实际意义,通过以下方法拿到dense及sparse的输入
+
+ ```python
+ self.sparse_inputs = self._sparse_data_var[1:]
+ self.dense_input = self._dense_data_var[0]
+ self.label_input = self._sparse_data_var[0]
+ ```
+
+可以参考官方模型的示例学习net的构造方法。
+
+## 如何运行自定义模型
+
+记录`model.py`,`config.yaml`及数据读取`reader.py`的文件路径,建议置于同一文件夹下,如`/home/custom_model`下,更改`config.yaml`中的配置选项
+
+1. 更改 workerspace为模型文件所在文件夹
+```yaml
+workspace: "/home/custom_model"
+```
+
+2. 更改数据地址及读取reader地址
+```yaml
+dataset:
+- name: custom_model_train
+- data_path: "{workspace}/data/train" # or "/home/custom_model/data/train"
+- data_converter: "{workspace}/reader.py" # or "/home/custom_model/reader.py"
+```
+
+3. 更改执行器的路径配置
+```yaml
+mode: train_runner
+
+runner:
+- name: train_runner
+ class: single_train
+ device: cpu
+ epochs: 10
+ save_checkpoint_interval: 2
+ save_inference_interval: 5
+ save_checkpoint_path: "{workspace}/increment" # or "/home/custom_model/increment"
+ save_inference_path: "{workspace}/inference" # or "/home/custom_model/inference"
+ print_interval: 10
+
+phase:
+- name: train
+ model: "{workspace}/model.py" # or "/home/custom_model/model"
+ dataset_name: custom_model_train
+ thread_num: 1
+```
+
+4. 使用paddlerec.run方法运行自定义模型
+
+```shell
+python -m paddlerec.run -m /home/custom_model/config.yaml
+```
+
+以上~请开始享受你的推荐算法高效开发流程。如有任何问题,欢迎在[issue](https://github.com/PaddlePaddle/PaddleRec/issues)提出,我们会第一时间跟进解决。
diff --git a/doc/distributed_train.md b/doc/distributed_train.md
new file mode 100644
index 0000000000000000000000000000000000000000..339c5a83ffd26f9416a67a02390a11ba4c87c29d
--- /dev/null
+++ b/doc/distributed_train.md
@@ -0,0 +1,9 @@
+# PaddleRec 分布式训练
+
+## PaddleRec分布式运行
+> 占位
+### 本地模拟分布式
+> 占位
+
+### K8S集群运行分布式
+> 占位
diff --git a/doc/faq.md b/doc/faq.md
new file mode 100644
index 0000000000000000000000000000000000000000..60790140877b6b11add29552e02c0a435da75f87
--- /dev/null
+++ b/doc/faq.md
@@ -0,0 +1,2 @@
+# 常见问题FAQ
+> 占位
diff --git a/doc/imgs/cnn-ckim2014.png b/doc/imgs/cnn-ckim2014.png
new file mode 100644
index 0000000000000000000000000000000000000000..691fd457b7c628a899632b4bbe91c9fe57655c71
Binary files /dev/null and b/doc/imgs/cnn-ckim2014.png differ
diff --git a/doc/imgs/dcn.png b/doc/imgs/dcn.png
new file mode 100644
index 0000000000000000000000000000000000000000..82a77e1743ac4425fbcd5f636b8360ede91258dd
Binary files /dev/null and b/doc/imgs/dcn.png differ
diff --git a/doc/imgs/deepfm.png b/doc/imgs/deepfm.png
new file mode 100644
index 0000000000000000000000000000000000000000..3288a71fdda524c972144b7eeaeeb2fcdd93728f
Binary files /dev/null and b/doc/imgs/deepfm.png differ
diff --git a/doc/imgs/design.png b/doc/imgs/design.png
new file mode 100644
index 0000000000000000000000000000000000000000..740112697e5d6ce82521446ac05336cad33d6324
Binary files /dev/null and b/doc/imgs/design.png differ
diff --git a/doc/imgs/din.png b/doc/imgs/din.png
new file mode 100644
index 0000000000000000000000000000000000000000..3a3e550a4802c6159d06f83759251d36fa7984ca
Binary files /dev/null and b/doc/imgs/din.png differ
diff --git a/doc/imgs/dssm.png b/doc/imgs/dssm.png
new file mode 100644
index 0000000000000000000000000000000000000000..c1c462b825225ce74f9669dc1b301ced8823287d
Binary files /dev/null and b/doc/imgs/dssm.png differ
diff --git a/doc/imgs/esmm.png b/doc/imgs/esmm.png
new file mode 100644
index 0000000000000000000000000000000000000000..73db53d999503832e893016c41917d962a9e3860
Binary files /dev/null and b/doc/imgs/esmm.png differ
diff --git a/doc/imgs/fleet-ps.png b/doc/imgs/fleet-ps.png
new file mode 100644
index 0000000000000000000000000000000000000000..c82141ea9ff43ed670927ed853fd4c766496a949
Binary files /dev/null and b/doc/imgs/fleet-ps.png differ
diff --git a/doc/imgs/gnn.png b/doc/imgs/gnn.png
new file mode 100644
index 0000000000000000000000000000000000000000..8a5e111e92853a92cd04df5de1c347f85afde56f
Binary files /dev/null and b/doc/imgs/gnn.png differ
diff --git a/doc/imgs/gru4rec.png b/doc/imgs/gru4rec.png
new file mode 100644
index 0000000000000000000000000000000000000000..ab7f6074b15ee85ce6f24e19ad74f3c16a72559d
Binary files /dev/null and b/doc/imgs/gru4rec.png differ
diff --git a/doc/imgs/listwise.png b/doc/imgs/listwise.png
new file mode 100644
index 0000000000000000000000000000000000000000..88e79fe4052273f349e707d14e2e0647c63caa03
Binary files /dev/null and b/doc/imgs/listwise.png differ
diff --git a/doc/imgs/logo.png b/doc/imgs/logo.png
old mode 100755
new mode 100644
index 4b3d1e888a5009fd86b6acc74e09e58067488c6e..913a5cb7a47ffc093b4d9b7ee620d18bc97ef0d9
Binary files a/doc/imgs/logo.png and b/doc/imgs/logo.png differ
diff --git a/doc/imgs/mmoe.png b/doc/imgs/mmoe.png
new file mode 100644
index 0000000000000000000000000000000000000000..5a64982d84c0897f051d2f5329a71e958d6cc16d
Binary files /dev/null and b/doc/imgs/mmoe.png differ
diff --git a/doc/imgs/multiview-simnet.png b/doc/imgs/multiview-simnet.png
new file mode 100644
index 0000000000000000000000000000000000000000..01c6974d6fb60b317b741669a41f2eecd949ca57
Binary files /dev/null and b/doc/imgs/multiview-simnet.png differ
diff --git a/doc/imgs/ncf.png b/doc/imgs/ncf.png
new file mode 100644
index 0000000000000000000000000000000000000000..2691ed9f851a3e1e4d7c22ac3bd6a49fe7f01b54
Binary files /dev/null and b/doc/imgs/ncf.png differ
diff --git a/doc/imgs/overview.png b/doc/imgs/overview.png
new file mode 100644
index 0000000000000000000000000000000000000000..83341cb3b96a257117f07e452993911277823f80
Binary files /dev/null and b/doc/imgs/overview.png differ
diff --git a/doc/imgs/ps-overview.png b/doc/imgs/ps-overview.png
new file mode 100644
index 0000000000000000000000000000000000000000..8e9509cb3e2b63bec46b40dcf39d8876ac41500e
Binary files /dev/null and b/doc/imgs/ps-overview.png differ
diff --git a/doc/imgs/rec-overview.png b/doc/imgs/rec-overview.png
new file mode 100644
index 0000000000000000000000000000000000000000..952bc447b7a2b602e9a2348d403092afaee1af08
Binary files /dev/null and b/doc/imgs/rec-overview.png differ
diff --git a/doc/imgs/share-bottom.png b/doc/imgs/share-bottom.png
new file mode 100644
index 0000000000000000000000000000000000000000..f33872ffaf8d4ff815ba30e4918c091eb89fd812
Binary files /dev/null and b/doc/imgs/share-bottom.png differ
diff --git a/doc/imgs/ssr.png b/doc/imgs/ssr.png
new file mode 100644
index 0000000000000000000000000000000000000000..03326c1681c56cfaa53ea6e0593b3fdee0c016e5
Binary files /dev/null and b/doc/imgs/ssr.png differ
diff --git a/doc/imgs/structure.png b/doc/imgs/structure.png
new file mode 100644
index 0000000000000000000000000000000000000000..69a6eb4efa3b0b06d299e7b098a5f82a574d909d
Binary files /dev/null and b/doc/imgs/structure.png differ
diff --git a/doc/imgs/tagspace.png b/doc/imgs/tagspace.png
new file mode 100644
index 0000000000000000000000000000000000000000..7f64fd6d4029ee1ad56b7778817735cee321f1c7
Binary files /dev/null and b/doc/imgs/tagspace.png differ
diff --git a/doc/imgs/wide&deep.png b/doc/imgs/wide&deep.png
new file mode 100644
index 0000000000000000000000000000000000000000..d46cef37d771acbedb766f98dabead50ff038b3e
Binary files /dev/null and b/doc/imgs/wide&deep.png differ
diff --git a/doc/imgs/word2vec.png b/doc/imgs/word2vec.png
new file mode 100644
index 0000000000000000000000000000000000000000..947ff6174685fd0ce24632d3c75487aa3d011df3
Binary files /dev/null and b/doc/imgs/word2vec.png differ
diff --git a/doc/imgs/xdeepfm.png b/doc/imgs/xdeepfm.png
new file mode 100644
index 0000000000000000000000000000000000000000..2c2577afbd1c4eb47d583f8aec317d1736aea5f1
Binary files /dev/null and b/doc/imgs/xdeepfm.png differ
diff --git a/doc/imgs/youtube_dnn.png b/doc/imgs/youtube_dnn.png
new file mode 100644
index 0000000000000000000000000000000000000000..e7480d80786ca6034ec61856effe5975ad5f72c1
Binary files /dev/null and b/doc/imgs/youtube_dnn.png differ
diff --git a/doc/local_train.md b/doc/local_train.md
new file mode 100644
index 0000000000000000000000000000000000000000..e65255ebf7e14933f52f9977b2ecec48dabbb76e
--- /dev/null
+++ b/doc/local_train.md
@@ -0,0 +1,2 @@
+# PaddleRec 单机训练
+> 占位
diff --git a/doc/model_list.md b/doc/model_list.md
new file mode 100644
index 0000000000000000000000000000000000000000..b46687a60475fbd309f01050194510b21b060f17
--- /dev/null
+++ b/doc/model_list.md
@@ -0,0 +1,14 @@
+# 支持模型列表
+| 方向 | 模型 | 单机CPU训练 | 单机GPU训练 | 分布式CPU训练 | 大规模稀疏 | 分布式GPU训练 | 自定义数据集 |
+| :------: | :--------------------: | :---------: | :---------: | :-----------: | :--------: | :-----------: | :----------: |
+| 内容理解 | [Text-Classifcation]() | ✓ | x | ✓ | x | ✓ | ✓ |
+| 内容理解 | [TagSpace]() | ✓ | x | ✓ | x | ✓ | ✓ |
+| 召回 | [Word2Vec]() | ✓ | x | ✓ | x | ✓ | ✓ |
+| 召回 | [TDM]() | ✓ | x | ✓ | x | ✓ | ✓ |
+| 排序 | [CTR-Dnn]() | ✓ | x | ✓ | x | ✓ | ✓ |
+| 排序 | [DeepFm]() | ✓ | x | ✓ | x | ✓ | ✓ |
+| 排序 | [ListWise]() | ✓ | x | ✓ | x | ✓ | ✓ |
+| 多任务 | [MMOE]() | ✓ | x | ✓ | x | ✓ | ✓ |
+| 多任务 | [ESMM]() | ✓ | x | ✓ | x | ✓ | ✓ |
+| 匹配 | [DSSM]() | ✓ | x | ✓ | x | ✓ | ✓ |
+| 匹配 | [Multiview-Simnet]() | ✓ | x | ✓ | x | ✓ | ✓ |
diff --git a/doc/optimization_model.md b/doc/optimization_model.md
new file mode 100644
index 0000000000000000000000000000000000000000..e63f45b62b50db55f1c6c0d48c7ca23b016b74d3
--- /dev/null
+++ b/doc/optimization_model.md
@@ -0,0 +1,2 @@
+# PaddleRec 模型调参
+> 占位
diff --git a/doc/predict.md b/doc/predict.md
new file mode 100644
index 0000000000000000000000000000000000000000..07160e1f0e7563276c33e514d006dd3747492f90
--- /dev/null
+++ b/doc/predict.md
@@ -0,0 +1 @@
+# PaddleRec 离线预测
diff --git a/doc/ps_background.md b/doc/ps_background.md
new file mode 100644
index 0000000000000000000000000000000000000000..e5f2e320940763986351fefd21a5e1f1363b6104
--- /dev/null
+++ b/doc/ps_background.md
@@ -0,0 +1,7 @@
+## [分布式训练概述](https://www.paddlepaddle.org.cn/tutorials/projectdetail/459124)
+
+
+## [多机多卡训练](https://www.paddlepaddle.org.cn/tutorials/projectdetail/459127)
+
+
+## [参数服务器训练](https://www.paddlepaddle.org.cn/tutorials/projectdetail/464839)
diff --git a/doc/rec_background.md b/doc/rec_background.md
new file mode 100644
index 0000000000000000000000000000000000000000..b6cf8b1c360165033f5640a775f3c8721754f4a3
--- /dev/null
+++ b/doc/rec_background.md
@@ -0,0 +1,64 @@
+# 推荐系统背景知识
+本文来源于[个性化推荐](https://github.com/PaddlePaddle/book/blob/develop/05.recommender_system/README.cn.md),进行了节选。
+
+本文代码目录在[book/recommender_system](https://github.com/PaddlePaddle/book/tree/develop/05.recommender_system),初次使用请您参考[Book文档使用说明](https://github.com/PaddlePaddle/book/blob/develop/README.cn.md#运行这本书)。
+
+更多教程及背景知识可以查阅[深度学习实践应用:个性化推荐](https://www.paddlepaddle.org.cn/tutorials/projectdetail/443958)
+
+## 背景介绍
+
+在网络技术不断发展和电子商务规模不断扩大的背景下,商品数量和种类快速增长,用户需要花费大量时间才能找到自己想买的商品,这就是信息超载问题。为了解决这个难题,个性化推荐系统(Recommender System)应运而生。
+
+个性化推荐系统是信息过滤系统(Information Filtering System)的子集,它可以用在很多领域,如电影、音乐、电商和 Feed 流推荐等。个性化推荐系统通过分析、挖掘用户行为,发现用户的个性化需求与兴趣特点,将用户可能感兴趣的信息或商品推荐给用户。与搜索引擎不同,个性化推荐系统不需要用户准确地描述出自己的需求,而是根据用户的历史行为进行建模,主动提供满足用户兴趣和需求的信息。
+
+1994年明尼苏达大学推出的GroupLens系统[[1](#参考文献)]一般被认为是个性化推荐系统成为一个相对独立的研究方向的标志。该系统首次提出了基于协同过滤来完成推荐任务的思想,此后,基于该模型的协同过滤推荐引领了个性化推荐系统十几年的发展方向。
+
+传统的个性化推荐系统方法主要有:
+
+- 协同过滤推荐(Collaborative Filtering Recommendation):该方法是应用最广泛的技术之一,需要收集和分析用户的历史行为、活动和偏好。它通常可以分为两个子类:基于用户 (User-Based)的推荐[[1](#参考文献)] 和基于物品(Item-Based)的推荐[[2](#参考文献)]。该方法的一个关键优势是它不依赖于机器去分析物品的内容特征,因此它无需理解物品本身也能够准确地推荐诸如电影之类的复杂物品;缺点是对于没有任何行为的新用户存在冷启动的问题,同时也存在用户与商品之间的交互数据不够多造成的稀疏问题。值得一提的是,社交网络[[3](#参考文献)]或地理位置等上下文信息都可以结合到协同过滤中去。
+- 基于内容过滤推荐[[4](#参考文献)](Content-based Filtering Recommendation):该方法利用商品的内容描述,抽象出有意义的特征,通过计算用户的兴趣和商品描述之间的相似度,来给用户做推荐。优点是简单直接,不需要依据其他用户对商品的评价,而是通过商品属性进行商品相似度度量,从而推荐给用户所感兴趣商品的相似商品;缺点是对于没有任何行为的新用户同样存在冷启动的问题。
+- 组合推荐[[5](#参考文献)](Hybrid Recommendation):运用不同的输入和技术共同进行推荐,以弥补各自推荐技术的缺点。
+
+近些年来,深度学习在很多领域都取得了巨大的成功。学术界和工业界都在尝试将深度学习应用于个性化推荐系统领域中。深度学习具有优秀的自动提取特征的能力,能够学习多层次的抽象特征表示,并对异质或跨域的内容信息进行学习,可以一定程度上处理个性化推荐系统冷启动问题[[6](#参考文献)]。
+
+### YouTube的深度神经网络个性化推荐系统
+
+YouTube是世界上最大的视频上传、分享和发现网站,YouTube个性化推荐系统为超过10亿用户从不断增长的视频库中推荐个性化的内容。整个系统由两个神经网络组成:候选生成网络和排序网络。候选生成网络从百万量级的视频库中生成上百个候选,排序网络对候选进行打分排序,输出排名最高的数十个结果。系统结构如图1所示:
+
+
+
+图1. YouTube 个性化推荐系统结构
+
+
+#### 候选生成网络(Candidate Generation Network)
+
+候选生成网络将推荐问题建模为一个类别数极大的多类分类问题:对于一个Youtube用户,使用其观看历史(视频ID)、搜索词记录(search tokens)、人口学信息(如地理位置、用户登录设备)、二值特征(如性别,是否登录)和连续特征(如用户年龄)等,对视频库中所有视频进行多分类,得到每一类别的分类结果(即每一个视频的推荐概率),最终输出概率较高的几百个视频。
+
+首先,将观看历史及搜索词记录这类历史信息,映射为向量后取平均值得到定长表示;同时,输入人口学特征以优化新用户的推荐效果,并将二值特征和连续特征归一化处理到[0, 1]范围。接下来,将所有特征表示拼接为一个向量,并输入给非线形多层感知器(MLP,详见[识别数字](https://github.com/PaddlePaddle/book/blob/develop/02.recognize_digits/README.cn.md)教程)处理。最后,训练时将MLP的输出给softmax做分类,预测时计算用户的综合特征(MLP的输出)与所有视频的相似度,取得分最高的K个作为候选生成网络的筛选结果。
+
+#### 排序网络(Ranking Network)
+排序网络的结构类似于候选生成网络,但是它的目标是对候选进行更细致的打分排序。和传统广告排序中的特征抽取方法类似,这里也构造了大量的用于视频排序的相关特征(如视频 ID、上次观看时间等)。这些特征的处理方式和候选生成网络类似,不同之处是排序网络的顶部是一个加权逻辑回归(weighted logistic regression),它对所有候选视频进行打分,从高到底排序后将分数较高的一些视频返回给用户。
+
+### 融合推荐模型
+本节会使用卷积神经网络(Convolutional Neural Networks)来学习电影名称的表示。下面会依次介绍文本卷积神经网络以及融合推荐模型。
+
+#### 文本卷积神经网络(CNN)
+
+卷积神经网络经常用来处理具有类似网格拓扑结构(grid-like topology)的数据。例如,图像可以视为二维网格的像素点,自然语言可以视为一维的词序列。卷积神经网络可以提取多种局部特征,并对其进行组合抽象得到更高级的特征表示。实验表明,卷积神经网络能高效地对图像及文本问题进行建模处理。
+
+卷积神经网络主要由卷积(convolution)和池化(pooling)操作构成,其应用及组合方式灵活多变,种类繁多。
+
+
+
+## 参考文献
+
+1. P. Resnick, N. Iacovou, etc. “[GroupLens: An Open Architecture for Collaborative Filtering of Netnews](http://ccs.mit.edu/papers/CCSWP165.html)”, Proceedings of ACM Conference on Computer Supported Cooperative Work, CSCW 1994. pp.175-186.
+2. Sarwar, Badrul, et al. "[Item-based collaborative filtering recommendation algorithms.](http://files.grouplens.org/papers/www10_sarwar.pdf)" *Proceedings of the 10th international conference on World Wide Web*. ACM, 2001.
+3. Kautz, Henry, Bart Selman, and Mehul Shah. "[Referral Web: combining social networks and collaborative filtering.](http://www.cs.cornell.edu/selman/papers/pdf/97.cacm.refweb.pdf)" Communications of the ACM 40.3 (1997): 63-65. APA
+4. [Peter Brusilovsky](https://en.wikipedia.org/wiki/Peter_Brusilovsky) (2007). *The Adaptive Web*. p. 325.
+5. Robin Burke , [Hybrid Web Recommender Systems](http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.435.7538&rep=rep1&type=pdf), pp. 377-408, The Adaptive Web, Peter Brusilovsky, Alfred Kobsa, Wolfgang Nejdl (Ed.), Lecture Notes in Computer Science, Springer-Verlag, Berlin, Germany, Lecture Notes in Computer Science, Vol. 4321, May 2007, 978-3-540-72078-2.
+6. Yuan, Jianbo, et al. ["Solving Cold-Start Problem in Large-scale Recommendation Engines: A Deep Learning Approach."](https://arxiv.org/pdf/1611.05480v1.pdf) *arXiv preprint arXiv:1611.05480* (2016).
+
+
+
+本教程 由 PaddlePaddle 创作,采用 知识共享 署名-相同方式共享 4.0 国际 许可协议 进行许可。
diff --git a/doc/yaml.md b/doc/yaml.md
new file mode 100644
index 0000000000000000000000000000000000000000..3dc9c4a5c2624acd8f2b1b9828f89183a256840a
--- /dev/null
+++ b/doc/yaml.md
@@ -0,0 +1,77 @@
+```yaml
+# 全局配置
+# Debug 模式开关,Debug模式下,会打印OP的耗时及IO占比
+debug: false
+
+# 工作区目录
+# 使用文件夹路径,则会在该目录下寻找超参配置,组网,数据等必须文件
+workspace: "/home/demo_model/"
+# 若 workspace: paddlerec.models.rank.dnn
+# 则会使用官方默认配置与组网
+
+
+# 用户可以指定多个dataset(数据读取配置)
+# 运行的不同阶段可以使用不同的dataset
+dataset:
+ # dataloader 示例
+ - name: dataset_1
+ type: DataLoader
+ batch_size: 5
+ data_path: "{workspace}/data/train"
+ # 指定自定义的reader.py所在路径
+ data_converter: "{workspace}/rsc15_reader.py"
+
+ # QueueDataset 示例
+ - name: dataset_2
+ type: QueueDataset
+ batch_size: 5
+ data_path: "{workspace}/data/train"
+ # 用户可以配置sparse_slots和dense_slots,无需再定义data_converter,使用默认reader
+ sparse_slots: "click ins_weight 6001 6002 6003 6005 6006 6007 6008 6009"
+ dense_slots: "readlist:9"
+
+
+# 自定义超参数,主要涉及网络中的模型超参及优化器
+hyper_parameters:
+ #优化器
+ optimizer:
+ class: Adam # 直接配置Optimizer,目前支持sgd/Adam/AdaGrad
+ learning_rate: 0.001
+ strategy: "{workspace}/conf/config_fleet.py" # 使用大规模稀疏pslib模式的特有配置
+ # 模型超参
+ vocab_size: 1000
+ hid_size: 100
+
+
+# 通过全局参数mode指定当前运行的runner
+mode: runner_1
+
+# runner主要涉及模型的执行环境,如:单机/分布式,CPU/GPU,迭代轮次,模型加载与保存地址
+runner:
+ - name: runner_1 # 配置一个runner,进行单机的训练
+ class: single_train # 配置运行模式的选择,还可以选择:single_infer/local_cluster_train/cluster_train
+ epochs: 10
+ device: cpu
+ init_model_path: ""
+ save_checkpoint_interval: 2
+ save_inference_interval: 4
+ # 下面是保存模型路径配置
+ save_checkpoint_path: "xxxx"
+ save_inference_path: "xxxx"
+
+ - name: runner_2 # 配置一个runner,进行单机的预测
+ class: single_infer
+ epochs: 1
+ device: cpu
+ init_model_path: "afs:/xxx/xxx"
+
+
+# 模型在训练时,可能存在多个阶段,每个阶段的组网与数据读取都可能不尽相同
+# 每个runner都会完整的运行所有阶段
+# phase指定运行时加载的模型及reader
+phase:
+- name: phase1
+ model: "{workspace}/model.py"
+ dataset_name: sample_1
+ thread_num: 1
+```
diff --git a/fleet_rec/check.py b/fleet_rec/check.py
deleted file mode 100755
index d4177fd8516e4f210d1897601fb88f4b11773875..0000000000000000000000000000000000000000
--- a/fleet_rec/check.py
+++ /dev/null
@@ -1,8 +0,0 @@
-import argparse
-
-if __name__ == "__main__":
- parser = argparse.ArgumentParser(description='fleet-rec check')
- parser.add_argument("--model", type=str)
- parser.add_argument("--engine", type=str)
-
- print("coming soon")
diff --git a/fleet_rec/core/__init__.py b/fleet_rec/core/__init__.py
deleted file mode 100755
index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000
diff --git a/fleet_rec/core/engine/__init__.py b/fleet_rec/core/engine/__init__.py
deleted file mode 100755
index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000
diff --git a/fleet_rec/core/engine/engine.py b/fleet_rec/core/engine/engine.py
deleted file mode 100755
index 66c43bca3f2121531e8477814d87adc36e6c6096..0000000000000000000000000000000000000000
--- a/fleet_rec/core/engine/engine.py
+++ /dev/null
@@ -1,14 +0,0 @@
-import abc
-
-
-class Engine:
- __metaclass__ = abc.ABCMeta
-
- def __init__(self, envs, trainer):
- self.envs = envs
- self.trainer = trainer
-
- @abc.abstractmethod
- def run(self):
- pass
-
diff --git a/fleet_rec/core/layer.py b/fleet_rec/core/layer.py
deleted file mode 100755
index ab06841706400ceae1ab4444ba77bba66ff9924a..0000000000000000000000000000000000000000
--- a/fleet_rec/core/layer.py
+++ /dev/null
@@ -1,47 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-import abc
-
-
-class Layer(object):
- """R
- """
- __metaclass__ = abc.ABCMeta
-
- def __init__(self, config):
- """R
- """
- pass
-
- def generate(self, mode, param):
- """R
- """
- if mode == 'fluid':
- return self.generate_fluid(param)
- elif mode == 'tensorflow':
- return self.generate_tensorflow(param)
- print('unsupport this mode: ' + mode)
- return None, None
-
- @abc.abstractmethod
- def generate_fluid(self, param):
- """R
- """
- pass
-
- def generate_tensorflow(self, param):
- """ Not implement currently
- """
- pass
diff --git a/fleet_rec/core/model.py b/fleet_rec/core/model.py
deleted file mode 100755
index 141c15777ad15b4837cca1f6f4a43e5696a7b05f..0000000000000000000000000000000000000000
--- a/fleet_rec/core/model.py
+++ /dev/null
@@ -1,93 +0,0 @@
-import abc
-
-import paddle.fluid as fluid
-
-from fleetrec.core.utils import envs
-
-
-class Model(object):
- """R
- """
- __metaclass__ = abc.ABCMeta
-
- def __init__(self, config):
- """R
- """
- self._cost = None
- self._metrics = {}
- self._data_var = []
- self._infer_data_var = []
- self._infer_results = {}
- self._data_loader = None
- self._infer_data_loader = None
- self._fetch_interval = 20
- self._namespace = "train.model"
- self._platform = envs.get_platform()
-
- def get_inputs(self):
- return self._data_var
-
- def get_infer_inputs(self):
- return self._infer_data_var
-
- def get_infer_results(self):
- return self._infer_results
-
- def get_cost_op(self):
- """R
- """
- return self._cost
-
- def get_metrics(self):
- """R
- """
- return self._metrics
-
- def custom_preprocess(self):
- """
- do something after exe.run(stratup_program) and before run()
- """
- pass
-
- def get_fetch_period(self):
- return self._fetch_interval
-
- def _build_optimizer(self, name, lr):
- name = name.upper()
- optimizers = ["SGD", "ADAM", "ADAGRAD"]
- if name not in optimizers:
- raise ValueError(
- "configured optimizer can only supported SGD/Adam/Adagrad")
-
- if name == "SGD":
- reg = envs.get_global_env(
- "hyper_parameters.reg", 0.0001, self._namespace)
- optimizer_i = fluid.optimizer.SGD(
- lr, regularization=fluid.regularizer.L2DecayRegularizer(reg))
- elif name == "ADAM":
- optimizer_i = fluid.optimizer.Adam(lr, lazy_mode=True)
- elif name == "ADAGRAD":
- optimizer_i = fluid.optimizer.Adagrad(lr)
- else:
- raise ValueError(
- "configured optimizer can only supported SGD/Adam/Adagrad")
-
- return optimizer_i
-
- def optimizer(self):
- learning_rate = envs.get_global_env(
- "hyper_parameters.learning_rate", None, self._namespace)
- optimizer = envs.get_global_env(
- "hyper_parameters.optimizer", None, self._namespace)
- print(">>>>>>>>>>>.learnig rate: %s" % learning_rate)
- return self._build_optimizer(optimizer, learning_rate)
-
- @abc.abstractmethod
- def train_net(self):
- """R
- """
- pass
-
- @abc.abstractmethod
- def infer_net(self):
- pass
diff --git a/fleet_rec/core/modules/__init__.py b/fleet_rec/core/modules/__init__.py
deleted file mode 100755
index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000
diff --git a/fleet_rec/core/modules/coding/__init__.py b/fleet_rec/core/modules/coding/__init__.py
deleted file mode 100755
index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000
diff --git a/fleet_rec/core/modules/coding/layers.py b/fleet_rec/core/modules/coding/layers.py
deleted file mode 100755
index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000
diff --git a/fleet_rec/core/modules/modul/__init__.py b/fleet_rec/core/modules/modul/__init__.py
deleted file mode 100755
index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000
diff --git a/fleet_rec/core/trainers/__init__.py b/fleet_rec/core/trainers/__init__.py
deleted file mode 100755
index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000
diff --git a/fleet_rec/core/trainers/ctr_coding_trainer.py b/fleet_rec/core/trainers/ctr_coding_trainer.py
deleted file mode 100755
index fdfdd1ecb2f40747cc8cb61e3f8ce0e3c1c19d4c..0000000000000000000000000000000000000000
--- a/fleet_rec/core/trainers/ctr_coding_trainer.py
+++ /dev/null
@@ -1,136 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-import os
-import numpy as np
-
-import paddle.fluid as fluid
-from paddle.fluid.incubate.fleet.parameter_server.pslib import fleet
-from paddle.fluid.incubate.fleet.base.role_maker import MPISymetricRoleMaker
-
-from fleetrec.core.utils import envs
-from fleetrec.core.trainer import Trainer
-
-
-class CtrPaddleTrainer(Trainer):
- """R
- """
-
- def __init__(self, config):
- """R
- """
- Trainer.__init__(self, config)
-
- self.global_config = config
- self._metrics = {}
- self.processor_register()
-
- def processor_register(self):
- role = MPISymetricRoleMaker()
- fleet.init(role)
-
- if fleet.is_server():
- self.regist_context_processor('uninit', self.instance)
- self.regist_context_processor('init_pass', self.init)
- self.regist_context_processor('server_pass', self.server)
- else:
- self.regist_context_processor('uninit', self.instance)
- self.regist_context_processor('init_pass', self.init)
- self.regist_context_processor('train_pass', self.train)
- self.regist_context_processor('terminal_pass', self.terminal)
-
- def _get_dataset(self):
- namespace = "train.reader"
-
- inputs = self.model.get_inputs()
- threads = envs.get_global_env("train.threads", None)
- batch_size = envs.get_global_env("batch_size", None, namespace)
- reader_class = envs.get_global_env("class", None, namespace)
- abs_dir = os.path.dirname(os.path.abspath(__file__))
- reader = os.path.join(abs_dir, '../utils', 'dataset_instance.py')
- pipe_cmd = "python {} {} {} {}".format(reader, reader_class, "TRAIN", self._config_yaml)
- train_data_path = envs.get_global_env("train_data_path", None, namespace)
-
- dataset = fluid.DatasetFactory().create_dataset()
- dataset.set_use_var(inputs)
- dataset.set_pipe_command(pipe_cmd)
- dataset.set_batch_size(batch_size)
- dataset.set_thread(threads)
- file_list = [
- os.path.join(train_data_path, x)
- for x in os.listdir(train_data_path)
- ]
-
- dataset.set_filelist(file_list)
- return dataset
-
- def instance(self, context):
- models = envs.get_global_env("train.model.models")
- model_class = envs.lazy_instance_by_fliename(models, "Model")
- self.model = model_class(None)
- context['status'] = 'init_pass'
-
- def init(self, context):
- """R
- """
- self.model.train_net()
- optimizer = self.model.optimizer()
-
- optimizer = fleet.distributed_optimizer(optimizer, strategy={"use_cvm": False})
- optimizer.minimize(self.model.get_cost_op())
-
- if fleet.is_server():
- context['status'] = 'server_pass'
- else:
- self.fetch_vars = []
- self.fetch_alias = []
- self.fetch_period = self.model.get_fetch_period()
-
- metrics = self.model.get_metrics()
- if metrics:
- self.fetch_vars = metrics.values()
- self.fetch_alias = metrics.keys()
- context['status'] = 'train_pass'
-
- def server(self, context):
- fleet.run_server()
- fleet.stop_worker()
- context['is_exit'] = True
-
- def train(self, context):
- self._exe.run(fluid.default_startup_program())
- fleet.init_worker()
-
- dataset = self._get_dataset()
-
- shuf = np.array([fleet.worker_index()])
- gs = shuf * 0
- fleet._role_maker._node_type_comm.Allreduce(shuf, gs)
-
- print("trainer id: {}, trainers: {}, gs: {}".format(fleet.worker_index(), fleet.worker_num(), gs))
-
- epochs = envs.get_global_env("train.epochs")
-
- for i in range(epochs):
- self._exe.train_from_dataset(program=fluid.default_main_program(),
- dataset=dataset,
- fetch_list=self.fetch_vars,
- fetch_info=self.fetch_alias,
- print_period=self.fetch_period)
-
- context['status'] = 'terminal_pass'
- fleet.stop_worker()
-
- def terminal(self, context):
- print("terminal ended.")
- context['is_exit'] = True
diff --git a/fleet_rec/core/trainers/ctr_modul_trainer.py b/fleet_rec/core/trainers/ctr_modul_trainer.py
deleted file mode 100755
index e72715e74634791c2c90890bc6b8b8f0a0e56c71..0000000000000000000000000000000000000000
--- a/fleet_rec/core/trainers/ctr_modul_trainer.py
+++ /dev/null
@@ -1,460 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-
-import sys
-import time
-import json
-import datetime
-import numpy as np
-
-import paddle.fluid as fluid
-from paddle.fluid.incubate.fleet.parameter_server.pslib import fleet
-from paddle.fluid.incubate.fleet.base.role_maker import GeneralRoleMaker
-
-
-from fleetrec.core.utils import fs as fs
-from fleetrec.core.utils import util as util
-from fleetrec.core.metrics.auc_metrics import AUCMetric
-from fleetrec.core.modules.modul import build as model_basic
-from fleetrec.core.utils import dataset
-from fleetrec.core.trainer import Trainer
-
-
-def wroker_numric_opt(value, env, opt):
- """
- numric count opt for workers
- Args:
- value: value for count
- env: mpi/gloo
- opt: count operator, SUM/MAX/MIN/AVG
- Return:
- count result
- """
- local_value = np.array([value])
- global_value = np.copy(local_value) * 0
- fleet._role_maker.all_reduce_worker(local_value, global_value, opt)
- return global_value[0]
-
-
-def worker_numric_sum(value, env="mpi"):
- """R
- """
- return wroker_numric_opt(value, env, "sum")
-
-
-def worker_numric_avg(value, env="mpi"):
- """R
- """
- return worker_numric_sum(value, env) / fleet.worker_num()
-
-
-def worker_numric_min(value, env="mpi"):
- """R
- """
- return wroker_numric_opt(value, env, "min")
-
-
-def worker_numric_max(value, env="mpi"):
- """R
- """
- return wroker_numric_opt(value, env, "max")
-
-
-class CtrPaddleTrainer(Trainer):
- """R
- """
-
- def __init__(self, config):
- """R
- """
- Trainer.__init__(self, config)
- config['output_path'] = util.get_absolute_path(
- config['output_path'], config['io']['afs'])
-
- self.global_config = config
- self._metrics = {}
-
- self._path_generator = util.PathGenerator({
- 'templates': [
- {'name': 'xbox_base_done', 'template': config['output_path'] + '/xbox_base_done.txt'},
- {'name': 'xbox_delta_done', 'template': config['output_path'] + '/xbox_patch_done.txt'},
- {'name': 'xbox_base', 'template': config['output_path'] + '/xbox/{day}/base/'},
- {'name': 'xbox_delta', 'template': config['output_path'] + '/xbox/{day}/delta-{pass_id}/'},
- {'name': 'batch_model', 'template': config['output_path'] + '/batch_model/{day}/{pass_id}/'}
- ]
- })
- if 'path_generator' in config:
- self._path_generator.add_path_template(config['path_generator'])
-
- self.regist_context_processor('uninit', self.init)
- self.regist_context_processor('startup', self.startup)
- self.regist_context_processor('begin_day', self.begin_day)
- self.regist_context_processor('train_pass', self.train_pass)
- self.regist_context_processor('end_day', self.end_day)
-
- def init(self, context):
- """R
- """
- role_maker = None
- if self.global_config.get('process_mode', 'mpi') == 'brilliant_cpu':
- afs_config = self.global_config['io']['afs']
- role_maker = GeneralRoleMaker(
- hdfs_name=afs_config['fs_name'], hdfs_ugi=afs_config['fs_ugi'],
- path=self.global_config['output_path'] + "/gloo",
- init_timeout_seconds=1200, run_timeout_seconds=1200)
- fleet.init(role_maker)
- data_var_list = []
- data_var_name_dict = {}
- runnnable_scope = []
- runnnable_cost_op = []
- context['status'] = 'startup'
-
- for executor in self.global_config['executor']:
- scope = fluid.Scope()
- self._exector_context[executor['name']] = {}
- self._exector_context[executor['name']]['scope'] = scope
- self._exector_context[executor['name']]['model'] = model_basic.create(executor)
- model = self._exector_context[executor['name']]['model']
- self._metrics.update(model.get_metrics())
- runnnable_scope.append(scope)
- runnnable_cost_op.append(model.get_cost_op())
- for var in model._data_var:
- if var.name in data_var_name_dict:
- continue
- data_var_list.append(var)
- data_var_name_dict[var.name] = var
-
- optimizer = model_basic.YamlModel.build_optimizer({
- 'metrics': self._metrics,
- 'optimizer_conf': self.global_config['optimizer']
- })
- optimizer.minimize(runnnable_cost_op, runnnable_scope)
- for executor in self.global_config['executor']:
- scope = self._exector_context[executor['name']]['scope']
- model = self._exector_context[executor['name']]['model']
- program = model._build_param['model']['train_program']
- if not executor['is_update_sparse']:
- program._fleet_opt["program_configs"][str(id(model.get_cost_op().block.program))]["push_sparse"] = []
- if 'train_thread_num' not in executor:
- executor['train_thread_num'] = self.global_config['train_thread_num']
- with fluid.scope_guard(scope):
- self._exe.run(model._build_param['model']['startup_program'])
- model.dump_model_program('./')
-
- # server init done
- if fleet.is_server():
- return 0
-
- self._dataset = {}
- for dataset_item in self.global_config['dataset']['data_list']:
- dataset_item['data_vars'] = data_var_list
- dataset_item.update(self.global_config['io']['afs'])
- dataset_item["batch_size"] = self.global_config['batch_size']
- self._dataset[dataset_item['name']] = dataset.FluidTimeSplitDataset(dataset_item)
- # if config.need_reqi_changeslot and config.reqi_dnn_plugin_day >= last_day and config.reqi_dnn_plugin_pass >= last_pass:
- # util.reqi_changeslot(config.hdfs_dnn_plugin_path, join_save_params, common_save_params, update_save_params, scope2, scope3)
- fleet.init_worker()
- pass
-
- def print_log(self, log_str, params):
- """R
- """
- params['index'] = fleet.worker_index()
- if params['master']:
- if fleet.worker_index() == 0:
- print(log_str)
- sys.stdout.flush()
- else:
- print(log_str)
- if 'stdout' in params:
- params['stdout'] += str(datetime.datetime.now()) + log_str
-
- def print_global_metrics(self, scope, model, monitor_data, stdout_str):
- """R
- """
- metrics = model.get_metrics()
- metric_calculator = AUCMetric(None)
- for metric in metrics:
- metric_param = {'label': metric, 'metric_dict': metrics[metric]}
- metric_calculator.calculate(scope, metric_param)
- metric_result = metric_calculator.get_result_to_string()
- self.print_log(metric_result, {'master': True, 'stdout': stdout_str})
- monitor_data += metric_result
- metric_calculator.clear(scope, metric_param)
-
- def save_model(self, day, pass_index, base_key):
- """R
- """
- cost_printer = util.CostPrinter(util.print_cost,
- {'master': True, 'log_format': 'save model cost %s sec'})
- model_path = self._path_generator.generate_path('batch_model', {'day': day, 'pass_id': pass_index})
- save_mode = 0 # just save all
- if pass_index < 1: # batch_model
- save_mode = 3 # unseen_day++, save all
- util.rank0_print("going to save_model %s" % model_path)
- fleet.save_persistables(None, model_path, mode=save_mode)
- if fleet._role_maker.is_first_worker():
- self._train_pass.save_train_progress(day, pass_index, base_key, model_path, is_checkpoint=True)
- cost_printer.done()
- return model_path
-
- def save_xbox_model(self, day, pass_index, xbox_base_key, monitor_data):
- """R
- """
- stdout_str = ""
- xbox_patch_id = str(int(time.time()))
- util.rank0_print("begin save delta model")
-
- model_path = ""
- xbox_model_donefile = ""
- cost_printer = util.CostPrinter(util.print_cost, {'master': True, \
- 'log_format': 'save xbox model cost %s sec',
- 'stdout': stdout_str})
- if pass_index < 1:
- save_mode = 2
- xbox_patch_id = xbox_base_key
- model_path = self._path_generator.generate_path('xbox_base', {'day': day})
- xbox_model_donefile = self._path_generator.generate_path('xbox_base_done', {'day': day})
- else:
- save_mode = 1
- model_path = self._path_generator.generate_path('xbox_delta', {'day': day, 'pass_id': pass_index})
- xbox_model_donefile = self._path_generator.generate_path('xbox_delta_done', {'day': day})
- total_save_num = fleet.save_persistables(None, model_path, mode=save_mode)
- cost_printer.done()
-
- cost_printer = util.CostPrinter(util.print_cost, {'master': True,
- 'log_format': 'save cache model cost %s sec',
- 'stdout': stdout_str})
- model_file_handler = fs.FileHandler(self.global_config['io']['afs'])
- if self.global_config['save_cache_model']:
- cache_save_num = fleet.save_cache_model(None, model_path, mode=save_mode)
- model_file_handler.write(
- "file_prefix:part\npart_num:16\nkey_num:%d\n" % cache_save_num,
- model_path + '/000_cache/sparse_cache.meta', 'w')
- cost_printer.done()
- util.rank0_print("save xbox cache model done, key_num=%s" % cache_save_num)
-
- save_env_param = {
- 'executor': self._exe,
- 'save_combine': True
- }
- cost_printer = util.CostPrinter(util.print_cost, {'master': True,
- 'log_format': 'save dense model cost %s sec',
- 'stdout': stdout_str})
- if fleet._role_maker.is_first_worker():
- for executor in self.global_config['executor']:
- if 'layer_for_inference' not in executor:
- continue
- executor_name = executor['name']
- model = self._exector_context[executor_name]['model']
- save_env_param['inference_list'] = executor['layer_for_inference']
- save_env_param['scope'] = self._exector_context[executor_name]['scope']
- model.dump_inference_param(save_env_param)
- for dnn_layer in executor['layer_for_inference']:
- model_file_handler.cp(dnn_layer['save_file_name'],
- model_path + '/dnn_plugin/' + dnn_layer['save_file_name'])
- fleet._role_maker._barrier_worker()
- cost_printer.done()
-
- xbox_done_info = {
- "id": xbox_patch_id,
- "key": xbox_base_key,
- "ins_path": "",
- "ins_tag": "feasign",
- "partition_type": "2",
- "record_count": "111111",
- "monitor_data": monitor_data,
- "mpi_size": str(fleet.worker_num()),
- "input": model_path.rstrip("/") + "/000",
- "job_id": util.get_env_value("JOB_ID"),
- "job_name": util.get_env_value("JOB_NAME")
- }
- if fleet._role_maker.is_first_worker():
- model_file_handler.write(json.dumps(xbox_done_info) + "\n", xbox_model_donefile, 'a')
- if pass_index > 0:
- self._train_pass.save_train_progress(day, pass_index, xbox_base_key, model_path, is_checkpoint=False)
- fleet._role_maker._barrier_worker()
- return stdout_str
-
- def run_executor(self, executor_config, dataset, stdout_str):
- """R
- """
- day = self._train_pass.date()
- pass_id = self._train_pass._pass_id
- xbox_base_key = self._train_pass._base_key
- executor_name = executor_config['name']
- scope = self._exector_context[executor_name]['scope']
- model = self._exector_context[executor_name]['model']
- with fluid.scope_guard(scope):
- util.rank0_print("Begin " + executor_name + " pass")
- begin = time.time()
- program = model._build_param['model']['train_program']
- self._exe.train_from_dataset(program, dataset, scope,
- thread=executor_config['train_thread_num'], debug=self.global_config['debug'])
- end = time.time()
- local_cost = (end - begin) / 60.0
- avg_cost = worker_numric_avg(local_cost)
- min_cost = worker_numric_min(local_cost)
- max_cost = worker_numric_max(local_cost)
- util.rank0_print("avg train time %s mins, min %s mins, max %s mins" % (avg_cost, min_cost, max_cost))
- self._exector_context[executor_name]['cost'] = max_cost
-
- monitor_data = ""
- self.print_global_metrics(scope, model, monitor_data, stdout_str)
- util.rank0_print("End " + executor_name + " pass")
- if self._train_pass.need_dump_inference(pass_id) and executor_config['dump_inference_model']:
- stdout_str += self.save_xbox_model(day, pass_id, xbox_base_key, monitor_data)
- fleet._role_maker._barrier_worker()
-
- def startup(self, context):
- """R
- """
- if fleet.is_server():
- fleet.run_server()
- context['status'] = 'wait'
- return
- stdout_str = ""
- self._train_pass = util.TimeTrainPass(self.global_config)
- if not self.global_config['cold_start']:
- cost_printer = util.CostPrinter(util.print_cost,
- {'master': True, 'log_format': 'load model cost %s sec',
- 'stdout': stdout_str})
- self.print_log("going to load model %s" % self._train_pass._checkpoint_model_path, {'master': True})
- # if config.need_reqi_changeslot and config.reqi_dnn_plugin_day >= self._train_pass.date()
- # and config.reqi_dnn_plugin_pass >= self._pass_id:
- # fleet.load_one_table(0, self._train_pass._checkpoint_model_path)
- # else:
- fleet.init_server(self._train_pass._checkpoint_model_path, mode=0)
- cost_printer.done()
- if self.global_config['save_first_base']:
- self.print_log("save_first_base=True", {'master': True})
- self.print_log("going to save xbox base model", {'master': True, 'stdout': stdout_str})
- self._train_pass._base_key = int(time.time())
- stdout_str += self.save_xbox_model(self._train_pass.date(), 0, self._train_pass._base_key, "")
- context['status'] = 'begin_day'
-
- def begin_day(self, context):
- """R
- """
- stdout_str = ""
- if not self._train_pass.next():
- context['is_exit'] = True
- day = self._train_pass.date()
- pass_id = self._train_pass._pass_id
- self.print_log("======== BEGIN DAY:%s ========" % day, {'master': True, 'stdout': stdout_str})
- if pass_id == self._train_pass.max_pass_num_day():
- context['status'] = 'end_day'
- else:
- context['status'] = 'train_pass'
-
- def end_day(self, context):
- """R
- """
- day = self._train_pass.date()
- pass_id = self._train_pass._pass_id
- xbox_base_key = int(time.time())
- context['status'] = 'begin_day'
-
- util.rank0_print("shrink table")
- cost_printer = util.CostPrinter(util.print_cost,
- {'master': True, 'log_format': 'shrink table done, cost %s sec'})
- fleet.shrink_sparse_table()
- for executor in self._exector_context:
- self._exector_context[executor]['model'].shrink({
- 'scope': self._exector_context[executor]['scope'],
- 'decay': self.global_config['optimizer']['dense_decay_rate']
- })
- cost_printer.done()
-
- next_date = self._train_pass.date(delta_day=1)
- util.rank0_print("going to save xbox base model")
- self.save_xbox_model(next_date, 0, xbox_base_key, "")
- util.rank0_print("going to save batch model")
- self.save_model(next_date, 0, xbox_base_key)
- self._train_pass._base_key = xbox_base_key
- fleet._role_maker._barrier_worker()
-
- def train_pass(self, context):
- """R
- """
- stdout_str = ""
- day = self._train_pass.date()
- pass_id = self._train_pass._pass_id
- base_key = self._train_pass._base_key
- pass_time = self._train_pass._current_train_time.strftime("%Y%m%d%H%M")
- self.print_log(" ==== begin delta:%s ========" % pass_id, {'master': True, 'stdout': stdout_str})
- train_begin_time = time.time()
-
- cost_printer = util.CostPrinter(util.print_cost, \
- {'master': True, 'log_format': 'load into memory done, cost %s sec',
- 'stdout': stdout_str})
- current_dataset = {}
- for name in self._dataset:
- current_dataset[name] = self._dataset[name].load_dataset({
- 'node_num': fleet.worker_num(), 'node_idx': fleet.worker_index(),
- 'begin_time': pass_time, 'time_window_min': self._train_pass._interval_per_pass
- })
- fleet._role_maker._barrier_worker()
- cost_printer.done()
-
- util.rank0_print("going to global shuffle")
- cost_printer = util.CostPrinter(util.print_cost, {
- 'master': True, 'stdout': stdout_str,
- 'log_format': 'global shuffle done, cost %s sec'})
- for name in current_dataset:
- current_dataset[name].global_shuffle(fleet, self.global_config['dataset']['shuffle_thread'])
- cost_printer.done()
- # str(dataset.get_shuffle_data_size(fleet))
- fleet._role_maker._barrier_worker()
-
- if self.global_config['prefetch_data']:
- next_pass_time = (self._train_pass._current_train_time +
- datetime.timedelta(minutes=self._train_pass._interval_per_pass)).strftime("%Y%m%d%H%M")
- for name in self._dataset:
- self._dataset[name].preload_dataset({
- 'node_num': fleet.worker_num(), 'node_idx': fleet.worker_index(),
- 'begin_time': next_pass_time, 'time_window_min': self._train_pass._interval_per_pass
- })
-
- fleet._role_maker._barrier_worker()
- pure_train_begin = time.time()
- for executor in self.global_config['executor']:
- self.run_executor(executor, current_dataset[executor['dataset_name']], stdout_str)
- cost_printer = util.CostPrinter(util.print_cost, \
- {'master': True, 'log_format': 'release_memory cost %s sec'})
- for name in current_dataset:
- current_dataset[name].release_memory()
- pure_train_cost = time.time() - pure_train_begin
-
- if self._train_pass.is_checkpoint_pass(pass_id):
- self.save_model(day, pass_id, base_key)
-
- train_end_time = time.time()
- train_cost = train_end_time - train_begin_time
- other_cost = train_cost - pure_train_cost
- log_str = "finished train day %s pass %s time cost:%s sec job time cost:" % (day, pass_id, train_cost)
- for executor in self._exector_context:
- log_str += '[' + executor + ':' + str(self._exector_context[executor]['cost']) + ']'
- log_str += '[other_cost:' + str(other_cost) + ']'
- util.rank0_print(log_str)
- stdout_str += util.now_time_str() + log_str
- sys.stdout.write(stdout_str)
- fleet._role_maker._barrier_worker()
- stdout_str = ""
- if pass_id == self._train_pass.max_pass_num_day():
- context['status'] = 'end_day'
- return
- elif not self._train_pass.next():
- context['is_exit'] = True
diff --git a/fleet_rec/core/trainers/single_trainer.py b/fleet_rec/core/trainers/single_trainer.py
deleted file mode 100755
index db3398932e4cefbaf5fb5c211db87bc029019b6f..0000000000000000000000000000000000000000
--- a/fleet_rec/core/trainers/single_trainer.py
+++ /dev/null
@@ -1,121 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-"""
-Training use fluid with one node only.
-"""
-
-from __future__ import print_function
-import logging
-import paddle.fluid as fluid
-
-from fleetrec.core.trainers.transpiler_trainer import TranspileTrainer
-from fleetrec.core.utils import envs
-import numpy as np
-
-logging.basicConfig(format="%(asctime)s - %(levelname)s - %(message)s")
-logger = logging.getLogger("fluid")
-logger.setLevel(logging.INFO)
-
-
-class SingleTrainer(TranspileTrainer):
- def processor_register(self):
- self.regist_context_processor('uninit', self.instance)
- self.regist_context_processor('init_pass', self.init)
- self.regist_context_processor('startup_pass', self.startup)
- if envs.get_platform() == "LINUX" and envs.get_global_env("dataset_class", None, "train.reader") != "DataLoader":
- self.regist_context_processor('train_pass', self.dataset_train)
- else:
- self.regist_context_processor('train_pass', self.dataloader_train)
-
- self.regist_context_processor('infer_pass', self.infer)
- self.regist_context_processor('terminal_pass', self.terminal)
-
- def init(self, context):
- self.model.train_net()
- optimizer = self.model.optimizer()
- optimizer.minimize((self.model.get_cost_op()))
-
- self.fetch_vars = []
- self.fetch_alias = []
- self.fetch_period = self.model.get_fetch_period()
-
- metrics = self.model.get_metrics()
- if metrics:
- self.fetch_vars = metrics.values()
- self.fetch_alias = metrics.keys()
- context['status'] = 'startup_pass'
-
- def startup(self, context):
- self._exe.run(fluid.default_startup_program())
- context['status'] = 'train_pass'
-
- def dataloader_train(self, context):
- reader = self._get_dataloader("TRAIN")
- epochs = envs.get_global_env("train.epochs")
-
- program = fluid.compiler.CompiledProgram(
- fluid.default_main_program()).with_data_parallel(
- loss_name=self.model.get_cost_op().name)
-
- metrics_varnames = []
- metrics_format = []
-
- metrics_format.append("{}: {{}}".format("epoch"))
- metrics_format.append("{}: {{}}".format("batch"))
-
- for name, var in self.model.get_metrics().items():
- metrics_varnames.append(var.name)
- metrics_format.append("{}: {{}}".format(name))
-
- metrics_format = ", ".join(metrics_format)
-
- for epoch in range(epochs):
- reader.start()
- batch_id = 0
- try:
- while True:
- metrics_rets = self._exe.run(
- program=program,
- fetch_list=metrics_varnames)
-
- metrics = [epoch, batch_id]
- metrics.extend(metrics_rets)
-
- if batch_id % self.fetch_period == 0 and batch_id != 0:
- print(metrics_format.format(*metrics))
- batch_id += 1
- except fluid.core.EOFException:
- reader.reset()
- self.save(epoch, "train", is_fleet=False)
-
- context['status'] = 'infer_pass'
-
- def dataset_train(self, context):
- dataset = self._get_dataset("TRAIN")
- epochs = envs.get_global_env("train.epochs")
-
- for i in range(epochs):
- self._exe.train_from_dataset(program=fluid.default_main_program(),
- dataset=dataset,
- fetch_list=self.fetch_vars,
- fetch_info=self.fetch_alias,
- print_period=self.fetch_period)
- self.save(i, "train", is_fleet=False)
- context['status'] = 'infer_pass'
-
- def terminal(self, context):
- for model in self.increment_models:
- print("epoch :{}, dir: {}".format(model[0], model[1]))
- context['is_exit'] = True
diff --git a/fleet_rec/core/utils/dataloader_instance.py b/fleet_rec/core/utils/dataloader_instance.py
deleted file mode 100755
index a16fde803d5fc1467ee5d266215da4938ff8c579..0000000000000000000000000000000000000000
--- a/fleet_rec/core/utils/dataloader_instance.py
+++ /dev/null
@@ -1,65 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-from __future__ import print_function
-
-import os
-import sys
-
-from fleetrec.core.utils.envs import lazy_instance_by_fliename
-from fleetrec.core.utils.envs import get_global_env
-from fleetrec.core.utils.envs import get_runtime_environ
-
-
-def dataloader(readerclass, train, yaml_file):
- if train == "TRAIN":
- reader_name = "TrainReader"
- namespace = "train.reader"
- data_path = get_global_env("train_data_path", None, namespace)
- else:
- reader_name = "EvaluateReader"
- namespace = "evaluate.reader"
- data_path = get_global_env("test_data_path", None, namespace)
-
- if data_path.startswith("fleetrec::"):
- package_base = get_runtime_environ("PACKAGE_BASE")
- assert package_base is not None
- data_path = os.path.join(package_base, data_path.split("::")[1])
-
- files = [str(data_path) + "/%s" % x for x in os.listdir(data_path)]
-
- reader_class = lazy_instance_by_fliename(readerclass, reader_name)
- reader = reader_class(yaml_file)
- reader.init()
-
- def gen_reader():
- for file in files:
- with open(file, 'r') as f:
- for line in f:
- line = line.rstrip('\n')
- iter = reader.generate_sample(line)
- for parsed_line in iter():
- if parsed_line is None:
- continue
- else:
- values = []
- for pased in parsed_line:
- values.append(pased[1])
- yield values
-
- def gen_batch_reader():
- return reader.generate_batch_from_trainfiles(files)
-
- if hasattr(reader, 'generate_batch_from_trainfiles'):
- return gen_batch_reader()
- return gen_reader
diff --git a/fleet_rec/core/utils/dataset_instance.py b/fleet_rec/core/utils/dataset_instance.py
deleted file mode 100755
index 89e6e45d2c53bb033e0e5fc6e436b149d76cc7c2..0000000000000000000000000000000000000000
--- a/fleet_rec/core/utils/dataset_instance.py
+++ /dev/null
@@ -1,33 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-from __future__ import print_function
-import sys
-
-from fleetrec.core.utils.envs import lazy_instance_by_fliename
-
-if len(sys.argv) != 4:
- raise ValueError("reader only accept 3 argument: 1. reader_class 2.train/evaluate 3.yaml_abs_path")
-
-reader_package = sys.argv[1]
-
-if sys.argv[2] == "TRAIN":
- reader_name = "TrainReader"
-else:
- reader_name = "EvaluateReader"
-
-yaml_abs_path = sys.argv[3]
-reader_class = lazy_instance_by_fliename(reader_package, reader_name)
-reader = reader_class(yaml_abs_path)
-reader.init()
-reader.run_from_stdin()
diff --git a/fleet_rec/run.py b/fleet_rec/run.py
deleted file mode 100755
index d40cb99420cde906c39cbe34eaa64bdf8e74dbd2..0000000000000000000000000000000000000000
--- a/fleet_rec/run.py
+++ /dev/null
@@ -1,220 +0,0 @@
-import argparse
-import os
-import subprocess
-import yaml
-
-from fleetrec.core.factory import TrainerFactory
-from fleetrec.core.utils import envs
-from fleetrec.core.utils import util
-
-engines = {}
-device = ["CPU", "GPU"]
-clusters = ["SINGLE", "LOCAL_CLUSTER", "CLUSTER"]
-custom_model = ['tdm']
-model_name = ""
-
-
-def engine_registry():
- cpu = {"TRANSPILER": {}, "PSLIB": {}}
- cpu["TRANSPILER"]["SINGLE"] = single_engine
- cpu["TRANSPILER"]["LOCAL_CLUSTER"] = local_cluster_engine
- cpu["TRANSPILER"]["CLUSTER"] = cluster_engine
- cpu["PSLIB"]["SINGLE"] = local_mpi_engine
- cpu["PSLIB"]["LOCAL_CLUSTER"] = local_mpi_engine
- cpu["PSLIB"]["CLUSTER"] = cluster_mpi_engine
-
- gpu = {"TRANSPILER": {}, "PSLIB": {}}
- gpu["TRANSPILER"]["SINGLE"] = single_engine
-
- engines["CPU"] = cpu
- engines["GPU"] = gpu
-
-
-def get_engine(args):
- device = args.device
- d_engine = engines[device]
- transpiler = get_transpiler()
-
- engine = args.engine
- run_engine = d_engine[transpiler].get(engine, None)
-
- if run_engine is None:
- raise ValueError(
- "engine {} can not be supported on device: {}".format(engine, device))
- return run_engine
-
-
-def get_transpiler():
- FNULL = open(os.devnull, 'w')
- cmd = ["python", "-c",
- "import paddle.fluid as fluid; fleet_ptr = fluid.core.Fleet(); [fleet_ptr.copy_table_by_feasign(10, 10, [2020, 1010])];"]
- proc = subprocess.Popen(cmd, stdout=FNULL, stderr=FNULL, cwd=os.getcwd())
- ret = proc.wait()
- if ret == -11:
- return "PSLIB"
- else:
- return "TRANSPILER"
-
-
-def set_runtime_envs(cluster_envs, engine_yaml):
- def get_engine_extras():
- with open(engine_yaml, 'r') as rb:
- _envs = yaml.load(rb.read(), Loader=yaml.FullLoader)
-
- flattens = envs.flatten_environs(_envs)
-
- engine_extras = {}
- for k, v in flattens.items():
- if k.startswith("train.trainer."):
- engine_extras[k] = v
- return engine_extras
-
- if cluster_envs is None:
- cluster_envs = {}
-
- engine_extras = get_engine_extras()
- if "train.trainer.threads" in engine_extras and "CPU_NUM" in cluster_envs:
- cluster_envs["CPU_NUM"] = engine_extras["train.trainer.threads"]
- envs.set_runtime_environs(cluster_envs)
- envs.set_runtime_environs(engine_extras)
-
- need_print = {}
- for k, v in os.environ.items():
- if k.startswith("train.trainer."):
- need_print[k] = v
-
- print(envs.pretty_print_envs(need_print, ("Runtime Envs", "Value")))
-
-
-def get_trainer_prefix(args):
- if model_name in custom_model:
- return model_name.upper()
- return ""
-
-
-def single_engine(args):
- trainer = get_trainer_prefix(args) + "SingleTrainer"
- single_envs = {}
- single_envs["train.trainer.trainer"] = trainer
- single_envs["train.trainer.threads"] = "2"
- single_envs["train.trainer.engine"] = "single"
- single_envs["train.trainer.device"] = args.device
- single_envs["train.trainer.platform"] = envs.get_platform()
- print("use {} engine to run model: {}".format(trainer, args.model))
-
- set_runtime_envs(single_envs, args.model)
- trainer = TrainerFactory.create(args.model)
- return trainer
-
-
-def cluster_engine(args):
- trainer = get_trainer_prefix(args) + "ClusterTrainer"
- cluster_envs = {}
- cluster_envs["train.trainer.trainer"] = trainer
- cluster_envs["train.trainer.engine"] = "cluster"
- cluster_envs["train.trainer.device"] = args.device
- cluster_envs["train.trainer.platform"] = envs.get_platform()
- print("launch {} engine with cluster to run model: {}".format(trainer, args.model))
-
- set_runtime_envs(cluster_envs, args.model)
- trainer = TrainerFactory.create(args.model)
- return trainer
-
-
-def cluster_mpi_engine(args):
- print("launch cluster engine with cluster to run model: {}".format(args.model))
-
- cluster_envs = {}
- cluster_envs["train.trainer.trainer"] = "CtrCodingTrainer"
- cluster_envs["train.trainer.device"] = args.device
- cluster_envs["train.trainer.platform"] = envs.get_platform()
-
- set_runtime_envs(cluster_envs, args.model)
-
- trainer = TrainerFactory.create(args.model)
- return trainer
-
-
-def local_cluster_engine(args):
- from fleetrec.core.engine.local_cluster_engine import LocalClusterEngine
-
- trainer = get_trainer_prefix(args) + "ClusterTrainer"
- cluster_envs = {}
- cluster_envs["server_num"] = 1
- cluster_envs["worker_num"] = 1
- cluster_envs["start_port"] = envs.find_free_port()
- cluster_envs["log_dir"] = "logs"
- cluster_envs["train.trainer.trainer"] = trainer
- cluster_envs["train.trainer.strategy"] = "async"
- cluster_envs["train.trainer.threads"] = "2"
- cluster_envs["train.trainer.engine"] = "local_cluster"
-
- cluster_envs["train.trainer.device"] = args.device
- cluster_envs["train.trainer.platform"] = envs.get_platform()
-
- cluster_envs["CPU_NUM"] = "2"
- print("launch {} engine with cluster to run model: {}".format(trainer, args.model))
-
- set_runtime_envs(cluster_envs, args.model)
- launch = LocalClusterEngine(cluster_envs, args.model)
- return launch
-
-
-def local_mpi_engine(args):
- print("launch cluster engine with cluster to run model: {}".format(args.model))
- from fleetrec.core.engine.local_mpi_engine import LocalMPIEngine
-
- print("use 1X1 MPI ClusterTraining at localhost to run model: {}".format(args.model))
-
- mpi = util.run_which("mpirun")
- if not mpi:
- raise RuntimeError("can not find mpirun, please check environment")
- cluster_envs = {}
- cluster_envs["mpirun"] = mpi
- cluster_envs["train.trainer.trainer"] = "CtrCodingTrainer"
- cluster_envs["log_dir"] = "logs"
- cluster_envs["train.trainer.engine"] = "local_cluster"
-
- cluster_envs["train.trainer.device"] = args.device
- cluster_envs["train.trainer.platform"] = envs.get_platform()
-
- set_runtime_envs(cluster_envs, args.model)
- launch = LocalMPIEngine(cluster_envs, args.model)
- return launch
-
-
-def get_abs_model(model):
- if model.startswith("fleetrec."):
- fleet_base = envs.get_runtime_environ("PACKAGE_BASE")
- workspace_dir = model.split("fleetrec.")[1].replace(".", "/")
- path = os.path.join(fleet_base, workspace_dir, "config.yaml")
- else:
- if not os.path.isfile(model):
- raise IOError("model config: {} invalid".format(model))
- path = model
- return path
-
-
-if __name__ == "__main__":
- parser = argparse.ArgumentParser(description='fleet-rec run')
- parser.add_argument("-m", "--model", type=str)
- parser.add_argument("-e", "--engine", type=str,
- choices=["single", "local_cluster", "cluster",
- "tdm_single", "tdm_local_cluster", "tdm_cluster"])
- parser.add_argument("-d", "--device", type=str,
- choices=["cpu", "gpu"], default="cpu")
-
- abs_dir = os.path.dirname(os.path.abspath(__file__))
- envs.set_runtime_environs({"PACKAGE_BASE": abs_dir})
-
- args = parser.parse_args()
- args.engine = args.engine.upper()
- args.device = args.device.upper()
- model_name = args.model.split('.')[-1]
- args.model = get_abs_model(args.model)
- engine_registry()
-
- which_engine = get_engine(args)
-
- engine = which_engine(args)
- engine.run()
diff --git a/fleet_rec/tests/__init__.py b/fleet_rec/tests/__init__.py
deleted file mode 100755
index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000
diff --git a/models/contentunderstanding/__init__.py b/models/contentunderstanding/__init__.py
new file mode 100755
index 0000000000000000000000000000000000000000..abf198b97e6e818e1fbe59006f98492640bcee54
--- /dev/null
+++ b/models/contentunderstanding/__init__.py
@@ -0,0 +1,13 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
diff --git a/models/contentunderstanding/classification/__init__.py b/models/contentunderstanding/classification/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..abf198b97e6e818e1fbe59006f98492640bcee54
--- /dev/null
+++ b/models/contentunderstanding/classification/__init__.py
@@ -0,0 +1,13 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
diff --git a/models/contentunderstanding/classification/config.yaml b/models/contentunderstanding/classification/config.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..9e0bdd1e851ada704eb2377efe0a82154fd2b371
--- /dev/null
+++ b/models/contentunderstanding/classification/config.yaml
@@ -0,0 +1,48 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+workspace: "paddlerec.models.contentunderstanding.classification"
+
+dataset:
+- name: data1
+ batch_size: 5
+ type: DataLoader
+ data_path: "{workspace}/data/train_data"
+ data_converter: "{workspace}/reader.py"
+
+hyper_parameters:
+ optimizer:
+ class: Adagrad
+ learning_rate: 0.001
+ is_sparse: False
+
+mode: runner1
+
+runner:
+- name: runner1
+ class: single_train
+ epochs: 10
+ device: cpu
+ save_checkpoint_interval: 2
+ save_inference_interval: 4
+ save_checkpoint_path: "increment"
+ save_inference_path: "inference"
+ save_inference_feed_varnames: []
+ save_inference_fetch_varnames: []
+
+phase:
+- name: phase1
+ model: "{workspace}/model.py"
+ dataset_name: data1
+ thread_num: 1
diff --git a/models/rank/text_classification/train_data/part-0 b/models/contentunderstanding/classification/data/train_data/part-0.txt
similarity index 100%
rename from models/rank/text_classification/train_data/part-0
rename to models/contentunderstanding/classification/data/train_data/part-0.txt
diff --git a/models/contentunderstanding/classification/model.py b/models/contentunderstanding/classification/model.py
new file mode 100644
index 0000000000000000000000000000000000000000..ce9caf5bfa6c5bc229a52d09c5c8f3b6093b80c6
--- /dev/null
+++ b/models/contentunderstanding/classification/model.py
@@ -0,0 +1,74 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import paddle.fluid as fluid
+
+from paddlerec.core.model import Model as ModelBase
+
+
+class Model(ModelBase):
+ def __init__(self, config):
+ ModelBase.__init__(self, config)
+ self.dict_dim = 100
+ self.max_len = 10
+ self.cnn_dim = 32
+ self.cnn_filter_size = 128
+ self.emb_dim = 8
+ self.hid_dim = 128
+ self.class_dim = 2
+ self.is_sparse = envs.get_global_env("hyper_parameters.is_sparse",
+ False)
+
+ def input_data(self, is_infer=False, **kwargs):
+ data = fluid.data(
+ name="input", shape=[None, self.max_len], dtype='int64')
+ label = fluid.data(name="label", shape=[None, 1], dtype='int64')
+ seq_len = fluid.data(name="seq_len", shape=[None], dtype='int64')
+ return [data, label, seq_len]
+
+ def net(self, input, is_infer=False):
+ """ network definition """
+ data = input[0]
+ label = input[1]
+ seq_len = input[2]
+
+ # embedding layer
+ emb = fluid.embedding(
+ input=data,
+ size=[self.dict_dim, self.emb_dim],
+ is_sparse=self.is_sparse)
+ emb = fluid.layers.sequence_unpad(emb, length=seq_len)
+ # convolution layer
+ conv = fluid.nets.sequence_conv_pool(
+ input=emb,
+ num_filters=self.cnn_dim,
+ filter_size=self.cnn_filter_size,
+ act="tanh",
+ pool_type="max")
+
+ # full connect layer
+ fc_1 = fluid.layers.fc(input=[conv], size=self.hid_dim)
+ # softmax layer
+ prediction = fluid.layers.fc(input=[fc_1],
+ size=self.class_dim,
+ act="softmax")
+ cost = fluid.layers.cross_entropy(input=prediction, label=label)
+ avg_cost = fluid.layers.mean(x=cost)
+ acc = fluid.layers.accuracy(input=prediction, label=label)
+
+ self._cost = avg_cost
+ if is_infer:
+ self._infer_results["acc"] = acc
+ else:
+ self._metrics["acc"] = acc
diff --git a/models/contentunderstanding/classification/reader.py b/models/contentunderstanding/classification/reader.py
new file mode 100644
index 0000000000000000000000000000000000000000..18a41ee844c35d0a6fa37a835203121868158c4e
--- /dev/null
+++ b/models/contentunderstanding/classification/reader.py
@@ -0,0 +1,42 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import sys
+
+from paddlerec.core.reader import Reader
+
+
+class TrainReader(Reader):
+ def init(self):
+ pass
+
+ def _process_line(self, l):
+ l = l.strip().split()
+ data = l[0:10]
+ seq_len = l[10:11]
+ label = l[11:]
+ return data, label, seq_len
+
+ def generate_sample(self, line):
+ def data_iter():
+ data, label, seq_len = self._process_line(line)
+ if data is None:
+ yield None
+ return
+ data = [int(i) for i in data]
+ label = [int(i) for i in label]
+ seq_len = [int(i) for i in seq_len]
+ yield [('data', data), ('label', label), ('seq_len', seq_len)]
+
+ return data_iter
diff --git a/models/contentunderstanding/readme.md b/models/contentunderstanding/readme.md
new file mode 100644
index 0000000000000000000000000000000000000000..217d7124d7cdb481ca7aacb418e36148508e42b8
--- /dev/null
+++ b/models/contentunderstanding/readme.md
@@ -0,0 +1,122 @@
+# 内容理解模型库
+
+## 简介
+我们提供了常见的内容理解任务中使用的模型算法的PaddleRec实现, 单机训练&预测效果指标以及分布式训练&预测性能指标等。实现的内容理解模型包括 [Tagspace](tagspace)、[文本分类](classification)等。
+
+模型算法库在持续添加中,欢迎关注。
+
+## 目录
+* [整体介绍](#整体介绍)
+ * [模型列表](#内容理解模型列表)
+* [使用教程](#使用教程)
+ * [数据处理](#数据处理)
+ * [训练](#训练)
+ * [预测](#预测)
+* [效果对比](#效果对比)
+ * [模型效果列表](#模型效果列表)
+* [分布式](#分布式)
+ * [模型性能列表](#模型性能列表)
+
+## 整体介绍
+### 模型列表
+
+| 模型 | 简介 | 论文 |
+| :------------------: | :--------------------: | :---------: |
+| TagSpace | 标签推荐 | [TagSpace: Semantic Embeddings from Hashtags (2014)](https://research.fb.com/publications/tagspace-semantic-embeddings-from-hashtags/) |
+| Classification | 文本分类 | [Convolutional neural networks for sentence classication (2014)](https://www.aclweb.org/anthology/D14-1181.pdf) |
+
+下面是每个模型的简介(注:图片引用自链接中的论文)
+
+[TagSpace模型](https://research.fb.com/publications/tagspace-semantic-embeddings-from-hashtags)
+
+
+
+
+[文本分类CNN模型](https://www.aclweb.org/anthology/D14-1181.pdf)
+
+
+
+
+##使用教程(快速开始)
+```
+python -m paddlerec.run -m paddlerec.models.contentunderstanding.tagspace
+python -m paddlerec.run -m paddlerec.models.contentunderstanding.classification
+```
+
+## 使用教程(复现论文)
+
+###注意
+
+为了方便使用者能够快速的跑通每一个模型,我们在每个模型下都提供了样例数据。如果需要复现readme中的效果请使用以下提供的脚本下载对应数据集以及数据预处理。
+
+### 数据处理
+
+**(1)TagSpace**
+
+[数据地址](https://github.com/mhjabreel/CharCNN/tree/master/data/) , [备份数据地址](https://paddle-tagspace.bj.bcebos.com/data.tar)
+
+数据格式如下
+```
+"3","Wall St. Bears Claw Back Into the Black (Reuters)","Reuters - Short-sellers, Wall Street's dwindling\band of ultra-cynics, are seeing green again."
+```
+
+数据解压后,将文本数据转为paddle数据,先将数据放到训练数据目录和测试数据目录
+
+```
+mkdir raw_big_train_data
+mkdir raw_big_test_data
+mv train.csv raw_big_train_data
+mv test.csv raw_big_test_data
+```
+
+运行脚本text2paddle.py 生成paddle输入格式
+
+```
+python text2paddle.py raw_big_train_data/ raw_big_test_data/ train_big_data test_big_data big_vocab_text.txt big_vocab_tag.txt
+```
+
+### 训练
+```
+cd modles/contentunderstanding/tagspace
+python -m paddlerec.run -m ./config.yaml # 自定义修改超参后,指定配置文件,使用自定义配置
+```
+
+### 预测
+```
+# 修改对应模型的config.yaml, workspace配置为当前目录的绝对路径
+# 修改对应模型的config.yaml,mode配置infer_runner
+# 示例: mode: train_runner -> mode: infer_runner
+# infer_runner中 class配置为 class: single_infer
+# 修改phase阶段为infer的配置,参照config注释
+
+# 修改完config.yaml后 执行:
+python -m paddlerec.run -m ./config.yaml
+```
+
+**(2)Classification**
+
+### 训练
+```
+cd modles/contentunderstanding/classification
+python -m paddlerec.run -m ./config.yaml # 自定义修改超参后,指定配置文件,使用自定义配置
+```
+
+### 预测
+```
+# 修改对应模型的config.yaml, workspace配置为当前目录的绝对路径
+# 修改对应模型的config.yaml,mode配置infer_runner
+# 示例: mode: train_runner -> mode: infer_runner
+# infer_runner中 class配置为 class: single_infer
+# 修改phase阶段为infer的配置,参照config注释
+
+# 修改完config.yaml后 执行:
+python -m paddlerec.run -m ./config.yaml
+```
+
+## 效果对比
+### 模型效果 (测试)
+
+| 数据集 | 模型 | loss | auc | acc | mae |
+| :------------------: | :--------------------: | :---------: |:---------: | :---------: |:---------: |
+| ag news dataset | TagSpace | -- | -- | -- | -- |
+| -- | Classification | -- | -- | -- | -- |
diff --git a/models/contentunderstanding/tagspace/__init__.py b/models/contentunderstanding/tagspace/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..abf198b97e6e818e1fbe59006f98492640bcee54
--- /dev/null
+++ b/models/contentunderstanding/tagspace/__init__.py
@@ -0,0 +1,13 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
diff --git a/models/contentunderstanding/tagspace/config.yaml b/models/contentunderstanding/tagspace/config.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..8ca28f2977dd4bfd382e250e5c6513b156360404
--- /dev/null
+++ b/models/contentunderstanding/tagspace/config.yaml
@@ -0,0 +1,55 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+workspace: "paddlerec.models.contentunderstanding.tagspace"
+
+dataset:
+- name: sample_1
+ type: QueueDataset
+ batch_size: 5
+ data_path: "{workspace}/data/train_data"
+ data_converter: "{workspace}/reader.py"
+
+hyper_parameters:
+ optimizer:
+ class: Adagrad
+ learning_rate: 0.001
+ vocab_text_size: 11447
+ vocab_tag_size: 4
+ emb_dim: 10
+ hid_dim: 1000
+ win_size: 5
+ margin: 0.1
+ neg_size: 3
+ num_devices: 1
+
+mode: runner1
+
+runner:
+- name: runner1
+ class: single_train
+ epochs: 10
+ device: cpu
+ save_checkpoint_interval: 2
+ save_inference_interval: 4
+ save_checkpoint_path: "increment"
+ save_inference_path: "inference"
+ save_inference_feed_varnames: []
+ save_inference_fetch_varnames: []
+
+phase:
+- name: phase1
+ model: "{workspace}/model.py"
+ dataset_name: sample_1
+ thread_num: 1
diff --git a/models/rank/tagspace/test_data/small_test.csv b/models/contentunderstanding/tagspace/data/test_data/small_test.csv
similarity index 100%
rename from models/rank/tagspace/test_data/small_test.csv
rename to models/contentunderstanding/tagspace/data/test_data/small_test.csv
diff --git a/models/rank/tagspace/train_data/small_train.csv b/models/contentunderstanding/tagspace/data/train_data/small_train.csv
similarity index 100%
rename from models/rank/tagspace/train_data/small_train.csv
rename to models/contentunderstanding/tagspace/data/train_data/small_train.csv
diff --git a/models/contentunderstanding/tagspace/model.py b/models/contentunderstanding/tagspace/model.py
new file mode 100644
index 0000000000000000000000000000000000000000..34e5ebace1c3f4a44def5ad8006c2eb74a40c3c0
--- /dev/null
+++ b/models/contentunderstanding/tagspace/model.py
@@ -0,0 +1,111 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import paddle.fluid as fluid
+import paddle.fluid.layers.nn as nn
+import paddle.fluid.layers.tensor as tensor
+import paddle.fluid.layers.control_flow as cf
+
+from paddlerec.core.model import Model as ModelBase
+from paddlerec.core.utils import envs
+
+
+class Model(ModelBase):
+ def __init__(self, config):
+ ModelBase.__init__(self, config)
+ self.cost = None
+ self.metrics = {}
+ self.vocab_text_size = envs.get_global_env(
+ "hyper_parameters.vocab_text_size")
+ self.vocab_tag_size = envs.get_global_env(
+ "hyper_parameters.vocab_tag_size")
+ self.emb_dim = envs.get_global_env("hyper_parameters.emb_dim")
+ self.hid_dim = envs.get_global_env("hyper_parameters.hid_dim")
+ self.win_size = envs.get_global_env("hyper_parameters.win_size")
+ self.margin = envs.get_global_env("hyper_parameters.margin")
+ self.neg_size = envs.get_global_env("hyper_parameters.neg_size")
+
+ def input_data(self, is_infer=False, **kwargs):
+ text = fluid.data(
+ name="text", shape=[None, 1], lod_level=1, dtype='int64')
+ pos_tag = fluid.data(
+ name="pos_tag", shape=[None, 1], lod_level=1, dtype='int64')
+ neg_tag = fluid.data(
+ name="neg_tag", shape=[None, 1], lod_level=1, dtype='int64')
+ return [text, pos_tag, neg_tag]
+
+ def net(self, input, is_infer=False):
+ """ network"""
+ text = input[0]
+ pos_tag = input[1]
+ neg_tag = input[2]
+
+ text_emb = fluid.embedding(
+ input=text,
+ size=[self.vocab_text_size, self.emb_dim],
+ param_attr="text_emb")
+ text_emb = fluid.layers.squeeze(input=text_emb, axes=[1])
+ pos_tag_emb = fluid.embedding(
+ input=pos_tag,
+ size=[self.vocab_tag_size, self.emb_dim],
+ param_attr="tag_emb")
+ pos_tag_emb = fluid.layers.squeeze(input=pos_tag_emb, axes=[1])
+ neg_tag_emb = fluid.embedding(
+ input=neg_tag,
+ size=[self.vocab_tag_size, self.emb_dim],
+ param_attr="tag_emb")
+ neg_tag_emb = fluid.layers.squeeze(input=neg_tag_emb, axes=[1])
+
+ conv_1d = fluid.nets.sequence_conv_pool(
+ input=text_emb,
+ num_filters=self.hid_dim,
+ filter_size=self.win_size,
+ act="tanh",
+ pool_type="max",
+ param_attr="cnn")
+ text_hid = fluid.layers.fc(input=conv_1d,
+ size=self.emb_dim,
+ param_attr="text_hid")
+ cos_pos = nn.cos_sim(pos_tag_emb, text_hid)
+ mul_text_hid = fluid.layers.sequence_expand_as(
+ x=text_hid, y=neg_tag_emb)
+ mul_cos_neg = nn.cos_sim(neg_tag_emb, mul_text_hid)
+ cos_neg_all = fluid.layers.sequence_reshape(
+ input=mul_cos_neg, new_dim=self.neg_size)
+ #choose max negtive cosine
+ cos_neg = nn.reduce_max(cos_neg_all, dim=1, keep_dim=True)
+ #calculate hinge loss
+ loss_part1 = nn.elementwise_sub(
+ tensor.fill_constant_batch_size_like(
+ input=cos_pos,
+ shape=[-1, 1],
+ value=self.margin,
+ dtype='float32'),
+ cos_pos)
+ loss_part2 = nn.elementwise_add(loss_part1, cos_neg)
+ loss_part3 = nn.elementwise_max(
+ tensor.fill_constant_batch_size_like(
+ input=loss_part2, shape=[-1, 1], value=0.0, dtype='float32'),
+ loss_part2)
+ avg_cost = nn.mean(loss_part3)
+ less = tensor.cast(cf.less_than(cos_neg, cos_pos), dtype='float32')
+ correct = nn.reduce_sum(less)
+ self._cost = avg_cost
+
+ if is_infer:
+ self._infer_results["correct"] = correct
+ self._infer_results["cos_pos"] = cos_pos
+ else:
+ self._metrics["correct"] = correct
+ self._metrics["cos_pos"] = cos_pos
diff --git a/models/rank/tagspace/reader.py b/models/contentunderstanding/tagspace/reader.py
similarity index 59%
rename from models/rank/tagspace/reader.py
rename to models/contentunderstanding/tagspace/reader.py
index 7803d862ed4f55890e810073f5a561e81be4f5e8..3bf704f17adbafc28302ec0b64180ec3fddf6d01 100644
--- a/models/rank/tagspace/reader.py
+++ b/models/contentunderstanding/tagspace/reader.py
@@ -1,23 +1,29 @@
-import re
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
import sys
-import collections
-import os
-import six
-import time
+
import numpy as np
-import paddle.fluid as fluid
-import paddle
-import csv
-import io
-from fleetrec.core.reader import Reader
-from fleetrec.core.utils import envs
+from paddlerec.core.reader import Reader
+
class TrainReader(Reader):
def init(self):
pass
- def _process_line(self, l):
+ def _process_line(self, l):
tag_size = 4
neg_size = 3
l = l.strip().split(",")
@@ -40,10 +46,7 @@ class TrainReader(Reader):
neg_index = rand_i
neg_tag.append(neg_index)
sum_n += 1
- # if n > 0 and len(text) > n:
- # #yield None
- # return None, None, None
- return text, pos_tag, neg_tag
+ return text, pos_tag, neg_tag
def generate_sample(self, line):
def data_iter():
@@ -52,4 +55,5 @@ class TrainReader(Reader):
yield None
return
yield [('text', text), ('pos_tag', pos_tag), ('neg_tag', neg_tag)]
+
return data_iter
diff --git a/models/match/__init__.py b/models/match/__init__.py
index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..abf198b97e6e818e1fbe59006f98492640bcee54 100755
--- a/models/match/__init__.py
+++ b/models/match/__init__.py
@@ -0,0 +1,13 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
diff --git a/models/match/dssm/config.yaml b/models/match/dssm/config.yaml
index 95d93d7843e77c407e6dc335301027b99e7c34b4..22881bdf906178a87f4b89091589523533eac934 100755
--- a/models/match/dssm/config.yaml
+++ b/models/match/dssm/config.yaml
@@ -23,7 +23,7 @@ train:
strategy: "async"
epochs: 4
- workspace: "fleetrec.models.match.dssm"
+ workspace: "paddlerec.models.match.dssm"
reader:
batch_size: 4
diff --git a/models/match/dssm/model.py b/models/match/dssm/model.py
index 414270524bc13987a2ad7f724cc6a78d3b19cbcf..05d6f762cb266b4cbe40c9a972aafe1885af5b86 100755
--- a/models/match/dssm/model.py
+++ b/models/match/dssm/model.py
@@ -12,11 +12,10 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-import math
import paddle.fluid as fluid
-from fleetrec.core.utils import envs
-from fleetrec.core.model import Model as ModelBase
+from paddlerec.core.utils import envs
+from paddlerec.core.model import Model as ModelBase
class Model(ModelBase):
@@ -24,12 +23,26 @@ class Model(ModelBase):
ModelBase.__init__(self, config)
def input(self):
- TRIGRAM_D = envs.get_global_env("hyper_parameters.TRIGRAM_D", None, self._namespace)
- Neg = envs.get_global_env("hyper_parameters.NEG", None, self._namespace)
-
- self.query = fluid.data(name="query", shape=[-1, TRIGRAM_D], dtype='float32', lod_level=0)
- self.doc_pos = fluid.data(name="doc_pos", shape=[-1, TRIGRAM_D], dtype='float32', lod_level=0)
- self.doc_negs = [fluid.data(name="doc_neg_" + str(i), shape=[-1, TRIGRAM_D], dtype="float32", lod_level=0) for i in range(Neg)]
+ TRIGRAM_D = envs.get_global_env("hyper_parameters.TRIGRAM_D", None,
+ self._namespace)
+
+ Neg = envs.get_global_env("hyper_parameters.NEG", None,
+ self._namespace)
+
+ self.query = fluid.data(
+ name="query", shape=[-1, TRIGRAM_D], dtype='float32', lod_level=0)
+ self.doc_pos = fluid.data(
+ name="doc_pos",
+ shape=[-1, TRIGRAM_D],
+ dtype='float32',
+ lod_level=0)
+ self.doc_negs = [
+ fluid.data(
+ name="doc_neg_" + str(i),
+ shape=[-1, TRIGRAM_D],
+ dtype="float32",
+ lod_level=0) for i in range(Neg)
+ ]
self._data_var.append(self.query)
self._data_var.append(self.doc_pos)
for input in self.doc_negs:
@@ -37,41 +50,55 @@ class Model(ModelBase):
if self._platform != "LINUX":
self._data_loader = fluid.io.DataLoader.from_generator(
- feed_list=self._data_var, capacity=64, use_double_buffer=False, iterable=False)
-
+ feed_list=self._data_var,
+ capacity=64,
+ use_double_buffer=False,
+ iterable=False)
def net(self, is_infer=False):
- hidden_layers = envs.get_global_env("hyper_parameters.fc_sizes", None, self._namespace)
- hidden_acts = envs.get_global_env("hyper_parameters.fc_acts", None, self._namespace)
-
+ hidden_layers = envs.get_global_env("hyper_parameters.fc_sizes", None,
+ self._namespace)
+ hidden_acts = envs.get_global_env("hyper_parameters.fc_acts", None,
+ self._namespace)
+
def fc(data, hidden_layers, hidden_acts, names):
fc_inputs = [data]
- for i in range(len(hidden_layers)):
- xavier=fluid.initializer.Xavier(uniform=True, fan_in=fc_inputs[-1].shape[1], fan_out=hidden_layers[i])
- out = fluid.layers.fc(input=fc_inputs[-1],
- size=hidden_layers[i],
- act=hidden_acts[i],
- param_attr=xavier,
- bias_attr=xavier,
- name=names[i])
- fc_inputs.append(out)
- return fc_inputs[-1]
-
- query_fc = fc(self.query, hidden_layers, hidden_acts, ['query_l1', 'query_l2', 'query_l3'])
- doc_pos_fc = fc(self.doc_pos, hidden_layers, hidden_acts, ['doc_pos_l1', 'doc_pos_l2', 'doc_pos_l3'])
- self.R_Q_D_p = fluid.layers.cos_sim(query_fc, doc_pos_fc)
+ for i in range(len(hidden_layers)):
+ xavier = fluid.initializer.Xavier(
+ uniform=True,
+ fan_in=fc_inputs[-1].shape[1],
+ fan_out=hidden_layers[i])
+ out = fluid.layers.fc(input=fc_inputs[-1],
+ size=hidden_layers[i],
+ act=hidden_acts[i],
+ param_attr=xavier,
+ bias_attr=xavier,
+ name=names[i])
+ fc_inputs.append(out)
+ return fc_inputs[-1]
+
+ query_fc = fc(self.query, hidden_layers, hidden_acts,
+ ['query_l1', 'query_l2', 'query_l3'])
+ doc_pos_fc = fc(self.doc_pos, hidden_layers, hidden_acts,
+ ['doc_pos_l1', 'doc_pos_l2', 'doc_pos_l3'])
+ self.R_Q_D_p = fluid.layers.cos_sim(query_fc, doc_pos_fc)
if is_infer:
return
R_Q_D_ns = []
- for i, doc_neg in enumerate(self.doc_negs):
- doc_neg_fc_i = fc(doc_neg, hidden_layers, hidden_acts, ['doc_neg_l1_' + str(i), 'doc_neg_l2_' + str(i), 'doc_neg_l3_' + str(i)])
+ for i, doc_neg in enumerate(self.doc_negs):
+ doc_neg_fc_i = fc(doc_neg, hidden_layers, hidden_acts, [
+ 'doc_neg_l1_' + str(i), 'doc_neg_l2_' + str(i),
+ 'doc_neg_l3_' + str(i)
+ ])
R_Q_D_ns.append(fluid.layers.cos_sim(query_fc, doc_neg_fc_i))
- concat_Rs = fluid.layers.concat(input=[self.R_Q_D_p] + R_Q_D_ns, axis=-1)
- prob = fluid.layers.softmax(concat_Rs, axis=1)
-
- hit_prob = fluid.layers.slice(prob, axes=[0,1], starts=[0,0], ends=[4, 1])
+ concat_Rs = fluid.layers.concat(
+ input=[self.R_Q_D_p] + R_Q_D_ns, axis=-1)
+ prob = fluid.layers.softmax(concat_Rs, axis=1)
+
+ hit_prob = fluid.layers.slice(
+ prob, axes=[0, 1], starts=[0, 0], ends=[4, 1])
loss = -fluid.layers.reduce_sum(fluid.layers.log(hit_prob))
self.avg_cost = fluid.layers.mean(x=loss)
@@ -91,20 +118,30 @@ class Model(ModelBase):
self.metrics()
def optimizer(self):
- learning_rate = envs.get_global_env("hyper_parameters.learning_rate", None, self._namespace)
+ learning_rate = envs.get_global_env("hyper_parameters.learning_rate",
+ None, self._namespace)
optimizer = fluid.optimizer.SGD(learning_rate)
return optimizer
def infer_input(self):
- TRIGRAM_D = envs.get_global_env("hyper_parameters.TRIGRAM_D", None, self._namespace)
- self.query = fluid.data(name="query", shape=[-1, TRIGRAM_D], dtype='float32', lod_level=0)
- self.doc_pos = fluid.data(name="doc_pos", shape=[-1, TRIGRAM_D], dtype='float32', lod_level=0)
+ TRIGRAM_D = envs.get_global_env("hyper_parameters.TRIGRAM_D", None,
+ self._namespace)
+ self.query = fluid.data(
+ name="query", shape=[-1, TRIGRAM_D], dtype='float32', lod_level=0)
+ self.doc_pos = fluid.data(
+ name="doc_pos",
+ shape=[-1, TRIGRAM_D],
+ dtype='float32',
+ lod_level=0)
self._infer_data_var = [self.query, self.doc_pos]
- self._infer_data_loader = fluid.io.DataLoader.from_generator(
- feed_list=self._infer_data_var, capacity=64, use_double_buffer=False, iterable=False)
-
+ self._infer_data_loader = fluid.io.DataLoader.from_generator(
+ feed_list=self._infer_data_var,
+ capacity=64,
+ use_double_buffer=False,
+ iterable=False)
+
def infer_net(self):
- self.infer_input()
+ self.infer_input()
self.net(is_infer=True)
- self.infer_results()
+ self.infer_results()
diff --git a/models/match/dssm/synthetic_evaluate_reader.py b/models/match/dssm/synthetic_evaluate_reader.py
index de49774084607a4d37406ad8667c8476a040f42b..97f50abf9720060b008b90c7729e93d13701bb3b 100755
--- a/models/match/dssm/synthetic_evaluate_reader.py
+++ b/models/match/dssm/synthetic_evaluate_reader.py
@@ -13,8 +13,7 @@
# limitations under the License.
from __future__ import print_function
-from fleetrec.core.reader import Reader
-from fleetrec.core.utils import envs
+from paddlerec.core.reader import Reader
class EvaluateReader(Reader):
diff --git a/models/match/dssm/synthetic_reader.py b/models/match/dssm/synthetic_reader.py
index 6eca228fd24da3603f69a822b81370f4a49fba13..13f57a6663ca372bc287386dd939214f362b503d 100755
--- a/models/match/dssm/synthetic_reader.py
+++ b/models/match/dssm/synthetic_reader.py
@@ -11,10 +11,10 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
+
from __future__ import print_function
-from fleetrec.core.reader import Reader
-from fleetrec.core.utils import envs
+from paddlerec.core.reader import Reader
class TrainReader(Reader):
@@ -37,7 +37,7 @@ class TrainReader(Reader):
neg_docs = []
for i in range(len(features) - 2):
feature_names.append('doc_neg_' + str(i))
- neg_docs.append(map(float, features[i+2].split(',')))
+ neg_docs.append(map(float, features[i + 2].split(',')))
yield zip(feature_names, [query] + [pos_doc] + neg_docs)
diff --git a/models/match/multiview-simnet/__init__.py b/models/match/multiview-simnet/__init__.py
new file mode 100755
index 0000000000000000000000000000000000000000..abf198b97e6e818e1fbe59006f98492640bcee54
--- /dev/null
+++ b/models/match/multiview-simnet/__init__.py
@@ -0,0 +1,13 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
diff --git a/models/recall/multiview-simnet/config.yaml b/models/match/multiview-simnet/config.yaml
similarity index 93%
rename from models/recall/multiview-simnet/config.yaml
rename to models/match/multiview-simnet/config.yaml
index 0fcc1fc2f18ed2588d4f9eb235a1dd619e99f337..53ac4c095c0d347cca8cba1afb9866c66ab85218 100755
--- a/models/recall/multiview-simnet/config.yaml
+++ b/models/match/multiview-simnet/config.yaml
@@ -12,7 +12,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
evaluate:
- workspace: "fleetrec.models.recall.multiview-simnet"
+ workspace: "paddlerec.models.match.multiview-simnet"
reader:
batch_size: 2
class: "{workspace}/evaluate_reader.py"
@@ -24,7 +24,7 @@ train:
strategy: "async"
epochs: 2
- workspace: "fleetrec.models.recall.multiview-simnet"
+ workspace: "paddlerec.models.match.multiview-simnet"
reader:
batch_size: 2
diff --git a/models/recall/multiview-simnet/data/test/test.txt b/models/match/multiview-simnet/data/test/test.txt
similarity index 100%
rename from models/recall/multiview-simnet/data/test/test.txt
rename to models/match/multiview-simnet/data/test/test.txt
diff --git a/models/recall/multiview-simnet/data/train/train.txt b/models/match/multiview-simnet/data/train/train.txt
similarity index 100%
rename from models/recall/multiview-simnet/data/train/train.txt
rename to models/match/multiview-simnet/data/train/train.txt
diff --git a/models/match/multiview-simnet/data_process.sh b/models/match/multiview-simnet/data_process.sh
new file mode 100755
index 0000000000000000000000000000000000000000..c8633cc7a41f62a29eee1778251b72a6f3b601eb
--- /dev/null
+++ b/models/match/multiview-simnet/data_process.sh
@@ -0,0 +1,24 @@
+#! /bin/bash
+
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+
+set -e
+echo "begin to prepare data"
+
+mkdir -p data/train
+mkdir -p data/test
+
+python generate_synthetic_data.py
diff --git a/models/recall/multiview-simnet/evaluate_reader.py b/models/match/multiview-simnet/evaluate_reader.py
similarity index 83%
rename from models/recall/multiview-simnet/evaluate_reader.py
rename to models/match/multiview-simnet/evaluate_reader.py
index 63340ccd003589d6e4411f08ed8ffa554ee170fa..d77032f3ca4e07cbbf20874f79023dc4a6fed8b4 100755
--- a/models/recall/multiview-simnet/evaluate_reader.py
+++ b/models/match/multiview-simnet/evaluate_reader.py
@@ -1,4 +1,4 @@
-# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved.
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
@@ -11,18 +11,17 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
-import numpy as np
-import io
-import copy
-import random
-from fleetrec.core.reader import Reader
-from fleetrec.core.utils import envs
+
+from paddlerec.core.reader import Reader
+from paddlerec.core.utils import envs
class EvaluateReader(Reader):
def init(self):
- self.query_slots = envs.get_global_env("hyper_parameters.query_slots", None, "train.model")
- self.title_slots = envs.get_global_env("hyper_parameters.title_slots", None, "train.model")
+ self.query_slots = envs.get_global_env("hyper_parameters.query_slots",
+ None, "train.model")
+ self.title_slots = envs.get_global_env("hyper_parameters.title_slots",
+ None, "train.model")
self.all_slots = []
for i in range(self.query_slots):
@@ -52,6 +51,7 @@ class EvaluateReader(Reader):
if visit:
self._all_slots_dict[slot][0] = False
else:
- output[index][1].append(padding)
+ output[index][1].append(padding)
yield output
+
return data_iter
diff --git a/models/recall/multiview-simnet/generate_synthetic_data.py b/models/match/multiview-simnet/generate_synthetic_data.py
similarity index 79%
rename from models/recall/multiview-simnet/generate_synthetic_data.py
rename to models/match/multiview-simnet/generate_synthetic_data.py
index 5ebb3a355f3904dc0a5cccff8e9d0b48b89f18f4..eb60e5c82f9decc2cfcd87da7bc6832ca98ee9d4 100755
--- a/models/recall/multiview-simnet/generate_synthetic_data.py
+++ b/models/match/multiview-simnet/generate_synthetic_data.py
@@ -1,4 +1,4 @@
-# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
@@ -14,12 +14,18 @@
import random
+
class Dataset:
def __init__(self):
pass
+
class SyntheticDataset(Dataset):
- def __init__(self, sparse_feature_dim, query_slot_num, title_slot_num, dataset_size=10000):
+ def __init__(self,
+ sparse_feature_dim,
+ query_slot_num,
+ title_slot_num,
+ dataset_size=10000):
# ids are randomly generated
self.ids_per_slot = 10
self.sparse_feature_dim = sparse_feature_dim
@@ -39,18 +45,25 @@ class SyntheticDataset(Dataset):
for i in range(self.query_slot_num):
qslot = generate_ids(self.ids_per_slot,
self.sparse_feature_dim)
- qslot = [str(fea) + ':' + str(i) for fea in qslot]
+ qslot = [str(fea) + ':' + str(i) for fea in qslot]
query_slots += qslot
for i in range(self.title_slot_num):
pt_slot = generate_ids(self.ids_per_slot,
self.sparse_feature_dim)
- pt_slot = [str(fea) + ':' + str(i + self.query_slot_num) for fea in pt_slot]
+ pt_slot = [
+ str(fea) + ':' + str(i + self.query_slot_num)
+ for fea in pt_slot
+ ]
pos_title_slots += pt_slot
if is_train:
for i in range(self.title_slot_num):
nt_slot = generate_ids(self.ids_per_slot,
self.sparse_feature_dim)
- nt_slot = [str(fea) + ':' + str(i + self.query_slot_num + self.title_slot_num) for fea in nt_slot]
+ nt_slot = [
+ str(fea) + ':' +
+ str(i + self.query_slot_num + self.title_slot_num)
+ for fea in nt_slot
+ ]
neg_title_slots += nt_slot
yield query_slots + pos_title_slots + neg_title_slots
else:
@@ -67,15 +80,17 @@ class SyntheticDataset(Dataset):
def test(self):
return self._reader_creator(False)
+
if __name__ == '__main__':
sparse_feature_dim = 1000001
query_slots = 1
title_slots = 1
dataset_size = 10
- dataset = SyntheticDataset(sparse_feature_dim, query_slots, title_slots, dataset_size)
+ dataset = SyntheticDataset(sparse_feature_dim, query_slots, title_slots,
+ dataset_size)
train_reader = dataset.train()
test_reader = dataset.test()
-
+
with open("data/train/train.txt", 'w') as fout:
for data in train_reader():
fout.write(' '.join(data))
diff --git a/models/recall/multiview-simnet/model.py b/models/match/multiview-simnet/model.py
similarity index 73%
rename from models/recall/multiview-simnet/model.py
rename to models/match/multiview-simnet/model.py
index c33c10033c55a2d95ce14ac7755eceba8a3a7dd1..f80a1cd0390f3c7aafc772ef535eb36b9657b439 100755
--- a/models/recall/multiview-simnet/model.py
+++ b/models/match/multiview-simnet/model.py
@@ -12,15 +12,13 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-import numpy as np
-import math
import paddle.fluid as fluid
-import paddle.fluid.layers as layers
import paddle.fluid.layers.tensor as tensor
import paddle.fluid.layers.control_flow as cf
-from fleetrec.core.utils import envs
-from fleetrec.core.model import Model as ModelBase
+from paddlerec.core.utils import envs
+from paddlerec.core.model import Model as ModelBase
+
class BowEncoder(object):
""" bow-encoder """
@@ -97,79 +95,98 @@ class SimpleEncoderFactory(object):
rnn_encode = GrnnEncoder(hidden_size=enc_hid_size)
return rnn_encode
+
class Model(ModelBase):
def __init__(self, config):
ModelBase.__init__(self, config)
self.init_config()
-
+
def init_config(self):
- self._fetch_interval = 1
- query_encoder = envs.get_global_env("hyper_parameters.query_encoder", None, self._namespace)
- title_encoder = envs.get_global_env("hyper_parameters.title_encoder", None, self._namespace)
- query_encode_dim = envs.get_global_env("hyper_parameters.query_encode_dim", None, self._namespace)
- title_encode_dim = envs.get_global_env("hyper_parameters.title_encode_dim", None, self._namespace)
- query_slots = envs.get_global_env("hyper_parameters.query_slots", None, self._namespace)
- title_slots = envs.get_global_env("hyper_parameters.title_slots", None, self._namespace)
+ self._fetch_interval = 1
+ query_encoder = envs.get_global_env("hyper_parameters.query_encoder",
+ None, self._namespace)
+ title_encoder = envs.get_global_env("hyper_parameters.title_encoder",
+ None, self._namespace)
+ query_encode_dim = envs.get_global_env(
+ "hyper_parameters.query_encode_dim", None, self._namespace)
+ title_encode_dim = envs.get_global_env(
+ "hyper_parameters.title_encode_dim", None, self._namespace)
+ query_slots = envs.get_global_env("hyper_parameters.query_slots", None,
+ self._namespace)
+ title_slots = envs.get_global_env("hyper_parameters.title_slots", None,
+ self._namespace)
factory = SimpleEncoderFactory()
self.query_encoders = [
factory.create(query_encoder, query_encode_dim)
for i in range(query_slots)
]
- self.title_encoders = [
+ self.title_encoders = [
factory.create(title_encoder, title_encode_dim)
for i in range(title_slots)
]
- self.emb_size = envs.get_global_env("hyper_parameters.sparse_feature_dim", None, self._namespace)
- self.emb_dim = envs.get_global_env("hyper_parameters.embedding_dim", None, self._namespace)
- self.emb_shape = [self.emb_size, self.emb_dim]
- self.hidden_size = envs.get_global_env("hyper_parameters.hidden_size", None, self._namespace)
- self.margin = 0.1
+ self.emb_size = envs.get_global_env(
+ "hyper_parameters.sparse_feature_dim", None, self._namespace)
+ self.emb_dim = envs.get_global_env("hyper_parameters.embedding_dim",
+ None, self._namespace)
+ self.emb_shape = [self.emb_size, self.emb_dim]
+ self.hidden_size = envs.get_global_env("hyper_parameters.hidden_size",
+ None, self._namespace)
+ self.margin = 0.1
def input(self, is_train=True):
- self.q_slots = [
+ self.q_slots = [
fluid.data(
name="%d" % i, shape=[None, 1], lod_level=1, dtype='int64')
for i in range(len(self.query_encoders))
]
self.pt_slots = [
fluid.data(
- name="%d" % (i + len(self.query_encoders)), shape=[None, 1], lod_level=1, dtype='int64')
- for i in range(len(self.title_encoders))
+ name="%d" % (i + len(self.query_encoders)),
+ shape=[None, 1],
+ lod_level=1,
+ dtype='int64') for i in range(len(self.title_encoders))
]
- if is_train == False:
- return self.q_slots + self.pt_slots
+ if is_train == False:
+ return self.q_slots + self.pt_slots
self.nt_slots = [
fluid.data(
- name="%d" % (i + len(self.query_encoders) + len(self.title_encoders)), shape=[None, 1], lod_level=1, dtype='int64')
- for i in range(len(self.title_encoders))
+ name="%d" %
+ (i + len(self.query_encoders) + len(self.title_encoders)),
+ shape=[None, 1],
+ lod_level=1,
+ dtype='int64') for i in range(len(self.title_encoders))
]
return self.q_slots + self.pt_slots + self.nt_slots
-
+
def train_input(self):
res = self.input()
self._data_var = res
- use_dataloader = envs.get_global_env("hyper_parameters.use_DataLoader", False, self._namespace)
+ use_dataloader = envs.get_global_env("hyper_parameters.use_DataLoader",
+ False, self._namespace)
if self._platform != "LINUX" or use_dataloader:
self._data_loader = fluid.io.DataLoader.from_generator(
- feed_list=self._data_var, capacity=256, use_double_buffer=False, iterable=False)
+ feed_list=self._data_var,
+ capacity=256,
+ use_double_buffer=False,
+ iterable=False)
def get_acc(self, x, y):
less = tensor.cast(cf.less_than(x, y), dtype='float32')
- label_ones = fluid.layers.fill_constant_batch_size_like(
+ label_ones = fluid.layers.fill_constant_batch_size_like(
input=x, dtype='float32', shape=[-1, 1], value=1.0)
correct = fluid.layers.reduce_sum(less)
- total = fluid.layers.reduce_sum(label_ones)
+ total = fluid.layers.reduce_sum(label_ones)
acc = fluid.layers.elementwise_div(correct, total)
- return acc
+ return acc
def net(self):
- q_embs = [
+ q_embs = [
fluid.embedding(
input=query, size=self.emb_shape, param_attr="emb")
for query in self.q_slots
@@ -184,16 +201,18 @@ class Model(ModelBase):
input=title, size=self.emb_shape, param_attr="emb")
for title in self.nt_slots
]
-
- # encode each embedding field with encoder
+
+ # encode each embedding field with encoder
q_encodes = [
self.query_encoders[i].forward(emb) for i, emb in enumerate(q_embs)
]
pt_encodes = [
- self.title_encoders[i].forward(emb) for i, emb in enumerate(pt_embs)
+ self.title_encoders[i].forward(emb)
+ for i, emb in enumerate(pt_embs)
]
nt_encodes = [
- self.title_encoders[i].forward(emb) for i, emb in enumerate(nt_embs)
+ self.title_encoders[i].forward(emb)
+ for i, emb in enumerate(nt_embs)
]
# concat multi view for query, pos_title, neg_title
@@ -201,7 +220,7 @@ class Model(ModelBase):
pt_concat = fluid.layers.concat(pt_encodes)
nt_concat = fluid.layers.concat(nt_encodes)
- # projection of hidden layer
+ # projection of hidden layer
q_hid = fluid.layers.fc(q_concat,
size=self.hidden_size,
param_attr='q_fc.w',
@@ -219,7 +238,7 @@ class Model(ModelBase):
cos_pos = fluid.layers.cos_sim(q_hid, pt_hid)
cos_neg = fluid.layers.cos_sim(q_hid, nt_hid)
- # pairwise hinge_loss
+ # pairwise hinge_loss
loss_part1 = fluid.layers.elementwise_sub(
tensor.fill_constant_batch_size_like(
input=cos_pos,
@@ -236,7 +255,7 @@ class Model(ModelBase):
loss_part2)
self.avg_cost = fluid.layers.mean(loss_part3)
- self.acc = self.get_acc(cos_neg, cos_pos)
+ self.acc = self.get_acc(cos_neg, cos_pos)
def avg_loss(self):
self._cost = self.avg_cost
@@ -252,20 +271,24 @@ class Model(ModelBase):
self.metrics()
def optimizer(self):
- learning_rate = envs.get_global_env("hyper_parameters.learning_rate", None, self._namespace)
- optimizer = fluid.optimizer.Adam(learning_rate=learning_rate)
- return optimizer
+ learning_rate = envs.get_global_env("hyper_parameters.learning_rate",
+ None, self._namespace)
+ optimizer = fluid.optimizer.Adam(learning_rate=learning_rate)
+ return optimizer
def infer_input(self):
res = self.input(is_train=False)
- self._infer_data_var = res
+ self._infer_data_var = res
self._infer_data_loader = fluid.io.DataLoader.from_generator(
- feed_list=self._infer_data_var, capacity=64, use_double_buffer=False, iterable=False)
-
+ feed_list=self._infer_data_var,
+ capacity=64,
+ use_double_buffer=False,
+ iterable=False)
+
def infer_net(self):
- self.infer_input()
- # lookup embedding for each slot
+ self.infer_input()
+ # lookup embedding for each slot
q_embs = [
fluid.embedding(
input=query, size=self.emb_shape, param_attr="emb")
@@ -276,14 +299,15 @@ class Model(ModelBase):
input=title, size=self.emb_shape, param_attr="emb")
for title in self.pt_slots
]
- # encode each embedding field with encoder
+ # encode each embedding field with encoder
q_encodes = [
self.query_encoders[i].forward(emb) for i, emb in enumerate(q_embs)
]
pt_encodes = [
- self.title_encoders[i].forward(emb) for i, emb in enumerate(pt_embs)
+ self.title_encoders[i].forward(emb)
+ for i, emb in enumerate(pt_embs)
]
- # concat multi view for query, pos_title, neg_title
+ # concat multi view for query, pos_title, neg_title
q_concat = fluid.layers.concat(q_encodes)
pt_concat = fluid.layers.concat(pt_encodes)
# projection of hidden layer
diff --git a/models/recall/multiview-simnet/reader.py b/models/match/multiview-simnet/reader.py
similarity index 84%
rename from models/recall/multiview-simnet/reader.py
rename to models/match/multiview-simnet/reader.py
index 34cabd415617bbbc5d4cfc942c5a48406e228d3d..4c0e42a44b0ea05272c832d65a6cfbc0d3f6c495 100755
--- a/models/recall/multiview-simnet/reader.py
+++ b/models/match/multiview-simnet/reader.py
@@ -1,4 +1,4 @@
-# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved.
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
@@ -11,18 +11,17 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
-import numpy as np
-import io
-import copy
-import random
-from fleetrec.core.reader import Reader
-from fleetrec.core.utils import envs
+
+from paddlerec.core.reader import Reader
+from paddlerec.core.utils import envs
class TrainReader(Reader):
def init(self):
- self.query_slots = envs.get_global_env("hyper_parameters.query_slots", None, "train.model")
- self.title_slots = envs.get_global_env("hyper_parameters.title_slots", None, "train.model")
+ self.query_slots = envs.get_global_env("hyper_parameters.query_slots",
+ None, "train.model")
+ self.title_slots = envs.get_global_env("hyper_parameters.title_slots",
+ None, "train.model")
self.all_slots = []
for i in range(self.query_slots):
@@ -55,6 +54,7 @@ class TrainReader(Reader):
if visit:
self._all_slots_dict[slot][0] = False
else:
- output[index][1].append(padding)
+ output[index][1].append(padding)
yield output
+
return data_iter
diff --git a/models/match/readme.md b/models/match/readme.md
new file mode 100755
index 0000000000000000000000000000000000000000..d9f91b257d81ffde820a04cad49b56edbd903f6a
--- /dev/null
+++ b/models/match/readme.md
@@ -0,0 +1,39 @@
+# 匹配模型库
+
+## 简介
+我们提供了常见的匹配任务中使用的模型算法的PaddleRec实现, 单机训练&预测效果指标以及分布式训练&预测性能指标等。实现的模型包括 [DSSM](http://gitlab.baidu.com/tangwei12/paddlerec/tree/develop/models/match/dssm)、[MultiView-Simnet](http://gitlab.baidu.com/tangwei12/paddlerec/tree/develop/models/match/multiview-simnet)。
+
+模型算法库在持续添加中,欢迎关注。
+
+## 目录
+* [整体介绍](#整体介绍)
+ * [匹配模型列表](#匹配模型列表)
+* [使用教程](#使用教程)
+ * [训练&预测](#训练&预测)
+
+## 整体介绍
+### 匹配模型列表
+
+| 模型 | 简介 | 论文 |
+| :------------------: | :--------------------: | :---------: |
+| DSSM | Deep Structured Semantic Models | [Learning Deep Structured Semantic Models for Web Search using Clickthrough Data](https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/cikm2013_DSSM_fullversion.pdf)(2013) |
+| MultiView-Simnet | Multi-view Simnet for Personalized recommendation | [A Multi-View Deep Learning Approach for Cross Domain User Modeling in Recommendation Systems](https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/frp1159-songA.pdf)(2015) |
+
+下面是每个模型的简介(注:图片引用自链接中的论文)
+
+[DSSM](https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/cikm2013_DSSM_fullversion.pdf):
+
+
+
+
+[MultiView-Simnet](https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/frp1159-songA.pdf):
+
+
+
+
+## 使用教程
+### 训练&预测
+```shell
+python -m paddlerec.run -m paddlerec.models.match.dssm # dssm
+python -m paddlerec.run -m paddlerec.models.match.multiview-simnet # multiview-simnet
+```
diff --git a/models/multitask/__init__.py b/models/multitask/__init__.py
new file mode 100755
index 0000000000000000000000000000000000000000..abf198b97e6e818e1fbe59006f98492640bcee54
--- /dev/null
+++ b/models/multitask/__init__.py
@@ -0,0 +1,13 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
diff --git a/models/multitask/esmm/config.yaml b/models/multitask/esmm/config.yaml
index 5cc16a5d14ee077d962db1de39d7f4f405a43649..b1412515d4c751d0980eb128601cb08066562b41 100644
--- a/models/multitask/esmm/config.yaml
+++ b/models/multitask/esmm/config.yaml
@@ -12,33 +12,55 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-train:
- trainer:
- # for cluster training
- strategy: "async"
- epochs: 3
- workspace: "fleetrec.models.multitask.esmm"
+workspace: "paddlerec.models.multitask.esmm"
+
+dataset:
+- name: dataset_train
+ batch_size: 1
+ type: QueueDataset
+ data_path: "{workspace}/data/train"
+ data_converter: "{workspace}/esmm_reader.py"
+- name: dataset_infer
+ batch_size: 1
+ type: QueueDataset
+ data_path: "{workspace}/data/test"
+ data_converter: "{workspace}/esmm_reader.py"
+
+hyper_parameters:
+ vocab_size: 10000
+ embed_size: 128
+ optimizer:
+ class: adam
+ learning_rate: 0.001
+ strategy: async
- reader:
- batch_size: 2
- class: "{workspace}/esmm_reader.py"
- train_data_path: "{workspace}/data/train"
+#use infer_runner mode and modify 'phase' below if infer
+mode: train_runner
+#mode: infer_runner
- model:
- models: "{workspace}/model.py"
- hyper_parameters:
- vocab_size: 10000
- embed_size: 128
- learning_rate: 0.001
- optimizer: adam
+runner:
+- name: train_runner
+ class: single_train
+ device: cpu
+ epochs: 3
+ save_checkpoint_interval: 2
+ save_inference_interval: 4
+ save_checkpoint_path: "increment"
+ save_inference_path: "inference"
+ print_interval: 10
+- name: infer_runner
+ class: single_infer
+ init_model_path: "increment/0"
+ device: cpu
+ epochs: 3
- save:
- increment:
- dirname: "increment"
- epoch_interval: 2
- save_last: True
- inference:
- dirname: "inference"
- epoch_interval: 4
- save_last: True
+phase:
+- name: train
+ model: "{workspace}/model.py"
+ dataset_name: dataset_train
+ thread_num: 1
+ #- name: infer
+ # model: "{workspace}/model.py"
+ # dataset_name: dataset_infer
+ # thread_num: 1
diff --git a/models/multitask/esmm/data/train/small.csv b/models/multitask/esmm/data/train/small.txt
similarity index 100%
rename from models/multitask/esmm/data/train/small.csv
rename to models/multitask/esmm/data/train/small.txt
diff --git a/models/multitask/esmm/esmm_reader.py b/models/multitask/esmm/esmm_reader.py
index a18702d39ae922ffd881c11966df0dfe8b713f51..5a3f3f916e1395a05b2f59a98132e5220dd224b9 100644
--- a/models/multitask/esmm/esmm_reader.py
+++ b/models/multitask/esmm/esmm_reader.py
@@ -11,21 +11,24 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
+
from __future__ import print_function
-from fleetrec.core.reader import Reader
-from fleetrec.core.utils import envs
from collections import defaultdict
-import numpy as np
+
+from paddlerec.core.reader import Reader
class TrainReader(Reader):
def init(self):
- all_field_id = ['101', '109_14', '110_14', '127_14', '150_14', '121', '122', '124', '125', '126', '127', '128', '129',
- '205', '206', '207', '210', '216', '508', '509', '702', '853', '301']
+ all_field_id = [
+ '101', '109_14', '110_14', '127_14', '150_14', '121', '122', '124',
+ '125', '126', '127', '128', '129', '205', '206', '207', '210',
+ '216', '508', '509', '702', '853', '301'
+ ]
self.all_field_id_dict = defaultdict(int)
- for i,field_id in enumerate(all_field_id):
- self.all_field_id_dict[field_id] = [False,i]
+ for i, field_id in enumerate(all_field_id):
+ self.all_field_id_dict[field_id] = [False, i]
def generate_sample(self, line):
"""
@@ -37,30 +40,28 @@ class TrainReader(Reader):
This function needs to be implemented by the user, based on data format
"""
features = line.strip().split(',')
- #ctr = list(map(int, features[1]))
- #cvr = list(map(int, features[2]))
ctr = int(features[1])
cvr = int(features[2])
-
+
padding = 0
- output = [(field_id,[]) for field_id in self.all_field_id_dict]
+ output = [(field_id, []) for field_id in self.all_field_id_dict]
for elem in features[4:]:
- field_id,feat_id = elem.strip().split(':')
+ field_id, feat_id = elem.strip().split(':')
if field_id not in self.all_field_id_dict:
continue
self.all_field_id_dict[field_id][0] = True
index = self.all_field_id_dict[field_id][1]
- #feat_id = list(map(int, feat_id))
- output[index][1].append(int(feat_id))
-
+ output[index][1].append(int(feat_id))
+
for field_id in self.all_field_id_dict:
- visited,index = self.all_field_id_dict[field_id]
+ visited, index = self.all_field_id_dict[field_id]
if visited:
self.all_field_id_dict[field_id][0] = False
else:
- output[index][1].append(padding)
+ output[index][1].append(padding)
output.append(('ctr', [ctr]))
output.append(('cvr', [cvr]))
yield output
+
return reader
diff --git a/models/multitask/esmm/model.py b/models/multitask/esmm/model.py
index 0063cdfefd7abebb315750d13fa723207fb5eb08..b4b257ed8a74829d3619c3b07bbb0cfc8e69ddde 100644
--- a/models/multitask/esmm/model.py
+++ b/models/multitask/esmm/model.py
@@ -12,103 +12,119 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-import math
+import numpy as np
import paddle.fluid as fluid
-from fleetrec.core.utils import envs
-from fleetrec.core.model import Model as ModelBase
-import numpy as np
+from paddlerec.core.utils import envs
+from paddlerec.core.model import Model as ModelBase
class Model(ModelBase):
def __init__(self, config):
ModelBase.__init__(self, config)
- def fc(self,tag, data, out_dim, active='prelu'):
-
- init_stddev = 1.0
- scales = 1.0 / np.sqrt(data.shape[1])
-
- p_attr = fluid.param_attr.ParamAttr(name='%s_weight' % tag,
- initializer=fluid.initializer.NormalInitializer(loc=0.0, scale=init_stddev * scales))
-
- b_attr = fluid.ParamAttr(name='%s_bias' % tag, initializer=fluid.initializer.Constant(0.1))
-
- out = fluid.layers.fc(input=data,
- size=out_dim,
- act=active,
- param_attr=p_attr,
- bias_attr =b_attr,
- name=tag)
- return out
-
- def input_data(self):
+ def _init_hyper_parameters(self):
+ self.vocab_size = envs.get_global_env("hyper_parameters.vocab_size")
+ self.embed_size = envs.get_global_env("hyper_parameters.embed_size")
+
+ def input_data(self, is_infer=False, **kwargs):
sparse_input_ids = [
- fluid.data(name="field_" + str(i), shape=[-1, 1], dtype="int64", lod_level=1) for i in range(0,23)
+ fluid.data(
+ name="field_" + str(i),
+ shape=[-1, 1],
+ dtype="int64",
+ lod_level=1) for i in range(0, 23)
]
label_ctr = fluid.data(name="ctr", shape=[-1, 1], dtype="int64")
label_cvr = fluid.data(name="cvr", shape=[-1, 1], dtype="int64")
inputs = sparse_input_ids + [label_ctr] + [label_cvr]
- self._data_var.extend(inputs)
-
- return inputs
-
- def net(self, inputs):
-
- vocab_size = envs.get_global_env("hyper_parameters.vocab_size", None, self._namespace)
- embed_size = envs.get_global_env("hyper_parameters.embed_size", None, self._namespace)
+ if is_infer:
+ return inputs
+ else:
+ return inputs
+
+ def net(self, inputs, is_infer=False):
+
emb = []
+ # input feature data
for data in inputs[0:-2]:
- feat_emb = fluid.embedding(input=data,
- size=[vocab_size, embed_size],
- param_attr=fluid.ParamAttr(name='dis_emb',
- learning_rate=5,
- initializer=fluid.initializer.Xavier(fan_in=embed_size,fan_out=embed_size)
- ),
- is_sparse=True)
- field_emb = fluid.layers.sequence_pool(input=feat_emb,pool_type='sum')
+ feat_emb = fluid.embedding(
+ input=data,
+ size=[self.vocab_size, self.embed_size],
+ param_attr=fluid.ParamAttr(
+ name='dis_emb',
+ learning_rate=5,
+ initializer=fluid.initializer.Xavier(
+ fan_in=self.embed_size, fan_out=self.embed_size)),
+ is_sparse=True)
+ field_emb = fluid.layers.sequence_pool(
+ input=feat_emb, pool_type='sum')
emb.append(field_emb)
concat_emb = fluid.layers.concat(emb, axis=1)
-
+
# ctr
active = 'relu'
- ctr_fc1 = self.fc('ctr_fc1', concat_emb, 200, active)
- ctr_fc2 = self.fc('ctr_fc2', ctr_fc1, 80, active)
- ctr_out = self.fc('ctr_out', ctr_fc2, 2, 'softmax')
-
+ ctr_fc1 = self._fc('ctr_fc1', concat_emb, 200, active)
+ ctr_fc2 = self._fc('ctr_fc2', ctr_fc1, 80, active)
+ ctr_out = self._fc('ctr_out', ctr_fc2, 2, 'softmax')
+
# cvr
- cvr_fc1 = self.fc('cvr_fc1', concat_emb, 200, active)
- cvr_fc2 = self.fc('cvr_fc2', cvr_fc1, 80, active)
- cvr_out = self.fc('cvr_out', cvr_fc2, 2,'softmax')
-
+ cvr_fc1 = self._fc('cvr_fc1', concat_emb, 200, active)
+ cvr_fc2 = self._fc('cvr_fc2', cvr_fc1, 80, active)
+ cvr_out = self._fc('cvr_out', cvr_fc2, 2, 'softmax')
+
ctr_clk = inputs[-2]
ctcvr_buy = inputs[-1]
-
- ctr_prop_one = fluid.layers.slice(ctr_out, axes=[1], starts=[1], ends=[2])
- cvr_prop_one = fluid.layers.slice(cvr_out, axes=[1], starts=[1], ends=[2])
-
- ctcvr_prop_one = fluid.layers.elementwise_mul(ctr_prop_one, cvr_prop_one)
- ctcvr_prop = fluid.layers.concat(input=[1-ctcvr_prop_one,ctcvr_prop_one], axis = 1)
-
+
+ ctr_prop_one = fluid.layers.slice(
+ ctr_out, axes=[1], starts=[1], ends=[2])
+ cvr_prop_one = fluid.layers.slice(
+ cvr_out, axes=[1], starts=[1], ends=[2])
+
+ ctcvr_prop_one = fluid.layers.elementwise_mul(ctr_prop_one,
+ cvr_prop_one)
+ ctcvr_prop = fluid.layers.concat(
+ input=[1 - ctcvr_prop_one, ctcvr_prop_one], axis=1)
+
+ auc_ctr, batch_auc_ctr, auc_states_ctr = fluid.layers.auc(
+ input=ctr_out, label=ctr_clk)
+ auc_ctcvr, batch_auc_ctcvr, auc_states_ctcvr = fluid.layers.auc(
+ input=ctcvr_prop, label=ctcvr_buy)
+
+ if is_infer:
+ self._infer_results["AUC_ctr"] = auc_ctr
+ self._infer_results["AUC_ctcvr"] = auc_ctcvr
+ return
+
loss_ctr = fluid.layers.cross_entropy(input=ctr_out, label=ctr_clk)
- loss_ctcvr = fluid.layers.cross_entropy(input=ctcvr_prop, label=ctcvr_buy)
+ loss_ctcvr = fluid.layers.cross_entropy(
+ input=ctcvr_prop, label=ctcvr_buy)
cost = loss_ctr + loss_ctcvr
avg_cost = fluid.layers.mean(cost)
- auc_ctr, batch_auc_ctr, auc_states_ctr = fluid.layers.auc(input=ctr_out, label=ctr_clk)
- auc_ctcvr, batch_auc_ctcvr, auc_states_ctcvr = fluid.layers.auc(input=ctcvr_prop, label=ctcvr_buy)
-
self._cost = avg_cost
self._metrics["AUC_ctr"] = auc_ctr
self._metrics["BATCH_AUC_ctr"] = batch_auc_ctr
self._metrics["AUC_ctcvr"] = auc_ctcvr
self._metrics["BATCH_AUC_ctcvr"] = batch_auc_ctcvr
+ def _fc(self, tag, data, out_dim, active='prelu'):
+
+ init_stddev = 1.0
+ scales = 1.0 / np.sqrt(data.shape[1])
- def train_net(self):
- input_data = self.input_data()
- self.net(input_data)
+ p_attr = fluid.param_attr.ParamAttr(
+ name='%s_weight' % tag,
+ initializer=fluid.initializer.NormalInitializer(
+ loc=0.0, scale=init_stddev * scales))
+ b_attr = fluid.ParamAttr(
+ name='%s_bias' % tag, initializer=fluid.initializer.Constant(0.1))
- def infer_net(self):
- pass
+ out = fluid.layers.fc(input=data,
+ size=out_dim,
+ act=active,
+ param_attr=p_attr,
+ bias_attr=b_attr,
+ name=tag)
+ return out
diff --git a/models/multitask/mmoe/census_reader.py b/models/multitask/mmoe/census_reader.py
index 323c15a40a29070285b89351612b1efdf162cd2c..d71133bd91692c8b17e7449aa305e5241db7777a 100644
--- a/models/multitask/mmoe/census_reader.py
+++ b/models/multitask/mmoe/census_reader.py
@@ -11,11 +11,10 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
+
from __future__ import print_function
-from fleetrec.core.reader import Reader
-from fleetrec.core.utils import envs
-import numpy as np
+from paddlerec.core.reader import Reader
class TrainReader(Reader):
@@ -25,6 +24,7 @@ class TrainReader(Reader):
def generate_sample(self, line):
"""
Read the data line by line and process it as a dictionary
+
"""
def reader():
@@ -44,8 +44,8 @@ class TrainReader(Reader):
label_marital = [1, 0]
elif int(l[0]) == 1:
label_marital = [0, 1]
- #label_income = np.array(label_income)
- #label_marital = np.array(label_marital)
+ # label_income = np.array(label_income)
+ # label_marital = np.array(label_marital)
feature_name = ["input", "label_income", "label_marital"]
yield zip(feature_name, [data] + [label_income] + [label_marital])
diff --git a/models/multitask/mmoe/config.yaml b/models/multitask/mmoe/config.yaml
index ca6552d8045761792b362fa3345e886cde396abb..9f36f84991ea30ffeb1745bc2d769b19a9887ab2 100644
--- a/models/multitask/mmoe/config.yaml
+++ b/models/multitask/mmoe/config.yaml
@@ -12,36 +12,57 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-train:
- trainer:
- # for cluster training
- strategy: "async"
+workspace: "paddlerec.models.multitask.mmoe"
- epochs: 3
- workspace: "fleetrec.models.multitask.mmoe"
+dataset:
+- name: dataset_train
+ batch_size: 1
+ type: QueueDataset
+ data_path: "{workspace}/data/train"
+ data_converter: "{workspace}/census_reader.py"
+- name: dataset_infer
+ batch_size: 1
+ type: QueueDataset
+ data_path: "{workspace}/data/train"
+ data_converter: "{workspace}/census_reader.py"
+
+hyper_parameters:
+ feature_size: 499
+ expert_num: 8
+ gate_num: 2
+ expert_size: 16
+ tower_size: 8
+ optimizer:
+ class: adam
+ learning_rate: 0.001
+ strategy: async
- reader:
- batch_size: 2
- class: "{workspace}/census_reader.py"
- train_data_path: "{workspace}/data/train"
+#use infer_runner mode and modify 'phase' below if infer
+mode: train_runner
+#mode: infer_runner
- model:
- models: "{workspace}/model.py"
- hyper_parameters:
- feature_size: 499
- expert_num: 8
- gate_num: 2
- expert_size: 16
- tower_size: 8
- learning_rate: 0.001
- optimizer: adam
+runner:
+- name: train_runner
+ class: single_train
+ device: cpu
+ epochs: 3
+ save_checkpoint_interval: 2
+ save_inference_interval: 4
+ save_checkpoint_path: "increment"
+ save_inference_path: "inference"
+ print_interval: 10
+- name: infer_runner
+ class: single_infer
+ init_model_path: "increment/0"
+ device: cpu
+ epochs: 3
- save:
- increment:
- dirname: "increment"
- epoch_interval: 2
- save_last: True
- inference:
- dirname: "inference"
- epoch_interval: 4
- save_last: True
+phase:
+- name: train
+ model: "{workspace}/model.py"
+ dataset_name: dataset_train
+ thread_num: 1
+ #- name: infer
+ # model: "{workspace}/model.py"
+ # dataset_name: dataset_infer
+ # thread_num: 1
diff --git a/models/multitask/mmoe/data/run.sh b/models/multitask/mmoe/data/run.sh
new file mode 100644
index 0000000000000000000000000000000000000000..b60d42b37057593b1c16aa5fd91b8217a5a71bbf
--- /dev/null
+++ b/models/multitask/mmoe/data/run.sh
@@ -0,0 +1,16 @@
+mkdir train_data
+mkdir test_data
+mkdir data
+train_path="data/census-income.data"
+test_path="data/census-income.test"
+train_data_path="train_data/"
+test_data_path="test_data/"
+pip install -r requirements.txt
+
+wget -P data/ https://archive.ics.uci.edu/ml/machine-learning-databases/census-income-mld/census.tar.gz
+tar -zxvf data/census.tar.gz -C data/
+
+python data_preparation.py --train_path ${train_path} \
+ --test_path ${test_path} \
+ --train_data_path ${train_data_path}\
+ --test_data_path ${test_data_path}
diff --git a/models/multitask/mmoe/data/train/train_data.txt b/models/multitask/mmoe/data/train/train_data.txt
new file mode 100644
index 0000000000000000000000000000000000000000..ba385736663d5efd4321692d1fbafda8bbf585c1
--- /dev/null
+++ b/models/multitask/mmoe/data/train/train_data.txt
@@ -0,0 +1,30 @@
+0,0,73,0,0,0,0,1700.09,0,0,2,0,95,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
+0,0,73,0,0,0,0,1700.09,0,0,2,0,95,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
+0,0,73,0,0,0,0,1700.09,0,0,2,0,95,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
+0,0,58,0,0,0,0,1053.55,1,0,2,52,94,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
+1,0,18,0,0,0,0,991.95,0,0,2,0,95,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,1,0
+1,0,9,0,0,0,0,1758.14,0,0,0,0,94,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
+1,0,10,0,0,0,0,1069.16,0,0,0,0,94,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
+0,0,48,1200,0,0,0,162.61,1,2,2,52,95,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
+0,0,42,0,5178,0,0,1535.86,6,0,2,52,94,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
+1,0,28,0,0,0,0,898.83,4,0,2,30,95,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
+0,0,47,876,0,0,0,1661.53,5,0,2,52,95,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
+0,0,34,0,0,0,0,1146.79,6,0,2,52,94,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
+0,0,58,0,0,0,0,1053.55,1,0,2,52,94,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
+1,0,18,0,0,0,0,991.95,0,0,2,0,95,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,1,0
+1,0,9,0,0,0,0,1758.14,0,0,0,0,94,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
+1,0,10,0,0,0,0,1069.16,0,0,0,0,94,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
+0,0,48,1200,0,0,0,162.61,1,2,2,52,95,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
+0,0,42,0,5178,0,0,1535.86,6,0,2,52,94,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
+1,0,28,0,0,0,0,898.83,4,0,2,30,95,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
+0,0,47,876,0,0,0,1661.53,5,0,2,52,95,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
+0,0,34,0,0,0,0,1146.79,6,0,2,52,94,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
+0,0,58,0,0,0,0,1053.55,1,0,2,52,94,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
+1,0,18,0,0,0,0,991.95,0,0,2,0,95,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,1,0
+1,0,9,0,0,0,0,1758.14,0,0,0,0,94,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
+1,0,10,0,0,0,0,1069.16,0,0,0,0,94,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
+0,0,48,1200,0,0,0,162.61,1,2,2,52,95,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
+0,0,42,0,5178,0,0,1535.86,6,0,2,52,94,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
+1,0,28,0,0,0,0,898.83,4,0,2,30,95,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
+0,0,47,876,0,0,0,1661.53,5,0,2,52,95,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
+0,0,34,0,0,0,0,1146.79,6,0,2,52,94,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
diff --git a/models/multitask/mmoe/model.py b/models/multitask/mmoe/model.py
index 0b757d0270e7f5159bf4154e66b8e321fb9d9885..309da6a31e8754110fb8c9d50971bc4dc9aff364 100644
--- a/models/multitask/mmoe/model.py
+++ b/models/multitask/mmoe/model.py
@@ -12,93 +12,118 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-import math
import paddle.fluid as fluid
-from fleetrec.core.utils import envs
-from fleetrec.core.model import Model as ModelBase
+from paddlerec.core.utils import envs
+from paddlerec.core.model import Model as ModelBase
class Model(ModelBase):
def __init__(self, config):
ModelBase.__init__(self, config)
- def MMOE(self):
+ def _init_hyper_parameters(self):
+ self.feature_size = envs.get_global_env(
+ "hyper_parameters.feature_size")
+ self.expert_num = envs.get_global_env("hyper_parameters.expert_num")
+ self.gate_num = envs.get_global_env("hyper_parameters.gate_num")
+ self.expert_size = envs.get_global_env("hyper_parameters.expert_size")
+ self.tower_size = envs.get_global_env("hyper_parameters.tower_size")
- feature_size = envs.get_global_env("hyper_parameters.feature_size", None, self._namespace)
- expert_num = envs.get_global_env("hyper_parameters.expert_num", None, self._namespace)
- gate_num = envs.get_global_env("hyper_parameters.gate_num", None, self._namespace)
- expert_size = envs.get_global_env("hyper_parameters.expert_size", None, self._namespace)
- tower_size = envs.get_global_env("hyper_parameters.tower_size", None, self._namespace)
+ def input_data(self, is_infer=False, **kwargs):
+ inputs = fluid.data(
+ name="input", shape=[-1, self.feature_size], dtype="float32")
+ label_income = fluid.data(
+ name="label_income", shape=[-1, 2], dtype="float32", lod_level=0)
+ label_marital = fluid.data(
+ name="label_marital", shape=[-1, 2], dtype="float32", lod_level=0)
+ if is_infer:
+ return [inputs, label_income, label_marital]
+ else:
+ return [inputs, label_income, label_marital]
+
+ def net(self, inputs, is_infer=False):
+ input_data = inputs[0]
+ label_income = inputs[1]
+ label_marital = inputs[2]
- input_data = fluid.data(name="input", shape=[-1, feature_size], dtype="float32")
- label_income = fluid.data(name="label_income", shape=[-1, 2], dtype="float32", lod_level=0)
- label_marital = fluid.data(name="label_marital", shape=[-1, 2], dtype="float32", lod_level=0)
-
- self._data_var.extend([input_data, label_income, label_marital])
# f_{i}(x) = activation(W_{i} * x + b), where activation is ReLU according to the paper
expert_outputs = []
- for i in range(0, expert_num):
- expert_output = fluid.layers.fc(input=input_data,
- size=expert_size,
- act='relu',
- bias_attr=fluid.ParamAttr(learning_rate=1.0),
- name='expert_' + str(i))
+ for i in range(0, self.expert_num):
+ expert_output = fluid.layers.fc(
+ input=input_data,
+ size=self.expert_size,
+ act='relu',
+ bias_attr=fluid.ParamAttr(learning_rate=1.0),
+ name='expert_' + str(i))
expert_outputs.append(expert_output)
expert_concat = fluid.layers.concat(expert_outputs, axis=1)
- expert_concat = fluid.layers.reshape(expert_concat,[-1, expert_num, expert_size])
-
-
+ expert_concat = fluid.layers.reshape(
+ expert_concat, [-1, self.expert_num, self.expert_size])
+
# g^{k}(x) = activation(W_{gk} * x + b), where activation is softmax according to the paper
output_layers = []
- for i in range(0, gate_num):
- cur_gate = fluid.layers.fc(input=input_data,
- size=expert_num,
- act='softmax',
- bias_attr=fluid.ParamAttr(learning_rate=1.0),
- name='gate_' + str(i))
+ for i in range(0, self.gate_num):
+ cur_gate = fluid.layers.fc(
+ input=input_data,
+ size=self.expert_num,
+ act='softmax',
+ bias_attr=fluid.ParamAttr(learning_rate=1.0),
+ name='gate_' + str(i))
# f^{k}(x) = sum_{i=1}^{n}(g^{k}(x)_{i} * f_{i}(x))
- cur_gate_expert = fluid.layers.elementwise_mul(expert_concat, cur_gate, axis=0)
+ cur_gate_expert = fluid.layers.elementwise_mul(
+ expert_concat, cur_gate, axis=0)
cur_gate_expert = fluid.layers.reduce_sum(cur_gate_expert, dim=1)
# Build tower layer
- cur_tower = fluid.layers.fc(input=cur_gate_expert,
- size=tower_size,
- act='relu',
- name='task_layer_' + str(i))
- out = fluid.layers.fc(input=cur_tower,
- size=2,
- act='softmax',
- name='out_' + str(i))
-
+ cur_tower = fluid.layers.fc(input=cur_gate_expert,
+ size=self.tower_size,
+ act='relu',
+ name='task_layer_' + str(i))
+ out = fluid.layers.fc(input=cur_tower,
+ size=2,
+ act='softmax',
+ name='out_' + str(i))
+
output_layers.append(out)
- pred_income = fluid.layers.clip(output_layers[0], min=1e-15, max=1.0 - 1e-15)
- pred_marital = fluid.layers.clip(output_layers[1], min=1e-15, max=1.0 - 1e-15)
-
- cost_income = fluid.layers.cross_entropy(input=pred_income, label=label_income,soft_label = True)
- cost_marital = fluid.layers.cross_entropy(input=pred_marital, label=label_marital,soft_label = True)
-
- label_income_1 = fluid.layers.slice(label_income, axes=[1], starts=[1], ends=[2])
- label_marital_1 = fluid.layers.slice(label_marital, axes=[1], starts=[1], ends=[2])
-
- auc_income, batch_auc_1, auc_states_1 = fluid.layers.auc(input=pred_income, label=fluid.layers.cast(x=label_income_1, dtype='int64'))
- auc_marital, batch_auc_2, auc_states_2 = fluid.layers.auc(input=pred_marital, label=fluid.layers.cast(x=label_marital_1, dtype='int64'))
-
+ pred_income = fluid.layers.clip(
+ output_layers[0], min=1e-15, max=1.0 - 1e-15)
+ pred_marital = fluid.layers.clip(
+ output_layers[1], min=1e-15, max=1.0 - 1e-15)
+
+ label_income_1 = fluid.layers.slice(
+ label_income, axes=[1], starts=[1], ends=[2])
+ label_marital_1 = fluid.layers.slice(
+ label_marital, axes=[1], starts=[1], ends=[2])
+
+ auc_income, batch_auc_1, auc_states_1 = fluid.layers.auc(
+ input=pred_income,
+ label=fluid.layers.cast(
+ x=label_income_1, dtype='int64'))
+ auc_marital, batch_auc_2, auc_states_2 = fluid.layers.auc(
+ input=pred_marital,
+ label=fluid.layers.cast(
+ x=label_marital_1, dtype='int64'))
+ if is_infer:
+ self._infer_results["AUC_income"] = auc_income
+ self._infer_results["AUC_marital"] = auc_marital
+ return
+
+ cost_income = fluid.layers.cross_entropy(
+ input=pred_income, label=label_income, soft_label=True)
+ cost_marital = fluid.layers.cross_entropy(
+ input=pred_marital, label=label_marital, soft_label=True)
+
avg_cost_income = fluid.layers.mean(x=cost_income)
avg_cost_marital = fluid.layers.mean(x=cost_marital)
-
- cost = avg_cost_income + avg_cost_marital
-
+
+ cost = avg_cost_income + avg_cost_marital
+
self._cost = cost
self._metrics["AUC_income"] = auc_income
self._metrics["BATCH_AUC_income"] = batch_auc_1
self._metrics["AUC_marital"] = auc_marital
self._metrics["BATCH_AUC_marital"] = batch_auc_2
-
- def train_net(self):
- self.MMOE()
-
-
def infer_net(self):
pass
diff --git a/models/multitask/readme.md b/models/multitask/readme.md
new file mode 100755
index 0000000000000000000000000000000000000000..07a6c01d77b72ed47153c3fad92521429a4769a2
--- /dev/null
+++ b/models/multitask/readme.md
@@ -0,0 +1,95 @@
+# 多任务学习模型库
+
+## 简介
+我们提供了常见的多任务学习中使用的模型算法的PaddleRec实现, 单机训练&预测效果指标以及分布式训练&预测性能指标等。实现的多任务模型包括 [MMoE](mmoe)、[Share-Bottom](share-bottom)、[ESMM](esmm)。
+
+模型算法库在持续添加中,欢迎关注。
+
+## 目录
+* [整体介绍](#整体介绍)
+ * [多任务模型列表](#多任务模型列表)
+* [使用教程](#使用教程)
+ * [数据处理](#数据处理)
+ * [训练](#训练)
+ * [预测](#预测)
+* [效果对比](#效果对比)
+ * [模型效果列表](#模型效果列表)
+
+## 整体介绍
+### 多任务模型列表
+
+| 模型 | 简介 | 论文 |
+| :------------------: | :--------------------: | :---------: |
+| Share-Bottom | share-bottom | [Multitask learning](http://reports-archive.adm.cs.cmu.edu/anon/1997/CMU-CS-97-203.pdf)(1998) |
+| ESMM | Entire Space Multi-Task Model | [Entire Space Multi-Task Model: An Effective Approach for Estimating Post-Click Conversion Rate](https://arxiv.org/abs/1804.07931)(2018) |
+| MMoE | Multi-gate Mixture-of-Experts | [Modeling Task Relationships in Multi-task Learning with Multi-gate Mixture-of-Experts](https://dl.acm.org/doi/abs/10.1145/3219819.3220007)(2018) |
+
+下面是每个模型的简介(注:图片引用自链接中的论文)
+
+
+[ESMM](https://arxiv.org/abs/1804.07931):
+
+
+
+
+[Share-Bottom](http://reports-archive.adm.cs.cmu.edu/anon/1997/CMU-CS-97-203.pdf):
+
+
+
+
+[MMoE](https://dl.acm.org/doi/abs/10.1145/3219819.3220007):
+
+
+
+
+## 使用教程(快速开始)
+```shell
+python -m paddlerec.run -m paddlerec.models.multitask.mmoe # mmoe
+python -m paddlerec.run -m paddlerec.models.multitask.share-bottom # share-bottom
+python -m paddlerec.run -m paddlerec.models.multitask.esmm # esmm
+```
+
+## 使用教程(复现论文)
+### 注意
+为了方便使用者能够快速的跑通每一个模型,我们在每个模型下都提供了样例数据,并且调整了batch_size等超参以便在样例数据上更加友好的显示训练&测试日志。如果需要复现readme中的效果请按照如下表格调整batch_size等超参,并使用提供的脚本下载对应数据集以及数据预处理。
+
+| 模型 | batch_size | thread_num | epoch_num |
+| :------------------: | :--------------------: | :--------------------: | :--------------------: |
+| Share-Bottom | 32 | 1 | 400 |
+| MMoE | 32 | 1 | 400 |
+| ESMM | 64 | 2 | 100 |
+
+### 数据处理
+参考每个模型目录数据下载&预处理脚本
+
+```
+sh run.sh
+```
+
+### 训练
+```
+cd modles/multitask/mmoe # 进入选定好的排序模型的目录 以MMoE为例
+python -m paddlerec.run -m ./config.yaml # 自定义修改超参后,指定配置文件,使用自定义配置
+```
+
+### 预测
+```
+# 修改对应模型的config.yaml, workspace配置为当前目录的绝对路径
+# 修改对应模型的config.yaml,mode配置infer_runner
+# 示例: mode: train_runner -> mode: infer_runner
+# infer_runner中 class配置为 class: single_infer
+# 修改phase阶段为infer的配置,参照config注释
+
+# 修改完config.yaml后 执行:
+python -m paddlerec.run -m ./config.yaml # 以MMoE为例
+```
+
+
+## 效果对比
+### 模型效果列表
+
+| 数据集 | 模型 | loss | auc |
+| :------------------: | :--------------------: | :---------: |:---------: |
+| Census-income Data | Share-Bottom | -- | 0.93120/0.99256 |
+| Census-income Data | MMoE | -- | 0.94465/0.99324 |
+| Ali-CCP | ESMM | -- | 0.97181/0.49967 |
diff --git a/models/multitask/share-bottom/census_reader.py b/models/multitask/share-bottom/census_reader.py
index 323c15a40a29070285b89351612b1efdf162cd2c..211e566882e5d8a7f50f22b0a1628307777099c8 100644
--- a/models/multitask/share-bottom/census_reader.py
+++ b/models/multitask/share-bottom/census_reader.py
@@ -11,11 +11,10 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
+
from __future__ import print_function
-from fleetrec.core.reader import Reader
-from fleetrec.core.utils import envs
-import numpy as np
+from paddlerec.core.reader import Reader
class TrainReader(Reader):
@@ -44,8 +43,8 @@ class TrainReader(Reader):
label_marital = [1, 0]
elif int(l[0]) == 1:
label_marital = [0, 1]
- #label_income = np.array(label_income)
- #label_marital = np.array(label_marital)
+ # label_income = np.array(label_income)
+ # label_marital = np.array(label_marital)
feature_name = ["input", "label_income", "label_marital"]
yield zip(feature_name, [data] + [label_income] + [label_marital])
diff --git a/models/multitask/share-bottom/config.yaml b/models/multitask/share-bottom/config.yaml
index 432ebdf331f098580dfad2019c05af3e4e47553d..3a44b8e7b23a545e5daf67a789a0c3537f614c4e 100644
--- a/models/multitask/share-bottom/config.yaml
+++ b/models/multitask/share-bottom/config.yaml
@@ -12,35 +12,56 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-train:
- trainer:
- # for cluster training
- strategy: "async"
+workspace: "paddlerec.models.multitask.share-bottom"
- epochs: 3
- workspace: "fleetrec.models.multitask.share-bottom"
+dataset:
+- name: dataset_train
+ batch_size: 1
+ type: QueueDataset
+ data_path: "{workspace}/data/train"
+ data_converter: "{workspace}/census_reader.py"
+- name: dataset_infer
+ batch_size: 1
+ type: QueueDataset
+ data_path: "{workspace}/data/train"
+ data_converter: "{workspace}/census_reader.py"
+
+hyper_parameters:
+ feature_size: 499
+ bottom_size: 117
+ tower_nums: 2
+ tower_size: 8
+ optimizer:
+ class: adam
+ learning_rate: 0.001
+ strategy: async
- reader:
- batch_size: 2
- class: "{workspace}/census_reader.py"
- train_data_path: "{workspace}/data/train"
+#use infer_runner mode and modify 'phase' below if infer
+mode: train_runner
+#mode: infer_runner
- model:
- models: "{workspace}/model.py"
- hyper_parameters:
- feature_size: 499
- bottom_size: 117
- tower_nums: 2
- tower_size: 8
- learning_rate: 0.001
- optimizer: adam
+runner:
+- name: train_runner
+ class: single_train
+ device: cpu
+ epochs: 3
+ save_checkpoint_interval: 2
+ save_inference_interval: 4
+ save_checkpoint_path: "increment"
+ save_inference_path: "inference"
+ print_interval: 5
+- name: infer_runner
+ class: single_infer
+ init_model_path: "increment/0"
+ device: cpu
+ epochs: 3
- save:
- increment:
- dirname: "increment"
- epoch_interval: 2
- save_last: True
- inference:
- dirname: "inference"
- epoch_interval: 4
- save_last: True
+phase:
+- name: train
+ model: "{workspace}/model.py"
+ dataset_name: dataset_train
+ thread_num: 1
+ #- name: infer
+ # model: "{workspace}/model.py"
+ # dataset_name: dataset_infer
+ # thread_num: 1
diff --git a/models/multitask/share-bottom/data/train/train_data b/models/multitask/share-bottom/data/train/train_data
deleted file mode 100644
index 992314e443942c1b3e08a7db88bf2c1d7354c451..0000000000000000000000000000000000000000
--- a/models/multitask/share-bottom/data/train/train_data
+++ /dev/null
@@ -1,10 +0,0 @@
-0,0,73,0,0,0,0,1700.09,0,0,2,0,95,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
-0,0,58,0,0,0,0,1053.55,1,0,2,52,94,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
-1,0,18,0,0,0,0,991.95,0,0,2,0,95,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,1,0
-1,0,9,0,0,0,0,1758.14,0,0,0,0,94,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
-1,0,10,0,0,0,0,1069.16,0,0,0,0,94,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
-0,0,48,1200,0,0,0,162.61,1,2,2,52,95,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
-0,0,42,0,5178,0,0,1535.86,6,0,2,52,94,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
-1,0,28,0,0,0,0,898.83,4,0,2,30,95,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
-0,0,47,876,0,0,0,1661.53,5,0,2,52,95,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
-0,0,34,0,0,0,0,1146.79,6,0,2,52,94,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0
diff --git a/models/multitask/mmoe/data/train/train_data b/models/multitask/share-bottom/data/train/train_data.txt
similarity index 100%
rename from models/multitask/mmoe/data/train/train_data
rename to models/multitask/share-bottom/data/train/train_data.txt
diff --git a/models/multitask/share-bottom/model.py b/models/multitask/share-bottom/model.py
index a364f1b82b9f4a6849e67cd1ff34fa8d179b475c..0275d3a10b3dd4f35388da10b303d86421228695 100644
--- a/models/multitask/share-bottom/model.py
+++ b/models/multitask/share-bottom/model.py
@@ -12,78 +12,94 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-import math
import paddle.fluid as fluid
-from fleetrec.core.utils import envs
-from fleetrec.core.model import Model as ModelBase
+from paddlerec.core.utils import envs
+from paddlerec.core.model import Model as ModelBase
class Model(ModelBase):
def __init__(self, config):
ModelBase.__init__(self, config)
- def train(self):
-
- feature_size = envs.get_global_env("hyper_parameters.feature_size", None, self._namespace)
- bottom_size = envs.get_global_env("hyper_parameters.bottom_size", None, self._namespace)
- tower_size = envs.get_global_env("hyper_parameters.tower_size", None, self._namespace)
- tower_nums = envs.get_global_env("hyper_parameters.tower_nums", None, self._namespace)
-
- input_data = fluid.data(name="input", shape=[-1, feature_size], dtype="float32")
- label_income = fluid.data(name="label_income", shape=[-1, 2], dtype="float32", lod_level=0)
- label_marital = fluid.data(name="label_marital", shape=[-1, 2], dtype="float32", lod_level=0)
-
- self._data_var.extend([input_data, label_income, label_marital])
-
- bottom_output = fluid.layers.fc(input=input_data,
- size=bottom_size,
- act='relu',
- bias_attr=fluid.ParamAttr(learning_rate=1.0),
- name='bottom_output')
-
-
+ def _init_hyper_parameters(self):
+ self.feature_size = envs.get_global_env(
+ "hyper_parameters.feature_size")
+ self.bottom_size = envs.get_global_env("hyper_parameters.bottom_size")
+ self.tower_size = envs.get_global_env("hyper_parameters.tower_size")
+ self.tower_nums = envs.get_global_env("hyper_parameters.tower_nums")
+
+ def input_data(self, is_infer=False, **kwargs):
+ inputs = fluid.data(
+ name="input", shape=[-1, self.feature_size], dtype="float32")
+ label_income = fluid.data(
+ name="label_income", shape=[-1, 2], dtype="float32", lod_level=0)
+ label_marital = fluid.data(
+ name="label_marital", shape=[-1, 2], dtype="float32", lod_level=0)
+ if is_infer:
+ return [inputs, label_income, label_marital]
+ else:
+ return [inputs, label_income, label_marital]
+
+ def net(self, inputs, is_infer=False):
+ input_data = inputs[0]
+ label_income = inputs[1]
+ label_marital = inputs[2]
+
+ bottom_output = fluid.layers.fc(
+ input=input_data,
+ size=self.bottom_size,
+ act='relu',
+ bias_attr=fluid.ParamAttr(learning_rate=1.0),
+ name='bottom_output')
+
# Build tower layer from bottom layer
output_layers = []
- for index in range(tower_nums):
+ for index in range(self.tower_nums):
tower_layer = fluid.layers.fc(input=bottom_output,
- size=tower_size,
- act='relu',
- name='task_layer_' + str(index))
+ size=self.tower_size,
+ act='relu',
+ name='task_layer_' + str(index))
output_layer = fluid.layers.fc(input=tower_layer,
- size=2,
- act='softmax',
- name='output_layer_' + str(index))
+ size=2,
+ act='softmax',
+ name='output_layer_' + str(index))
output_layers.append(output_layer)
+ pred_income = fluid.layers.clip(
+ output_layers[0], min=1e-15, max=1.0 - 1e-15)
+ pred_marital = fluid.layers.clip(
+ output_layers[1], min=1e-15, max=1.0 - 1e-15)
+
+ label_income_1 = fluid.layers.slice(
+ label_income, axes=[1], starts=[1], ends=[2])
+ label_marital_1 = fluid.layers.slice(
+ label_marital, axes=[1], starts=[1], ends=[2])
+
+ auc_income, batch_auc_1, auc_states_1 = fluid.layers.auc(
+ input=pred_income,
+ label=fluid.layers.cast(
+ x=label_income_1, dtype='int64'))
+ auc_marital, batch_auc_2, auc_states_2 = fluid.layers.auc(
+ input=pred_marital,
+ label=fluid.layers.cast(
+ x=label_marital_1, dtype='int64'))
+
+ if is_infer:
+ self._infer_results["AUC_income"] = auc_income
+ self._infer_results["AUC_marital"] = auc_marital
+ return
+
+ cost_income = fluid.layers.cross_entropy(
+ input=pred_income, label=label_income, soft_label=True)
+ cost_marital = fluid.layers.cross_entropy(
+ input=pred_marital, label=label_marital, soft_label=True)
+ cost = fluid.layers.elementwise_add(cost_income, cost_marital, axis=1)
- pred_income = fluid.layers.clip(output_layers[0], min=1e-15, max=1.0 - 1e-15)
- pred_marital = fluid.layers.clip(output_layers[1], min=1e-15, max=1.0 - 1e-15)
-
- cost_income = fluid.layers.cross_entropy(input=pred_income, label=label_income,soft_label = True)
- cost_marital = fluid.layers.cross_entropy(input=pred_marital, label=label_marital,soft_label = True)
-
-
- label_income_1 = fluid.layers.slice(label_income, axes=[1], starts=[1], ends=[2])
- label_marital_1 = fluid.layers.slice(label_marital, axes=[1], starts=[1], ends=[2])
-
- auc_income, batch_auc_1, auc_states_1 = fluid.layers.auc(input=pred_income, label=fluid.layers.cast(x=label_income_1, dtype='int64'))
- auc_marital, batch_auc_2, auc_states_2 = fluid.layers.auc(input=pred_marital, label=fluid.layers.cast(x=label_marital_1, dtype='int64'))
+ avg_cost = fluid.layers.mean(x=cost)
- cost = fluid.layers.elementwise_add(cost_income, cost_marital, axis=1)
-
- avg_cost = fluid.layers.mean(x=cost)
-
self._cost = avg_cost
self._metrics["AUC_income"] = auc_income
self._metrics["BATCH_AUC_income"] = batch_auc_1
self._metrics["AUC_marital"] = auc_marital
self._metrics["BATCH_AUC_marital"] = batch_auc_2
-
-
- def train_net(self):
- self.train()
-
-
- def infer_net(self):
- pass
diff --git a/models/rank/__init__.py b/models/rank/__init__.py
index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..abf198b97e6e818e1fbe59006f98492640bcee54 100755
--- a/models/rank/__init__.py
+++ b/models/rank/__init__.py
@@ -0,0 +1,13 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
diff --git a/models/rank/criteo_reader.py b/models/rank/criteo_reader.py
deleted file mode 100755
index c4930c8ddc628c13f5c958743673bd42c507161f..0000000000000000000000000000000000000000
--- a/models/rank/criteo_reader.py
+++ /dev/null
@@ -1,60 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-from __future__ import print_function
-
-from fleetrec.core.reader import Reader
-from fleetrec.core.utils import envs
-
-
-class TrainReader(Reader):
- def init(self):
- self.cont_min_ = [0, -3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
- self.cont_max_ = [20, 600, 100, 50, 64000, 500, 100, 50, 500, 10, 10, 10, 50]
- self.cont_diff_ = [20, 603, 100, 50, 64000, 500, 100, 50, 500, 10, 10, 10, 50]
- self.hash_dim_ = envs.get_global_env("hyper_parameters.sparse_feature_number", None, "train.model")
- self.continuous_range_ = range(1, 14)
- self.categorical_range_ = range(14, 40)
-
- def generate_sample(self, line):
- """
- Read the data line by line and process it as a dictionary
- """
-
- def reader():
- """
- This function needs to be implemented by the user, based on data format
- """
- features = line.rstrip('\n').split('\t')
-
- dense_feature = []
- sparse_feature = []
- for idx in self.continuous_range_:
- if features[idx] == "":
- dense_feature.append(0.0)
- else:
- dense_feature.append(
- (float(features[idx]) - self.cont_min_[idx - 1]) /
- self.cont_diff_[idx - 1])
-
- for idx in self.categorical_range_:
- sparse_feature.append(
- [hash(str(idx) + features[idx]) % self.hash_dim_])
- label = [int(features[0])]
- feature_name = ["D"]
- for idx in self.categorical_range_:
- feature_name.append("S" + str(idx - 13))
- feature_name.append("label")
- yield zip(feature_name, [dense_feature] + sparse_feature + [label])
-
- return reader
diff --git a/models/rank/dcn/config.yaml b/models/rank/dcn/config.yaml
index d84adb748ae460512a5c31fa898adfda1a88da56..390b460a84d9e212867d372c6fd542c0f1f2b478 100755
--- a/models/rank/dcn/config.yaml
+++ b/models/rank/dcn/config.yaml
@@ -12,42 +12,66 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-train:
- trainer:
- # for cluster training
- strategy: "async"
-
- epochs: 10
- workspace: "fleetrec.models.rank.dcn"
-
- reader:
- batch_size: 2
- class: "{workspace}/criteo_reader.py"
- train_data_path: "{workspace}/data/train"
- feat_dict_name: "{workspace}/data/vocab"
-
- model:
- models: "{workspace}/model.py"
- hyper_parameters:
- cross_num: 2
- dnn_hidden_units: [128, 128]
- l2_reg_cross: 0.00005
- dnn_use_bn: False
- clip_by_norm: 100.0
- cat_feat_num: "{workspace}/data/cat_feature_num.txt"
- is_sparse: False
- is_test: False
- num_field: 39
- learning_rate: 0.0001
- act: "relu"
- optimizer: adam
-
- save:
- increment:
- dirname: "increment"
- epoch_interval: 2
- save_last: True
- inference:
- dirname: "inference"
- epoch_interval: 4
- save_last: True
+
+# global settings
+debug: false
+workspace: "paddlerec.models.rank.dcn"
+
+dataset:
+ - name: train_sample
+ type: QueueDataset
+ batch_size: 5
+ data_path: "{workspace}/data/sample_data/train"
+ sparse_slots: "label C1 C2 C3 C4 C5 C6 C7 C8 C9 C10 C11 C12 C13 C14 C15 C16 C17 C18 C19 C20 C21 C22 C23 C24 C25 C26"
+ dense_slots: "I1:1 I2:1 I3:1 I4:1 I5:1 I6:1 I7:1 I8:1 I9:1 I10:1 I11:1 I12:1 I13:1"
+ - name: infer_sample
+ type: QueueDataset
+ batch_size: 5
+ data_path: "{workspace}/data/sample_data/infer"
+ sparse_slots: "label C1 C2 C3 C4 C5 C6 C7 C8 C9 C10 C11 C12 C13 C14 C15 C16 C17 C18 C19 C20 C21 C22 C23 C24 C25 C26"
+ dense_slots: "I1:1 I2:1 I3:1 I4:1 I5:1 I6:1 I7:1 I8:1 I9:1 I10:1 I11:1 I12:1 I13:1"
+
+hyper_parameters:
+ optimizer:
+ class: Adam
+ learning_rate: 0.0001
+ # 用户自定义配置
+ cross_num: 2
+ dnn_hidden_units: [128, 128]
+ l2_reg_cross: 0.00005
+ dnn_use_bn: False
+ clip_by_norm: 100.0
+ cat_feat_num: "{workspace}/data/sample_data/cat_feature_num.txt"
+ is_sparse: False
+
+
+mode: train_runner
+# if infer, change mode to "infer_runner" and change phase to "infer_phase"
+
+runner:
+ - name: train_runner
+ trainer_class: single_train
+ epochs: 1
+ device: cpu
+ init_model_path: ""
+ save_checkpoint_interval: 1
+ save_inference_interval: 1
+ save_checkpoint_path: "increment"
+ save_inference_path: "inference"
+ print_interval: 1
+ - name: infer_runner
+ trainer_class: single_infer
+ epochs: 1
+ device: cpu
+ init_model_path: "increment/0"
+ print_interval: 1
+
+phase:
+- name: phase1
+ model: "{workspace}/model.py"
+ dataset_name: train_sample
+ thread_num: 1
+#- name: infer_phase
+# model: "{workspace}/model.py"
+# dataset_name: infer_sample
+# thread_num: 1
diff --git a/models/rank/dcn/data/download.py b/models/rank/dcn/data/download.py
index b862988aef1802d8bfeb2554b7dce812e941f9e9..4203a3868a577757930ae848736c34bb4da376c7 100755
--- a/models/rank/dcn/data/download.py
+++ b/models/rank/dcn/data/download.py
@@ -1,3 +1,17 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
import os
import sys
import io
@@ -6,7 +20,7 @@ LOCAL_PATH = os.path.dirname(os.path.abspath(__file__))
TOOLS_PATH = os.path.join(LOCAL_PATH, "..", "..", "tools")
sys.path.append(TOOLS_PATH)
-from fleetrec.tools.tools import download_file_and_uncompress
+from paddlerec.tools.tools import download_file_and_uncompress
if __name__ == '__main__':
trainfile = 'train.txt'
diff --git a/models/rank/dcn/criteo_reader.py b/models/rank/dcn/data/get_slot_data.py
similarity index 76%
rename from models/rank/dcn/criteo_reader.py
rename to models/rank/dcn/data/get_slot_data.py
index 4b81e1fb66b6fac2ead7583ad93451e4822077d2..96d4448214d6a87092495326646a279657079f45 100755
--- a/models/rank/dcn/criteo_reader.py
+++ b/models/rank/dcn/data/get_slot_data.py
@@ -11,20 +11,32 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
-from __future__ import print_function
import math
import sys
-
-from fleetrec.core.reader import Reader
-from fleetrec.core.utils import envs
+import yaml
+from paddlerec.core.reader import Reader
+from paddlerec.core.utils import envs
+import math
+import os
try:
import cPickle as pickle
except ImportError:
import pickle
from collections import Counter
import os
+import paddle.fluid.incubate.data_generator as dg
+
+
+class TrainReader(dg.MultiSlotDataGenerator):
+ def __init__(self, config):
+ dg.MultiSlotDataGenerator.__init__(self)
+
+ if os.path.isfile(config):
+ with open(config, 'r') as rb:
+ _config = yaml.load(rb.read(), Loader=yaml.FullLoader)
+ else:
+ raise ValueError("reader config only support yaml")
-class TrainReader(Reader):
def init(self):
self.cont_min_ = [0, -3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
self.cont_max_ = [
@@ -45,15 +57,15 @@ class TrainReader(Reader):
self.label_feat_names = target + dense_feat_names + sparse_feat_names
self.cat_feat_idx_dict_list = [{} for _ in range(26)]
-
+
# TODO: set vocabulary dictionary
- vocab_dir = envs.get_global_env("feat_dict_name", None, "train.reader")
+ vocab_dir = "./vocab/"
for i in range(26):
lookup_idx = 1 # remain 0 for default value
for line in open(
os.path.join(vocab_dir, 'C' + str(i + 1) + '.txt')):
self.cat_feat_idx_dict_list[i][line.strip()] = lookup_idx
- lookup_idx += 1
+ lookup_idx += 1
def _process_line(self, line):
features = line.rstrip('\n').split('\t')
@@ -78,13 +90,26 @@ class TrainReader(Reader):
idx - 14][features[idx]])
label_feat_list[0].append(int(features[0]))
return label_feat_list
-
+
def generate_sample(self, line):
"""
Read the data line by line and process it as a dictionary
"""
+
def data_iter():
label_feat_list = self._process_line(line)
- yield list(zip(self.label_feat_names, label_feat_list))
+ s = ""
+ for i in list(zip(self.label_feat_names, label_feat_list)):
+ k = i[0]
+ v = i[1]
+ for j in v:
+ s += " " + k + ":" + str(j)
+ print s.strip()
+ yield None
+
+ return data_iter
+
- return data_iter
\ No newline at end of file
+reader = TrainReader("../config.yaml")
+reader.init()
+reader.run_from_stdin()
diff --git a/models/rank/dcn/data/preprocess.py b/models/rank/dcn/data/preprocess.py
index b356607729eedd73854a77449ffda3cc3bb8050f..9a89df10ef42dcfa09faad66f409b21439f340a8 100755
--- a/models/rank/dcn/data/preprocess.py
+++ b/models/rank/dcn/data/preprocess.py
@@ -1,3 +1,17 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
from __future__ import print_function, absolute_import, division
import os
diff --git a/models/rank/dcn/data/run.sh b/models/rank/dcn/data/run.sh
new file mode 100644
index 0000000000000000000000000000000000000000..32d653dc4113514740a9ff6de9ec1902aea4eeb1
--- /dev/null
+++ b/models/rank/dcn/data/run.sh
@@ -0,0 +1,14 @@
+python download.py
+python preprocess.py
+
+mkdir slot_train
+for i in `ls ./train`
+do
+ cat train/$i | python get_slot_data.py > slot_train/$i
+done
+
+mkdir slot_test_valid
+for i in `ls ./test_valid`
+do
+ cat test_valid/$i | python get_slot_data.py > slot_test_valid/$i
+done
diff --git a/models/rank/dcn/data/sample_data/cat_feature_num.txt b/models/rank/dcn/data/sample_data/cat_feature_num.txt
new file mode 100644
index 0000000000000000000000000000000000000000..0c75fc39b5890c323d1f94d6cc5c60fa151ef2e0
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/cat_feature_num.txt
@@ -0,0 +1,26 @@
+C1 139
+C2 422
+C3 1548
+C4 1965
+C5 54
+C6 10
+C7 3213
+C8 81
+C9 3
+C10 2402
+C11 2246
+C12 1583
+C13 1911
+C14 24
+C15 2011
+C16 1731
+C17 9
+C18 1197
+C19 584
+C20 3
+C21 1652
+C22 8
+C23 14
+C24 1770
+C25 40
+C26 1349
diff --git a/models/rank/dcn/data/sample_data/infer/infer_sample_data b/models/rank/dcn/data/sample_data/infer/infer_sample_data
new file mode 100644
index 0000000000000000000000000000000000000000..4aa6d249feecf542a5ce947f510bded60aa6414f
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/infer/infer_sample_data
@@ -0,0 +1,10 @@
+label:0 I1:0.69314718056 I2:1.60943791243 I3:1.79175946923 I4:0.0 I5:7.23201033166 I6:1.60943791243 I7:2.77258872224 I8:1.09861228867 I9:5.20400668708 I10:0.69314718056 I11:1.09861228867 I12:0 I13:1.09861228867 C1:95 C2:398 C3:0 C4:0 C5:53 C6:1 C7:73 C8:71 C9:3 C10:1974 C11:832 C12:0 C13:875 C14:8 C15:1764 C16:0 C17:5 C18:390 C19:226 C20:1 C21:0 C22:0 C23:8 C24:1759 C25:1 C26:862
+label:0 I1:1.09861228867 I2:1.38629436112 I3:3.80666248977 I4:0.69314718056 I5:4.63472898823 I6:2.19722457734 I7:1.09861228867 I8:1.09861228867 I9:1.60943791243 I10:0.69314718056 I11:0.69314718056 I12:0 I13:1.60943791243 C1:95 C2:200 C3:1184 C4:1929 C5:53 C6:4 C7:1477 C8:2 C9:3 C10:1283 C11:1567 C12:1048 C13:271 C14:6 C15:1551 C16:899 C17:1 C18:162 C19:226 C20:2 C21:575 C22:0 C23:8 C24:1615 C25:1 C26:659
+label:0 I1:1.09861228867 I2:1.38629436112 I3:0.69314718056 I4:2.7080502011 I5:6.64378973315 I6:4.49980967033 I7:1.60943791243 I8:1.09861228867 I9:5.50533153593 I10:0.69314718056 I11:1.38629436112 I12:1.38629436112 I13:3.82864139649 C1:123 C2:378 C3:991 C4:197 C5:53 C6:1 C7:689 C8:2 C9:3 C10:245 C11:623 C12:1482 C13:887 C14:21 C15:106 C16:720 C17:3 C18:768 C19:0 C20:0 C21:1010 C22:1 C23:8 C24:720 C25:0 C26:0
+label:0 I1:0 I2:6.79905586206 I3:0 I4:0 I5:8.38776764398 I6:0 I7:0.0 I8:0.0 I9:0.0 I10:0 I11:0.0 I12:0 I13:0 C1:95 C2:227 C3:0 C4:219 C5:53 C6:4 C7:3174 C8:2 C9:3 C10:569 C11:1963 C12:0 C13:1150 C14:21 C15:1656 C16:0 C17:6 C18:584 C19:0 C20:0 C21:0 C22:0 C23:8 C24:954 C25:0 C26:0
+label:0 I1:1.38629436112 I2:1.09861228867 I3:0 I4:0.0 I5:1.09861228867 I6:0.0 I7:1.38629436112 I8:0.0 I9:0.0 I10:0.69314718056 I11:0.69314718056 I12:0 I13:0.0 C1:121 C2:147 C3:0 C4:1356 C5:53 C6:7 C7:2120 C8:2 C9:3 C10:703 C11:1678 C12:1210 C13:1455 C14:8 C15:538 C16:1276 C17:6 C18:346 C19:0 C20:0 C21:944 C22:0 C23:10 C24:355 C25:0 C26:0
+label:0 I1:0 I2:1.09861228867 I3:0 I4:0 I5:9.45915167004 I6:0 I7:0.0 I8:0.0 I9:1.94591014906 I10:0 I11:0.0 I12:0 I13:0 C1:14 C2:75 C3:993 C4:480 C5:50 C6:6 C7:1188 C8:2 C9:3 C10:245 C11:1037 C12:1365 C13:1421 C14:21 C15:786 C16:5 C17:2 C18:555 C19:0 C20:0 C21:1408 C22:6 C23:7 C24:753 C25:0 C26:0
+label:0 I1:0 I2:1.60943791243 I3:1.09861228867 I4:0 I5:8.06117135969 I6:0 I7:0.0 I8:0.69314718056 I9:1.09861228867 I10:0 I11:0.0 I12:0 I13:0 C1:139 C2:343 C3:553 C4:828 C5:50 C6:4 C7:0 C8:2 C9:3 C10:245 C11:2081 C12:260 C13:455 C14:21 C15:122 C16:1159 C17:2 C18:612 C19:0 C20:0 C21:1137 C22:0 C23:1 C24:1583 C25:0 C26:0
+label:1 I1:0.69314718056 I2:2.07944154168 I3:1.09861228867 I4:0.0 I5:0.0 I6:0.0 I7:0.69314718056 I8:0.0 I9:0.0 I10:0.69314718056 I11:0.69314718056 I12:0 I13:0.0 C1:95 C2:227 C3:0 C4:1567 C5:21 C6:7 C7:2496 C8:71 C9:3 C10:1913 C11:2212 C12:0 C13:673 C14:21 C15:1656 C16:0 C17:5 C18:584 C19:0 C20:0 C21:0 C22:0 C23:10 C24:954 C25:0 C26:0
+label:0 I1:0 I2:3.87120101091 I3:1.60943791243 I4:2.19722457734 I5:9.85277303799 I6:5.52146091786 I7:3.36729582999 I8:3.4657359028 I9:4.9558270576 I10:0 I11:0.69314718056 I12:0 I13:2.19722457734 C1:14 C2:14 C3:454 C4:197 C5:53 C6:1 C7:1386 C8:2 C9:3 C10:0 C11:1979 C12:205 C13:214 C14:6 C15:1837 C16:638 C17:5 C18:6 C19:0 C20:0 C21:70 C22:0 C23:10 C24:720 C25:0 C26:0
+label:0 I1:0 I2:3.66356164613 I3:0 I4:0.69314718056 I5:10.4263800775 I6:3.09104245336 I7:0.69314718056 I8:1.09861228867 I9:1.38629436112 I10:0 I11:0.69314718056 I12:0 I13:0.69314718056 C1:14 C2:179 C3:120 C4:746 C5:53 C6:0 C7:1312 C8:2 C9:3 C10:1337 C11:1963 C12:905 C13:1150 C14:21 C15:1820 C16:328 C17:9 C18:77 C19:0 C20:0 C21:311 C22:0 C23:10 C24:89 C25:0 C26:0
diff --git a/models/rank/dcn/data/sample_data/train/sample_train.txt b/models/rank/dcn/data/sample_data/train/sample_train.txt
new file mode 100644
index 0000000000000000000000000000000000000000..4aa6d249feecf542a5ce947f510bded60aa6414f
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/train/sample_train.txt
@@ -0,0 +1,10 @@
+label:0 I1:0.69314718056 I2:1.60943791243 I3:1.79175946923 I4:0.0 I5:7.23201033166 I6:1.60943791243 I7:2.77258872224 I8:1.09861228867 I9:5.20400668708 I10:0.69314718056 I11:1.09861228867 I12:0 I13:1.09861228867 C1:95 C2:398 C3:0 C4:0 C5:53 C6:1 C7:73 C8:71 C9:3 C10:1974 C11:832 C12:0 C13:875 C14:8 C15:1764 C16:0 C17:5 C18:390 C19:226 C20:1 C21:0 C22:0 C23:8 C24:1759 C25:1 C26:862
+label:0 I1:1.09861228867 I2:1.38629436112 I3:3.80666248977 I4:0.69314718056 I5:4.63472898823 I6:2.19722457734 I7:1.09861228867 I8:1.09861228867 I9:1.60943791243 I10:0.69314718056 I11:0.69314718056 I12:0 I13:1.60943791243 C1:95 C2:200 C3:1184 C4:1929 C5:53 C6:4 C7:1477 C8:2 C9:3 C10:1283 C11:1567 C12:1048 C13:271 C14:6 C15:1551 C16:899 C17:1 C18:162 C19:226 C20:2 C21:575 C22:0 C23:8 C24:1615 C25:1 C26:659
+label:0 I1:1.09861228867 I2:1.38629436112 I3:0.69314718056 I4:2.7080502011 I5:6.64378973315 I6:4.49980967033 I7:1.60943791243 I8:1.09861228867 I9:5.50533153593 I10:0.69314718056 I11:1.38629436112 I12:1.38629436112 I13:3.82864139649 C1:123 C2:378 C3:991 C4:197 C5:53 C6:1 C7:689 C8:2 C9:3 C10:245 C11:623 C12:1482 C13:887 C14:21 C15:106 C16:720 C17:3 C18:768 C19:0 C20:0 C21:1010 C22:1 C23:8 C24:720 C25:0 C26:0
+label:0 I1:0 I2:6.79905586206 I3:0 I4:0 I5:8.38776764398 I6:0 I7:0.0 I8:0.0 I9:0.0 I10:0 I11:0.0 I12:0 I13:0 C1:95 C2:227 C3:0 C4:219 C5:53 C6:4 C7:3174 C8:2 C9:3 C10:569 C11:1963 C12:0 C13:1150 C14:21 C15:1656 C16:0 C17:6 C18:584 C19:0 C20:0 C21:0 C22:0 C23:8 C24:954 C25:0 C26:0
+label:0 I1:1.38629436112 I2:1.09861228867 I3:0 I4:0.0 I5:1.09861228867 I6:0.0 I7:1.38629436112 I8:0.0 I9:0.0 I10:0.69314718056 I11:0.69314718056 I12:0 I13:0.0 C1:121 C2:147 C3:0 C4:1356 C5:53 C6:7 C7:2120 C8:2 C9:3 C10:703 C11:1678 C12:1210 C13:1455 C14:8 C15:538 C16:1276 C17:6 C18:346 C19:0 C20:0 C21:944 C22:0 C23:10 C24:355 C25:0 C26:0
+label:0 I1:0 I2:1.09861228867 I3:0 I4:0 I5:9.45915167004 I6:0 I7:0.0 I8:0.0 I9:1.94591014906 I10:0 I11:0.0 I12:0 I13:0 C1:14 C2:75 C3:993 C4:480 C5:50 C6:6 C7:1188 C8:2 C9:3 C10:245 C11:1037 C12:1365 C13:1421 C14:21 C15:786 C16:5 C17:2 C18:555 C19:0 C20:0 C21:1408 C22:6 C23:7 C24:753 C25:0 C26:0
+label:0 I1:0 I2:1.60943791243 I3:1.09861228867 I4:0 I5:8.06117135969 I6:0 I7:0.0 I8:0.69314718056 I9:1.09861228867 I10:0 I11:0.0 I12:0 I13:0 C1:139 C2:343 C3:553 C4:828 C5:50 C6:4 C7:0 C8:2 C9:3 C10:245 C11:2081 C12:260 C13:455 C14:21 C15:122 C16:1159 C17:2 C18:612 C19:0 C20:0 C21:1137 C22:0 C23:1 C24:1583 C25:0 C26:0
+label:1 I1:0.69314718056 I2:2.07944154168 I3:1.09861228867 I4:0.0 I5:0.0 I6:0.0 I7:0.69314718056 I8:0.0 I9:0.0 I10:0.69314718056 I11:0.69314718056 I12:0 I13:0.0 C1:95 C2:227 C3:0 C4:1567 C5:21 C6:7 C7:2496 C8:71 C9:3 C10:1913 C11:2212 C12:0 C13:673 C14:21 C15:1656 C16:0 C17:5 C18:584 C19:0 C20:0 C21:0 C22:0 C23:10 C24:954 C25:0 C26:0
+label:0 I1:0 I2:3.87120101091 I3:1.60943791243 I4:2.19722457734 I5:9.85277303799 I6:5.52146091786 I7:3.36729582999 I8:3.4657359028 I9:4.9558270576 I10:0 I11:0.69314718056 I12:0 I13:2.19722457734 C1:14 C2:14 C3:454 C4:197 C5:53 C6:1 C7:1386 C8:2 C9:3 C10:0 C11:1979 C12:205 C13:214 C14:6 C15:1837 C16:638 C17:5 C18:6 C19:0 C20:0 C21:70 C22:0 C23:10 C24:720 C25:0 C26:0
+label:0 I1:0 I2:3.66356164613 I3:0 I4:0.69314718056 I5:10.4263800775 I6:3.09104245336 I7:0.69314718056 I8:1.09861228867 I9:1.38629436112 I10:0 I11:0.69314718056 I12:0 I13:0.69314718056 C1:14 C2:179 C3:120 C4:746 C5:53 C6:0 C7:1312 C8:2 C9:3 C10:1337 C11:1963 C12:905 C13:1150 C14:21 C15:1820 C16:328 C17:9 C18:77 C19:0 C20:0 C21:311 C22:0 C23:10 C24:89 C25:0 C26:0
diff --git a/models/rank/dcn/data/sample_data/vocab/C1.txt b/models/rank/dcn/data/sample_data/vocab/C1.txt
new file mode 100644
index 0000000000000000000000000000000000000000..4597891767100dcd85ac15caa833d119ba49bb05
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/vocab/C1.txt
@@ -0,0 +1,139 @@
+f434fac1
+e6051457
+7e5c2ff4
+abca0bad
+3b509222
+340c148e
+48f8c5b9
+3c9d8785
+585b6ccc
+561bf9d4
+b474c2c2
+c1730738
+92fb1d87
+05db9164
+c35dc981
+ae82ea21
+824be517
+16a99cfb
+e8ef605b
+88abfaf6
+7ceef477
+17f69355
+1464facd
+f0a33555
+80e4d755
+3ec5d916
+f5c9f18c
+87552397
+5ebc3192
+426610d2
+eb6dcae0
+651f6a2d
+7f9f4eb6
+bd4b6d14
+3560b08b
+8068dc7e
+9660b97b
+9eb7531c
+2d4ea12b
+87773c45
+5a9ed9b0
+f473b8dc
+b19f768d
+70d60005
+89889f05
+c71ae391
+c6dce90e
+64e77ae7
+0e78bd46
+75ac2fe6
+42a16b9a
+19c5f803
+cbffbdad
+bfb430af
+127f4a6b
+6ca3af46
+2b3bff44
+8a033483
+45cb84c9
+554adfdb
+46300ee3
+a14cf13a
+d0d66375
+da4eff0f
+4265881a
+9684fd4d
+7382c353
+50d4de26
+60c68845
+e3493c7c
+09ca0b81
+3b65d647
+98237733
+fc9c62bb
+41edac3d
+dbfc8345
+39af2607
+581e410c
+55845e1c
+28e55712
+6bcf7a5b
+66651cdf
+2b92c0d2
+24eda356
+dbe63c2b
+9a89b36c
+489d0f96
+dac91c28
+dc5ebbd9
+1a5f926e
+885aeecb
+f1548e14
+6062d843
+c2a5852e
+68fd1e64
+be589b51
+b455c6d7
+cd3695ae
+291b7ba2
+2998a458
+5e53cc38
+dbe15b41
+ff5f3ab9
+49f631b8
+3b1bc654
+36a5b3ff
+fbc55dae
+467085ca
+06584483
+3f6e3c8b
+3cc2325b
+ff004ae3
+eb6ac63c
+0a16e1d4
+34f74dfd
+decf6fa6
+18988050
+c512b859
+a86f8721
+5bfa8ab5
+8cf07265
+dd14f377
+287e684f
+49c4b7c4
+2ebc17d3
+8c6ba407
+fb174e6b
+4615a3b6
+394fc830
+9e9d28f5
+241546e0
+4a4e85c4
+26428e51
+940683b1
+65aada8c
+ba454362
+d4b08d58
+49807078
+439a44a4
diff --git a/models/rank/dcn/data/sample_data/vocab/C10.txt b/models/rank/dcn/data/sample_data/vocab/C10.txt
new file mode 100644
index 0000000000000000000000000000000000000000..023078ebaaafbec5df4af3529a09e8b3fb8848f0
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/vocab/C10.txt
@@ -0,0 +1,2402 @@
+e44ef203
+210e5b53
+21fa915a
+ff5a1549
+83a3b517
+e2549837
+f359604e
+814f97f4
+82932781
+4effc25c
+7f12aa11
+ceea4f75
+89794d42
+3de6dc67
+8dde7540
+fed0f64c
+2f510755
+e5330e23
+ddf5d47c
+16003d82
+4c783419
+a883869a
+6feef489
+81e63d1a
+9268adb2
+fba4e66d
+63c7ada0
+5e4f1c70
+b8fa4771
+d0ed569a
+e8957687
+c5fe5cb9
+b40a6cd7
+ef4fc845
+4edb5b9e
+8a502a35
+b0488b24
+6d7bbe0b
+ed8257bc
+e2abbf7d
+7c907dc3
+27b57225
+4f9a11fe
+4debc04c
+27f4bf82
+da337cc8
+5d61d71c
+70962768
+6bde15a2
+eaca4085
+7ffe4676
+26ed657b
+7934ffe6
+b1fe0e2e
+63081b29
+a8ebcfec
+fb2b96f1
+cfde4d9c
+1a5ba63e
+a5f77a53
+e1af44fa
+acce978c
+7e1aa72a
+ff05c3fb
+b8d25928
+ac9e0776
+a7b2c086
+74273dde
+7d06a816
+19c1d716
+d0ff5b05
+c7453e87
+5139ddc4
+733b729f
+4c1928d3
+a60210d9
+637a2733
+3df0c12e
+972359d0
+d40d5d74
+e6003298
+8601d04a
+edf1e23b
+7e2377e8
+52a0b310
+61efb79c
+6287b8a6
+fab83763
+467e9b7a
+391969e7
+8df97100
+0ada1061
+50af9b31
+c18c8582
+9b56218a
+3cb12840
+7d5e46eb
+d091f686
+888c57de
+bc6d6cf6
+32bd723b
+328057da
+6c9fd7d6
+267caf03
+bb851295
+72ce33ff
+27fdd7aa
+8c6b20b6
+dad22fce
+caa87280
+76c29aa8
+222ae0ea
+8e54038a
+9044f0e3
+d1ebaddf
+c9f77507
+c36f333e
+98649ae7
+bbc5072e
+6ab3d8b6
+823a2b16
+d38d058f
+c4bffb57
+230aa7a0
+81875c3f
+8f796047
+9dd065bd
+cce8f055
+5f49e872
+edd99521
+aed682d5
+402c08ae
+2e6e9bd1
+4c801c81
+cbd3f8f8
+a1680317
+79041558
+7e535333
+4fd5d54e
+ec4e2ecb
+3c149ae9
+32390b96
+a4b832ae
+d430c780
+36f76e5b
+35acd07e
+d077d4d4
+b78f3a55
+568b2298
+8661d897
+bec43bb3
+6a5a3b69
+b47dfc84
+bd256365
+687c8b31
+991fc4a4
+3dc9bc0f
+f9065d00
+b78b25b1
+955bb819
+bc5704d7
+d2d421de
+f050c183
+7f091e3d
+a866ff7f
+1b7b3a1c
+d9b71390
+b82dcdf9
+70726fe6
+f72dbcb1
+85a19788
+f24b3308
+7360544e
+b0bfa96e
+61283720
+fb1d3661
+97c795d2
+b883655e
+4b8a7639
+9743bf1f
+d3f2758d
+3ea987b7
+33677623
+0cbab662
+d6ea61a9
+59b0bdac
+087fd87e
+89ff09ee
+48294e1c
+7c0a503a
+4d9c06d9
+85f10390
+20d01e61
+8ac80ace
+328c6dd0
+8a48eec7
+d289a4e8
+65089c81
+d72f452e
+7f79890b
+8a05e080
+1f8282c4
+b0c25211
+d471ce0d
+b02a09a4
+2860ede1
+4a817a2d
+0529dce1
+eccd1549
+36fe2e15
+ceb10289
+5cad6330
+63ddc9f1
+3658ac68
+4de83a51
+f5ec1909
+ad75e5d3
+0042ccac
+17f2ac39
+1800d606
+7760f7bf
+ca2f139a
+534fc986
+d954cc10
+8f4737bc
+6100a91c
+0bd12a8c
+e1a2ef0f
+04dc09a2
+83ff688a
+8d45c464
+60a54006
+d2487d17
+98bd7a24
+d78294fb
+bc743bbf
+e6556a21
+b0ee15ff
+5db9389b
+fa907ddd
+e4fa8060
+f9f603ac
+621d488a
+05c0f465
+3b08e48b
+dea56e7c
+bde51b15
+9e6884ca
+b48bb0bd
+f8ae506d
+26204092
+bfedff8f
+2283dcdb
+b681243c
+63f14ce6
+94e68c1d
+ff513e20
+c86fc87f
+fc6ac8b3
+83e211e9
+90210e5a
+1725b2db
+6a5d2b37
+0affc0bc
+41f909be
+8886dc9c
+074e7523
+ddebe327
+c2ebde4f
+4dbf86ba
+d2182a9f
+0ddec944
+fffdffc3
+30e5979b
+4549ea1f
+837e840e
+0af798a0
+18908d58
+ecd4bc60
+31990058
+f36ff6fe
+041560b6
+2c0fb2f4
+7aa4e186
+34dd9626
+4681f55a
+8d3ab9e3
+c61d3ee0
+acccca1c
+21e99844
+a9f8d03b
+0a347d7d
+4ae9db84
+b7d3df60
+78243957
+2436c534
+9c48e3c3
+98fea455
+0611d5bb
+ce338497
+091fd245
+f918493f
+5c73d885
+21a95e8f
+80254878
+48beebf8
+2d3d7f00
+993d4041
+78a42779
+3139c7ed
+921af1f0
+1791bdb6
+5c0801f4
+0fbac2a4
+45a4f840
+5a2dbfb2
+85635bd0
+cd1a2f1a
+383878dc
+c484657f
+d2cb8f4b
+5162b19c
+f0bda286
+6f506765
+e14a3053
+bd69f3a2
+6705fa4e
+e643be1a
+6b8edada
+7318b5eb
+9ce4c96c
+30819844
+61a3f9a3
+d1c011c4
+be511b47
+18acc97b
+a5270a71
+7cc27cd3
+23c4b9b1
+dd02e40b
+a9157b48
+9ca0fba4
+39dc9e08
+0fa587cd
+257c4f93
+90af6b7f
+c6577552
+8e00c8ce
+416adc26
+8b00bbbb
+afb89f8f
+7ab0132a
+57eaa63f
+dc6c5c76
+0f9a018e
+1bd7be48
+9ab449df
+5a76905d
+03bf6378
+1c56cea6
+878f4678
+0b6ffaef
+a5f22153
+7ef432eb
+420eaf2c
+ee2c9f64
+72c2d088
+148b68a6
+1b4e8e72
+18139a78
+78cab4c1
+25e9e422
+6de79a03
+0f1a2599
+146d694a
+58380efe
+27431e6a
+def5e035
+2ce2764d
+864f07fa
+2209cb46
+e184588a
+e7ba2569
+4876401c
+ecdbd897
+cc291550
+b59aee26
+a77382a0
+ed2e4bcb
+b50b81d2
+8e9c1bc2
+46a7deb3
+562c7c45
+fb37e544
+597a7b20
+d9dbe799
+3e278c1c
+36c5cf0a
+f4404e5d
+11e1309c
+9d39065e
+9eaa24a1
+9dc8b302
+55299251
+9ad5d5af
+ef8f7268
+897188be
+687176eb
+82df227a
+551654b4
+c5374647
+d367b3d6
+266543bd
+7ffdf000
+458166fb
+e41fb5a0
+bd80ecee
+c82e45aa
+936b781e
+b84009e5
+5f50c86b
+e63ec6b1
+9b98a6fb
+129b3e93
+e49e3c36
+c6465bed
+84579e97
+a4205083
+2a1327e7
+811a2873
+408fa381
+966e9450
+461a7e93
+162688a8
+d4a82fb9
+a3dec2a4
+8b1d2aea
+cf7470a6
+8f48ce11
+5ffc7b25
+72ea02d9
+f7529429
+c514dac9
+b4a4028c
+7d41d504
+f4ea8681
+1b233b32
+adfc656f
+0466803a
+d18fd499
+593c3061
+71243c9d
+66d456a0
+7670fab5
+9a2bbc4f
+e851ff7b
+40fd233d
+0eac114d
+2aa66c47
+a6a8aa2f
+625caf4b
+2fbe07b2
+18c3f9c0
+3e05a4f2
+b6900243
+c244b9d2
+60cfb358
+5a148c42
+56c80038
+bc45f217
+f5073ae4
+0c70a731
+082cafb0
+4b4670a5
+594fca00
+ea0ac2a3
+2f3d84f4
+4e592255
+251cb79f
+2c443d6f
+f828214f
+db1b3edf
+37db0a21
+91f11edd
+077f4295
+d977db1c
+0af77012
+ae8a8ee0
+4d7d371a
+1a130480
+7d83f681
+d2886136
+0a2f1c99
+b57b827f
+aa91245c
+0e9ead52
+7effe9ee
+4b50516d
+65d27e99
+d6b11eb9
+aec22587
+05906beb
+d7026747
+34ccc264
+5b444efd
+38405c01
+6c3d14a4
+16a81a6c
+a863ec71
+c56fc6bb
+2d94a531
+2ed12647
+c4261527
+1f5d8474
+ab9e9acf
+e4c9f6ec
+c639d0e6
+1f6803dc
+1acc9833
+80f90445
+5e607e7a
+9ad4a825
+5ba575e7
+d35a37c2
+d5c9288e
+983552b8
+f710483a
+9491602e
+faaa1061
+70723a0a
+7a2f5e17
+a5b97a09
+ccb434c4
+f6837a23
+95bade81
+901bad3f
+6c370c54
+a8821a26
+e05e1a3b
+b5f7cd48
+7c421cbc
+64145819
+23beb4b7
+5ae197ee
+418c9c81
+305a0646
+59a08730
+86b46b2e
+ab9456b4
+0a164266
+64bbfb54
+54ae60c8
+b118f931
+493b74f2
+283a1656
+9f9346bd
+a8afcd13
+2decae57
+3325ba1f
+6dcd997e
+1f765f17
+498bf426
+01159052
+2324fd10
+8d34ddcd
+f322117a
+22a99f9d
+782fd71f
+efea433b
+f72caf8d
+4d0c7778
+7db4270f
+361eec86
+d3587737
+62f096f9
+d0766390
+51d6cda8
+27bab03d
+f6f942d1
+7f0f0e67
+960e3dcd
+317baeff
+205b0e16
+06012307
+85634890
+8a1a934c
+80463bc7
+1f20471e
+7baf2906
+845eb783
+dbfa89a7
+4f384fbf
+dc9f749b
+aa864043
+62e76cf6
+75542289
+6f55d865
+e7e608ce
+d34aff56
+4072f40f
+7950d1e1
+f83b2b33
+b8a94306
+c10eaf60
+ceff3ecd
+ffa150e0
+e7de00de
+11b0b64c
+ee88845f
+716e68e2
+5b0cd9c0
+901ad2a4
+9c0d33a0
+a30a7b4a
+ae66e0b0
+03ed27e7
+c364190c
+84fbca29
+293a5f98
+3db7e40d
+2f0da49f
+3f25d8e4
+24e4550a
+ba3f97f1
+2bb4d590
+5014682d
+27dc434a
+9e8fae15
+0aefb005
+3fcc3170
+233e3a0c
+5f3f6a5c
+3598de88
+d57d77ae
+f856eab2
+a5ad4326
+57cccf2c
+1449a3bf
+f77a4864
+87dcf8fc
+1397fdf7
+b99cb647
+d4e84cf7
+42878a50
+ee2a256a
+b594c154
+e9976153
+b3d657b8
+755e266a
+cea90bbb
+327c5dc6
+f273048c
+390c8ae2
+08fda2ad
+d4e7f371
+d273d652
+973598ed
+60ae1daf
+3dc1e140
+2155dcbf
+25b724cf
+0b16773a
+42adcf15
+b173a655
+379c488f
+466a312b
+b16556f1
+9e36b8d8
+cb3963cc
+e4058b6b
+808e1dff
+53a60f53
+c102eaa2
+38b63f8c
+d108fc83
+9ba3a252
+73258318
+a1a6e0a6
+3757d3fa
+43567237
+aeace694
+213889cd
+584ac46d
+36bdedca
+3b5c4057
+53549019
+25103694
+3ff10fb2
+b72cdc0a
+dcbc7c2b
+f6e8fa66
+ffc5e9fa
+885bbe5f
+7c621383
+a0005c5e
+ae769f5e
+a4390d6e
+c70e334d
+2ed68727
+b2ebcf4d
+50e2ba02
+741be51a
+5282c137
+302b11c0
+7af74a77
+34d2db5f
+f8a0d88b
+8b72213c
+e5ced7ab
+61cc0eca
+44ee1822
+58da7fdf
+be289b53
+1ea58694
+e1537bee
+38b51f5e
+5407c73b
+b916cb08
+ba06e67a
+bf3b6158
+7b4fbefc
+73c4af58
+f71087b9
+179c5e6e
+42459662
+9b1a8d44
+0022e234
+d33e0b1d
+16c80a87
+75d433aa
+441dd290
+f2fffa3c
+6dac4051
+cb8bd380
+edd13d14
+dfb81e79
+6a143158
+ae40a03b
+e5043974
+12bb8262
+c5abe1ae
+ced91602
+511b1945
+0d2f29be
+d199ede7
+1faebe63
+b22ede5f
+7fa80053
+643b1ea5
+c811e460
+6291463d
+c7009b63
+213fd432
+1238e21c
+9775be10
+8c444d53
+200e383b
+995f172b
+98728ad1
+5fd7dd92
+acc16d4c
+abd614f9
+b1e27fa4
+6b1e4999
+9950f018
+b50bddc2
+850b7500
+711ec2bc
+e6f40065
+420cf710
+671ae88f
+4f6357b0
+157764b8
+da272362
+508cc9f1
+b14f7b5c
+cf61ba90
+a5537d82
+10d616e4
+8d78680b
+92b37564
+0adf0424
+19fe7c0c
+31be9cd5
+4c9179ab
+3cba23a4
+087159c3
+9dac6e27
+7077ce03
+12e12a31
+a2cb676b
+0a9fb6a6
+376bbe93
+73218f57
+e37fd0e9
+6fd10037
+9dab6db6
+18f4eb3d
+b22fc48e
+a9177962
+f7d3cc4c
+b1442b2a
+6884de4c
+69c989e7
+dd5238b4
+6f14652c
+bd10c7fe
+1a7ec8bc
+c39b8790
+eee4532b
+07cfebed
+933960cf
+f2bc30ec
+e245e5ab
+53ae19bf
+51208b21
+4e1c8c97
+c97556d9
+78eaca07
+eba89aac
+ac25feb9
+1d508893
+e8584d35
+ae090aed
+88f474d4
+865b29d9
+76361f0d
+8248dbe2
+b37ce137
+9b8e7680
+85128389
+80550809
+1e387ea9
+d5b0cca2
+f6e56ae2
+1722d4c8
+cd481139
+bbfc3aed
+27d15133
+913264d0
+28c6ef79
+eb8c349b
+cdb1e959
+7cd5620d
+5e9d4e57
+fcd76f02
+7d3579e2
+e5cbd87c
+ddad80e3
+324851f0
+2796f46e
+69483786
+01281f02
+59e5ee8d
+9bed549c
+cd1ec860
+69c07ae0
+1cf80d48
+1665e5d5
+5543532b
+b127c679
+ce191313
+29d1046a
+474773a7
+8d3bf189
+2a339e36
+8627508e
+4aead435
+2c4945ea
+7239ba1f
+5115abf9
+81f943ba
+f26b2389
+d89121c6
+b10da5d1
+98d5faa2
+eab8822a
+9d6b7db5
+66ff5fff
+a023b042
+d888e259
+76ab342e
+f531e651
+a267e6af
+eea530d7
+07ae0d24
+de6c0d05
+494c8471
+f3b83678
+e3629c5f
+3c7f8715
+7676d9c1
+90eb15ca
+23176e12
+8bf500a5
+1c620c21
+91080ab6
+5ba3608f
+2e8e8e87
+401e07cb
+23f75311
+d13ff55c
+23157da8
+c4223f4f
+01e100dd
+cbf93977
+19a77b1d
+6cb0e696
+4ba9b410
+a87a9e02
+20fa8c86
+bcc8b4c6
+1f9b2c55
+244d313e
+dc5f6020
+0c02dd76
+13007dd6
+c7fcae9a
+ec308632
+9367457d
+c1bba512
+4b80a904
+0b43e34c
+0526eedb
+ebcc4ac8
+fa5f7df0
+30040193
+bfd36fb3
+d40f8fc0
+e157f846
+5f5c3083
+a791cb47
+6ca469d2
+6bebf69d
+5fcee6b1
+a9184f62
+e7a2a2d0
+c03541ea
+4e2d1b78
+1e23bdea
+3bc8bb1a
+56d7acb9
+adf94434
+309fa0b9
+34b289dd
+39d3ae3f
+f717eda0
+a15782e2
+bc67eb65
+f0d90bb8
+98787e95
+8c8662e4
+d2461b80
+7623f223
+9c8738a1
+5592aa50
+0d538fca
+aba09c45
+fa642b71
+73404786
+1e8c6c4d
+49fb0a90
+076897bf
+d6a4738e
+751f5193
+cfa5dd13
+bac024c9
+a6cc5a0b
+87b2167c
+b58c39b4
+ad0a58c1
+01c73aa1
+4b8a28d0
+377df5e7
+f6540b40
+7688173f
+472ef943
+cd5fd374
+48a94b2e
+e80d9cf3
+0d0ca0e7
+a4fd5f7b
+7eb572b4
+e8b9a804
+5141558f
+74359727
+603ff749
+28986229
+000e2f4b
+d7cb1343
+303a8aaf
+911ee817
+073bd468
+ced51bd9
+b024d419
+ef8ef338
+ecb84f3b
+7e3f556f
+c31e5ea3
+61409cc8
+c1a29f8d
+98b03c11
+620fe6e4
+e034d02e
+6814706f
+15b65bf5
+e63b8ef6
+772d6920
+151f69cc
+d8a1c4f1
+bb5d4103
+fb999b75
+3513d986
+adf17e4d
+d3cf8a36
+a98ba113
+028ddb7e
+66581c5b
+fbed4a4e
+98e605cf
+e058d4a3
+86cff510
+6434b75e
+0741dc88
+a9271c40
+3150b962
+43d2248f
+2181f1a9
+d2ae7518
+c1abb2bf
+0706314e
+700e5b5d
+f9850f55
+79eb2fa0
+998b0039
+c18f37b1
+868eadfb
+04238c7f
+962f3844
+d4eae210
+6ad0eb30
+8287cc43
+c57cf046
+ce8ec97b
+ce8e716d
+9ed66789
+62936561
+ea6f67a2
+758bcead
+f90ae47f
+c387fa77
+230a3832
+e93ce9d6
+0ba8fed5
+36264492
+0f38c1ef
+513d7738
+3044b0a2
+317a8bea
+5ab952ab
+447c5fc0
+1891e2f3
+9a1250bd
+a4688402
+b9930e98
+80da5522
+886c48e9
+3d6b0efe
+98bd362e
+9e7cf32d
+0969795c
+60bbdab0
+8534f6ff
+a83a1d81
+119ed2ef
+7c3cada7
+a9966b7b
+42103299
+0461dac0
+84bbf9c6
+851e1a96
+92ea8417
+81cb5a77
+fbbf2c95
+18e09007
+2f5d18f0
+94c7e002
+27700f23
+616ed314
+f17eed1c
+86bf49b0
+2ba4033c
+3823da6e
+cf77f74f
+d6133462
+82bcd67f
+cfa407de
+9af3cdd1
+bd841179
+854d88cd
+bf957bfd
+faa8cda2
+d38099bf
+a0172017
+88bc1874
+8bdd3111
+15fa156b
+de668ebd
+080fa7b3
+8aef4905
+f991ddf2
+14514389
+cd34b466
+e7ef8a26
+a11476d8
+d9997676
+86439eb8
+29cfc193
+34ae4e8c
+f8beff89
+099b68bd
+49a6c4e8
+cd5ad306
+30c090d5
+c92976ed
+ac82a793
+e7a687d9
+4f1c6ae7
+68c27a50
+29453ef9
+4d6d8af8
+cf9c4b61
+2cc3e168
+d9dc2828
+d71fbd3d
+2e546b3f
+5df036eb
+1a69db9b
+c51a7043
+c9373a92
+ac473633
+366090e9
+8924e76a
+c024829a
+4cc45dda
+d32b23f6
+32887328
+b90dbc4e
+99bf439d
+8cf4ed5d
+d8881c14
+40c7ccc3
+4ea0d483
+882d0608
+89a21b8d
+ca70d133
+7ae50bfd
+bd30f2d4
+7db8154d
+63af0a0a
+9a2a80f7
+6223753a
+b399d729
+1ce1e29d
+65aac74e
+47e80ca7
+6f0d561c
+23e7c11a
+9bb3a560
+50f4e28a
+558a2680
+adf4e701
+2244d16a
+79d88089
+6a002f59
+337b2014
+4be9ce03
+b12b0205
+cde7d5a2
+96fa0723
+0749e649
+46b1f339
+1a9dbfe9
+8a99abc1
+a46ac80b
+5e3553cc
+45ab2c55
+3afc13ce
+fbe2edba
+86a562f6
+a8c69066
+7f56dad6
+3e0e22f0
+db5eb19d
+393af595
+06ee81ba
+a5375493
+ffc981e6
+5c7c893d
+3753b9eb
+56ef22e9
+d87d491f
+e9b1b7e2
+e6cc2641
+c16dd063
+337f5af7
+f3678585
+f26b7f57
+b4a8738e
+46a09953
+22b0870f
+ce92c282
+90869172
+01047267
+08b8971d
+53550bd8
+17a0cc50
+d10e3a7b
+33a0531b
+a4f81e90
+58da4dc4
+ae210a42
+c4ffcd11
+3c982956
+20511efd
+2011f780
+aadb34d7
+58f611af
+b4f213ad
+3dab57ff
+b6b62d7c
+2a47dab8
+1585cd0d
+35940a6d
+7fdb06fe
+155e735d
+c21d422b
+0a197c54
+d35e1d60
+3f07fd24
+d7c62471
+33fb6281
+0bf31079
+2729f67b
+a1b65b52
+5612701e
+dec405d0
+67e4e8f5
+2b85a3f6
+afa26c81
+99009ad9
+0aba5e07
+bac95df6
+ce938a7c
+a8bdf9ae
+907110ab
+00f2b452
+ff4b22bb
+3afb9285
+773e9b96
+3c14599e
+f5b2855c
+3d12126c
+07134e2e
+95043468
+d8972ad3
+b29862ed
+1910742d
+3e26b10b
+359cb6b0
+28c9a1fe
+8aea823e
+8cc354a1
+814278fa
+f4a6fba6
+060196aa
+d31d362f
+42302f1e
+fa46949c
+402b6ab6
+1000fc95
+24e322b0
+2b53e5fb
+7b68cac5
+f1561b42
+0be656f9
+fe687d88
+b2eee74d
+24ff6dd9
+b5a4a1b8
+1a88cf9b
+ba85085b
+416b5b64
+46c32c26
+02ee778d
+cba9fce5
+28bd0575
+cde5d6cd
+40cd4d57
+84bde65b
+f027dfc0
+e70742b0
+c8e30f21
+e8454fac
+9a14870f
+6700e25a
+bf1ddb5d
+feb38f87
+56376c23
+37a40843
+1363daa1
+5dbaabb1
+d319dfed
+62ad6cac
+0fddc33f
+91c3f75e
+06965c3d
+0eed8297
+906d2709
+9b83013e
+a4602f88
+8d0f7d7d
+4b415bb3
+691fab4c
+16803b43
+5bd9a453
+bed00763
+3aadb51d
+a93f9684
+3da4e300
+663b5c9d
+d21c67cc
+69dad686
+ec4d75ea
+c77292d6
+29b00bcc
+015ac893
+3b351828
+663eefea
+1cd85054
+99afd380
+73f5c3b4
+4624c4e8
+94e616d0
+3a45da11
+ed3d3020
+d9318e91
+7bfe920f
+7636f6c8
+8a8cd8fb
+01a1468f
+5fe250bc
+9c86a61a
+5c49c210
+38b45203
+b7003483
+8e3b57d6
+b198a040
+160c6060
+e4559b2e
+eab78bab
+74b211ca
+75d852fc
+8dc24aa0
+db46288b
+b1d7a288
+2ed15862
+6698671f
+661e6d51
+2341c5e8
+de544024
+d5d5c8cc
+5d97f238
+3fbc8adf
+e79a19af
+dcc5ccee
+3a714417
+b7efa269
+f385050d
+943eca3a
+a1078166
+ffbc684a
+4e8d3607
+14781fa9
+f4878cd6
+26fa334d
+a08eee5a
+a3028b57
+7cf0f3d6
+e71f1259
+d0b8ef0e
+a6717a4b
+3723bd3a
+7746263f
+f66f7df7
+c116edf3
+fc3680e8
+3a523fc8
+da12b8e7
+6745ef4b
+33b64f41
+ae935818
+a251a8ff
+33a91747
+2b438e13
+ea1bdfa5
+afc4d756
+3fb38a44
+935a36f0
+2a8bc4d8
+b1ced7c4
+111557ff
+c6373fa0
+374bc53b
+edccdfed
+468632ac
+d6e13808
+3b9efc16
+0be3526a
+4c6078fc
+a3044e12
+4cd9b343
+da37bfee
+be7d01b3
+499f32dd
+b377cfd5
+89f9878d
+54b51cb9
+377138d3
+4adf420c
+d29027ae
+724b49e9
+e207354c
+b1e940ac
+e90a0d2c
+47e01053
+0113735c
+8702e9da
+4b343c04
+e0c337b4
+c8c4e285
+5a842825
+5d6f6fc3
+8bdf9bc6
+42429aab
+2ec6a85f
+7142ad68
+e76e53f5
+fa303997
+a9dd3a26
+0ed4b00d
+2407a361
+c69c38d4
+49f10262
+1e2ab9fa
+55f6ada0
+f62834dd
+ee4444a2
+8ce94bed
+b45daa5e
+58f020bc
+77fbe0ec
+7e0d83d4
+e9ad77b6
+6b2a72ca
+3d32f06a
+5feb1c1d
+01a07fd7
+f5edae7e
+bff849fa
+515ed6ca
+6b5ebd2e
+e625639c
+94b43404
+451bd4e4
+b95a6ad3
+45a2d21a
+7cadbe30
+5db9788f
+8f5a431e
+eff5602f
+01ba7bcb
+79913867
+950ef4aa
+21b9b0e3
+4f0c5ea9
+fea175a5
+1b8e7647
+9598fde8
+a4ce2b8b
+399a051b
+4fcd52f7
+2aeb54f9
+cfd60b98
+2a607812
+3af15bb9
+692d4004
+4d96c0fb
+8527e4b3
+76aa8efa
+7cfcb35e
+57a0d09e
+6024c84a
+dd55fb56
+8958d049
+50e29c88
+2fbe7353
+f476fbe3
+95871de0
+a9919676
+255f3655
+326fdf87
+5a01afad
+8df2b7d2
+a1618deb
+7eeb6445
+28daff0d
+4f2c396f
+cd8881c7
+8e97febe
+76534d58
+77f8290d
+0e14b5d4
+985ac50f
+1212582a
+86e94423
+f88af86a
+c7434eea
+ecf2a440
+c1c39cbf
+d37a911d
+456ee48e
+67af0a24
+23f11485
+706a704b
+13e9fdad
+d229fbfe
+d48065a4
+8944bc68
+3eecfbe4
+8d9d3025
+05f0143b
+6d30c0ba
+baf239e9
+1e2e737f
+eae262b7
+cd44503d
+6c4b6023
+5761d03c
+bb0bd08f
+8e123f97
+b0d6a2ae
+81dd15c2
+5e1cc39e
+6cc17fda
+cc7a7a5c
+8b437ed5
+cac43882
+48317e70
+870c83ec
+2134f605
+ef9084e1
+cd43cb3f
+3b76bfa9
+80512db7
+c67d0dd2
+ddf89415
+527e7d0c
+6976d08b
+eb30547e
+3905b732
+44a82060
+c9d23c54
+334397be
+69bfae5c
+7643ecbb
+82ab7978
+e0e9eeb8
+2618e2c5
+feb6db0b
+d62b39ca
+0822dd5e
+d24eeac1
+3ccfe0c0
+824c70df
+567ba666
+6bdb944c
+3ca00bf5
+54e4f35b
+cf2fe68a
+5fd9cdb3
+1d0ae71a
+09ff33b3
+c90ae1ba
+e3d58036
+7d2c334f
+72593448
+4458bd2f
+e502ac14
+e202b934
+ae07e31d
+c6cdbfae
+f665b5a4
+c8f646ea
+ab480cda
+43a46434
+72899ea3
+e037b52d
+f1317066
+832ea201
+9d7e4fe0
+e303eb3a
+6aa9a5b6
+8fcc403c
+0f086324
+d88b2b52
+cc912768
+6a69ea7f
+b9677080
+42635bfd
+b595d9a3
+6d3070fb
+d1374258
+a8fee414
+ca53fc84
+8e321faf
+938c8515
+15d48d19
+71c30a63
+c510044d
+fabfe70a
+38c8e16e
+31757b75
+5080de78
+165431e3
+b020056e
+9b2a83c5
+b95c890d
+ed6c3785
+24c14a4a
+3dd5f464
+e8438e24
+7a2cdcab
+d7304dc6
+3994a81b
+5c88b319
+6b97edd2
+7b7e43a5
+e48f8bf2
+f7276337
+2ec4dbbb
+5cd48b96
+c77f7d2f
+6c087261
+5208991a
+4ed88d64
+d7ae8050
+f1189ac6
+3d8bc3c8
+128d9bd0
+c0948268
+ecaf7c72
+e5edcbd4
+ca1bb880
+d12db685
+a88972b8
+ccbfe296
+baa1d4df
+78ed0c4d
+d393fb81
+42dfc7f7
+6c89307e
+55d80497
+c1e1b6f2
+2f45a7d3
+0446ed7f
+4182e17e
+d5e9553c
+4661c871
+015dced7
+4262b839
+8c98d8c4
+631ddef6
+492044d0
+9c8b2ced
+5be0d285
+853f0ce5
+3275d09a
+89907d9b
+a2dfcd9f
+a9043efc
+4c7842d5
+22d25595
+8bfda13d
+83d5c7e8
+72820433
+97786649
+fe8c0c37
+f7c01119
+a1060a95
+0a959b50
+dab57562
+70d3f60a
+2451587f
+02b3a27d
+a78c9a05
+bda4e331
+147a12f7
+d6104949
+bac719b6
+b3bfacba
+4684ca03
+14955006
+7bd64769
+887d9642
+60112853
+28d50dc5
+a2ffc7fa
+80b18b16
+9e5006cd
+c6c8dd7c
+1badc2d2
+0f91e8cc
+55c1d42a
+745ce906
+cfe44c97
+83fc275c
+739e6d7e
+20bb74cf
+54c4bae4
+8b7bc54c
+73a53e2f
+6d7007cb
+ec79b2d8
+72ea60f7
+dcaa1dd3
+5168789b
+f6c6d9f8
+bf13525e
+4f517438
+9bc1a7c1
+65993851
+82bb4986
+7259dc52
+1fc41e60
+f693249e
+686e97b9
+038785ca
+1a86275d
+9ef16dec
+a5188390
+f2376412
+6acfc6a6
+f8d69a87
+131267ca
+434f9ed9
+dc367c5f
+b2bb7085
+48bedfe7
+57f7d8b1
+5097d18e
+19b9435e
+33ecf57b
+73af8251
+e938fc82
+d04aae7d
+7ad4ea2c
+bc4c21ef
+a85e0b9b
+f605ef77
+03e48276
+13e7f44e
+739ff196
+669b0ac5
+d7cfb8e2
+599db241
+30fdb872
+69af56b9
+004b452f
+980d90f4
+89201d31
+cf368215
+1b74d531
+c0d5d38a
+8228dde1
+bbbd7e2b
+0506bfd4
+b64775cc
+e113fc4b
+f4233f25
+61a7526e
+0ea7c76e
+358f48c7
+c6c8271d
+8fda7cf8
+726f00fd
+5932d31f
+794fedab
+d2e3dec0
+294e4d47
+3a2c2ea8
+5ff2e290
+7b233b75
+d6c63567
+f6b9bbd2
+bfc44ba9
+f5e90c82
+304fa5ee
+54634f71
+5e1b1d0e
+6997b535
+34c6dc22
+cfa6c3bf
+da87807c
+cb3c0ba3
+beb8cbb0
+b5ce065f
+7f57646a
+6a39d150
+8d2bba78
+732b4cb5
+9f7517e0
+5c83c2dd
+ed086ca2
+d39b9463
+83544135
+f3e003c4
+c40f2e3f
+b2aeffc0
+f6888a32
+597ee6dc
+c4c151ed
+46087f7a
+15b532b1
+a567fd47
+4c89c3af
+88a43e6d
+afe4ade4
+0a263d38
+391ec177
+cc4b09df
+cd122c7e
+ee59a960
+8c827f7f
+98d8d031
+3107b950
+5769e63a
+8446dc39
+0a3a2cb6
+c5f36ac2
+2fb5429d
+35f2cb4c
+a21d9eab
+f1b45aab
+2b98ae01
+66c281d9
+d3d7a5ca
+e050cd8b
+3cd0db39
+9333e74c
+aca41307
+53f5f0ed
+649ab4dc
+a1613c35
+22ff76b7
+23de5a4a
+ee325ae9
+54d94652
+9de7c14e
+852273ac
+67eea4ef
+ad208ee6
+7f02edac
+f0c8b1be
+24ee718f
+e53bceb4
+fe28e8a8
+e13d8f54
+4c2180da
+0e63575c
+a799ed45
+3a7915a0
+e029047e
+2bf72324
+c54560e0
+3df44b8a
+33f67854
+92f83f4e
+7d48ae56
+372f26e1
+610f6d4c
+0cad5278
+dfeed14a
+74db6d7d
+9a09a6bb
+b1ed2e73
+a4074cba
+695b51dc
+eb4b8edb
+08658f3b
+229ba619
+df7b8074
+b3badaa9
+6f3d6efc
+0eca1729
+230405f5
+9c4dd39e
+4600bc29
+eefb9756
+a1f25462
+0d6c9034
+9a05ee56
+547c0ffe
+0945b3f6
+61fa6541
+8c4d4775
+479d08d5
+799b1690
+07704244
+f1311559
+ccf705a3
+e71a3721
+e4f51aba
+6c47047a
+2a55d52e
+2c318def
+f0fa18d0
+ff4776d6
+903f1f14
+6e7947ce
+a5bb26cf
+b320e88d
+aa4b3ff6
+d520251d
+392463fc
+622fc8eb
+abcafd02
+7e930516
+830ac546
+bd07a556
+ec98e31a
+d86da906
+29132dbe
+724beea1
+b9fe941e
+a53e3202
+af94b16c
+16a6e853
+401ced54
+422a3712
+8b7e0638
+6aea41c7
+b1b6c11f
+f51aaa12
+9996c0a6
+942bdf7d
+6cf2533c
+beee742f
+896ef94e
+392644bf
+e7766a34
+f902af47
+22eb9f3d
+45bf62cc
+57c97e3d
+63c8d3d5
+36c6971d
+9685e7f5
+bdad3f20
+cb90a903
+60290797
+de717989
+a3d4e5bf
+a8cd5504
+8902507b
+069be97c
+37aeaad2
+04a6cddd
+8a7eec4a
+55ecc706
+5a11102b
+14e49183
+7f518378
+2b24d61e
+67cebf16
+3fae39c5
+49d1ad89
+c45f801d
+13516e01
+5bee1ad7
+39046df2
+a6b6043a
+8332a54f
+7edea927
+9ec876ba
+398fb903
+b94d9d6d
+4bd70b9b
+456523e2
+3e8d379b
+8b8cd228
+2f829c7a
+fa7d0797
+ea23b002
+eaf136b6
+cf05f704
+97cd93b6
+5a460471
+d3787b55
+57b09f76
+189460b0
+c731d31c
+2c9992e3
+316007da
+aae04188
+9a5e0260
+40df519b
+2ce114bc
+4e4ecb8e
+465779f0
+85e276f4
+4e0a7911
+12f7fc9e
+122c7980
+8b4f0fad
+97d3ddaa
+10011256
+bbd117b4
+1704ec8f
+3562e97a
+02088995
+6bbfd4f3
+25517785
+762081fd
+81271509
+df361421
+cb89a731
+c477c997
+aed3d80e
+b71e883a
+1866b7d4
+cfc21884
+62e331b2
+891a72b6
+944ea55d
+b1f3f2b4
+106f7cda
+ef7bced1
+1ac6b94c
+637e5734
+09ead1d3
+a0d48449
+4ce8f99f
+7cda6c86
+752c830e
+ba12b984
+c557768a
+79556672
+dec9ee0b
+dac59909
+96fc0b70
+87237511
+8a176218
+d3d8086d
+f4b83f99
+f1cc6fa4
+79f15f43
+175d6c71
+2c58ad9b
+86aaf467
+02bf81ed
+22f136d5
+083944ee
+1b46e71c
+da950aa8
+89977be6
+88c5e4c8
+42a7764a
+1d56e466
+d8babd50
+ede207dc
+ab029c73
+9ea38862
+2e834f9a
+f7ab55a0
+514cac09
+487e84a5
+66ae8393
+2bf8bed1
+8b0b85e5
+89aa5c1d
+39cda501
+bd4827d9
+2188411d
+2262dd8a
+a02952bf
+bc082914
+6107a156
+dc790dda
+6739124c
+4d0f5c58
+1650c2ec
+bcecd637
+5772038b
+cd5f2acf
+272e7ece
+041aef9c
+1fcfcb5a
+c3af7909
+188b9fe1
+757f1081
+d092f850
+3ff5f6c9
+dfa3d0ad
+a7c6996e
+ecac5559
+0b64fe05
+b72eaafe
+54dee4bc
+d61cc293
+3611435b
+00925bbb
+0870f385
+c08e58a1
+b393caa5
+e8c6d5af
+8fa0b8fc
+33800760
+012bac1e
+b6168da6
+9206713a
+e9995d97
+c7af51a6
+30517549
+c3e69838
+9d109687
+84462a5b
+e623e09f
+eaa3a97b
+00ad84d8
+92135ede
+ce9fc368
+79d97513
+12fa2938
+299aecf1
+e38cbfac
+612f8227
+65a3f12e
+55384a15
+076295ce
+a3e2e7a5
+04078da6
+8b00e866
+4cd49990
+13af20e5
+b755fe49
+748e1ebc
+a3a12501
+343ac6ed
+e034d733
+d341fee5
+ff7d71b8
+d9bde05f
+5bb8abcb
+7e435bf1
+0a2e6d2e
+de89c3d2
+550727c0
+325780fb
+aa9347e0
+410a0a99
+2462946f
+7cb8561e
+f679a9ac
+02516dfc
+a0bf6af3
+e89812b3
+9343fa06
+a32a6c9e
+2bfbb2fc
+c4c22604
+91b5a423
+e4802c36
+083998ce
+1dda5fa3
+245361b7
+ceee5d18
+88a133e0
+fe01516c
+4e979b5e
+a115aaee
+85258115
+8567b763
+6f0b6a04
+4ec65f86
+df2138cf
+7fc0c2b2
+c295a1ce
+d5972498
+418b1fb1
+74ff6ea4
+814f28f7
+5717f8f7
+cc41a65c
+1dc67798
+0f2ec50d
+dbd3ed6d
+b5489539
+a8793b14
+b2d2f56a
+ae37a48e
+39f59c08
+753998f6
+51eac460
+62669db8
+1768414e
+343ca46a
+bdfd8a02
+bbab92de
+cec295e1
+5c309cee
+4f11d1f4
+2a1dd548
+cbae81ee
+23f3797c
+e5fc4a2b
+438a806d
+02705efe
+dfaaa501
+0bc4c053
+b941752a
+5c217e41
+e8f7c7e8
+0bcbbcd0
+084bb20d
+d93ced0c
+bc2b8616
+d023b3d2
+b7f7d54f
+e9f37f7e
+af4c380a
+eac68878
+30596e6f
+6a8de37e
+a37c8b45
+97acbbc2
+de551d2f
+72314724
+a1ee64a6
+fdf60c9c
+80fca57d
+bd08de2f
+60dca7a3
+b824973f
+79f7c195
+d55c6c33
+bbd5dea0
+45f00e25
+caff9963
+4459517c
+033c8168
+a7aea1ba
+2124a520
+b8b81ee6
+296f0d15
+7c3ba41b
+07c7b3f7
+4e56c58e
+37706054
+8ffa5d97
+37eb4c99
+a17fd261
+ba630905
+69b8df56
+3155f559
+6ffb4550
+d2f1c80c
+1a428761
+933c819b
+94adfbe7
+e8c8c3a9
+da609e27
+d422ad04
+18dacb0e
+7f314591
+4fc6fe06
+9f335e83
+7471f41c
+335254db
+3005cb05
+7c79c8f6
+dc650390
+d668848a
+13623384
+3c1e7151
+79793ca8
+e5c1c957
+b28c67ba
+c9e11adf
+d4f05169
+b05b0018
+58172e5a
+e51df783
+88b0ca3c
+92e70d0f
+99810933
+e2432229
+1e2bd8f8
+a384f95f
+afbc3455
+121038f7
+a8d1ae09
+aedbd1e7
+a27d7b04
+bd45ab0c
+7dab1649
+010e2c68
+89be0406
+3e07d661
+83b0b6d1
+f434c77a
+fd591517
+595e392d
+9e2d5050
+d31c4758
+ce544aa1
+7ca23b4a
+2af3fe82
+87f5c49d
+015dc527
+d56146bc
+54e99be5
+8975cf25
+ebb8102f
+63d165f3
+d649a368
+c195d620
+cbc7bc74
+6f07d986
+e949720a
+fe8687bd
+f3b0a575
+62cfc6bd
+a75928f0
+95b6ef60
+f206942a
+bbaa2887
+e3432089
+5b9d1e4d
+e29743a0
+e3a43a6f
+79a57263
+44535dd9
+22e03083
+e801aed0
+1853a68d
+3013a9ec
+0429de61
+5aecc062
+f71a9de3
+50636bef
+f956319a
+0ca624bc
+2b3a2e72
+36ea002f
+e328a4b5
+c9556546
+a17186da
+cc3d7a75
+793cb1f2
+de7c4f1b
+8ec317ae
+e5cadd10
+b50c2223
+1964d309
+a0eb88e1
+25ddaad7
+df6ef679
+645464f7
+5bd53d40
+b486d165
+df41254d
+8b7e21f6
+afa0822d
+6d7e8644
+5783071a
+38d73ba3
+cd8f34fb
+b0aed267
+37843f80
+c5566d9f
+f8fc9154
+74942bb0
+50c56209
+3440d43b
+ae30c15e
+8a8307e8
+bc283a64
+93aa528a
+48923fe5
+5a64c137
+ad1afe78
diff --git a/models/rank/dcn/data/sample_data/vocab/C11.txt b/models/rank/dcn/data/sample_data/vocab/C11.txt
new file mode 100644
index 0000000000000000000000000000000000000000..311221d155de8558a93f2f3f2f2ba652616f12fc
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/vocab/C11.txt
@@ -0,0 +1,2246 @@
+fa6526a0
+0a76236e
+07128696
+facf05cc
+a739bbee
+a847377d
+61593534
+804df57f
+f72fff3d
+45e3ea2a
+f89fe102
+aa5e0431
+7970068e
+838d6a6a
+98bf6715
+b205dccf
+ef8b8995
+668d693c
+0524c7b4
+35d24a2f
+d08bad2a
+9ad3b3ec
+2de821fc
+dab547a5
+37428807
+f2537412
+9ca5c0f6
+8e05d0d5
+1a036e26
+3bcfd189
+e9cea188
+fd7856c1
+80cc1e76
+5f9d9fd9
+ef65befe
+15cd287d
+6743b177
+394e4e3f
+1e41b6a4
+4e8abdaa
+2566311a
+75a64bb4
+276be673
+68115e0b
+f29b9ed2
+6e067112
+c7435d5a
+6263d404
+ac416c77
+434d6c13
+0e2e5e48
+2b4132f2
+84eb4210
+d9244025
+7e2c5c15
+dcdf9753
+ee9e620f
+ce3dfeb8
+9f0003f4
+95fb4c31
+e81438fc
+2bfe015c
+67aa8b13
+1bde707f
+d20ffd8f
+f2a5d7d2
+034539cd
+60a1c175
+7845abb2
+5de52301
+86ca7dc8
+415f416d
+dcc84468
+05af0aa0
+ae15df0c
+2161e3dd
+5447972d
+d0727572
+93f5ebb7
+fbb27036
+f1b78ab4
+54649d62
+0601c4d9
+b4e53c60
+5db56e1f
+e973bfd7
+e92f89be
+4be2b24f
+474b4dbf
+e0c3f6d9
+5ef0d562
+dd0c7036
+1a0a36bd
+58c2a175
+52105669
+ff360943
+a1288914
+a33816cf
+a0a5e9d7
+bffe9c30
+512b93d4
+3bf272bf
+8d75f3f0
+868a9e47
+70caa7c8
+06d58ceb
+bf09be0e
+43d9f976
+843d8639
+fe951f20
+b73648e6
+087dfcfd
+f0d7528c
+ba1ff80a
+60d2afd7
+7373475d
+497f72ca
+407e9439
+2df02cf1
+46a894f7
+a8e564a4
+1313f20a
+ddf72fe8
+11fbf407
+7bc4afbf
+cd40d7db
+d70e2491
+793356a6
+62adf0d3
+9c50f4f2
+4a31f431
+bd56ba93
+dd542e6d
+5420373c
+f4aee513
+f499652b
+d5a66f22
+f293f2e6
+d79bad49
+9e12e146
+b925bab7
+be28d080
+2cc0193e
+65819b7a
+bf2008fa
+a47c5009
+9ffb3655
+248679bb
+0e4ebdac
+26e56083
+089d43ff
+e3baf8d4
+ac52bf19
+69628afd
+8924112e
+ad757a5a
+77212bd7
+fccfdad3
+2b9c7071
+df93ff85
+c959e5b8
+34c909fe
+12cf60ea
+043725ae
+a21d2994
+1d125cbb
+08b5d3b2
+c6dfa670
+62042a07
+0a6a17de
+7397d690
+1cba690a
+ba0f9e8a
+8a7e2934
+e3ee9d2e
+a95a8954
+d91d8560
+ff48b5b4
+12f01a32
+a1e02e8a
+9e1e00cf
+c026aaca
+a7645fc3
+723eb72b
+deb288a9
+c8e9dff6
+4c14b9bc
+4fa8c39d
+6e647667
+78d9666b
+b49d37b3
+ecb2338a
+88cbaede
+ef1bfbab
+86720f44
+31fbbd11
+3a3c960c
+9a024337
+f61e81dd
+e4fbabb4
+73be5cc3
+8b94178b
+e6f56a1b
+d556b556
+2996a71e
+eeb15007
+8bc6ab4e
+ac846092
+6cd6a35b
+87dc9425
+258020a4
+d4e10454
+5ef59ae3
+dceff55c
+f11e784f
+9166ec18
+5fa0c6ac
+4b602e6b
+e163daf6
+873a0039
+5e183c58
+5f0f014d
+d83caed6
+39f870ad
+be281e8c
+1a2065ce
+01df04b2
+761e668d
+b1a5e8a6
+dbfab1aa
+d39dfd5d
+d850f8dd
+35fea499
+7defe259
+745416a3
+6624dcd4
+4427adce
+c0edaa76
+315b651b
+4fcc6c51
+963a782f
+3af49b08
+a10c0fc9
+b06ac93b
+a2c1d2d9
+71265574
+85d57109
+c0e757b7
+4950c85b
+8f2a5d32
+9215e3c6
+c5bbf6a9
+28404bee
+75a786d9
+ec88dd34
+89f2ad37
+07c67b85
+f958dd70
+3f8283fe
+540c97b9
+e7c049c2
+0f736a0c
+27258c97
+54ca8d16
+dbdb7970
+6b0d4f04
+54e3c2f7
+e7c8dd10
+ad3de7df
+750c33d8
+c3ce49fb
+69926409
+83dba508
+b071da68
+8b851381
+c255f829
+41516dc9
+ae6dcfce
+26472930
+9925d6d3
+19ea7894
+1de95640
+10e6a64f
+1f14947f
+cfcea1c3
+ff694829
+d0ca2b2f
+c6377b31
+2b72fa0c
+d4c2039b
+7b61aa9b
+51eb4c2d
+43f6a84f
+8b92652b
+a631eb8f
+8a26f2e1
+a60de4e5
+38aca7c7
+69753b72
+d2b7c44b
+c5d99561
+731cd88c
+63b4ceea
+b38835a9
+b5939c49
+8c997030
+76120d9d
+86c05043
+2a7d9fca
+d66168dc
+c4a3755f
+69d26514
+e24ff9e7
+105496a0
+8888be74
+b176603f
+9a47e077
+4d99801f
+494bd436
+60adb56e
+4462494b
+36706e24
+1d351a39
+59b7f136
+5bee5497
+f2313205
+c9168a8c
+63c88067
+e16bba2e
+a3411756
+503d9588
+284551d1
+6df3cb7a
+7e40f08a
+97365bba
+e98d0c6a
+fef0266d
+61cb7f22
+acac17a6
+89073265
+19d27eca
+df80c9eb
+d912cbc4
+b750092e
+d83c9bd0
+8fa5dd63
+76dd75bd
+e09c447b
+56cb5b8a
+0044e5ec
+37e1e439
+6cac6e94
+89d1fc33
+6685ea28
+ab06ce90
+ef379d27
+8a79d845
+8f7b26d4
+e04f77e2
+c389b738
+cf690be6
+b1db274c
+925f62dd
+28b2a54d
+1c1abc64
+a4ea009a
+e6c92dd9
+55506080
+2b31063f
+e6c365aa
+d9466be4
+d433166d
+19de763a
+c24bcce6
+cacf7f71
+7c7199cb
+81cef21c
+caecb243
+04fa5bf2
+8ce8df41
+27a6c0d1
+d6ea7935
+e9cae52f
+be099a80
+f497a54f
+60e58dde
+348ca00b
+02ab188b
+64d5ed07
+24adbadc
+d87a8147
+4ebb5e92
+d74f7351
+391d1efe
+f629f438
+298f4138
+75b1673e
+0fd651a2
+d5df6cd3
+4aa47037
+ebd30041
+bef6a013
+e25a89f7
+63ab36be
+3e128095
+01a25c9a
+ce13c273
+29d227bd
+2397f9e6
+b4105d5a
+981a1f4f
+e07b8196
+3e794ff5
+94d2aad8
+e52152c0
+ee8f8975
+520c3688
+680bf6cb
+d24aec2b
+d34b988b
+149238ed
+93aed850
+c9ddcfd0
+7865b9ab
+88731e13
+f03a9de3
+1b723d3c
+1c4009ed
+a284f9c4
+38317f6d
+4e0b2bc3
+9ffc0b1a
+f00ad1d7
+86fae75f
+eb3851c7
+88030678
+6ebaee32
+f180b699
+709d0c1d
+785a24cb
+562ab4a7
+e469acef
+1c3fae87
+9fa1dee2
+9fc5181f
+7fb8555d
+defdb18a
+b4eb1891
+c05dff4b
+7beb451a
+e90cbbe1
+c82f1813
+fdd3a1fa
+17586bd8
+49aeb6a9
+3f6af40a
+193ecef0
+8bb35684
+d8d7567b
+319687c9
+c4bd1c72
+b0de0b18
+8fca7b8d
+d47df190
+33765c29
+b5565b1c
+3d566bf6
+5587bbc1
+7f21b03d
+42d37e0a
+9ba53fcc
+cb755bc4
+bc8d1448
+51ef0313
+0b048c07
+eacae3ce
+bfacd3e5
+0b9f69c5
+2ec0a569
+ef91e49c
+5ed6ddb4
+488e8c8d
+9ad95d23
+1c3f82f9
+6ed062c6
+17d8db0e
+9bcaeafe
+4ca1899c
+10b3128b
+064b8acf
+7622c80f
+5cab60cb
+ea602e11
+1d4f52a9
+4f1210cf
+68adf3f2
+98096bae
+431f1b36
+ba34ed14
+208d9dd6
+d3e650fb
+f697a983
+b7301ecc
+86f426fa
+cef3fa21
+48540920
+81a23494
+1a1d177e
+9e4e8906
+340c0959
+6d91e005
+efb18b6a
+6872a82d
+2db82d51
+1f36d28a
+cfdc314b
+c3c3306c
+70948dc7
+4827f801
+9a995116
+ff78732c
+50578e76
+7eda4554
+44482217
+c19406bc
+cb55be22
+3ac87d37
+69593bb9
+67360210
+fb5156e6
+45ea640a
+bb3750e8
+d1a4bd60
+4b0929e2
+c7164dd7
+1c850e0f
+7d87ae83
+f53ac662
+e6959f26
+b813ea6b
+4e258239
+1bb4f435
+3dfa865d
+8f68a279
+ff3e74e6
+9f7c4fc1
+a3d2f3d0
+1f84ae2e
+0014be64
+48876b80
+4ba74619
+13f669ec
+09ccd2da
+de035f75
+5b225578
+4352b29b
+acfbbebf
+e0c3cae0
+727af3e2
+1791efef
+780660db
+a15790a3
+364e8b48
+6fc9d950
+1ac15ae5
+5b20ba06
+a2460a9f
+39e1365f
+30a76a50
+d407af40
+f339b72e
+f2a195a2
+1294fec1
+d87ff710
+2dc2a28b
+82efeb82
+422750b7
+2b336a73
+0a8f4121
+911b463c
+7940fc2a
+8139d33e
+3179c622
+54bfa8d0
+ba2f8f9c
+b2914f93
+c3752eb8
+8d3d2b01
+e8da7aef
+a9b84bd5
+30b2881b
+36bccca0
+6c8d25d9
+5f3eb205
+2e1a735b
+6fc1ac4c
+98291270
+2ec02461
+1333e775
+45ca0a69
+46759dcb
+3722e006
+179de637
+eb94162a
+13429d76
+4eddcd83
+3db1963c
+b14be692
+740f5b4f
+ab60c4de
+a9f02987
+5fff2f45
+cd8f9ff6
+db1106ce
+3bb8e704
+e8f11449
+5298a972
+43327221
+b5bb9d63
+c7cf63e9
+8aa7f20a
+cdeb6d98
+5f5e6091
+6348db7c
+9d7e66c3
+88ac36d5
+f677069d
+5a539822
+873e6871
+a950f02e
+39ddd652
+9c4ff10d
+ac5b959b
+16faa766
+6ea0f130
+62750746
+6e76c47f
+6361f816
+23feec22
+d14484c6
+8ae133d8
+cdb97ac0
+2a0b79f8
+b28889f8
+3ad41aaa
+1ce24b7d
+ab9a95af
+60b755e3
+757868ef
+541383bd
+c08aeb9d
+cc4e770a
+d778c774
+8f410860
+50a20b02
+9e271da4
+4899e211
+94952d35
+76030d37
+703083f2
+7c53dc69
+6fd96e76
+8ca164ab
+4b31c286
+acc758fc
+ae19a197
+6fadc5f9
+afdd3fca
+0b60ef54
+07a46575
+e349711d
+7671c62f
+69c3827e
+e0811b9f
+e303df6d
+209ac897
+ee26f284
+00e16cd3
+e3205ff0
+02ab57a9
+bd46133b
+d4619bed
+ebcb3ba3
+cd797342
+3d7fac52
+5d1b7285
+659434fe
+f40d1e68
+fbeb00ee
+0cb221d0
+ea4adb47
+bda62d37
+779a5106
+3dcc32d2
+e5cd3d61
+9ddfe8c7
+562311c7
+2131198e
+2b5aa2bf
+4ebe7951
+9f73d324
+633b9489
+50349a3f
+4e218488
+c92f4124
+9a993c35
+f1b649a3
+d3246de4
+4160baab
+ef121faa
+c1700682
+a5a03018
+05286645
+a6cb287e
+74475d27
+9c9c7308
+c8b904a3
+6394ad05
+7b760927
+a396c8d9
+7158b034
+febe567a
+30f71c07
+ef817d57
+d2830bad
+ac2c5371
+6c9a3693
+ab855bcc
+034f08f3
+b67e50d8
+08bbd6df
+081c279a
+d4a6e08f
+e07776f0
+4cd7127a
+904aa145
+28929846
+47db1bd7
+595cd8f9
+6ee38365
+aa1ed092
+f3d551b3
+cfd9d2a6
+9d9b97f6
+bbd76afe
+c4525253
+7fd08581
+26dd1ee5
+feb49a68
+636405ac
+25f4f871
+a3fd26e0
+5d7dd95e
+bb40a095
+bc61decf
+4ce044d9
+1495215c
+6944f67f
+03eac08a
+cba091a3
+2700f421
+ba2ce7a2
+5330ca74
+6fcaf9ee
+df29f7bb
+6a5e1ce5
+33fa033a
+5738b0bf
+9839fbc5
+0e0ed736
+d75847b4
+93163196
+00ccd0c5
+161c0138
+d0c3ead8
+37deccff
+0a8fd877
+c3e44774
+c1a1a022
+2cae895d
+e7553038
+f295b28a
+19f5f5dd
+4b8a85bd
+07678d3e
+10c54501
+1eb61c99
+ad4f09de
+f5c42ad7
+a0085b3e
+8d689bd2
+adac42e2
+2271d551
+3e38ef40
+4829f487
+d13e1160
+83f1ad4a
+c19f03c7
+6e1654c5
+43bc35a8
+ad2bc6f4
+392b18ed
+44ccb891
+9a3e4210
+78664cf9
+79c3312f
+83baed7f
+75c39076
+defc0075
+6c0215d6
+38718806
+01e8b761
+72a65bcc
+1c9ad75d
+3547565f
+2ec2c272
+094d98ad
+b3410e99
+3d5fb018
+8f8d1f20
+02efa108
+61839c39
+98579192
+859b343f
+61422b9a
+6c429741
+773c8555
+4153544b
+c3516644
+9a42c09c
+4a77ddca
+b2cb9c98
+d0fb938a
+ed04080f
+a83e53ab
+7331bbf1
+1fc14089
+8f4f8f83
+f161ec47
+b902a565
+4be981e7
+1dd0d4a0
+746cc630
+a4e68497
+7975199d
+c813da1a
+21ebeddc
+26df63c3
+418037d7
+d52cdd9b
+9fc87b07
+753571c7
+70b31aec
+25647a7f
+1808565a
+e931c5cd
+cc0d592a
+620c8561
+860c302b
+38c9f469
+1acca553
+6ad82a5d
+7760d878
+8aabdae8
+507605d4
+fee58969
+a7b6a17e
+09263bd9
+d21494f8
+f5a125f1
+e93fcc6d
+44fa9a7f
+df9d9c50
+c6cb726f
+b7681f11
+e66005d3
+0b8d6fd9
+0dd82f01
+19a2ded8
+46febd4d
+f57856be
+11948fab
+57f01cf7
+59cd5ae7
+27446250
+13d4586f
+f1911847
+1739932b
+90b648e0
+d19fc8c6
+5ab334eb
+b785d136
+daeb791d
+530b0b2d
+89103b4b
+b007257e
+01196eee
+bbf0e61b
+8eb2eac5
+fea375b9
+7579b566
+d30a51a7
+b77c4857
+cec6a8ca
+8956b1d3
+2c44514d
+ec2b795a
+1ed3ae25
+3bfee234
+03458ded
+d962f011
+6cfd6e77
+e192b186
+f4c5992d
+00f107e8
+612c40c1
+65695d1f
+f7433a43
+60a1720f
+d02264a7
+4a1677bb
+70dcd184
+ea857ad7
+d206c325
+aef30e3e
+6803595d
+278636c9
+37fba4a7
+2140aa09
+608452cc
+dd244141
+7cb56051
+024eda9a
+864d33c2
+6a447eb3
+6033ad73
+b44883b5
+83cad30f
+553ebda3
+89f1b83e
+b7bb7a17
+9bc67cd3
+ae7000db
+bb1cf240
+720446f5
+f2f4d27c
+57784783
+3d1fed94
+41b3f655
+bc786d15
+222919a2
+7696b047
+ac60dfda
+20beca9a
+8cfaeec1
+d2676d80
+f3c8cee9
+c27c1197
+9841e6d9
+742b7c22
+cc665c85
+ea86da03
+371dae82
+ae84264a
+529e8447
+72719926
+84b20221
+24f22538
+2872a4bd
+b0f4c8f5
+8bd4b780
+a08945ca
+0ad34f2b
+0f1fa8b8
+9e28e80a
+96a54d80
+585a8b28
+16ef27ce
+32b2a883
+7f9d1b4e
+ada36e89
+555eeba5
+0741a7f1
+c3c8550f
+9c0e6cbb
+ed3ff64c
+afa98192
+6514ea2d
+6643a666
+70d4d706
+0d5c2942
+af48e415
+53dc99b0
+2d5077d8
+06474f17
+a0be29b1
+611103d4
+031ba22d
+d697c57a
+eb9eb939
+7ca01a9d
+2869bd0a
+88196a93
+3de09421
+2b2bf391
+a7b606c4
+547c304b
+5ba8124d
+a12fca95
+640d8b63
+da89cb9b
+cd581ca6
+606866a9
+a7b7cb1b
+5358eb25
+2e8b75e2
+d9085127
+b85b416c
+6aff7e87
+95cad465
+8147447e
+c1b6abff
+0205988f
+fa1b06e6
+41656eae
+cb70bc55
+eb07d8ec
+c4adf918
+68357db6
+0d24afc6
+6c19f56d
+222b3777
+b2b8c49c
+c30e7b00
+c5dfd786
+d2467c8e
+91e8fc27
+43113bd0
+81b9f538
+199045c0
+159499d1
+a277f761
+39dd23e7
+f60614cc
+cdca587b
+38a367ae
+0ad37b4b
+065c8439
+cf0467b4
+7f90c133
+eb9e7931
+6e9fe9a5
+7bff66d4
+2671d3af
+2591ca7a
+1a887928
+9ed2f758
+04910a35
+b63edf55
+90bf7fef
+42a4fc1b
+1fec5e61
+74e6b5a6
+910afbbb
+82f3fe17
+0be27447
+5ee4697e
+6c27619d
+873f4a8c
+81d42f77
+4a00b569
+550d5889
+5fdca18e
+0fa9165a
+4de0e5ca
+7a3651f5
+b5c67594
+7fb7db93
+26f221d3
+f47e21eb
+3ec9c616
+0ea2f0e7
+f4d0627d
+2aa56c5c
+2a6cc1e3
+ef271697
+661c2800
+2c9174a6
+c315d3e9
+979e94d6
+e4034ebf
+ecad1737
+307c692a
+a523f48a
+26800aa4
+18d1c966
+0d04ec40
+d59f04ab
+615eab37
+d68fa39d
+c1bfba9c
+0983d89c
+c8e7f509
+a72ac67d
+7ef210b7
+ce997386
+c17b047a
+48e4d52e
+d3bf1cde
+873e9732
+e71d6444
+730987d3
+35dc8759
+dcd28acb
+980e6880
+20aef7a0
+15eced00
+2e9d5aa6
+31b286c0
+de95524b
+2de24be4
+532530d1
+5368b8e5
+63903663
+0893f6c5
+c1ee56d0
+6d61e4a1
+5c2caadf
+23f20700
+4d8549da
+f7143717
+2a55306d
+5f2b6964
+6b5f6a88
+d08d4ff1
+d650f1bd
+7c934a0b
+d059cd92
+f7756ac1
+bc862fb6
+7cf13f16
+7bbe6c06
+21f0dfc7
+d01b247d
+f272f98e
+29df6324
+4ad5ddce
+4a3262ff
+21c80bff
+3db09674
+b0fdd070
+3f335b6c
+33e57ef5
+ea77d5e7
+9c2588ba
+92ee270c
+c81aba6a
+75ae185c
+41d056e9
+12e19cc5
+e08d8d9c
+731f1c84
+95eaf7a0
+6241e24a
+138856a5
+39bff5e5
+2cda6e6e
+365017bd
+fdc97314
+9790e63b
+7056d78a
+9e19e80d
+e048fc7a
+75a867b0
+7d7cbd2f
+586674c8
+09e4ffde
+30d2a079
+26fd9541
+e2a3d92c
+e015e3d6
+efd075d5
+14f42fa7
+8d5ad79c
+b8215e06
+6cc7718d
+07d372ee
+c9208cfe
+b846a6c8
+040c1843
+ba515693
+fd318026
+ab147b82
+41e42cbd
+6301d50c
+a35ba7a0
+7a4536bb
+3f3009e8
+effca8b9
+531d1ebe
+f1f039ef
+af411eaa
+700a0da7
+77e7d573
+3290a168
+a015981d
+85384859
+7aaef595
+332a8783
+f669ffb1
+3c9ac292
+9e511730
+002aed7e
+cb95800c
+20c37f5c
+b388c135
+fdc83259
+477c3e2b
+9ee336c5
+6b286918
+d58d68bb
+f53090e9
+50594cdc
+138b2402
+f3580cf0
+9ebaacb7
+32eed026
+73529f4e
+0fc6ec45
+643327e3
+f7188a67
+30b2a438
+27465d16
+96a52b15
+afedbaa0
+1c80d81c
+dc886240
+09b20c8a
+b4bb4248
+7315b319
+273b889f
+0d989d65
+ada5494c
+0f5b971d
+38914a66
+4c074d2a
+e8d690e8
+c649e270
+ab066900
+c6b5ff1d
+f52829dc
+00adbfbb
+64bbd6b2
+447a6784
+760b1970
+8715580b
+ac2ebe45
+7bc78da9
+7883ee47
+1d7bf26b
+3168dd4c
+b7a129a6
+511ec7f5
+9f423ce8
+477ac334
+245da965
+c851b930
+9091de16
+e819cafc
+3a403d8c
+3c264af5
+faede055
+7466b255
+6ae20392
+b865c9ad
+213da3f4
+9163f8f1
+8fe2ef61
+f82e5dd9
+b1d36dc3
+ba29baf1
+3289d3e1
+1cf3e7d2
+f5387bbc
+e0e79bd6
+60d383c1
+080b546b
+0bc63bd0
+2922698c
+bc0819f7
+eac2829e
+04589a14
+6823ba31
+7495273d
+5c81e974
+fcf1ea92
+3617b5f5
+671196e3
+9d15fbc8
+a39a3a3e
+3decbee2
+dd6fc8cb
+13754a9c
+67841877
+dff85499
+98ac1c4a
+8e46b8f3
+a45c6436
+69c8c8e5
+7fee217f
+f1f0b97e
+e73bed86
+b9ec9192
+a28aaafe
+ee53454a
+07806dbd
+9afc591f
+29de8560
+71fd20d9
+30b240f3
+c0f836a7
+0a524be1
+7e449acb
+31f2fafe
+68f54e9d
+6e8f1bd5
+2e676adb
+e6e0c2dc
+5aca3f1e
+5fb649d8
+6f0e70b7
+b6759853
+9625b211
+5874c9c9
+f8bed1fb
+c1face1a
+72db3cb0
+faf36938
+9c9d4957
+dcea998f
+bb202789
+4f9ddad6
+be8a7bc2
+d6004b54
+3ac1b508
+7b6cf84a
+16dc029b
+bbf9c1a0
+4c4ecb2b
+b87ef7f7
+922bbb91
+358a1187
+1c67f554
+682b9278
+2181d913
+a6f5e788
+609032c1
+19e80277
+4d38a97d
+a8d08fd9
+f0e08020
+251b26ad
+cff871dc
+2bbe08a0
+86428265
+25076b65
+5a0a81aa
+1c541241
+0b28362c
+2386466b
+f9d0f35e
+c2bb68d8
+f741cd0d
+9cd33fcf
+5d8204e3
+1314bfd8
+4ca43e84
+b900501a
+c05bd0b8
+aa209877
+20ec800a
+297e8e76
+8aa1c159
+165c12cf
+bbbf282d
+a17df47c
+36456e9f
+2d8db3aa
+7c430b79
+0eb7632e
+e40ee698
+6a3de4e2
+f173d5e6
+bfeac7fd
+9526c084
+bfd9d6e5
+75d349ab
+726a2eb8
+7c928cc9
+757a1e4d
+5097afa1
+af6a4ffc
+c22febf3
+50df38c2
+41a49f7a
+9906c8f3
+e3603bff
+c6c91669
+fe1bd1df
+b9d19939
+2666eb13
+20008692
+23eb0c3b
+d96a7426
+55065437
+248065ad
+72788f31
+7f8ffe57
+683e14e9
+f25fe7e9
+036d21f8
+de372071
+2d9eed4d
+6dc69f41
+6ef8c86f
+010265ac
+58676f28
+cb9c5260
+701c2666
+c21c44c8
+9076a36b
+1aa6cf31
+680d0797
+6fc6ad29
+c6240b97
+da28c392
+7fd9753a
+f2796a9d
+b5a9f90e
+b25d5c21
+2064200e
+8106fb8f
+a07dad6b
+5610b0e6
+5e4d7944
+620f5b56
+163b50ea
+63566efd
+82af9502
+ed062f7d
+ad0ca57b
+62f0bb34
+0f8baa15
+2714650d
+23cd7a8a
+dae7ef8b
+f7862358
+1054ae5c
+683a8b32
+06e24bf7
+df7e8e0b
+d4180b92
+b64edd04
+e216a695
+a4d2f869
+98a38529
+d700703a
+755e4a50
+28479331
+8b6b3f43
+24d10da7
+c3a20c8d
+22e2fd54
+3a372697
+5cce7078
+e4783e44
+9642755d
+a602cdb0
+544abd9b
+15a9c688
+a7b56d70
+c804061c
+71eceb0b
+68f8bdd3
+c04ef036
+9a422971
+66024b53
+4cf05f10
+1d132b5d
+b6923fc6
+aaa08406
+d027c970
+58d8ca47
+29473fc8
+20824485
+4a6661ff
+22a0f9a5
+ce0191f2
+ae7d5498
+22bec8ee
+21f4c305
+0edeb46b
+340a88d1
+af56fc8e
+a05a0d99
+5bd8a4ae
+6c78773b
+3898d718
+cfd02f48
+94e499b0
+fb7a7aa9
+560f6a86
+742e9bd6
+59256bd8
+a353f58d
+c2cea603
+5b906b78
+56919fc5
+b857291e
+5ae9bd3a
+90e43499
+b3a61313
+cd50497e
+e2217f93
+93bc0a6a
+9b340c14
+af0b7ceb
+607db0ed
+a5f8b24c
+2839b07a
+13b11a79
+004929ef
+3475a27a
+d54a5851
+c6efad65
+76c50a94
+df994e88
+070f2ece
+9700edac
+9bd51b96
+5491e76f
+66d5ea09
+c44f7d31
+65a8f40d
+87fe3e10
+2115d03b
+5f8383cb
+a0060bca
+2c5278c9
+2efe2214
+5d247f7d
+6153cf57
+4be3b87a
+37bb0bfe
+e7ce7f20
+f5f5efa3
+52497132
+1ca0a31b
+fd0dad89
+a74169ca
+9b86870a
+d5cf9352
+49a95fa1
+ee991e99
+d2c92194
+bc414ecc
+4f1b46f3
+bafae19c
+ecf21575
+f47c33aa
+fba7e52b
+0c94bc57
+cd496470
+2a80208f
+664b7e81
+94a1f0fa
+0746c3ee
+a4e98865
+cd2897e9
+9595e278
+f9c3d82c
+da148aea
+9a660f03
+46c74967
+e59c07c6
+fde1be1f
+05588f53
+0bd0c3b3
+b7bb9e4d
+ed632a6b
+058c0a32
+aa61920c
+02e43b2c
+a61f1fd6
+749e230e
+42e01668
+b6a644d6
+ccecf8bb
+42c897f4
+7cd75ed0
+1736789a
+bf4de37f
+4d1f7d97
+dcc0e16b
+e0be7968
+669a9e7b
+2dbe1596
+44e1a73d
+f36fc282
+2aa8b840
+1be1371e
+77aa7207
+38f692a7
+d7ee0177
+dd0eb023
+4ab39743
+eb8867e1
+f66047e5
+84203386
+687bc173
+689e4c36
+b6ac69d0
+7b4b217a
+3407cf7b
+29e4ad33
+7670a456
+85078508
+66eaddb8
+149170d2
+19fa30eb
+0a857eec
+383e77c6
+41475a84
+ad39ba86
+db9587fe
+d9634d7d
+f37efd0f
+b4dc63bf
+ccfdca2f
+087c5f3d
+7f09a3d1
+488b44e3
+6be5122d
+b04847d6
+78143a4a
+0ced815c
+efc5e2cf
+cf1ad8cf
+e0e98842
+1e409ab1
+d63b996a
+d79cc967
+46ed1e66
+873bfd46
+af763b4c
+b8deab54
+5307d8e2
+9a225f8d
+d42c213c
+412f3604
+2ee49d19
+4859ffe3
+0d8e34fa
+bd706b7f
+19aaafd4
+d6194705
+b99e9d9c
+f426f5c8
+cd3a0eb4
+a50ef3e5
+4402c53f
+1b895d3c
+0bb50cb2
+b3bbdaaa
+158e4baf
+0b14a45a
+5f0bd2ad
+e5d8af57
+4deff5cf
+132f13a7
+e8b75b63
+94fb1def
+73787f82
+00164ba4
+2fff84cb
+75ebee06
+c2089e3c
+f3cf617b
+97039a90
+7b973d22
+ac9c2e8f
+47ac6011
+21428b74
+eace49d7
+543dd239
+95f54440
+d1f0b2fd
+b642fddf
+ac5b9c54
+dd7956ab
+105d4c59
+5e43c7c4
+cd1b7031
+310e932c
+4088eea3
+d99a1a67
+05cb7a59
+14dfde81
+63248e0c
+bbd0e773
+f6e05074
+44997d97
+cd18416f
+78da5469
+9a086339
+a88c9743
+01a88896
+6859602e
+65db2d27
+753f8c08
+c58af4d9
+7905be9a
+89e133cd
+a0633758
+20da8513
+c2e51649
+2a031a88
+f4333fa1
+098f2b17
+bca79aeb
+a89c45cb
+7f0f9bb9
+35401f6c
+6a7453d3
+229db8f5
+35cda973
+b83708c2
+81cae03e
+c1519473
+1b783b86
+f536b86c
+9ddd72e9
+e4fb4d13
+3240ea7b
+79c57b7a
+64eb41f8
+aa566c09
+7a18edce
+3a5bf2d6
+577ff050
+c11709dc
+3b856384
+3a9c7259
+0fe5c671
+12eedc36
+e79fe85e
+568ff992
+99a8aa81
+8b1803e6
+e8977c13
+68b98396
+dbf405cd
+adfcc52a
+46d4b56a
+737d31ff
+7592da6b
+2b9f131d
+c8aad345
+46adcd72
+7934c105
+f09c4eef
+d7ee407e
+4f39b3c0
+c7a9f205
+1c844518
+192461f2
+2cad6d17
+19ecdf2c
+62aedd5c
+883c1eb7
+6b631d5d
+99bbf84a
+9099c04e
+ff2333c8
+91875c79
+59ced183
+868744ab
+c97b1b56
+8e1e7f74
+4a1b9088
+2c3edb8b
+e13a7974
+0bc0e6ed
+0ec1e215
+19ff9be1
+826459d0
+6f71ad27
+959236e9
+42474081
+55b266e1
+d520c3c7
+96736975
+c00f746e
+8f8cf05a
+7d4bba07
+99ab642b
+6bb90e1e
+1867a48c
+71ff2367
+aa8e09e8
+68471789
+1f2a2f8c
+fc867505
+43be6ab6
+bc256141
+e8120b24
+4e5bdb62
+43fa125b
+a04db730
+786751d8
+4ca0f167
+31267608
+ca4aa270
+c022d279
+0372bd68
+69efa7ec
+3f31bb3e
+8f999c5c
+d2e78b72
+8dd0a090
+83dc9956
+fc27ecc0
+05c4eeb4
+d74aabe6
+15e10f9c
+137fc00a
+03181d47
+aaaad9da
+2b0240af
+12d18ab5
+eaf88d59
+95c18539
+92e73eff
+cb1e4c6c
+248c347e
+53be0d4b
+7c4f062c
+4c60be6f
+974aaa98
+7e46d496
+07976b6d
+d3c977f5
+e649e0c1
+bce0c45c
+9ef3b60b
+ea796a4b
+bc346946
+1c315a80
+ceb943bc
+998b9bc2
+6fe71be1
+8f736c02
+d9b1e3ff
+e51bf5bd
+258875ea
+a9619aeb
+9d31f567
+3e97bb96
+5b83be43
+c5734ebc
+758732d3
+59da2976
+e6787c76
+26fc476a
+aadb87b9
+9d12ce9b
+40862c01
+0f565918
+ef969cb2
+73413f68
+ea26a3ee
+c708d1a1
+ec967dd8
+2a82c17f
+1d53f0ca
+a18c1138
+53bfd177
+9cf09d42
+1b75cb24
+13ef80fb
+62cdafdf
+88b7f981
+bd80a7ef
+4f5088d5
+1e79255f
+4c9e8313
+6407d951
+69cf5f6b
+d64f4ee5
+05766aa3
+913ce140
+abe793d5
+97977aae
+edd21768
+ba395776
+499e9dcb
+0f5eb504
+58d7f05a
+be9b0935
+9fd640ca
+fd14fb53
+b4192d04
+b7094596
+2202ebba
+fd0ccae7
+ac48f34e
+e3f1653b
+4e1f7b94
+db5d89ed
+b362cd99
+2010b191
+624048db
+4438e7f4
+5b97686e
+c012107d
+93cf7728
+71a572f0
+f51ceed5
+ae4c531b
+61115049
+fa365cf9
+bb168f01
+49bed3c6
+7ce809fb
+15cb968e
+5ba8ac16
+058b2e38
+f4a4fded
+6e390a8b
+d8448b98
+88206ea4
+1cf4bf0e
+283e5982
+3f3796ef
+6892d4c7
+0911832d
+c9c964c1
+e243c625
+ce8aed4f
+eb0d54ff
+577aa337
+a2482a8c
+0e438766
+b3930e07
+25c007f9
+2deff9ca
+2e420cd8
+15d51e52
+7f5de427
+90b202b5
+66a7c2c1
+419d31d4
+e51ddf94
+29377998
+e470ff63
+61af8052
+fdd5d8c2
+92b77f2e
+7d756b25
+20b05825
+daafdea3
+7833fda9
+8935467a
+6bb0db5c
+02965f5f
+c5638068
+c814511f
+4df7bf88
+086ac2d2
+82e4f226
+46031dab
+d7129972
+33f520c3
+32f3823e
+744161b3
+1564a011
+4f7de891
+db8bb103
+b26d847d
+1aeb3f3e
+f59c87f6
+80fba0d2
+ac31fe3c
+4955b0c0
+e180938c
+2e15139e
+4f2f828c
+bfd56fac
+3bf8298e
+2774fad3
+ee6845d2
+496d5192
+ee80ae70
+bfd3f296
+6a2ec4ec
+fe10053c
+d83c39bc
+7b5deffb
+52eb7ca7
+274c9977
+a42ebacb
+f117ca64
+7333d7a2
+b29110e2
+6929fdac
+5a859ab2
+48ca1b74
+7c482ac1
+fa256020
+c1ca0a1b
+e584c47c
+42880796
+08c4aa80
+5accb57d
+3b0a3499
+ed625ce6
+927d4649
+38df5213
+9d0e6c68
+f22957ba
+c33bcaba
+f6a9db63
+b6939878
+c3ef34ee
+d908d529
+0ee43fee
+3522bc6a
+bcb04c37
+2d3c9c71
+8e8b2ae0
+d257849e
+dc823071
+26c99d74
+0b5280b7
+d93e6010
+18783374
+ac043af0
+4cf337ff
+d93d5dc0
+2df79247
+995ff8f1
+efbc225d
+6c07e306
+7ca25fd2
+cbf66924
+90dfb174
+2ef45c3b
+208fb1a4
+2d6f299a
+a765ad4b
+4cdf6fb6
+34823d1c
+47791ba3
+d73ebc38
+2d1abcd4
+962f47a7
+d4587bb9
+83ef8061
+f87e56ab
+238057a8
+cebf7e5e
+4b41a8e8
+192b5981
+02718d7a
+78f92234
+1e0ad8ae
+1fc34e82
+15131181
+bf47c6ee
+aade0a7f
+a4609aab
+788bd9f4
+0d64b668
+95922fe2
+1ad9245a
+96777576
+07a94afb
+c0a94690
+7252cfd2
+9090b697
+7ce882d2
+d9a0967f
+940833f7
+2c7ca449
+39b4a686
+1bdc0607
+699034a0
+8838c381
+18a94864
+482eeb5e
+e9561d8b
+2c7b458c
+f560f2af
+a2a217b9
+2c841c8e
+176711a3
+5a750b25
+817ab7da
+8e932b67
+d481e8e2
+fa6608a5
+52fd9280
+eb4a9b83
+f971a7d6
+e1fb71c4
+f440d1c0
+d70d69ef
+b61df50f
+4552f2f6
+4dee99ee
+703dfdea
+0dc374d1
+58ab8266
+fd89d13f
+b6358cf2
+1c9f823c
+c69447fb
+3c3acf35
+25adbd2d
+53e86199
+48e01e3c
+938fe91f
+18bd2b95
+2bcfb78f
+144d9b96
+26a64614
+94b3084b
+78328c52
+c94a3c07
+85c5f849
+d1e26c69
+c658b52c
+719fe631
+f9e7c1e3
+a04e019f
+4f304956
+f045731b
+75e0d71b
+cb041195
+c01c1eb7
+b8f1b1b5
+7ff6230e
+d0069af4
+cc84adb3
+b0878dc1
+47296f4a
+90ccf098
+c31ac3e7
+3fc3e565
+6939835e
+566a7713
+c042f1df
+c83f81dd
+2f917555
+b3761bf6
+7a435b68
+06bd5e27
+f37be5c0
+f68c5128
+57b9523f
+10ddc5e2
+f72b4bd1
+825607b7
+dbafcd7e
+2770cb50
+a1c20e27
+015b13f0
+d84f0165
+14515151
+dbfc0eae
+65a22dc3
+620bf19a
+2974cc2d
+8cffe207
+68c9e006
+ef11661d
+10465598
+84bc66d0
+69afd526
+4fc18a28
+ab29ed12
+f064833f
+2dad6ba2
+2d3f0cc3
+52d28861
+e051f06f
+7cd4c2a5
+e671204e
+348e21cb
+e4eb05d4
+377633d6
+7467deef
+55795b33
+adf5875b
+3fddcd94
+0283b01a
+a35dd4d8
+8efb83f0
+92955a26
+96572ad0
+1f77092b
+bc8c9f21
+2c1a318d
+43b7cf5e
+adc21200
+175d5d07
+8e3de34d
+45922c00
+80da9312
+bd3d9389
+9002e6b7
+4618e030
+2a0683ab
+acf4a5f2
+3bfe9bf5
+8d68f0f6
+26879515
+f5204b1e
+4ef08845
+8c18d8d0
+a803a2ad
+bdb4b621
+b91c2548
+fe0b7755
+e2274ca7
+1c36a2e9
+ef9a7a47
+9b972b2e
+86068b3f
+ab6a04a6
+5adcba72
+342a96ca
+e9c32980
+c9819e4d
+040ce78a
+99177493
diff --git a/models/rank/dcn/data/sample_data/vocab/C12.txt b/models/rank/dcn/data/sample_data/vocab/C12.txt
new file mode 100644
index 0000000000000000000000000000000000000000..4553a544414464def71c643b5988b71614d6d338
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/vocab/C12.txt
@@ -0,0 +1,1583 @@
+c1aef73d
+2d81ed2c
+a3bdcb7c
+9248a1e8
+a788321c
+66b1e155
+b5b8de53
+fdab8598
+4d9c9fa0
+f1ce2ed4
+fbbc41c2
+c65541d1
+a5de6c17
+94f5aaee
+c5d0e605
+84d2c673
+ee3501f0
+4f3b0399
+1606ff92
+424ba327
+192ffbec
+640cd77c
+9d0d5312
+f9138878
+fd461458
+6499063c
+5d65f22e
+ed204454
+dadde5ca
+42c3797e
+d4439b3f
+7f508e0d
+0aa6de84
+6b9c3fee
+7f658abc
+b3c6e177
+f88ba033
+49e68fcc
+140e9e27
+d89e699e
+826cb6b1
+f1efc5f6
+e22add65
+13f2b8f1
+06cf9db9
+f38b9685
+79b87c55
+be9ad4e7
+40d991f0
+131e5de2
+c198b273
+d7c5e6ba
+0a6fd594
+90dfb495
+5b97bb07
+b96752b6
+5b355b50
+cf7e278f
+84d46930
+76ad996b
+77fb35ab
+feee3a16
+957d6ee7
+ea14b165
+8c02ead9
+f65b69a3
+da27298a
+cf724373
+0d2a2c95
+5662d3e8
+b2fd56ef
+d72a8b65
+d8a69a76
+61104d70
+d674a6c9
+4310190f
+78a79932
+3aac3e1b
+35e0892f
+7239dd00
+9bf4fa6a
+c995314a
+e1756869
+28b4e105
+c385faef
+31668efd
+7eee8b4a
+0273523d
+061e59ac
+5d00ecad
+893f9e14
+ce875433
+087ef0e7
+591e3c11
+9aac7976
+0734e0df
+acb44480
+a27f34ff
+5989a764
+9f32b866
+3b90ab93
+58a787a7
+5595f556
+cadcc5cd
+9065c400
+b7a2276b
+827a0467
+c71493ed
+14d7c42b
+3af94af0
+867038d3
+07f02922
+3186644d
+b866cd75
+c534c129
+863e573f
+78d9706e
+90c76b3c
+ddd50acc
+0af1d7f7
+e145958a
+b99ddbc8
+f9de2371
+d79b39c1
+3df9cdd9
+5ba57bfa
+b9b911b1
+8fbe0072
+57c08194
+c7883ba5
+982104df
+c111ef8c
+28b79840
+e678a74d
+13bc7e19
+5a594cea
+adc24c45
+cd2085cf
+e2163351
+2bd984a5
+c5011072
+e37f4bc1
+a3e0d914
+94ee7692
+db60caf1
+055ab34c
+85a68f0f
+db21b797
+52b7a181
+7ca611fd
+ea734b1a
+c06d6429
+6532318c
+d6b4fe71
+b908eace
+72f158e9
+aa03db5c
+7ea9a2f4
+6dd83bc2
+fa34f1cb
+5bfa7585
+7a9b3053
+904e7ef6
+d6c5488b
+af0c1645
+3af38a9c
+6376dd6b
+549524c2
+e993816b
+d0624fed
+be29f7d4
+f72523f1
+877e0ebf
+0942bb43
+4e040fd9
+df0f4ef2
+cdd4f388
+d6cbb5fb
+5dac6850
+08903b9d
+a8ab776a
+6119e0d4
+d8399834
+6c83c219
+e539ae19
+8933ce59
+cbf608bb
+ae9cdcd4
+7a8f087d
+bd13437e
+90813faa
+b4e60c78
+e20558b3
+183261ff
+bafda429
+af3d699c
+317bfd7d
+69f7e502
+bb669e25
+3b3e6ba1
+e2c08cd9
+6ebafa83
+66372059
+94a1cc80
+dfbb09fb
+f898f1b8
+48bc5b52
+98c0e953
+6fd08f98
+f3ad49a9
+c66e830e
+8a098a5d
+a012b59f
+b79346d7
+7f78e6c7
+68944c23
+ba5aae2e
+a705b156
+a1364ca0
+a0cf5647
+a825c99b
+2c4d9c4c
+8125573a
+1a69f1c0
+e5b09bf6
+c8c1e8a4
+e2700d86
+5172ee67
+a389d767
+2e61058e
+6fb8f39a
+021c82e5
+6e76119f
+ad972965
+bef78a22
+8a925a1b
+4df84614
+1da0c261
+09200219
+879f4082
+f0be4a30
+2397259a
+b58812c1
+33b83378
+79d19519
+1b0c8aa3
+4d2af459
+4cce3f75
+0da1837b
+662d25fe
+3c7ba2bb
+870771b1
+673768e2
+1a30ae06
+362d4000
+9d254525
+b03f0955
+1f6b8745
+44f6116a
+14d63538
+49eb8265
+03dce2a4
+b1b76758
+a8d07d00
+2d2e6c55
+cac79b4d
+a0015d5d
+87e248e6
+cdd76771
+aa910cce
+e8c9d3fa
+271f5122
+5424fda7
+5c935f2a
+2598d8eb
+78458b47
+41d764b1
+4afe6861
+f25a8037
+02472b09
+803696a1
+5c28d6c6
+3a32a012
+9c95a0df
+79b98d3d
+96e99a54
+3c2d0e05
+bdd0bd34
+399cdaea
+93de0c8e
+76615e67
+b7fb9997
+7dcc3969
+424d315c
+7c09503b
+58eb8589
+2ad06856
+d6e3ab87
+c74f92a8
+6eade2cc
+e6deec50
+e977ae2f
+76951817
+a90d7d9f
+8b5867f2
+5ea2e48b
+6706ce51
+604f499b
+b9898409
+b0f12191
+982418bf
+eebf94aa
+ea50fcc9
+672a3bf9
+179a11e3
+4d69e263
+262ae33d
+3743d561
+0f80cdc7
+7ddda62f
+a3ca726d
+504f5db4
+84e5ac4e
+f8ce562a
+b67ac327
+1d698bc2
+1c77d5f1
+800b8573
+15d97f2a
+18922c00
+d000d519
+b87a052d
+d99fa921
+05ec3803
+7979221d
+ba9a9658
+2e1b4595
+9c5a3598
+6c7591c2
+67cb474c
+60e03064
+fb8fab62
+7b538f4b
+b51dc799
+5306e9ea
+b787d76f
+2eba67ae
+a6bfd75c
+77f29381
+b12396c8
+0706d4cc
+7ea94441
+b7f1d23a
+449c20d0
+ac612432
+a0f41a51
+bc1a3f28
+2ed7ed80
+39f1263d
+69ae4278
+5864de82
+d365a3e3
+f94df932
+b992a469
+bc69cf0b
+1ee870da
+cf7894f9
+a85f443c
+19fc0b79
+88abab80
+94368077
+784088f0
+cf30aac4
+c28f349d
+c8534259
+6d065bbd
+1b4ce856
+eacd174d
+1f54546e
+f10c9cfd
+5a92ccc8
+17c0e328
+17b9e35b
+36407983
+03afd96d
+153ff04a
+8229bc5b
+ff35e49e
+ca71e406
+ecf3c050
+05994a27
+18407a62
+d0275c5e
+f0261606
+560557e5
+f9507afe
+5cf1acf3
+2e364a21
+21a23bfe
+3e302d42
+863480c7
+59fcba41
+875b735e
+1a1f723b
+a6edc56f
+15782fd0
+553e02c3
+184d6c51
+e165d24e
+5292f047
+44776637
+b661e386
+5fc62500
+04621bb1
+58874c6c
+434c4893
+197dcb0e
+67f771ae
+a8fbe2f4
+f2ea9889
+7becd6e7
+af908315
+8ebd48c3
+bd1662fd
+e90a010f
+c9cc4cf0
+2a137d77
+34e567a8
+252162ec
+ef0c2022
+f7d97d7a
+1ac36d08
+2d878564
+1bad82f2
+9e2c7b68
+2defaf33
+af7fa246
+bc0eb380
+53becffa
+2a93094b
+9dedcb09
+7161e106
+a924e126
+dc5b89d3
+b293ab33
+f89eb8f6
+a1a65be6
+74ba00cb
+4a3850b0
+ec8dca4c
+13563125
+e221fdc6
+8a3b9b24
+c504cddb
+769e4c52
+a0ca5294
+bac9dcdb
+ba8679a2
+d54ff067
+8882c6cd
+5505dfb7
+d326786e
+2120ed69
+2055fa1e
+902ac8b1
+375d8bbd
+6647ec34
+636dcdf5
+251bc4a5
+ef151f20
+5b0180c6
+9fa694f3
+1205ef20
+82ee7fd9
+79cbc7f7
+5e5ca0f9
+c23bca28
+6c28a86e
+77602344
+cd31013f
+5594286f
+5a4db0a2
+b8f1a8df
+ab22eb88
+72e65cea
+47d2b89c
+467f06b7
+7f91056a
+76517c94
+4d934f18
+d3809c46
+f23432d5
+ee6c19b5
+bd8165fc
+e29f816a
+34765ee9
+8d33fe00
+1dae3163
+40edbbe8
+8cc98d00
+05c3b29a
+894896ed
+15ca9c06
+e0a57f94
+c8ace354
+56bf7f9e
+90edbc51
+c665cd2b
+c297ece7
+422ad4cf
+6ed1e4e2
+0f8779c5
+c12603f4
+2daad6f3
+2fe438ed
+0ee90fc2
+3ea9c523
+c05ae48e
+b191dfce
+82bfc352
+704ed80d
+8a48eb95
+f0f6a9c1
+f217c8b4
+8ace78d0
+f3adf8be
+6324e4bd
+0105634a
+0e3cae7e
+062350df
+9338777d
+eb675c3b
+fdc8950b
+1d17ca13
+6937c791
+5c45a578
+befd9d25
+34b25eee
+42778bea
+bf2f0a8c
+9548807f
+7937deba
+c79de9f7
+10536ff9
+3e72da54
+43e3a426
+63470841
+68a6d325
+893b7ae1
+1c04d4f5
+80becfff
+9f457abd
+865ad808
+8a78a25e
+3563ab62
+ac5ccef6
+50ed27ca
+3c5900b5
+96e389b3
+cd749c9b
+0dd7417d
+d2278cf5
+af5ce1ed
+63d3fb31
+709576ed
+2d72ffd1
+f4ec1778
+69a5083d
+34a2ebf1
+8b103cf2
+9ed3d0b9
+2c1e69b6
+ac6820fe
+45bbb0ec
+368c358b
+4f230359
+c703e271
+0bc49df7
+dc85594f
+c0e6ed5c
+257fcc4d
+104fabc4
+156f99ef
+8f4497cb
+ddc05636
+19728a2f
+45e9a7fa
+a312e1e2
+9ccb63a9
+d9b23502
+6663c4b0
+179183e1
+57676afe
+9fd03f62
+efa67886
+06b297d9
+41dd8d09
+34645d5b
+2dad7b23
+961e6d6b
+cc239583
+d319cb43
+b8ee36fc
+0e26b386
+c15e7f3e
+a5571b9f
+19ca8e3c
+f9cd9fa8
+e3c15540
+e2c291a5
+2a98cb06
+7364e701
+6a9dde60
+9158b5a5
+77348965
+9ed8d6bc
+39c0b4ea
+8481d649
+24d58844
+5f27a931
+b35573cd
+28f7eeac
+dcc9fc37
+f0d5cc59
+5aca94e6
+792e199e
+6169f967
+1a4ffe88
+fca38b4c
+8b7c2178
+c658f48e
+6453e163
+857a49b0
+df8a4f07
+47e8c514
+01d4b4db
+8c03a09b
+ebf6ae0a
+18b86fc6
+f6e5b9ef
+480bde65
+8c5ece2a
+1ce23264
+2949c943
+de898612
+4bc80e2a
+bca7e012
+6536f6f8
+d4fbf673
+230267f3
+aa03d2ee
+d1fb0874
+eb5ff98b
+f4afbdcb
+e31997ae
+7d8db404
+71345146
+3565df81
+ef6f097c
+e9715419
+adb4a533
+8ae5e221
+bb93f61b
+1ca7a526
+fa1eee27
+590dfbb8
+4fd67e13
+c886a342
+680d7261
+855210f5
+22cad86a
+0f89aec9
+fe9d0c5b
+9aff23d7
+8546f1ef
+0a1485c1
+4d360c97
+d75edbf5
+041d9426
+9be71e89
+f8ad4a41
+a5f9a198
+68a5e351
+95812a33
+4a67f833
+a93a2643
+0869abd6
+253c11a9
+fc75a704
+d754a848
+165dc2ce
+82c3b58f
+0da36732
+f2d5da00
+eff214f1
+cb635ae2
+67b31aac
+641593be
+33f5356f
+89f7067a
+38588c6b
+e97cae00
+45b12a68
+3bef6c32
+56e1a6c9
+d5d86dcf
+0a665a51
+29dfcef7
+80441957
+dfc096e9
+a3a9a46f
+8396e6c0
+50a702fb
+d9714c1b
+bf1f4a02
+9a6d5824
+a2a80116
+2211576e
+b94e4985
+536a6954
+bbe97cb9
+7d7c6076
+fa8e4000
+ba2755e8
+3493a15d
+d9372f55
+42bee2f2
+a1cca232
+bfd22f6f
+719229e3
+ff80ecd5
+9b37f3c5
+21161865
+af97a54d
+a6f57f5c
+2098d925
+fb885aa5
+edd5a0ae
+8e15c24e
+c12fdf81
+a719b8eb
+38fe7ce8
+bf474c97
+b770657b
+41b57a7c
+304cec1b
+6cae5a9d
+1a5d089b
+9f32d017
+3bcac28d
+e1f3056f
+eb83af8a
+2fd26f96
+4eb9c398
+a779839d
+342476eb
+a0a1b9a9
+389eb0dc
+5c55d0e0
+4b715f92
+e09a8161
+92a663ff
+16c8ad7b
+da7b77d1
+b411cbc7
+f47c2c6a
+7441161f
+eeb07caf
+1d27e688
+1fa058ba
+d60edc65
+935327e3
+c6e66003
+bf2e5f33
+6f2d0b37
+4248353d
+904fba4d
+dde73338
+3e86dc36
+90db28fc
+bee3806d
+f42f28fd
+5808459b
+6aaba33c
+1ea443ff
+e2848a9a
+72ca6191
+8c92f967
+fd352d95
+32b1d348
+276f3fdd
+e5c1db3f
+1e5bcb5d
+f2bf9229
+b7c15dd5
+23977c44
+cfc86806
+5a1f6612
+3ae289b3
+71c32035
+cead3a62
+0a48382f
+a3c23ad1
+62ed9a91
+ea83607f
+942e2302
+129cc160
+2df201af
+f5d84254
+2f6a7c52
+d8b234f6
+4f4316df
+c7a69a3b
+7a27d4e1
+cfe25cb7
+83202629
+34289160
+21895686
+24da5932
+24d54eae
+a0202926
+81f69a8f
+9132d455
+424e28fe
+3beaf220
+2a6bd999
+11fcf7fa
+75587697
+703226ec
+bae6c746
+8827de87
+5b119c47
+fec0c6ef
+94235c77
+20622e06
+254ca0d7
+e4e111a1
+ed397d6b
+5c4009e6
+d083c277
+23bc90a1
+4efceca3
+50165667
+43c46d83
+e4734e87
+450f33a6
+12711fa0
+61ab0c4f
+18b125c6
+0826f297
+76c72828
+1731f3db
+0e43ec3b
+2630b570
+977d7916
+d7028959
+ed9ffce7
+ebb533c7
+8cdc4941
+4b625655
+0ff31973
+30111d93
+11f44afd
+f8b55668
+1d6bfcff
+e0cc99f3
+ed5b84d4
+eafd7f0c
+4c31606a
+fdb247ed
+516c3bad
+2a6d37f8
+174d825f
+69d2de5e
+176858ad
+2014c1b3
+5332e3fb
+10e9872f
+afafa62a
+5c84927d
+9f2dff26
+995e4a74
+e0b0d930
+3b917db0
+54cb84ab
+2f671908
+0301901e
+8249520b
+6d4472d8
+2c5e6524
+7127a7d6
+18e40a04
+92d569ba
+a0c32c81
+27987cb9
+376a4f52
+95b585f9
+8527be14
+82665b78
+07bdbe6b
+2c7c4bba
+25ca9201
+7a97a313
+aaef34d6
+951fe4a9
+0e528718
+22608499
+b015ed6c
+22142f68
+e657c595
+e9562baf
+d19c068c
+f68514e9
+539c5644
+4757d03e
+d69079e6
+8a4d8e46
+8066b103
+ac639f12
+b0fc60da
+a63d1ffa
+07f030ae
+514fccea
+39fff66b
+4f25198d
+99c1c8d3
+574a31af
+848b4d84
+75c79158
+167ba71f
+71d55d49
+d3f30591
+55ffe9ca
+255699b9
+234f825d
+18b9928d
+64c7b338
+0f4f2db4
+8065cc64
+76a79c33
+f39ad7ae
+b5d8545c
+e9521d94
+c9272d17
+a5b4b78b
+1c9ce10b
+260ef2b6
+ff838771
+a6554629
+4f879f5d
+49fee879
+8a11f111
+aadcb74e
+15b28eec
+fe18e0cf
+0096db77
+ff8c6fd9
+f6fe1d50
+eb2a7aad
+0f660539
+8e838324
+838b6876
+4750aa00
+6340457b
+5ca5ea59
+43183389
+d010a692
+1d29cbf8
+97ad9659
+7023b7d2
+78022e82
+e7b4ee44
+a7d3279d
+5f6ade89
+0d261508
+548118bd
+88a8f5f6
+23ae741f
+44a0cf7f
+940b6e42
+ce197608
+b4928074
+98eb92dc
+dd8781ec
+26ffc51b
+74906f90
+676b1292
+25644e7d
+4ea4e9d5
+83d7d5c2
+834b5edc
+32e1a215
+06bd1916
+a648ca66
+f5d19c1c
+aa787f00
+5e78a000
+28283f53
+6953b65a
+dbc14f56
+154f316b
+11d928f6
+4d144b7d
+34a77493
+7c74405f
+02d20151
+52e91b9e
+2c5269ac
+a3391ca5
+f6d35a1e
+30e909b6
+3ad1f48d
+2eaa6a44
+611d855e
+e7378c05
+cbc58cbc
+0705b078
+c889eb07
+e333d643
+511e6df7
+5deda06e
+72df8b8b
+5091db5f
+4866f28f
+af487b63
+b70bfe13
+77145fc1
+ada7568a
+7452a48c
+7b9e9937
+d8bf293a
+266666ac
+13859eb7
+7307c19e
+cc99b33c
+b938df15
+f7cbe917
+a6ae1271
+7414723a
+c47972c1
+8d6a9f16
+ce7ec713
+9e7aafc2
+397675e9
+d712779e
+afdf9a65
+fd685f34
+623049e6
+5f4d1c67
+fe2ae07d
+3f66c36e
+e25bdedd
+fa6b2a51
+b2526fea
+54b7a508
+98d78b2b
+e2e599a7
+479cc2e1
+16191617
+f2c35aba
+9a103204
+0bcd01e8
+59f3245b
+dc31f3dc
+c228f276
+8c5ebb31
+3b9ae062
+688190f3
+ce2957ad
+fde18531
+b3e118e3
+fcaae253
+870620e6
+ee4cd37c
+7af65151
+15b7ec34
+ca193645
+d7f5cb55
+0e431092
+4c6ddb1a
+0afc6bf4
+23121637
+7eaa8ae0
+8287dc29
+7b570c07
+87e6874d
+c5825e99
+a1dbab30
+381dd9fd
+476973dd
+6ee4e8a3
+7ec456cd
+ec90808e
+80fff3f2
+f3a55aa0
+4d72208f
+1f27c37d
+12a4b15a
+6f92f1c3
+55f78a0c
+9eae5dcd
+f3be137d
+3b7201ca
+e4c074db
+374b2880
+2a411f2a
+6074c21f
+4c000893
+e0d76380
+8eb3f772
+c198896f
+f96cc0b9
+21a0ef88
+7b79c094
+d28c687a
+c697eb6d
+299582d9
+c35a10e3
+f36c0ae8
+a8700c60
+a53ad6d6
+2e4c88c2
+b853b799
+14776f23
+2479f13d
+6a767367
+23f7905b
+4f47c10e
+661c7493
+2bf06664
+62f43136
+eec0e5f1
+619f887c
+7e01f09a
+f919769c
+d8c29807
+20ea8abf
+fc95c453
+c1054c52
+8e1de7db
+183aee3b
+a2722ce4
+75ed459e
+17ebf5d3
+b9736368
+4802d2e5
+bf5cf2ca
+4132f6f1
+cf681365
+eb865f73
+ea803b7e
+7aaa871c
+ce298f58
+e8934d81
+87408e45
+23253c33
+171c9373
+4dd1cabd
+343da6e2
+513555c7
+2a064dba
+f5d83e6f
+cd9f20fa
+ea260e89
+1edfa625
+8b82c64b
+1d00cbc4
+d2a980cd
+80c43ab4
+d73d5e92
+c2bc8f73
+98922d75
+d898dd7f
+3ee41b39
+93bab460
+39852814
+241692b0
+4f31b8af
+4f7b022c
+7e1106ce
+7a09282f
+496d0124
+b8b41318
+34a238e0
+d07f4262
+9e7f897d
+7078d5bd
+97bcec27
+f91fd7b0
+0a3254f1
+422e8212
+58d90787
+5a962bf4
+18fe96f5
+33ad9b8a
+993f2992
+9b665b9c
+ed5cfa27
+91bc2f51
+19199681
+7e6c78cd
+b438fc5b
+325a69ed
+e58930d5
+eee3943b
+4f1124f1
+69d0b693
+1fd56e82
+336bb1d7
+66a76a26
+a9580b00
+7b8c2a24
+8376bb1e
+8ff467ea
+a70fb4d6
+985464e6
+cad8f137
+349450bc
+981eb85d
+dcfd94f4
+93c5771e
+50ec33a6
+4c8de1d7
+e2cc7c06
+3953854a
+c78e8461
+e23a52b4
+1a3c8178
+b456a550
+d9455394
+c07fefdc
+bb0b487b
+6fbed051
+ae31f81d
+05ccc530
+9da25024
+716265a1
+961d324d
+2edf6ee5
+80111036
+21551ee3
+43fa7203
+25b02fad
+795646b5
+50b7166d
+2a4ef823
+ec985270
+7951b860
+91f646f3
+1cbc1420
+cb0b50a9
+4fb6247b
+a9d84aa8
+67995a64
+d9a2039a
+e1b40567
+ac0e3f62
+de1e5036
+d818f210
+c6cd927c
+c857bb2b
+c5244b96
+47a0e9ff
+e23891db
+ebfb225c
+6c6c6fff
+3d40540f
+ca4409a4
+2828a546
+276c03a1
+8fa23f87
+8e9d3e55
+659daacd
+d0c76b17
+548cd6f7
+9a841f69
+bf413137
+1d36bbc0
+64e10296
+657e3250
+bd7db808
+8e5c8813
+6a3d6e52
+58418528
+fd6e6bcb
+87c5daf8
+df423e6d
+0956dfee
+488af1d3
+27d47540
+cdac3d6f
+659ed597
+5debd38d
+e4df8c0b
+7ac21686
+fb9aaa0b
+df78683f
+2b0a63d7
+38562f67
+7e2167d5
+92969770
+0648c174
+5b24eb53
+be170224
+a4d0248c
+176cf3e5
+729ecc4c
+5f15d820
+b519c595
+8cd04d30
+06a5d61b
+720361aa
+ae1bb660
+ede54a1f
+1d4d1a02
+a48e62ea
+46591921
+96e4de4d
+a4425bd8
+f34e8f6a
+8747d4c8
+6741a372
+46e4b83f
+9f8dbf10
+2215dddc
+f3b2e496
+4fc9d001
+adf02f65
+a7217873
+79560450
+ca9f3db8
+f1760b27
+bd7eec69
+e3ea3d05
+ac7833d4
+c64bd5c0
+875e8b1c
+47fa5313
+3d19618a
+92f519ec
+2d15871c
+9699b949
+218cd107
+ed357f1d
+936a332f
+e2a965be
+ba233d4f
+d19e10db
+dd727b99
+3bec5d45
+414d3dc9
+bdd6affd
+50391d84
+38176faa
+06ec1805
+a9842147
+e0b2c12a
+e1a883d2
+d9fc673a
+37c28e63
+defeb71b
+d24c389e
+18b3794f
+07cecd0e
+be45b877
+732c8db2
+a9ecf335
+3bc47171
+6803e296
+d6f4cc32
+5a276398
+06fa8096
+a0d2c974
+ab5549ec
+eb8ded57
+c1dfa649
+34db1f06
+24c21652
+ee11f1eb
+1938e765
+2a5bad22
+66da8e94
+83a0714b
+63b34abd
+43de72f6
+d26682f5
+e29f2fcf
+06862b4a
+e32bae69
+33bea160
+8a8a06bb
+a325f22b
+b9bee1c2
+ad1492b0
+534b9be3
+f87e889e
+f7e43e31
+a2f4e8b5
+ca0011c3
+277cb5a2
+b9f28c33
+931a220d
+3f9cbb6d
+4d9cca5a
+bf28da20
+bde06ba1
+14a72480
+9cc98ead
+160e1be5
+d17f38c6
+507ba343
+2f06501b
+7fc6984c
+6f115481
+e5e1ca92
+0917c88e
+74563ec1
+9efba788
+fa6037b9
+28c51437
+4454ea3a
+f30da081
+9378c2f0
+2a73af0a
+8ea37200
+4919711a
+7c5bf905
+c2a85cd2
+a2418cb1
+e3af97ba
+e7edcdba
+22b8ec76
+f1313990
+ad365bb8
+8529d3b4
+1e087995
+69873012
+2436ff75
+53bfa6f5
+c47979b9
+49507531
+48f165b8
+3dcc2635
+a08bb4d7
+6328e826
+f161c883
+f1ffcff0
+31dad9c2
+21084397
+4cacc5df
+dea90c39
+8a582d5b
+92ac7e35
+4c6ad4f5
+6bca71b1
+ef8ef09d
+3263408b
+fe7ca040
+28156fd4
+3de41f4f
+aa6b7700
+c30bbcd1
+22d6a7d8
+5565c8ef
+1bdd5433
+f5b1b647
+5f1d379c
+90d80d1a
+c3cdaf85
+fb991bf5
+70a58a12
+e8a073ad
+75c17e23
+28540af6
+8c30321c
+520c7935
+ea374c10
+7d9b89ef
+8b8cacf5
+77120ffb
+71fcc39d
+8fe001f4
+18979573
+3512d101
+9712b4e7
+ace843df
+610be298
+47a3aa35
+1c940221
+4e67d4fa
+3f8aec46
+5c942915
+048d525a
+ffcecf14
+cae2b00f
+d08dd3b6
+44f0eaa2
+612905e5
+1b2022a0
+a5ee56ac
+3096136c
+3da52313
+cb7fa42d
+ed5c5acb
+654bb16a
+bb8c28a0
+7609a1e0
+15ce4793
+b4339699
+7ce3c66f
+d8988024
+63e498fc
+394b6320
+75529ad8
+aef7b37a
+d89a0026
+ca4a4da9
+24e81557
+e8107cf4
+83be0de2
+d4dbce44
+f9d5effa
+f3dbd9b0
+f24b551c
+e599f97e
+0cdb9a18
+3020b608
+d05bb1f7
+7d5a4791
+0ac9dec6
+97cd01be
+10ae0788
+55b4ed2d
+a7069b57
+c3962347
+e7750f14
+526f00c4
+a7ee0769
+73c7614d
+c260f168
+138527f4
+867f94c0
+f56cc84a
+f3d25ecf
+b1121caf
+b77b9c57
+04047220
+c9669737
+dc906891
+3f90e6f7
+32528850
+91f87a19
+3a09994f
+9441b6e1
+291a0d2a
+c60752a9
+1241e747
+136babe3
+9556ca08
+5f27bc59
+3d6857f7
+d267d56c
+78a16776
+11c577b0
+ae70dc88
+3fe585ab
+3cc6c8de
+17bcc684
+7aeafd62
+416d6970
+05e13793
+8d42130e
+9311a066
+b79a0855
+1bb74bfa
+7f0bbac3
+fd2387f8
+612652a9
+a165a27c
+be0e156a
+b3a93317
+02c27dec
+8bb91717
diff --git a/models/rank/dcn/data/sample_data/vocab/C13.txt b/models/rank/dcn/data/sample_data/vocab/C13.txt
new file mode 100644
index 0000000000000000000000000000000000000000..204761319cc5f7a33fed867a375c6c512247cf53
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/vocab/C13.txt
@@ -0,0 +1,1911 @@
+8f94bdef
+59be1107
+3b03d76e
+ebf437b6
+20819d96
+74c6dd28
+615fcd3b
+02c899e1
+93624a8e
+a85a252e
+fab2664b
+f6292965
+474aef6e
+0d8d4492
+60d783cc
+dbe5226f
+85fd1cb8
+b17372a1
+c56bd951
+5d75dfcf
+d30547aa
+3eb9539d
+2b30823d
+8cde53cf
+9e1f2dd1
+a5975b1d
+fc5c7b5d
+ca051874
+6212bf27
+83d6c557
+1d36488f
+b2db9c22
+cece4673
+613b2c28
+63f28c33
+769a1844
+30735474
+12d7c50f
+a1354fa8
+43881ace
+0159bf9f
+8a6ad3da
+f7177056
+d3cef96f
+9196db4a
+4fd5719b
+768f6658
+878084bf
+1d0f2da8
+7f3c5570
+a39c6d4f
+6b5d07b4
+13708b2b
+3bd6c21d
+b081141b
+917e9bf3
+3a129657
+d7c5b499
+7ffad306
+b2e5689c
+a4b04123
+47727147
+0464ad73
+00783c5a
+60d87974
+a3fda569
+f5b0cee8
+e1fb2a68
+61f6145a
+3c69efcc
+da8bc9a8
+077640f4
+3a15d952
+6c24bc52
+706a14b0
+0463405f
+5f43eed6
+18a329a1
+b59ccc82
+8828a59c
+9af2727b
+2ea53cd2
+bc1e82c6
+186ee908
+01305c64
+6df6ef3a
+30efa172
+d7339e81
+0ed79272
+d2420e4c
+86c79eb0
+f61e5576
+8978af5c
+cd13043e
+9325eab4
+592afbbb
+5166abc7
+84550f35
+d66aea7c
+ea4ec703
+bb7a2c12
+52358aab
+85fcb68e
+9be66b48
+7773290c
+a5d233a8
+3600cc6b
+e3543236
+b6c4d6d8
+859523d3
+d06e4993
+3301b467
+f05a093b
+45cbedc7
+8159a24c
+79128231
+383a5973
+888b8320
+e9332a03
+2c6b3cbd
+8b6e2cdb
+010656c3
+4f1b6347
+c42f523d
+f405e2e8
+4e997c84
+247dab93
+0092602c
+10114fb4
+a4fafa5b
+40f3111a
+ba5dfacf
+a2d0d3c2
+18fc2b1e
+252ee845
+62a53816
+4f8e2224
+5c7593a3
+7700c0a3
+7ca7ae3b
+d54f0ddc
+7150d0d7
+cc6beff1
+cededcbb
+61e43922
+ce5114a2
+934b2395
+d7ce3abd
+b3ee1907
+20c03ad2
+07fdb6cc
+4799f4bd
+14da3023
+e21429e2
+c6ce90af
+3f17c1d7
+ab30ccf7
+e6d6213e
+ff3ffc9f
+42a6d751
+8ef1b3d5
+a22b1944
+ef6a4d03
+ed738fad
+605bbc24
+bf568607
+2bf15304
+576d0fbd
+581ec7a9
+edcf17ce
+8846a878
+7301027a
+369e2611
+0e02bbfa
+991fb45f
+0c66bf77
+5bed21b5
+65977207
+bd21b34a
+dbffa30d
+ba781108
+f2f583e0
+a4fb9828
+44b2ca79
+6e1e209e
+722a41c8
+a15f7072
+eead0438
+a4b5da60
+7f58d895
+ab71b9e2
+a97efc05
+7f9db67d
+553803cb
+14e6fd48
+098e937e
+3b1c934a
+c5403e45
+7f3c0396
+07e4bbfd
+799a3beb
+58c02306
+78ef55d4
+174e4cac
+e581365b
+bd91360a
+b7f43038
+519a3843
+7fdb534a
+d6d04484
+4c968cf7
+3691e3e3
+0ee45b82
+41a6ae00
+02e4b2e5
+6b4b1a1d
+7fca81e5
+739ec2d6
+6abad6bd
+ee568de7
+6e2d6a15
+a4c5d6dd
+456bfa31
+115d29f4
+1ffceffc
+e97e78ef
+950971dd
+075ffbfa
+0a8a3875
+f7af2412
+0e3b5f2a
+e0230d57
+7d1f1fa0
+7bbf93ce
+0c7dd611
+cc19e17a
+a3704113
+1a7b6bdf
+c23c531f
+fea69ce3
+a0874a81
+7b903000
+e332aa6e
+f0e0f335
+2d0ad352
+0b29a94e
+9a02f9bb
+cbbfff55
+df5886ca
+fe8714f8
+8145d857
+f562c5da
+c8c687ed
+e67cdf97
+a60a20e4
+c5ee4e0e
+a536f2e1
+4f270104
+1211c647
+d1e37101
+5c97559f
+9259d03d
+034e5f3b
+aa52b3b1
+849a0a56
+8407d8f7
+41ac5c91
+1bfa372b
+3bcc4b70
+7203f04e
+d7020589
+b998fb9d
+0f565535
+079c7d25
+7ed1487f
+67414029
+036dfdef
+881b53f8
+4a83d7b3
+b118c64a
+954f731f
+bcc725fc
+8b216f7b
+d332dc5c
+4940d667
+f65030f6
+d48de876
+0a9ac04c
+9b656adc
+ce418dc9
+a3b89afc
+470effde
+ae73a7bb
+17560ca3
+c2e887fc
+d15a5853
+cef1cfbd
+566e88c0
+e99759ed
+07828d4a
+24e01e63
+5e1dbb3c
+7aa2166f
+fe3942e2
+dd2ba37e
+2f6d429e
+fb22a91a
+ea519e47
+1e9339bc
+a0b8a352
+4fce4e51
+7380f035
+78b1bfe4
+9da67560
+aa6a9855
+95059b3e
+ad5bcc65
+9f53c224
+e8d73025
+1498fbcf
+951771f8
+654099a3
+596a2dcd
+f745e01e
+afd04e3a
+7db6a946
+1577a179
+f59fd52b
+269889be
+9de14311
+18041128
+ad7be027
+f6cf7834
+b331dd8e
+e9b9711d
+b6ce287d
+1d92c5d0
+cbd38716
+80467802
+4422e246
+49183eee
+8d02a2e3
+1b63c4d8
+68a51fd9
+d5e6f555
+e1d6ecb6
+01c73616
+9bbdb8bd
+1503f844
+1fdfd0bf
+1e8bfb9a
+cb87a1bf
+6077fd1f
+7410a24c
+cbe5db28
+2a2faae1
+71b17693
+03232503
+90cf8234
+6f2df0d6
+c95c9034
+f61c8891
+18a5133f
+e1b62f8f
+38087489
+4ac491a9
+706d0b7b
+969475b3
+03d0ba0e
+7a7b8db5
+131341f8
+96fa211f
+51351dd6
+be029482
+99b38f98
+d66dc019
+bd45bc75
+ef800ef3
+c0bff1ae
+0ccfefd1
+42ef23bb
+43de99de
+37ad9ef4
+35bacbaa
+aa1eb12e
+19fe98bf
+3f9a68d9
+f82c43c2
+0d5dd3ea
+45820f61
+4dc59597
+163b0c2d
+2180053c
+2839ca0f
+6e8ef725
+278e48a9
+d55eeb0a
+7f5bf282
+7777d16b
+e8e7aace
+d2bd7bb7
+d9b4659c
+533bbf82
+0110dc36
+0b968df6
+47cb697a
+971b4ca9
+9a88e2e2
+09cd9f24
+4bca2fbb
+412be499
+eae197fd
+1d27b635
+30ce0b45
+78644930
+983037a4
+36b96ed0
+905d1cca
+d70433ea
+dc1d72e4
+f1c59e29
+80dcea18
+679fcee8
+814907b3
+b2e0e821
+bd727667
+882005a9
+5f3a0c1b
+18e04ccf
+e1448645
+fe528cd1
+2b590d5f
+345cf499
+8e7ad399
+f3437620
+e630cd5b
+5e419718
+9c7a975e
+05325702
+e2bab9cb
+822b54e1
+a36387e6
+9ec97065
+c1b2cf01
+ec644921
+7351d888
+7928af93
+94bdca27
+df2c6750
+d69427e9
+5b30510f
+a2f2ba1d
+d1f9e505
+52ec5ece
+772a00d7
+7eb73375
+b017b046
+a2484c90
+81621307
+31078fbf
+3f5498ce
+cb2e33ed
+42148b26
+31e0d702
+5fffcbe5
+11fa2c12
+d7ccab4e
+7fc3134f
+9106f5bd
+01f32ac8
+39a06276
+c3871809
+ce8b8587
+094e10ad
+df957573
+d1be539d
+f8525b14
+05fc8d90
+0b1e410e
+d06bbcfe
+cfbfce5c
+fdd4a668
+0067ac1b
+47d6a934
+b71128bf
+3a59b47b
+f59826bf
+07a271be
+d5d2b430
+158b69db
+31319767
+bb97f4d0
+4d4c02d1
+d100e0ca
+60d930f7
+86a3034d
+27dc8af3
+031ce614
+4632bcdc
+c9154658
+34e72b37
+9faf125a
+9085af3c
+6969cd35
+2d0b0698
+2fe66320
+9291ca36
+ebc0bf67
+fc371461
+e5086096
+3796b047
+1ddad6aa
+5fcd3498
+b558cfc7
+e6fc496d
+3a9dafb8
+2bcf8620
+2648f8ef
+6879d1cf
+f04d3c1f
+d86616b0
+0e5bc979
+aca10c14
+4ceb7e09
+0afbe2b3
+aa581d26
+b7165d79
+e7c6a03d
+78864591
+33a45f17
+5667b6ce
+d13ca58c
+943ed02c
+b61fbc43
+fc5dea81
+65addca8
+07ee0156
+93b18cb5
+8d02f360
+c4d1c70a
+3f813a5c
+eb11180c
+f849e1ee
+ad2b09a0
+c576c612
+d6b339c9
+c4bf1f3a
+daee1f01
+67dfb6b2
+a341d3ba
+b96c41d3
+527cd8b9
+e5643e9a
+b0bfed6d
+519f294e
+37a1d9ea
+552f61a7
+32569bcf
+eba35887
+619a03e9
+92f01d0c
+9999f835
+1c1cf101
+2d451cad
+427c3888
+df708737
+bd251a95
+72a5be57
+c0bc5873
+96077837
+b484ff1f
+e0a5ea97
+87483b16
+c8cde1ed
+dee21837
+bae8a0ab
+a4b00bfe
+7bfc137a
+ad650332
+97fd470f
+9ded12ab
+4c628bcc
+6ed5acef
+afc0f047
+ab8c2919
+9233128b
+18a32ed4
+51caee91
+ac44a253
+e506d458
+406194ca
+d6b5c454
+dd2f45db
+bf8a142f
+5978055e
+d999ee7c
+40736183
+245827de
+2ecea536
+3dffcbec
+1750e95b
+74838342
+8da63ccd
+17c1f5eb
+9971fe3a
+35323fda
+ab04d8fe
+715b22a3
+6af44dc9
+8f176d1a
+58e74fa3
+b079c917
+df69ace5
+ea1f21b7
+1e06fca2
+a7ccf9b4
+0d60a93e
+af8f7f71
+1c448f8a
+bcf6a386
+f47f13e4
+2ec4b007
+11aa1375
+ec2c5cdc
+4fd35e8f
+a107e633
+90a568bc
+8752b356
+422ba909
+5774a041
+7e7395ca
+253f7c47
+5c069cee
+a4c7bffd
+140595a0
+8abf8160
+4fd925c1
+49fe3d4e
+bcc112ba
+8dab0422
+373ce97c
+522119d6
+6580b282
+55f96f80
+93d61944
+57960a15
+510f15b3
+f0b9fae0
+f9cef5cc
+e40e52ae
+ddef3914
+e5cf62b4
+5579ddc3
+aec8a59d
+af495ff9
+31a4e9cd
+5b56b1f0
+9ccdbd12
+f61cb57e
+9619e9ff
+d4098941
+03cd10d0
+18a5e4b8
+8885f5a6
+d52bc10f
+a4456f70
+4e492ec5
+b9465c69
+9de8c1aa
+6205d58a
+627b2207
+2cbd8bbc
+46f42a63
+ed469c28
+c08ec9d3
+c1dacb89
+c42f98df
+d8c94760
+f0ba98ce
+41a35133
+2a922a5b
+e7a57442
+493fa4dc
+395ab872
+103a86a6
+129be74a
+538fab05
+0721132d
+b6b1bbdb
+c278016c
+a4586960
+740c210d
+3a6d18b1
+5b6ee19d
+5e50db79
+4d482f76
+1c63bbf4
+86684160
+cb5e9ff4
+5a2964f9
+3cfb0207
+2985dfa2
+2e83bdac
+a0abbb5b
+569507a4
+04490a65
+1444038b
+208257bb
+65b6dc96
+4eb5dabc
+d52d8fc4
+3a09834b
+772fda73
+d14c9212
+da9ee8bd
+bcb2e77c
+f0f0e3cf
+09ee4b41
+f81d2620
+5707d7ee
+1f88b095
+98e78e7a
+633b9be7
+f88bdc06
+b6eb5d96
+057e2dd3
+e7cb1184
+a03da696
+5a012d1a
+4e6891fc
+b8065f15
+a984ac48
+d90d259c
+05287a9b
+72b6061f
+b8eec0b1
+79f7177d
+9347b4a3
+5029598e
+ec43a8f7
+751c7a99
+e411c4db
+30cbe961
+4d8e7d98
+2dfcb44a
+ee79db7b
+b95f83fa
+a8374791
+2f8b3560
+4956d4e5
+b59b744b
+6bdee646
+685b846d
+64533206
+0495019f
+3af886ff
+c3f71b59
+bcb03896
+56b1ebb5
+bccbbffe
+34fc0029
+e7d6ff27
+1bf19848
+aaa80b97
+89dfa9ff
+8c1a3ad8
+9f24464b
+1d1029fe
+4c288eb0
+329fa3ab
+9d8dd43d
+095af3d6
+e987b058
+274f4e12
+aa46e155
+c0e6befc
+8e6dec32
+09fe0564
+fa17cc68
+7846e329
+72364a9d
+dd183b4c
+ea089f5d
+fde1645d
+609cd654
+22591fcf
+a553219e
+27d6e0dd
+3955da49
+ab6afcff
+5366bced
+53b60829
+74f5ab05
+d83ca122
+7dae0941
+65e5e5e5
+779f824b
+0fb78b80
+28499975
+24d85889
+9a9c8717
+f66b043c
+9669f51a
+f6aeec90
+81b27955
+7eb57a21
+f7ad89e1
+d0967f41
+f27ed3ab
+af545dd9
+870796cf
+7f0d7407
+02254f5f
+9cab1003
+21e58fe4
+84def884
+ee341e2f
+d87c8d9c
+61c65daf
+18f84563
+19e8bfd1
+6b1f1538
+38016f21
+839d50cb
+f30f7842
+e8f6ccfe
+4a2b4a54
+5b2b6068
+814c91f1
+4e4dd817
+6671dc76
+5ba7dfaa
+5bfca2aa
+5d235d7a
+16aa5daa
+5eaab985
+a6116275
+a22449ca
+5d111255
+a8840224
+153f0382
+2891c67c
+63db155e
+4db253a1
+5a2f4542
+cde3ec68
+a09402a6
+19e253e4
+bad22647
+e65ecb20
+c9dbd0bf
+ac5efd91
+f3a2b604
+fe546f75
+78d8f465
+978940b9
+90c7f9d1
+ec8fc416
+12880350
+8f7e5dc7
+3cf672d1
+2c6a1e9a
+9b66fd14
+328d63f5
+73012b1a
+a638f0a0
+c9971f88
+4af5f8c2
+bc2d81a8
+a57cffd3
+cbb8fa8b
+1e18519e
+c0d12152
+4ad8a654
+2824a5f6
+c8e0a818
+2afe74a2
+1f9d2c38
+c0c5f46b
+aeff95e3
+54377e0b
+11b529d9
+0d86025d
+2b54e95d
+7df3a6c1
+9c5fcfd8
+aa655a2f
+33c5a752
+c8c105ca
+0fd466ce
+dde839bf
+0de3b27a
+e55dbe27
+f9074aef
+4c8b03dc
+23cab7d3
+bd664eee
+8f33b365
+addc3db7
+055c40a7
+76dfc898
+dc2ea919
+58fe6044
+4004a1be
+953707c2
+e0614341
+976896e2
+70728bf9
+5d00fa56
+b132c838
+d53bbf5c
+2e551bbe
+cd2e6eaf
+95793ef0
+66e6836c
+6f1ddc34
+952c94a3
+31b42deb
+cb655a00
+f7be54b0
+852588b4
+13b025c3
+2e21cea2
+f695651f
+2fad1153
+468f0632
+b1c8c31e
+e19a92ae
+9416a599
+bb567ffd
+dd52a028
+9b706dc0
+2fc3058f
+9e571700
+9f2f9c89
+b808e73f
+db4c1ae9
+30f114f1
+9b73c1ea
+5a197d48
+e1e391fd
+63f0b970
+d38ebad8
+17bb51e8
+4350b107
+679527fa
+2d9819ce
+e392d05d
+79101aa3
+837d93f2
+096b841f
+9d722607
+4ff47081
+ad978c4a
+649c08d2
+51cb7f1b
+916be09b
+4cc724bd
+9e70b72b
+6665daff
+b0afc252
+23a62dd5
+86ac5919
+879fa878
+a98bc52f
+437a58d6
+c0ccec72
+446235be
+1930cae2
+4b7c8d43
+34098dd6
+b0c30eeb
+70e0d2b1
+9ff6dbed
+86c2fa69
+ee6824f6
+dbf42252
+59d8869e
+7ce5cdf0
+3b5e3853
+d2b8af4a
+1f72447b
+92f2a33b
+c058c33f
+c66b30f8
+949ea585
+0b5f5e58
+bdbbf16c
+176d07bc
+7b027a2b
+ac6687f6
+469f32d5
+61c8d479
+2dd1e24a
+ef007ecc
+df7a1f21
+fd24f170
+eb9ed7ff
+e8d4ea40
+79ae8b9a
+d7ccce7c
+2cefc7b0
+a8df3bc6
+fc0d867d
+d76cea6e
+42156eb4
+0b7c691d
+11939cba
+e62d6c68
+0a26070a
+79e86662
+cf4e333b
+37bfaf8b
+8c8f2bfd
+f303d0ba
+5e350f6e
+435355ef
+4818d449
+3966c8cd
+b1451e1a
+04422054
+3fe840eb
+95bc260c
+f6224065
+ddd66ce1
+abd69a9d
+f19108a9
+61d6d816
+86462f28
+c8e4b0c1
+f36daf65
+0db4a766
+b47bdc8c
+060905ec
+202a3015
+f8362c26
+0d2cad4c
+a59ea816
+94172618
+2e9a2599
+9b7d472e
+b0c3b696
+b87a829f
+0a76735e
+55be071f
+a05d649e
+5a504385
+1966ae4f
+bcd59fe5
+decd9980
+6a430a5b
+570a3ead
+c28e8ff8
+d3b3cfc1
+bcfc54a9
+46452df6
+58ff512b
+934f826a
+5e400308
+2506f746
+2cb77ec6
+f948ca5d
+fde2c9d7
+34786fb9
+41c8b580
+5dbbf8e4
+c69937ef
+c0885ca2
+b66e751a
+4fa1154e
+4f487d87
+a9171cc5
+07f4cb4c
+507b0e32
+1c717231
+54503c4c
+02fc7724
+390e6e02
+ea474a6b
+c612254f
+1e750733
+82f36f18
+3eb2f9dc
+00e20e7b
+ba6e0657
+c28589ee
+e853b835
+f90394cd
+0c639b84
+742493ed
+467fb8b0
+7be1c5df
+403e1842
+b7b2c5fe
+4809d853
+25512dff
+4b328e0a
+0ba7fd89
+1743148e
+38b5339a
+1ac91ec9
+52f64c84
+aef750b7
+4c80e3a6
+ea63400b
+89071a69
+abb448e9
+66110d1b
+34cbb1bc
+857a4197
+cfe11615
+580817cd
+69632b63
+0209b664
+611c569b
+be2bcc0e
+03f77fd2
+2da55e7d
+9af2cb2e
+1ec8e563
+f4c487c1
+c281c227
+97084f23
+781f4d92
+d628f6f4
+c3808654
+3b1d1322
+1fa0660e
+f9f84038
+d62903ca
+62036f49
+5a0aacc5
+ec44c97a
+09a25a13
+c888b255
+84c24bbd
+569b7369
+948f6009
+9a5d52f1
+5dd86246
+2e072c0d
+1528191d
+decaa31b
+4f3f2bb1
+dfce06f9
+696c3b11
+aca22cf9
+f98d9f6e
+ef2542c1
+3516f6e6
+fd36b258
+1cc1a03a
+ba344b31
+ee8fe4bf
+a85c4213
+0eca41f0
+05e73957
+8f265ae5
+38e7245c
+fc3dc255
+48a932f6
+0e2ac643
+f0a72e95
+6619af2b
+db8113cc
+b7a9e83c
+e39bd393
+c988094b
+c1d8cef1
+3b256bf8
+2723b688
+66815d59
+140b1e5d
+8e7af09c
+80acfd1a
+67b031b4
+6423bcac
+6d41cadf
+b0ff1de2
+ad61640d
+3e052f22
+873349e8
+bdafcab4
+e5186205
+5afd9e51
+73e186f6
+5b7186cf
+347e4e86
+4c19025b
+49e7b7d7
+86f63759
+f61f4d16
+a7caa7ae
+50447cd3
+c22de786
+6287329c
+156d3d40
+dabbd479
+648c9a0e
+e28fed1d
+798e96a3
+a70d1580
+f6fdda26
+07a906b4
+f8842eb7
+ef7e2c01
+85cbc79f
+2c26da42
+a5cf8381
+ab15c16e
+b94e5df6
+a8bd2ec0
+d22c7c9a
+34bc3a7e
+5552fe29
+5a777c73
+343367de
+9156816c
+a48ae04c
+0f39538f
+49cc0ddb
+37d97d89
+7eaf6f1a
+39ccb769
+a0edec24
+86dc4b63
+31d2ac00
+5d8f6443
+2e9930bc
+84620f02
+b57fa159
+304a5c96
+88fc4928
+39276314
+dfb2a8fa
+54be6cea
+ea9c146e
+39e0ed50
+e7c4888d
+0bbb5cbb
+df132e22
+b5b29c1f
+5c473d32
+025225f2
+6c2acc46
+8916f815
+97c509df
+2342c398
+e651250d
+708ad459
+04c73167
+5b01a374
+90f1367d
+08169c19
+72978071
+b657eb7f
+5cd1b6c0
+eaa11718
+552e5180
+8803181f
+c91ba988
+d7f1cc4d
+546a84fe
+b8fee572
+87762256
+5cc21877
+fa8359e2
+253a4ca5
+f3747bf1
+1bcf2acd
+bbf65b42
+be60a2af
+eb6f2560
+31c26a78
+c8dca410
+83827cf4
+8b266858
+16659efe
+eb4b1bfd
+9c1b4b40
+e11070c6
+89747540
+0932f78d
+aa902020
+08740150
+178ba87d
+bf596cfe
+3a96d406
+f589a6b4
+041d3f9f
+99b4a17d
+45c28793
+538ef950
+bb9f3872
+618a08a4
+132a0b80
+2e5a9f20
+c390fa9f
+3df169e6
+44cccd4b
+cd73097f
+4af686a3
+d4384424
+91a1b611
+d7679784
+dccbd94b
+dc154463
+f2edb1f7
+e920b070
+c4b75451
+8019075f
+7ff056a9
+0c53c0cc
+e2824235
+ef1c6319
+95e55a52
+ba3ed4e6
+ee3486ce
+26babcd1
+478ebe53
+d70487f2
+da3aa72d
+f9d99d81
+29142596
+1aa94af3
+1d7c612c
+0ac9d919
+91f9489d
+51aaa971
+004dc387
+593e603e
+0b7f85d0
+08775c1b
+1cc9ac51
+4a6575c1
+2d5fa3e9
+c4271032
+59f566f6
+66c4d623
+7eda22c5
+8a2b6e98
+ebfa42b2
+3f70c42f
+6cb5243a
+9d91960c
+d2f63f6d
+c6904240
+52542824
+a3970ae6
+e8400e63
+a7de95c2
+253e8774
+aa9cd64a
+2f3c8347
+d413ef3e
+b724aaae
+2324ef79
+1cf9c8dd
+221012f5
+d7034fe2
+cc241569
+6b7b3066
+e4fa2059
+ef9686d6
+d534e77a
+307352ac
+3a3bb406
+7aab7990
+d299b0dc
+3f48c30b
+083755dd
+40dfba03
+34ebd8fc
+9a86130f
+13615098
+6b8887cc
+38a4e3dc
+5747a0b1
+c95e66fd
+d2a6d800
+954d6797
+e1ba527d
+5b3fc509
+ca8dd3c6
+db381caa
+cdc2ccda
+377af8aa
+ab8a1a53
+3f158c90
+05781932
+8ac214dd
+b6a6a31e
+302977df
+9a7d4900
+dd6a9565
+69f2068b
+efbb2435
+f1a8396c
+4e73e6f9
+5f4de855
+2b9fb512
+00b2c8b2
+c247ca08
+c2d489b5
+a566e4af
+ec3484a7
+fca56425
+262ac3d4
+f5df7ab9
+a2691d6e
+433f9499
+17531d4f
+6f833c7a
+1a347339
+8b11c4b8
+e3ee2035
+611283ab
+c1618056
+0a291705
+9b9e44d2
+9ff13f22
+cc7759fc
+f55365b2
+c7176043
+7a3043c0
+d5aaf8c3
+5aae435a
+672d927b
+91fafaee
+dea83554
+9f2e7291
+95aa5a9d
+1d76104b
+8c9867f0
+8fa83e23
+04de9d96
+0c43934f
+90dca23e
+6e5da64f
+60ac57eb
+d3802338
+ff5626de
+b18f5a8c
+fec93d55
+cdc4ba5e
+2a1579a2
+2b9f0754
+5e5e993e
+5ba9ebe2
+61263ddd
+8c13f0bc
+82eca395
+97894d02
+009f5dd8
+f06c53ac
+d2999b77
+ae5ac833
+37e99bb7
+aebdb575
+c42aeabc
+bd9310c2
+9bbf3ee9
+be753812
+b0562e4a
+23d7f461
+2f3ee7fb
+39795005
+a197fb38
+0710c0b5
+5d4198ed
+43b30a24
+7889204d
+001fc921
+9621d71c
+83b2c411
+d8ac70b5
+1e152deb
+01c2bbc7
+855758bf
+921cb1a9
+138498c6
+08ba5c35
+04e4a7e0
+c5bc951e
+84c02464
+0a110528
+0defc36f
+fb0d3002
+fc885e3f
+dcc8f90a
+133353b8
+b55434a9
+e40f343d
+11f52a04
+d814ccda
+34f2dfec
+8c923da0
+165642be
+a62b6eb1
+3c31c135
+ac249cd4
+7a36be10
+ccfd4002
+434b8eb7
+00613319
+4e9bebb4
+669cb7a8
+7165d9e8
+00b0a3ec
+d8e8499b
+61537f27
+67e1f5b8
+09e3bbd5
+98401720
+36ab5831
+4e8bba73
+2ef8dcdd
+2f83c4ea
+94b87a00
+9f4437a4
+76b6f478
+0ff4db6a
+45db6793
+85dbe138
+306e7a68
+f9073f46
+370eceb9
+4aca365e
+f8320f48
+22d23aac
+b757ad4a
+5fe42e80
+6994d528
+022a36e3
+5c0c091d
+50fd89c3
+692883b5
+871eb035
+6d601735
+4f8670dc
+5317f239
+ec4b1d39
+ccb9cc75
+bf7d7467
+6b315457
+51cc5d51
+975bedc4
+5595948f
+93a059ae
+4ac844f1
+2e7b87f1
+dc90535b
+3e7d76a0
+08c3f986
+bbf281ce
+41c193e5
+3c4e22f2
+f171ba63
+66f2d604
+2c7bd6f7
+cce745f5
+9237b653
+d1b5e8c1
+b093e98d
+4e215042
+dcad2706
+31bdb569
+dec94acf
+c2e2250e
+6239dde3
+83e6ca2e
+8368e64b
+7f75d712
+073bd88e
+0ae1463b
+5434103d
+781c8327
+12c11b9c
+a7193268
+5c63c3ea
+a8af8c1d
+66e756a2
+739f373b
+4d8657a2
+2ac84021
+a4d84fb9
+3a06fb3d
+b6433e34
+b7f1ef1d
+79db54f6
+54c30969
+de1618b9
+de7b14a2
+b688506c
+3c6e7a7d
+64f8e14f
+c6378246
+0c78dd7a
+e11162e6
+7fe85270
+44c96b6d
+9f125fdd
+6de37a8f
+1399de53
+9f792ab2
+fa305cd0
+c3e9876f
+e45d0295
+4d7749d6
+272d6290
+66e1cb62
+27aba7ec
+4abc6315
+6cc03d02
+e325e0dd
+cfbf9eaa
+ec3df57f
+48a23704
+e051ab3c
+b952cc38
+7edc047a
+2a0e88c8
+1d8cfec6
+58b27a4b
+5ef5257f
+f9a980ae
+0eb69562
+da3c8447
+b0484887
+c6881550
+9b537a5e
+e93e5897
+028f45bf
+24b0ac45
+a615211e
+ace88beb
+cb72c230
+c7a109eb
+b029ebab
+e17b3043
+d67a8fd5
+16563886
+37880c21
+18539b7f
+07f1a7eb
+4ca13ee8
+8e2ee35f
+7d5ee5b6
+9dfda2b9
+31045073
+c2ba94b6
+4ab361e1
+d84743cc
+75b05ddf
+439cd4cc
+31493a18
+39e37558
+8b5b6b7a
+32d38cb5
+fb0920f2
+b5f7ce2d
+cd98af01
+c63ea0b4
+6726c9f7
+56568181
+32720067
+5a378215
+d77a196a
+45ad6166
+f830b131
+d596a58a
+c3dacdef
+4f2491fe
+e6855cac
+19c0a3be
+09cf6853
+ca9ed493
+510ad963
+e65a5fc3
+26739be1
+1907e19c
+a7fa7627
+747bcd4a
+ea18ebd8
+2b2a1789
+08961fd0
+d0d71a10
+3c7dc0d9
+13c89cc4
+61fca60c
+da693158
+265ea3d8
+6179f7b7
+fc6c221e
+a7474f62
+c467e219
+1eb0f8f0
+beaa48ab
+98ffd96f
+ae9d290a
+70452055
+4dfac58e
+5085b39c
+df941d33
+ca9b769f
+cc0946cf
+5106540c
+64d8d024
+751fbd5f
+5218d824
+4c9ff09f
+969e14fd
+32d3f313
+bd031e78
+881eaccc
+10cccf24
+c9059ff0
+1a1cefef
+43fc2d8c
+a4857795
+96e0fe79
+232ffde2
+b2d8bbb5
+8205703d
+0f3d4e02
+dd40f08c
+386226ef
+a2158803
+428332cf
+7d30c3f0
+fac4c83d
+bf4cebb4
+9a62ee76
+5deef092
+1761eac4
+1cd94349
+572bcfdf
+82863df1
+aa3f2e5b
+155ff7d9
+94ebed39
+7f9f5628
+863f8f8a
+954029f8
+59dd51b4
+208f4a9c
+628738d3
+4a647359
+af22fb96
+bf50c711
+c679a49f
+50a56f08
+ab2497dc
+f8b2e505
+568aa0f3
+d3442f6b
+48b975db
+fa9f89ca
+61ddcfc1
+d53ac30e
+18e5bb32
+97d749c9
+82e0647d
+62555ac3
+7462682f
+3787a7a6
+9f16a973
+a950a59f
+0de6e038
+0c9cc756
+06384f84
+e4e9ce3a
+c905df7d
+4738a95a
+919200e1
+d2243160
+00eb1d81
+2b763576
+e63f43cb
+fd8358ed
+bb4fe764
+7ca53f33
+4840c1ab
+24f97fa1
+b72482f5
+e1f044a7
+311e4f94
+6e2907f1
+3562872b
+3608dc17
+81fcad9c
+81f76635
+51a8c80d
+bdd4dc4e
+34d253f7
+2c8d8e3e
+2f4fe83b
+b8a76289
+d4b0fcb7
+08c03dfd
+12ae6811
+68637c0d
+fb4bc60c
+7253ac71
+0369d47b
+9f60c586
+780bcb50
+f561018e
+36078a65
+4727e381
+b50e2ed0
+b22a5f52
+ea31804b
+97c8f8e5
+e206ac9b
+44af41ef
+fe04b02f
+b5ab7347
+0087e728
+7e728009
+8aa132a6
+6a13bc13
+eb1944fd
+8e648136
+5aceb3b4
+41eb38a4
+f5ff33d9
+0b7ab2cf
+baf9aeb5
+58fede9b
+1f96eb41
+b722e1f5
+1668f3c7
+89affafc
+51b97b8f
+75b85a30
+d7f84e3c
+4730fb52
+e86c4329
+2e4547f5
+f4dc288e
+8c2b39b2
+70df6b3b
+d4b85d8d
+7e16ced2
+cb606c19
+86c652c6
+1956fbbd
+f23a3825
+d3379e7a
+90a941f6
+7591f682
+499d401f
+ab160bba
+9e911356
+5bdcb1fa
+c0ed8bfc
+01a64571
+2429d596
+6a70a42a
+3d3449d7
+41777878
+6821d031
+94881fc3
+2a268e4e
+4c8986a3
+cf2c2521
+062065a8
+92988254
+33f7d746
+5bfcd826
+e06150a9
+21894a89
+215c69d5
+bff89401
+f522015f
+370914c1
+ebd756bd
+88a225ed
+2e94d3f7
+0b9e06f0
+216d45ac
+b91cf382
+8a430f81
+b3f7fb1d
+7c30251b
+733247ac
+57d7b6a5
+9bacc95c
+5edd0633
+41071de5
+f0fe287d
+ff4d50db
+77f6ab2f
+565a196f
+0cf56582
+e3e04aed
+d96d7793
+784618a5
+3a4e700b
+3950253e
+167b0a48
+88d07896
+86e074fe
+665d1a83
+e8df3343
+89334965
+15bba33a
+3c12244d
+c9ae71af
+86f2e6a7
diff --git a/models/rank/dcn/data/sample_data/vocab/C14.txt b/models/rank/dcn/data/sample_data/vocab/C14.txt
new file mode 100644
index 0000000000000000000000000000000000000000..a4a5a781ba6b052deb66e0dc44bdf98bfb1bbe24
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/vocab/C14.txt
@@ -0,0 +1,24 @@
+26ac7cf4
+cf1fc48d
+ad1cc976
+e8dce07a
+5aebfb83
+b28479f6
+64c94865
+1adce6ef
+32813e21
+dcd762ee
+051219e6
+243a4e68
+687dfaf4
+f862f261
+ab7390e9
+f7c1b33f
+d2dfe871
+91233270
+ec19f520
+8ceecbc8
+07d13a8f
+0601d3b5
+cfef1c29
+0bc7c8c2
diff --git a/models/rank/dcn/data/sample_data/vocab/C15.txt b/models/rank/dcn/data/sample_data/vocab/C15.txt
new file mode 100644
index 0000000000000000000000000000000000000000..93dfd19da6bf50714d87348bfd97b59857a281c4
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/vocab/C15.txt
@@ -0,0 +1,2011 @@
+19da88a6
+2f71ef4e
+dd244129
+ad47695e
+72d05a1c
+c8beaa0f
+14d05219
+98c93ca1
+09d98d76
+f27e752d
+6ec20f2e
+413cc8c6
+1df0792c
+4c6befe4
+46218630
+844c9d15
+402ce7c7
+8cd0367d
+f455b8e3
+f5e3cff6
+ab3d548b
+311b40f3
+a4508665
+53b51623
+4ce39685
+fd888b80
+dbbde166
+c357ecb7
+73438c3b
+6ea6312e
+d44f0339
+210dd677
+f44ef062
+2d751612
+f3002fbd
+29a18ba0
+73b98472
+df3426f3
+4595ddb7
+040981c2
+3d58d8c8
+d268f1ff
+6b3737ff
+11b901c4
+bf020cef
+2cd997f8
+e8ffeeb4
+32330105
+c99f9716
+62eca3c0
+f2d9bf39
+867d99df
+0f7acee9
+06809048
+343e5724
+d420da19
+007ac9f6
+0ad47a49
+65f5fb36
+24c5daaf
+5c2e109b
+4882e4b3
+e23edeb4
+21da7395
+76835fdf
+b62843be
+e28388cc
+0d146d27
+dd42a670
+fa321567
+dcd06253
+7fb65da3
+81d3f724
+6951674a
+adc5d179
+40a510d5
+5534a300
+d345cbde
+0b49e24d
+b812f9f2
+8cf98699
+c6ceab4d
+1ca2ec64
+b6eceb93
+6d9472f9
+bda24cf5
+0d054fb9
+f767920c
+4a7ca0f5
+60fa10e5
+2223bcd1
+11da3cff
+88a47777
+db586845
+c2b7aaa6
+24e96cd2
+c83563ad
+054ebda1
+2f2dc9ba
+f846ffbe
+0eb906ef
+eb0f6b30
+fdb8cdb7
+bfb03e99
+3e704e66
+6dc710ed
+9d0aa133
+17a3bcd8
+b73b928d
+3438297f
+a56a115e
+d3ac08f7
+37a9f717
+55dc357b
+102e49b8
+11b2ae92
+50e1ee80
+df2f73e9
+554c4a03
+0bf0feff
+091737ad
+f9d1382e
+30dbdead
+3e5b3f6e
+fe54d1df
+815790d1
+30be2b36
+52d2c108
+86f221f7
+bc5419d4
+84eb9934
+5535e22f
+73e2709e
+b8acb23c
+081a721d
+24ff9452
+59e11c14
+eab6c8bd
+520cb89e
+4b8e3335
+7801874e
+d959e25d
+ada14dd8
+0efe0a20
+798a3785
+8d016df5
+281363b2
+ee203128
+19f03519
+3908325d
+84203dfc
+716ac5c0
+300cab0d
+993ec99d
+a89936e1
+d5adea3d
+0169c5a1
+fa9ea4ca
+0739b998
+61a7397e
+6fab2bd0
+162f3329
+ae3a9888
+f188ba83
+708e7409
+d120f347
+316da2ff
+a509c337
+1addf65e
+330bda5b
+6d74487d
+59a58e86
+46df822a
+3dbfb847
+8921eade
+a45dafc6
+33d2c881
+2e85e7c0
+5ab7247d
+5489614d
+993a7719
+fdec7f51
+a9bfbf99
+dd94570a
+6c34b86f
+972b922a
+91126f30
+0ac64405
+7c54ccb7
+c9ad4d07
+39a6addf
+bc01d9a4
+739f0723
+9e1f851e
+55d28d38
+df2ad21b
+89407d7d
+68b71149
+4a610ec9
+950307af
+35f69a8e
+03db9d8a
+78e3b025
+b4512bcd
+f8f73c49
+d37d4a5e
+007fa274
+467f93ff
+9cc57c4d
+42ac678d
+a80707b4
+fbdfc6b5
+c869a687
+dbc04c15
+549f5d5d
+188301e5
+59fb5783
+da8cbfac
+123b2f29
+e68feea8
+5fda82c8
+afbdbcfc
+33f4c205
+be562b57
+7ac43a46
+4ac81a35
+0f9c4124
+b775a16c
+af75c9b2
+d1320b47
+b1502433
+a7769648
+4feed03c
+de3c58c9
+b760dcb7
+376a23f2
+f775a6d5
+18e53827
+e3ea55fa
+fdf07338
+91246cad
+4f3b3616
+a46bf7c6
+f9d8f165
+10935a85
+23bf2849
+bcdb9b50
+bd352426
+003cf364
+4165c23c
+e4d19743
+693190ee
+9c55db9a
+2ed5bdad
+30915654
+e046903d
+0ee9165e
+85fd7306
+5be89da3
+98995c3b
+3967be37
+5182f694
+aa5bc3c9
+089dd1eb
+94812b8a
+d355626a
+9fa7b46b
+70eb0ad8
+bbd6edf5
+3a6fb6d1
+79a98024
+c6ca3eaf
+4fda094b
+6813f031
+72fdc13c
+50cb82bf
+51d40f16
+e3dc98de
+62ecc9ff
+a299ce75
+31a2c464
+111db041
+4499b524
+72fbc65c
+b927d122
+d27eed0e
+205c1982
+222f4227
+83d46128
+9606b30f
+3581cad7
+05a584e4
+dd21eb00
+5ace6c00
+2683ec1f
+95c6c51f
+91f600b3
+457839f9
+b62ec7c9
+c1ddc990
+55ef2202
+10dcd6ab
+194c42a4
+5f2d5a3a
+9917ad07
+9fedb890
+40fcbacb
+1302f720
+d83cf239
+7de4908b
+16d478e7
+4dad26dd
+c80d508c
+31a82566
+2d3e18d9
+2277633a
+fc42663d
+cd8fe2de
+bdca2d56
+cea0ad99
+e3fd574c
+d5690a93
+e289fdf3
+899da9d5
+31c3734b
+e0a2f482
+f2756ea2
+4d5e718e
+fb2e22de
+89e43c50
+64748fbe
+6177250c
+521b9165
+c68ba31d
+7cd8225b
+9da6bb5f
+4587d8cc
+3dc3c930
+aa05a951
+f08ed40b
+8154d418
+719e4e9b
+d345b1a0
+bfa6d08a
+9c382f7a
+8b945211
+8cbe05fd
+602e38b0
+7f1c4567
+b19775cf
+9a0b7e16
+b6dcf31f
+d4dc65dd
+27252522
+d816fd68
+18c35602
+62092ade
+f8526149
+be5a5ec6
+b685835f
+e5d3c510
+5d922427
+571f6c76
+05728e3f
+bb91e6f3
+32ec6582
+a5118040
+c417161a
+57bbde4b
+98fca9df
+484cbf06
+1718dff1
+11dade36
+90453bc2
+ede0309e
+753d0ff5
+1d778de3
+729eed5d
+c06af98c
+53ea480e
+1715a999
+17194aa4
+042c782a
+c0f21d30
+4db058fe
+d054d062
+4fe21ad9
+de57fa3f
+13f8263b
+f81cb17b
+23cf85dc
+4f7854df
+0e8eba3b
+b30b312f
+bb0bf84d
+4a0b6d3a
+1d0f5921
+d30dcf14
+4e47e13c
+7c4b3602
+8028822d
+dcea681c
+ef6b7bdf
+5d1e64c6
+549dd7ab
+693474a8
+827fae3a
+b96e7224
+cd48a236
+a7541387
+630f96bc
+033862ee
+022e018a
+36777332
+a21446b7
+943169c2
+1a6b624e
+d1056dd6
+f891b70f
+ebe321fd
+655b6b35
+013a3bdb
+67596d53
+4290cfe5
+7ba31d46
+80df7c9d
+785edb93
+c57dc31f
+5e495eba
+5ecabfeb
+0f04b39f
+8a1efad5
+2e7889d3
+df360709
+655fad18
+c94a9d2c
+87f18530
+f8aab0bc
+55741344
+75b12e6a
+ac98936b
+d2fba5f5
+9a0b5832
+4a3a5069
+3f652258
+63ac89c1
+00631f93
+e1c12056
+68fc4e7e
+d337bd03
+1dca7862
+040ec437
+b708086d
+d50f20a7
+b5de5956
+830017af
+38f77bf2
+6caba1cd
+a39d772c
+3c767806
+b9df93b4
+fec19144
+f5f22330
+bc5b2cda
+846fb5bd
+03970cbd
+efb45515
+dd72d95b
+4903dd2e
+4cdbbe0a
+8f0f791f
+610165b2
+ebccecf0
+4dd467c3
+6cf5975a
+aa25ba63
+1c4aefab
+4adaefcf
+6c802764
+6b43d7e2
+0c12be69
+ae0c3875
+1be668f2
+1f70852b
+cf995508
+ab659b6f
+6fda8f60
+a58e7e22
+e8ffebe5
+7ae598b5
+f033fb80
+344bf25d
+c6d46efe
+757fef46
+b7e33e82
+78d17716
+2f51688f
+19015106
+21fc3685
+8d015bd8
+3565b499
+784d56d3
+9c03c866
+8c532c12
+28aed80d
+a64d32bb
+be4cdb56
+51348d8a
+9aba6ed7
+ee77976f
+9b96f3bb
+3b3ddb03
+2a3ad26e
+5d69c8e9
+d2c173f9
+7267469c
+8d847668
+e985e88a
+2e1209d7
+e1572535
+0ca2a578
+e60ec714
+fb78f559
+486c8b7c
+5ff145a2
+2aa2bbdd
+bfcf91a0
+7d4df900
+aa0c8851
+9dc1156f
+4bf4361a
+986b9531
+70a601bc
+13da90fe
+9221b8f3
+4bce7416
+7e2a59b4
+8ff4b403
+48a5d003
+cf25f9a5
+f5799c5c
+75a4c7b9
+4ad96776
+cfa9df5b
+1613a442
+addd37ac
+d9d2424f
+1ecad5f4
+1150f5ed
+5a8b8943
+8cb6186c
+b27dd6c7
+f6b23a53
+ba34ea3c
+64c3f190
+b5fa159b
+14d0a096
+2c14c412
+d1128331
+558590b3
+08cec6f8
+dcd1fc03
+d5eb6e15
+8168d844
+68822b03
+89fc04f3
+d4525f76
+7f882989
+4f5fbd7d
+49d2606e
+ac847e0e
+9538b54b
+4b572351
+5f105900
+ae44460a
+396a6cf8
+1d432c1e
+43e828a4
+57a9c379
+db102b26
+f9fbbd0a
+cef498a5
+bcf15323
+f38153dd
+bde7754a
+90af1d37
+5507c0e6
+2c684cfd
+8d772a8d
+e24ff4c6
+3ac25d07
+7a0ac0eb
+22094a91
+e296eaf8
+77efb64f
+55f982ca
+151f2153
+aa600b94
+28b468d1
+cc31eb0c
+aadaf558
+425fb9bc
+37f2f6dc
+a08d6f8d
+9bd565d9
+2735b5ff
+c4d7f010
+22223d6c
+c31e31ed
+91da17f1
+f3a94039
+1e744fde
+3b973320
+82c3f59d
+1ac46c26
+3a129fb8
+7da29044
+7d7be910
+617d13f9
+234efa3a
+5d6805b7
+aa18cf81
+e768ae05
+74438fb6
+16d2748c
+9d7d0b99
+d9e19f11
+24564fe5
+6ec1dd51
+9ee9d394
+9038720c
+2d8c0895
+8f92e3a1
+e6696727
+68ae28de
+21879282
+00e52733
+84f7a6dc
+6ddbba94
+5525889d
+2915894f
+ad31dfdb
+10139ce3
+9703aa2f
+59184f4e
+1e4809ec
+eec7af60
+7cf626ab
+623e9b47
+df998728
+0879961c
+6f1ab4eb
+d6da9624
+c5d3e789
+bfef54b3
+4805ca48
+ec278e6a
+e9d8fdbd
+e3eccb04
+83763c20
+34cce7d2
+6fbef07b
+89585e18
+ea8d4f05
+ea3f53df
+25078362
+40dae2c5
+79aae8a5
+2f8d7add
+2cd24ac0
+916e9a2c
+d15f4d76
+72a9d1ce
+c7ec25fb
+a36eb32c
+5e4152f7
+095f3c05
+742fffca
+ecd9dd20
+71aee2bd
+b16d44e6
+babd0f1f
+74c49480
+217d542a
+a4e2caab
+af094307
+c60b6bbf
+d867f5df
+61f73647
+a8035554
+06894a0e
+b5907fa8
+bffbd637
+cddd56a1
+ad062d55
+0e79f9fd
+bd50245a
+6253e3cb
+cbffe0e5
+c367b077
+f77979f3
+2d5d7e11
+1381bcdb
+c8591a9a
+588b40b2
+90460e03
+5c0a1507
+65afeec4
+0bf25e0d
+c1041ab3
+93414bca
+c008c9df
+625dc429
+2f5df569
+89bdfdab
+4dba65af
+6b4d7a3a
+c8ea88d9
+5c595008
+5e163f0f
+9fa29433
+2ebbf26a
+b2353559
+555cd321
+813bda68
+ff933175
+fc574fb2
+32c6ddd8
+7f360330
+b2ff8c6b
+3af33a20
+99153e7d
+4962a9e7
+4c1df281
+2f3d2683
+ee76936d
+79266ab2
+25475cb3
+75351cbf
+102fc449
+d2da00f1
+0d598aa0
+83ebd498
+ad1f1760
+ea906d6f
+fa467a7a
+ca8b2a1a
+4e70dc14
+8c46ffb9
+f189ac4e
+eb8600ba
+c74ee7fc
+a90e5e79
+a9d1ba1a
+b98be2c0
+f8071e98
+234191d3
+e3209fc2
+de829bed
+2eb18840
+e9374fe9
+2b982426
+a2ccf91a
+2de5271c
+c4316e0b
+d18f16f9
+022c81dc
+607f4bc1
+01f71796
+e509a942
+084b5d3d
+647eea15
+6f8d08e8
+2e7bc615
+59e23b95
+1db94996
+2cff2cc9
+407f71c8
+14108df6
+a0602981
+d6c04afa
+207e3a4f
+74056b5a
+1f9b2a33
+af8db00e
+06969a20
+0a509540
+c251e774
+6dc1c4d1
+a54fca2b
+1f3f71a4
+a4861bb9
+e0ee3953
+5340cb84
+a9e21d3e
+e8b24d08
+f412e004
+b21f08fe
+5b0e9927
+76e67c5c
+5db1ef75
+daf40beb
+1a2412ac
+bac304f0
+543dbc6f
+d295078c
+35e4d583
+ce0ff07e
+681a3f32
+e0ee18f5
+6cdc8bfd
+131d3346
+a6b876ce
+6cf5aed1
+675cae34
+78ebcaf1
+b5c889a8
+45d3b115
+8dbc001a
+bf94b88d
+446be5fb
+23287566
+d2160250
+fd8c09af
+75803ac3
+24a39c8a
+f88f2bff
+559229a0
+a240ff99
+569913cf
+af0dfabe
+8b3a08e7
+5dd684c2
+e89210ee
+a6ddd360
+5ff90ea5
+c0b54b27
+04ef2985
+492d8335
+9810119d
+2d49999f
+5a9e81d6
+e2876dac
+fc9ac63e
+292452ce
+e2c18d5a
+f0d65d05
+4be6e448
+fd8464ad
+7c1f836c
+9014f0f9
+4929aa91
+812ea57a
+ae97ecc3
+bc566d75
+4d30aa09
+f566116c
+2415f957
+ac52fb15
+42793602
+37663ab0
+924a3a81
+f8d15366
+ebd27968
+f717b29a
+d9429614
+f0260896
+f492e179
+1f29ec61
+9009bc6a
+0e1257cc
+53d9e231
+9ea2e321
+e89eb8ca
+1545f0d7
+17d9b759
+ab3069a3
+23a9a76d
+07c3d3a0
+413978ff
+15827352
+1886172b
+e57866d4
+8187184a
+f9175ecf
+dd039517
+ff824c52
+f74f7152
+8b556421
+31fdfed8
+6f73304a
+aff315dd
+038f8759
+d68a7fd4
+2a892173
+46ed0b3c
+abc790fd
+6aa9f74a
+e6250fde
+71d7ea29
+1cd2cc85
+f320caf6
+8aceb51f
+4431dc14
+0728c213
+4df3da6b
+57e70a79
+6e897ecc
+8c674b00
+023d3891
+51c5d5ca
+bb1e9ca8
+16b21935
+e2275836
+b8596134
+a2327e9a
+337d8d28
+3a8c68b7
+42b3012c
+75d3c990
+7eaf5074
+aada1423
+87b94fec
+1317e91a
+a4bc0595
+1d858abc
+97cdf314
+03b0a8e3
+0d91d386
+43da1ff4
+48ed53c5
+cceaad58
+53c5f305
+3f2e2dbf
+5784e3d4
+bea58ab4
+c610e71f
+3a5cfb00
+39f048ad
+5258bd01
+d8107d42
+a84cf4a2
+b53db43e
+18dadcec
+e38a76a7
+55ea1fa2
+883a7c31
+abcca5c1
+ade1227c
+7901647f
+144050ec
+ae9c6510
+2559d9b6
+559cd202
+f0d9127f
+69f825dd
+e815112f
+90ddbf7c
+4a593892
+ae28cea7
+12412a67
+6b94b61c
+ac04e0e8
+a473257f
+7d20a3fb
+b19c1e74
+7736a2bf
+4e1234cc
+10040656
+7e7dc5e4
+a10cf729
+c169c458
+f29e2024
+217b70eb
+2c5989f1
+df2987c2
+93a4aa35
+4a393d04
+ae74ec5a
+e8f4b767
+6632d26f
+c6a089a8
+e8b76093
+f554f7ca
+210e69f1
+d4a5a2be
+fc8e4662
+ad730b96
+d9c759ef
+247f84ab
+4424a0a4
+8c169bd7
+978a6920
+cc4a49ab
+4e06592a
+25a9dbc6
+f511c49f
+dad721df
+0cfbc5df
+88e3c6af
+0a069322
+cf2096e6
+e6cf16bc
+d58f740b
+16beb12e
+912970ed
+63210625
+95d8c83c
+b593a63b
+6b8ac38e
+9bc78fa1
+d14edd20
+a3c5833f
+1e3c8e25
+df0f0cd8
+fdb1071f
+53a64a06
+fa622b73
+b6ff67f9
+82d260d1
+6a805a0e
+88cdcba1
+edb12304
+a8e45647
+889bd31d
+ddf40270
+dddd963f
+9d9f6832
+78fc1b0e
+9b45d6e8
+5622df07
+db5f813f
+d7bba1ae
+05758ed1
+1936a526
+422c8577
+ef97a140
+ebd8f6fe
+26c4ae32
+4640d8d7
+dee34d3a
+cd764ff2
+892166f0
+3048f058
+e0ef7640
+0c32c2c5
+e81555c8
+837782ed
+5fe41dba
+1e27c4c8
+b1302089
+35f1d5a6
+126c9dd2
+0507b832
+290e3042
+a08bd07d
+eb8f48c0
+b16ae607
+68909f00
+e036ff59
+81875423
+00b8dc89
+7d48b3c1
+7cfbc0ba
+e439dd9b
+f50033e5
+4622e3e7
+a33da14d
+24177fab
+1c51f095
+6d68e99c
+adfbda5b
+8aa7e8d4
+48240945
+27ee6db8
+d8b631cd
+2e2bd969
+b4f76d13
+60a23d23
+e70e26cb
+14be02cc
+aef6344d
+83d11398
+06613aa4
+92d8ec64
+02f01e20
+25c6ebba
+6d818e07
+69d56794
+f1ab5b11
+89620128
+7c2269b8
+3a6fbb6d
+97b8cec6
+74e301c3
+ddf559be
+1f7d09e9
+368a1c96
+79d279e1
+de65dc97
+6da7d68c
+0cb182b4
+3c96d3ef
+d262b21e
+42d81087
+fec218c0
+77f7d01a
+b7062484
+aad865a9
+14674f9b
+ae1edc05
+548c2cfa
+21ad9ca1
+e0182bae
+9aa55138
+c38116c9
+f77a8d3d
+16829d18
+18847041
+d52c7ad1
+60538e81
+d998d55d
+a8fdb472
+9a82459a
+b64212a7
+9b8c4db5
+d3e39ddb
+9559bea6
+44504dbb
+0c3fc478
+0b5ece76
+ef81bc7d
+4bd13a09
+a513eed0
+e952ec4a
+f7a61767
+fb2772ea
+310d155b
+12573c8e
+7bebfcd1
+50340d14
+13ec578a
+516195be
+0efac1d3
+c3b8c6da
+e5481fc7
+815908d7
+6074d864
+12e792d8
+dfab705f
+25ea17b3
+f2be9bcb
+e80ae76f
+8d28db42
+b4fc11b8
+8ebf193d
+bbc04dff
+6cc0bdf7
+404fadc5
+1ce3ecc3
+cccdd69e
+91b9f97f
+14ae7bf2
+3dfcad8f
+d3dd56d3
+b36ca009
+d2a5ca11
+9b5a6a83
+8485aaf7
+5b1e9201
+4a4e8baf
+688f359a
+80fdf058
+4e16937c
+9406dc72
+fb00c8e2
+5d87c02d
+9e17b0b2
+3a216454
+217d99f2
+5e7a356b
+2fed2c4d
+b20f9276
+79595843
+8f76a345
+99e7a41b
+3038bf01
+a8e4fe6e
+eb17a35f
+903024b9
+0ae958e9
+0ccbb29a
+e0e76958
+14fa210e
+fda2c275
+f3cdac21
+fb67e61d
+d8f0318c
+c535a0ec
+ccb85bf3
+c26b5dd4
+02651efc
+cb9f8edc
+3d5d2969
+fba62a10
+356fa1ec
+1a211c92
+5f2506af
+fa461c15
+35679327
+0e7f48ca
+12a6592e
+a9a633a3
+d07fce85
+4d2c9b0e
+6efcae21
+c11477f0
+0d1b9366
+7e1a4d8a
+cee9f178
+c8389df7
+8691120a
+1620a246
+c9b266fc
+9d08aa8e
+9c95787b
+24505cf1
+d82fb770
+77660bba
+3a2b2190
+dc96c4b0
+40e29d2a
+1e68f872
+c82d8eb2
+3e25e5f5
+673c0d56
+2d0bb053
+faa78901
+9a6e9a96
+ad334f54
+1cc16de8
+d37c2a33
+1a277242
+6d737967
+a8e0f0c6
+33b0f0e1
+8ebd9227
+750d1068
+0ebc8ddf
+78530a26
+3f565406
+9a89c28d
+79fcb5cb
+f238735a
+fb11a931
+cf1221e8
+884478a4
+94bf744c
+2b9e95e8
+28582d10
+2f49c59e
+765b5f71
+54747c3e
+1605250d
+90cfbb67
+704bcd37
+454fe7a2
+f2f8a7e2
+00b78dd1
+0d26bcd2
+df97efd0
+722bd39d
+747ea14a
+6e521569
+b02eef70
+c3cd9e9c
+4ca91e1e
+93625cba
+c8d259cd
+9b3255a6
+d2aade62
+e1ac77f7
+ee9ed69d
+50996ffd
+bac307cc
+9c25c3f3
+c7e6b6f3
+80de833e
+b4a435f2
+cca21054
+02436a51
+f5baaaa1
+1cbc3cca
+c6a9d95a
+4b0401e8
+126216a0
+794f6f8b
+a552a47a
+cef18e1d
+f0bf9094
+cbfe96a9
+0af7c64c
+715f1291
+c888d015
+fec781bb
+de877c2a
+99b06d74
+ef3290e2
+573054c2
+7f758956
+ceb8574f
+e0f1de3a
+2223bbe1
+ac4acf1b
+f3c64936
+5c4a968b
+6afa614f
+7cad642c
+da315591
+49c15c8d
+db23286c
+1ac8e354
+9c792869
+f6114366
+342f00e9
+9f44248d
+41dfe8bd
+ac79e11f
+a66dcf27
+18dc3782
+cf539875
+259ecb43
+744c8e3e
+2f2a444e
+7da04584
+a43baafd
+a723edcb
+aa322bcf
+6aaa8dbc
+733cd612
+68d4cd79
+633f7ea7
+de781d57
+0f762d27
+67f18454
+e8b148e7
+453545f8
+99e01c3b
+2b8f96c1
+baa2eb66
+178b0ad8
+9adf3328
+ed807c25
+72a87a31
+661da0ed
+2dc7d066
+50b07d60
+18517138
+9ca59173
+98eddd86
+a6bf53df
+717db705
+74d258b9
+4e9e3b1e
+a772a8e2
+4724c85b
+fe83a0f3
+384cd4c8
+4267a81c
+554efdc2
+9add9e7b
+d601c43f
+8a8ba5d8
+a27f29ac
+117b1660
+085e9824
+908c84c6
+15eb8c62
+58be2b32
+d038e464
+f7864b4d
+68bcec11
+36721ddc
+81baefbb
+033b0f45
+298421a5
+d5223973
+b4f53f74
+35a3a286
+14b3c33e
+43a44453
+883b58ab
+1afdece0
+8ec16239
+6fc31e43
+b28d1085
+38645649
+9dd8631f
+91f74a64
+5846a832
+bde42a15
+027bbb3e
+58251aab
+764e251f
+90ff0d8d
+c39914bd
+e5133863
+7f7458c8
+8ae3cee1
+12642bae
+44773e40
+02443eae
+41f10449
+2547a09c
+e9c61033
+f3996583
+7af2211a
+aad60a5d
+eccc2578
+d83fb924
+a888f201
+73c54e3e
+55e4f717
+3bfd73d1
+3de8e0f8
+962bbefe
+cb1612e3
+2c00caf2
+9c34b09b
+e5a5467f
+3bcc995c
+03259d67
+f9df5022
+c31d207c
+46175d31
+733e1a5b
+f5a6875b
+778f5086
+876a7e78
+42e20bc6
+e38b602f
+6bdcea99
+78c64a1d
+7d9f3a10
+95275a51
+36673c74
+e0cb89f2
+d6c9a527
+7b403f46
+370f28ad
+b43b1e88
+fbf35bf3
+e0052e65
+1ee67fb5
+e92521f5
+1a015fe7
+e84d9f7a
+e8d4033b
+fd16dc03
+4b11a292
+e7dfec07
+4db04ed5
+d36539aa
+4cc22009
+48824041
+521b6787
+4e917140
+de9cb0c7
+d7b9be02
+f3bbc114
+298440f7
+e25cc91e
+3afef1d4
+06373944
+695aaab0
+a18e4f89
+cae64906
+50a9ce4a
+f75bc6c8
+099b839c
+b29e0819
+b15b8172
+04ad44dc
+618ab562
+7e319349
+e01367cb
+a94ba589
+beac2588
+f1e1df0a
+1e578945
+feb7c7c2
+c7b09696
+a428132e
+cbbe08bc
+33b4066f
+d2f03b75
+9efd8b77
+f28cbdb6
+8c7816d3
+e6863a8e
+94199158
+70ccb37b
+6c2fcd3f
+78fa6c90
+6900c8e2
+80d1ee72
+4c97d3ba
+586a2aab
+bd7b7794
+4b2ba50a
+ff18df4c
+d3e4765e
+81f86262
+bf6a9963
+04318a63
+99637b25
+413759ad
+bb06390e
+6d4dbed1
+adf0052b
+29f33fcb
+355e8a3f
+1cb4f44f
+7d775e9e
+0268068c
+2880af3a
+902a109f
+0c67c4ca
+c72eb942
+ea6f6206
+4d4745a0
+40f9f2b8
+801ee1ae
+58ae23c0
+77366052
+08805adb
+f728aebb
+cdc6b72b
+d7e9854f
+c6438ddb
+7457283f
+10e14b33
+b4316eb3
+1ddb1d9a
+8abed8ff
+429860e3
+f0449815
+e02fd783
+9ebbad56
+5a7d5bd8
+e6c5b5cd
+6c947207
+808ff1bc
+30d3612f
+a1517d9c
+49e58eb2
+c44556e6
+63ce87b1
+b53e73b4
+936289f3
+5e877be7
+547b8c62
+bc5095c8
+184a3554
+25ba53ab
+24566b17
+3f488e91
+6ba03318
+0e444a54
+94dbd80a
+e914effc
+d4696a42
+0d604f7e
+58d23284
+c94f6bb0
+876f9708
+027e6d5c
+ef6fe5a5
+731c0cbb
+856048a0
+afa1b545
+1daee501
+e974eb9b
+8d1e25de
+15c5d1cb
+0b409f8e
+70884403
+bee1e977
+3c237d18
+062f70fd
+6641807d
+0ce49139
+cf4a3e0f
+5c5f4dd6
+796a1a2e
+7683d651
+62615981
+4ddb1170
+5bcd6532
+b302b2e1
+c5bbd8ea
+fc8350a5
+cbd68e8d
+364f5bb8
+38f1e55f
+c1563774
+f193d2f6
+15cf35f3
+131566f7
+36cc336d
+7da7b9b2
+9bc78475
+5163515b
+31b59ad3
+3b23ee68
+bbee52f9
+6045815c
+85e5b07c
+b2db654e
+b22663dc
+3403e98c
+7501d6be
+61f52294
+0bda6c6c
+7d7542ee
+2aaebd23
+71c460b5
+21879cf5
+1e82594c
+865841c8
+2a0f2dfa
+b49353dd
+43275f13
+b88d2fea
+603a2e9e
+18993625
+65b8074c
+25753fb1
+56aba88b
+e8973e1f
+3628a186
+15677940
+53a91212
+63cf9103
+a8830f2c
+6e3a19f2
+e04ac6ba
+d002b6d9
+f679d5d0
+5ac40597
+8af1dd15
+d8524628
+59b212e4
+19378d26
+6cb56b0f
+18231224
+622c34d8
+754487ee
+5f7a33fa
+e4e3526d
+d81d048b
+fc45bcbf
+96231dc6
+c9e013aa
+4f648a87
+dd751531
+c06bba41
+1cbd1882
+faabdac3
+8c1d85be
+5502ed6b
+a3443e75
+91b8fb2b
+e857f574
+ac8848fa
+1726d6ca
+63cfcd4e
+4909d3b5
+11384998
+633f1661
+0e8af873
+5a205a23
+e7a9991a
+d53709e0
+b4e23a79
+b25845fd
+89b076fc
+36fdeaa3
+8d787bee
+80afa366
+97e04114
+d7a661f9
+1ba61dfb
+cb0f0e06
+1b47b808
+818651cf
+af56328b
+d493d6b4
+eaa2a917
+e20a93dc
+f31180cd
+ced5be3a
+c82b028d
+e87e1df4
+3b12d492
+6c39197b
+7c42278d
+1396e916
+608ca4b2
+2a7a147e
+6cfa4ac6
+02319a52
+1e33dfc1
+f137c9c9
+a8c1b20b
+42e41d92
+1e6bef34
+78916971
+8ff759e2
+6aae217b
+b0009333
+4f2e5eea
+11817ae8
+a733d362
+74f82d5d
+135df24d
+33e5b3c4
+eff62e32
+2eef01bc
+f7b14c36
+67daf98c
+f3635baf
+137346bd
+62487356
+59621a99
+c7cb28b6
+a785131a
+25f5cee6
+ddcd2653
+1247bc49
+e63b30de
+5cedaf14
+b9c9993c
+543c0413
+9608911d
+ca419ead
+3b87bbd6
+61b4e99f
+54973283
+fa7b7e5c
+cac2f8ca
+04595d90
+b3b12b81
+862dcf90
+e5655411
+725cbd5d
+6a170b88
+5e396db7
+60403b20
+4e505ea3
+e2dd9a77
+3068b94c
+f9bd7db7
+8ba8b39a
+075f843b
+850cdec4
+79e800bf
+11b334da
+631b539f
+d22224d7
+08812651
+b4569567
+1d06b1eb
+1068335d
+907a361f
+1c703d37
+102957f2
+de2f758d
+c4b55f96
+028b3996
+f9605f71
+b046231a
+f107d273
+2703247e
+4a378089
+35a6d30f
+a1e267e4
+b42692ff
+63f3dbe1
+c7532bdb
+f7aa54a4
+e083c32b
+ff4a3566
+0b528994
+387ececc
+ac182643
+009c8235
+231f3923
+a6d97bf2
+b5351783
+ada91bd2
+11984f7a
+9d9ef20e
+a1d90ff1
+c8eb437a
+e29aa995
+3aa85cb3
+12f48803
+d51e7608
+8941fc47
+a8db6590
+21bdb810
+4957164f
+3706a538
+c27f6c76
+1bc32d5c
+9ead8b78
+1042ca77
+0816fba2
+2ae4121c
+f3e7ba3b
+5568e7b8
+53175e74
+3d620870
+870efc17
+ee569ce2
+286d9690
+ce0f2958
+bd2c909f
+e3cfab30
+98674466
+6f6d80ab
+a67c19b7
+47656a7d
+f8ebf901
+8f3ef960
+e2502ec9
+b00d57a8
+b9c4a1cb
+4f55eb8e
+ae717b6c
+a8e962af
+fc1f50d6
+345a7624
+3fe8c1d0
+2718f7a2
+a5007c7b
+7c5bcff3
+e458ca69
+7c4f0ad1
+9ee86a69
+8736735c
+2e6376d2
+8776c080
+47b64ef2
+abd8f51e
+c1124d0c
+236888f1
+8db2662c
+badb3463
+dbc5e126
+c9e96939
+004dd4ed
+b79ee0f9
+f78f8a99
+1bb48b6c
+333b9800
+47a431f5
+2406b2d9
+a250a852
+ebc72f39
+6a13f121
+97620915
+97082ba3
+2d541419
+e7dd0bfc
+d0b3a21c
+ce3c65c0
+52b49730
+e8ba0304
+83258839
+5edc1a28
+dbf47116
+45e17a48
+aec1ba74
+ce39fa55
+7e6956e0
+87f76302
+ac2a3732
+97bbf6e5
+ca6db340
+7b9bdd9c
+234453bf
+65ccc4ef
+5726b2dc
+008efb75
+28883800
+87f8fcde
+a1fa3e9c
+2a079683
+be77fbc9
+f0d27586
+732eea45
+3b2d8705
+d8c099d7
+2a63319f
+5307e616
+8ab5b746
+ad932be1
+7858f722
+307ba917
+a46c3543
+16a78aed
+b19c72b5
+d82a1750
+a8690d4a
+d2988321
+95e6862b
+0f942372
+d57668e2
+ce5ff47d
+bcdada2d
+311bf554
+231bbd2f
+ef1823bb
+7b973dd4
+53de4ce2
+b8437aeb
+52baadf5
+1050017a
+72f85ad5
+d1ea3c15
+acd50d4d
+487ddf17
+a1ac2b6d
+db12b98c
+064da126
+5c3dbd29
+cbdbab51
+16634bc3
+4755f655
+da946ae8
+dba8012a
+5e3f66c0
+3ae36386
+b3de59db
+947ae171
+856fb045
+7bc0f42b
+5fb54e84
+38d99054
+51ed9966
+eb1997cb
+b5aa9529
+bc55d84c
+64dc52bd
+e9910d7d
+02aa1879
+fc29c5a9
+32cff810
+143da7b9
+e40dc223
+465dad55
+a6d6a075
+5e4ae600
+c2041322
+54971505
+d7f255a7
+34ad39cf
+2ee9f086
+40318a15
+a2c9156e
+206c5a0c
+45f80c6f
+ba8b8b16
+1b9ea990
+1f6a1fec
+6a5222b4
+e83e8005
+1b856468
+f5fed91e
+bad3cc31
+aa39dd42
+d582d840
+11290d8c
+618b0ee5
+47fde388
+94eccbef
+cea3323b
+b7fe4065
+0f475426
+a85e7b5f
+d4c446d5
+500c45a8
+38dea4a0
+280a440c
+0065bb75
+20e0957a
+56e96b36
+0e78291e
+c9349f4b
+12e876c8
+487cbdb7
+b8db89ad
+c279efa6
+41415003
+3046a70a
+e4849971
+8c236762
+5d69436c
+7f6af6b0
diff --git a/models/rank/dcn/data/sample_data/vocab/C16.txt b/models/rank/dcn/data/sample_data/vocab/C16.txt
new file mode 100644
index 0000000000000000000000000000000000000000..8832f7160de90699622a3260b8c2d36467f833f7
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/vocab/C16.txt
@@ -0,0 +1,1731 @@
+cae0eccf
+afd2b3a4
+5165dea3
+16b7b16b
+9bc7fff5
+8e63e610
+561c341b
+80f3dc53
+01fa3a2e
+bb01ab0a
+4d10556d
+447c553c
+df3e8e2f
+75d5b49a
+6615ffe6
+bd4c16e4
+c6ff2f13
+2a08cb76
+31ca40b6
+9c3ab443
+283d439e
+06f4ae56
+6db5537a
+1fc8b40e
+6fa1c2c7
+e7a6bfa7
+dce552fc
+cd27642f
+6d922e3b
+15f1882e
+0a6bdd7f
+751ef953
+234a28d2
+52b66428
+08c8970c
+d85a4e9e
+2498d843
+0c76e706
+32d8c0fc
+8e662061
+fd1bdc62
+ed067317
+f1eaaed6
+15da0096
+974c6edc
+1f2ccccc
+c0c9dd9d
+a323782c
+2a98c117
+a23a1df1
+21dc8df4
+f76dabef
+b0b60d11
+6a43d0e2
+c81066d4
+c9906857
+adbff367
+5eb9c4aa
+53850bd8
+99d3874a
+44e56dbe
+0f24f259
+d37efe8c
+2dd4e74f
+e4392c3f
+40de7f97
+2f4fe648
+3f6a5fd0
+7471db7b
+514e8248
+487a3acc
+489b1305
+c6b1e1b2
+af6fc4b8
+8ad23cab
+a9388658
+65a893c6
+9bbd716e
+f2a191bd
+35533465
+8430d68f
+82bc9210
+41055d57
+89052618
+f83a681d
+743db411
+fadee8c3
+e1dba062
+efd92064
+2b2e3ba5
+708f5460
+beb10f0c
+8a788b29
+f9be7ce7
+bc164e70
+85d9cfbf
+06f33c3d
+d617f1ff
+92101395
+ea748914
+d82ec2e6
+bd7b8828
+10f050cc
+10e0405a
+aa7d0863
+1478197a
+e8345353
+bd44a60b
+2b2ce127
+b13e0025
+6f4d535f
+ecfdc8a2
+b97f8f45
+6083e1d5
+11cd5ad9
+54dd60b2
+4c63632d
+34d3c2e4
+1279f5d1
+aafa191e
+6168daf0
+2bfe9822
+fcc06f9f
+01fed589
+f25d0dca
+7fd23f14
+8b4e38bb
+ee238677
+f81d403c
+0d7e5968
+4d612a0c
+f69d8369
+2d0bbe92
+9462f7bc
+8f4762b6
+cbc9bb5b
+d797b346
+3b87fa92
+b98c1201
+a73325b8
+41bec2fe
+30682b23
+e6b23bf9
+ce1efd5a
+2c562262
+cc498f79
+be2132af
+8dcfd2e9
+cc95908c
+828b488c
+2e5c410a
+1271e443
+ac01c54b
+8064deec
+1e8e1075
+f82fe53d
+b031d057
+71895e92
+b03f023d
+64db8f46
+571e4ba6
+c3352fb6
+fc7cf329
+69fed5ea
+e6bf8468
+2774343d
+cf393f9f
+97d75abd
+dff2640e
+37d72a2c
+8b15d58e
+299ed5ac
+601d6f98
+a7d2766d
+d2ee3e11
+dc1edaf3
+38008e4e
+20c3c861
+af3fa3f3
+006e616f
+e1253924
+73480f4a
+c28be541
+451c4d0b
+3b58b07a
+7c17a186
+dd81aa8f
+baf0ded6
+d0be961b
+840307c7
+2873aefc
+d79dd8b2
+cd0a01fe
+5d9228fd
+5e622e84
+ea3544d3
+a9ff631c
+9a3b8889
+0e903673
+e2b64862
+98358e3d
+196f40bf
+09954edb
+a4a21eea
+dacdab3c
+8bc8a1e1
+342c1437
+c39bb110
+156b58e6
+c84a8d24
+f7d36847
+4e939ce3
+470550df
+d2e70fe5
+208d9c1d
+26bf6b76
+ea6028b9
+1b9d47c7
+7d1732dd
+64bd637c
+81cf9a89
+cc81317c
+4df5d193
+85520cc5
+1a1231dc
+e82d76e7
+a8a75566
+1bf03082
+d53e706c
+72bb3f2c
+2c9d222f
+e3469046
+db283b1d
+8351b996
+1b5dadce
+d33e34b5
+179d9828
+8578e9a5
+c2f6e6e1
+5a6584e5
+ae4b68cf
+1b9c30fe
+f83a3710
+a88c79f0
+83e507e1
+6a6f7772
+2d3fb4ba
+8a00a9dd
+8ec308fc
+6f264602
+e26d4d55
+02f1ae6c
+38e9704f
+c7b50b55
+f4ead43c
+c77cde77
+4dab12d6
+58cde36e
+ff8dfb8b
+a1d12214
+979a0804
+9faa714c
+21d71b93
+fb8ca891
+7dd3ec13
+1210402b
+d114c516
+30de134e
+d4aed6bf
+f822c462
+745d0b21
+da635fd1
+49258147
+4e669708
+28cf6109
+7226c4ea
+90d6ddcd
+01589eb8
+ae58fe16
+940da9ec
+5ef8643d
+0a32f9ac
+f47fe81d
+09954283
+ed5f7ac0
+db2482b1
+af6edd0f
+9fc98721
+a06498ff
+f08ce1fa
+84a7032a
+dd35897e
+b64575c1
+0f655650
+ebbc78b6
+a5092e44
+f6de1122
+88078b95
+4b34a656
+880b4662
+7295d80b
+9e14a657
+ed6d847a
+2a7f62df
+e3ce80ff
+7e52dfcd
+edcaa3e7
+ce580091
+e18142d0
+687aedf2
+c0bd8d11
+4f4ee7ab
+77facfd0
+4d03fc5c
+291a7c04
+2cbbafa4
+afd51269
+ed168377
+eb288602
+dc2888cc
+d296d513
+d388d33c
+36cd32ed
+a8ed1a69
+b0f32bfe
+d9462a51
+1a06d925
+8ec71479
+5c0831b0
+c9503369
+27b159cf
+450d3c93
+23056e4f
+08514295
+49f8f6d0
+c65f695b
+f5c0df64
+39d3899e
+a16f9f0d
+46c94a1a
+d925fa94
+c26ce5c1
+e66d1649
+dfd505b1
+72719c09
+179c9c5b
+0a605846
+3dc4161e
+72d1790f
+e8b28b69
+f6e0a711
+008a0f1c
+0462dfe5
+0ada1635
+97f0f8fd
+3a0994f6
+4f07bb6c
+ef4073ea
+5f45d1f2
+f8b34416
+7d9b60c8
+f19a28e0
+64f2ada9
+a98ec356
+922f1718
+2cfeef71
+18e0b979
+56bfb1f9
+a09fe1df
+ba0eeab9
+eb8a8ef3
+bc05011a
+06bafb1b
+b9a86283
+31140703
+ef03d344
+3a753b84
+757d27e0
+9452ef6e
+4c05759c
+c304c76b
+0aa7c298
+8beaa75c
+bd5c70d6
+7444bc51
+12e989e9
+ac388dc0
+5db86fb1
+43a7c9a1
+eca39129
+e52b079d
+ca47fe98
+6c3c8c53
+20922717
+bb01ced1
+f67f1d00
+b4c7c2e9
+0a047b5f
+3a1a0a65
+fab4dda8
+fcb03a0d
+9d05a081
+bc1042d6
+656e512e
+20340c29
+c0cd21a7
+f294bed7
+46b5b94a
+849973b8
+25c23598
+e4419971
+55af5def
+6b421932
+3e8c1b09
+41d0f91b
+4e02cba8
+69601db8
+114789a5
+ac704b5d
+aef3b576
+1e6beedd
+a661a977
+2a3c2d0b
+36c4741f
+9ca51d92
+abf7cf2f
+e73fa8aa
+4b0bcbe8
+29b0229f
+5a9431f3
+4c99f67a
+bcf3b985
+860336c1
+1b3728ae
+7224bef5
+4c7535f3
+93c58c36
+2433f614
+1f69ccdc
+df1d6cc5
+692feca8
+713dc1b1
+affa8bde
+ac163d1c
+2b78e3a8
+7f034b52
+84cd5428
+fd358509
+4b3cf298
+db0cf8e1
+c70b9c18
+4cff2ef3
+c59b5981
+0c15d525
+a26e6c28
+877a0d08
+3eef319d
+cfec49b7
+8df716d2
+f38f1ad6
+8cf8002e
+e83517d4
+01a953e0
+20c50219
+ab0bc9c9
+732af477
+ef26fc9d
+889df0bf
+c7cbb0c6
+47cf0073
+afc0e5d0
+69f18a97
+f148c075
+d04d484f
+164e9b80
+696a9f28
+0607e9c3
+421f0c5e
+12b81da2
+86a29fb8
+bdbb8d34
+32f5183d
+e1533f50
+116648a6
+6ac19842
+820d735d
+34744eac
+6ea4b293
+c15b3b27
+ec676ace
+2376e033
+4161d52a
+a2d72a3f
+86575745
+bcaa621d
+39048973
+1b4d5a76
+a0b94b82
+de77f397
+13798f0b
+949b03ad
+b6d021e8
+6a3fd5ea
+ba79b5f7
+81b7a22d
+0cfecc91
+f0b7d70a
+fab26a41
+f370632c
+10a87365
+f48266eb
+68d8d3a3
+4868e47f
+2c52502a
+8ec07f77
+8075af0c
+d61e2a13
+d910c5eb
+e118a362
+332f7099
+b7a016ed
+8f2b49d3
+f4ae7edb
+8f13519e
+5421ac75
+ffb61047
+13a83624
+d0ffe660
+feb3c46e
+15b70811
+5ea15ec1
+233426cd
+85806c82
+3eca4adf
+7dc581df
+28de384b
+f6f8c63b
+0bbe4985
+a6a6ad50
+c336971a
+c170278e
+731fb6c9
+a08737b2
+c0d9bff4
+4562f4f5
+f5b1ed89
+0f8bf16a
+6dd5141b
+82708081
+071a7236
+893f2442
+d58d490f
+0fd6d3ca
+0c6b4ad6
+fd164c2b
+f1bad98d
+808c0b0d
+8acba4a8
+ddf2b9b0
+b4444678
+542c1f96
+5a84e8aa
+dd72f8c1
+8fdaecc1
+f2c276c3
+f1ae102c
+542b1cfe
+b4ac091d
+22531fa0
+ed7abfcd
+fe29a868
+6d9ef2d1
+754f444e
+eab0541f
+15816386
+3ccd9eb6
+08020620
+384d2714
+c62b15ac
+61b9a121
+88aa944f
+ef734ec3
+4de97669
+3671f003
+84449e99
+3d9023a4
+e0823772
+d290e290
+96d73731
+c4d74589
+d432823b
+74f15ec4
+f796bcb3
+15fcdb5a
+834b85f5
+33a1f420
+e8474477
+498519e1
+1689e4de
+3a24b9ab
+c37719a2
+aa6059e9
+925e59a8
+0fda2db5
+e408e1f7
+f4281abb
+d8971452
+ffda1616
+d4b48c2c
+def38fd9
+301d75b3
+fb6eda9a
+c6abf9f4
+e6b81728
+0ed962cb
+b8c2e9d2
+93da9544
+daa60e5c
+86e0d825
+46cd1499
+c8719c2b
+5e1b6b9d
+4aa45c2c
+a2c3e3bf
+6137bb93
+f80f36da
+808ebeba
+478c1d9d
+b556f548
+0ded9094
+921a13f5
+8d422b7e
+b4823cb8
+2f0415eb
+8836a52b
+0c794472
+a5bf2db0
+84898b2a
+f0d0056b
+930b788c
+3704b755
+982df0c4
+acafcfc8
+bddf8a43
+569ba1fc
+b4275cb2
+9137a27f
+478a3caa
+dac00324
+ab9b38cf
+8874810b
+61340085
+0239b329
+f1d8a21b
+6117a6e4
+32d7d6c9
+6e177038
+50b03903
+e4025117
+410dec23
+ff48ade9
+b2fe1c00
+c71d72ac
+e568843d
+5acccdad
+672761b3
+d54a8875
+9aa86a71
+ab5583e5
+55744cd4
+9e6dcadf
+100ca56a
+10b37127
+20debb53
+6515dca1
+4860e447
+7eb9657f
+45885e5a
+c64d548f
+e5bd6255
+b7a70408
+00f5b9eb
+4ce4d1c5
+19b5d667
+44671422
+d038a670
+e8e891a7
+44879595
+5a99ee77
+6e1fcea8
+424da7bc
+9226afe8
+2a14a268
+6683931b
+73c5cce6
+f9d663c0
+25b075e4
+b681902f
+4ae7939c
+6375532a
+baaf75c0
+55396b8c
+35f81edf
+a65f23a2
+386da210
+05480723
+9c54f8f7
+74284a43
+15469308
+7af48a61
+722ee340
+e9194f3c
+8f35bead
+6512dce6
+7c93429a
+b6bb85bc
+e047b86c
+482b97c0
+fb1d7fb9
+36103458
+d720af47
+971a416e
+bb5f54a0
+8884cf4a
+10d214c3
+354a37d9
+e3a7947c
+3e049d9b
+94a59915
+45830731
+64052f86
+48e669c4
+73bee060
+e77b3ec1
+a6a69939
+51b0c534
+3f177f53
+353b21f8
+e1c6cde7
+5766f62c
+7e28aee1
+85a68fde
+9bae805a
+3c39ef53
+8737ecc0
+cb120e85
+3ecdadf7
+ce06dc79
+15f867a5
+b9a485cb
+f10ee122
+0decd005
+b8996eba
+c41d1835
+80ec9d45
+51a8930b
+cbec743d
+04e3abd4
+c47e9279
+5614787a
+572b6b0e
+0a00b0a7
+229a113d
+4d5fd931
+73c3f6f6
+807b85d5
+ee045eff
+e5e5fb5c
+9c8f5069
+c8ebd6d1
+27e96fba
+4c3da0da
+169f1150
+f4fe237d
+1203a270
+324dfefb
+12e66851
+6dc8c52c
+4605797a
+72dbfd4a
+ed79a622
+31864932
+f8d7f193
+cf28e926
+461fd0bd
+10a66e28
+5015d391
+e3eb0683
+370172ca
+4f0a3b07
+6b98792b
+29fd6b7b
+28a15409
+ebc92e5f
+6f5e1c59
+49c430ef
+6364c8a3
+0167dbb3
+70908237
+ab8b968d
+aeabc7c5
+1d87e348
+5e431415
+d32b8b58
+316f8364
+f6f410c6
+d1a4e968
+eba16983
+7b31c46c
+ba92dd6b
+58fba838
+731b4fa6
+d8027e71
+e46d3bc4
+a56e8538
+9a14c887
+119cf460
+83928875
+3c0f748b
+6e092adf
+a9d89996
+9f56eb2f
+d099393e
+6f8c3cff
+20d25aa6
+446fa98b
+3e9b78fb
+53c3a448
+f09f93ab
+eea8dd9f
+66532548
+1eb90712
+7db2eaad
+a867f8e6
+0c0ca96d
+146a70fd
+8e98024d
+f13e0819
+c4036541
+5053c298
+756fd7f6
+541a7675
+5b987349
+d621ce3f
+e5157e36
+eba85660
+378b65de
+356f6a16
+05f98a59
+8d83ed6b
+ff674bdf
+d2b0336b
+32061fb4
+db29b42b
+208d4baf
+2cde4ca7
+5d32e679
+bad5ee18
+0f0d1d70
+f2480acd
+96e51d24
+d20f257a
+5d9dc78d
+d08de474
+84534f54
+15329ff3
+0e12e381
+bee50777
+101d02f3
+4a81344f
+c75db6e2
+a0613853
+a043360e
+a4676ba4
+7edc3aaa
+9243e635
+3141102a
+8213a764
+2de68aa8
+1057e51f
+f4917c7b
+e67ffd3e
+0a671bcc
+6a841a1b
+fc89d1b6
+e7eb8087
+30235be8
+92e4b1e3
+3241ec25
+fc53f85c
+bc19fa9c
+c408db33
+393fd56c
+559e0253
+c574de76
+8f24e8fd
+3f4322ac
+f2c6a810
+c92f3b61
+2e632a1d
+7b274b5b
+599bc9c9
+c0870772
+d037f349
+70cbea05
+1d0eb9f4
+7aa4d9a4
+1034ac0d
+80a5e23a
+67e2b595
+15204f2f
+1a00d73c
+fd81fbb4
+813b4439
+46938f5c
+f049917e
+978122ff
+3f8c9229
+4e6f7f94
+574feb13
+5f92b84a
+37a59704
+8e75b06d
+f050b1c2
+99d07cb8
+8d5674a4
+94b37237
+d6527b8c
+6194470b
+431c40cc
+12bf7f57
+236709b9
+f3a4510a
+99597fb8
+8cec3c5b
+79f223ae
+b08ca972
+ff5837b6
+1c72109f
+10d54ddd
+cb819dc1
+f84992d4
+5bacecf8
+c00ab85b
+8295d26f
+080e268d
+ba46c3a1
+c5c60ff1
+6ee8f7b4
+98761926
+cffc9d4b
+4a392807
+7505431d
+fe49babc
+f9a5de15
+9243b160
+b5699521
+e7f36786
+056d8866
+ebe989c7
+33bcdc9a
+c5abb254
+bf3da783
+8b4ddd30
+0614d25b
+acaa85f0
+6049b763
+53c569e0
+3311902e
+03ba499a
+4905a617
+a0da09ea
+32c4ab02
+80778e5f
+4da40ea2
+d22376fb
+39286441
+baf6534a
+5306324f
+26a7cd59
+8da176c7
+e2315f69
+d245456d
+0f3e52cd
+dedea45c
+99ebec87
+af42819f
+77ecf4f4
+f6505d8b
+c4ef13a9
+8fef7c31
+00a362bc
+994db95d
+afb79297
+8aaa31c1
+1174f1ac
+2bb0a227
+3cab5c13
+c14ad775
+2c58c4c9
+49619833
+ebfe78ce
+e9bf33b6
+071015ec
+83600e40
+69f67894
+6e8334e0
+e1d99b20
+3305e464
+98b478e6
+a152c6fa
+92a5822f
+5da8ab92
+85200fcb
+ff48f719
+cee8ad01
+22197936
+0bb0ac33
+dfdccb2b
+2538652b
+2d0324fe
+47ad791c
+377abb26
+31af4795
+460dec75
+fb914e97
+da1333b6
+2bbe2611
+111f9514
+878192fe
+d071c642
+271d5b6c
+0302196d
+5a05f38e
+de573b7a
+81246ba8
+c85dc1d7
+edab8953
+f356b4b9
+ff5a027e
+563c1077
+553d3322
+bef7b2dd
+1ed45831
+be9937f2
+ddbbeb4f
+4bd6a847
+cb1ebe81
+4d582443
+75c4ec95
+17b82863
+d54ecb0b
+206d3694
+eaead249
+32b77bca
+941c0bd2
+2d66e7d9
+9d3d0702
+8b199911
+3084c78b
+2b486b52
+f0910df6
+e2a38f90
+010b4748
+ecbb1f29
+12d5686b
+2d27b48e
+8845d54d
+d4e599f9
+e4f8613d
+aa7e0d02
+2d3aabbd
+93f15153
+e07ce4a6
+2210df91
+004352ac
+5f32239f
+7292beb4
+f05fa1de
+f744ce7e
+05e9cdcc
+b19f0f9c
+87140baa
+2dafbd7d
+047ebe19
+7a76439d
+bcfc5b8e
+568a6980
+1397bef2
+50d2b997
+ddfabc04
+99dffe5a
+98334731
+dc0c2eb5
+d07baf38
+ae6d9c5c
+9d3a34ed
+1656bd81
+5feef02b
+6de617d3
+b2ae3c75
+d7e9a5e7
+79b8cbbc
+73c60e2f
+b5595c57
+4a838997
+9fa7a7a0
+9bdf183e
+cd3ebf28
+cc1ad3ca
+4cff9ab8
+9f064e1a
+9e0269a2
+7494f9ca
+ea80b40f
+ac63c39c
+521c14b8
+feb10b8e
+bde37cbb
+30f8923d
+b151f39c
+c686a4e6
+4bc3e211
+310c45c8
+44f81422
+c9cea2f5
+a52c02de
+c63c4860
+c9e13b26
+b77d5417
+a8ea7dd4
+350805d3
+29042374
+ac2a6674
+d0b4477d
+4cf88e5c
+fa608050
+74ee75f0
+4b784c7f
+4fcd8a20
+5a56404e
+9606585f
+a029b14b
+2bf6da38
+5509d325
+afc96aa6
+56223e9e
+b0888b66
+e8cc6979
+aa2872ec
+5ad327f8
+5085ea22
+b74e1eb0
+3f0d7ff4
+92562f8f
+c94e5879
+ac6a6c69
+fb5401ca
+b00d3dc9
+13832203
+25f08141
+cbfc3b36
+41388a35
+76ffaff8
+06cbca6c
+2795dbe6
+33bb926b
+87582436
+603eff57
+0af60b43
+57eb7640
+eb81c23c
+c41ec047
+bb0456d9
+20ae1873
+5e228b63
+f036a5b6
+beef16d6
+48aa2ef9
+a7c73aea
+7c89058d
+0ff23d24
+3e54f2de
+5e7869e6
+fb361ef1
+81d0ea9e
+900ef7fc
+8309aa04
+2bde9f1b
+ee029c86
+8b9923a6
+6232d8a9
+a8d4711a
+a46acbe8
+9a5109f8
+5e416359
+389b217b
+8b30fd90
+13e2a30d
+e66306df
+a54711b4
+e1fa124e
+c9df465b
+07f05fbe
+9cf588a6
+f024e2e5
+def71843
+bd40ae8f
+3889905a
+d707e487
+18469f99
+8e47fca6
+005b3220
+659f2c5e
+3fc987f1
+9c5a6b67
+38d74309
+3357ae7e
+bbf76701
+684e7195
+42bb41b3
+019187b0
+0186e5e1
+fae7560f
+6077db2c
+5e5cf891
+01e777d6
+7a99cae1
+3aaae0a8
+5131d930
+62f60446
+b4df7a81
+3f7cda8a
+ef65250c
+ea95fc5f
+4a09aba8
+c4cde3e8
+b1054099
+743293f1
+1a7791b7
+f6e6e0c7
+2e96c2d2
+13b1cd06
+4a484ccb
+fd2045ee
+5627d7e0
+5efa8d38
+c9a5d168
+8fb26375
+a7633a2e
+07408417
+0f33b689
+a40234a0
+77d90ebb
+b1296b69
+731db326
+89e1e64c
+3c4ae2c9
+aeac1355
+0128620d
+eaee869e
+439f27b4
+710d9802
+cedd5bd6
+162db140
+c737b3d4
+59e681a4
+e86d1dc1
+777865fa
+5eea53aa
+776f5665
+b61f1a5b
+b6f28485
+b041b04a
+1ab2aab4
+01adbab4
+2d08259c
+1c16ef03
+69542bc5
+e15ad623
+6779aff7
+735b87b1
+4b6488bc
+b65bb5a6
+f46f5e2d
+5292ab7e
+02df538f
+d360dae2
+20e899d2
+2be3c2c6
+77c6c8cc
+02d816f5
+6596e6e9
+4563039f
+86cf30ad
+b399e966
+c92cc179
+c1f64e08
+9f2a6a96
+ae365d93
+7e348ca8
+0f99b5b5
+19f33e86
+1206a8a1
+9a9902d0
+29ea6d8e
+35b1ae1a
+bf27550d
+d87236ce
+37f6b7ba
+bd6532f7
+658a84f7
+e41271ab
+966d7566
+250c7973
+dbd6e947
+a9f1618d
+a398a2b8
+c9df9574
+647d5d9d
+2d816edd
+2a27c935
+51a8a5ef
+cb1662be
+c98799bb
+12daa519
+b706d7ea
+0c98f503
+fd94f924
+8e6c8cf8
+057d4aa9
+cf831015
+e5e1a10a
+2f6bcbc0
+ff770af1
+695998f1
+17329968
+2b0da700
+9fe6f065
+5b9c12f0
+1b5a24cc
+ee293b0f
+91e3510f
+78550b97
+ce7e3667
+c50c200c
+5f0f99d6
+bf67130a
+5d05774d
+348b5f6b
+c0673b44
+e51a1d95
+5fbf4a84
+056512da
+1cbd80aa
+6bb29970
+bf6033d4
+851ba1e9
+bc5d0fe6
+b592b5d7
+90772df6
+d99c0f09
+4acd10c7
+27c229c0
+b5f483fa
+6b423eed
+83bfc78c
+8ffccf5d
+d2173eba
+587267a3
+57325571
+10138641
+ad774107
+286e74e5
+841e145b
+cf445916
+91a6eec5
+b9bc9b86
+3f487a2a
+d8daf836
+0fb7893e
+860fc9ef
+8d04fe7d
+cbeda6bd
+c4f97e73
+946a6214
+8245f73c
+283066b5
+ad48368b
+2b524f2b
+e307e1b3
+1af9282c
+ecc5a996
+016308a2
+a78f25f7
+1fe58112
+c22e3268
+3292b8e0
+9393d331
+e05d680b
+56a2daae
+09948b41
+36e6f2fa
+b06f79e3
+118ad0d5
+a231edf3
+0ecc9ba3
+8a3074c9
+2b1b1104
+cf91a2ee
+94d7b373
+be928393
+6553ae28
+b8abbe53
+784a2d05
+0eefff36
+0fed94d1
+01171f7e
+0abe22ad
+3b5e40ae
+cac6d32b
+c8e21b43
+474808b8
+6d790484
+97131038
+788faa1b
+9ab4d6b1
+86d4ce4b
+56cff7f4
+7992413f
+a57495ac
+713ac685
+797c5039
+67b3c631
+16fe1dbd
+bbce9ffb
+bb26bef5
+a7cfe8b7
+232038bf
+e3463331
+526e49a9
+631f0045
+93b613d5
+1d4fe3c9
+981f7a30
+c66a5050
+8e2fc5b4
+65ac3c6a
+a59243a7
+e63161ce
+f10a7996
+bc3e9121
+831a2231
+ba2f32ea
+c0c7a39b
+755f8311
+db2d4359
+5ab5ddcd
+a9dcda12
+1a190c21
+6d87c0d4
+6c9b0432
+f089b621
+054b386f
+0b442a80
+f1cd3235
+8e8f030d
+4549ac92
+03b5b1e2
+9689b397
+0fd4fbad
+51168bf4
+b7f61016
+c47d8ce5
+f9e311f9
+778b5d90
+3a00c84b
+cb6c245c
+ac20c297
+b4023aaa
+a5b8b6b0
+565637ad
+967fe9c7
+7da9962b
+8c3aa0c2
+1e2a8a01
+8ac5e229
+05e5b5e6
+1ce2c3ab
+8759a2d9
+2d1e4c26
+8ec11ae8
+36a11f89
+9ff8d453
+005aafc1
+e638c51d
+023d28fa
+332e83f1
+551b167d
+bdb3fdc8
+e04d9bb6
+08de312b
+93fdfa71
+30bed612
+15109528
+9e0369ab
+288db732
+b2f2a0c7
+9872e04f
+8bfde9d4
+9bbfdd44
+a2f91f2d
+2c22cc36
+256b5c71
+7e14b290
+fc3c75c2
+9bd9d055
+c2807520
+c57bda3a
+e071b305
+14b1aae9
+9282da85
+153cc4e3
+f7b89495
+9c91bbc5
+a13c2bfb
+91c2360c
+c92dd02b
+f47fbf28
+3949fb6c
+bcae7dd8
+25195683
+07e39ac1
+ac13b0c9
+996376c8
+9be443fa
+d68aaefd
+99944c06
+355e0fc0
+6bc40863
+940a7930
+4bcf344c
+f073b53c
+5efb510e
+43f65e56
+858ebe08
+58bfa110
+ceb59ce6
+410c14ba
+267c028a
+477320e3
+e2ea5450
+210f4624
+c2b1bcd9
+04a1a565
+17bb48ef
+01b74094
+0b9df185
+88aba84c
+1d88db53
+86758547
+362e8de2
+afce40f1
+5151f2ca
+3770493f
+f98306d8
+07fc819e
+38fc4d35
+6399ea39
+23b497d2
+4f02a842
+2af5302a
+35da571c
+6aef3c1e
+a8903c45
+8a6d5f7f
+bcb792f1
+d8097e8c
+ca0962d9
+97c5a99d
+da441c7e
+d82d8ba6
+82f06a35
+03f89a73
+a14df6f7
+c1ad684c
+93068590
+dba66dd1
+13fed0f6
+a3625962
+03f7048f
+1a3b2c65
+4ade0efb
+ceee664b
+d5fea315
+74e69a7e
+3b35c4c2
+eb41785c
+978d609d
+c36529ce
+c1b18f73
+ab2b610c
+e334d594
+c4c8682a
+f8fb9f92
+ca6d17f6
+15f7d055
+c4de5bba
+de319688
+1748bb09
+6e159954
+b4851db5
+72491ac7
+b72d8ca1
+3ab25b0b
+5c646b1e
+e895d2cc
+fd90abe9
+64bda102
+2c37b1c7
+46973e83
+5feb4d80
+cf3ec61f
+5f704016
+dc671cac
+15b23f84
+be809e2d
+069df1e6
+dd39b141
+412b23b3
+0fbcd4cf
+097b7927
+84e63284
+5a0accb5
+8acda24f
+16f841da
+5bfb82fe
+c86a06ef
+ab444698
+e373e613
+e2e2fcd9
+25e6ec8c
+0d5067c8
+6cfa79fe
+41b4dd52
+449af417
+a2ac460c
+94025981
+23440ede
+644b2622
+3b233707
+efdd7154
+8ebd7aac
+15857827
+be7e65d8
+e832fae6
+ec6e1470
+97a55a46
+7ebe13a3
+1776d153
+a5d59761
+6a4fdb11
+2c1cea37
+3e0c4a55
+25810bea
+9364d972
+26f6b4ed
+cd13746a
+766102dd
+e7a94a1c
+b458da0e
+b93ac0ad
+7f43e4dc
+87acb535
+30f8aa32
+b688c8cc
+acf5f625
+dc9d47a7
+823a0874
+0b065b85
+a424b389
+06f8836b
+d7916960
+31d0d1d5
+83525307
+a5aff774
+7d594f0c
+266dfb7d
+11ff5139
+9e724f87
+7c8ae841
+a57124ef
+db8bf2c2
+bb6d240e
+ad5061b7
+58d622ca
+19254a71
+d458fea0
+34e85fee
+663b0827
+bc8707ae
+a847e1a6
+f327e411
+eeaabdba
+f09c47f4
+6c12a781
+62589caf
+0adebad3
+9e6ff465
+668f77c8
+c638c67b
+ee029252
+50038464
+f3c338af
+6e6350b2
+0757ce7c
+b9750c83
+84f42268
+b2da78a1
+3b6bb4ca
+7f787ad5
+49033934
+3bd3c82c
+3df1fda7
+5d09b37d
+47aac979
+6a3f7e72
+0835d27f
+7b6538b4
+13728406
+ab423a9b
diff --git a/models/rank/dcn/data/sample_data/vocab/C17.txt b/models/rank/dcn/data/sample_data/vocab/C17.txt
new file mode 100644
index 0000000000000000000000000000000000000000..95c0fccec81cab5dfc06bf906492566a043fbfe1
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/vocab/C17.txt
@@ -0,0 +1,9 @@
+07c540c4
+776ce399
+8efede7f
+3486227d
+e5ba7672
+1e88c74f
+2005abd1
+27c07bd6
+d4bb7bd8
diff --git a/models/rank/dcn/data/sample_data/vocab/C18.txt b/models/rank/dcn/data/sample_data/vocab/C18.txt
new file mode 100644
index 0000000000000000000000000000000000000000..152a873922d041f038a54f78e11022c0624d4fb9
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/vocab/C18.txt
@@ -0,0 +1,1197 @@
+236eaece
+c1b71324
+5ffc34e4
+c6c8223f
+9ede65d7
+42a2edb9
+ea3e5063
+3987fb8a
+44631fc4
+4ff1b968
+642f2610
+c3186a4a
+c61e82d7
+6ddae0d9
+cc0bf907
+afcf762b
+deab2e92
+e46d32b4
+449d6705
+f3058562
+744ad4a0
+199d5d68
+0f2f65b1
+3cbc29b4
+c2ce2fbb
+9d4b7baa
+fa5f2e4d
+181879d3
+522e8a7f
+ae19423f
+ae46962e
+9e066b21
+622ba7f1
+01f66046
+48c6d900
+9e6c2033
+dc593eab
+a1d0cc4f
+94d86dad
+ef97d1ee
+b6c1d748
+9a89698e
+416e8695
+f4b06ee7
+16f4f19b
+24de59c1
+63aa00dd
+a2157fe3
+7d461236
+7ba9340b
+082ff924
+0babc5cd
+be03c068
+93b0d1d7
+3009c5ce
+51b03af8
+be7bc2c7
+4e8be8d8
+e3f6ec41
+16631f54
+2a93f7c8
+e6f5e38e
+b7b8cd0d
+63d354f8
+9ef8c1e3
+bc55074b
+a8f42b59
+b37c34ba
+af0a538f
+2b829a72
+2f6ba926
+f2fc99b1
+f5d4d21c
+218cd882
+750192a8
+582152eb
+70d0f5f9
+2a5c1ed3
+90c88cee
+bbcfef98
+b0bcc1cd
+32d83fe2
+0f820978
+bc6f1d9c
+5d01bce4
+30228196
+205d1531
+ff87dac1
+b238b9bc
+72002960
+d51975d7
+63c4742e
+e589f353
+4cae8121
+53b5a4f2
+ea22fc4f
+566137f4
+222616cc
+6391198f
+8e8af94f
+f75ed668
+a2c1374b
+140bbba0
+e392c918
+ac02dc99
+fe49f530
+98c4d3e0
+abd741d5
+97f132c9
+51ec7a2a
+cbae5931
+6fe9da0c
+725fcae5
+42c62349
+c8d26de7
+5b5794f1
+306a7468
+5d961bca
+c04ce6df
+95f11b33
+63cdbb21
+75edcf1f
+fadc3645
+15f0df20
+e1cc12a2
+5916bc73
+e5213cd5
+95f5c722
+607d062b
+5a5b8bf9
+c63b394e
+1e4afada
+7a499c3c
+93e0e949
+f7be65bb
+def1cd2c
+66501661
+6de7a3ee
+ce500fd8
+31f7449a
+4885ae60
+891589e7
+e9a3d86d
+23fb8a75
+92c4efaf
+f1a8f10f
+e4ca448c
+f2becb37
+0e128609
+ce4d072d
+d73b34b3
+661a2801
+381bd833
+775e80fe
+38b82d9f
+c84cfab2
+385c471a
+2c6cb693
+0248f398
+157482f0
+1bae7658
+b04e4670
+2b0a9d11
+5ae61280
+bc33765c
+4427594e
+8aaa5b67
+97ec19bd
+31114d25
+e88ffc9d
+1ec1a272
+df903078
+e727949e
+12d119a8
+65a2ac26
+05f89946
+a7e6c628
+97235b4a
+28ebd359
+0b502ecd
+96549fa1
+701d695d
+b4c9444e
+4ac4fd60
+e5c40f39
+cd81e6cb
+b6d33a04
+04fdc63f
+71a94f1e
+52e44668
+b486119d
+00ce28f6
+269527d4
+409455e2
+474ecca0
+e37d2c6d
+7549f127
+c6fa25f8
+b3e92443
+06762f0b
+94ab5489
+7ccd2973
+62acb0f3
+a32b8daf
+26ea54dc
+be645006
+9f591d04
+ca8583cf
+e703a375
+d942f032
+f2f9ae5a
+b0a66c0c
+62c2b7f7
+74fc71da
+c786d1ea
+caf3c615
+335ea8bf
+c0424025
+e310ad06
+4aad5ba8
+b5c23ebf
+02607769
+e5f8f18f
+6d6312b5
+bc48b783
+fc923ad0
+658dca4c
+387f058a
+966c77d8
+5aed7436
+4172ab05
+e8623312
+38dce391
+be457d6e
+1866f3bc
+43a9e4b1
+135390ff
+cfa866e4
+4c6aba3c
+aab53bcd
+89a6dfcb
+bc5a0ff7
+d0e5eb07
+a3e3ec9a
+6c913fe0
+6e046711
+1ed5bdd8
+91e72260
+d632055f
+16bddfab
+a5ac4b1e
+35a9ed38
+df00d249
+e6f0d720
+f724634a
+0280dd56
+6ae8c28c
+4b17f8a2
+a9a6d3d6
+c1299c0b
+102bad49
+4bcc9449
+be2b722c
+aa6ed13f
+6fc84bfb
+57c8dedd
+a3d71525
+b8c1a741
+69a65434
+08d65fe0
+da9d18bb
+0b331314
+78808395
+b2d0d922
+8fc2e6f8
+5b22094b
+dbebbe86
+037e5c3d
+4873986c
+d5e4889c
+b21492d9
+79d7c2c9
+5b17ed49
+a863ac26
+0d40d1db
+717b8551
+9f169174
+c3a4b048
+d452c287
+874ecd52
+4b519f6d
+79a92e0a
+5e8bcf8b
+4623da58
+12195b22
+fd3919f9
+713e6091
+908eaeb8
+97954b01
+cb2feb6e
+5d1eff8d
+8ecf282f
+29b0e3e5
+0d6480a8
+af13ada3
+d6be9f7b
+7ac00b77
+0fc6cea4
+a2c19c3c
+d3d245d2
+90ad4528
+a086cf41
+b76fb0de
+3e50aadf
+38748bc3
+b4abdd09
+b0a0fca9
+babd84a5
+912c7e21
+240f5971
+0c52d809
+b79acaab
+f6942533
+f0ad6738
+f207eeeb
+7a824e68
+03364bd3
+1490697e
+0a106e05
+88416823
+4e6b896a
+6eeba0b0
+303a3ab3
+35176a17
+4390975d
+3c59b550
+06c23e12
+065917ca
+c79539f7
+cb5f60e5
+2a64e498
+779fb8f7
+e3d9ff6e
+c196a249
+d87588c2
+26b3c7a7
+f559cb8a
+213ffd91
+6477f501
+df4fffb7
+89b242a4
+abad9dca
+43e64261
+44e3f649
+86f93336
+45aa217a
+3d8fdbaf
+2e171ed0
+e74ab908
+d24f3c5b
+cb95e657
+ae288801
+bf4e216d
+2939761f
+2a92b119
+cdccdc82
+6edd3a0d
+ad19d8d8
+dd4c6357
+7d8c03aa
+45af748a
+ca533012
+86f1f0b5
+ed8965e3
+afa6ee61
+1263c077
+ffd53157
+cac48684
+ce18773b
+ba0e319f
+84eb7a34
+1cece7d0
+5a6878f5
+78db103b
+60a331ee
+8019e4db
+53d8aa6f
+30d1165e
+db2a6191
+f54016b9
+a69461b9
+ed43cc6d
+3ae505af
+16bee6bf
+53367220
+6ef500f4
+f8c14a22
+1f868fdd
+c342ea0e
+d29b1174
+61f32b0a
+73b57b6d
+281769c2
+48dc5aca
+ff809076
+fc16085c
+526e8765
+a628cd1b
+f8078ad8
+9cd2305d
+e1174b91
+69e1d4bd
+ade68c22
+665a060b
+8c6bc614
+fa0643ee
+f0524df0
+0e2b2aec
+be5810bd
+b49c9b63
+d9be4b29
+3e3ba623
+4b49363e
+6a74e011
+e01eacde
+b4562df2
+e712bb73
+d0505ef3
+51dc528b
+b3e4ddc9
+dcda5a9d
+b2b0c96e
+1cdbd1c5
+e43cd1f6
+d6301f08
+4cfda94a
+963139a7
+0c1cbf43
+41d2aeab
+f8bea632
+cec3268b
+4539f136
+4d641da4
+61d44c4a
+ecfc6e86
+49e22b38
+d60f3983
+63e4be9d
+24a3debb
+15bb350b
+108eb01c
+1292d860
+401f44c4
+09186e1c
+bdb9e553
+698d1c68
+fe74f288
+048d01f4
+83a18735
+f8babbef
+f93845d0
+06747363
+66fcfcc7
+ab78c7f7
+5742e45c
+341f23a1
+02e8d897
+817481a8
+8cd9b2d3
+371018ba
+5de94169
+deaa119d
+e015d952
+5214a8bc
+1c381aea
+1e9aa8c4
+c23adf9f
+7ae26406
+832bedbd
+003d4f4f
+1c130afa
+3f6bf69a
+c9da8737
+47ecd034
+f9edfcb8
+b133fcd4
+35edc425
+fe94fad1
+e32bf683
+dca44521
+8222ff64
+8e950a44
+257796a3
+c4eab330
+30f25b5e
+3e340673
+56ee3543
+6ffd3334
+8ef31929
+6830a76e
+cc693e93
+35cc0917
+3735c118
+47e4d79e
+70eff6a6
+bfeb231e
+a673f1fe
+5fc3b9ce
+34ab890d
+0b59e971
+627060bf
+2ae4f30d
+0b75865c
+31d0e6f6
+cb9ae335
+76ae8aa6
+f7c2843d
+02cf5d49
+d4328054
+c402f369
+3f10e4b9
+cbd13810
+997cd4bd
+0c4e94df
+43c6ac42
+eec58038
+da72d4a6
+96af95cc
+570391ac
+faa19495
+e14dc045
+3fb96179
+3ca7ad0b
+c6c687be
+27b1bfb9
+1b884e69
+2a40f0da
+6771b543
+82816173
+238d8de3
+30adf650
+d5d1ae1c
+23a483c8
+ab7d03ab
+a7e06874
+cd6c876a
+defd44e8
+32224310
+05c7f49f
+5dce8960
+be07c275
+901e19dc
+252f0dde
+ac7705cd
+92555263
+311e1ffe
+d4a314a2
+0a2d848b
+1f98b1e5
+f5f4ae5b
+e7a96a4b
+45e3284c
+9fb8fc2f
+3a2028fd
+0545cc03
+f65bbbd5
+8f9b4e88
+0b25643e
+21eb63af
+eb0f5533
+cc464611
+1996f15d
+a2926ce7
+bcc05e92
+a30a18f8
+cf1cde40
+04d863d5
+a1ff8a24
+44fa4e35
+4b0f5ddd
+ca5a79fa
+934abd6e
+9c9da140
+74ef3502
+8687deed
+5cd2e5a3
+e138f81d
+370c59e8
+7da6ea7e
+ab5ae14d
+14a8c0a7
+da507f45
+bc836583
+faeb6b69
+5681c2c1
+1a66fb6f
+97b81540
+51360aab
+65979fb7
+9378e2ce
+e85b49c7
+1b71d04b
+68de516c
+7e32f7a4
+b2e570f5
+c1c7adf1
+d4aabcd5
+ab194a92
+1e3d9f94
+cdf5b4bf
+562abdc5
+cdfa8259
+59718f2f
+9eed203d
+595e9ddb
+5162930e
+5cd35b65
+07ae81f0
+0a20b09c
+2ebcb279
+659bdb63
+9e4517be
+1616f155
+b1f2cae8
+eea3ab97
+87c6f83c
+2cad38b8
+824dcc94
+9f1e2f58
+d495a339
+3453b1e4
+da738eed
+808e7bc3
+40685634
+bbf70d82
+bfd76bad
+ea9f4495
+c24ac50d
+5cd0443b
+6a2d2873
+e24773fa
+d4df8873
+8f1ba373
+94d8f2d9
+718ffb5e
+4b340164
+7b06fafe
+b4cf6245
+9a0908ac
+8b61249a
+331653e6
+a78bd508
+a53934cb
+a70ca30e
+c1e3e8e7
+9dde83ca
+08154af3
+752d8b8a
+a62ebaa8
+0705fb3d
+e96a7df2
+f2f1286e
+e12da5c8
+7ab54100
+5ba7fffe
+bfc6a90e
+67007047
+46e9560c
+5911fc7e
+ba67aba9
+130ebfcd
+b2879faf
+41f61cc5
+d1c83925
+492675bb
+7c3c801a
+7c21218c
+9e8d3bce
+20b2cd03
+45e764d2
+19ef42ad
+d942e999
+e8f3ce77
+ab368164
+836a11e3
+b4270c3a
+641c905d
+f42c5fb7
+c9fb83bd
+395856b0
+9a1cdd7e
+191cff01
+8fbe065e
+7a71d8da
+98ff11f4
+f3644223
+893857db
+699f5baf
+e161d23a
+73a47204
+08d20fe7
+7a8e3247
+e84025ed
+637510ea
+6217c532
+9df49ecd
+5447182f
+f68751cd
+020c927a
+40aab586
+e261f8d8
+dff11f14
+d513f9e9
+b608c073
+14012d9c
+8cc913b0
+bbe2f8c3
+f190d96a
+9b82aca5
+426d4e67
+7181ccc8
+eb4d3f8a
+8d18bc02
+5f2d60ad
+c68ebaa0
+ca5a75f3
+7955b5c7
+cc7032eb
+58fd9158
+3593e33b
+43de85d3
+e7648a8f
+d20a7fea
+c90735c5
+135c06e9
+a5bb7b8a
+3e7350b8
+ae09efbe
+8a25126c
+e569651b
+a7a5c186
+7ce63c71
+5080e3d8
+3cb7e3f0
+86b4fc22
+5ccbf8bc
+3e9b1322
+04c62c3d
+8f445203
+2c65ec06
+c04654d0
+2bbd3980
+c9bfd921
+813e0639
+01890ebf
+66c3058a
+31941843
+df2589f6
+29907d97
+97029569
+6a756ddb
+423679c5
+2dad8067
+fb342121
+6067836d
+adad417c
+167708ca
+3412118d
+6e245777
+6e200add
+eef7297e
+a00a189e
+15a4e6dd
+bf6b118a
+f4373605
+d5288836
+e5477d35
+747ad3e2
+3d513ec7
+f8286ab3
+ac4787ed
+8151ce98
+cbadff99
+01cb4e6a
+62274968
+75a3693f
+44992f00
+8814ed47
+abc718ac
+1cb978cf
+ca6a63cf
+5d93f8ab
+0e393340
+2804effd
+807ea8b0
+d3303ea5
+05773134
+d8ce605f
+2b0916a3
+d245c200
+a7cf409e
+5ab59e5b
+d53925d9
+7eb5f96b
+1999bae9
+cd231a7e
+12869708
+07070d63
+42a7e7a3
+1f9656b8
+fad5dbfd
+be70385c
+25c88e42
+85910cb8
+0bcc943a
+f6707d4d
+f9b74a64
+f54fae70
+c235abed
+873af5c4
+fc35e8fe
+96d926ee
+f699ac01
+870cf59d
+6c62e002
+c7cf2414
+a33fb37b
+1576ec18
+c3854c72
+2f7ca015
+3b659b79
+124c6b00
+7face29f
+5705c078
+e52d145b
+fbd943f3
+195c811d
+43c2c7c4
+db8d9382
+98bf96cf
+3bce7100
+ff62f93f
+4cb86eeb
+bd17c3da
+1f9a787a
+795b8402
+36020264
+427f76d8
+a9cad496
+b9f82cf2
+ac2846a6
+955ed1fa
+e089d5fb
+686da8a3
+6c5555bd
+61efab44
+331176b7
+ff58a873
+a00e54d0
+9469f352
+d2f0bce2
+4ab7652a
+0c63bab5
+35ee3e9e
+452e336e
+d94f84e4
+26692966
+b5dbd3fb
+4fc40d0d
+f03e8b05
+84abbf61
+de2a126a
+3dde2dc8
+9d83e7f2
+2c698a26
+bd9a37e8
+f0e2be93
+366370e2
+b07781c8
+8832437e
+006aeba4
+f57e4608
+a6f5dd38
+a5b84a06
+005c6740
+0edc7d17
+a0f3a3dc
+58681afc
+dceb92f5
+99d64e00
+8abf93c8
+62b2e723
+3f9969b8
+3ff18c32
+7abb2837
+3182300e
+1e9e2790
+1ea311a2
+2efa89c6
+7ab25cfe
+431b9468
+4961a65b
+3cb7741f
+af035b0e
+e7e991cb
+6fba30f8
+1f8f8372
+bd246965
+616484f3
+2585827d
+112d0327
+9943b99f
+479030a6
+8b2c8140
+82e4ff9f
+53515e19
+a4667218
+c587eafc
+94008e9d
+618f779f
+9d3171e9
+fcea3412
+fdbdefe6
+88daae07
+ea3d03ad
+a4dd5669
+70e5bba7
+2e39068c
+3cbf782e
+15fa1f82
+01125673
+2c98cd83
+17a67eff
+1a9f6745
+b6c7c5c3
+9bf8ffef
+65bcc8d8
+15a36060
+68983471
+0b14a1aa
+540a1cbc
+a58f62db
+69d8c303
+7119e567
+37ad08a7
+60502883
+5ce4203e
+0533e21b
+ebcbb440
+24fead2a
+58b6f121
+7cf65c5f
+45de52e0
+5d50ce9d
+bc95fbcf
+2787cb2b
+4854928e
+e1bdfb13
+88b0e440
+bb983d97
+836a67dd
+f855e3f0
+1ba8cd7e
+f6a2fc70
+4771e483
+6e3d8cb0
+166a4729
+3d033dfe
+204e9bc1
+3ac30845
+cacb8db1
+88ff59ed
+d1464cb7
+f92d697a
+1e42ba17
+72175246
+3ffb873b
+7b49e3d2
+b182a697
+c25374fb
+b6b880ec
+5ff6b040
+a8df06fa
+acd948bb
+87fd936e
+9880032b
+35901cfb
+d2f29bd8
+f64c5d3a
+b34aa802
+f953a972
+e8bc4abf
+e1e3d16a
+908f310b
+0c425168
+0bcb7dd3
+1a47ef6b
+280c9cbb
+b81a5022
+989821a4
+0f4a15b0
+784f00eb
+a45ed116
+4431064e
+de8971ff
+8e8b535e
+416587b1
+436a5570
+002dcfdb
+95e4ca74
+ef981aa1
+de5b2875
+87105aa9
+36a1d942
+426a24d8
+3598c4a9
+35ecb99e
+fffe2a63
+43dfe9bd
+08ed8a1c
+fba30a05
+d3e54c7f
+07801acb
+ec24516a
+7458820a
+1b8d9b2b
+12c649f7
+a7ccaded
+90b7bec5
+519b56d4
+7f98f2d9
+f2a4ebbc
+3e608631
+09518cd0
+29b9d89a
+5a87d8e9
+3842635d
+c185129a
+7ef5affa
+5c4b15cd
+bb84f5b3
+ab9c686d
+c0b5f1cd
+13145934
+906ff5cb
+ae4db229
+118f3bce
+78e611cf
+51369abb
+63226d20
+e262a40b
+6777eb08
+fe623d4e
+1445a5e9
+65cebfa5
+47b9d9e5
+2b46823a
+fd673b92
+fd0ec440
+5e5b3998
+4903cd40
+93444c11
+15c0d0ad
+456d734d
+1ba38918
+4240e004
+c41887ad
+03afe4f8
+24d4558e
+05719980
+19fec6cc
+863eba0d
+d1605c46
+42076ccd
+f1a41ea7
+6c8360a8
+c15e5a62
+a573334c
+0ad1cc71
+2ef5e1bc
+e1b6ea80
+b75b4438
+67bd0ece
+cfdf1056
+04bd1bd7
+d981a095
+c08a0c90
+9397091a
+6ef5c311
+caebc32e
+c191a3ff
+c3c7460a
+dd489dad
+bdc06043
+a10c0817
+86c37593
+c4ba79ab
+42235923
+cfb045b5
+daaca6f6
+0f9cf934
+6ecd217b
+670f513e
+5bb2ec8e
+a56e3253
+9480132a
+ec7a9588
+83ef104a
+f5508183
+82c4bf6e
+9221d3c3
+821c30b8
+bbdd12dc
+fde97152
+87c9f30d
+bd6b4cf0
+2ab4da9c
+0a6e5453
+e7c97dee
+f59d3e4b
+57598e25
+0ebdbf46
+c2d5aae5
+befd6e69
+da7ab2f7
+c3d46189
+a896fd82
+004fdf10
+c21c3e4c
+d5a8843a
+8cf6783a
+45e58044
+1c4b4926
+3d6594d8
+65c9624a
+44a0d901
+c8219a71
+5cebaf83
+c7dbecd5
+2170da5e
+e05e805d
+1304f63b
+a1654f4f
+48970815
+6305316f
+20157126
+b3fe34a4
+20e28e86
+a866c2a1
+9d40190a
+25935396
+6f5ec0c1
+caaa7e73
+19f8590c
+3868b983
+cc793350
+faa6829e
+c351d4e3
+876521e0
+c9ac134a
+745169bd
+b37f5bce
+79c48790
+ede11b1b
+262c8681
+0ccbf1dd
+a30a2069
+8f0f692f
+fb299884
+e90118d1
+9efc6015
+9b3bdc6b
+7e2d7802
+09d9daf5
+64da4141
+d9942b4c
+4ae65742
+bab43a1a
+3fb55a52
+7d0a2593
+cc0b0790
+6a6c29a9
+8950eb26
+3a165c37
+86b4c7aa
+8db09f74
+0ac416c5
+d39e7a70
+0f2f9850
+ecf26b99
+52b872ed
+3c4f2d82
+9de259c3
+fc45d11e
+d2651d6e
+6a58e423
+561cabfe
diff --git a/models/rank/dcn/data/sample_data/vocab/C19.txt b/models/rank/dcn/data/sample_data/vocab/C19.txt
new file mode 100644
index 0000000000000000000000000000000000000000..fd141ed95d49f963bfd871d6dc3ac4219b57da0a
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/vocab/C19.txt
@@ -0,0 +1,584 @@
+5520c45c
+b41b58af
+04de9d96
+1c2a036d
+609029d1
+2d6b8eb8
+3b422a71
+0f265be8
+f6a8df04
+3eb2f959
+b009ee50
+3be24715
+54591762
+65b79aa2
+7839a083
+f3c1a470
+3f7ee9bc
+e51f040f
+74f2710c
+3d9b755a
+c61b9367
+4632bcdc
+a04181c4
+7d7d0ad4
+5048dbb5
+82390ba4
+b85fd9dc
+6f0c4cf7
+b2e77b63
+d3ff1b16
+6a3f73bd
+872eb0d7
+b6799750
+7c629f16
+9908ba56
+a9d84a0c
+eb30f6ed
+2221c689
+389c11a8
+f1ed3100
+700b0c57
+c708d93f
+169e9700
+83a6a781
+9a9aaf67
+c361c1be
+6fdd6e61
+289ac443
+1d1eb838
+c453eb25
+8633b12e
+d630e5f7
+c60a4703
+97f9ffcc
+6f3756eb
+ec1695e7
+6678f597
+d785f37f
+fd577979
+9b19e0d9
+da6e9c74
+158cdff9
+c9414bdf
+3aae8792
+dbe199cf
+e3b5ceb7
+54156659
+8279049d
+841389cd
+cbe37417
+399d5c3f
+410d7406
+6c6e7412
+200779d9
+dd2d8ada
+23a4bfcf
+92524a76
+8c604d64
+c393dc22
+c4aab396
+5b1d6ed9
+34fce22d
+75647e0f
+7e8c642b
+d913d8f1
+18259a83
+0e8004a0
+f4cb0c7d
+5faa1322
+9151fd50
+511667fb
+77f48d1c
+a88b4268
+08dcc8dd
+d951bdce
+909bcde7
+0cc116d5
+4bd8c890
+1b1b9309
+3a1e0f5a
+f33fe217
+ae44ba4c
+5fed1d47
+1339d4a6
+3ed24bb5
+44a71869
+a02e9a1e
+920f5f20
+d271d52f
+b1fb78cc
+0421a5ee
+670dadca
+dfdebe48
+b971ae39
+db709382
+757bb18e
+75916440
+e0237f36
+73006510
+a35e3db3
+4b1019ff
+03a07a24
+f44bef3c
+f9730b28
+77129a30
+13c9db1f
+10a826e7
+0b8cd6e5
+28125b9c
+69578b6b
+7e57d0f3
+97fd470f
+566c492c
+064f1f80
+55f74743
+45d6674c
+83552c76
+d8f18599
+b7380686
+c79aad78
+e4b30de2
+05e4794e
+ea727e37
+062e2590
+72bdebf3
+2a56f4a7
+39e30682
+d989bbf7
+a73cd2e0
+423fab69
+9890ba64
+5b5de845
+733bf73d
+061feb43
+f7b3cf28
+a2c503f1
+7be4df37
+9d523618
+dd25bffc
+a8617aff
+58a67fbd
+473e5032
+153bb189
+69ff7363
+de2c6717
+fbf39fb5
+a34d2cf6
+8cd56d91
+25047f94
+4322165b
+6ef46a75
+72592995
+e14dee06
+e142ee46
+af8bed6d
+b1aed9b3
+8b3a2728
+03e9f0e8
+4a237258
+d13f30e6
+7d551238
+93d183db
+b795e71a
+641df83e
+edfe7dcd
+c68db44a
+fb3de65f
+c0a97214
+91d8fa7a
+650dc442
+a2a24dde
+738584ec
+1d474f25
+d9aa05dc
+bb2479f7
+54564c1f
+5a9b3868
+071c1933
+825112a1
+5b885066
+790d09cc
+ed2a0c35
+4e1ffee8
+89c06616
+3e0910c1
+7866b2f7
+72d4f58d
+5b9203c4
+72719375
+0bcf2bf3
+382b1e40
+b04164d1
+e5093397
+5fd56cf9
+dfc341f8
+45455b38
+ae25bd5f
+1439e83a
+0b86bc18
+2e30f394
+1bd5359f
+f005689f
+b6abc58c
+68c36492
+645ae380
+21ddcdc9
+afafffd7
+2ba39771
+a105e305
+08fffa8d
+46f1d1b5
+8eadf6d6
+cf99e5de
+91066f77
+9ff26bea
+df1de58f
+3d29bf8e
+185d5000
+2e631106
+d7a14759
+c651ec54
+e847a21e
+943674f2
+05802368
+083e89d9
+9437f62f
+479655e4
+aafe3a09
+f6a3e43b
+cbf94c6a
+4764bf77
+52f9e685
+a7ba9291
+19050a44
+f7090e1f
+95709483
+bf212c4c
+e22752c5
+34a10404
+55dd3565
+49175026
+c31bf66a
+32695916
+195e20a0
+449c7d20
+6506b94b
+5d097c5a
+a18beb4f
+966f1c31
+65a3ca24
+b287721a
+cb9d08a1
+e5e1bbb7
+2bb27f45
+b24872b9
+0d7273b6
+4c01641b
+c94fbfbd
+ec607592
+3dd03255
+41fd0e18
+555c64f6
+712d530c
+c63ea0b4
+37c0d2e7
+3fe5a574
+a2eea4d0
+a4038074
+c220703e
+9bc2b29a
+d1756250
+38ad7000
+09243db5
+a8ac0fc0
+b034efb6
+54f1657e
+5af15ee4
+2f4b9dd2
+8c0ec819
+8c2c7ffb
+b8a0b6d7
+c4209246
+1a7bd53d
+17e184e7
+fba3b75e
+463a84e9
+1f4e7222
+2b47c6cd
+a153cea2
+1e6ac7d4
+ba92e49d
+610c11f9
+3cc52e83
+6f86de17
+84fc6c03
+915c3457
+f0e46a5e
+d16737e3
+8817e1c3
+c5849342
+65c9d3c2
+fc134659
+4cc48856
+dcb7a0e1
+601f1f3b
+e77cc843
+02883f9d
+81551f75
+d80b0064
+55c3a60b
+4bc5ddcf
+102512e0
+c5cd97b1
+53d07655
+1a9b8580
+1ac39a24
+f51eccb3
+ed601ba0
+2c434f3e
+afd8410c
+e8a609b0
+8491d222
+2918b658
+f2edb1f7
+1ecb6a20
+4aabab64
+64d8d024
+8b067084
+050f9eb1
+b656d252
+a9084bcd
+26e97973
+5c2784c9
+8f80afeb
+4c3f8eb8
+5e5ff12b
+108e6f07
+881fb6ec
+e87bdfc8
+d0a7991d
+8cccb387
+fe22cc73
+c703dc9e
+a0dfc6e7
+71d82dea
+bb8d42f6
+96794921
+fbce4e45
+2c12be6b
+1ae44294
+b99b8dbb
+cd11c728
+c5f356b7
+1176e15b
+39c70805
+67b50e6d
+83072916
+be323b68
+d7b50177
+43cfe51c
+cf77ac68
+52f9ebfe
+361d380e
+9653bb65
+8a319e43
+2c6a63f5
+cdff2091
+d928a345
+e78e0c47
+b3824928
+aecf55cd
+1c3eb40c
+a84a01e1
+83b188af
+ada95f0f
+deca810a
+35c9aac8
+a32d11fa
+c2c9cc0c
+45372acf
+ff40648c
+e27c6abe
+b6862699
+8c8dd18c
+1bf8725c
+c9be8835
+ce199c86
+a4c44aaf
+898a0348
+38ecb055
+c7dc6720
+9b99c7f6
+84dfb479
+97b66ca8
+7734aa05
+d0dc53b4
+410499c1
+662641e6
+b87498e2
+ac1a2337
+77e4599d
+c5cce658
+49b8041f
+a8dbd7f0
+73ba0e1f
+8583a7da
+8733cf72
+5a7ebc3d
+f455cee3
+3fc9f59b
+1d04f4a4
+af7c4727
+647e63d4
+ec05e327
+dfb213f6
+c3c4c8af
+2b558521
+6e2e2cbd
+0bcb00bb
+efd98782
+83236299
+7a471cd5
+d74ff162
+49463d54
+65e326cb
+e809b8ed
+abfaf938
+db0b20dc
+4b403ba3
+db3c96e6
+11247e20
+f30f7842
+d3fd955c
+a231b7eb
+7324dc45
+5837ec60
+85684dc0
+338f20de
+16788d7e
+7368c400
+e9a984fa
+c09ad765
+6301e460
+315ba0e1
+82368759
+bfae7bcf
+656485cf
+f08320ef
+edb3d180
+a85e795a
+0053530c
+9bcb61cc
+1608be0c
+01c1c86b
+81d9a273
+6fb9a07c
+2808950c
+bb16f112
+07995af6
+ea132b7c
+d779b884
+3d1013ec
+3f2c5984
+f67dba39
+2030fcf8
+8e453acc
+ed9c25e8
+4b0c4f0e
+40af282d
+276c2365
+7a45f7f2
+d412b549
+8647a494
+26de7434
+18884750
+3a55d521
+b386900c
+7f638078
+7caa15fb
+19b31d2c
+6f62a118
+68628d59
+fb9c5daf
+b171f800
+65d4fce0
+bdffef68
+11f12e8f
+ffc90f70
+5e89f4c8
+28f6d2e0
+b8868b4a
+22cd7c5b
+9902b8f1
+4cafff97
+5ce524d1
+8aff387f
+4114b273
+115660c3
+bb71621a
+2e7e0472
+06035f73
+db0cca70
+35f2504b
+59e501fc
+60b01a98
+88a2cb01
+c584e779
+2efde463
+a3c895d5
+c48b5b0f
+cf2c2521
+b2166334
+c27239bd
+a8d7417a
+3014a4b1
+3edfe27a
+b077911b
+0ec8d23c
+f7dab263
+d0289910
+54e53a39
+c9b4dc46
+efa3470f
+24ebf1da
+79849f24
+e8a57b45
+99f90f6d
+7fb78749
+cb6b4a8b
+cfed19f3
+c10eb73c
+28e1676e
+2ed534da
+47e67bca
+5958dc97
+9616150a
+c51f3181
+ff6cdd42
+a9e22034
+4b46b36d
+1d78ea79
+6a342af1
+fefc1f4c
+af1445c4
+7a6e31dc
+69d52f7b
+f8443395
+2442feac
+11b8c1d4
+bdcd34dd
+cc4c70c1
+b8ec925a
+b93aa927
+245e2acd
+419b4cef
+a74245e7
+e29e5544
+c85b2f47
+790f389c
+c1b929fd
+5cb64673
+cdc8f390
+27ab133b
+b9a905aa
diff --git a/models/rank/dcn/data/sample_data/vocab/C2.txt b/models/rank/dcn/data/sample_data/vocab/C2.txt
new file mode 100644
index 0000000000000000000000000000000000000000..26f550d3753641756f72f4afbb6bb503e951f32e
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/vocab/C2.txt
@@ -0,0 +1,422 @@
+6887a43c
+00ac063c
+c44e8a72
+58e67aaf
+f3139f76
+39dfaa0d
+a244fe99
+33424936
+1cfdf714
+cfcdf408
+2aee75a8
+3df44d94
+85ba09f0
+d833535f
+6697ea82
+c01b42bc
+2607540a
+1329e144
+46bbf321
+a0e12995
+d9060812
+b26462db
+064c8f31
+b71e0773
+b06bb7e8
+3458ac28
+6ffd49b0
+73b37f46
+59ab477c
+2fcf9bcd
+efb7db0e
+e1d6d0c5
+1550810c
+270cc1b8
+d8c4e58f
+102fc1ca
+a05721f4
+47e8ab98
+c6000c21
+bdaedcf5
+dde11b16
+f40dac87
+8e465f4d
+78ccd99e
+8ab240be
+cfb45bfd
+a90d6e47
+7772ab29
+bfdcfc4a
+56067431
+40ed0c67
+d5b3b632
+65265295
+3f0d3f28
+beea1002
+06174070
+a07503cc
+9af7d014
+89ddfee8
+fff6365b
+532da141
+55e0a784
+52e9ecfc
+e241319b
+fc1fa80d
+10dccf9d
+fe5f56f0
+7008ef6d
+d79dbf86
+4c2bc594
+f6f4fe4b
+0aadb108
+e6b41dab
+1cf8ffb9
+6c9c9cf3
+1cd31e7f
+e3b5d099
+051a26e5
+f988e09a
+802009dc
+8cc9c66e
+6f609dc9
+9e5ce894
+0d6ad090
+1cddd76d
+2705da39
+9ac2ee61
+0d3cb2b1
+094e1902
+999aae35
+e112a9de
+097e9399
+c5c1d6ae
+cbac4677
+e04f8851
+9adf4cf9
+62e9e9bf
+df6efd70
+3e4b7926
+d4bd9877
+637af2df
+30bf41e1
+a90d9954
+2ae0a573
+028bd518
+b56822db
+8e4f887c
+98269b3e
+4f7e8e40
+68b3edbf
+8947f767
+681c0354
+db2905e6
+cc4a4a1f
+9a82ab91
+57a04a1a
+77d5ad3a
+e6203a55
+bb9f48d7
+63e44a0b
+5dac953d
+30af1585
+8084ee93
+56eca72b
+a8382141
+302debf8
+13f25995
+31eb7ac1
+aaaeb481
+5d9e94d4
+bf9174dd
+f5384b37
+c8687797
+7e979632
+a8da270e
+a016abf8
+09e68b86
+980cc9df
+7cd19acc
+9b5fd12f
+dd8c896e
+92a5b341
+2efdbb44
+5f06ed0c
+023a27f8
+8ac34ffc
+ae46a29d
+7bd1349b
+6d4d5095
+91381efc
+f9875f50
+e18b1e61
+0b8e9caf
+8db5bc37
+93c5dc56
+5368c225
+0bf920cf
+8dbd550a
+403ea497
+bccb7a1a
+2f86bdee
+7e899df2
+ad828524
+9f7e1d07
+39c6751e
+ed7b1c58
+ef01ba63
+6c2cbbdc
+d4e5ee28
+9bf4fe35
+287130e0
+9b6b8959
+b7bbb864
+d4be07ad
+bf7a2333
+2796cdff
+e3db0bac
+14a8f8d6
+510b40a5
+a3659c14
+e9ac66c5
+cc8e236e
+8b0005b7
+78c51e9e
+c5fe64d9
+2a8248f5
+482fe41f
+a7e699d4
+26a88120
+6c713117
+26ece8a8
+f8c8e8f8
+0acbc0bb
+2fe85f57
+d8fc04df
+4bb4b657
+bce95927
+d4ef6e5b
+95e2d337
+f0cf0024
+9bcd4a15
+d57c0709
+42016cda
+8d42d2d4
+c1384774
+24d41293
+56273427
+84b4e42f
+c37e7940
+4950f9dd
+c3d483bc
+cb89a94d
+16edc335
+8cafcda6
+27f70843
+7a1bc0a3
+537e899b
+c1d8ea5b
+0c4bf847
+1287a654
+ea3a5818
+9819deea
+08d6d899
+e9b8a266
+603e7ec9
+f3b07830
+2c16a946
+90f4f36f
+b2659ff1
+a984ae94
+b0d4a6f6
+a3397841
+f3f84ffb
+70a1db74
+6496eea0
+99679f51
+cb3359a6
+8f5b4275
+09bf88cc
+b3f3d753
+24c700ac
+71ca0a25
+784f09f8
+faefeeb6
+876f5f5b
+92f101be
+c2004197
+31520db9
+558b4efb
+111631d5
+08c2f5df
+a7147fb4
+a0baa1e8
+c319674f
+5b7b33dc
+4883e3bc
+4bbb9391
+aa8fcc21
+f5e11606
+6ff18bcb
+791f3f76
+8b57fabc
+b723f84a
+d7988e72
+6e638bbc
+3c53160c
+960c983b
+a796837e
+069b6d24
+ca843edf
+bc84a2bc
+6f115d49
+a40adb47
+172de694
+d1f70341
+bf7159af
+2a69d406
+0363d860
+7b99bba3
+5ca60b73
+207b2d81
+221a0666
+e77e5e6e
+04440d29
+016cbb4f
+ce2a3d4e
+c1c79489
+ad88539c
+318d2c95
+3491b642
+2c2fd77c
+38d50e09
+33728ce9
+298d0556
+dcb2560b
+f234d60e
+f7ab4058
+b78edce9
+edf9ca84
+85af3139
+47feb5d7
+2eb7b10e
+46320fff
+9239554a
+ae51005c
+4f25e98b
+aa157e5d
+0eb070fa
+6deb1348
+af447d7a
+f53b0269
+73a46ff0
+cbafb605
+1e327ff6
+404660bb
+0468d672
+38c81d1a
+76c475b1
+71b67965
+942f9a8d
+ad61f1c8
+dda1caf9
+51d3128b
+52b2f87e
+b46aceb6
+bc478804
+b7ca2abd
+f1bd57c1
+9c6bc64b
+8d406027
+5df29557
+762b9a6f
+e3a0dc66
+ef69887a
+3e25b403
+1bc2d3e2
+2c8c5f5d
+5a88f1d5
+4e8d18ed
+e5fb1af3
+9e681c70
+247a1a11
+ad4527a2
+04e09220
+25231301
+3e50afd4
+453a17e8
+52edee5f
+54b0d681
+61e10608
+333137d9
+953c5ca5
+9b25e48b
+7f068eee
+c76014f5
+fc67db1d
+1caf43ad
+52d631d9
+9fae339f
+3cb90ca3
+df15595b
+b4ef78a3
+1612be27
+ab577bd1
+ed3ebcd1
+083aa75b
+0a765a7a
+c6fe9276
+a5b69ae3
+d0a34130
+e3ce8d54
+8aade191
+512fdf0c
+90081f33
+4322636e
+c41a84c8
+975247a0
+0a519c5c
+aa6dadc6
+23e64f61
+8e980788
+b0660259
+b06f9574
+2fecedeb
+3c232dee
+d97d4ce8
+e5857f7e
+f6ff9aa4
+e1696232
+0c0567c2
+c66fca21
+c5e4f7c9
+291579d4
+421b43cd
+2f659110
+4c7b80e7
+6582398b
+80e26c9b
+6cb60366
+825b3afb
+98159f6d
+68aede49
+86d4fccc
+014e4174
+876465ad
+b80912da
+8c0e1294
+4d554e60
+38a947a1
+3ab4d7f5
+0f8a625c
+669afbed
+b1f0fea7
+0ca4b7d7
+5e5a2f21
+02de4366
+b961056b
+bc6e3dc1
+854a10a0
+f2329666
+bf58e3e3
+dc1def19
diff --git a/models/rank/dcn/data/sample_data/vocab/C20.txt b/models/rank/dcn/data/sample_data/vocab/C20.txt
new file mode 100644
index 0000000000000000000000000000000000000000..7e2df90cf93e2c54611e62aef11f134357bdc613
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/vocab/C20.txt
@@ -0,0 +1,3 @@
+b1252a9d
+5840adea
+a458ea53
diff --git a/models/rank/dcn/data/sample_data/vocab/C21.txt b/models/rank/dcn/data/sample_data/vocab/C21.txt
new file mode 100644
index 0000000000000000000000000000000000000000..3e3077b9b197194cd43e6b487489ea7d39ae66db
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/vocab/C21.txt
@@ -0,0 +1,1652 @@
+f99850b4
+00dc22cf
+f6f86ffa
+6821b359
+ff8c5410
+96d57eeb
+b4dee110
+307b6d5e
+bd511f73
+58f22221
+4645d72c
+3e7ad7ee
+9179411e
+396fa1a2
+b28aec66
+59f17c24
+e1a7e876
+7e5b7cc4
+9e64f3d8
+d45dae63
+a8e82f5a
+07b818d7
+4a6e55a8
+206e5f33
+975f8b39
+4a490bff
+74be63ef
+97d028a6
+110e2e16
+db65a5eb
+26e9f66b
+5346761b
+aa1759a3
+4063500f
+e2a7a0f5
+b8e64bed
+ae950ff3
+067ffc55
+39429ae8
+6e73599a
+93ff20cd
+ee6cb673
+c9b0e118
+6a777618
+c576fe9d
+f638dc86
+fb8b0b45
+1988119e
+3a8607ea
+ed35ed93
+13e6792a
+bb829898
+1d970d19
+305e99e1
+1d0aeb7a
+12a583ed
+eca3c63c
+af5dc647
+deaf6b52
+bb776325
+9b6ed758
+19b534e9
+744c59d3
+a0f79e44
+b1aed7f3
+c0cd6339
+2dc77640
+f7a03bbc
+57454cfa
+0014c32a
+48a439b8
+3aead2c1
+23696c24
+a8f79430
+ee91f72a
+9877c7e1
+ddd879fb
+1a00a447
+541a022d
+413ca5e8
+c8297264
+f5b0a7ea
+c3c1485b
+700fd15f
+cfb6bd8a
+921fee9c
+1826f894
+486fd8a5
+c9927317
+06f9041b
+87cd3c7c
+3cf3f609
+802628f9
+d51c4861
+924dd56c
+7af6ef2a
+a6244f2d
+72f082ba
+e8fc62db
+abec5c93
+89c0bacc
+6f2b9e30
+81c5b96d
+37385fb8
+9e805f53
+405920eb
+cad88c3b
+c94c0fdb
+51fe7a92
+04458f69
+90bb0c94
+cd56cbd9
+96cb648e
+a006b999
+e106ec2a
+1c207144
+0429f84b
+a17a10b3
+f025509c
+256e0d3b
+351f6e0f
+0c48b21a
+1147ea33
+b0d89763
+f45f6fb4
+0eeb6fcd
+23259350
+f2beb768
+14785471
+44bba10c
+2c71b47a
+aec73e74
+290c14f6
+2abc4810
+2bce7b1d
+672b8599
+3dd7f9f0
+e5fcc02d
+fdc4f2dd
+04d0c894
+572bdde8
+5ba30e27
+866ae800
+aa305063
+c101ad6c
+4b1a8d75
+0aa67561
+15fb7955
+e784452e
+d0d58819
+66d1b210
+42a96311
+55274123
+a4b7004c
+adabb77c
+a92e4560
+1e65bd80
+41de37f5
+25609db7
+a0cdba9d
+933de4f7
+eaf9f45e
+08254b44
+05cee18f
+ff3ce4c0
+34dda59b
+73a05b01
+39fed57d
+ea6a0e31
+bafa34d7
+3ffa7e0b
+4d34277c
+ea1acc1f
+63580fba
+41917ca1
+fbdbc2c0
+6901eb2d
+d9d9202f
+5499cd34
+3892086c
+553a9125
+2dda5517
+03242e1b
+9065968e
+2272694a
+47ed3c6c
+f3e08856
+34906df6
+eb0fc6f8
+12383cde
+5a5953a2
+5d5e077c
+73911ceb
+512e309b
+58b10cf8
+89eff65d
+11a6817c
+2ce387f5
+aaf7d15f
+1cf1b565
+9148450b
+c3c2dca5
+4468d193
+8f4f775f
+eb177b04
+f49e3fc7
+8e584d7d
+4b8bc745
+605305ee
+d4703ebd
+632f9e80
+6ef07378
+0370bc83
+a1cc3401
+29911a67
+5fb6d34a
+17a49741
+88dae2b1
+87ea5d4e
+65131293
+845453b6
+1f29edb3
+1b28d570
+92259097
+9973aba4
+0f4d4f1a
+1a5b248f
+1521ff26
+9c7775b8
+ac628d47
+12f66f7f
+ceabc01a
+41e7a5c1
+b26219c3
+7c7d4827
+e1574bfc
+4d646eb5
+702c3a05
+ebe5064d
+80530232
+ca6512ef
+7dc1b537
+afa033f3
+72e47560
+783a5445
+5f24ddc8
+9249e83d
+cfe7812d
+147cf6df
+5c2357c6
+917a840b
+a013190d
+ff1c8c30
+98296e80
+66ccd171
+7d4a17a4
+81f8acaf
+0628293c
+635d4ac9
+bf8efd4c
+5e9baaea
+5c7c443c
+d4568750
+3e3b6b75
+bbf96cac
+fd741691
+180d6a3e
+51c2f103
+adc40847
+fb510e79
+6cce8b6e
+959bcfae
+dc55d6df
+634f9da4
+4923f51d
+ccc4f0e7
+e2a50a6a
+ebabd1c5
+5f771429
+38eae2e7
+fdb27279
+8d663382
+76ff3b70
+599d651c
+e9ba6c16
+dca9a28d
+e5191f27
+cf158609
+82f817cd
+cb9cef67
+f64ab18a
+33f24a65
+11e4edec
+d49b7689
+32f1f809
+29201b1a
+5b6b6b73
+ad110f35
+ffeab582
+d78d48f4
+e754c5e1
+29b4c0b1
+a670cf46
+a2d1f99e
+563e2490
+3349cebf
+a846a3a4
+1d170c31
+e9df034c
+7c76bd05
+0e63fca0
+552c638b
+62dbf1f8
+547c058f
+db863d15
+f22e0924
+404e2c40
+0a47a519
+18398c0b
+8247502d
+999f460c
+232b48ee
+e79470cf
+c576dc74
+a4dff8f6
+8d9a161e
+e530efc1
+08085eb3
+7b877178
+0586873f
+4b0ac19f
+c155211f
+42ce69dd
+2327e62c
+2de2ff11
+f426075e
+3ee29a07
+18d3b551
+ac4ebf7f
+31c11f6b
+0aa1048f
+5c1998a9
+c3d093fb
+bcff2af8
+e659ed0a
+1622c145
+f204ff8b
+206a3073
+6569e54b
+97e3049f
+d152ebf7
+0f78ab39
+2e246c86
+22a04a97
+fbe846ae
+e7c19539
+99a2ba60
+86a6dc71
+8db9be7c
+5ef8c5ab
+d991438c
+cc9703ad
+b7cd6fb5
+75bc7ac4
+7eefff0d
+1cf598c7
+3c677f78
+63ebd45e
+bd5d189f
+0be61dd1
+b72bc267
+d9c46345
+9752428f
+2063e5dc
+75bb61dd
+f5bcfc28
+5dc70c60
+24da1a62
+b9a9a70f
+334c0b32
+c2c1d8a1
+2de3ac85
+d479575f
+08aa4ec1
+01a0e18d
+fecb5e8c
+3e120d5e
+af53b446
+33e08fec
+ecb880ff
+ced91df1
+8c60fff8
+18ccde85
+68a036bd
+3b226dea
+bb959f9d
+024d1e8b
+ce332a97
+7633c7c8
+a090a37a
+7aa7103f
+b3826027
+5fe2e239
+eaacdba2
+7842225c
+b1770886
+6f107277
+a7d02174
+77799c4f
+fc573961
+70b08046
+e9412ec7
+3b6491cd
+70e737bb
+53cb3c2b
+7f34dceb
+450ef7e3
+1bb70ac8
+444c24bf
+5524f7c3
+53070487
+7b2b2bb6
+dc6ae5ad
+9c9532b9
+5d444249
+b0cbef3e
+bd074856
+6c581814
+28e6fcb1
+5af340e7
+695d27a4
+e58d8a84
+455df971
+0fcf7a27
+53fbd75e
+0cb0671d
+f52f289f
+2abe3412
+2c613179
+395eaa28
+eff0052d
+ba63e022
+8aa20324
+bc2a1fe3
+fd72cd8a
+48beb5b6
+122ecd28
+a3773d84
+d364e0b1
+a9dc70ee
+b221a02e
+d93de4d4
+a003d877
+2b81e06c
+996f4cd8
+614c649f
+f15fe1ee
+6e1963f6
+01d9669f
+fd1a30ae
+e7f0c6dc
+ac660668
+9bb8bade
+3a96e155
+a8e7b758
+20e129b8
+fa632b06
+bd2c82b1
+a3ae3fa5
+c17a2d4a
+20290c31
+dccddc19
+961cfe92
+c00b313b
+53682fb2
+c9af3868
+9069264b
+82d51b48
+7d24f355
+ee14fd3f
+83e5d1b6
+0303d915
+33b519e6
+8e60a921
+16554324
+d44b821a
+f3a50e6c
+893f5fc3
+f5d0a169
+287cc8d3
+9c3eb598
+52f5108f
+b3bfa7d4
+4ab4ce41
+69c7ea42
+5ed168cf
+4bef3539
+84ec2c79
+22265485
+abfaa6b8
+9e616bbb
+df4da2b7
+b1ae3ed2
+bcf105a8
+f9b789df
+2716996e
+8dbce355
+45838e05
+f0bb1194
+22850398
+cb66f1e5
+6d779133
+118624bb
+e9672021
+fb92f0ae
+8717ea07
+51a9f5bc
+7e274524
+ac3aac58
+c2a93b37
+9484a6ed
+aeefeccf
+a27e2a32
+f9860df8
+1a61934b
+c608f557
+09bdfd2f
+eaec45d0
+a2eb375d
+875bc056
+588029c7
+de8cff3a
+8aa27b9f
+7d27b4d7
+7f311475
+3ad09b4e
+7a8f7051
+53f975fd
+3b51048c
+a64dca2a
+613a7ed9
+558cc24f
+d8e97ca0
+da10527e
+2754aaf1
+86c3b59d
+8c9a0161
+3ec73293
+74cb7e67
+1bbb5c9c
+619247e4
+564bd00d
+ad4ddfa2
+960a7d7e
+b4e06297
+30eeb0b3
+7acf525b
+c3ec1415
+1df3ad93
+85699a97
+15f6ce3f
+d7a43622
+32a6d80d
+663788cc
+3101b9f6
+75f133b6
+4c7ffd03
+7072f0cc
+872c22d6
+deb9605d
+9e0bf7f3
+fc1b1382
+0baa810e
+b17d327a
+60f6221e
+56725ea3
+a97b62ca
+59056f22
+077336e5
+15fce809
+3294f8f3
+d4a99654
+5053994d
+d7e60254
+9c03188d
+49ddbd85
+01c82a58
+6c877358
+2a40b116
+ed2e8a12
+d6cb2886
+e7f43ad4
+d5a47947
+6aa6bc88
+078acbb4
+9337fb91
+64b47588
+bbac8571
+64001231
+0b841f38
+bea5fb63
+58c647e1
+951a607f
+523e61c5
+2548c38d
+55b5edae
+50d1b7ec
+865546f4
+f89f9b62
+a58d9ce3
+882e211a
+a4a5986c
+30519829
+3f829dee
+a2b7caec
+d9bd2fe2
+f27b638d
+cc9b27eb
+9780a253
+5baf84fe
+68a8ee0b
+fed8f4aa
+0c8b6f3b
+27ed5ecd
+6cee777e
+e3d9a25c
+34b4de25
+2b796e4a
+bd86ba66
+75d65efb
+6919decb
+a7a755dc
+90de4912
+deffff48
+565fcd66
+8f359be2
+38d14678
+1ed32d9d
+dc1b605a
+3103ff79
+42286ff9
+2aa7444f
+f563fde8
+cff06640
+e9ab8737
+ec0a74ba
+30ec1dcf
+1720a38e
+a32cf5e9
+67bb5322
+d4f60ad4
+bacd3941
+43809027
+1a0fc4bc
+5cc807cb
+71b9f31a
+f070ed3c
+153b1e66
+9988d803
+138231e4
+3796b028
+24c96cf8
+e5195a68
+76405b4c
+15414e28
+d8bb0075
+20f2f9c0
+11612e5d
+eeb3e1dd
+b6af5d81
+6e5a7a1d
+31d441b8
+3bbda4e7
+e25f2cb4
+c8c10603
+82e00bbc
+8a1d6051
+ec1594f7
+3de4b3fe
+e2e82c3c
+0bf1ddc4
+84e17141
+4c14738f
+5e52d78b
+d4b6b7e8
+f8729ba4
+92e2812e
+59a15e58
+de5ab50b
+dcd998bf
+7767274a
+e77e3775
+e91dcbc0
+f5f07930
+122d6055
+caa43aa7
+387a0a8e
+c7cf9eeb
+0187218b
+cbec39db
+78766d37
+a466b525
+45fdf300
+477e15f6
+c0d83e8b
+6226c3bc
+60b707f7
+723b4dfd
+a769dab3
+01fb5ffb
+6e0198a2
+3f5d7bc9
+dfe80055
+3ff9c11f
+b37c765b
+2af73b1d
+75ecdbc1
+f6afa8c2
+551fae35
+a716bbe2
+cf649f7f
+238dba10
+fce0707d
+fec2ad5d
+27125229
+dfcfc3fa
+c0428ebd
+ff4f1642
+9aed4330
+e1a91d60
+f15a8aa8
+11b3a893
+edd5bb8d
+f8c88eda
+fbaf98df
+47c22908
+eaa7dafb
+4e14161a
+2c18ba52
+97147821
+e4c72a38
+a40dde7b
+97855e81
+05861d16
+ad6ee353
+bb0240d2
+297d0f59
+ada328db
+5c859cae
+10236799
+6aa90754
+06c4ef72
+97f663de
+6f861cad
+f30c76bf
+cc097d36
+876b9abf
+72a8c407
+80a18360
+61e8eec5
+257324b7
+0cfc9935
+52ff833b
+ddefa1ef
+4457c150
+18039a1a
+0711daa5
+03f02464
+377dd8df
+4f1aa25f
+9529cd0f
+486d06b1
+16907e69
+3ea1f173
+a82c39a1
+0ac4575d
+3288603f
+d41d994b
+01805bdd
+dae8fcb9
+a262c259
+2620001c
+56842720
+e52eec5c
+f6f1de98
+74715d78
+86ca49cc
+32b712f8
+8912910d
+62e04ecf
+95b757a6
+83504e3b
+0ae83b6a
+fb236348
+9adaf9fb
+51152b76
+6a2d420e
+67ee088d
+879e47ce
+e42fa830
+4000298c
+23f3c59f
+db9e7457
+81a9f375
+c16366f9
+109457e4
+9f2bf83b
+c75676ad
+9beadb85
+c4d244b9
+b5cba562
+542b9f47
+3dd38d65
+a32eb4ee
+59fe1952
+62d3d585
+4b3ae1ff
+e7e29fe3
+e58f5b50
+c4073818
+73d06dde
+794c73bd
+0dd41d11
+52252402
+36a4f6c3
+0e4d0e2a
+932bbde4
+784d5c90
+6fb7987f
+6ec2bcf7
+d6dfe93e
+36c57c20
+e80431c2
+8b68c97d
+2f153e08
+12965bb8
+ae90cc8b
+4e37c79c
+0ee8d107
+9d45285b
+58a9b2cf
+f7bec017
+42ab8d60
+9ca57840
+03180670
+5754880c
+b7146468
+c5cea7f6
+d4fa2b9c
+834f61ef
+c815cc03
+68bcc2ad
+5a9032d6
+f2e48b5a
+66e40ec3
+b0782e7e
+02f8f889
+c60a5d27
+892ac502
+7a593b43
+df7c386a
+b6d56156
+1b245167
+27dd043b
+8185e4ff
+b6f517cc
+1db52038
+b0ecb78f
+c8f620ed
+c53c6f04
+ea48946f
+a19054a4
+f06a8211
+906d0191
+a3ec19e1
+f6ca7ac2
+d3ac1dc7
+e7258bec
+2580fd8b
+4a50b9c8
+f068ed61
+517618e0
+60cc6d6c
+2dccd904
+800b961b
+216374f4
+b1c7a442
+d3337aaa
+6387fda4
+a3405885
+80e90bce
+7edbd38b
+d5c325e0
+ae7b2d98
+d78066e7
+6157ed55
+9d8e02b3
+14195e38
+8be38803
+a17e0831
+fd50dab9
+c3444bea
+f0def462
+1c63c71e
+a6928003
+3b453869
+13530739
+91311aa2
+ba1ea4da
+52cfcb1e
+3b689f8a
+ab6399cc
+b964dee0
+d34c47e1
+6e244b6c
+7262c4e7
+bc97d93a
+f684da87
+f8e4bc88
+08168d73
+a62fa482
+ef335a31
+bac2090f
+88e2295c
+156fabc2
+38d30bf3
+69216694
+288eaded
+c17bc83f
+5155d8a3
+424af181
+5c5200e4
+17a800e7
+16a16949
+4aa405a1
+e05fa5d9
+e44c1409
+8d8572e4
+4ed90330
+2866d136
+fce7cb8c
+66bee02f
+4703a36c
+21c9516a
+0909cbc8
+28e194c1
+6f2f598f
+87bf569c
+d381852f
+a120230e
+6e963fc8
+634571e9
+6bbca3f3
+dd162b51
+199ebf08
+5f361005
+9cfdbc70
+f6e47def
+15e9dab8
+b2b4048c
+9824113b
+d002f067
+5337a20b
+1284f248
+c7be8078
+e5adfc2f
+ea03ca8b
+ac2c2b41
+45973cd5
+d3e31025
+0689e840
+278a31a1
+35c9cd92
+b566ad45
+826a8402
+df66957b
+c5a8c8bb
+49f40294
+d51b35e7
+d38433bd
+950d91c1
+cf4250c5
+4b4716a2
+14b530e0
+809daeb4
+1992c897
+36a862f1
+69e29ad5
+b0b1d4f4
+24e5131b
+17da6214
+a4e1425e
+9fd12735
+d37aea01
+7fcceb02
+15837f71
+241a1c48
+901b12ea
+362ad5a2
+f0987c72
+f48102c8
+31b4af04
+c9ffd163
+4a74c2f6
+0affe296
+f669e8c8
+397abb5b
+6d37a2fe
+598c5042
+e587c466
+17aef59e
+33c96fc7
+8eb451d6
+fbe278c3
+6630794a
+3561091d
+a13bd40d
+03a47151
+e17839cf
+d13fcba0
+233f0d38
+8b563cfc
+55e899dd
+34d9a8f4
+e800ac5e
+9dbb42af
+9c28a03d
+f96c2136
+78c86246
+35173275
+4090cd5f
+2ec8bfe3
+f2d01817
+e22a5394
+2bf9e48b
+c1a2053b
+6b012c1c
+98fd45a2
+25a45402
+1c108732
+f40613aa
+c39f166f
+f0dd122e
+33e00b0e
+a012ca78
+3b1a1662
+f1732901
+e22e102f
+211286a9
+ce247dc1
+435d7d92
+1dc95fcd
+3e711d83
+c2410e6c
+7c63db81
+780bdc55
+b15b2b2e
+af1869be
+c30dce78
+972d4080
+40acbeb3
+21f70572
+02a01d3d
+fdf90dad
+ace95b6e
+602ce342
+0272379f
+7f1bf047
+63bba41a
+16e2e3b3
+ddbb8a5e
+1762389f
+b6e2d4cc
+600ddaed
+112e52ec
+1ad431b2
+0e8585d2
+ecd6affd
+6a909d9a
+e2dcbcdf
+7e39d717
+5fe17899
+6234a766
+4ac46f2c
+11c700df
+b193bbca
+71d4501c
+d39acd52
+8638ccd1
+54ea07b7
+21dc9672
+193238e4
+ce88b997
+150518f2
+0d9d598b
+ec92df64
+a63c48b4
+42c08f72
+5e5902af
+62bb718d
+79725e40
+345e27b5
+912b264e
+0fe21b0e
+c171c0ad
+2c466a1d
+16741b68
+29670e19
+78ba0a03
+98040c74
+ec64c4d7
+e3b35e99
+6f393b20
+09fc6c82
+f200a8d6
+2e9b1af0
+0bf4a9b7
+c995c90e
+a1d5d41a
+0ee473fa
+d8ad7a42
+0f3f8951
+9153b994
+12edb6c8
+a84fb99e
+218fb84f
+662c923a
+9e7c55a8
+6dfd157c
+d70af1c6
+ad69ce75
+97f203aa
+d41a1a9f
+9ad47d25
+89a26962
+376f7f0d
+20062612
+58c5bcc1
+95f07388
+7c2f3adc
+dec13458
+1fe24a90
+9e063b0d
+a41bbd15
+c6008d7a
+bfd95a58
+cbfa4917
+500f132c
+3edc9deb
+0366ddd4
+4fcd9076
+55d7255a
+ae0ed514
+b691bac2
+d83181ad
+b300c516
+06f2027c
+024722f1
+54bf0a81
+8039e3b8
+e751a45e
+5a634328
+230fcdc9
+6c5b8354
+ae43f51a
+30f37c77
+3ba1c760
+2891c920
+65229b6e
+2bce5813
+95ee3d7a
+935e4b2f
+e6584790
+1ea17cde
+292c4646
+6d5cb8c6
+46af0015
+cc2ffb60
+dbfd7937
+784aa3ad
+9ea3b81e
+ce8373db
+09f4f5ca
+2b1c0330
+947cce72
+2802c63d
+c0213c31
+50699d39
+f5148ea7
+9b10698f
+e339163e
+90756215
+c430d55f
+54d8bb06
+c5cc8079
+26e36622
+3ff7a605
+17b90ef0
+5dcc9e3d
+f3e9f13d
+89272cb0
+cc6a9262
+d59ec484
+d67f1959
+6364be0b
+6842eb57
+08584528
+ead945fe
+b31e1f74
+f58dfaac
+c97af6e9
+1718ad45
+e27d6c43
+82dc787a
+9447c917
+1fb42d80
+6cbbdae5
+d1cd4aac
+f6f4a94c
+7b6393e8
+3de90f45
+6c01005f
+833c4620
+8b0abce6
+0c2239c7
+38483e30
+90d91a22
+9502af25
+99c09e97
+704031f9
+1aba3fd6
+958c71d4
+1f215d99
+55efc8ee
+b426bc93
+0f74a194
+b6fd7699
+fd6072e3
+8316999f
+fa77045e
+fc39341f
+94ed3ece
+87e212d4
+006a1016
+e1627e2c
+e0c6d533
+2d1e017e
+5911ddcb
+b6119319
+701e8d19
+de9fabc1
+15f80464
+a3bd0899
+5bcb843b
+3c84bfdb
+8e4884c0
+14c4474d
+21d7978f
+7279a3c5
+1ceb5991
+71e3dba8
+3315b01a
+166779ab
+c4e32492
+eeaa7077
+08b65540
+329e8c53
+a30afd76
+09730561
+b7bb1eac
+694efdce
+8a72cc70
+4db5f6e2
+f2b28cd7
+1e4963f9
+077bab1b
+c67620a8
+ea3a6228
+eee84f68
+4986aa94
+0cfb78de
+7f9fe834
+0ccb9bf3
+19657d0b
+8e1cb851
+39059201
+4a057340
+19e4f3a8
+85cc3ff1
+7cf93b74
+7b8c9565
+d53381c8
+7fa1816b
+db086a72
+c3a4c331
+f6e3bd9c
+ad789d5a
+3cff4e7f
+43d01030
+8cfc8f03
+44cb2862
+c6b16df2
+2834c793
+003add01
+1f5196c1
+4b3387f9
+0704b0b2
+9c23db80
+29e83164
+a1fe1c97
+053a45ab
+fc0524ce
+4c9b6081
+fad15dad
+e0a20aee
+2d1f75c0
+33241664
+3df2213d
+3c1adf80
+79ca6193
+bafb971e
+a948e2c1
+fa041ecd
+3c7fc748
+a5cf9110
+1da20bf8
+b853bc32
+3ac4edc2
+85b48dae
+1952b3e9
+6d327a6a
+07ff6f40
+1d1bec73
+a0c56c9d
+3b958bc0
+5e1fb081
+f4868def
+59c21cca
+faead7f6
+5a775d44
+3229ec62
+43215f00
+3db17de9
+92db3899
+bbcac452
+40d924bf
+12ad5e03
+d7c93a6d
+21756f76
+ddfc7f5c
+849d3db3
+3443a4eb
+6ee9862d
+0cba1665
+28fa11f6
+929192e3
+4784c2a4
+361a1080
+20cf75f6
+0fabec7d
+08119c8b
+4f45f117
+2510ab84
+2899d36c
+50ec0a1f
+40bcbcd9
+78c1dd4b
+f226d6a0
+4bac89df
+a921d7b8
+2f4978df
+ffe9bf4e
+9ae3e892
+09192c91
+9885a271
+bda00e85
+164ab357
+54acbe2a
+a7b99eae
+96eac89d
+f6936d17
+6ab09ef5
+cf744eda
+f3ddd519
+757536b1
+218633e2
+c8bb8a95
+b6c42b20
+76ef8858
+182fdd1a
+29bd6058
+28942d4f
+8c66acdc
+ef887cfb
+7db36402
+fbca226b
+f9bd8f54
+3316a698
+8a65310d
+e049c839
+c0f28cb1
+921c1039
+34a1e7bf
+9fb07dd2
+4a08ec4e
+85350345
+769c5847
+242bb710
+a4f13390
+af4c4f22
+e014f4ed
+72557d9e
+1f93dd86
+c3f2592f
+143de831
+5033304f
+9ac6b7e3
+8c7e4180
+27b10c4e
+3173b047
+30bb890d
+ddbeead2
+54e54a66
+eaa687a1
+6cb23951
+831d5286
+5387d45d
+d20113f6
+b5d13cfb
+8d1f6da9
+513ce97e
+7336716d
+f68b38fe
+cb105f80
+5f1ed37f
+9f80b735
+7ca98d3b
+7c2dd7b3
+19ad0d9d
+cc85abe7
+5c6c011e
+b87b65ef
+7fcb7db3
+8b7fb864
+46ccbbf4
+86b10dda
+8a3290e6
+714ac7d6
+f799acd9
+3dd09d1e
+360aedef
+58bf1e98
+f3cf62a6
+0e9478f8
+ce62560f
+90da9c54
+01394575
+a2d91f6e
+00cd7438
+ba8cf27c
+a69cb007
+2a4891ce
+066228d7
+e5b175af
+76851c4f
+7a4c0cb5
+fb5ec17f
+effa916f
+af460a0d
+4e1e9322
+b178dd37
+cb896c26
+646e302c
+5c3c860a
+bfeb50f6
+79682ce0
+6e51654c
+7401c2b0
+b92ba73d
+fb7b0a40
+fcbe92e8
+b5768eb0
+adcf1fd1
+f22a431d
+f420ee6f
+3631d5eb
+7a0873f3
+3f0af554
+858323e3
+cb3e5850
+b39b1608
+c526fd2b
+fac8a0a6
+b055c31b
+ecd4a633
+d90f665b
+af2a560c
+9a8a4df4
+f288712f
+ed01532f
+97ecc0b4
+5f957280
+ddf683eb
+2fed5554
+c4bababd
+055d5768
+8a54c4f7
+98818775
+159949d4
+7034bb6a
+5dcfd068
+15e1c508
+5f6bc5d6
+92cfc76a
+29b895b5
+a8c3795e
+7288f4fc
+69e15563
+cd6a523d
+22601592
+8ddf9112
+974c22e1
+85014ef4
+565b9e6e
+1529f99c
+29d21ab1
+39be81ed
+5d05ae85
+ee0dc527
+a43bec0d
+ba293893
+1280d805
+a8333d55
+c6047019
+2ab07a3f
+c6b806a0
+347fbbd1
+430d27ac
+636bf64c
+7702742d
+6a4bdd9b
+c57954d3
+98a79791
+a2408af4
+d7866de6
+7ba8d8f7
+57a5effb
+cf65ec08
+a59203c3
+b7a2973c
+8dc34570
+bf9670c3
+e0136b4c
+261a6a22
+dfc3e034
+561711eb
+2b337cce
+927a8cff
+d9e590ae
+eab99233
+74ea047f
+ee4fa92e
+6b7069c5
+ceac1e70
+5a8fe828
+310eecde
+514b7308
+5cd86926
+2d4c8386
+a10e2bab
+1e3227a8
+6791461d
+64e2daa1
+d9dc4f73
+adb3c670
+a50737e9
+c057c236
+e983f6ee
+55486e2e
+9deb011d
+c71bdbf9
+1e687180
+9874e03a
+783c7a2c
+194a09b7
+237b756b
+b4198b45
+c4583fc4
+03cd6ce5
+ab09fdba
+f229bf17
+ef8600f0
+888b7233
+c73e1386
+b73f278f
+2c6163fc
+654ae73b
+8ca007fc
+1c353974
+29cd4eef
+437c4394
+cf6b91c6
+ec5ac7c6
+4afd0cac
+a2321eae
+5ac15b45
+b2d90d0f
+52150d3c
+26ea29ca
+8bcea858
+05c27aaa
+9617eced
+5b08eee1
+fc0a0585
+c1849e4c
+fea3ba36
+887e0ae4
+9376d3ac
+9b27cd07
+11c5f92a
+c6d83625
+db6abe3c
+c6ee92c5
+ef486571
+df89d9e2
+2796fa99
+d5fb7e7c
+52fde8ea
+e4fd98aa
+336dac29
+c646942d
+d40e2611
+da2ff2a8
+755a7303
+80346042
+b7643ed7
+d1f36314
+2500f511
+5c26dd4b
+a0a725f0
+0057ee52
+01cd89b5
+0823ecf9
+00f9a917
+c46696a7
+1fe472e2
+0ae865be
+e5752394
+6e5ab00f
+98f5f4f7
+e6a746f3
+58cd488e
diff --git a/models/rank/dcn/data/sample_data/vocab/C22.txt b/models/rank/dcn/data/sample_data/vocab/C22.txt
new file mode 100644
index 0000000000000000000000000000000000000000..3b6f6a760862e95652851d248f4d7084eda46451
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/vocab/C22.txt
@@ -0,0 +1,8 @@
+ad3062eb
+49e825c5
+ccfd4002
+c9d4222a
+78e2e389
+8ec974f4
+8651fddb
+c0061c6d
diff --git a/models/rank/dcn/data/sample_data/vocab/C23.txt b/models/rank/dcn/data/sample_data/vocab/C23.txt
new file mode 100644
index 0000000000000000000000000000000000000000..4cf222bf01de901752cede45e57ffbfb764c607e
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/vocab/C23.txt
@@ -0,0 +1,14 @@
+93bad2c0
+c3dc6cef
+423fab69
+25e3c76b
+b264a060
+85d5a995
+be7c41b4
+3a171ecb
+c7dc6720
+32c7478e
+72592995
+dbb486d7
+bcdee96c
+55dd3565
diff --git a/models/rank/dcn/data/sample_data/vocab/C24.txt b/models/rank/dcn/data/sample_data/vocab/C24.txt
new file mode 100644
index 0000000000000000000000000000000000000000..ac7de137388ba7964349336c687cdd1dad0eed64
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/vocab/C24.txt
@@ -0,0 +1,1770 @@
+3814fc66
+3599e91f
+74798b91
+ddc4fca0
+5a456be6
+498c3321
+8ecdb20f
+efc0a15e
+8bcb92ba
+1f68c81f
+59d5131b
+6a6f8ecc
+5627391f
+9dfcb3c8
+14806529
+a8380e43
+57e6fda0
+20c8320e
+03b8c7d3
+e010ad8a
+ff24030b
+0d147601
+25cb4641
+ef39b317
+cac07f27
+f96a556f
+f3e1b768
+6758ff18
+d2288502
+fae97cfd
+087dfd71
+c9170981
+2dcedde1
+29ce79bf
+fd0fd1bb
+f91651cd
+ab246148
+53085255
+0a42fb3f
+93a075b7
+5c6b280f
+a5725134
+6095f986
+b63141be
+25616ee5
+aebc75c0
+e33735a0
+f8d85724
+72c326ed
+d56dfb81
+15a9a163
+ec531568
+ec8639be
+4b3abb84
+7b80ab11
+1481ceb4
+078b017d
+a9b776de
+7851af43
+5a1a48d4
+75070ba6
+65f72381
+2cda5b6b
+d7b29d8c
+d96682aa
+45ab94c8
+b72b38a4
+87f61a7d
+5f986f38
+af1e90d8
+c049edcb
+0c3013ca
+97347d1d
+40b4aecc
+53fa3f5b
+891dbdde
+b61e3676
+39c28d2b
+ce327ac7
+41c913a8
+6a32b80c
+65e74c52
+4acb8523
+460ca618
+7dc1d687
+958bc15c
+6c4e776a
+71292dbb
+0e8fe315
+e4f45003
+55dea74e
+c651fc6a
+5a816fcd
+198d00e5
+ec498394
+cf23f0df
+4904c5a1
+fb1d9706
+67442292
+bf921720
+6566aa76
+3ad63d3f
+a5a7cc68
+c0ad5792
+eb251029
+2a475b60
+3a076b66
+2cb8e5cc
+dbb0f7fb
+d19832ad
+cd361e60
+17fc42dc
+c2fe6ca4
+a4bff677
+3c6cb018
+51b4beaa
+71fc35d2
+bc0cea8e
+34675a54
+a0cc244e
+3dd6d4c7
+2e661f08
+1d7321ce
+7cb64d22
+8cab152b
+a611a48d
+2261c9f9
+03a8e84d
+c440aa3c
+1a780b02
+b74a9cb2
+5bbc3e54
+78d01a90
+e457a564
+040519a8
+dc3b44d8
+ef15820d
+d70cd354
+9a87eb0a
+f828a610
+59e13581
+0ae78e0e
+a6ef0df0
+a183a9b5
+40da8cfc
+38ea3106
+ae98e1a2
+866b09e0
+340cc764
+524755a6
+e9b8f8ed
+d4a1b4cb
+99623d4b
+82f556e1
+941adbca
+517e5d60
+c6f82d6f
+701470f9
+4d1bb0f3
+44cab774
+9094dc99
+afaecfb9
+d73320ab
+5e03aeab
+8f62425f
+b0253af9
+29daad63
+61f7d033
+da89b7d5
+2fdc9048
+23b1600f
+40de02ec
+4001cb4f
+043a382b
+5d814ad9
+ff89b61c
+69cd673a
+5f13bb00
+528e9fa5
+12fd752e
+eb2b9fc7
+b9981895
+9fb93b83
+83d3323e
+5238e1a6
+671d300c
+cde7b0aa
+702a69f7
+882fa719
+3f55a9b6
+83a9835a
+c07abd1d
+775488dc
+be059b55
+b936bfbe
+6ee5593f
+6c24d6e9
+abda10be
+bc75d80b
+f6ecd438
+00de2b63
+e9403827
+32588652
+feec5924
+8e6654d0
+0877de73
+d91ea8bd
+3be94b64
+af647f02
+90b6276f
+fd82c312
+b3901f1e
+2836dcd1
+cc4cadb3
+b7c6f617
+38194a4f
+0d4a6d1a
+a1093ec2
+043ce596
+9ac59e2a
+cef0214e
+9b41a362
+6cef9284
+244a7d22
+f14d6396
+9e779944
+c258aec7
+16e901e7
+ed209d56
+31b11b9b
+fd926e96
+1b478bd9
+348e7ade
+8c7eb3ec
+c70a58f2
+cc518955
+8b1f3048
+3b425be9
+9e07eb4a
+4bf3b035
+382cd1f1
+5c59fd65
+c4803cb9
+abd6caea
+56800348
+9b786035
+e04988e4
+af55e227
+61dcdbeb
+f6becd93
+c4f35009
+40c02949
+66b0c3ea
+d7ff2a41
+d3b901a5
+2896ad66
+3d29547e
+84edfefe
+936da3dd
+ed3f20c3
+27b60b01
+489cbbc1
+26056640
+38b7753f
+39fe175c
+4b73039b
+56ddf355
+135c8b41
+d553a122
+8e183052
+68400de2
+7b837715
+2e617f4d
+b6b5bc47
+83f4339b
+7f686ab3
+1f022022
+34423196
+4f18f205
+04936173
+145ae095
+856ed6b0
+ccad99af
+f36b8c71
+32321167
+370676d3
+3c6f2df8
+c6d9fa35
+5fd07f39
+a5ce2d0d
+8ec48ff8
+4e9d9b37
+8dd87033
+81ef41e2
+c3408140
+721c37e3
+458af519
+0cae79aa
+70750a77
+e9bcc914
+4fe18e82
+5ef28b07
+6d14803a
+78d8fa88
+7ce74ce1
+c0b8dfd6
+672dcbe6
+88c909ea
+c9c24a79
+3a13327e
+953245e0
+b67cb69d
+cf6248f3
+8f8a745c
+b3caf022
+71b884cd
+f36969da
+42bf52c5
+07eae818
+e14245b5
+67a6d8b4
+8e84e831
+67a18c8c
+f54aa12a
+6b4dea61
+6042edf0
+a7be3080
+b0734835
+aee52b6f
+9d8b4082
+c8ea7afd
+7a28e855
+3371d710
+ebf21959
+62aa24c6
+226b0ab6
+4b018d91
+64fbb706
+82c1b580
+f2cbc463
+6c25dad0
+dfb22eaf
+2b702eb2
+cfaef5ef
+bb039b11
+0393be52
+531ba5cb
+9d214089
+d7d60a0c
+2e1eb18c
+3d146fbb
+5d2db8ad
+74cb42c6
+92a3affb
+b34f3128
+4087e42e
+0a10f385
+38fb6d0d
+465f835f
+d28d80ac
+e0a4d56d
+65c55747
+9e0bee34
+3294fab7
+f6c8a517
+abbf2dd0
+6bc2bf95
+87d71e01
+0a55c43c
+dbc81ff6
+48954301
+7058cbd3
+1b06d1e6
+c83e0347
+5fcc9829
+b6a6d491
+e20fd388
+c3924d09
+d3eeb97b
+2e3db81c
+d1fc6400
+4043f8a2
+f20c047e
+75b08a3d
+021c0acb
+d86e9fd9
+bbefa796
+8d5294cb
+5fd6be32
+d09e7a36
+1d987843
+08f469e6
+5e980bf0
+9c2bbbda
+db01eccd
+14a69cd1
+9a2fec5e
+70bf2569
+6bdf991d
+e446429a
+1d789fd9
+d6689ddc
+b94bd0ee
+dffccd89
+d278603e
+e4c99bb2
+6b8aaf76
+81820397
+6ddc15cc
+093bfc30
+cfc55279
+69661344
+84e146e3
+61842413
+2909a885
+808a3191
+158a894b
+5ecd6f9b
+66458243
+4ca51bd1
+b4f0f308
+ad21a9fc
+adf537c3
+3e022f4d
+ad80aaa7
+2c720b71
+82b7ca7e
+502f2493
+cd56213b
+48b11258
+a90e7e17
+cee0fca8
+d1d230cb
+0fde6d0a
+2a90c749
+7beffc84
+2af5bbab
+727a7cc7
+96e0508c
+de7d4fce
+b2455157
+2c4aceb1
+c954b96f
+7797fef5
+a7b94871
+8bb8d07d
+8d988431
+9c123ffc
+eeffd916
+ee70205e
+0663db38
+41c6a05d
+a3433af1
+a8833871
+c9a8db2a
+5ddc2c4c
+d256414b
+cad46f36
+347f7b3c
+200cf515
+61bf6e16
+2fd70e1c
+fb55adff
+887a9628
+1e48c981
+73496e83
+81940e09
+4e141a61
+9fc6df89
+6783c6e3
+8096c2cb
+d265ca2a
+af7e56e7
+73e86bff
+aab05942
+ed2c5ada
+47577e42
+f2c8ae40
+34842886
+79f9cf1b
+921c4069
+8a17bfab
+8c8c0a34
+f4e98c72
+ae39798e
+727923fa
+924e1861
+0109cb5b
+c67ffc42
+df0ed3eb
+0243a2a0
+9276a528
+7f0b4c99
+f4c62ae7
+46851bfd
+3f7eb911
+4d9888f5
+80a15ff2
+24db27bb
+cb8a00d8
+1cf05ea7
+b460c647
+698d3534
+06f6f124
+9f8b7b85
+841a44fc
+b3540901
+ccdc9cf6
+657f7d7a
+de0d7b75
+0b95f1c5
+a15e072a
+93935bcf
+99517c2c
+8b51e37e
+7ef78543
+daf43eb5
+7e60320b
+726bef9a
+4ed82ed7
+e7540df6
+d2456b98
+32ebc486
+8407b107
+a0ab2ce0
+196c454a
+aab9e637
+8c33950d
+ea2ce97d
+bacc4064
+5f44b012
+f3f5a940
+3fac8b50
+e1b26251
+72e0a24a
+0c55705d
+971d53a5
+b870aae5
+c673354c
+e448275f
+44e65d0d
+165e877f
+bee64424
+81e099cd
+4c5c856c
+570e3148
+8c882225
+4fcc135f
+f7d9662d
+187472c2
+cd1adb91
+22c5df8c
+55ae113f
+00747dbb
+47b52926
+7c4ba11e
+bffa4efc
+3212c2f4
+218c9ff7
+3ff8e180
+bcaac82f
+f87cca3f
+a9313cb6
+91367ef5
+0ab9c4b2
+f556f019
+f67c718f
+9c1163ab
+ac73f6cb
+70c6904e
+a1da5668
+2d3e3ac8
+6683fda4
+143b1e0d
+c640e7a4
+cc6ea5b0
+eda696dc
+299fa073
+0b8baf69
+812a6aff
+09691cbc
+8b7d995d
+d1edecda
+fa26a1ec
+f2d06cff
+3d87e5eb
+47c5aea3
+038f231a
+47620345
+36468341
+899c6ddb
+fc7f8260
+696cc26c
+6ca1f5e0
+ab99f948
+cbe16042
+3f1ffffa
+d5c410bd
+2cf2378d
+335a6a1e
+dbe0773a
+77f43467
+0dfdc7d2
+4a5825e5
+f689bb81
+ecbda592
+216a829e
+fad45c97
+cd1b1f96
+ef089725
+ebfb058a
+e525e48e
+56c60afc
+a1f8d915
+e397a120
+c5cbfccf
+cd9dfcdf
+e0d419cd
+38255568
+1260fbcb
+60a197ae
+e676df61
+13b52aee
+22dd4e42
+123fff3c
+3e1dcafc
+c3f77dcc
+b7d5a1f5
+92513de3
+331512c9
+79fc7b8a
+bba4462d
+9d58d744
+246f2e7f
+e5fe7725
+c43c8d37
+25b923d3
+2ca52429
+1b3a4e10
+89064851
+fa749baf
+1890c8c8
+246eec54
+e03ad56b
+039f930b
+58aebea7
+0a9d189e
+32dbef3c
+9153d898
+6c121858
+4c0a9db1
+a62d8862
+53575b1f
+b9e6ec5b
+a5278534
+35594d99
+49d074df
+2916886f
+29d9a0f2
+57ef7a21
+910fbaa8
+71b8baa3
+b5903154
+59c12dda
+8ed869e2
+413a4753
+850e26c7
+5751afd4
+ed62d872
+731311f7
+13628649
+6a2507e6
+2c502c1d
+9d96bacb
+c75adb94
+7aaa93f3
+20b0a854
+09163aa9
+b99e97fd
+b748fd41
+eedee301
+7a8e7ed6
+456bdb5f
+9a4f8abe
+b36618e6
+129e27b2
+50d994d6
+82437b86
+53951edc
+8fcfcb08
+dadea544
+4003a508
+1888f921
+0e4b6ccd
+9b18ad04
+673aed71
+1a495a8d
+6da13322
+b6616a66
+cafb4e4d
+e0e1e766
+716bdf99
+ca763252
+944ed2e9
+66ee6583
+75a8f3b7
+1eb46a94
+8ecc176a
+33642e3c
+a8b9ad80
+0acbced6
+e6971433
+85858d9b
+08b0ce98
+11cfe898
+590b856f
+1793a828
+a545e387
+87a038c3
+3b183c5c
+4a5cfcca
+96261dd5
+a6e8741e
+ae18bc5b
+7a39a7c1
+eb1ef865
+d18dc394
+052e29ff
+c201359d
+fe0d4fd8
+acbaacbd
+3fefd57e
+2a46f54f
+1b1571ac
+682a8518
+75d67d71
+fc7d5b23
+f839ad5f
+1c884a87
+e4e10900
+1625c147
+428df801
+2de969cd
+2437e831
+dc8e8f81
+92d45137
+799ebff0
+d24ccbf9
+af0cb2c3
+8eca1fe1
+5fd04b39
+bb502178
+72c78f11
+1c4b599d
+d36c7dbf
+a9d9c151
+03f412a3
+9f7f425a
+5622865b
+48e2b4fa
+5ab11cbe
+07e4d1cf
+3e90e1b2
+453471d8
+093f678c
+35adc05d
+507fac5c
+f271f41c
+c0762e38
+4c8e5aef
+4e69b127
+59aa86a7
+b4920404
+937f8adf
+f2b01507
+89fa735f
+f659d008
+0b351a52
+97dafa13
+c23c2e19
+dd5daac7
+a67b5cf1
+f7078ef3
+eaa1bb97
+c99dc8c3
+6138b4a7
+a5938ba2
+a0d302fd
+9a04d350
+1172291f
+6451efd5
+e630a04b
+65d7d87d
+1bbbdffb
+bffef508
+f0f70e8f
+c4426981
+7ab4ee7d
+7697d011
+a98f5ada
+e9513c86
+543fab60
+deda7b3f
+7e42b083
+f5f4756f
+d6b9343b
+fa4eb4dd
+27e81296
+65ca2360
+54a607b7
+5a9ffa19
+b839567a
+25394d2e
+03a47055
+c94ffa50
+8d365d3b
+01568b8c
+89ca0045
+fe6b762b
+71703c49
+a6a329a4
+09511edc
+919b984e
+89856b53
+d691765a
+4008e1ec
+4c9292a4
+daef26d8
+0c9ebda9
+519dc45d
+df487a73
+1a696ebf
+1370c56e
+98239188
+a57341ed
+792d292d
+8cc77276
+b9de745c
+a3bd4d33
+8f282db5
+307b6753
+70417824
+2ed60951
+a113c90e
+2ff7b3f8
+100dfaf5
+2663a8a8
+4a236577
+30d6ea16
+aaa9e441
+f7b7052d
+4b352104
+b6654c21
+bc5b0a35
+4ccb4ae2
+079b4da8
+8a07912c
+cedad179
+46008b41
+cb33dd43
+9287482e
+a67c8001
+4f7b7578
+52e75cce
+79869a4f
+b0fb6a50
+359dd977
+c60081c2
+f0c37e57
+1d4dcf28
+d288f01e
+0df4da2b
+d9dc8cc0
+8c6e501f
+c774a2e2
+eed7c069
+9b7eed78
+f2b0025b
+66b8103e
+a6e7d8d3
+d61a7d0a
+fa83cf95
+cca8c2f2
+f0e3ead3
+17f458f7
+b94e053e
+38be899f
+8d9d382d
+27b03081
+c4fd06c3
+03aefe56
+abba3671
+24d3b1d4
+88b2de92
+00c9134f
+3e387bd4
+68359ad4
+832d3547
+1b2208f8
+aa3637f6
+42998020
+e63dd16e
+2f15c94b
+65aca77d
+672b83e3
+2fea1d4d
+51e4f023
+21c9d296
+f1fc44b5
+e46634bf
+8decccb3
+0fab2d26
+6e311859
+b15e807d
+240e5a71
+479e2b59
+b8468a20
+a1d7a338
+dd95f77b
+05427247
+88fb0a1b
+32ebebe1
+69e4f188
+825ef36f
+ee448990
+45fc54df
+20f6d101
+7de79d0e
+f998e32f
+3e1aae16
+ded4aac9
+36325a86
+da45fa6c
+e9e62581
+d198e252
+c44bbc69
+dcba8699
+1dcb8764
+0505abc3
+2f647dfe
+cc0e3da8
+b43c75ff
+ce350354
+cc409594
+9fb5a9a2
+51c114aa
+5029cba6
+896b9d63
+5160a778
+7939255b
+e3c310aa
+a052b1ed
+c144fd6f
+9117a34a
+90b99a4b
+0916262c
+990a118a
+33044299
+7c2d72a4
+1a3b3d64
+748fa578
+5bed8759
+6e828bd5
+e22fed0c
+5bdb4bb6
+504fe8ec
+3fe205c8
+9f94762d
+d0cd5a75
+73897902
+b2df17ed
+b3f3cd63
+4fa16304
+becdd73c
+99c23ccf
+8bd236ae
+19b117ac
+9f0d87bf
+6a6f263b
+b1df4bf6
+646a7ba8
+d891c925
+05b86cfb
+4bc4a47f
+f7b109c0
+16bf8799
+4024af75
+4e513205
+69b371cf
+b3d84af2
+52f412c3
+3400540c
+bd95e5ed
+5bceb83a
+b9f94fdb
+cda41f10
+50bd4647
+487b8cd7
+0de4bad8
+71dc4ef2
+166ad104
+22ff028a
+22b2abe6
+ada925b2
+0ae4c07b
+4e877ebb
+b46381c0
+871dfd78
+fdf5ac25
+3037ff6a
+33d0ff9a
+b0deab67
+b1ec9c5d
+d6285a00
+c9bc2384
+7f6485ad
+0ff91809
+10864bee
+1acf8062
+89bd83a1
+54d76883
+996f5a43
+426c92eb
+b13a4767
+764af1e7
+9ff482d3
+7d44b04c
+8eb162c5
+4f129db5
+99296def
+786a0db5
+f6fcb7d2
+f53ea242
+a415643d
+a9a1fbd4
+41be4766
+59887b6f
+ef218b6e
+91de1848
+0f3002b4
+3b047130
+88ed380d
+71640730
+8548f4a1
+afb24112
+0ea6eb3c
+28cbfdd1
+dc13346e
+91f54db5
+53e0a237
+5991d363
+916f4113
+8ea22d26
+6f90ebe1
+7e99499d
+f86b306c
+151606b8
+8804f0e3
+80dea675
+cd8c216a
+98f3000d
+8c6caec8
+66450483
+56f742f2
+df739657
+42bab436
+340d03c3
+f6ebfd12
+a5862ce8
+2a0f32b0
+f4b7f89f
+6d179652
+ba986015
+68db1c55
+b258af68
+8ca3a6b7
+defc8ccf
+dd48603a
+62899acb
+6d31f611
+ab6c4f7c
+dc49cc4e
+0bb1263a
+9c71165e
+0a3cd755
+8e35571e
+66335bd5
+b44bd498
+1a5d7654
+b2a514db
+4921c033
+f36ddc52
+fd42e0df
+829944f0
+7553379a
+0174dd24
+6b4edae7
+8a73292d
+9807dde0
+c9beb5ca
+32adb9ae
+3e2c4cbd
+37cc789e
+f216590f
+7b888172
+00954a0e
+545ba04e
+e4ef8e56
+f80ced1c
+abf08f1b
+a43e2630
+98f2c2f9
+58e38a64
+cd7bf52a
+2cb14fec
+9134a4c2
+7100b65c
+1dd4e2ec
+5ee1762f
+63e142a0
+e4cf11c4
+16e042f7
+45b2acf4
+ad8e1407
+ce76fe7a
+d052bbbf
+ecc32110
+0784e21a
+47cbdaec
+f8bdffd6
+fa447a8d
+819b58b5
+78aca291
+aa9b9ab9
+df62ac33
+43237b56
+6788e7f4
+37821b83
+02772104
+dcda2e04
+56db18d8
+8a23eec0
+bf118644
+fe3df019
+7566c6ec
+1c6eb7ec
+60efe6e6
+8d49fa4b
+d104653f
+e138864a
+2a953b98
+7fb4ff91
+edc54ecb
+45fd5c7e
+38d9d896
+337985c3
+0e636f7b
+998fc200
+3f57fe68
+b2f178a3
+ff09b92e
+a9848e26
+7827330b
+8daa10bb
+2f7e98de
+3fc1428d
+b63e3fad
+0ce5cb39
+8b9120d0
+468af5eb
+db6a503b
+c6eccc02
+08cc5743
+18de9e19
+273937df
+3bae196c
+d1736fdc
+a6e96657
+fca4622f
+ad65a526
+540a7116
+0799870b
+037a0e10
+86b2dfe8
+0789ac40
+fb508a2a
+ea444225
+471f55fb
+c82232f3
+30d6a904
+42df8359
+c041f2d3
+261a4628
+9d919ce8
+c8213966
+ef38f1fc
+de065242
+03955d00
+adc58845
+4beb20da
+422f440e
+aa640e91
+ae2cd100
+6ace2d4d
+48056b77
+47b093e0
+86f85a4b
+78cb4e3f
+ba20a2af
+eed53290
+917815cf
+4943ec2d
+37d44daf
+32dfe8b6
+fa470cd3
+b889075b
+d57e7d03
+da363f9b
+cde6fafb
+bf253a99
+87889f51
+f1dd75f2
+0b7c4dff
+d2a38c68
+a636a323
+992bbff0
+58bf07a9
+7e4050f5
+adc85729
+0c95bf62
+cb461b40
+e7d518ab
+41377cb5
+d5f248b3
+7836b4d5
+d719d049
+b675522c
+7cff0908
+ad9125fe
+d3d40c0b
+3f4c58e8
+8f079aa5
+914b000f
+e9950559
+7e3ee667
+2497cf4c
+4e8956e8
+ea8b7f8a
+47794ca2
+1057b323
+80ff3b96
+5bedd47e
+6de25db6
+f5220a4d
+c0486dfb
+c48364d9
+ba38fa19
+d43818d8
+8055cecc
+a88bdc08
+b59ef8e3
+732e04a1
+94940b2e
+882001d1
+2a1e2684
+d1425cbe
+7357b104
+476d291e
+580c3edf
+206e2927
+c4566169
+e3beb64b
+9d8f9768
+b62c71e2
+992f954f
+0d9715c0
+10b3e56d
+c657e6e5
+37e9c759
+e9b3cd6f
+c64ce893
+e4ff7aad
+6a8e9007
+36125f5d
+79a1dedc
+e95018ca
+43fe299c
+5bdcd9c4
+921af7c4
+1c20ef10
+dd532e83
+e20d69d0
+78ec1d80
+51095639
+74591569
+81ca3223
+bc6e3c3a
+fad805fa
+4b927c56
+55d9e56c
+f966d341
+8aa244c5
+9fb5b007
+fcf34ec5
+5e1d54ab
+6b15bc38
+15c045d3
+b96588d2
+673790da
+73b6604e
+87eafdec
+7c25ac81
+d08d9a3d
+bc8b14b9
+fea28307
+791d26d0
+15556a9a
+f18e11cb
+c8983142
+e76eee7f
+f9f7eb22
+8dca7de3
+497c77b6
+a025bb5d
+cc92fe5a
+838037e2
+6a62f247
+4507844f
+8c6800d6
+91fce746
+830a0352
+c19679a3
+2d2ccfe8
+dfa876d8
+4e44852e
+d64be7b9
+1305f2b2
+aa0115d2
+76503798
+07cbc0f4
+3fdb382b
+88154d04
+0c130f98
+9f4f09e1
+6a22210d
+55dbbe40
+92866377
+52c59e65
+4d547c31
+00d5a681
+12f3c872
+772b286f
+98276f90
+fe4ff14f
+22ea1208
+faf5d8b3
+87a0bd15
+af931d1c
+08403048
+57b704c0
+1a691628
+2bb26daa
+ec542427
+97a1ca52
+05e39ce1
+52077998
+2401b8bc
+bbaa92f5
+3c65d693
+aeb61edd
+03a1fa24
+af6dadc2
+b21b5fb3
+462b471d
+5c4f7464
+195e7102
+3e8656f4
+212a56b2
+0dea7444
+55d1ba40
+f2e9f0dd
+7bb205ed
+bc491035
+16bb3de8
+575b0d69
+2debeac8
+c6bce007
+869caea3
+7e1145c7
+5fca8ab2
+d859b4dd
+c3a031e2
+197a3a69
+69e7316d
+4f778213
+a4b9e88f
+41213946
+dd892473
+afec87ac
+a994f5a2
+d0c103c8
+874db5d2
+87a3b049
+54baf4d1
+4ac791f9
+00ba4790
+38ba8d58
+78572248
+15b4ecfa
+25e1b26e
+677163ed
+01eb9c81
+7ed7f502
+d48ac163
+93dd3d3a
+45ca10b6
+b3b64833
+8fc66e78
+10edf4e4
+27c676b8
+bc069ab4
+e29de47e
+6ca3d42e
+1335030a
+60b881bf
+5db13492
+068be52a
+e91468a8
+93800554
+d982520c
+6216a5f6
+d8e17d82
+f8009a4f
+f773db90
+9be5c7a4
+e773f0cb
+01811a71
+cf01163c
+54100933
+8a3cfad4
+60a57787
+a024e830
+67dc6ad3
+5647ab57
+602d066e
+57b5bc3c
+82b18aae
+e944e245
+d9556584
+576efe81
+b77f253e
+f188b2de
+ad4c56a0
+e7bc1058
+15789c48
+b6ee835f
+6b3a653f
+e47759f6
+a3d6d29b
+29f1f7ad
+8d653a3e
+23c0aaf8
+4de83b96
+488ba19e
+5b11b876
+c7300032
+80f697dc
+b3b23293
+a78905c5
+1aaa0c5c
+56adeae3
+7cb5b4d7
+afe9396a
+061b6467
+375c3609
+61ebc718
+343afb5b
+6870bd60
+f420e5b9
+d8d2681b
+279ba740
+ee2c9e5e
+0a60f0b3
+0e5f0a94
+c646e587
+ca9e488d
+b9ca963f
+cf6d0b98
+692bc8a6
+3aebd96a
+4cd51db1
+757d996c
+79aabb00
+886696ea
+29ece3ed
+9f196282
+b2f8b1c8
+0b6472aa
+adc36d25
+7f5d4de7
+a6d762a0
+e21e6ce3
+8495beab
+962a6272
+fb6b5384
+16668505
+eff6c82f
+2de3edc9
+a283230e
+e21a5a56
+a9a2ac1a
+04db156f
+d9d34c25
+b6a0db5c
+d6d254a3
+b1febde7
+364bceeb
+494a0755
+6b06946f
+9ef1fbff
+afb22fb1
+4b871c6a
+d3f51405
+0b49b877
+fbd1a9a1
+237343db
+45dc0576
+db977bbb
+005872ee
+ac3194da
+9fa3e01a
+1aa84899
+c8965754
+79d26e5b
+5dc43b96
+7d2e67c5
+b92e237f
+6870b565
+202530fb
+ba835183
+12c269ca
+ffbe05dd
+f1a27f66
+5bc7bd41
+4941cf83
+e5fca70a
+3bbc13c4
+9568f0b0
+723f08fd
+b050fb87
+5a3afc28
+a63ac4ad
+7f132c47
+270dc187
+0ddb9ed3
+6c1cdd05
+edc3c618
+5188732d
+28cee3a1
+acd8372a
+41769d4f
+26dd4ba1
+f1193908
+cf9f8644
+8451bc6d
+53264305
+454b4db6
+f6f4eed5
+4bb4f28b
+fa1177fb
+10533421
+f50560c3
+23c88dd9
+2d794282
+798b43db
+e4f7a0a4
+85e4d73f
+d2623109
+6df5380a
+46648158
+ed8e1d46
+382c8572
+1b256e61
+d9c126e5
+30be0ce6
+3da47871
+5a9a5de4
+40a0947a
+29334d40
+984f840c
+9195f690
+aaef3712
+a259b40d
+553f4956
+8ab167ac
+d28e90ea
+e45cf762
+521834ee
+fe300890
+9af06ad9
+0ad882d8
+6efc9bbd
+e62aef2e
+3214afd4
+322142a3
+f676f61f
+2364c1ff
+ada1744c
+1e258e53
+8590a491
+fedcfba3
+810c1d25
+fb38505f
+c9e1b7a4
+43f13e8b
+e02e537a
+4aa354ec
+8cdd8b87
+ad4fcc41
+e3f7a6d2
+3822e7d5
+c05c98ab
+1a3ece10
+88cba9eb
+07fc91cd
+67d37917
+67371b0f
+63991703
+d8aa072f
+8418f43d
+c62ecec9
+416be1b4
+78db00ac
+da408463
+948f70f8
+8e8b7730
+fbe24bb0
+8118a66b
+bd635379
+5dd5bf0b
+21c632fc
+e4aec941
+a642674e
+75aae369
+91e03621
+f0765a1c
+54c91918
+48be0ede
+75c8ca05
+215dcc2f
+dc6c5bf5
+44a9ba11
+fdd7a5a0
+cf004f46
+fb96278a
+be2f0db5
+53786e59
+87f419c2
+8d4a9014
+195e2646
+585230bb
+00a359b5
+8b5bb744
+df470bd4
+f38cb55d
+4997082e
+1d3415b4
+6bad707c
+7d5cc1be
+f090fae7
+811853ba
+32b2300c
+7745a8a2
+c3f5ab54
+02d1a5f5
+8bfb698f
+6e9a987f
+e798ac81
+801e4eb9
+904cf83e
+d5b4ea7d
+2e0a0035
+5a4adb7d
+20621cdf
+a219426e
+ecb5aa0d
+135bb8eb
+71033737
+68f7a8c1
+b43d6cf7
+200b8e97
+56341849
+d968c351
+732eaaee
+34461b7d
+b001878b
+99aa8de5
+8d7ddeb6
+9149f647
+06c8336f
+9b82dd5e
+9e9a60e4
+d4af2638
+48a656c9
+7e862c7d
+74ccfe0d
+ffad9766
+b10c3a35
+4a4b30e2
+ca774bc9
+0b655ce1
+fab2a151
+05dd8229
+b71d0371
+a3da2fa0
+2b0975db
+aa1b807b
+34f0f59d
+90c9635e
+aea3714c
+24fb0c28
+4f272e57
+c85648db
+fbc5ce88
+0113e0ff
+5a3877a6
+3e30919e
+d8223456
+ddb359cc
+ddbb8d8b
+974a5c95
+9266d2b9
+8940103a
+12d17b0e
+0bab5d7e
+2cd44b01
+396a8f90
+c0d61a5c
+4a36b55b
+eaa0d62c
+f1b99840
+be673bbf
+f09f6f4e
+f14753fa
+cc80b3be
+0604c2c6
+79d2852f
+d339d9f9
+b150ea7f
+abe3a684
+eb8b03b7
+610a186d
+e00b3005
+84a27184
+51360e73
+88b17bc2
+7724aff3
+5e26f7a3
+c5c50484
+c69e1e8a
+005409f9
+a39c8d13
+a86c0565
+b9ef321b
+130b046a
+671c2b89
+ae362bdd
+7986b75d
+b8942a02
+b3398ce3
diff --git a/models/rank/dcn/data/sample_data/vocab/C25.txt b/models/rank/dcn/data/sample_data/vocab/C25.txt
new file mode 100644
index 0000000000000000000000000000000000000000..f23c2ee0cdd3de63cba3bd06d77e3042001f113f
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/vocab/C25.txt
@@ -0,0 +1,40 @@
+e8b83407
+7a402766
+fd2fe0bd
+ea9a246c
+445bbe3b
+8b8de563
+9d93af03
+c0812fc5
+e13f3bf1
+e0f2931a
+07ee399f
+5c813496
+001f3601
+59e2d823
+9721386e
+875ea8a7
+82d3ae39
+24657b11
+2bf691b1
+8f8c5acd
+c243e98b
+3a6f6b59
+51c3d1d4
+9b3e8820
+ce62e669
+f55c04b6
+b9266ff0
+47907db5
+010f6491
+f7839e21
+46fbac64
+3d2bedd7
+724b04da
+1575c75f
+33d94071
+f5b6afe5
+c9f3bea7
+f0f449dd
+60c2b362
+cb079c2d
diff --git a/models/rank/dcn/data/sample_data/vocab/C26.txt b/models/rank/dcn/data/sample_data/vocab/C26.txt
new file mode 100644
index 0000000000000000000000000000000000000000..1cb7d644ce8688720752f3a74f7494df109d824e
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/vocab/C26.txt
@@ -0,0 +1,1349 @@
+72b1f187
+a3efae54
+00095ee6
+c26860c7
+06a75e35
+3e01e343
+52769d8f
+47b6f269
+6ae9621c
+42e1f6ba
+03e419e8
+ef37ef57
+e85c38ab
+e3a60438
+d2bea05d
+90ff72b0
+9201d3e9
+10373592
+d2e794c1
+08b9244f
+60404332
+67ebe777
+691a6530
+35ac6843
+5cf0065d
+d1d45fc5
+0da76050
+c97d7481
+54ca28ff
+cfd96da1
+754857d9
+ec6b0e7a
+3f8f3f3e
+1e111169
+f3737bd0
+5cef228f
+d6ced323
+159759fc
+968f2b7d
+2fc5e3d4
+5382327d
+4a626f32
+4821aad6
+87d8715a
+5cd866a0
+22174682
+81be451e
+1c57d70a
+5036f81d
+9f96fcef
+1295442b
+f0fe762d
+0ba6ef32
+c132cd74
+c9afef22
+08f26f5a
+9502e7e8
+4eb653b2
+35599f97
+2f494e85
+a6749956
+584d8464
+0c1bbe81
+f3b1f00d
+cd0af79e
+b4bc8bff
+350a6bdb
+99f4f64c
+31c2f4d5
+bf09626b
+d9c73336
+cfe622de
+fe5e6399
+68e890c8
+244c712e
+9b8e4e84
+09dd415c
+98f2cb71
+5236e08c
+40cd9f3b
+09b3050a
+56be3401
+25e83f59
+ef472e96
+06ffc780
+3d186c66
+963a4122
+8221c0ef
+66b8aaa8
+ac26d563
+2de6f28a
+14b7c6f2
+226a47bf
+66301586
+51e206f9
+79beb634
+3464ae5c
+fe56b763
+ffef2a68
+6cf80e8d
+41d8ab16
+06878f6b
+9bce6de4
+dd830c09
+a5df7777
+210d4617
+4900cea2
+3fdeab66
+39f4914b
+b892ed1d
+9523b7f4
+4271e99f
+e45edfc2
+a53e3095
+8a657198
+af9e8348
+c68033de
+9863119c
+2674da8b
+a475662f
+b502d7f9
+c3dd86ad
+509bbc3b
+3d83c402
+418e75bf
+fc8ac6ea
+a35b580e
+a6d189cc
+7aff7640
+bac82070
+a5f6b352
+08b44c88
+6704f643
+ac9faae3
+316b9ae1
+199fd640
+e8048e9b
+00e78c76
+b1d427dc
+79bf1fca
+82503832
+e738d3f2
+63093459
+3a01ee68
+9af95efe
+c3546e32
+e914c1e9
+a1004638
+7587faaa
+db4d8c37
+99e71d4d
+30fe33ed
+92f24e76
+851a6ce9
+44de9504
+a034ca84
+c99deada
+01c838a0
+7d171dcc
+28ddfbd0
+8b80cbf0
+a8914830
+25a34e8a
+db1e2da7
+ec1bfd1a
+16ed7d69
+b29c74dc
+ea481fff
+f8d62db8
+0a5456c3
+a81ff309
+ea609d5c
+2c444dca
+d5e46346
+bb721794
+c1ac5aa9
+1206a3a7
+d90c05da
+c0608db0
+fd6c1cb5
+903e4adc
+8fd6bdd6
+d7af0b10
+a388434c
+99fe62d7
+693f2ab8
+61221532
+5cbce625
+515eed79
+741fbc61
+901912f0
+512c7a53
+0278ad53
+c5116a1c
+1cc46575
+b556b353
+dbc4d43f
+0015d4de
+6d7f6444
+aaed6d5d
+5df320f0
+826dc169
+4808a23d
+9e08f841
+8e0dad53
+1faf87ce
+0ba13929
+fcd456fa
+157c7745
+ceb009c6
+1d173a83
+2d056c0b
+51ef6654
+5a4d1669
+cad27838
+66045105
+bcf1a4a2
+d6fcf5cd
+c2d2389a
+e16aa324
+45eeb1ce
+776d35d1
+bf3ff876
+ba676e3c
+4238d56d
+e57ef3e6
+4d6d3594
+12e53712
+fa3124de
+9f348762
+61556511
+920f8c9a
+bde577f6
+70b6702c
+9c1b4a8f
+c4510344
+6077d3a5
+2fede552
+94c22011
+5222bdd5
+92c878de
+80a27288
+a8d546b1
+df576608
+6dff734f
+4450e3bb
+1d6cced3
+70451962
+3ad3379e
+d2473bc3
+748a713e
+8efc26f8
+03764a6b
+33757f80
+f69424a0
+e001324a
+213da196
+6ee7509a
+daa08341
+d0151bcf
+18d5df57
+b74c530e
+a4ab2614
+9fab4921
+7826e9ae
+1617d874
+705ee187
+fbc30346
+04442161
+93791661
+1b8c6698
+e9069368
+6d75a07c
+f159b6cb
+3ff1af9e
+efe23900
+315ad76b
+083a8da3
+904cfc1e
+cf995043
+e6cf77ec
+b69ff5ca
+bdf46dce
+a329b171
+2967ea07
+46d78034
+fadcf53b
+aa5f0a15
+4c6b94ab
+0afec7b1
+da9e13a5
+32280082
+fbe10aa8
+6023d3f5
+4e87ab18
+84c3d38c
+24420a37
+882f541d
+08a0303f
+22aa53d4
+1f477f83
+709dce56
+5a3c4cae
+f9a614bd
+44eced7b
+b4e0464b
+44af8b3d
+88b19c6f
+1f05f21f
+18fda1f8
+0f81083a
+294b0c0a
+fcd5a3f4
+5bdcba46
+224b88ae
+b8203831
+8b8b6204
+b2ddae6c
+3090e38b
+1a8cd4b6
+3956fc5e
+e225c8e7
+39b76061
+0cac137f
+a7c08b38
+e59b81f8
+00535fba
+0624d126
+e8cee7fa
+f0760dde
+3508d15a
+cf50627a
+f379f550
+e4b8a64c
+4cd8cb41
+1cbdcc4b
+2a30fcd4
+afd260f5
+3135c89d
+304c20c9
+a3a8e8f4
+5b729341
+d94377ca
+4a5cc2a6
+c5686cdc
+718a0860
+ca2ddb62
+bd422f81
+6167761d
+de50512a
+4d0c145c
+3826071d
+e3a8d2c3
+4c7ae747
+94525e91
+2139c64b
+cb5fbc1c
+9641e9f4
+7518c447
+48c7d77f
+f55e3520
+dc1553c4
+6b2ae7de
+a0d02864
+69148c9d
+7e6f9939
+c1f5c61c
+5a622c49
+37c5e077
+e75c9ae9
+4e64b6da
+65aa49a4
+ec75f0c4
+f868e7eb
+8cbc1758
+dd8b4f5c
+92888dc9
+cb5a2b38
+0b57b1ec
+f15edf91
+87da501e
+dbc93f04
+f06304e1
+da681e04
+9fcd80f8
+a602d432
+4d0197e4
+8769d55c
+f88aa0e7
+49e5480c
+68a2a837
+ee61903a
+e29f7e91
+0e35119c
+c470a649
+9b1e24b6
+779ff446
+b172a55e
+28f3d990
+68a56e9b
+a3f0086a
+1768030f
+4589f06a
+d19c4cc9
+3e0e39d6
+d4998978
+e222caad
+dda91ee8
+b2ab83fe
+e6a6ec4b
+293abd24
+5bd8d5fa
+fba4325a
+bb574173
+cc0e8b70
+4a31693b
+7a2fb9af
+1e8667cd
+3bb6f6fc
+864fc796
+69c18da4
+d4060883
+9a483882
+3d04cc90
+b655cbf8
+b6030675
+cc7b7077
+be1fac76
+45d0d806
+cfb3c3bd
+0e25d9c4
+7b840f23
+f9607f7a
+eedff58a
+2ae73226
+453fce6e
+510b928b
+723c999a
+819513af
+9d26dbb7
+27029e68
+adf2a43b
+16edf87e
+98ea1376
+ba14bbcb
+a6308e9b
+8d8eb391
+7fd6516e
+7dfcab64
+9636866f
+dc39e0d1
+03f7961c
+6527ade9
+146e6aa2
+bb3b40dd
+ba2e2d00
+af6e345c
+43363def
+0347f0d3
+db4bed22
+adf647f3
+b7ce06a8
+d5ca783a
+397f057c
+dd0ce319
+e1c088a6
+295b0bfa
+d4d7b05b
+73cca889
+9dd6ee43
+e740b2da
+fdeb9c92
+f8cffed2
+d15c0cc8
+49a2a201
+98279775
+2abadb4e
+cd888bfe
+d121595f
+7ee6b096
+eb178148
+1fc8fb5f
+14ca3f83
+e17643f0
+3df61e3d
+1731d5c5
+4ea41590
+1658f2f1
+5e24a43d
+6fdbca64
+c45afe49
+a8cf207e
+d37be573
+75bab1b7
+13e09a2a
+c8a4f9f8
+8aa85f14
+1f622150
+88eee740
+bde2bc96
+b4a4615f
+6e6b0fbb
+cea188cc
+708a7ad8
+e76e842a
+79994cbb
+1e370030
+8f72e5eb
+bddc0140
+aa662b71
+4fccd22c
+8f154cad
+409c7293
+140d9b7c
+0e808c19
+bb51a9f4
+1b220bb2
+35a05896
+0c57a22e
+6ab28812
+4be29bf3
+c4304c4b
+64a82175
+f25f4271
+2249063b
+42956c28
+d509cc5d
+77d7fd9f
+03219b28
+c15d6087
+f46b117c
+36b08c04
+7359befb
+47d6d9aa
+e86b2239
+02c5733a
+8b781ed5
+6a25b225
+b2a97390
+f5908455
+723e40c9
+26d20f88
+0653fd2d
+009abd69
+78ef31a6
+cedb4060
+cedbd4b5
+74373432
+984e0db0
+c1038b71
+84e33fb8
+6de18d63
+e10bbd8d
+cd93c731
+2b839123
+db0fa156
+4b6a03d5
+06f77b0d
+e9b68fcc
+8115a695
+001e34ee
+0cb77ebc
+355a8ecb
+cc7a9215
+65bbcf43
+2702fec5
+6bee583f
+4f47de0a
+fb7edec8
+3055b376
+93f6d392
+b7d9c3bc
+3662e24f
+cdd2b5b7
+1a7d228a
+c106e37b
+f19b4688
+ad8965e8
+61e54666
+f05fd399
+f231aa7f
+a08dac3e
+8aa73a74
+54b49e3f
+8538ac3f
+231fab9e
+965db6b1
+ff6f8ceb
+d14e41ff
+3292862a
+50d64b65
+c037ee57
+ce0bf6fc
+067308a3
+6c679105
+1e21c6b8
+b9baa3d9
+af7ece63
+674bfc41
+37a5dd0a
+aaf35019
+5a205e8e
+c9940502
+96c9cf5b
+873786f7
+b112057a
+ea98f275
+6700f0d1
+8ca33e04
+d3b1c827
+be8b4aee
+8811188d
+aa86a675
+100fe023
+efac4835
+c40a0966
+b95f2202
+402af921
+4ac03fcc
+62436b08
+eb9a9610
+f945c71c
+653aee86
+8c53c771
+c6468596
+7a6357a3
+c73ed234
+6598e0f7
+6ddc02f9
+afb16416
+14d40504
+608ebd3c
+d8abefe6
+456c12a0
+d745cf91
+d885a38e
+40ce7785
+6b7a26a4
+04f73185
+e5ca37e4
+17c3439f
+4671943d
+d8ecbc17
+e8990e6d
+84c60230
+69134261
+b80ee752
+95349313
+8d2deb5a
+029398e0
+d67a6f5b
+5ebffeaa
+14886693
+632dae8a
+8ebfb2ce
+ee926a57
+72ecd2e2
+cb451a6d
+731c3655
+997369c3
+7bdcd81a
+c79a3a65
+457ebcef
+074bb89f
+bf8367ba
+2b5727c4
+7438db2a
+1d8c941c
+0e8e05b3
+496b82a5
+a649b97a
+c97f0488
+9a7a203e
+f18a1a20
+b11e5179
+0b945011
+56133ed0
+0ca8dfbc
+4d75aff6
+60228615
+9bef54fd
+21db0a46
+59c54af3
+05c7ae70
+915daef3
+f524c586
+6ddf47a7
+41eb481f
+b818451c
+3aa72374
+b91a7c2c
+1f44c09d
+87db5143
+9569c251
+2de4bd28
+0355bcb6
+4640af2d
+30d2386b
+7ebbdf14
+8d8d9c8f
+70f53519
+c131e22f
+c6ee8096
+e93ac0e5
+58274fd2
+8379ffe4
+0d428e85
+bb4e2505
+67a53a1d
+2080aa8b
+e1cc3a15
+abbbe19d
+fa3deb10
+56009c93
+434a7801
+b9d5a998
+b91c0f78
+0f43bbd8
+21fba449
+9b5de197
+1aa2e416
+1ba54abc
+08e0e995
+d10d0f6e
+63d1e016
+a2a92e0e
+3b410cc4
+b13f4ade
+0dd4407a
+e539c901
+c3c038f6
+eac68b2b
+63c80f7e
+09929967
+b34f6f11
+655ff9ca
+6c1aedfd
+2a4caf7c
+7de5bac1
+7b1f0488
+83293b11
+cb545e02
+8aebbad7
+48b0a13a
+f89011c1
+6b0f3f1b
+be31c894
+9ff5fbc0
+25bf05c2
+447046b0
+dbd4e512
+2921e678
+5048bd93
+00cd7c8a
+247efbab
+a8e9d3b6
+f7b3eefb
+16cb50bb
+73edb4b5
+fa1d538c
+3fa6ef2a
+caf3d9aa
+dce6f283
+bc33111f
+fda55550
+fc14c8a4
+fce22553
+0610d914
+5b938117
+a412d5b0
+175120e4
+def102a7
+c55085ba
+c07c490e
+fc92bae1
+73128e71
+b371b4f6
+6aa7a624
+a2ee9095
+56d7abe3
+c4206c8f
+f1e9162d
+e9e3604a
+c16873c9
+2013a96b
+b820b6c5
+adb7a642
+d6f0caf4
+2bf605fd
+06016427
+8e1d68da
+0e7eaf4b
+b1617e25
+539454ce
+d6952d32
+b1863223
+ec4dcfa0
+bebc2875
+d12e1d0e
+2efac7d3
+6867cdf2
+89c50118
+60687331
+e914a548
+4612bd34
+d73f0a8f
+0e6e23fd
+d8e7b1a8
+02864ca9
+9bf02322
+7d704f34
+4a2664e0
+8f16a3b8
+64d6cfd3
+52bd701b
+ae5c4798
+80b0aeb9
+d25a68df
+8fc187f5
+d7fb715c
+89678cd6
+7814ac53
+7f3a7824
+c4dffc79
+e5475fb7
+d0cd2d37
+edb3e7c7
+abc00283
+4426ce6d
+a00dadcb
+b999534e
+1954e885
+b54e5d63
+ff86d5e0
+cc884ba5
+f3b2673e
+029d11c0
+ab67d6da
+988b0775
+35fe2911
+a2cbc80c
+cdfe5ab7
+2280f93c
+4e461000
+dbd747f5
+de298f4b
+57504b73
+b637e203
+9b48375d
+e493f174
+f7924e53
+12b59965
+375da37a
+e1282c94
+0b9b375e
+69e9d621
+006df4eb
+975c3ad2
+e8b90e7c
+0093c58d
+e564f221
+9727dd16
+4a6648b5
+86d002a4
+fc5e989c
+712ff32e
+0facb2ea
+124ee8f3
+47ee0e11
+1c2df582
+be8d43d8
+8a7c7f69
+6102ce2c
+d21d0b82
+24fbd9e7
+6aa23ef0
+86e082c8
+8510538a
+7a1ac642
+78b17569
+8aaf36cd
+9042adf0
+6797655d
+5fd064c9
+23caa300
+38e62aa3
+c90af692
+71ff9cf8
+77e5b96c
+04e822ec
+3ce84a7d
+7d5cc2a2
+877c5de5
+fa7237cc
+138f9d21
+5d579413
+03e819b2
+d4f283bd
+a9637a08
+a68ab18c
+f769f95c
+c1544f0c
+57737d26
+311f18a0
+b5170d16
+9f47d894
+2f49b025
+3c13f4a0
+3279611b
+2339f197
+d7c5d96b
+12d4e9a4
+15d33de6
+6c27a535
+a4ce2a3f
+834c6d31
+cfdd3db9
+e2f05ce0
+3433dcd5
+8ed8cc23
+7f6a8fa5
+64f08cc6
+13e48a90
+27924c6e
+c7c9b407
+35571de5
+d8a062c4
+78c19ed4
+602f0609
+45bc0805
+a00829e6
+c84c4aec
+777607e9
+08fd6ea6
+73aba5e4
+d31e0d71
+91116abe
+59b96d68
+b025bfb1
+03ed121b
+5694d529
+3a490508
+b08575a7
+2ddaef64
+6719eca5
+fc1f43e7
+db23eb1c
+4eff5f13
+def43eb3
+a91b75af
+04a4756f
+3b0cd992
+c43c3f58
+d9bcfc08
+d064b1aa
+f0670357
+7d7023e7
+5bc9845d
+c986348f
+1fc6a45a
+24d28202
+7ece26c3
+8e63e212
+0eabc199
+3547c540
+f921e5fe
+61826ff0
+8270b5de
+9973f80f
+db3c44a5
+c2e4ae15
+c19104a8
+df7c8cef
+cc7a24ff
+51bb992d
+6cbbcc23
+a4610e92
+17723a96
+54bb9bfc
+85efdf7f
+c8816bd2
+e8060a36
+9b648db5
+11616784
+0abd210a
+c338ef12
+adb5d234
+a6e524ab
+2176cd69
+fdd35dbf
+0fd820a6
+24185db6
+2ab86020
+80dd0a5b
+195abe7f
+93032663
+5f3ed85b
+3b93a75d
+4f001fa6
+9809272d
+52d71dab
+50eec8cd
+1c7f8927
+11a8c449
+f64d3d19
+30700dab
+b4aa4b3d
+33ced911
+c935b06f
+9fcea641
+f9a06465
+0d2ff409
+42f54cab
+a9e64f38
+ea0f3efc
+77c1aeed
+3dd87dd4
+64cb0863
+ebc95c58
+d43753e1
+42c1aa63
+a32ba1ec
+8200fe31
+0dac3b4c
+810e46b6
+42caded1
+b2671abd
+1da4142d
+b7047847
+a612b8b4
+55f266e8
+58bc84e7
+414c6af0
+65e4fe4f
+51714c65
+f91dfab6
+06421b62
+60132ddf
+d04d1ab5
+86d572b3
+bf7e4243
+ec20c783
+c00553f0
+46dc009d
+058ed731
+349c5cb3
+77152453
+cb1956a3
+71961874
+cdc744c1
+c1a1e60b
+dbcaded1
+05e40c8f
+b9809574
+c5aa6f0c
+311b370d
+61089298
+c36f2d3c
+b1966909
+0adbea63
+f5416171
+cf600c0d
+51751315
+c4773b81
+fe158ee6
+28f9781d
+7476f449
+3b9b1ea1
+ed6846b0
+75cf7ffd
+ed284d23
+39bf9f5b
+ba383c06
+b8bac2eb
+49d68486
+1ab47dae
+d7fc74d8
+9f6a34e7
+b1c17344
+895ec2a9
+0e74a422
+bcc9f601
+2d16df54
+fbd8ae1d
+2ba3ed14
+02ad7ae7
+938732a0
+3b76b414
+40392c4c
+99d96479
+7d493915
+5310c9ce
+fccdc224
+491eeeef
+4c0bb69a
+89e1abe5
+14e36524
+2e5068eb
+6888462e
+816e7c4f
+71236095
+d255f8ed
+a2de1476
+2f44e540
+69c04c92
+8903c4b2
+3d90e67b
+8060e460
+dd2b7392
+c27f155b
+196d3b7e
+bfcff13d
+85df71bb
+f5422ec9
+cf5f6c51
+55449beb
+8177a934
+478fade0
+3aa1a67f
+5fdad95a
+1e6c340b
+91002aff
+cdebf969
+87f23977
+9bdb2a8d
+cd499895
+9c48b948
+f2ca47cf
+912316ae
+1219b447
+87aaf011
+2841fe84
+875d7cb9
+93cca2a3
+a6dec5b6
+00dc9eff
+53d3a2ce
+8274ec13
+c7c83a0a
+d8dae084
+322cbe58
+d1e5f5bc
+e47d5fd6
+4361c4a9
+2ab494e0
+8e1ae331
+5f51d065
+65d78c0a
+26c64313
+57f5fbc7
+9c9c0cd4
+2cd125c5
+b6594d59
+a869683e
+aa636b9b
+22594278
+a8b79368
+192c2d47
+98b5582b
+df5cbf86
+275ffb27
+bae5dbd6
+0cfc1e76
+69dba424
+d5c1ba16
+ea2a173d
+fffa8e76
+7833e4df
+9bd47970
+115a51cb
+778164cb
+25b8934b
+ba003504
+93fd67b7
+91183804
+c26ea7d4
+01774abe
+d775bd12
+98f9ccac
+621c7d33
+13843d40
+d3d314f7
+75594775
+5fccd633
+c025847e
+ddf88ddd
+8b6cc5d8
+bb105ac0
+de8f997d
+7d045fa6
+f3fccfbe
+e8200f0f
+f8d04f33
+a9a5cd4f
+39dfbbfe
+4e7af834
+0f79cd7e
+90ea1eda
+69a16fdc
+cffab2b5
+04176629
+96861b56
+a39e1586
+d5eed791
+50884cba
+8d0e6adb
+3c437d2a
+56a92c60
+45e9d861
+11c29831
+5d379412
+718dad9f
+f07d0cfd
+2546059d
+1d15466a
+be62abfc
+45514bad
+b8adc4e2
+6eefd8e9
+d3b2f8c3
+6300d686
+f2e315d1
+849a1ff5
+270f2890
+87700c4f
+1d5d3a57
+c0db400f
+ec01bf7b
+d67dab89
+955db1a4
+1bced191
+b39f5eb3
+770ab1e0
+b1262c8d
+1790e73e
+1b0ebd59
+bad122a7
+bc84f389
+7c5de8f0
+07752963
+505dde7b
+054f795c
+56760dad
+caf035f0
+96911ece
+3f2113d2
+d6081aa8
+bdcdb638
+22a3ad95
+4fd13aa7
+3ec889ce
+4b66b782
+cd9ee518
+5103b5f2
+164d3259
+b3606536
+9a241526
+ec26ad35
+da9fe092
+ff7dd6cd
+8e08ab72
+f3ea27fd
+9fddce14
+e0671437
+a9f179db
+3c99d8c8
+6aba8db0
+f37f3967
+c250242d
+f496700c
+ee42dad0
+9df43148
+4a9edd30
+531b4491
+4abca469
+32d7ba4f
+058079ff
+404b6540
+ab7f43ef
+72b8d755
+6fb72cba
+831f44d7
+032e8f3b
+67d450ba
+02fa3dea
+ae34bc12
+8d4e6d8c
+a4e8e846
+3dddc622
+8b3aeb32
+31129d1d
+2d3a050a
+966cf5e5
+6242c732
+72b8d7dc
+2232b44f
+79ec5c37
+6f451ccf
+c004c6bb
+85f5af35
+9a556cfc
+fdd86175
+8bdda8d3
+c9bb7db4
+462a6b96
+9b10b2d8
+974bb6b3
+b1794f41
+1b462c8c
+58a43195
+54f6aff5
+bcdecf97
+6b5cead0
+81c5905f
+248f6630
+3738ef73
+fda9dac7
+7ba39386
+e9510b62
+c0c13be0
+d399af40
+7d50b7bf
+18ebc7f5
+1be6c7f1
+323aab6a
+38a0a8f1
+6321e1ac
+9904c656
+891dabaf
+79446d01
+55432fa3
+4ca64076
+eef24cfc
+ee99a821
+ba46a667
+0cc1543a
+422e7113
+e2c9ffae
+b6a3490e
+28793267
+6605962d
+3b8f3836
+3626d9ba
+15f5348d
+72966777
+6935065e
+0b35b5e2
+d1478842
+aee7bde0
diff --git a/models/rank/dcn/data/sample_data/vocab/C3.txt b/models/rank/dcn/data/sample_data/vocab/C3.txt
new file mode 100644
index 0000000000000000000000000000000000000000..a794b343b63ddc4d727a454a1485b5c557fdd679
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/vocab/C3.txt
@@ -0,0 +1,1548 @@
+ddb1783b
+ab0c401c
+015450da
+b9769c83
+2ed12532
+fbead95c
+ff030570
+40a52a3b
+53f9aa51
+455c46fb
+c658dd74
+bc4ec02e
+9577a60f
+5fbbb62b
+5d2a373c
+9dabad1e
+847d4d09
+1bc3b471
+af709d76
+6ace624e
+948a3e21
+fbfa5f46
+2ceb366d
+727847c6
+196e2242
+5d3a12c5
+d0c01507
+e42cc8c2
+23aed453
+af5655e7
+b2a093d0
+2efd4490
+f3dd6a1f
+d577be04
+f89b52a7
+c0dd0705
+7da86e4b
+b9b98a33
+340745ec
+ed458a77
+9eeccff1
+41f71350
+2dbfa925
+6e5bddab
+f9f78e8c
+0662f9b6
+b3eea8ef
+bb6adea5
+06ec8bb9
+d507fb3f
+a1780056
+993aa5eb
+77b35cda
+850e7558
+3c548aa7
+2ebfb65b
+6de5c512
+74cc1f20
+ae06bf90
+9143c832
+2c2915ad
+af169d43
+ee6e4611
+c804b8dd
+bce3f26f
+4a75b52b
+d1825ce2
+a544b0a6
+8512d54d
+2279cb88
+0cd3102e
+b3770b31
+27c6cfe4
+a47ffdf1
+8dd2cbce
+948ee031
+3b318456
+d4f32f4e
+fd56a128
+31c3612f
+5cbc786d
+c5c5f657
+4e1476b1
+e52136bc
+cf59444f
+50b60277
+9f7c7c99
+0a1435c1
+0c38a323
+233378be
+9516176e
+9069ec83
+5a45eb29
+7a6d9b45
+0e4f7fa4
+23ee74f4
+97053b66
+80793437
+44b0e0a0
+81d4a8c7
+c711073e
+c81ab184
+09dc35ff
+9a44114c
+c3a3f4e5
+378e8302
+785f1adb
+5cca3efc
+f9506251
+f9e4de46
+e3a92241
+b542f5d0
+d1e7b5d9
+fecb26a1
+d8d869f2
+814a9e19
+a5079607
+e126dcf4
+90488a91
+d03e7c24
+63f00f19
+2bce25c8
+322488b3
+ababa78b
+a4330150
+1ae827c7
+0890aca7
+4824fcf4
+0ed56658
+bfadb215
+b1b6f323
+539fbf75
+597d596e
+a98d9b55
+e48e5552
+ea6b1af6
+2b8564f8
+2a4c390e
+1b3c04d7
+5bc87b59
+4552917c
+9c1c85e7
+3a864d1f
+7bf6b80a
+95c48c52
+ddda0e18
+7deeba7e
+7f15ef1f
+b9e3d20b
+d3386cca
+2c14b932
+46928ef3
+80e96ca4
+580235b1
+d8461118
+ed458acb
+063d90a8
+6eb4e2b4
+ec13b576
+2c77bee7
+834c6d29
+c92acf96
+e9015daf
+722f4985
+6f2bf6d4
+6035209d
+795c3cc3
+de211a17
+7153a1f2
+2632cc5e
+fc1cad4b
+7e3ce46e
+cb2e352d
+13f4e374
+146971d6
+c0ac520f
+eef2d500
+125bfa8f
+1967dbac
+704fe5b6
+1c239854
+31692ff1
+3c5f3c81
+a5781643
+82593e7a
+9972167d
+119bd655
+6d4629e0
+091f2f36
+1858be83
+f1c030c9
+4f9666c3
+c510e130
+77f6ea44
+02bd7bb3
+4c56abb0
+faf21a45
+e78bdebd
+c48cd8f8
+378112d3
+5d1ca1e5
+e74ea3bf
+2787247d
+9b792af9
+ef4b47aa
+40cb550b
+ea492f68
+edd5d2f5
+961d73a9
+6dace5b6
+2c727545
+e4bd2d4e
+1bc9054e
+4197e80a
+8f9e5a86
+9270e53b
+2401bdc8
+0e38eff1
+225cd5cb
+7b36f13b
+aec46440
+c798ded6
+8b1bfdec
+b34e125c
+24c93e37
+75746b02
+6ca29110
+341ad6e1
+79ec5d3c
+6f6cdf1c
+1a3f9411
+6b8d3b5f
+16b30097
+92c2a738
+adb580cc
+29bfb2cc
+7542ab52
+d20fdd45
+cdf6abb1
+aa6d27e5
+551dce64
+b7506183
+173284a4
+74e1a23a
+08de7b18
+d91a58ba
+b0874fd0
+502f417b
+05e27d26
+b642d4d1
+1fc15dd2
+ab654591
+8dcd646f
+e980a986
+b7810abb
+3c402e7d
+f82fff8c
+7ae80d0f
+d33de7e6
+df58a0e3
+ed0efebd
+efcccd46
+6c865690
+1f06b85d
+4385e7c4
+a5cea538
+d88496ea
+840eeb3a
+a56bed9e
+7623ee18
+d384736c
+9c64e4bb
+dca5d15a
+397bf656
+ad82a6f4
+68f5ccf5
+68b7d1bf
+1e88f1ba
+af21d90e
+c4f1b7d8
+03380d83
+b9d09879
+010a5a81
+2958f27a
+0a83c6af
+c4b406e2
+a3c9e878
+1c8bc12d
+d36105b0
+b6025941
+d1063f65
+050447a9
+8ae24953
+f544f869
+1704d8fe
+bab1512d
+ee9dd06a
+2b280564
+250f83a6
+962bdb57
+0021f343
+4470baf4
+293cd84e
+d6be853a
+8d9e5174
+d20dc4ff
+79b8616c
+6c04f392
+f84d40ab
+762fc25b
+b915db9f
+01bd58aa
+66b3c112
+c1102de8
+97469fb8
+b12880f3
+889981ce
+63f98b5e
+f652979e
+109ddc69
+dbd1e7b4
+bd4d1b8d
+b393df87
+daf652b9
+42db3232
+df00c4f0
+cf6e068e
+72e5eac0
+ae9684f6
+fb5c5031
+39800186
+ffe810c0
+29a24574
+e28faa26
+946aa7e7
+51f449fc
+bd45f0c1
+3f2cf26e
+04cc3bef
+1e08dae2
+0035792d
+b7a01d06
+82e8e720
+6d8e7880
+48cbfcc7
+e38f9fdf
+82059855
+5eccd0df
+f27a97c9
+a01554cf
+7ccc8a31
+e9c3a8e9
+5c05f1ab
+bdec0e5d
+d9429dcc
+7d0bcac6
+519ea5b3
+c47701be
+1e358e52
+0fc95866
+28677526
+9011c162
+4a48fe14
+d743235c
+d5e9e7d7
+53dd5463
+46818393
+c6ae415c
+5a1201eb
+60c37737
+93915f07
+505d7888
+5c7adc62
+69638440
+a5576c90
+5e975d57
+0a1e0060
+b009d929
+af709891
+1a9dbb50
+12da0bba
+223b0e16
+59ee5d15
+b2e50211
+93655629
+e7f030ce
+4387a5b3
+dde182a0
+9b9cd1bb
+46cbfec9
+1c81733f
+615cb3c8
+a65db9fb
+48a29f29
+6406abee
+be3b82f7
+7fee21cb
+7555338e
+9b47d5e7
+93656bc0
+517bfe1c
+f48a1b1e
+55f6b073
+2296d72b
+059b987c
+2ee4d4e6
+837d918a
+c8437bbc
+cef97273
+9c5f3db4
+75e2b65b
+78a62ae6
+dffca8ba
+332bfa7b
+67212b6a
+f9aed79a
+87c73e27
+20009f96
+c041ee3f
+c8d6ca0f
+60dc6bde
+382be412
+a816b43a
+58d4eb03
+2f60f9b9
+b0830d8d
+cbc2a46f
+15b27e1d
+dbab5bee
+f33f3202
+af2e6b8f
+e9ba3c02
+26dd9517
+0b064033
+724423ff
+5fdb3c8e
+6e82e332
+d3d01a24
+d594ce19
+a84c8242
+69866e1c
+e6319005
+4d564a1c
+2ca73647
+258e7f83
+d135e418
+a7ded28e
+cd2e49f6
+469781c4
+0739a78d
+489ec9a9
+ce2fc08d
+9a46383d
+d032c263
+7509c305
+f7e2684b
+aad619fc
+8205c476
+d43f535f
+996a90ad
+4ec501f8
+45f68c2a
+5a5cd8d6
+79f19a9c
+9db30a48
+b1ecc6c4
+253ec1f1
+ff92fbfe
+96cb9be2
+873cec9e
+ab223c32
+17b70ad0
+5aa043f5
+843d5113
+2c43536d
+d822fd07
+e56d2e98
+8679dd44
+90097620
+40e7e38c
+89ed150d
+0b0f3952
+a472032d
+f74669f4
+1967b0f8
+7802d793
+b015b8c5
+92ee4c97
+8160d721
+8b9d6c77
+9a5df585
+6e981d07
+3a192a8d
+8b5dc0cc
+83d8004d
+bc236a94
+80796f11
+e426872e
+35e8c904
+274507c6
+feeaa45b
+dcb20533
+d407d36d
+d2c61e57
+4b3e29f1
+a070f5e9
+75df6d36
+10a2b97d
+0961ef7f
+71a318e5
+1c3d2e8c
+561f09ea
+5da92388
+c470c841
+03942b3f
+66b90a69
+164421ea
+4b1bf593
+d35af343
+406c45fb
+3970719c
+e7290a18
+6c2a81ab
+915bbd8f
+6cea144c
+58128c6d
+ce013d83
+a0410512
+47d34037
+3e4271a4
+50d9a8f4
+4bab508d
+f00a9cf4
+5f169e93
+db080e48
+0d7d1680
+f794f100
+d5e7f4bb
+fd53d470
+a52e6cfa
+8e8980b5
+03f5e595
+34aec1f7
+3c368043
+4e1c9eda
+2810497b
+1bb8c9c7
+9c0ff811
+6bc1a9df
+353686f2
+c5aee658
+f4caa708
+c02372d0
+d73903c4
+04db0280
+ba2dadc3
+f772a31a
+3e50a6a4
+53a79aac
+15d7420a
+462a1c4a
+5432921d
+6e4b89cc
+333440d5
+10650d38
+58a6a982
+58ca7e87
+13cd0697
+232809ed
+bd840d0f
+1100da42
+38508800
+771a1642
+d7717a47
+bc33ba2c
+cf54e294
+a285b185
+2ed6ee9b
+e915f032
+2025a6db
+8e862e24
+96eb1060
+9827765c
+02b32b98
+adb5c476
+ca6213c9
+d57d404f
+51546964
+65e71023
+b063fe4e
+dad8b3db
+4a73e400
+c5d94b65
+83eaa0e0
+905a7b53
+de788080
+4c855e25
+2c9830bd
+8af97d79
+272d244a
+2603521e
+859e1d36
+72791039
+b3ee24fe
+abc41c67
+0d71b822
+6d0ceb43
+d03de95e
+1ef76781
+48783b80
+7b31f518
+1a31a0e8
+b7e9dd4b
+30da786a
+cecb37cf
+341322e5
+ce9f4996
+fb47f7d0
+b1d13717
+3fb8a9a7
+e37e0228
+dd17c91c
+1646cf1d
+f4725fdf
+6a061171
+98af366b
+052a2083
+1e0d238e
+3b3bf2a2
+8dbba1ce
+892fc718
+005f6097
+33c0cc8d
+0878b655
+68d09113
+3c43a549
+12807cb3
+34e8cf3d
+0b323616
+153d848d
+c97befcf
+13b6fed0
+7e17f920
+e05b1305
+b98694b1
+ecbb5a5c
+a1197891
+6419de89
+52ffbd0c
+ae843bfd
+5ca889bb
+ca9fc7b4
+ee98eed4
+071e949e
+d6f8e965
+65ee8faa
+74f7472e
+19dc63af
+9a26b596
+a578e02e
+6b8d0fd6
+10fe679e
+2c1222d0
+cebe3da8
+cd82408a
+e54b8399
+c32bb6da
+0c489a16
+34aa8ece
+f1d57af7
+50ef37c2
+84381448
+010e36f1
+8fe8dfe4
+4a3f7953
+dfd82d9b
+7d73dd56
+d83f5a10
+a7078efc
+484189a1
+f483d069
+435c3859
+d125aecd
+c23b2b9e
+68eda0d0
+d1b59691
+0e57238b
+7c8c0c03
+645d9349
+4098d6ac
+003f419b
+478fe7f6
+2fa542f3
+2b032f79
+2cbec47f
+12d978b8
+11e9ef25
+2a329a21
+bd24ac03
+e00f49da
+05e89637
+ff2375f9
+e2ae765d
+edc59ebe
+6ee9fd65
+78b2fb6f
+44e04519
+5cf913a1
+0e3e14a7
+9c28ed49
+16c47b7d
+f5a5e9d4
+25d46809
+89ae5da6
+829a8bcb
+1a02db2d
+b917b34f
+49bd9fb7
+a336644e
+53fc0f21
+cd856971
+7c1aa849
+9bd70a98
+ee96fc95
+df65428e
+637f263a
+21fc6ea0
+0611a1af
+f4e8f032
+e22844b2
+3273f1ee
+645f6a5a
+c906fb15
+8ac69fb0
+184635e7
+f95bfbb3
+b66067d8
+c740bca1
+614f2543
+3e25fde3
+04f0a0fb
+8ff58dd3
+7de0495b
+796aae85
+93aa4d81
+c634c9aa
+9d7dd0ad
+ce31dbea
+92d78a73
+2784d8df
+c57b763c
+fcd210eb
+c7cac1c4
+02391f51
+2f41ea3c
+999a54eb
+81789123
+f7b63909
+c9ceed41
+fb95c669
+1fd38a03
+ad4b77ff
+24f88ea0
+acd63214
+92eb3174
+149eea9f
+33405363
+db002ea8
+933d61bf
+ca5bc989
+d27bb610
+50bb5f20
+086446f0
+5a50b771
+268a0825
+8d8386d4
+ebe4edfa
+91788211
+7f0790b8
+7e45a604
+49b1e15c
+fd2eb00d
+a763a2ac
+2c4ab411
+e0759955
+cedcacac
+4266970d
+edc14e06
+4bccfea1
+3db0424c
+3b823e59
+c80c0855
+d4636800
+bbee4bea
+0fffb4b1
+14b7568f
+2260448d
+b954d064
+6e452d04
+94708009
+96f32d88
+6b744014
+43701d0f
+fa7b4b22
+77bb016d
+8af3494c
+c257d9f4
+f10f60c4
+f7564807
+22a198d9
+bba9a339
+b6d802d2
+d9b5f132
+23ef2401
+62c5c26e
+23005a99
+a8d6c8b0
+847d5737
+cebee428
+d7e4eede
+554bd1ff
+2f5180fc
+244c0ec2
+0d1c21d3
+bc7f5764
+bd19fe92
+f89db90f
+8c7412f7
+48c323a8
+8b3d7db2
+45d3b3c8
+9df940f1
+5faf0585
+d14555f7
+ccec842f
+622d2ce8
+efded18f
+24d63acf
+64094ddd
+910ade79
+115fb26b
+5336cae5
+77b96efa
+c024e760
+a377b99e
+2041209a
+6199c6fa
+76ed1dd5
+b844b308
+bb86f1da
+c2b008c5
+3ea86ae4
+5b478069
+59c2633b
+79359a6c
+9486af17
+3d8e2b79
+d8a48efa
+59ed2f60
+fc248c55
+356f7e0c
+eacb18f6
+95c6b43e
+a0569682
+f148b03b
+7eb22712
+f7ae16fe
+fdd14ae2
+e6cf47df
+9f39ec1f
+8248eb23
+88caff19
+cba72ec7
+5c4ee42c
+9201d724
+fa33d039
+424c24ac
+55a2a6b1
+a23dcba0
+73e60e15
+184dfa53
+87777475
+64d413a1
+5bb8027a
+d65a2dd6
+d22cca26
+c0788a71
+70168f62
+9263753e
+3cab1e66
+72ae71bd
+85481426
+6455af46
+45a3782e
+502bedec
+2b2f6a90
+39c6a6b5
+e945cc9a
+c46e7a6f
+332f8364
+16da655e
+58357e2f
+dc7cd51f
+83e01bf1
+aec3baa9
+04e51b4a
+c56b0117
+fd22e418
+d80ff8ff
+1f6c471f
+ad81dbe0
+d627c43e
+80b25c0d
+21d601df
+a9d15bf1
+145f2f75
+9c348e36
+e2772539
+26dd42ac
+34e62402
+01a0648b
+8962afa9
+dcab504e
+381d8ea3
+0f09a700
+04885e0b
+dfa7ce18
+90825897
+8018e37d
+2d9eb888
+f8b1330d
+74e971ba
+be7504db
+d12c9593
+506b8f15
+c4cdaf5a
+47f0b0d2
+e489c1e0
+8755b699
+34c3a840
+930c5680
+26c306dd
+5dc0c13c
+534511c4
+5c7d8ff6
+e55f823c
+288455b0
+66767e4b
+eece23d4
+4724f2c8
+b7942dde
+df1c11ee
+29bad92f
+8994dca3
+27d31a3b
+88af0782
+7ae8338d
+260d6a74
+98bb788f
+2289f682
+737c1470
+ea4345d8
+67802c30
+6d1bcf74
+ed23b64d
+b30d5393
+365d1d63
+f2336534
+0ac5c1a6
+4fd25998
+e95d4630
+8f113de9
+615e3e4e
+71269a66
+bdecd9fd
+6f6caad7
+a89fefff
+c0dcd56e
+548bf07a
+cca28e3d
+fd09bb1b
+6ef2aa66
+dc0d65af
+99b1c7f1
+236df6e5
+11a7ae61
+d7bd8733
+9d0a64b3
+47d2b0a5
+3b7a0f0e
+02cf9876
+f7b9956b
+2730ec9c
+95d14a00
+6abc79b0
+efbb5fb4
+02342845
+a2edc244
+25c5f247
+de29631e
+81f1f12a
+f1397040
+fe3a8624
+237370b5
+cbda6ee3
+b0922cc7
+e38b5735
+e97e9e53
+70654e99
+c8454cab
+021f3c35
+2b9362b9
+53e06f71
+2eb5869b
+012e0126
+79bc99b4
+d37f98ac
+39a7290a
+396df967
+4d0c7db6
+ec8fe644
+75cdab88
+52a05003
+1757640a
+96759526
+e6d0b609
+c81d20a1
+303c197b
+a366c0ca
+db47ece6
+d0871e18
+159e905e
+f254048d
+c1a6669f
+69085bd0
+2fbf2e7a
+b2896a41
+8b9ad909
+213fa071
+2e6ca0f0
+4b078021
+ee6c8c23
+11ec74e4
+d94d0a1b
+1fd630bb
+4079e3e2
+6a50850c
+1417521f
+70e66e63
+292ab5d7
+322bc855
+39ecd705
+ae45b212
+698ea994
+c3ae85bb
+be01b114
+a06031ef
+a72e728a
+1af0afd8
+98a1498a
+890610b3
+7e5fff77
+3a3780fe
+6f84a9c9
+90ba4a32
+77f2f2e5
+99c31d4a
+8cce4736
+71ae8ace
+5d4f34a3
+dbf8d845
+4cd4cd15
+468b8835
+a828bf0c
+9970467e
+443666b1
+90123821
+bcdcf8fa
+c15b73ac
+ffb0bd11
+e36f6eeb
+b323581a
+49dc1a80
+33430e23
+b3c5ca87
+8be6e8e0
+c99ca9d6
+46bf55fe
+bd8dcf8f
+a5881fe9
+75173be3
+28a8906c
+7eef46ee
+eabcac60
+056ac35c
+e211eb6b
+66e344ba
+0a4787b1
+9b953c56
+692f4de5
+1e1f4c64
+40933f2f
+3f7d0844
+ab5e44b3
+8fa47210
+f336432c
+90ad2a9e
+2c8c83b4
+45ffb426
+050aec9a
+a3829614
+0b2171e0
+aa653ce3
+c530a5ab
+2403714c
+bafad80a
+5e469e4a
+0d00ad2e
+8a679709
+cd690793
+8e5fc57c
+71f3d24b
+74972720
+a07196a2
+99a1431e
+9db1c427
+09c825b5
+95a04d25
+1aff2d87
+f90e3411
+83471565
+6e3d442c
+36e97f3a
+3cc84046
+c2b2b3f5
+ba1947d0
+eb29354e
+7befd969
+8c49f7b9
+aed5a45d
+30576b40
+8da968cd
+af74f993
+35823fda
+59de90c8
+9ca0b7fb
+0de7e3d7
+ba4559ea
+11f8270b
+476ce478
+474ee3e3
+9bce50ef
+2a7fa4bc
+d2fc8695
+b2705826
+450383ed
+74964023
+f586f263
+bdb4178e
+572bef58
+8446bff8
+157695d9
+17369e26
+1c79c988
+02ec2911
+578a14a1
+2bf592d7
+e8fa131d
+dcddb172
+6ef2fe94
+b00d1501
+5a5ff593
+e28682c3
+c40d0733
+6b5531ad
+515eef18
+32138dac
+5d076085
+30356fb1
+91dfee2e
+43bd7284
+6c890a10
+5dc4cc4d
+6f67f7e5
+5426b176
+73cef274
+e5b25273
+7ed93a63
+8e433f86
+90308960
+58c8748e
+4d3cbd76
+7ff4b9af
+a4b38ecc
+97a96e5a
+71947b86
+8edd83fd
+b602eddb
+6eacece1
+a7b0be35
+dd72e8c6
+75eb0ab9
+b5b41c62
+1aa7234a
+f6316f35
+210c632d
+c449cf49
+1701e1d8
+8ee58029
+01bdc7f0
+88a32828
+ad876a43
+f94ad0d8
+b605e025
+83a8fa4e
+0bab1155
+ad30ed0b
+4b0ad917
+398e1819
+30a8067b
+34eb0f96
+fdb50970
+2c993e23
+496224ea
+85f5bf39
+a55127b0
+48a5334b
+98427538
+61059204
+51a1ba8e
+aa92a9ec
+29042d0c
+af1e24ee
+fbf74047
+5e25fa67
+540fe858
+e123608d
+88290645
+e7637bdc
+7f27a8d3
+9d427ddf
+7fb9f9c5
+790f7e66
+d91d4f43
+523078c5
+097916d2
+342bec23
+1f047e79
+e2c7d8fb
+b64ac9a3
+f463435f
+5e53347d
+695a85e0
+60018c36
+cdccaa4a
+9ede0a40
+cbea9d17
+950a41a7
+fa1513f3
+ce44ec37
+c4a9c2c1
+d459136b
+f0e6d718
+c57275f1
+cf1e87d9
+74fa4c24
+928fec4e
+55870860
+3cc14b5b
+0f8b497f
+328c0325
+5e265c7d
+a17519ab
+f388c511
+da89f77a
+0496d5c7
+42a04a35
+ca11895c
+5846bc71
+13f58cbf
+a6cc32c5
+c9587ee7
+545c5287
+b1ba39d7
+15795d8e
+1c746f18
+069d54e6
+ab9b70d4
+d4f062b9
+9804030f
+866e2110
+37102ed8
+e039dc50
+8b57b943
+7017aaba
+47c40151
+6a25e5bb
+a027dac1
+41d6f01f
+14728af4
+239b957d
+8f363e8c
+1adc7742
+b35f900b
+67eaaabf
+a80263d5
+a293bc0f
+42e213cf
+2f8cdec2
+4eb6fa53
+15e1524a
+d60096f4
+9d372a2e
+bcafa3fb
+d0ecd5e2
+2d8004c4
+c2594e29
+10934eea
+0dd08f6f
+bb87fbf5
+a3188fdd
+d406ed6f
+9097a699
+e346a5fd
+48bc3864
+ab32166e
+a1f7e30b
+e5b0dedc
+4cbb2c0e
+19b57a8c
+066663fa
+00148a74
+4c020e70
+6dd570f4
+1b636f9b
+6470acd8
+43ad6a52
+c6b00883
+eded476a
+c77c40e3
+af48fb08
+e673c4fb
+5a70ec82
+f05c7f76
+9418ec47
+f9239e58
+eb89ae82
+61610e17
+3dd0c426
+2b036e4c
+63f8d4e4
+5d977fa7
+38522d18
+b72e2ed1
+4bbacdd4
+432ff8c1
+1a821b32
+d0484442
+5ed3ec1c
+15c00914
+99bce3c4
+0cfae832
+fff95cb4
+80306e2a
+1efd7f38
+79c84135
+21d13ddc
+21bc63e0
+55b0912e
+ddf8ef77
+df88f877
+88f7097f
+9dd3c4fc
+18b27f91
+c86b2d8d
+bc9e2c3f
+b2de8002
+0c59920e
+0f3c25dd
+792674a8
+9f63d649
+666432c5
+7ebe08b3
+dc1726bb
+111fd16e
+4aef7162
+5d18b92c
+5a4e7529
+862a1294
+bb6aa417
+62780ccd
+398802bf
+a5873bcb
+bd16453c
+1c637694
+ddf34c45
+4bd49689
+82c33bb4
+03c15597
+fe69ee4a
+d69361cf
+68040b80
+8302b068
+b7508dce
+bc2aea05
+8a48553d
+892ca2f7
+4385bbd8
+e058fc3c
+993bbe39
+cf7e6997
+5e64bdd3
+b6956a1b
+fef6fc8c
+ee9e2d79
+41831fec
+7b7af951
+03689820
+942817f5
+ac6fa568
+3f598f1f
+fb9ab580
+3b34d563
+be3b6a18
+2494313b
+6147a35d
+2acc1a0e
+3f0ffc77
+4bea9b89
+bccf5891
+cd9dbee4
+59c2f2a1
+6773f5b1
+b7ebbc9e
+c9205aff
+713c0f91
+fc83d197
+e1266b28
+eee1546d
+6a8a1217
+aa8c1539
+f5359d0c
+3470b40d
+79302b52
+41de735a
+c870a67f
+6783e03c
+ba64df94
+29ffa33c
+d3658d99
+a889401d
+1e41aecd
+9ef94bf8
+fc39fe56
+5f0a3857
+12c46593
+5037b88e
+99ab3e96
+403e8b23
+1eee82a0
+4530b12c
+67363f44
+84b83c16
+838c936d
+33f1d084
+f7df66e3
+b9f2c3d4
+ca008db9
+a684a111
+e3818eb2
+c725873a
+2273663d
+dd1e882e
+6b80d7c0
+5134adab
+477de29b
+53adb653
+5bb03b00
+3f4419b9
+2b8f3e48
+2f6383d6
+fe109641
+f55a4e08
+bae8ddd2
+46a2cf53
+bb40d3b4
+2c802917
+fe431481
+bc2e311b
+6770694a
+e159e1de
+984b1bd2
+2ac5d4c1
+95e13fd4
+d52980aa
+e78177cf
+7ca9a99d
+60a2cdee
+09adf3e4
+db44bc96
+b97d785f
+01638aca
+093ca40b
+69b028e3
+12970416
+bde9e399
+d44e8253
+10f163e6
+9187a4f9
+85954fc4
+eec92c7a
+50c09a82
+d2c64403
+0e08eb0e
+c01cf7e0
+8d7b3c67
+e773e626
+e6eab848
+67799c69
+404fa9e4
+c978d3e9
+179121ae
+2e4b14c7
+d0699d8c
+10c6bc1e
+883c3589
+15295acc
+861daaf7
+308b652a
+b4e79720
+07950225
+f715d8cc
+ed790185
+80702622
+67af58c5
+2b85cc69
+e200a84e
+b33f4466
+0e960dcf
+76dfd93b
+3b3ce734
+1bb7dd38
+43a795a8
+21a3ef2b
+9c1b40b2
+c56a75c1
+b3d4c34f
+a78bd8da
diff --git a/models/rank/dcn/data/sample_data/vocab/C4.txt b/models/rank/dcn/data/sample_data/vocab/C4.txt
new file mode 100644
index 0000000000000000000000000000000000000000..ed778fd9ca3eb6c3c22410e0ef9594bf26359303
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/vocab/C4.txt
@@ -0,0 +1,1965 @@
+2e31cf94
+26aac878
+a4b13072
+07abc9f3
+da3462da
+0aa96788
+bbf460ff
+bf9e41b6
+800845b3
+1e0ec6a2
+c64ef081
+200b020d
+97ce69e9
+46ec0a38
+ece2faed
+ab75352d
+b30488a9
+a9d25980
+bb714275
+d2f72457
+6d8e8c37
+79aaad8a
+a7894c94
+728d8e1d
+bf26826f
+fab4c8e6
+0d0ae4e6
+a8ec061f
+68637816
+a9b5f0f3
+6a14f9b9
+f623717e
+2e3970d3
+45e7b9c6
+65d481a6
+6bb5a9c4
+5c5e2f2f
+17f1ba18
+1bf52f2f
+8b3b6b2e
+e4005c97
+800261f1
+c2d5cdf1
+35ef0160
+5d077156
+de785a3d
+75cf45d4
+5fd7e960
+0a4711b4
+562bbed5
+fcdc5174
+95f28cf4
+81e71114
+7ea693a9
+a7f56562
+765c507f
+3dc1afeb
+43bcdfb8
+9bbf6db9
+d7660276
+bf1c5fc8
+bde8476c
+d4e22528
+419729f4
+6c74d7a4
+41c583c9
+c2d204f0
+37718639
+57cd1f7b
+2ce13ec9
+f00be896
+2e4c7112
+beeacfa1
+ebc3fea2
+0c2cae44
+cc5a9904
+da88deee
+abe0875c
+ecbdadad
+0580defb
+2c3cb82a
+8c66cb26
+6d56089b
+f752dab0
+33a72095
+1a55b47b
+2f2d24fe
+7a26412a
+9a0487cc
+4ab0c6e1
+748540db
+397ce063
+cb705317
+dab52c65
+37d833ce
+db01fc03
+69bb6a95
+d2941349
+d74ca86d
+fadd820a
+62a6a4d6
+3ca2c63f
+6b9bd703
+a9be8183
+d1038a37
+d8660950
+54cbeeca
+a4c00986
+496a28d9
+fd143ae6
+5f7bee64
+2d3b257b
+8cc7b33b
+d88e08aa
+868a33b2
+a54c061b
+3fb81b62
+6bd9c814
+d16679b9
+83c88500
+8d8188a0
+9ddc492e
+4ca5b93b
+397511a3
+8eb9aec7
+428cff52
+b034af31
+d13a0547
+ef6ae422
+07f72e07
+ed346961
+9a6888fb
+1584a843
+06b65969
+62e53fd1
+a9cb248d
+cbd09844
+88d7a4fa
+a1644c3d
+8368ab98
+fb3a6ea6
+6fbea887
+8a68706a
+f92cb03a
+7eac96d7
+ac18055c
+f7263320
+e5e453f3
+aa5c12ed
+7b8d93e0
+5a115a36
+726af275
+66e2fea5
+00cf6715
+02bea2eb
+34dbb2cd
+dde6f934
+aaa00ab7
+9cde7791
+3fee2b13
+fc932c4b
+b696e406
+b4495b71
+780558c9
+8438921f
+b125f81c
+a6436ded
+8e9c10ae
+b14544e3
+907e4345
+caee1d0c
+7c7935a7
+1ac60e6a
+70fbc093
+0b77ee62
+175a314e
+fdeabd46
+2ad596ff
+a91445c2
+08c4403f
+71f59793
+9e05157c
+23fb4c3f
+d63f2e09
+940c70a2
+7017661b
+410e2607
+65866ff6
+dc8a4678
+a5cb361a
+5b017cc6
+ee1bb254
+f8c98289
+54b2d6b3
+fc5aec6c
+bdc79ef0
+c18be181
+39547932
+a44d75e2
+43dd5c5a
+a52b75e1
+7d0ca3f1
+f11f709b
+fb3c4833
+859489e4
+8617eea2
+84019174
+8467828d
+32f39e3d
+afe92929
+20716e3a
+7e42e0aa
+5d5ca56d
+a2c2c98b
+971b9e1f
+0a5a0e77
+6c5d0648
+27d8fdb5
+2e17d6f6
+211597c6
+0a5de462
+d691bb7d
+c57d3796
+7d953bb9
+32e70271
+e53b75cf
+6f0ed6cb
+e9370452
+53aa3ec9
+96cc0f03
+10a8cee8
+c9fa2fbd
+93ee47dd
+64f93c71
+68ed35d1
+1d9255ba
+8a48d514
+a3b26a06
+7cb07a1c
+90a05518
+f3608b1e
+8c806a40
+d04115f6
+3e2bfbda
+2a35d8c7
+ad5ffc6b
+4559bc04
+b10dae38
+5349d668
+b733e495
+1cf477ac
+4654fcab
+90b089a9
+c37e7225
+df51e19a
+86205315
+6c16053c
+ae18383d
+fd4c1a23
+22504558
+452d0b5c
+5024ec42
+f55645df
+b742bcc4
+98b100e5
+934c781b
+c1800226
+30e21ad2
+6c48d226
+491a7c8a
+0f7d010f
+f6164b8d
+0d0471a8
+6a79eb33
+a094fdfb
+fe72cef1
+8a8d0597
+befb3a98
+18931141
+79b6bcfc
+04e2444d
+6267c936
+ec142e08
+ada604fa
+0d02fead
+ded6a29a
+746afb58
+62eb4eb9
+0ad2ad70
+82b259b6
+e96617b3
+811c603b
+8a3c46b4
+b7ab56a2
+a21a58c6
+5d0afaff
+4e68109b
+72ca9f69
+c874c3e3
+85f5cdf6
+4d3fb9a7
+a8c1440e
+f4eea829
+ecb0d7f8
+8ae4eb4c
+1f320590
+ff441594
+f16599ee
+8f667b75
+bd179186
+dd248fab
+7ec57f46
+dae73f1c
+b457564b
+7ccee479
+cc936fd0
+327400cc
+f013f935
+d27e1744
+fdaff3de
+36f384e5
+d152dd75
+05cb1209
+ef4fd7f1
+1233b8ac
+e713408b
+745c2d1e
+47db54eb
+fa63e799
+9fe64608
+8b75ff29
+f921f21a
+a0b6c65f
+6a11ea31
+6f52a2db
+d87eb438
+10d51044
+6321bf30
+4f447b84
+84ce1573
+137b22f3
+f56b7dd5
+cfc23926
+e71d441e
+a19a89a7
+6ccb34a9
+8e745cb4
+7e911e7f
+5739bc45
+5a66b2e0
+9c083410
+41505292
+27dea862
+079955f5
+ebcce931
+d3e92866
+aa0b8fd6
+a59b22bc
+7ab86f88
+705aac57
+d5ed7272
+b3615b6f
+61c3fcec
+3e9ad633
+140dc137
+acc1e89e
+01df5b81
+9c92db7d
+849a9637
+27d7b676
+a349b61f
+8fd92cf7
+85bbe3d4
+a7ac9d32
+64337d7e
+d0c7032a
+36a05f39
+71d76203
+cf9d2c81
+96ba22aa
+99f80b6a
+a145d0d7
+e517f033
+9ce24ab2
+745c6c18
+bdbb8503
+36fe83c1
+84a30d9c
+da13cca9
+26ac5cc6
+86fe73ef
+718f932c
+25af3149
+1d134ad6
+e5b87686
+e900583c
+c771bf5c
+30d9fc77
+beeea64d
+e2159af3
+acd57dc9
+f158407d
+37dff460
+25865f0f
+b7aaf07f
+03138509
+7f00f8e1
+21289e5c
+620fad99
+1fbb595c
+dc64cb2c
+3a007444
+7c566b4e
+cd2ad5f0
+4183399e
+07fcd133
+9dde01fd
+ec520fcc
+bef1728d
+a374d428
+2945ec5d
+8554d277
+ae78390d
+2df0cf77
+9ad1d177
+94ea135f
+097de257
+aecea748
+34261634
+b5d0a0c3
+0c31dcbd
+5db066fb
+d0435d12
+5a6bc533
+01e2d043
+1b6b396f
+f705a113
+b5862d87
+c9e71a54
+9b096029
+3480babf
+d0c86811
+b165bd11
+d45e432f
+3beb8147
+59e43b2f
+4f0ab304
+d3a36822
+c3582109
+87abc25f
+f7469be2
+08a40877
+39452c42
+ddd25918
+4e9262c1
+059679d3
+859c2873
+1d87490d
+695e9665
+0e5acd1d
+149d562b
+26cb8980
+8835bc51
+1b1bbb94
+1bd4a9cc
+a089e42f
+2181fc9c
+fb36ec7d
+85dd697c
+88d9df28
+3c7f9843
+e03d937c
+b7b53a5e
+d3f49387
+66d81227
+cbcac95d
+2cd2fd5a
+2f0d1024
+a3cc9eaa
+5400db8b
+c694bf03
+c1dcebcb
+2c2de8b3
+608fe68f
+e257e34b
+d3c8a2de
+fed4d11a
+65f694a1
+5fb2af39
+4c3716da
+0271af42
+bf85f943
+95f6a680
+ebc42d91
+338188f0
+e1ad6dd1
+3b49800b
+b0dbe2c2
+7b7cd677
+dbff86cf
+b5f7c643
+f05677b6
+5928edab
+1a5a748e
+40ba66d9
+0fe3165a
+dd665070
+d420f43b
+23c9c12c
+21817e80
+4afa7420
+ea25cfe7
+35818b64
+b9c629a9
+e5320740
+7fb641bc
+64cdf833
+9f94f6ac
+7973d61f
+039346f1
+2a57f2fe
+ac975db6
+0a3d1a50
+14936ab6
+5cc8f91d
+8817de80
+e8b383b6
+9e9e557e
+cb4efbc9
+b503a661
+6a724007
+d98f5768
+01a3b3e6
+adef24d7
+dad66aab
+06ce4c4a
+151438cd
+7f8d6a7f
+09f22e1e
+899dc6ac
+99d22d4b
+d4fc654d
+807db0cb
+499ac831
+6cb7ac02
+81c1155a
+1e05994e
+b3205f72
+0139f0cc
+cc1a0a91
+8e586a8d
+7ce39f04
+6ffc8f28
+b1155a69
+b1c1e580
+3298f58b
+5a9257d6
+4962b7b8
+2b643690
+19fca40f
+50674731
+830ef509
+b6549ba4
+5df43724
+62b4602e
+b6e624e7
+b2e9811a
+bb192af3
+bbf3c209
+415e1074
+09f7d6db
+b531cb1f
+c31e0677
+65b2bfc7
+7ada4047
+73ad28a8
+7f358a34
+018d7b56
+112bd4d1
+9bc7c818
+ab90c1da
+7b72fe70
+56e59721
+2dd6c68d
+09003f7b
+bc952269
+379a7ea8
+b0a4d1e3
+4c2a19b4
+f09d399b
+8b661ccb
+29f9a9e6
+bf59c9a4
+94fb8c54
+d369058a
+1a4b1eb1
+f1d3d4bf
+b0793367
+b8a1d5a8
+03ce2803
+2f4614b5
+169ffff5
+003ceb8c
+0a1e415a
+19884508
+bdc301e2
+61187f24
+07c30eed
+f6f30123
+36028f8a
+ddb90d33
+87aa334e
+61b76e4e
+7bbfd67a
+ec23eec3
+db34ea49
+28625509
+c62785ab
+a7ddf150
+90c72fb4
+c2ae9c00
+cc2ba95a
+b248863d
+7f262f82
+2be241e1
+c3fecae9
+f9aabca3
+be100ff9
+d2fe9681
+10e91261
+c1cc8edf
+1a23ad1a
+7967fcf5
+46deb7c4
+5761eec8
+7a862506
+d8643573
+3bee9ada
+5974d6bc
+7031bb66
+5ef5cf67
+1032c319
+1fd32458
+603a65d2
+cbdd012d
+db2cc97b
+c24f736f
+6c02aa53
+6b6247cc
+f07af706
+b40012b1
+660082f3
+faab7dcf
+d298f071
+0ed59841
+5678eeef
+96f776c6
+d631e57b
+8f04f14e
+fe6e7431
+1296af9d
+b0ed6de7
+78916ad5
+e2833138
+93caa69d
+c6fdc148
+b6330cc4
+500c52be
+b2268c1c
+e8063698
+aafa41b8
+b024e21f
+948cb95a
+a4456f7e
+c1130a48
+750448d2
+6850950a
+d8e6aa86
+8638e8ae
+524639ce
+b562027f
+917bdb89
+a7ad003e
+2192038e
+6ad68ce1
+60db469b
+bb7d6eea
+116167fd
+5711a6aa
+e0a2ecca
+8e302e2c
+5cb843c4
+acec7b79
+1d25bf0c
+b025b0a6
+da123b4d
+84b037c2
+0ea8f0bc
+1c499678
+2c508449
+c5455c5f
+5ef4890d
+5139c88e
+de1dc0c1
+82584656
+70853bbd
+891853d4
+12100c35
+b931030b
+0b2f068a
+dc0a11c7
+927b3924
+106df148
+0481f0ba
+c1b10ea4
+dea812a4
+61a229f4
+666aef97
+f089159e
+98c00efa
+81e01c09
+96febccd
+69101c95
+e5b2a31b
+e4fd0a5b
+e15addd5
+470b58e3
+048ae0ae
+81b1d519
+8f98628f
+824f5618
+5c5707c1
+be5c30df
+6d8b3d83
+5c39400d
+b4b00886
+8d4208cd
+36375a46
+992f92cd
+91455f13
+df8fbf53
+1c83391e
+902872c9
+4b80d357
+650cc93b
+eb1fd928
+bb729a93
+c9145374
+771e2523
+8faf5407
+f0da0308
+38aca36b
+89755d05
+be13fbd1
+ae1a39d6
+da11631c
+456b4d8c
+22684a15
+0e833929
+ca97f951
+bedf102b
+d502349a
+209d0d06
+c5daf919
+c9d7eaf9
+ffc12480
+b0f95598
+14977b92
+3223d3ec
+691b0a82
+39cd8dfe
+65481888
+4801a450
+2b30dc64
+06f7386c
+1d97d7a2
+43c8dd6c
+b12693c7
+f96986ce
+8934d14a
+81f090b4
+96302ef8
+16fe249c
+d285863e
+39b2a715
+ce15223f
+a19a116e
+e3801c1b
+c0d96a67
+cd08b588
+e5d4c5ff
+b95e4218
+3604fb1e
+39b8b662
+52566d96
+985021a4
+a8bbb26b
+1e1acac0
+9d6607ae
+c60bb04c
+8ae8b275
+8d0c7214
+fa033aeb
+2e946ee2
+8eb681c0
+5bddd663
+37984208
+8b48e628
+f85c9252
+a0604a2b
+96dc7a08
+ecd34703
+14bfebf4
+3801b780
+e96bd736
+07f84acf
+60d5f5a7
+cb55c02a
+b6951e6b
+8c97a28c
+06148e59
+ec0cbaec
+ebcc64f3
+31248f9f
+529d73fd
+4b8137fe
+20af9140
+d34ebbaa
+b247c1ec
+b3e2674e
+7775ae9a
+85e9525f
+3db62e06
+b2a9214d
+58be6964
+c7533df5
+b6471837
+6a3f2218
+001cfd57
+d8d4c93b
+928f610c
+bf840629
+54ac6611
+67ca8dc5
+8bff9688
+3db5e097
+425b95ac
+e3ed620f
+1c88fa00
+d05c497e
+91bfe327
+b3dbc908
+aa5b3c1b
+eab838bb
+aa8227bf
+628afd69
+a793f053
+b5784755
+9b17f367
+bd4581c3
+de11c602
+8908db00
+b0c884f0
+8d164e53
+90fc8956
+ff852091
+f75d8f51
+70734f2f
+78a5bd6a
+1d4137e7
+92c42d51
+221fafeb
+dd99ca75
+ed371518
+d445635f
+7e5d5281
+a3661869
+826ec973
+e566ddbe
+8bcf4b56
+caa98880
+36df1f08
+f34203e3
+67dd8a70
+3fa4cf0e
+21a523a8
+8803e357
+8100472a
+c15ec51c
+cc89d489
+d3a2acfa
+90cd598c
+119cf591
+0c268e6b
+7be07df9
+4f9f86f7
+28964865
+d589819f
+66180dd2
+64712dc5
+0dac8445
+13cc7f10
+cf069647
+e5ab1c0f
+770e7839
+015d2650
+6fe3d88b
+962e2053
+8e6c17b2
+a18759b0
+19a8fcdb
+bb0a192f
+a3115372
+ceb863d3
+b4d02e53
+a4cf0025
+959a9ad0
+06d22039
+e6f5cc23
+f6f3f70f
+515c2bcc
+f98d05fa
+bf596fe6
+c31ff1cf
+3bece18f
+43ccd97e
+a170deb0
+4e0c8817
+953f038f
+fb1d7926
+3670ab58
+819b7ce8
+8049c529
+4564467c
+4de28b82
+7a5fc78c
+88887d97
+fed15caf
+8a2b280f
+d51317a5
+39ab344b
+7a739d42
+e230cfa2
+15c4817d
+46534650
+c686f482
+dec55681
+f9f6536b
+61f87be2
+19ae4fbd
+38262be5
+3aec9877
+2d618c4e
+113b853d
+5c4d808b
+591ce327
+bb9802c1
+7106a279
+b160944f
+0a552b97
+5dbf0cc5
+b180f466
+9ff76ed7
+e494a63d
+490b4f17
+2b9be9e0
+5fab3057
+77a160bd
+0317a84a
+4c942c6d
+061a659f
+868c2a8b
+d87200db
+b5e67941
+69a5b7b7
+e161fae2
+eee0e446
+04daee9a
+e5aa2f47
+ca55061c
+cfbf40a1
+ea483df3
+9581b80f
+95e5d0b6
+1502bcf9
+69040d07
+4b09bd8d
+3b5eac6d
+7db2c9d5
+9e2401c7
+9028f543
+29cd1ae1
+3663749b
+37f5949d
+f5da12b7
+ae574c8f
+560f248f
+ecb7ef0c
+5ad7dd0e
+fa2aceeb
+33585d7e
+73e48f47
+a1833656
+82a61820
+30b862e7
+bc5f5426
+ebe677ec
+5b392af8
+4518a475
+7c47cc94
+828df116
+90044821
+862b5ba0
+c2be1131
+ee36060f
+8a5020e9
+f4323577
+f4a59cf4
+c1246581
+66265d86
+41793c01
+ba02f1aa
+97faaf60
+3668803d
+02e957f5
+19664c72
+8f1819ac
+e1b6e14d
+56c0bd2f
+515db406
+c8cf221f
+a42fe8cb
+2788fed8
+aa400bde
+07fcc899
+29285e46
+dfe0c979
+5fd8c451
+8c131b96
+a9c1f26a
+3cdc525d
+79dd5901
+944cd645
+9c6d05a0
+73391cc9
+faeb53d1
+a2fc2c1b
+55c67188
+59f0bbca
+31709caf
+03206413
+a3048ee7
+878b088b
+ddfd0b31
+d392d940
+9cc94cae
+473f609b
+80019584
+be54e449
+66ca6f9c
+d14e01dc
+be42d44a
+290e326f
+f5a1d625
+285a096b
+328f9af8
+031bba14
+173df572
+9437973f
+98cbf0d9
+9bebbb37
+35b598a9
+f4f4fb94
+3e84bddd
+8149bb03
+75509a22
+ebdba02b
+34ba6c91
+5ec469f9
+9e5965b7
+924dcc0b
+45c4fdf6
+b368bad6
+bb2756c4
+680a1c2a
+7eb9a0ca
+9f10f3b2
+a09fab49
+7bfb40c0
+ec4dbfc1
+f1bf4afc
+61d17b28
+9b90d9a2
+c91fccd0
+fd9ad406
+55cf9503
+f6dcfdcf
+068d1303
+fceee742
+1655a41f
+c9fd810e
+2a173a8a
+99c30e24
+03e9d473
+4b972461
+8aeb71bb
+7790bfab
+f24d4506
+10ee14f5
+0b02a688
+4eadb673
+69dc1bd6
+8dbf6682
+1b9f91ce
+b1231540
+b83e6ee1
+6acfb8f9
+7c4b21bc
+a8925441
+aa7c1231
+b39b48aa
+612ccfd4
+bdcfffba
+d2677fc7
+5fe7b4d5
+641e8692
+4a8085c5
+c2fcecf6
+d96b5614
+398a0889
+1548e7da
+67b6a1f2
+8b3e76d0
+3bc43aaf
+e7fe9af5
+c2ef9737
+1e5e2162
+38f19f3f
+a0bbf881
+d0e94460
+fa30ea43
+3bb2ac36
+037a02c0
+501c4bfd
+04f5acf2
+a3bb605a
+1aa27b29
+a3f149af
+116c7f59
+d2cb670a
+60bac0fd
+c36d4980
+cbb4c12e
+4d85ccf2
+695eb72b
+35d9e6fe
+c6eaa22d
+a6286a8c
+65723153
+6917d100
+f4f9caca
+07d8cef8
+e6affb81
+dafdc657
+327970ed
+b34313f2
+610b81ba
+0aafb8b9
+3cf772ba
+da47dd29
+1601be24
+3dfa00aa
+a9d3b1b0
+9afe3039
+92b79a55
+12146618
+d00d0f35
+0676a23d
+faaba315
+dabaa4f7
+2a3fbbb1
+b89fd40f
+4733f0c2
+df074db9
+01a6363e
+5aa21c10
+5f83dbdb
+780d0d73
+cdbf4896
+c0186801
+f2159098
+2276df82
+45b357c1
+7736081b
+b373ad8c
+878dc046
+a99706ce
+9ba30a3c
+0fccb71f
+5b4584ea
+00347586
+10d65c35
+6aa667d1
+11fba85c
+94faf4f4
+86ebb3cb
+01bf33b8
+19371580
+bac6c60c
+ac9bd079
+91e6318a
+3ad61695
+54eeda33
+0575de32
+a7970f22
+4e2622db
+fbbfef37
+71c6c125
+5e820c2c
+96606633
+85a4691c
+6600e614
+0b3852c2
+2849adaa
+26d1c179
+4144f892
+0bc13f44
+ab1eede1
+afd1732a
+999276e2
+8082ebf2
+854ccc79
+170ac369
+b6d4a742
+0267c387
+59b89765
+30029de7
+cf596c87
+00b3f0d8
+bc9fc40e
+0594a503
+20d5b575
+13194a12
+10fcd416
+06b1cf6e
+14f195ab
+f9aff643
+1a978569
+7e5bf711
+6faba5fa
+3315ecc7
+0d8c5f24
+2ac9344f
+e98d8698
+e5a801dd
+c7043c4b
+d6b6e0bf
+8c3ca4dd
+0ab0cda9
+9cb0327f
+80d8555a
+d3f71b89
+5957096a
+68ac9e60
+81008c67
+dc2aae02
+b7ad4efd
+101bc0d0
+9a1fdd48
+e27fb4c6
+00e32fae
+5f3e1806
+d2352e66
+f0ee1171
+15801a38
+3c150a80
+6b67ba92
+1d812a04
+df9c9d2c
+5f851af2
+d49c74eb
+cf2f0a5b
+c24ec7b4
+a68ba4c2
+4aeb6732
+b04202d7
+dec9341a
+89fbf73e
+d6c23de6
+bee72785
+19b7e27c
+8c95fd86
+8acc70df
+8a49d676
+656a79e3
+e3d0459f
+3536f0a0
+1b966f3f
+5516165a
+83778e6d
+8b7d76a3
+ce71785b
+f8bef0a6
+2755e7fd
+533ef85a
+e155cbf6
+a09c594e
+4ff691a6
+9bf32005
+a093e90b
+65f36fbb
+430e4260
+67d4b901
+66b3876d
+44a40b37
+bebc99aa
+6d01028b
+73fbdef0
+c4f19e8d
+db9c39b1
+6ae61a42
+62169fb6
+899a7abd
+88e439d9
+3bee5b97
+5029cbc2
+cb3a592b
+2a648b12
+20f4bade
+c9add53e
+3da4685b
+b2565f71
+c7577387
+48bd2f24
+e31f97bb
+9ad60ebf
+a3479ac2
+e36ef5a0
+dc796779
+de57dee7
+ddeef97c
+f46f41df
+36c4c874
+f44af879
+02d9f1aa
+e44a010e
+f2657c82
+45000d9c
+32bebd7d
+6ff41c4e
+8510f416
+4967a4e4
+b244269a
+759c4a2e
+2a04d8c1
+f922efad
+ea5f6ca5
+e17eea0f
+64befbe5
+a3428239
+8fc50a0b
+5fbc73c3
+d7c58073
+6e2e4e7e
+81fc4ddd
+abef9a16
+08087ab3
+e8cfc6e1
+65e58ae6
+b2820985
+a6791c46
+dea65486
+4c96e381
+040c8aa9
+253a9987
+6fad4c40
+b0714e74
+794a45e4
+250a1f5b
+fd28c00f
+51c64c6d
+7dd10c27
+bdb2e9bb
+f72f3e11
+867d05be
+c70b09b5
+40e820d7
+bce72daa
+9a9bf803
+4e1c036b
+faefd679
+d0189e5a
+88825148
+5dff9b29
+792bc2c0
+77a972c7
+beab5630
+1440f066
+3d294693
+e416815e
+c3dd08e6
+97beb967
+1452b7c4
+63fb5989
+f3e1dcd0
+a01d631a
+23929bda
+baca34cf
+3ce1ebcf
+09e3cd5a
+48e1c623
+07f0dd75
+80d23df6
+681bd079
+c619b45b
+c9918a97
+6ab45cc6
+28d926b8
+88bf064b
+5fae3a7d
+1c180eb6
+f37ced8a
+1d8a14d0
+8f2f8a38
+7a0b30d1
+ea992411
+92f88b7f
+8ae9faaf
+c60d2bce
+3904e043
+291c308d
+0c6fb000
+2b14b0ab
+b4e66c7f
+e3cc371a
+2106cad5
+d63ceed4
+7352b613
+6d65b22f
+196060ff
+c0fc6d51
+7f4beed8
+2c2b7368
+6448e299
+5deac079
+a136ae32
+3401bb69
+eb45e6e4
+9078f4af
+43346490
+5623d87e
+10b5d63e
+5000c229
+a1198d00
+b16b1b14
+ec0f612c
+dd5aaf51
+d00a89cf
+2f1b2c1d
+d6eecd96
+a0158c0b
+631a0f79
+8edb102f
+02195b1d
+8dcfa982
+2f91f54d
+b40a5ae5
+cf3dc9c2
+13bc59b7
+54f76fbb
+5ce5db23
+0c036330
+358959df
+d1de383b
+17a25a2e
+41d91b64
+90aed9d9
+358b999c
+e4477d63
+fd4d6dc3
+7dc492cd
+5714e30c
+ea54d21e
+8681d92f
+ef51c5fd
+30f831be
+b0f3befd
+1bfb072a
+d50f8f77
+2798f61a
+7b81297f
+ca1739c3
+e1871028
+0c6cc498
+6eabba09
+eae1a4e6
+a6c4d6a6
+bfc7247f
+ad5a15d5
+c2be2cd6
+e6e1eada
+fe2819ec
+106063a9
+c35b84a2
+bea25674
+5a2b1097
+a92a2097
+6e91d0aa
+910d0c78
+08125ec3
+a0bf113f
+7d344ec1
+bc67dc86
+7d568898
+be4cb064
+51cab503
+05ae89ac
+9ebf27b2
+3e3b5cdd
+5e48f06d
+99cbea63
+34c9c2c8
+f441464a
+7646d219
+1ceec30c
+8c82d5ac
+60f8cfe9
+520e87c6
+e3679b95
+755bfdc0
+78f5dce1
+59dbabe4
+62cacb6d
+0d696b96
+1c648235
+2e68af6b
+c3b6b1ff
+3f647607
+fa16b189
+4120ff84
+76bbce8c
+1f2b62a4
+dc5a46d3
+13a91973
+c2f0597e
+2bb01002
+01e96a8b
+d8731300
+497cdc17
+fb7b61aa
+467803ff
+1425ca4f
+6a779687
+a754f0ee
+0ae08089
+f6cac145
+f25b1ce2
+ab52da0f
+8820c221
+2c20fcf8
+24031442
+c5ab649a
+bf0b19a8
+f2ecb6cd
+251a11e7
+6f3bebc6
+e4dbea90
+fc86bde0
+97476ff9
+c312e8ea
+7f1931b1
+1c5a7983
+9f43a1b5
+7e5a33a2
+e269cb85
+129fdc8a
+6d5dd203
+24e3cb6b
+ca94ceb5
+f97061f8
+d86c3243
+77d7145b
+0bd10f79
+f9e72488
+48513177
+3517881a
+6d69411c
+7d5591ce
+a1e6a194
+3e899880
+715dbf7b
+8b0fcf84
+7fec07ee
+20055f77
+493d9030
+46abb307
+794fc893
+638ff2c1
+68c542c6
+ed8dc2b9
+c95cee83
+f8b86b4d
+974c2ef9
+e05db802
+d46e345f
+660dd111
+f7f717d1
+329ba483
+a67afb96
+8c4b6740
+2a02f19f
+dfc67ca4
+13508380
+fb148005
+4e81cc05
+4384a030
+3d513154
+e029fd83
+9db2ef85
+dec79c89
+96b59e09
+195297b9
+871d46cc
+1acf0e98
+de15aafd
+c5ea822b
+684abf7b
+38c6a2ff
+0acd9a72
+1e4f76c8
+173df40d
+7edcb432
+4842a03d
+567c03d3
+a33c37d0
+63b9d452
+59b2577d
+513c4642
+55db6044
+cc402e51
+dc56a6c8
+3908de9b
+d65cfebe
+0cefb95f
+b5568233
+6bb27684
+3266012b
+8b9e4c43
+37ee624b
+fe031c1b
+8fa3275d
+ef102153
+d6ac1a99
+b49b8121
+8a77aa30
+352cefe6
+0720fa82
+19f38e39
+6af132a7
+7b9b4b4c
+e642456d
+5345e2ea
+10b1d801
+84f1c38f
+6898d7cb
+d6f9abfb
+8c1441bf
+967c1441
+dd867531
+6956727c
+4859631e
+b66d15e3
+f492df10
+a3ce95ab
+5bc95bbb
+afc54bd9
+a6ed4428
+02904c2e
+99944ac5
+bb466b96
+75f5605c
+8b78b687
+8cd4141c
+a2c4cd50
+7c15fa92
+64607668
+bcaa7baa
+27e168c4
+2240ccc9
+39cc9792
+fb7f035f
+29998ed1
+ab139dd1
+a6cb5175
+d3553154
+077ac770
+1f4710e9
+1f563ffa
+c19bed16
+2091baca
+10b2deb6
+b78cd77b
+e5b0eb70
+4d444fa8
+8030f7e0
+3a3446f1
+8ead245f
+4e353d3a
+465f621b
+3a636c38
+06f03557
+fb9d5150
+671429a1
+63412727
+8599e2a2
+34d81946
+2ec3f51f
+e5cbc185
+026c0584
+ad0533ab
+d46c3907
+a31ef0f4
+abd61de0
+d19f0185
+971fc9c9
+c3544e2f
+4a46af6a
+d6663a57
+032e9f84
+a2c5148d
+36f6f194
+2c0ce193
+f8b291b6
+230aba50
+f76ce723
+3a226f79
+4b930c6a
+88bd0a47
+d02ba8fa
+cba37588
+a38a359c
+74434d6c
+6fc4f0c4
+c79b46ea
+c9b7c37d
+3e1101de
+30d1c86c
+ae4b3923
+266a1de5
+57c0f490
+2cbcee90
+231e053c
+2e01ffc6
+f2ee08c0
+2fcd21f4
+12b193b5
+9810185a
+f47311d9
+39febaca
+ad20b269
+18fd2581
+1e10bd9f
+884c047f
+18c67922
+61cd98c8
+e0dd8846
+93627a01
+cb14f052
+4a8fb058
+2dbe3d18
+1258049c
+517ada1d
+1a0d796e
+89ebd875
+68ad052c
+bb43c97b
+f1c7f203
+111b671c
+1b40afef
+9fe1748a
+3a7f3731
+be3276fb
+f9a7e394
+e275ed08
+1f38109e
+4e0cb6db
+f83b1066
+4fcad2fb
+9a3cb908
+5b2a7e55
+c3c612d5
+815e3083
+cb2156cb
+89b9aa94
+598807ce
+9d09ea70
+b036469d
+ed70b62a
+d3fd4798
+a1b718fb
+f2e08e7d
+b724d6df
+585950a5
+d84e5d81
+c57235ae
+da4e4283
+9d72a4aa
+bba89f78
+25dc7829
+99292b0e
+c688113f
+cd6faf07
+e97210a9
+c7de6ae0
+f5da3a6f
+46fcaef1
+adc7cd76
+9ccd40b7
+cbbab2d5
+585ab217
+7f653a0b
+7040bbcf
+c5c59751
+1950acf1
+9dd430b5
+2d37c7db
+55efee58
+b8e42c74
+f46db46f
+b240e4bc
+47d43a7e
+49045e9d
+fd9fe48c
+3293c5f3
+49308009
+ea2b155b
+9a5d2597
+1fbad044
+66f43abd
+47784164
+470320bf
+76b52ce1
+41f900f4
+ce7b7115
+5d8ecd6a
+3ad308ed
+8e5b38d8
+32a55192
+bf536db3
+8808c4b5
+04ea5408
+1df4d824
+8aea8ae9
+56659b75
+a800f41a
+4fd17e73
+489e090c
+f438658e
+c7d47a71
+54cadf79
+d8534b2b
+dcc72519
+79b0147e
+42aafe6d
+d1baf1d4
+d4473940
+d3e15e1a
+cd9fe13e
+556dd4b6
+dae87672
+4ebf2637
+7a719f12
+e8d8ef8e
+392ccbb7
+ea124ac7
+610358b4
+c517a192
+0167d85c
+657dc3b9
+85a5c0e5
+d98d57ff
+a180116d
+d7091b25
+f8abdbcc
+4df3ba04
+6df7610f
+6f85b167
+e53a9b5b
+16c6592a
+b1a3f5da
+c8458f4f
+9124e6be
+994033a1
+6460fab0
+45a69eeb
+40ed41e5
+41afcc19
+dd47ba3b
+46663de6
+55699589
+65e97d66
+71c47f65
+a562e6fb
+dbac10ab
+8c8a4c47
+4e4c5a3d
+d772d0ec
+4ebf84f3
+2dfff8dc
+4df3cb9c
+f8f91ee1
+83d5ea21
+a63f2c9f
+1b255578
+fc9fd8f9
+00bed72f
+aa76e087
+d9c78f96
+e27d0414
+8c23df31
+a42c24d9
+41b7847c
+3c6752bd
+2fa8be0f
+8203c75e
+5ec783e8
+7b3460d9
+5916d5d9
+41274cd7
+9c31f3dc
+8877d8ee
+e26ca125
+89fb1807
+fbd91004
+c446f801
+e4188c35
+9e128e8a
+ab5bc535
+81c9e43a
+24d89f30
+dd416de1
+3197d543
+0cf0c892
+645fd4c0
+73fec7fb
+e3b2879d
+21dbbb54
+c3d6b7e4
+1b69e68d
+0c8cc302
+451fd92b
+0fa0d423
+466c08ee
+be70c76d
+5a9c9fa8
+9b3bebb3
+f26ad04e
+328b42c3
+d19a4440
+51d55e9c
+20eb7fc4
+0cd9d1eb
+55c7c029
+eeecd0cd
+5bee7f8a
diff --git a/models/rank/dcn/data/sample_data/vocab/C5.txt b/models/rank/dcn/data/sample_data/vocab/C5.txt
new file mode 100644
index 0000000000000000000000000000000000000000..93d3888f00ca9d40867074ebf2d529a5c6c456a6
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/vocab/C5.txt
@@ -0,0 +1,54 @@
+08da92ce
+38ee9c99
+b0530c50
+8c837181
+492fbc44
+b4bae0ed
+bf9f7f48
+2319bb7d
+06afddf0
+b706ee81
+3ca0f876
+2c6b8ded
+28bce97a
+afcf7897
+4f8b7acc
+26eb6185
+e856df70
+ffab0078
+4ea20c7d
+47f98056
+f3474129
+0942e0a7
+db844843
+1524de30
+b2241560
+b974d47d
+503ebb08
+5a3e1872
+db679829
+384874ce
+f1d40cbe
+46dadd18
+a9411994
+d9131ab2
+dd2d8e4d
+4cf72387
+d3ae3ce1
+3597f508
+f7109724
+30903e74
+65be028e
+27bf8a17
+3bb20e22
+f281d2a7
+d5b7606b
+cadb6f23
+a93acb09
+229df405
+307e775a
+43b19349
+89ff5705
+a444653d
+25c83c98
+42d2cbf8
diff --git a/models/rank/dcn/data/sample_data/vocab/C6.txt b/models/rank/dcn/data/sample_data/vocab/C6.txt
new file mode 100644
index 0000000000000000000000000000000000000000..67c73128c2b885ee37dab1ede607a559edc8512f
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/vocab/C6.txt
@@ -0,0 +1,10 @@
+7e0ccccf
+f1f2de2d
+c05778d5
+fe6b92e5
+c76aecf6
+6f6d9be8
+13718bbd
+e3520422
+3bf701e7
+fbad5c96
diff --git a/models/rank/dcn/data/sample_data/vocab/C7.txt b/models/rank/dcn/data/sample_data/vocab/C7.txt
new file mode 100644
index 0000000000000000000000000000000000000000..5335eedaaca0f062c3d7e548f0167824284adaf0
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/vocab/C7.txt
@@ -0,0 +1,3213 @@
+175f4be6
+ac1e51fc
+2c7105eb
+3e2ef4e6
+5547e1f4
+52b4e012
+6716577d
+5f29da0e
+e3e35adf
+def296ff
+d7f3ff9f
+56f361f1
+33e2ab71
+b02ec8fd
+6c4b228c
+b28fa88b
+fc38ad4a
+97b87d6b
+9e6949e1
+3a9c29ce
+727e3bdb
+05276f0a
+0738547e
+c349b5ec
+7f2c5a6e
+d5500dd5
+9853338f
+94a113a4
+cdbfd303
+15ebc248
+34c90c45
+511c5ce8
+91992e62
+55b80a3b
+01eaa539
+59bbdf75
+15b2e600
+aafae983
+a879efe1
+2cce877b
+c1e20400
+b3e8889a
+4e222d7f
+8de2dfc7
+c642e324
+8f31f6ff
+18671b18
+f82e389f
+c75f034d
+7d251743
+39895754
+23995c53
+06a7d0ff
+a9fe36f4
+ab495772
+575686cf
+3edc13a7
+e5c32f14
+f5d20afe
+f48f6b93
+4bcc6397
+a178dc8f
+ff493eb4
+63b212f8
+01a0f3f6
+4f32010d
+48db53e5
+3b097cb3
+3fd38f3b
+3ffb655b
+a6af5059
+dd645c34
+de7995b8
+24e8ca9f
+8672200c
+69b885a7
+383a8b09
+49b74ebc
+8e974a77
+9ea2e0f0
+dc0aae79
+d18f8f99
+7a57d343
+27cc0b50
+4642c514
+e092b976
+9ec884dc
+815cb136
+a2152621
+0dab78da
+99706856
+9511db4b
+d135c004
+49a74e66
+4e784388
+e983795f
+5c116cc3
+c2d69606
+e05f159b
+d1900ca6
+baddb49d
+2047fed7
+5d4c7462
+8037ee10
+15f4d8fc
+9b8dbe4e
+3d067f68
+65d3801d
+061da6ec
+9c687a4b
+51a176d2
+6d490d7a
+e9e7564c
+28acc02a
+63310cdc
+ac652f55
+d6e25b09
+a3dac0df
+21518109
+5e21596e
+f68ba07b
+81bb0302
+f1f19b87
+472a591e
+3f8cc471
+0d339a25
+1ede2f0e
+bdf75bb3
+61beb1aa
+366a171d
+4bd812ed
+c7713316
+718ec8b1
+c5919aca
+7307f77d
+9115f277
+a44bd14e
+c23eb1cf
+0388fb8f
+8fb5446a
+3a34d722
+99ceaeb8
+6af7dea8
+15ce37bc
+18568021
+f417bf96
+7bd25275
+86b4a73b
+12df1262
+38a0b991
+23c51afa
+9034b6f4
+a86d9649
+9a68af50
+0cc37e77
+adb6fd9f
+57b4bd89
+b816804f
+dc2b40a4
+d4bd463d
+5cc3d947
+42f629ee
+5c8931c6
+1d2ec7e0
+87cf12c3
+1318627b
+431c274d
+a38bdcad
+e13ff1c4
+3e702f2e
+fd3483f3
+46e7b789
+c52b723b
+cb387efd
+ed433b46
+9bc5edff
+423ac2e7
+d0bdaa98
+8873d510
+7a019822
+4d9d55ae
+5b07414c
+f6619575
+b4ef5b52
+dc90e471
+5e4f7d2b
+27c992fb
+a3550aea
+6d917ca9
+82cfb145
+b8645299
+51ecf77f
+34ee90db
+5c4e3b7f
+c4efc2d5
+496b90e2
+0f26c0aa
+00f44282
+88afd773
+c5252395
+7c7f8ebd
+60daccf7
+21dba0b3
+c70acf1d
+5f1727f9
+4d0498a5
+56fb8166
+189f2643
+606a51c5
+082d52b1
+461e4520
+58385398
+d297f888
+c69fca72
+8a32280c
+cee7e902
+71a0c30f
+82460b2a
+9762f2be
+b7471958
+ed535358
+b94b41bd
+820c5119
+e05853b8
+0550a3ec
+e4a69765
+c1158194
+fa44c4cf
+afa309bd
+812b47ef
+0d5ac942
+c14e7c84
+fa9b0b5c
+befcdd25
+2b1607b7
+df5c2d18
+f4995d86
+75091be4
+2da6b2cd
+ab20436d
+3f4c6e44
+fc2c0a2a
+71ccc25b
+c96de117
+36e6bd59
+7dab17c2
+5abec7dc
+e30c82b2
+fb0b594b
+5adcc5c0
+0b507e32
+b5346873
+3c0bbea8
+62d6c184
+e7a3c783
+2b3ce8b7
+9bf0f6d8
+0dbf2675
+7a36f8ea
+2de87ccc
+835761fe
+8dbe18ee
+f914ec7a
+c0224b89
+9043dba2
+969b8e03
+cc5ed2f1
+e168afff
+14956523
+e1c95382
+6eacaa38
+095d6b9e
+2fef7371
+a33f91da
+a097ff18
+83070dcd
+af28b684
+85e1a170
+66d269da
+b3a5258d
+439821b4
+a2bea6d8
+9db7a15b
+7ebb604b
+5d7d417f
+0ed4d92c
+68d98d74
+ade953a9
+12e093fa
+0dedf29f
+c8e6227a
+c4b69de7
+e4c15eea
+417ee83b
+7a9ee4e9
+e09c981a
+96ce77ca
+8ea060ec
+fbef5b0e
+fe4e75fa
+b92ab722
+d7d892e2
+e02d15e5
+ce6020cc
+9b9962ac
+4b243124
+5d2bb5c5
+c644bfb4
+c23ffa25
+26904bf5
+0b0a1a1b
+271555b7
+297835a6
+28cd7a0b
+2395bf4a
+d252d044
+632a4eba
+97006d5e
+fc658742
+fcb51d01
+04eff078
+8a1c9c84
+f85fe9bc
+183ceacc
+8a6600b0
+fae8ca82
+0ad52247
+781b6b88
+ef10547e
+92ce5a7d
+4dfba5b0
+4c019ab3
+9882d927
+397def4e
+649c7ded
+b46e01f1
+2c6cfbf3
+4feec62c
+a337e95e
+8ee96e41
+670e4ae1
+59434e5e
+3a6d4c08
+087ff1dc
+b6a34586
+44ce0dc9
+a788607f
+cffe7bdc
+6253811b
+a85e71ff
+bd6361af
+112b9a9c
+5fb2890d
+040964e5
+753aa291
+5ce94b48
+84912e13
+85a54c46
+3bb417aa
+91758491
+c722e601
+0dcdd42b
+4da83d42
+81f76acc
+0afeae7c
+383764b0
+17b47bf9
+0c8aa424
+e717dc9a
+d3ce2398
+84eefcc9
+b9c51fff
+5a3586d1
+50afe1fb
+3cf1ccde
+1a95b4d0
+058a08aa
+408f0e75
+07d03e2a
+bcd25092
+9a75d128
+9f525672
+34c24f33
+ec945943
+ce8217f8
+5b051c5b
+df196ec8
+519d7a9a
+3249beb8
+3344da99
+2c7d33be
+db6d175c
+a14880c9
+3ceda19f
+1971812a
+830c88f3
+1c5f1bfc
+b000eee9
+77e91f62
+35c6016d
+5788f21d
+76b637bb
+29ad44e0
+16553469
+a4f02a4a
+7a341aab
+dfd65057
+ac7cb372
+6b34b7f2
+71ddaac7
+e45fa080
+d657f077
+24eeddcb
+03817bb3
+f40a4522
+d9bcb70c
+ce90adcf
+1035913d
+ecda0af7
+3e36122e
+ed0b84e8
+361fe71f
+d5e0ab97
+b2bec27c
+f9c471b3
+0697a6a6
+c5cd50b9
+3cbff1a3
+678e9b5b
+4e134c03
+af02c678
+88822b23
+372a0c4c
+60e1c24a
+53e14bd5
+9d38b00d
+65449064
+049ee846
+968690bc
+b00f5963
+f2a82962
+9caac933
+08e57a96
+2d3724a1
+959a03b7
+780710da
+5a5b6b26
+a2468490
+1adc3003
+086ae8db
+5e7fd737
+33393315
+e15d33f5
+1b2007fe
+320d425f
+d0e2f16a
+6b277f8d
+80a53d13
+363ef5da
+a70b42bf
+04cd1ca9
+67997a6a
+199c1eb3
+241e6c1c
+749f0591
+3086a9e9
+894562a9
+56f91e46
+ed67826c
+19cc4a94
+3d384ecf
+e1b5b276
+9140e6ca
+7493059b
+2823fac6
+fc9d1bd7
+5ed5f40d
+45bab68d
+0d273fab
+b724bd80
+3aaa5b43
+cb888fb1
+1c2172ce
+a90a5352
+58ca4e16
+cc8abb17
+08b9a859
+e7d1870f
+745ab0bd
+b9c2d157
+71458861
+52b7c976
+f819e175
+f626266a
+2f93c675
+f38736bd
+0515d6a2
+d4ab9344
+6fa3c1a7
+a0135c90
+968a6688
+e3082cb7
+7eb00b77
+bf9d4f90
+81f7f73c
+fb6ff985
+b423dd3e
+ea32d016
+675b6656
+0a07cc10
+720a6d84
+f376e33a
+1bc6d75a
+97169d40
+17cd3087
+d8ba4ee9
+9aba5215
+9d5117c6
+a4739bbf
+0fcb1581
+a7a8c75e
+fdd66c0a
+1a671428
+eee2db9e
+91fb7d50
+aafff5ff
+1554a783
+cafcd67e
+ab42cbc3
+5e4454aa
+34dc2e70
+dced2023
+b2a55fc8
+89571866
+dfb6e687
+e8486fac
+0a7b8169
+93d83d33
+885a157e
+b2b07c38
+0be0c9a6
+3562f8da
+ec258437
+e9396c09
+93b19353
+e84d3fde
+ecb24b52
+a868b53e
+22ff0182
+f9e7e1f7
+a51beb72
+6fe4c2c7
+136f2270
+3b4e72e9
+50a5390e
+06e43bba
+24bb62fb
+e746fe19
+c06f9e3c
+744033ad
+82e199c3
+4a7e820b
+1b19041f
+50024640
+b56d3d42
+928efe34
+acb8e1b9
+5636dd60
+22d01a7f
+1e638f87
+e807f153
+a09339e3
+95b0fb8d
+d3d2b34a
+f1565765
+8d8b98b5
+272fd25e
+804d2f11
+354d03e6
+b796c337
+d6348309
+cc0bd899
+01c31e6c
+f46926c1
+6edd3a80
+0ba783a9
+283a4abe
+d7087b39
+1c287f78
+39a57971
+038c3c09
+0d59e258
+1a30bbac
+b1ebef4c
+7b6e81f6
+05010f39
+8c2fedb1
+4bf86524
+c437f9a8
+71f8fa56
+bf115338
+0d62424f
+9a8798e6
+94224a91
+849ec99d
+0f13229f
+24c48926
+c895fed0
+bffbe599
+8ce2a590
+0641c280
+e5332619
+42d650fa
+2d99b1f3
+2e62d414
+ceadc1bd
+b4ac002c
+f2530a89
+1f0cb5c0
+40e38160
+6283fd6c
+ec1a1856
+a4870320
+9b364a3d
+f4c7a105
+b5e1898d
+0fba0dbc
+0fdf56d6
+799a5703
+72c0043a
+432ca5c8
+fdc48e74
+95f7bc50
+ae33f6b1
+f873b3b5
+abb4b4d7
+b7ad21e3
+48ed6f4d
+66b0ce37
+820e7ca9
+1238f7e7
+b7220c9e
+e76a087f
+8025502e
+40635b15
+5af90a82
+c714065c
+c0251c88
+f1d914a9
+55fc227e
+538dcead
+ee4cec21
+f913fba5
+a4113d19
+424fbb9a
+03bcb38d
+11401ad8
+52c84489
+7ae300e8
+efdc98b7
+1fdea705
+0808742e
+f3f95dc8
+65c9ee01
+36a49eb9
+bba1c324
+e300b4b3
+fd78c7c1
+1b76cf1e
+6ac49c28
+a255dd63
+7a8beb53
+0c0c2a5d
+7acae6b9
+bd9a3e0c
+1fc0346b
+7409a10e
+82f666b6
+d12723dd
+7fcb1ce3
+9099e7b5
+3b16ebba
+01f89068
+a5cda78a
+5547439e
+aabe15c8
+c9255aae
+7bec6ac1
+c78204a1
+fa19d92a
+0b28383c
+c1225605
+3527bb7c
+7bbf62b0
+61f2f170
+a7709056
+de9d892c
+8d156510
+d01ba955
+657bdb1a
+44424751
+8fb24933
+f14f1abf
+09adc20b
+019bb335
+e227b3d3
+89fad423
+09ccb863
+e88f1cec
+a6950954
+bcf6609d
+5e06a5e6
+6f63b155
+46fff0b1
+a3d51543
+8a37465b
+e465eb54
+2c34f582
+c51e7976
+7956c2ff
+0df4df10
+91282309
+4637152d
+5cfda1ab
+7bcc368f
+eeafc8da
+ce4f7f55
+04eaba96
+d5c1e844
+06a10ca4
+3b1ed307
+fbf976ea
+b38200e4
+adbcc874
+a5849528
+e6ca792d
+3a0b5107
+3ff49afe
+f606f91d
+01aa47c1
+fe872bde
+5d055371
+f188f978
+def688f3
+3296387a
+21ac4489
+b4c1a392
+daddc43a
+122c542a
+14ba4967
+33656cc3
+230105cd
+272d4827
+d59df9f7
+0465bdc8
+34459022
+df1c3c63
+176dc88e
+bb900cf3
+4be7ed03
+fd5c0ed6
+3ba5eec4
+e39d3fea
+7b26d3fe
+60b63001
+a012896e
+92132b65
+c62dda97
+a37d0015
+60db3a7e
+08145aa3
+6be1db7c
+5b3d71b3
+cd319e28
+8c327098
+4c1a8299
+79af222c
+50b860ee
+c8e48a82
+3b3a8277
+03a2a612
+2f61f432
+533e114c
+1cb331ef
+87e1825e
+0d00feb3
+52f754a9
+ab6c60f2
+19c9e23a
+02b42a34
+3f35b640
+516054a1
+8f801a1a
+b464b526
+d2800119
+d1e50403
+166a3711
+72cf945c
+2f5788d6
+ab06dceb
+3f758c52
+9b0d6173
+8b9aba09
+2fdd0477
+18d476b5
+099a6236
+1874479c
+93ec533b
+75861c28
+5df71803
+761bbc1e
+315e4292
+6b8a94e7
+94aa68fb
+a655f3db
+c0527891
+fd88893f
+7e516c92
+05254e29
+4b115a24
+3e47364f
+a464bc4c
+315c76f3
+a5f981f1
+2d91eedf
+4977709c
+d45bbf18
+08deaad1
+608eb4f4
+9d328a41
+fd8ca9eb
+124c72d9
+05782bd9
+caeaacb7
+d25406d5
+836cf3b8
+a25cceac
+e575438c
+d009ea70
+7d733ece
+49eb0b1a
+4ebdc6e2
+0889a633
+5417eef9
+16a2e9cb
+3625ff87
+ba0ca6c5
+9eec359f
+2489e185
+e0386b63
+1dac1752
+306a1d05
+9265028a
+e3075893
+368e23c1
+51794392
+24cf8b9b
+17c22666
+46352858
+d0519bab
+cc8ce7f3
+79e90973
+ed54b715
+bbdb3d58
+6aff3cf5
+5270a50f
+b471ac4f
+32079c61
+5489c59f
+d89d88cb
+21d34d1e
+dae33a8e
+1d94dd40
+ab34a858
+ebd71761
+484b01ec
+9589af59
+c7d96c76
+4af3481d
+86e5b4b0
+184fb3cf
+d45bf916
+f12fe6e9
+4bd57ddc
+177373db
+86aa3994
+c8b3d034
+ea98b75d
+0d3f4163
+65e1a370
+234b2838
+4757b5b9
+18ea2c75
+691193e4
+93b64cee
+b791f05d
+da33ebe6
+dcd8f4dd
+a3384403
+5a3fc03d
+8f02eb3e
+08e221ec
+052969f6
+f1ff45d6
+2a9234e6
+e2942acf
+60f43665
+dde8fbe1
+83b5d68f
+eed87f81
+07f0c740
+75344eff
+b5d27db4
+90b1c15b
+140343ba
+fc7be191
+e105792a
+874ed57c
+08155fd0
+59c03bda
+68bb64a9
+e0f2cfb1
+fc8b993a
+bde86fea
+02dfb6ec
+8b26c36f
+b097f76f
+08383675
+a870a74a
+e798d858
+b687a92f
+ffe5c96c
+26a81064
+276c620c
+87f71d3e
+a0e559da
+f4ae27b8
+7127d5fa
+9ca82575
+d70f015d
+38fde89e
+5186ec40
+d419754f
+e6c8d4cd
+f27f598d
+8379baa1
+83a81c7c
+6b157a36
+5c43c7a6
+d5484ce5
+b43266ec
+8323555d
+32bdb7cc
+463b5452
+e211bdcb
+eef6ea32
+0452b338
+7ee86170
+178a87e0
+5b10fdbf
+3447df37
+d0478358
+f839546c
+4f90c59c
+9e13c42a
+0d15142a
+78e68386
+3e42ce05
+9a4f2943
+cf3a5ef5
+34bc6d4e
+5879b64f
+4a055def
+631d3105
+bc05fd44
+fd38440a
+8b008629
+e90f312d
+19680529
+f0a40c67
+4b90fefd
+a7c6130a
+ffb26bfc
+bad6ca74
+394bde93
+caa2ce1b
+00583f05
+47dcdb4b
+02dabf8f
+c519c54d
+a40d2174
+6aa6cba4
+f36791d8
+0534d457
+5a46027e
+f49c4f98
+8ce3a35f
+f4b9d7ad
+d3aa6bbf
+201c9b7b
+4b3c7cfe
+f4ec3e93
+840c9106
+3bf1c027
+0e3faf03
+22009f3b
+3341567d
+0dd60320
+5ffd8025
+bd4eedd9
+82615655
+bcec77e9
+b45db245
+4d045f61
+e2ec9176
+f2487bdb
+a7ddc5e9
+3fe8889f
+eef6b4a6
+871caac6
+c147e81c
+ffca8f48
+21fe8907
+82aecd2d
+24562a27
+a4fea880
+cbf3e672
+66de5af0
+543d9801
+2f9aebae
+d82ee34a
+db8511f6
+5349e810
+6284da2d
+cf4e7204
+2ecc40d7
+996f470c
+7892e47d
+44b5f4db
+30af5eb4
+9ed4ccfa
+eba9a360
+e8e4d487
+e14d6a33
+d55d70ca
+3368987f
+b3f805c6
+4746e653
+def4a4d4
+1ce56d6e
+fee0031b
+5acc7d7b
+93315d47
+46067623
+7da52276
+59666442
+68e20378
+776ecf80
+5b77bea0
+bdb0698c
+18ffd618
+0db090eb
+b1537aa3
+7d18dd4b
+5b6df03a
+d161368b
+d0f8c04a
+90a2c015
+44abcdd2
+e700dac7
+ec2621cb
+0008893e
+da5b96c0
+add2820b
+7537857d
+10cfa4ce
+e31fb017
+4eac7c46
+fc44398f
+3c07cb63
+e0da76c6
+ef0d76b7
+2db80152
+cdcb58a8
+12c61956
+36df62f7
+3beeccff
+60902511
+4f27f3be
+c0698233
+c85d1754
+02d72eea
+a4586e44
+24349845
+9f27dd12
+d84de2eb
+216a1127
+78c0b2ff
+bcb9266a
+28f6409f
+b0142235
+ac8dee1b
+579e8498
+3a53b968
+12b681be
+2d87b336
+6b691b66
+b9ac220a
+8d452fe2
+f7bcbd6e
+f2f89ef6
+64341056
+62629a0c
+d50ab3e4
+9e5b069d
+270fb9d7
+ef519766
+0cc91c69
+a1bd68f9
+3625d5b8
+b09434e3
+14667edb
+ea259ef0
+19a30e1b
+5346551f
+860f4669
+6362d73a
+61bed5b3
+6f208241
+0fff68f0
+2ebce834
+43f947ab
+5ca19802
+db4141e4
+b72ec13d
+c9f171f9
+bd4d5d08
+cd1e2562
+a247195a
+98f5a60a
+d9e56639
+26e81396
+970f01b2
+c3ebd8ef
+6a871e6a
+ac0c5c8e
+bc324536
+4c3504c7
+b10db6c1
+22976346
+992f2706
+b8d2c80a
+05fd9bff
+78155071
+b25246ae
+5aef82b1
+e4b6f85a
+fba4eff1
+f929761f
+be166e51
+9af6e535
+130c01bb
+a1017c6f
+5ccd151c
+b80722d1
+d5a20cbc
+3ee4221b
+46388d36
+bc3882f6
+8963c187
+c03eb803
+17136be1
+ba16eb32
+d2c6a893
+00239ba4
+61ad49e0
+6d51a5b0
+01c5751b
+0adaf163
+e0e691db
+38ae26b9
+300886a8
+d8aca856
+6d23e366
+80330060
+19fa9bdd
+b54c1364
+53b5f978
+2c62f209
+4b0cab49
+0111f837
+7e46c875
+1183f3a9
+d1a94340
+be6ddaca
+6da2fbd6
+228c32d2
+64917feb
+af39d293
+e6ff8cc3
+a36dc9f7
+1d7f26b5
+f4bd06d3
+07d75b52
+ada39ce1
+b7e459b8
+e3b8f237
+36a88c96
+25d97f40
+ff242002
+99609181
+d48d8f92
+2dba0f83
+64df3d3e
+e1d3ed05
+f2744d04
+a3579031
+c5dce8fa
+3f125956
+3598a741
+0016c23d
+51700b94
+07ba4505
+4157815a
+e0f3a9fe
+1d6f0c12
+0b31ac6f
+5fd3419b
+412cb2ce
+be48ad1b
+24db17b4
+7a6960b8
+10ff3d28
+a9af10b0
+be8b19db
+df59a42d
+9c5b7db1
+fbe7539c
+99cb8fdb
+7c627251
+c74bf5e9
+3d63f4e6
+c48ba52f
+2975580b
+f2d80b52
+0b72a0e8
+0f4edcf3
+60143fb2
+6978304f
+6d174d7a
+b0da3612
+c70bb7c5
+97e8991a
+5801d8e0
+52307c44
+70f05750
+a60279aa
+925f31a7
+34737292
+19411fa1
+f5bd9017
+75550cdf
+73eca724
+336b43c1
+20552046
+2b2fa660
+82a0280f
+7f1d4b97
+a972360e
+b89d8b78
+83be4a6a
+cb984a3c
+2c7bd1f3
+90679357
+3f165d33
+a0845add
+e500bd33
+7e64d215
+e1250122
+d7224acb
+5d627b54
+289680f1
+ff64bb9d
+0b6d8b0c
+9676704c
+c6ce3f98
+0d43cf13
+846f2a52
+281b532b
+46cf5e29
+68fbb662
+282f8d99
+a19eefac
+355bbb36
+08bd9583
+13074a34
+a14c6a98
+605a00e8
+d5f2e4bd
+0fc3d825
+5b18f3d9
+7a3855e9
+177c6cda
+18dd2280
+fc8f52a9
+b28deb14
+58bfa98e
+b892a709
+5b9ec2a7
+c70d17e2
+cde71155
+52283d1c
+496612dd
+a7adf5cc
+8324bae5
+134011b9
+1adcf51c
+fff1d454
+7f91384a
+728ec517
+c151233f
+4c01b45b
+8f125065
+6698833c
+f2cb6404
+36b796aa
+ff68bab8
+860f347d
+40045b28
+75ed12c9
+59696d95
+e119f971
+bb3f8e2d
+1c2bde14
+7fafef37
+208249e2
+407b9c2c
+c68c5cf1
+b5f54d53
+d61751c6
+8e697795
+d600c2ba
+0823a1f7
+b00137a1
+fe2bda20
+75dcaaca
+dad4d7f9
+8589418a
+fb0e0635
+f23492bc
+7580882d
+297f698b
+a95494e7
+28476024
+f1eb9b03
+468f29c4
+ba8bba08
+a13be9ad
+b96ea7e3
+b10a99f1
+382d0109
+bc252bd0
+32da4b59
+e8a0f270
+c0a9460b
+88540f9f
+63e83fa0
+045eef77
+95e4914a
+6354ea7b
+ed70802a
+01620311
+d20b4953
+ad155974
+7bbe47bf
+410f6b58
+f84840a1
+40f7b9c2
+2b195774
+80de4e5b
+3448acb9
+8456bc5d
+d9aa9d97
+26a2de7b
+20826cf3
+d5b6acf2
+42695288
+4524852e
+47d9a620
+fceb837b
+7f14ae96
+1c3e7071
+19d53f72
+f90789ab
+293a661b
+dca5a2e4
+be6ce59c
+17024f49
+f84e6f4b
+7e72e3e1
+ea6998f9
+37b3a046
+95e45eef
+60be2adb
+2db71de9
+8b752de0
+25c64e98
+d40ca917
+cb0c1a8a
+1f76565a
+0de33c8a
+bf1d0717
+ea03829a
+e885c7a5
+1dc1a765
+cee47266
+d6293852
+962ea28e
+cd219a7b
+87e29668
+4813a129
+9fcaf15b
+5192dba2
+8d56bbe6
+74752b9b
+6fb62f1a
+dbc0d030
+5326a2ff
+3eec36fc
+8d1db846
+20481192
+fc25b411
+2505ae8d
+e56a4862
+b3e225a6
+da1b1519
+01aabf93
+7669afed
+e9bb803b
+1d42f25b
+3466f0e9
+3386c3fb
+c17559d3
+866e2a5e
+b8e888c2
+48032855
+346d97cf
+70c46fdc
+15055a89
+5af69d36
+c7fa17ac
+df9c8a42
+96deb8ad
+84c427f0
+5f45be7b
+ba24e6c2
+38850d41
+650f5583
+d7c52953
+e814f1fc
+80f9f5dc
+90078967
+e3bd67d4
+1346a1ab
+15b49232
+758cca13
+b430d61f
+e2bbffc6
+aa1c94e4
+93b82143
+c3a38a35
+7ff7f0e7
+b8af7109
+9498fc71
+bd3ca059
+fc6b47d9
+922afcc0
+7b73c084
+d7a32f49
+896d0c90
+c3d9de60
+20e2f986
+20784db5
+45607029
+a2b7a122
+76fc09f3
+e429020b
+7ee0fcbf
+b751550f
+7e5d650d
+78dcd87f
+35ff5092
+f6ce794a
+3e5a89e0
+ad32de6e
+fa5d9d01
+4a9d0dab
+1af58b2e
+b71cc18f
+938d6619
+02779f6a
+86946882
+3dd32b15
+5d96cba2
+7d47cc7d
+56cce13b
+ff207cd1
+4bd081bf
+7079e499
+a296768b
+dd028a70
+15f8eb35
+69bf8f43
+f5047e31
+380ff654
+1faa5ca1
+55992d5b
+c3080988
+5e212158
+2019efc6
+50db917d
+715bf772
+c9f2de31
+e80c2c96
+1c86e0eb
+d0b243fd
+a19ea455
+e824c09e
+cb338fa1
+a9e572b2
+27bc044b
+7840f00e
+0636947d
+01ea2333
+a57e209b
+0950d862
+316949b7
+cc168a10
+b7106cc5
+7848490d
+965ec239
+f94314d0
+736bf568
+a1eeac3d
+8ad7e7bb
+8818fa86
+ef77ce01
+74e550ac
+744efce0
+11c75000
+a49a14cd
+19be9f03
+22f0e375
+1ef3325b
+ce813de3
+62ecd427
+03b3628d
+e88b2a65
+85e7f6c9
+3bac37e5
+c9fe8af8
+87feb968
+97f8984c
+ac2d4799
+54b86f81
+2f24b2ba
+6b7b5a6c
+69a21119
+f1be50b4
+8be05aad
+c6b039d2
+06824fb6
+0054a97c
+4a669be7
+10b54add
+30067bb0
+e14874c9
+46f058e7
+d03697f9
+69079877
+45db7ec4
+2f6a74d9
+6e0566a4
+eb6d9fd1
+f20511e2
+41caf149
+bef3bee5
+301e3e06
+0f52e8d2
+e94e2844
+46e95fd4
+bf5752ee
+1e3bdb1b
+08a6c211
+43c04b41
+6dad974f
+70b61eb4
+130e5a79
+2e021363
+ad978c4a
+e72c88ea
+5e8b4856
+00ed31f1
+95b7a1e7
+36bc192e
+ddfd5c11
+1c11c85d
+2c015dd6
+a4b4a9f7
+f8748bfc
+1d794a16
+085dd100
+705feb79
+74f473d3
+d69f74b1
+4ac30148
+9f6e2821
+b793da4d
+3a4f29b8
+14ad5567
+94b76214
+0dec8040
+c3d4a686
+e3fb0ef1
+00f47310
+564c0fdf
+82968d75
+d96ae1f7
+39cc02c6
+d0792267
+15b50a28
+63282fe3
+559eb1e1
+d4edb6c0
+d3e02226
+8f99333a
+b3ddf65a
+8363bee7
+3f4ec687
+dc7659bd
+278fecb5
+a64ea28c
+def88231
+1e3ff2a7
+2da66872
+421b4ccc
+b5043bcd
+f970e59a
+c4e64dcf
+dff12936
+d385ea68
+b7208332
+b07a6d6c
+49ea3b45
+a088b320
+9df35e84
+fed3cb1d
+9e7ef243
+cc244126
+5f03e1f1
+1c52337a
+c32eaa9e
+15da22d0
+ac44ed7c
+019f06de
+df12968c
+5b0479b1
+1e1ea31c
+0f011b03
+d130ce5a
+f744980f
+d3c333c1
+a159cfab
+f98e692b
+ac07b602
+25b74c9b
+998526ec
+7817139b
+c0bd6312
+98444d02
+c80423ae
+1adfa886
+ba5d463e
+5c16df0c
+513bc7e9
+5f175e17
+33cef4a1
+2e440430
+cd8c0a8c
+68e0c539
+220efc6b
+1e43bd3d
+dd363e61
+df533a04
+9e5d3694
+d41c6069
+dcbfc6aa
+4dc8730f
+b8b6e920
+678fa00c
+b52bc475
+189c5ab6
+40a7d89f
+f77109f8
+c0914fca
+42e4941c
+62fc022b
+38eec06c
+cd20fd78
+f1e0eec7
+8f359715
+ff3f3dda
+0d51177c
+3cab4fc4
+170d6d09
+6f714715
+31be8a17
+0a8f8836
+db3b7bfc
+c42bfe5e
+fa2eb8fb
+3c5dbf5f
+8826c65c
+6c5b8266
+7195046d
+bce85ecf
+ed73f170
+052e75f4
+56274898
+0e352bf4
+5ba5b40e
+e677b4da
+2b99f7f4
+20a04274
+86651165
+8a9baceb
+9b7f373a
+815e3303
+1354087a
+ac4db851
+9a2c88ab
+89a13b6b
+b1c676fe
+d40710f9
+70a02d1e
+fdfda897
+0038e65c
+84c24e61
+5f2133cc
+218a739b
+36c0fcc5
+96152eb1
+4c5c7066
+40e2fd9f
+5992d1e4
+36b21dc8
+437e46f0
+d4134c20
+197b4575
+9d8d7034
+555d7949
+a6624a99
+0da96972
+52d62090
+8b9efd53
+9f35dfeb
+12343fcc
+bde13382
+36862899
+e99c5f57
+38377571
+9b98e9fc
+7651692b
+cc2b9a62
+3514a1ad
+88059e21
+d99f6c37
+65cab4e7
+2952cebf
+23202016
+99737b67
+05c5e3c0
+809e67f4
+09e42cac
+f4eba758
+5a91237e
+4b46e434
+bd1d21e2
+6d5c16c7
+de59dbae
+863329da
+27a7f9e6
+7f17566a
+357f5397
+2575d83f
+76c12254
+26d35971
+8271e3c6
+10b653ed
+b647358a
+95cfaa9c
+764a17a5
+a97488df
+63d6c0c3
+ef1eacfd
+b913131e
+44fb02c7
+9a62af90
+516ce6f9
+55e0ab5a
+c44012d0
+8e257943
+65242065
+08b48f3f
+9bcd1771
+1b1aa9ea
+709df051
+bbd3100d
+2dbe7afc
+3b47042e
+206fb529
+c5e09334
+26f5c8bf
+dd895aff
+4b219154
+df7005ec
+87283599
+f93c8e47
+d2bfca2c
+af6e2f81
+038685fb
+2f72cbb2
+64f227d9
+ee065f9d
+bdaf7920
+60d4eb86
+5fbd9170
+f54b0c69
+6d8017e7
+516d14cd
+f0b772eb
+8ff3a569
+2829f187
+efaaa07b
+bb3b7ab9
+d823e6e6
+91208258
+bd1a1b0b
+0d70c5c8
+21880383
+f879f7cc
+ec80180e
+4ae3fa5e
+741ad05c
+91cb2204
+f536e44a
+01b2a2e7
+78c469af
+9c9d3be7
+05940d6c
+0adb34fd
+4488e3cd
+7ff5559f
+ec3a7885
+9c82fead
+b4b47f1a
+2a266054
+ec874408
+10bc08c3
+ae0815e6
+fe06fd10
+78737263
+44b87d93
+dd423537
+cdc0ad95
+1c4d06eb
+3e579335
+9d44e906
+b593ebe0
+002fdf0c
+639c285a
+ed862cd5
+41e6f3d3
+3e60567c
+3625547a
+c9b65311
+d52bae7b
+ff6b8352
+969e0f2f
+2555b4d9
+19413c8b
+ed8893c3
+7bd17596
+28a88aa1
+62722182
+ef9f2f1b
+e7ed3eac
+2a8c42b0
+8d08cf90
+9221c533
+0cf5c887
+e98ad345
+bfa1a33f
+90cedade
+dbb74fdd
+f8077d16
+6e8d859c
+f95a897b
+c9ede706
+a600f4bc
+b01d50d5
+8c1351a3
+75e6112a
+c6d8ea00
+57707173
+15eea2bb
+9e77062a
+150f9628
+a9111ff3
+21f5b15a
+547c7f2c
+9da5eef1
+eb1a4bfa
+5bc5bb8d
+8d801483
+665f37b8
+6641b1eb
+d252c510
+4ee00c04
+311d5143
+646b4bca
+668ca1ce
+08b17996
+374ea08f
+45fccddd
+4a230e59
+668aef36
+71fd6dcd
+c4cf3750
+737174dc
+a1b6fdc0
+f3da12f3
+23cf3683
+3baecfcb
+a4ca1fb6
+c59871a4
+d5b22f0b
+d5527617
+df1a74df
+5aa59340
+ea96ac60
+b6f811dd
+2cb4dce6
+6dec97db
+89465d46
+1c60da66
+8c28e5b5
+72ff732c
+96e0bb89
+3c88480b
+1767d1c0
+afc91167
+6dee85e9
+126ebfa6
+3f472c27
+26f017be
+fcf2aea6
+e910ffbc
+4e98835c
+9f3e4cce
+ab1ad103
+313458f6
+36268542
+159f4fed
+6e6e841b
+1c701bf7
+be0a9688
+0932e23c
+3115c52f
+2be44e4e
+bd6c605d
+5ba7982a
+cec6003a
+87e287ce
+2598f805
+888c40fe
+f2ce13f1
+151f2ea0
+fde3abb7
+86b374da
+e377834d
+6a00a3da
+16401b7d
+e7351254
+0ec6f284
+f10cdbc0
+f2775c24
+32e43e65
+b0eae64e
+668f94ca
+637b5b1a
+cd13dd0c
+a5785c33
+e52d8b41
+df871e60
+03692f63
+ce76f037
+85b03970
+48deb408
+800354a7
+08b798dd
+4a6c02fb
+ad3508b1
+05c1e2a9
+752910a2
+d03756ed
+9e0ed189
+622305e6
+d5630477
+410c07be
+2903ead3
+319a3bc4
+987da766
+7d565721
+953b365c
+1771cc97
+74456ccd
+9043b403
+0e7dd402
+f6d03c1b
+f593b214
+405fef77
+964d1fdd
+3fbde16c
+04716793
+f1167b3d
+6c338953
+9fe517cb
+b80f388c
+476fb13b
+c31847f5
+cdd4e327
+a98972ab
+28fc6e4d
+c6fa13fe
+42cdd679
+a4bbd4f4
+390d0fe9
+f65b0103
+0d3aae8d
+3ce91fbf
+ca4fd8f8
+186ff29b
+276a5de5
+a015ed43
+7a38dae4
+a9087a98
+1171550e
+a09c9a76
+100f342c
+0bdc3959
+77270b8b
+1c0d3937
+28fb6c20
+17cdc396
+ea7a79db
+1747b194
+8cf87048
+e2f9eb0a
+1e9876db
+6645e274
+b2541a33
+6909bd0c
+537d79c4
+ee72596b
+613433a5
+268ac29a
+57a3620b
+c3e9476d
+d9d4299e
+eb5e10b3
+cd98cc3d
+86a057cd
+53e253dd
+52a0c096
+32b74dd9
+5c5f5aa0
+d0d29b5e
+f449c697
+37044a3d
+262c51d4
+cd175af1
+fba98a2e
+fa1400c7
+4d10903e
+c563a829
+34cbc0af
+003b71c9
+7e9ea410
+d3b8f746
+65c53f25
+d2d741ca
+82313a01
+25504ca6
+09f58120
+6ad82e7a
+81dfece6
+842250db
+7b6dbb70
+b9fb854c
+9aedd0cc
+7724ab49
+a56b946e
+c7e847cd
+de0abb66
+7fffde06
+00727e83
+62dcba95
+103c17bc
+ad9fa255
+be6faea2
+0de51a54
+6574331a
+ee756623
+ca025a55
+32cd8777
+b3bd1abb
+07feabcf
+f033ed65
+29cfae7f
+89391314
+2ce919c7
+b2a27ca4
+ad720bd5
+9a16a5ac
+2c561953
+0e43ec16
+4f9b38fe
+54111a43
+7dfc13fc
+afa3deff
+c6341482
+6fd0f1e6
+ce66ff94
+00c46cd1
+d469d881
+c2052828
+4f1f0075
+c1bec383
+7ba29dd1
+ffed5bfc
+19945f93
+b043878c
+20183968
+6cdb3998
+0832c824
+aab60b68
+648eac94
+0df5d32d
+97ceec1e
+81a70a4a
+b226f465
+dcfa4e1f
+1fd159fd
+61a3f3fb
+c415ae55
+2e67d537
+368f84ee
+fb5b865f
+6753ba04
+e2de05d6
+ec9bddf6
+f7b97391
+15b1b6a8
+ed0714a0
+a4f7afc8
+52afb1dd
+a4756aa0
+ba000501
+5b9b818d
+65ae2219
+98f4456a
+28620315
+c8d70ef9
+43e336b7
+de25e5cf
+19167572
+dbaf11b7
+8b64e1c6
+af54cdc5
+285ee98d
+66acf824
+e1aea7af
+9fb3e8f3
+b315263a
+00dd27a6
+468a0854
+6a3814dd
+2e0b99f0
+4c397c64
+68532a38
+0a189418
+f828f7fb
+a70805e4
+8324f342
+210cec56
+f33e4fa1
+cc936bac
+58a6956e
+7fc96aa2
+356a3686
+c8767556
+0ceee1bb
+f4710bb5
+4194f62e
+80bf0b09
+2b700eac
+3d6c6f30
+004876f7
+6772d022
+3a0ca13a
+a8cee9fa
+68e35846
+2d897be3
+845fb196
+a4d7f58c
+69013617
+4f93c5a7
+94aea708
+67537fdc
+9e36bb88
+06f31cbc
+4bf6def0
+81b62616
+02eb3c72
+a16bb584
+adc26f48
+0676ad19
+f6d99763
+f41cadbe
+e470f541
+2a9761ba
+4f88b699
+a77b6a38
+f8008800
+320d5d46
+a3411929
+7a4379aa
+1bd5e279
+8422994c
+f01581b2
+05adf7a2
+656b7974
+ec2174ab
+11d5a05b
+9b11d405
+1a4bcc41
+f21bfcf6
+41e1828d
+8d96084a
+a477a3ab
+3da890e4
+bab49c91
+66d52e53
+9d99aaa3
+8ac87f94
+8871fecc
+1712036c
+7c59aadb
+b6005a2a
+bd53d88a
+a04aae5a
+2d332991
+0284608a
+821b1762
+35dea70d
+53c2def9
+4ee46fd0
+7b466955
+4e1bdeed
+795f9ed0
+aed379a6
+21e1ff13
+cb9b1a64
+b295b95b
+aa239c24
+0e9645a9
+c6c200b2
+2f943ff5
+10d51d22
+b230f1da
+eb8cc0b4
+04679a14
+1913ac2e
+5595bfee
+eb05c406
+894ba00a
+447d84b9
+632a71f5
+aad8a7ac
+8ebe9f8b
+111121f4
+1821502d
+376cbe23
+f5385f9a
+257c542e
+b06857f8
+3c72f31d
+9963c37d
+3665bfd8
+d2ed984a
+6996831f
+67b7679f
+5536f925
+02914429
+527e32d6
+1f3230eb
+ca328b68
+934229d4
+95c3fea9
+9fd51c1a
+e487858a
+e3768f02
+e1a192f3
+e308d460
+a10916b9
+37fec982
+e6cdbcfb
+fc9ef69d
+346ade38
+d833acd1
+b848dbd2
+3e6ff6cf
+7a0ebef1
+c7502d84
+e702f4b9
+051659fe
+1c0357f3
+d130cbd1
+9f61ea5c
+53a18832
+09ad6a06
+209d1929
+a2c2ee6c
+0492c809
+7bd8ddfe
+f1d3558a
+6a858837
+7504367d
+f310fff0
+dee9c7d7
+39d672df
+d1208de2
+72683681
+9ebbd31c
+f01779eb
+1662de8f
+9ff9bbde
+88eb86ef
+cd846c62
+80d2263e
+1e3cba9d
+a90a99c5
+d9b2b2a2
+04fe0fe2
+40d4ce68
+a2f7459e
+08c4540f
+a916d565
+f15bee03
+e85e0eda
+af8f31b5
+dad9fff7
+f79f6778
+1e41894c
+1f2bade2
+cdca04f4
+7b48eb6b
+11ffbf5b
+ee47b323
+ce4181d1
+c7f4d077
+71b05ccb
+e24d7cb8
+dc1a35f3
+7b915d39
+f74420a3
+6b406125
+c25f0f44
+940f5c4c
+3d4f5cb7
+9e8dab66
+1c7fe85c
+2bbbed4f
+fe1695ff
+1f5b6f1d
+d1f88533
+1478592d
+65b155e9
+309c2ca1
+63f702b3
+4ac7e86f
+22ca49b1
+659ee50d
+f44a6a1c
+0123fa59
+dcc1b63d
+cf03f39e
+3ee65618
+aa6de869
+2b31dace
+69c4f51b
+9b6a4cc9
+9d084773
+9ffbc792
+822d177f
+f1a65d92
+966736bc
+56fb669f
+99ebb9c4
+511ab827
+dec4c984
+3d28a1a3
+7755a201
+9905f515
+54a60f79
+ebb5e7b8
+3965ff35
+392f3c3f
+5ff926ae
+d70a05b0
+d3dde3a9
+3d722699
+bca35340
+3bfb1097
+77676b2e
+f460b5ca
+dbac9f7d
+40274d95
+70daf390
+d7ccabfe
+636e44e1
+bfb6fd56
+84a150ff
+a0143475
+6aa33a0e
+b709e7a7
+dafd0cbd
+075b45e7
+eba0462b
+72c50170
+7d48c0ae
+7076d935
+73afe74f
+860b3171
+099d72d1
+e8281e85
+45f17cc2
+6c72d836
+38142205
+78d53f07
+30787e1b
+df623f3a
+eabf23ff
+a9b35bdb
+18be0b80
+3e32c00e
+0907d079
+b0130b6f
+b7b3e17f
+1d5e568b
+f927a0df
+a0f3f4b3
+3357b61c
+4fb73f5f
+4020111e
+d9268037
+fd1a46b9
+341fa342
+0c16cbfe
+86583230
+c7404d2c
+03bcb2df
+1e12a4d5
+b272de30
+2b322b66
+eb385583
+8e2c2068
+1b6b8747
+f6ddbdf9
+0dce61ea
+62e09dff
+d4ba3a12
+ead731f4
+a4695845
+38eb9cf4
+883cecab
+e0218d0b
+515fbe62
+7952c21f
+2ae656cc
+4b6ca883
+ab33723a
+39adc19d
+b0e3882e
+6ed7a87f
+04f80d3f
+dda1fed2
+511a53a9
+adb39613
+ddc1d62e
+d8a43e61
+fda682d3
+1f2924d9
+e8cdfd91
+65570340
+ccbac4d9
+9d88c00e
+6933dec1
+3b8b9179
+2314fade
+6a83ca1a
+501069e9
+1c77dc98
+5e074626
+c685df7b
+db9df920
+4ac9db9e
+21c0ea1a
+53370944
+945ff49e
+709a6b80
+c480abf1
+d686b3b3
+e512e19a
+16487bff
+05fe176a
+2fe9af09
+293d71dc
+5732a3f8
+c4939891
+9dbcad7a
+4597ba5c
+720a37fe
+33cca6fa
+88627e29
+484f647e
+963d99df
+9304dcc4
+5392de9d
+a431136b
+04277bf9
+efe2d2a2
+69014bf0
+5c65a92c
+1760a525
+4ad03979
+e6e845d1
+e4b9f8b4
+1a64627c
+45a6f2f6
+6f113338
+7f9907fe
+c86e8c6b
+14483af0
+493a39f7
+841f6a9d
+22b76649
+a1612bc9
+11754474
+96386eea
+a26e0679
+b77d7b90
+cd29414e
+4a92ca0f
+73064a3a
+54a5ff49
+78cc57bf
+40e21c71
+2e802650
+9d547ce0
+3833f734
+866a07e5
+e7698644
+512573b7
+31479de9
+834a2d0f
+e938c592
+fe4dce68
+3a24b646
+611a1d9d
+5668f2e7
+b933b82d
+1271124e
+de9167c0
+93ed8904
+01fef71a
+a4a8fd5a
+e66aab59
+f74ed3c0
+023184e0
+2aef1419
+fda88ae2
+c92058fb
+ea1366e6
+1d6dfac9
+fda1a50f
+107bdf69
+650921b9
+7ba8db6f
+6c5e14ec
+5ce3f5af
+e96bb299
+d9a747fe
+5b0a6891
+a36861b9
+28639f10
+63e91d1a
+22a1dbd1
+2b9b2aa3
+d5f62b87
+7415122a
+9326fe51
+6990f900
+387b156c
+b5ae52c2
+45e063a0
+d356c7e6
+4eb50e0e
+5a08ef7b
+34800de8
+89376183
+f9aa2b92
+66232500
+d31bbcd9
+5f585823
+6d0ca8d7
+094fc6a1
+274b6720
+8e26f624
+253e47fb
+b1ff5115
+4655cd35
+ee247b1b
+79fdcba2
+fb95450f
+a6d9bf82
+b920f7a4
+fcf0132a
+5e104492
+8ebf84b6
+0036f8a0
+8336c0bd
+8f89a2e1
+68b6fb29
+0d21cab2
+875679ca
+fab6d109
+57b8da1d
+80f99b7e
+ce3ca1b0
+388390e6
+ace07b3a
+f35ba0ab
+e4912d2c
+1540d583
+23cbab1b
+17f619bc
+6643174f
+be19a3cc
+f4752ad1
+8048b460
+9962ddf1
+d3c9e746
+4f33db5c
+b388a7a9
+13a0a6d2
+0720c358
+2220ee44
+23db6a7d
+8dd700bc
+c34c3067
+ce1b0bb1
+8de03920
+5e64ce5f
+8418b5d7
+ff129f93
+49042125
+b9bcbd5a
+ca280131
+867c8cfe
+3abbad17
+7f656efc
+51f248f7
+1919941b
+9da73374
+0c41b6a1
+d0f6a382
+55951b61
+ce396c2c
+c9a51835
+d8a67747
+543f351f
+6a862553
+10e3c01f
+858cd058
+1cebe213
+e131dc73
+7d56a723
+3c2cc096
+01549dc2
+af84702c
+45740e73
+fb901dae
+5c5e7cd3
+b4b7b907
+31488ae6
+7925e09b
+f653f22d
+b7dc9c31
+7c85f226
+480f877d
+b1f455fe
+1f8a5e2a
+b84dcfdb
+d9a9e04e
+5add3f47
+73dc4f54
+4a5f8682
+a398d4c8
+1151bc34
+7de93965
+01b4b465
+b2325088
+156d77f8
+d9f4e70f
+ebb19a5e
+1dcabd2a
+8ec9af25
+2e95f057
+5ad722b4
+795c2d33
+86528389
+f1a4971b
+ee30c9a0
+1ed993f4
+dbdaacd4
+a20faac1
+6c9b7105
+6828817a
+a6a575e6
+95402f9a
+dfc6e241
+e0835d8c
+daf51ef3
+1363aa92
+82af59e9
+a54c550f
+78c1db80
+fd1f42c5
+72334595
+fe19892c
+46dbd380
+88002ee1
+a093294b
+2ff0088e
+376f3f5c
+eecc2a17
+86e54348
+26817995
+b3b280b5
+4148ab9f
+5e92008d
+6df0a1b5
+2dd1fb40
+0f8518dc
+76d84582
+6855ef53
+a3202a0a
+504dd6d1
+6530f87a
+f61d4346
+7fdd4afb
+a84a4da1
+09636619
+64105c44
+69db5ac9
+ac93ddc6
+e6a5c204
+50b436c9
+283d5555
+50631f06
+e4c8cb25
+50370160
+0f59d328
+51f67be2
+2c46d30a
+360616eb
+e3e366c8
+3c38cb28
+d594a1c3
+48bc161d
+cef82434
+aeabbe50
+5d87968e
+a7456855
+0d15d3e4
+8ff6f5af
+dcdd8d42
+ab86a6e1
+03db5ecd
+da67c68a
+a01a8ffd
+7cb373df
+12e47959
+404415c2
+01ad6a22
+e2510ced
+b8a6d129
+3023ecc5
+4bbbd783
+f4ec8b1c
+681352e5
+b87f4a4a
+a24c8c8e
+d5141a06
+ca400082
+b5448ded
+29ed3d08
+22fd02f3
+bc7e9cb5
+ddeb3836
+203ce26c
+2ad73baf
+510593ac
+677d4d01
+01348ea8
+4284b2c3
+1468c1c9
+683d3783
+ce80bb26
+d1d5852c
+d712eae6
+c88e8d4f
+a361412b
+46d96a2f
+eb4aa055
+9acd7628
+18794ae8
+afa7711d
+bc7822a2
+76242a0a
+24e993b3
+4aeed845
+2f720426
+e05d4b01
+b449b553
+ddecb707
+6a955f37
+cd628e53
+f6ae482a
+ca4c27ef
+cdf61bd0
+8f8a62c3
+2773eaab
+f10470f2
+ee805808
+a8f97d40
+65fa47c9
+22fd2464
+bcb858e8
+9bbfbb90
+9023c343
+9971a939
+6e158752
+6cd97108
+2e6e298f
+cb07389f
+398e408c
+80162d04
+3f093f78
+a2084f28
+484cb3fb
+fb5440b6
+b7c924a4
+78f06ef9
+c37cc98c
+4ee3a75a
+d7ea84dc
+d9f2b5d2
+24d4f466
+9862df9c
+03dda6cf
+aa3a12d0
+7cc1c832
+25dbc3bd
+aa0ddc3c
+9fc5abd8
+34a36cd9
+43f40ff9
+81744249
+95667a0f
+5784bc71
+f610c0f7
+8caa4e5e
+5d017240
+8f572b5e
+0f772d97
+db5f2402
+af3851d3
+869243b9
+33811062
+ea95ef5e
+a30fb220
+679d9384
+f896f38f
+86d1cd0a
+54bbde6b
+579c293b
+81b3c14e
+880b3651
+33b15f2c
+8108ba36
+ba65d0b4
+e2dcd9b1
+83b75ed2
+3477c94d
+e46971ac
+7c0cb2a3
+c6ad90de
+c6f906ef
+90178ba3
+ea8d581d
+66ea8ad1
+7c88f89e
+a97f437d
+d10e063e
+1327309b
+b6913117
+029d716d
+4660f8df
+de822e2a
+d69b60fc
+6f57a6a4
+bc75d75f
+95bb412b
+8960a01e
+9e67150a
+6479b792
+f403f840
+7b01d9a0
+e02f2a89
+b38649b2
+ad17b5ad
+45307bb0
+2045039f
+e5d6e12c
+9557b824
+fb2108e5
+0ed4baad
+98feabd3
+f48ecf39
+b2ee0739
+fc19bfad
+1d081801
+d3f2ae29
+024a4c3d
+87b20ec6
+6da4fad5
+c52b5f8e
+2b89e85e
+1c63b114
+01074d39
+97cb803a
+cb901c0b
+63985851
+96e3d003
+c8b9d4a6
+9318e07c
+71c23d74
+dcab49d9
+ab6399dc
+cec38daa
+74d5c608
+49fc605a
+c18696ea
+15fe4a4e
+42226b85
+c37c44f9
+4a45f6c5
+a0fe1603
+ad82323c
+bd2770d1
+7fbc51ec
+4e94fd7f
+88fd1aaf
+583bc341
+fa760480
+22b00980
+81821353
+632ab46b
+603ed8a5
+0433a7db
+2c356176
+1c11202a
+bc6b8a36
+2c4fd9a4
+8ce00b40
+f20f34e3
+73e7ce67
+a61aeaec
+e773d3a8
+ccee292f
+bbfdb5e8
+8327af33
+9be69d87
+c577035a
+74506e62
+919b847a
+43ee8cf6
+2d7ba5c7
+29861d62
+67c7761c
+00c0cf3e
+780bcb50
+e6a1e2ea
+7a15bf06
+4aa938fc
+1de2c2da
+e5b7557e
+ef9ac0c5
+61f42546
+788ff59f
+f9cc53d7
+c8897aa6
+f00bddf8
+e0cbca17
+e83cac32
+403864c5
+762d27e5
+1f70796e
+24a360aa
+82a0b3e0
+7841e8c6
+8352926f
+fa2da417
+c62d54af
+b0de5825
+a7565058
+87add1a3
+ea60fe71
+f29947a2
+b1eaad65
+6a2e21be
+0108813d
+4f4c953d
+a49e5f93
+66118c34
+9a17f584
+0f95c4c5
+02b9c867
+71a4bb78
+b93919cc
+7b665f22
+7b460ca9
+cc8be72b
+5c4783a5
+7a889b50
+02ad17bd
+e2e78404
+e5848c97
+bd08bed4
+3a7402e7
+fd9d6ea5
+3cc924c0
+5aa098bf
+be65d029
+3f0b24b0
+124131fa
+bb578fab
+99298973
+be8e8a31
+f2d6912f
+fcbb6a11
+ec50554d
+b4ecbce4
+119d04bb
+15204412
+56c3bcdd
+26d9ea7d
+339dbe2a
+2e097cf0
+6951cde4
+407438c8
+b3083335
+5e3afe32
+ef73f0b4
+7622471a
+f505e07b
+4d6f4495
+707ec18a
+b67de336
+675e81f6
+a5a83bdd
+9a81f9dd
+ae70da65
+3b549cb8
+daf0b56a
+4a990549
+282b88fc
+23d26ff3
+8afc6cba
+2e7e4d26
+ac5d8d16
+fd92532f
+6de90931
+5039cc1d
+76931f79
+166e4822
+c5d2950a
+64c40353
+b2b7e021
+c39628fc
+05638cc7
+066186ba
+e3808c56
+042827b7
+7de67dda
+d8bd8f84
+6f441cf5
+aab5f469
+269d5991
+e1c5f21d
+1a4ee5ec
+2bbd9003
+41cde2fc
+f305ed62
+35a5c393
+de87ff87
+fb056459
+f500e1cb
+040a84da
+af0809a5
+88964150
+195ce981
+1452be55
+bf279e29
+cb5d6844
+168be022
+35f92e2c
+db045b48
+879ccac6
+affecfe1
+b9e744cb
+d6c6af4e
+635b4f70
+e307ab03
+85f287b3
+0649d59a
+50016374
+9c8ed289
+d9dbcb72
+56acedd9
+3015bd7a
+9606a33d
+bb37b5c3
+e675cfd6
+e62245eb
+fb82cfe5
+b1c33ffe
+6d389dca
+c1e7a56a
+71b9440e
+2e8a689b
+91a8a30b
+798ee394
+2f1a67ee
+29cfd71f
+f258c0eb
+2c1c829d
+5282678c
+76b42115
+d88cfb4d
+1e4be8d7
+41f01e8e
+89b08e39
+380ff345
+0ac20149
+7227c706
+5bd6584c
+d6ee7377
+7e2503a1
+ae1dfa39
+0bc1953a
+aa0d873c
+63e71e46
+7083ddcc
+55fcaa41
+b7ac0822
+6e895c0a
+1ae85837
+fdd404ad
+c52dd224
+17c5d911
+912c6d03
+19114224
+bd6afa2b
+b13433e4
+020941fb
+9163310a
+d2dbdfe6
+fd0b1667
+5ce839a9
diff --git a/models/rank/dcn/data/sample_data/vocab/C8.txt b/models/rank/dcn/data/sample_data/vocab/C8.txt
new file mode 100644
index 0000000000000000000000000000000000000000..61c11ee5c5aabab0f917ffd6e76d39c82ddc90e5
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/vocab/C8.txt
@@ -0,0 +1,81 @@
+f504a6f4
+0b153874
+7b6fecd5
+235697d5
+64523cfa
+316074ea
+0017bc7c
+afc9ca6a
+a6d156f4
+13037314
+d7c4a8f5
+e350fe4f
+931c3bf7
+361384ce
+6a698541
+43383eb4
+73b7901e
+cb66451f
+f0e5818a
+0fb392dd
+4671807e
+da1ed842
+51d76abe
+5b392875
+e602701d
+f0298c3c
+a25968f2
+67b76963
+cbe011a6
+1296137f
+d3334ebc
+f1aa21a3
+23eefdc2
+322e63df
+a674580f
+d1b66f7a
+a8d6f709
+66c1ef42
+813607cc
+966033bc
+e7945bc1
+77b89023
+73ba467e
+169d7cc1
+8bedcc53
+9d01afb9
+bb170c38
+233428af
+07a0b7e5
+560f93f8
+9c376700
+d1aaef6c
+71f9a260
+449116d5
+c5e75280
+ba7cbdc6
+25611aba
+56563555
+cb69809d
+25239412
+093a9651
+a61cc0ef
+062b5529
+c7453af1
+b621aeb8
+45f7c2dd
+49dd1874
+8ee20c61
+62c159ed
+c8ddd494
+1f89b562
+0b5a4776
+37e4aa92
+985e3fcb
+efaa8b67
+66f29b89
+6c41e35e
+5b9f3341
+271b0642
+a04b3fa3
+e6210023
diff --git a/models/rank/dcn/data/sample_data/vocab/C9.txt b/models/rank/dcn/data/sample_data/vocab/C9.txt
new file mode 100644
index 0000000000000000000000000000000000000000..eddb3ff906a2aa4c5177be414452c9defbb1496b
--- /dev/null
+++ b/models/rank/dcn/data/sample_data/vocab/C9.txt
@@ -0,0 +1,3 @@
+7cc72ec2
+a18233ea
+a73ee510
diff --git a/models/rank/dcn/model.py b/models/rank/dcn/model.py
index c0395c27318cda335dd8271e28df1f8b0a9193b9..52764c3e2122c408078c65875427af74c4ae83da 100755
--- a/models/rank/dcn/model.py
+++ b/models/rank/dcn/model.py
@@ -1,56 +1,59 @@
-import paddle.fluid as fluid
-import math
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
-from fleetrec.core.utils import envs
-from fleetrec.core.model import Model as ModelBase
from collections import OrderedDict
+import paddle.fluid as fluid
+
+from paddlerec.core.utils import envs
+from paddlerec.core.model import Model as ModelBase
+
+
class Model(ModelBase):
def __init__(self, config):
ModelBase.__init__(self, config)
-
- def init_network(self):
- self.cross_num = envs.get_global_env("hyper_parameters.cross_num", None, self._namespace)
- self.dnn_hidden_units = envs.get_global_env("hyper_parameters.dnn_hidden_units", None, self._namespace)
- self.l2_reg_cross = envs.get_global_env("hyper_parameters.l2_reg_cross", None, self._namespace)
- self.dnn_use_bn = envs.get_global_env("hyper_parameters.dnn_use_bn", None, self._namespace)
- self.clip_by_norm = envs.get_global_env("hyper_parameters.clip_by_norm", None, self._namespace)
- cat_feat_num = envs.get_global_env("hyper_parameters.cat_feat_num", None, self._namespace)
- cat_feat_dims_dict = OrderedDict()
- for line in open(cat_feat_num):
- spls = line.strip().split()
- assert len(spls) == 2
- cat_feat_dims_dict[spls[0]] = int(spls[1])
- self.cat_feat_dims_dict = cat_feat_dims_dict if cat_feat_dims_dict else OrderedDict(
- )
- self.is_sparse = envs.get_global_env("hyper_parameters.is_sparse", None, self._namespace)
- self.dense_feat_names = ['I' + str(i) for i in range(1, 14)]
- self.sparse_feat_names = ['C' + str(i) for i in range(1, 27)]
-
- # {feat_name: dims}
- self.feat_dims_dict = OrderedDict(
- [(feat_name, 1) for feat_name in self.dense_feat_names])
- self.feat_dims_dict.update(self.cat_feat_dims_dict)
-
- self.net_input = None
- self.loss = None
-
- def _create_embedding_input(self, data_dict):
+ def _init_hyper_parameters(self):
+ self.cross_num = envs.get_global_env("hyper_parameters.cross_num",
+ None)
+ self.dnn_hidden_units = envs.get_global_env(
+ "hyper_parameters.dnn_hidden_units", None)
+ self.l2_reg_cross = envs.get_global_env(
+ "hyper_parameters.l2_reg_cross", None)
+ self.dnn_use_bn = envs.get_global_env("hyper_parameters.dnn_use_bn",
+ None)
+ self.clip_by_norm = envs.get_global_env(
+ "hyper_parameters.clip_by_norm", None)
+ self.cat_feat_num = envs.get_global_env(
+ "hyper_parameters.cat_feat_num", None)
+ self.is_sparse = envs.get_global_env("hyper_parameters.is_sparse",
+ None)
+
+ def _create_embedding_input(self):
# sparse embedding
- sparse_emb_dict = OrderedDict((name, fluid.embedding(
- input=fluid.layers.cast(
- data_dict[name], dtype='int64'),
- size=[
- self.feat_dims_dict[name] + 1,
- 6 * int(pow(self.feat_dims_dict[name], 0.25))
- ],
- is_sparse=self.is_sparse)) for name in self.sparse_feat_names)
+ sparse_emb_dict = OrderedDict()
+ for var in self.sparse_inputs:
+ sparse_emb_dict[var.name] = fluid.embedding(
+ input=var,
+ size=[
+ self.feat_dims_dict[var.name] + 1,
+ 6 * int(pow(self.feat_dims_dict[var.name], 0.25))
+ ],
+ is_sparse=self.is_sparse)
# combine dense and sparse_emb
- dense_input_list = [
- data_dict[name] for name in data_dict if name.startswith('I')
- ]
+ dense_input_list = self.dense_inputs
sparse_emb_list = list(sparse_emb_dict.values())
sparse_input = fluid.layers.concat(sparse_emb_list, axis=-1)
@@ -63,7 +66,7 @@ class Model(ModelBase):
net_input = fluid.layers.concat([dense_input, sparse_input], axis=-1)
return net_input
-
+
def _deep_net(self, input, hidden_units, use_bn=False, is_test=False):
for units in hidden_units:
input = fluid.layers.fc(input=input, size=units)
@@ -80,7 +83,7 @@ class Model(ModelBase):
[input_dim], dtype='float32', name=prefix + "_b")
xw = fluid.layers.reduce_sum(x * w, dim=1, keep_dim=True) # (N, 1)
return x0 * xw + b + x, w
-
+
def _cross_net(self, input, num_corss_layers):
x = x0 = input
l2_reg_cross_list = []
@@ -91,33 +94,46 @@ class Model(ModelBase):
fluid.layers.concat(
l2_reg_cross_list, axis=-1))
return x, l2_reg_cross_loss
-
+
def _l2_loss(self, w):
return fluid.layers.reduce_sum(fluid.layers.square(w))
-
- def train_net(self):
- self.init_network()
- self.target_input = fluid.data(
- name='label', shape=[None, 1], dtype='float32')
- data_dict = OrderedDict()
- for feat_name in self.feat_dims_dict:
- data_dict[feat_name] = fluid.data(
- name=feat_name, shape=[None, 1], dtype='float32')
-
- self.net_input = self._create_embedding_input(data_dict)
-
- deep_out = self._deep_net(self.net_input, self.dnn_hidden_units, self.dnn_use_bn, False)
+
+ def net(self, inputs, is_infer=False):
+ self.sparse_inputs = self._sparse_data_var[1:]
+ self.dense_inputs = self._dense_data_var
+ self.target_input = self._sparse_data_var[0]
+
+ cat_feat_dims_dict = OrderedDict()
+ for line in open(self.cat_feat_num):
+ spls = line.strip().split()
+ assert len(spls) == 2
+ cat_feat_dims_dict[spls[0]] = int(spls[1])
+ self.cat_feat_dims_dict = cat_feat_dims_dict if cat_feat_dims_dict else OrderedDict(
+ )
+
+ self.dense_feat_names = [i.name for i in self.dense_inputs]
+ self.sparse_feat_names = [i.name for i in self.sparse_inputs]
+
+ # {feat_name: dims}
+ self.feat_dims_dict = OrderedDict(
+ [(feat_name, 1) for feat_name in self.dense_feat_names])
+ self.feat_dims_dict.update(self.cat_feat_dims_dict)
+
+ self.net_input = None
+ self.loss = None
+
+ self.net_input = self._create_embedding_input()
+
+ deep_out = self._deep_net(self.net_input, self.dnn_hidden_units,
+ self.dnn_use_bn, False)
cross_out, l2_reg_cross_loss = self._cross_net(self.net_input,
- self.cross_num)
-
+ self.cross_num)
+
last_out = fluid.layers.concat([deep_out, cross_out], axis=-1)
logit = fluid.layers.fc(last_out, 1)
self.prob = fluid.layers.sigmoid(logit)
- self._data_var = [self.target_input] + [
- data_dict[dense_name] for dense_name in self.dense_feat_names
- ] + [data_dict[sparse_name] for sparse_name in self.sparse_feat_names]
# auc
prob_2d = fluid.layers.concat([1 - self.prob, self.prob], 1)
@@ -126,10 +142,14 @@ class Model(ModelBase):
input=prob_2d, label=label_int, slide_steps=0)
self._metrics["AUC"] = auc_var
self._metrics["BATCH_AUC"] = batch_auc_var
-
+
+ if is_infer:
+ self._infer_results["AUC"] = auc_var
# logloss
- logloss = fluid.layers.log_loss(self.prob, self.target_input)
+ logloss = fluid.layers.log_loss(
+ self.prob, fluid.layers.cast(
+ self.target_input, dtype='float32'))
self.avg_logloss = fluid.layers.reduce_mean(logloss)
# reg_coeff * l2_reg_cross
@@ -137,10 +157,7 @@ class Model(ModelBase):
self.loss = self.avg_logloss + l2_reg_cross_loss
self._cost = self.loss
- def optimizer(self):
- learning_rate = envs.get_global_env("hyper_parameters.learning_rate", None, self._namespace)
- optimizer = fluid.optimizer.Adam(learning_rate, lazy_mode=True)
- return optimizer
-
- def infer_net(self, parameter_list):
- self.deepfm_net()
+ #def optimizer(self):
+ #
+ # optimizer = fluid.optimizer.Adam(self.learning_rate, lazy_mode=True)
+ # return optimizer
diff --git a/models/rank/deepfm/config.yaml b/models/rank/deepfm/config.yaml
index fd06406db481fdfb8e00aefecda7ca42a3c89353..d1d25c2c4c05c82f8ee0b9554563d2f310c2ac01 100755
--- a/models/rank/deepfm/config.yaml
+++ b/models/rank/deepfm/config.yaml
@@ -12,38 +12,65 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-train:
- trainer:
- # for cluster training
- strategy: "async"
-
- epochs: 10
- workspace: "fleetrec.models.rank.deepfm"
-
- reader:
- batch_size: 2
- class: "{workspace}/criteo_reader.py"
- train_data_path: "{workspace}/data/train_data"
- feat_dict_name: "{workspace}/data/aid_data/feat_dict_10.pkl2"
-
- model:
- models: "{workspace}/model.py"
- hyper_parameters:
- sparse_feature_number: 1086460
- sparse_feature_dim: 9
- num_field: 39
- fc_sizes: [400, 400, 400]
- learning_rate: 0.0001
- reg: 0.001
- act: "relu"
- optimizer: SGD
-
- save:
- increment:
- dirname: "increment"
- epoch_interval: 2
- save_last: True
- inference:
- dirname: "inference"
- epoch_interval: 4
- save_last: True
+# global settings
+debug: false
+workspace: "paddlerec.models.rank.deepfm"
+
+
+dataset:
+ - name: train_sample
+ type: QueueDataset
+ batch_size: 5
+ data_path: "{workspace}/data/sample_data/train"
+ sparse_slots: "label feat_idx"
+ dense_slots: "feat_value:39"
+ - name: infer_sample
+ type: QueueDataset
+ batch_size: 5
+ data_path: "{workspace}/data/sample_data/train"
+ sparse_slots: "label feat_idx"
+ dense_slots: "feat_value:39"
+
+hyper_parameters:
+ optimizer:
+ class: SGD
+ learning_rate: 0.0001
+ sparse_feature_number: 1086460
+ sparse_feature_dim: 9
+ num_field: 39
+ fc_sizes: [400, 400, 400]
+ reg: 0.001
+ act: "relu"
+
+
+mode: train_runner
+# if infer, change mode to "infer_runner" and change phase to "infer_phase"
+
+runner:
+ - name: train_runner
+ trainer_class: single_train
+ epochs: 2
+ device: cpu
+ init_model_path: ""
+ save_checkpoint_interval: 1
+ save_inference_interval: 1
+ save_checkpoint_path: "increment"
+ save_inference_path: "inference"
+ print_interval: 1
+ - name: infer_runner
+ trainer_class: single_infer
+ epochs: 1
+ device: cpu
+ init_model_path: "increment/0"
+ print_interval: 1
+
+
+phase:
+- name: phase1
+ model: "{workspace}/model.py"
+ dataset_name: train_sample
+ thread_num: 1
+#- name: infer_phase
+# model: "{workspace}/model.py"
+# dataset_name: infer_sample
+# thread_num: 1
diff --git a/models/rank/deepfm/data/download_preprocess.py b/models/rank/deepfm/data/download_preprocess.py
index 1f114bcfb7ec7ce5c93e676bb0467026eedf6f33..7a504b4f88e49d8b4f242d4d6b56f6f168464e5c 100755
--- a/models/rank/deepfm/data/download_preprocess.py
+++ b/models/rank/deepfm/data/download_preprocess.py
@@ -1,3 +1,17 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
import os
import shutil
import sys
@@ -6,7 +20,7 @@ LOCAL_PATH = os.path.dirname(os.path.abspath(__file__))
TOOLS_PATH = os.path.join(LOCAL_PATH, "..", "..", "tools")
sys.path.append(TOOLS_PATH)
-from fleetrec.tools.tools import download_file_and_uncompress, download_file
+from paddlerec.tools.tools import download_file_and_uncompress, download_file
if __name__ == '__main__':
url = "https://s3-eu-west-1.amazonaws.com/kaggle-display-advertising-challenge-dataset/dac.tar.gz"
diff --git a/models/rank/deepfm/criteo_reader.py b/models/rank/deepfm/data/get_slot_data.py
similarity index 71%
rename from models/rank/deepfm/criteo_reader.py
rename to models/rank/deepfm/data/get_slot_data.py
index a4c2587a507cd1744383c725e5a237950abffd42..6177c990d8ef0c8a1cf922dd9d50c6419cb8c1b7 100755
--- a/models/rank/deepfm/criteo_reader.py
+++ b/models/rank/deepfm/data/get_slot_data.py
@@ -11,16 +11,26 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
-from __future__ import print_function
-from fleetrec.core.reader import Reader
-from fleetrec.core.utils import envs
+import yaml
+from paddlerec.core.reader import Reader
+from paddlerec.core.utils import envs
try:
import cPickle as pickle
except ImportError:
import pickle
-class TrainReader(Reader):
+
+class TrainReader(dg.MultiSlotDataGenerator):
+ def __init__(self, config):
+ dg.MultiSlotDataGenerator.__init__(self)
+
+ if os.path.isfile(config):
+ with open(config, 'r') as rb:
+ _config = yaml.load(rb.read(), Loader=yaml.FullLoader)
+ else:
+ raise ValueError("reader config only support yaml")
+
def init(self):
self.cont_min_ = [0, -3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
self.cont_max_ = [
@@ -34,8 +44,8 @@ class TrainReader(Reader):
self.continuous_range_ = range(1, 14)
self.categorical_range_ = range(14, 40)
# load preprocessed feature dict
- self.feat_dict_name = envs.get_global_env("feat_dict_name", None, "train.reader")
- self.feat_dict_ = pickle.load(open(self.feat_dict_name, 'rb'))
+ self.feat_dict_name = "aid_data/feat_dict_10.pkl2"
+ self.feat_dict_ = pickle.load(open(self.feat_dict_name, 'rb'))
def _process_line(self, line):
features = line.rstrip('\n').split('\t')
@@ -59,13 +69,27 @@ class TrainReader(Reader):
feat_value.append(1.0)
label = [int(features[0])]
return feat_idx, feat_value, label
-
+
def generate_sample(self, line):
"""
Read the data line by line and process it as a dictionary
"""
+
def data_iter():
feat_idx, feat_value, label = self._process_line(line)
- yield [('feat_idx', feat_idx), ('feat_value', feat_value), ('label', label)]
+ s = ""
+ for i in [('feat_idx', feat_idx), ('feat_value', feat_value),
+ ('label', label)]:
+ k = i[0]
+ v = i[1]
+ for j in v:
+ s += " " + k + ":" + str(j)
+ print s.strip()
+ yield None
+
+ return data_iter
+
- return data_iter
\ No newline at end of file
+reader = TrainReader("../config.yaml")
+reader.init()
+reader.run_from_stdin()
diff --git a/models/rank/deepfm/data/preprocess.py b/models/rank/deepfm/data/preprocess.py
index 1fa4a5feae17bde64463d2f05beb3d053284dcda..9da3bdc3d93bfcd0dd98fddc64c870d20feddb38 100755
--- a/models/rank/deepfm/data/preprocess.py
+++ b/models/rank/deepfm/data/preprocess.py
@@ -1,3 +1,17 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
import os
import numpy
from collections import Counter
diff --git a/models/rank/deepfm/data/run.sh b/models/rank/deepfm/data/run.sh
new file mode 100644
index 0000000000000000000000000000000000000000..c2bc4ae8ce1d7ad7c89ebd48f993ab920fba0ba2
--- /dev/null
+++ b/models/rank/deepfm/data/run.sh
@@ -0,0 +1,13 @@
+python download_preprocess.py
+
+mkdir slot_train_data
+for i in `ls ./train_data`
+do
+ cat train_data/$i | python get_slot_data.py > slot_train_data/$i
+done
+
+mkdir slot_test_data
+for i in `ls ./test_data`
+do
+ cat test_data/$i | python get_slot_data.py > slot_test_data/$i
+done
diff --git a/models/rank/deepfm/data/sample_data/feat_dict_10.pkl2 b/models/rank/deepfm/data/sample_data/feat_dict_10.pkl2
new file mode 100644
index 0000000000000000000000000000000000000000..962d552ab2a094b01c1223b8359dc594aa5bbef5
Binary files /dev/null and b/models/rank/deepfm/data/sample_data/feat_dict_10.pkl2 differ
diff --git a/models/rank/deepfm/data/sample_data/train/sample_train.txt b/models/rank/deepfm/data/sample_data/train/sample_train.txt
new file mode 100644
index 0000000000000000000000000000000000000000..4b0308e17f74efa4272e1871e86d03c236b1945a
--- /dev/null
+++ b/models/rank/deepfm/data/sample_data/train/sample_train.txt
@@ -0,0 +1,100 @@
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:695357 feat_idx:655161 feat_idx:0 feat_idx:1075467 feat_idx:314332 feat_idx:615411 feat_idx:733564 feat_idx:795081 feat_idx:148475 feat_idx:123424 feat_idx:582322 feat_idx:0 feat_idx:1082305 feat_idx:288355 feat_idx:328646 feat_idx:756244 feat_idx:13161 feat_idx:134834 feat_idx:734534 feat_idx:1047606 feat_idx:626828 feat_idx:0 feat_idx:476211 feat_idx:819217 feat_idx:502861 feat_idx:767167 feat_value:0.00017316017316 feat_value:1.55232499476e-05 feat_value:7.62951094835e-05 feat_value:0.0 feat_value:5.96732496653e-05 feat_value:9.27994580512e-06 feat_value:0.000266377794747 feat_value:0.000330742516951 feat_value:0.00623729280816 feat_value:0.0217391304348 feat_value:0.00865800865801 feat_value:0.0 feat_value:0.000270526173407 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:695357 feat_idx:328856 feat_idx:583609 feat_idx:356189 feat_idx:314332 feat_idx:404876 feat_idx:233441 feat_idx:144963 feat_idx:148475 feat_idx:954707 feat_idx:778340 feat_idx:598842 feat_idx:701804 feat_idx:223357 feat_idx:310528 feat_idx:805012 feat_idx:599055 feat_idx:683739 feat_idx:734534 feat_idx:94311 feat_idx:135625 feat_idx:0 feat_idx:476211 feat_idx:737768 feat_idx:502861 feat_idx:618666 feat_value:0.00034632034632 feat_value:1.16424374607e-05 feat_value:0.000671396963455 feat_value:0.00103199174407 feat_value:4.40424852812e-06 feat_value:1.85598916102e-05 feat_value:3.55170392996e-05 feat_value:0.000330742516951 feat_value:0.000137840725042 feat_value:0.0217391304348 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000541052346815 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:125230 feat_idx:244091 feat_idx:428972 feat_idx:323226 feat_idx:314332 feat_idx:615411 feat_idx:655488 feat_idx:144963 feat_idx:148475 feat_idx:754940 feat_idx:989454 feat_idx:789125 feat_idx:274685 feat_idx:59528 feat_idx:142028 feat_idx:791919 feat_idx:339114 feat_idx:12934 feat_idx:0 feat_idx:0 feat_idx:128761 feat_idx:925828 feat_idx:476211 feat_idx:686449 feat_idx:0 feat_idx:0 feat_value:0.00034632034632 feat_value:1.16424374607e-05 feat_value:1.52590218967e-05 feat_value:0.0144478844169 feat_value:3.31182217752e-05 feat_value:0.000206478794164 feat_value:7.10340785992e-05 feat_value:0.000330742516951 feat_value:0.00844274440884 feat_value:0.0217391304348 feat_value:0.012987012987 feat_value:0.000748502994012 feat_value:0.00608683890166 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:0 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:695357 feat_idx:541890 feat_idx:0 feat_idx:1012660 feat_idx:314332 feat_idx:404876 feat_idx:1742 feat_idx:144963 feat_idx:148475 feat_idx:456917 feat_idx:220560 feat_idx:0 feat_idx:480237 feat_idx:59528 feat_idx:402233 feat_idx:0 feat_idx:763481 feat_idx:885529 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:476211 feat_idx:68781 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.00347720798826 feat_value:0.0 feat_value:0.0 feat_value:0.000189641760152 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:518052 feat_idx:52223 feat_idx:0 feat_idx:610088 feat_idx:314332 feat_idx:85900 feat_idx:253972 feat_idx:144963 feat_idx:148475 feat_idx:581401 feat_idx:921618 feat_idx:374454 feat_idx:576858 feat_idx:288355 feat_idx:526081 feat_idx:597631 feat_idx:763481 feat_idx:468634 feat_idx:0 feat_idx:0 feat_idx:360559 feat_idx:0 feat_idx:122096 feat_idx:604513 feat_idx:0 feat_idx:0 feat_value:0.000519480519481 feat_value:7.7616249738e-06 feat_value:0.0 feat_value:0.0 feat_value:8.63578142768e-08 feat_value:0.0 feat_value:5.32755589494e-05 feat_value:0.0 feat_value:0.0 feat_value:0.0217391304348 feat_value:0.004329004329 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:0 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:268086 feat_idx:844726 feat_idx:589259 feat_idx:34922 feat_idx:943087 feat_idx:831162 feat_idx:687817 feat_idx:144963 feat_idx:148475 feat_idx:754940 feat_idx:160002 feat_idx:879363 feat_idx:979424 feat_idx:59528 feat_idx:844314 feat_idx:974289 feat_idx:197974 feat_idx:82573 feat_idx:0 feat_idx:0 feat_idx:4620 feat_idx:811639 feat_idx:441547 feat_idx:578537 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:7.7616249738e-06 feat_value:0.0 feat_value:0.0 feat_value:0.000553726305143 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.000206761087563 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:0 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:74940 feat_idx:503640 feat_idx:888356 feat_idx:507702 feat_idx:943087 feat_idx:404876 feat_idx:1081499 feat_idx:144963 feat_idx:148475 feat_idx:754940 feat_idx:202629 feat_idx:486504 feat_idx:981942 feat_idx:59528 feat_idx:404100 feat_idx:210897 feat_idx:197974 feat_idx:821035 feat_idx:0 feat_idx:0 feat_idx:627303 feat_idx:0 feat_idx:637620 feat_idx:409520 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:1.55232499476e-05 feat_value:3.05180437934e-05 feat_value:0.0 feat_value:0.000136790777814 feat_value:0.0 feat_value:0.0 feat_value:0.000165371258475 feat_value:6.89203625211e-05 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:695357 feat_idx:541890 feat_idx:0 feat_idx:175574 feat_idx:1022525 feat_idx:85900 feat_idx:114990 feat_idx:795081 feat_idx:148475 feat_idx:391150 feat_idx:172637 feat_idx:0 feat_idx:831202 feat_idx:59528 feat_idx:402233 feat_idx:0 feat_idx:13161 feat_idx:885529 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:122096 feat_idx:68781 feat_idx:0 feat_idx:0 feat_value:0.00017316017316 feat_value:2.71656874083e-05 feat_value:3.05180437934e-05 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.77585196498e-05 feat_value:0.0 feat_value:0.0 feat_value:0.0217391304348 feat_value:0.004329004329 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:1
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:585875 feat_idx:460446 feat_idx:323226 feat_idx:314332 feat_idx:615411 feat_idx:453185 feat_idx:144963 feat_idx:148475 feat_idx:995582 feat_idx:409958 feat_idx:824386 feat_idx:745363 feat_idx:223357 feat_idx:782190 feat_idx:499188 feat_idx:13161 feat_idx:826986 feat_idx:0 feat_idx:0 feat_idx:335421 feat_idx:0 feat_idx:122096 feat_idx:686449 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.000182398186884 feat_value:6.10360875868e-05 feat_value:0.00825593395253 feat_value:0.000820831024701 feat_value:0.000577676626369 feat_value:0.000497238550194 feat_value:0.00512650901273 feat_value:0.00485888555774 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.00108210469363 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:952850 feat_idx:444926 feat_idx:327161 feat_idx:314332 feat_idx:0 feat_idx:48165 feat_idx:144963 feat_idx:148475 feat_idx:408072 feat_idx:220560 feat_idx:313350 feat_idx:480237 feat_idx:59528 feat_idx:767941 feat_idx:274209 feat_idx:587215 feat_idx:49542 feat_idx:0 feat_idx:0 feat_idx:918027 feat_idx:0 feat_idx:122096 feat_idx:210681 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.000147470874502 feat_value:0.0 feat_value:0.00103199174407 feat_value:0.00145672679013 feat_value:4.87197154769e-05 feat_value:1.77585196498e-05 feat_value:0.000330742516951 feat_value:0.000103380543782 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000135263086704 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:323969 feat_idx:1007141 feat_idx:1053419 feat_idx:314332 feat_idx:615411 feat_idx:926319 feat_idx:144963 feat_idx:31348 feat_idx:754940 feat_idx:35969 feat_idx:469428 feat_idx:394416 feat_idx:223357 feat_idx:878804 feat_idx:9647 feat_idx:197974 feat_idx:316785 feat_idx:734534 feat_idx:94311 feat_idx:409871 feat_idx:0 feat_idx:476211 feat_idx:755653 feat_idx:522503 feat_idx:379855 feat_value:0.0 feat_value:1.94040624345e-05 feat_value:0.00964370183871 feat_value:0.0 feat_value:0.00245126655825 feat_value:0.0 feat_value:0.0 feat_value:0.000826856292376 feat_value:0.00223991178194 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.000270526173407 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:985125 feat_idx:0 feat_idx:0 feat_idx:360051 feat_idx:0 feat_idx:304911 feat_idx:144963 feat_idx:148475 feat_idx:754940 feat_idx:887175 feat_idx:0 feat_idx:701330 feat_idx:59528 feat_idx:670083 feat_idx:0 feat_idx:587215 feat_idx:334296 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:122096 feat_idx:0 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:3.49273123821e-05 feat_value:9.15541313802e-05 feat_value:0.0061919504644 feat_value:1.81783199053e-05 feat_value:0.000252878523189 feat_value:1.77585196498e-05 feat_value:0.00115759880933 feat_value:0.00368723939488 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000811578520222 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:0 feat_idx:0 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:685954 feat_idx:439682 feat_idx:0 feat_idx:983567 feat_idx:314332 feat_idx:404876 feat_idx:909239 feat_idx:795081 feat_idx:148475 feat_idx:36347 feat_idx:663689 feat_idx:0 feat_idx:398775 feat_idx:59528 feat_idx:996203 feat_idx:150509 feat_idx:13161 feat_idx:183924 feat_idx:0 feat_idx:0 feat_idx:379144 feat_idx:0 feat_idx:122096 feat_idx:604513 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:7.7616249738e-06 feat_value:0.0 feat_value:0.0 feat_value:6.32570989578e-05 feat_value:0.0 feat_value:0.000301894834047 feat_value:0.0 feat_value:0.000137840725042 feat_value:0.0 feat_value:0.017316017316 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:1
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:181401 feat_idx:702327 feat_idx:0 feat_idx:334017 feat_idx:314332 feat_idx:0 feat_idx:191120 feat_idx:299805 feat_idx:148475 feat_idx:442554 feat_idx:480141 feat_idx:0 feat_idx:16042 feat_idx:288355 feat_idx:928072 feat_idx:0 feat_idx:599055 feat_idx:91753 feat_idx:297696 feat_idx:330429 feat_idx:0 feat_idx:0 feat_idx:122096 feat_idx:590863 feat_idx:525837 feat_idx:413413 feat_value:0.0 feat_value:1.94040624345e-05 feat_value:0.000167849240864 feat_value:0.00515995872033 feat_value:0.000443101945054 feat_value:7.88795393435e-05 feat_value:3.55170392996e-05 feat_value:0.000661485033901 feat_value:0.000172300906303 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000676315433518 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:1
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:506931 feat_idx:655161 feat_idx:0 feat_idx:49997 feat_idx:1076285 feat_idx:85900 feat_idx:79619 feat_idx:144963 feat_idx:148475 feat_idx:817613 feat_idx:933612 feat_idx:0 feat_idx:733763 feat_idx:288355 feat_idx:565066 feat_idx:310463 feat_idx:854924 feat_idx:378884 feat_idx:734534 feat_idx:1047606 feat_idx:884047 feat_idx:0 feat_idx:241528 feat_idx:40100 feat_idx:502861 feat_idx:752176 feat_value:0.0 feat_value:0.000209563874293 feat_value:0.00128175783932 feat_value:0.00412796697626 feat_value:0.000156868969634 feat_value:6.03196477333e-05 feat_value:1.77585196498e-05 feat_value:0.000661485033901 feat_value:0.000275681450084 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000541052346815 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:328239 feat_idx:910743 feat_idx:915614 feat_idx:360051 feat_idx:615411 feat_idx:49489 feat_idx:1007823 feat_idx:148475 feat_idx:754940 feat_idx:224964 feat_idx:235573 feat_idx:226878 feat_idx:693306 feat_idx:277510 feat_idx:277345 feat_idx:197974 feat_idx:969807 feat_idx:0 feat_idx:0 feat_idx:539201 feat_idx:0 feat_idx:476211 feat_idx:650546 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:1.94040624345e-05 feat_value:1.52590218967e-05 feat_value:0.0185758513932 feat_value:0.000874588764088 feat_value:0.0 feat_value:0.0 feat_value:0.000165371258475 feat_value:0.0450049967263 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.00270526173407 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:695357 feat_idx:211148 feat_idx:0 feat_idx:0 feat_idx:943087 feat_idx:615411 feat_idx:98894 feat_idx:144963 feat_idx:148475 feat_idx:754940 feat_idx:683585 feat_idx:0 feat_idx:460786 feat_idx:59528 feat_idx:883086 feat_idx:0 feat_idx:587215 feat_idx:197941 feat_idx:734534 feat_idx:1047606 feat_idx:0 feat_idx:0 feat_idx:122096 feat_idx:537421 feat_idx:24736 feat_idx:962390 feat_value:0.00017316017316 feat_value:0.00384200436203 feat_value:0.0 feat_value:0.00206398348813 feat_value:4.53378524953e-06 feat_value:4.63997290256e-06 feat_value:1.77585196498e-05 feat_value:0.000330742516951 feat_value:6.89203625211e-05 feat_value:0.0217391304348 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000270526173407 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:1
+feat_idx:1 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:518052 feat_idx:894672 feat_idx:521506 feat_idx:105841 feat_idx:360051 feat_idx:108674 feat_idx:642013 feat_idx:144963 feat_idx:148475 feat_idx:165260 feat_idx:212992 feat_idx:1009370 feat_idx:775147 feat_idx:223357 feat_idx:274230 feat_idx:833849 feat_idx:13161 feat_idx:57230 feat_idx:0 feat_idx:0 feat_idx:844134 feat_idx:925828 feat_idx:122096 feat_idx:141692 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:1.55232499476e-05 feat_value:0.0 feat_value:0.0 feat_value:0.000716640321776 feat_value:0.00129223245336 feat_value:5.32755589494e-05 feat_value:0.000826856292376 feat_value:0.00423860229505 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000135263086704 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:328856 feat_idx:506639 feat_idx:78755 feat_idx:463568 feat_idx:108674 feat_idx:152478 feat_idx:888742 feat_idx:148475 feat_idx:14838 feat_idx:682657 feat_idx:993166 feat_idx:502067 feat_idx:288355 feat_idx:190674 feat_idx:472919 feat_idx:13161 feat_idx:683739 feat_idx:734534 feat_idx:1047606 feat_idx:768815 feat_idx:0 feat_idx:122096 feat_idx:1010006 feat_idx:522503 feat_idx:963757 feat_value:0.0 feat_value:0.000104781937146 feat_value:6.10360875868e-05 feat_value:0.00206398348813 feat_value:8.87758330766e-05 feat_value:2.78398374153e-05 feat_value:0.000106551117899 feat_value:0.00165371258475 feat_value:0.00286019504463 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000270526173407 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:738089 feat_idx:606995 feat_idx:964206 feat_idx:269737 feat_idx:360051 feat_idx:85900 feat_idx:608469 feat_idx:144963 feat_idx:148475 feat_idx:307543 feat_idx:405000 feat_idx:65140 feat_idx:749745 feat_idx:218723 feat_idx:686050 feat_idx:594443 feat_idx:13161 feat_idx:96125 feat_idx:0 feat_idx:0 feat_idx:946269 feat_idx:0 feat_idx:943262 feat_idx:395579 feat_idx:0 feat_idx:0 feat_value:0.00121212121212 feat_value:0.000407485311125 feat_value:0.0 feat_value:0.0030959752322 feat_value:3.3679547568e-05 feat_value:3.47997967692e-05 feat_value:0.000124309637549 feat_value:0.00248056887713 feat_value:0.000516902718908 feat_value:0.0217391304348 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000405789260111 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:906706 feat_idx:439682 feat_idx:4257 feat_idx:430841 feat_idx:314332 feat_idx:615411 feat_idx:998076 feat_idx:66687 feat_idx:148475 feat_idx:754940 feat_idx:648531 feat_idx:779745 feat_idx:718037 feat_idx:288355 feat_idx:360204 feat_idx:944849 feat_idx:13161 feat_idx:631544 feat_idx:0 feat_idx:0 feat_idx:177363 feat_idx:0 feat_idx:122096 feat_idx:1072137 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.000194040624345 feat_value:0.0 feat_value:0.0 feat_value:0.000276301826779 feat_value:8.81594851486e-05 feat_value:0.000337411873346 feat_value:0.00165371258475 feat_value:0.00492780592026 feat_value:0.0 feat_value:0.04329004329 feat_value:0.0 feat_value:0.000811578520222 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:1
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:704711 feat_idx:0 feat_idx:388090 feat_idx:314332 feat_idx:615411 feat_idx:595457 feat_idx:144963 feat_idx:148475 feat_idx:754940 feat_idx:298800 feat_idx:0 feat_idx:349549 feat_idx:59528 feat_idx:28300 feat_idx:0 feat_idx:587215 feat_idx:750233 feat_idx:832803 feat_idx:330429 feat_idx:0 feat_idx:0 feat_idx:122096 feat_idx:612991 feat_idx:502861 feat_idx:691775 feat_value:0.0 feat_value:1.55232499476e-05 feat_value:0.00122072175174 feat_value:0.0 feat_value:7.97946203918e-05 feat_value:0.000665836111517 feat_value:1.77585196498e-05 feat_value:0.000661485033901 feat_value:0.00158516833799 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000541052346815 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:439682 feat_idx:998375 feat_idx:373577 feat_idx:314332 feat_idx:108674 feat_idx:76428 feat_idx:66687 feat_idx:148475 feat_idx:636407 feat_idx:840978 feat_idx:221841 feat_idx:110276 feat_idx:223357 feat_idx:104371 feat_idx:535541 feat_idx:599055 feat_idx:892333 feat_idx:0 feat_idx:0 feat_idx:519737 feat_idx:0 feat_idx:476211 feat_idx:26849 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:1.16424374607e-05 feat_value:0.000213626306554 feat_value:0.0061919504644 feat_value:0.000307951965711 feat_value:0.000396717683169 feat_value:3.55170392996e-05 feat_value:0.000330742516951 feat_value:0.000206761087563 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000811578520222 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:507093 feat_idx:28898 feat_idx:1067105 feat_idx:314332 feat_idx:615411 feat_idx:875540 feat_idx:144963 feat_idx:148475 feat_idx:801559 feat_idx:965246 feat_idx:93410 feat_idx:648840 feat_idx:59528 feat_idx:63243 feat_idx:1041736 feat_idx:763481 feat_idx:206486 feat_idx:0 feat_idx:0 feat_idx:623203 feat_idx:0 feat_idx:377126 feat_idx:1017627 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:4.65697498428e-05 feat_value:0.00013733119707 feat_value:0.0175438596491 feat_value:0.000508388452648 feat_value:0.0 feat_value:0.0 feat_value:0.00380353894493 feat_value:0.00441090320135 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.00229947247396 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:0 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:506931 feat_idx:195832 feat_idx:460446 feat_idx:323226 feat_idx:314332 feat_idx:615411 feat_idx:414506 feat_idx:144963 feat_idx:148475 feat_idx:127380 feat_idx:385804 feat_idx:824386 feat_idx:203621 feat_idx:59528 feat_idx:631370 feat_idx:499188 feat_idx:587215 feat_idx:855342 feat_idx:0 feat_idx:0 feat_idx:335421 feat_idx:969590 feat_idx:476211 feat_idx:686449 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:1.55232499476e-05 feat_value:3.05180437934e-05 feat_value:0.0 feat_value:0.000267277435187 feat_value:0.000194878861907 feat_value:1.77585196498e-05 feat_value:0.00446502397883 feat_value:0.0024466728695 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:0 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:506931 feat_idx:704711 feat_idx:701980 feat_idx:42486 feat_idx:314332 feat_idx:0 feat_idx:786460 feat_idx:144963 feat_idx:148475 feat_idx:466556 feat_idx:775018 feat_idx:404666 feat_idx:1065844 feat_idx:39086 feat_idx:992008 feat_idx:506428 feat_idx:599055 feat_idx:750233 feat_idx:256242 feat_idx:330429 feat_idx:218251 feat_idx:0 feat_idx:122096 feat_idx:221229 feat_idx:502861 feat_idx:24246 feat_value:0.0 feat_value:2.71656874083e-05 feat_value:0.000244144350347 feat_value:0.0 feat_value:0.000255835024795 feat_value:4.63997290256e-06 feat_value:3.55170392996e-05 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:518052 feat_idx:1049859 feat_idx:0 feat_idx:1096 feat_idx:314332 feat_idx:615411 feat_idx:714816 feat_idx:795081 feat_idx:148475 feat_idx:900313 feat_idx:855314 feat_idx:0 feat_idx:603555 feat_idx:59528 feat_idx:211559 feat_idx:0 feat_idx:379814 feat_idx:311468 feat_idx:734534 feat_idx:330429 feat_idx:0 feat_idx:0 feat_idx:122096 feat_idx:383498 feat_idx:917031 feat_idx:879752 feat_value:0.0 feat_value:1.55232499476e-05 feat_value:0.000305180437934 feat_value:0.0165118679051 feat_value:6.68409482503e-05 feat_value:0.000215758739969 feat_value:0.000745857825292 feat_value:0.00529188027121 feat_value:0.0314276853096 feat_value:0.0 feat_value:0.0649350649351 feat_value:0.000249500998004 feat_value:0.00216420938726 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:1
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:695357 feat_idx:439682 feat_idx:433159 feat_idx:217415 feat_idx:360051 feat_idx:615411 feat_idx:235834 feat_idx:144963 feat_idx:148475 feat_idx:343946 feat_idx:489781 feat_idx:168412 feat_idx:950158 feat_idx:59528 feat_idx:419036 feat_idx:782554 feat_idx:854924 feat_idx:502656 feat_idx:0 feat_idx:0 feat_idx:1082526 feat_idx:0 feat_idx:476211 feat_idx:972567 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:8.92586871988e-05 feat_value:3.05180437934e-05 feat_value:0.00206398348813 feat_value:0.000310369984511 feat_value:0.000394397696717 feat_value:3.55170392996e-05 feat_value:0.000496113775426 feat_value:0.000827044350253 feat_value:0.0 feat_value:0.00865800865801 feat_value:0.0 feat_value:0.000270526173407 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:1
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:695357 feat_idx:983083 feat_idx:555506 feat_idx:311508 feat_idx:360051 feat_idx:831162 feat_idx:662893 feat_idx:144963 feat_idx:148475 feat_idx:453404 feat_idx:437228 feat_idx:866349 feat_idx:987534 feat_idx:223357 feat_idx:872276 feat_idx:719825 feat_idx:13161 feat_idx:146364 feat_idx:0 feat_idx:0 feat_idx:1083188 feat_idx:0 feat_idx:122096 feat_idx:33938 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.000314345811439 feat_value:3.05180437934e-05 feat_value:0.015479876161 feat_value:0.000186144268674 feat_value:0.000197198848359 feat_value:7.10340785992e-05 feat_value:0.00297668265255 feat_value:0.00792584168993 feat_value:0.0 feat_value:0.012987012987 feat_value:0.0 feat_value:0.00202894630055 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:1
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:638696 feat_idx:232393 feat_idx:537609 feat_idx:314332 feat_idx:85900 feat_idx:158968 feat_idx:144963 feat_idx:148475 feat_idx:411650 feat_idx:220560 feat_idx:633471 feat_idx:480237 feat_idx:39086 feat_idx:611928 feat_idx:584121 feat_idx:13161 feat_idx:747604 feat_idx:0 feat_idx:0 feat_idx:204145 feat_idx:0 feat_idx:476211 feat_idx:485685 feat_idx:0 feat_idx:0 feat_value:0.000519480519481 feat_value:1.16424374607e-05 feat_value:6.10360875868e-05 feat_value:0.0134158926729 feat_value:9.672075199e-06 feat_value:6.49596206358e-05 feat_value:5.32755589494e-05 feat_value:0.00578799404663 feat_value:0.000930424894035 feat_value:0.0217391304348 feat_value:0.004329004329 feat_value:0.0 feat_value:0.00175842012715 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:1
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:518052 feat_idx:245713 feat_idx:964221 feat_idx:976933 feat_idx:360051 feat_idx:404876 feat_idx:469669 feat_idx:144963 feat_idx:148475 feat_idx:754940 feat_idx:496768 feat_idx:978607 feat_idx:788967 feat_idx:59528 feat_idx:717827 feat_idx:227446 feat_idx:13161 feat_idx:251726 feat_idx:0 feat_idx:0 feat_idx:2400 feat_idx:0 feat_idx:476211 feat_idx:942610 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.00108662749633 feat_value:0.0 feat_value:0.0030959752322 feat_value:0.000315983242439 feat_value:5.56796748307e-05 feat_value:0.000106551117899 feat_value:0.000496113775426 feat_value:0.00337709776353 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000405789260111 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:1
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:0 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:181401 feat_idx:569676 feat_idx:460446 feat_idx:323226 feat_idx:314332 feat_idx:404876 feat_idx:286011 feat_idx:144963 feat_idx:148475 feat_idx:754940 feat_idx:966589 feat_idx:824386 feat_idx:429895 feat_idx:863222 feat_idx:406685 feat_idx:499188 feat_idx:197974 feat_idx:251433 feat_idx:0 feat_idx:0 feat_idx:335421 feat_idx:0 feat_idx:321110 feat_idx:686449 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:7.7616249738e-06 feat_value:0.0 feat_value:0.0 feat_value:0.000213994663778 feat_value:0.0 feat_value:0.0 feat_value:0.00611873656359 feat_value:0.00334263758227 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:0 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:0 feat_idx:268086 feat_idx:83142 feat_idx:288162 feat_idx:1060646 feat_idx:360051 feat_idx:615411 feat_idx:714816 feat_idx:144963 feat_idx:148475 feat_idx:138291 feat_idx:855314 feat_idx:165496 feat_idx:603555 feat_idx:59528 feat_idx:224690 feat_idx:316295 feat_idx:854924 feat_idx:257823 feat_idx:0 feat_idx:0 feat_idx:704548 feat_idx:0 feat_idx:122096 feat_idx:782694 feat_idx:0 feat_idx:0 feat_value:0.00017316017316 feat_value:1.16424374607e-05 feat_value:1.52590218967e-05 feat_value:0.0 feat_value:6.16163004865e-05 feat_value:6.95995935384e-06 feat_value:0.000284136314397 feat_value:0.00181908384323 feat_value:0.00172300906303 feat_value:0.0 feat_value:0.00865800865801 feat_value:0.000249500998004 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:563443 feat_idx:51995 feat_idx:49997 feat_idx:314332 feat_idx:0 feat_idx:595457 feat_idx:144963 feat_idx:148475 feat_idx:754940 feat_idx:188162 feat_idx:721984 feat_idx:349549 feat_idx:199920 feat_idx:180762 feat_idx:310463 feat_idx:197974 feat_idx:319863 feat_idx:734534 feat_idx:330429 feat_idx:467968 feat_idx:0 feat_idx:122096 feat_idx:40100 feat_idx:502861 feat_idx:777305 feat_value:0.000692640692641 feat_value:1.16424374607e-05 feat_value:0.000839246204318 feat_value:0.00825593395253 feat_value:3.70906812319e-05 feat_value:3.01598238666e-05 feat_value:7.10340785992e-05 feat_value:0.0019844551017 feat_value:0.000447982356387 feat_value:0.0217391304348 feat_value:0.004329004329 feat_value:0.0 feat_value:0.00108210469363 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:281207 feat_idx:430926 feat_idx:909211 feat_idx:314332 feat_idx:0 feat_idx:928918 feat_idx:144963 feat_idx:148475 feat_idx:904134 feat_idx:535335 feat_idx:327558 feat_idx:639245 feat_idx:223357 feat_idx:18380 feat_idx:471487 feat_idx:13161 feat_idx:188469 feat_idx:0 feat_idx:0 feat_idx:500616 feat_idx:0 feat_idx:122096 feat_idx:657898 feat_idx:0 feat_idx:0 feat_value:0.00017316017316 feat_value:0.00101677287157 feat_value:1.52590218967e-05 feat_value:0.00103199174407 feat_value:2.15894535692e-07 feat_value:2.31998645128e-06 feat_value:0.000106551117899 feat_value:0.000165371258475 feat_value:3.44601812606e-05 feat_value:0.0217391304348 feat_value:0.012987012987 feat_value:0.0 feat_value:0.000135263086704 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:1
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:268086 feat_idx:87449 feat_idx:691591 feat_idx:466372 feat_idx:360051 feat_idx:108674 feat_idx:537959 feat_idx:144963 feat_idx:148475 feat_idx:882632 feat_idx:1037965 feat_idx:783604 feat_idx:521533 feat_idx:59528 feat_idx:185313 feat_idx:972394 feat_idx:339114 feat_idx:644343 feat_idx:603603 feat_idx:330429 feat_idx:722203 feat_idx:925828 feat_idx:377126 feat_idx:221229 feat_idx:343446 feat_idx:24246 feat_value:0.0 feat_value:0.000504505623297 feat_value:1.52590218967e-05 feat_value:0.0030959752322 feat_value:7.26701007139e-05 feat_value:4.40797425743e-05 feat_value:0.000461721510895 feat_value:0.00281131139408 feat_value:0.0163685860988 feat_value:0.0 feat_value:0.038961038961 feat_value:0.0 feat_value:0.000405789260111 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:1
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:0 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:87868 feat_idx:585875 feat_idx:143202 feat_idx:105841 feat_idx:314332 feat_idx:615411 feat_idx:685294 feat_idx:795081 feat_idx:148475 feat_idx:754940 feat_idx:853239 feat_idx:1062322 feat_idx:529712 feat_idx:223357 feat_idx:715789 feat_idx:334774 feat_idx:197974 feat_idx:339749 feat_idx:0 feat_idx:0 feat_idx:540979 feat_idx:0 feat_idx:122096 feat_idx:141692 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:1.55232499476e-05 feat_value:0.0 feat_value:0.0 feat_value:0.0010041254855 feat_value:0.0 feat_value:0.0 feat_value:0.000165371258475 feat_value:0.00251559323202 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:0 feat_idx:0 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:154881 feat_idx:664380 feat_idx:0 feat_idx:470673 feat_idx:314332 feat_idx:108674 feat_idx:610634 feat_idx:144963 feat_idx:148475 feat_idx:125722 feat_idx:153800 feat_idx:0 feat_idx:297062 feat_idx:223357 feat_idx:712970 feat_idx:124318 feat_idx:13161 feat_idx:521259 feat_idx:734534 feat_idx:330429 feat_idx:0 feat_idx:969590 feat_idx:217677 feat_idx:643925 feat_idx:24736 feat_idx:941404 feat_value:0.00103896103896 feat_value:7.7616249738e-06 feat_value:0.0 feat_value:0.0 feat_value:3.95087000316e-05 feat_value:9.27994580512e-05 feat_value:0.000461721510895 feat_value:0.00545725152968 feat_value:0.00248113305076 feat_value:0.0217391304348 feat_value:0.012987012987 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:695357 feat_idx:245713 feat_idx:987054 feat_idx:399764 feat_idx:360051 feat_idx:615411 feat_idx:684605 feat_idx:144963 feat_idx:148475 feat_idx:874792 feat_idx:107682 feat_idx:879950 feat_idx:321212 feat_idx:288355 feat_idx:369087 feat_idx:762311 feat_idx:13161 feat_idx:879575 feat_idx:0 feat_idx:0 feat_idx:1086254 feat_idx:0 feat_idx:122096 feat_idx:942610 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:1.16424374607e-05 feat_value:4.57770656901e-05 feat_value:0.0123839009288 feat_value:0.000315551453367 feat_value:0.000225038685774 feat_value:3.55170392996e-05 feat_value:0.00347279642798 feat_value:0.00310141631345 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.00162315704044 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:506931 feat_idx:714652 feat_idx:0 feat_idx:213479 feat_idx:314332 feat_idx:0 feat_idx:432079 feat_idx:144963 feat_idx:148475 feat_idx:666980 feat_idx:405740 feat_idx:0 feat_idx:705197 feat_idx:288355 feat_idx:104862 feat_idx:0 feat_idx:339114 feat_idx:679030 feat_idx:734534 feat_idx:1047606 feat_idx:0 feat_idx:0 feat_idx:122096 feat_idx:1057480 feat_idx:343446 feat_idx:502409 feat_value:0.00138528138528 feat_value:1.16424374607e-05 feat_value:0.00022888532845 feat_value:0.0206398348813 feat_value:4.96557432092e-06 feat_value:5.56796748307e-05 feat_value:0.000142068157198 feat_value:0.00380353894493 feat_value:0.000827044350253 feat_value:0.0434782608696 feat_value:0.00865800865801 feat_value:0.0 feat_value:0.00270526173407 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:268086 feat_idx:83142 feat_idx:460446 feat_idx:323226 feat_idx:360051 feat_idx:108674 feat_idx:714816 feat_idx:795081 feat_idx:148475 feat_idx:900313 feat_idx:855314 feat_idx:824386 feat_idx:603555 feat_idx:59528 feat_idx:95559 feat_idx:499188 feat_idx:339114 feat_idx:882666 feat_idx:0 feat_idx:0 feat_idx:335421 feat_idx:0 feat_idx:122096 feat_idx:686449 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.000159113311963 feat_value:3.05180437934e-05 feat_value:0.00412796697626 feat_value:0.000134675011365 feat_value:0.000345677981241 feat_value:0.00113654525759 feat_value:0.00793782040681 feat_value:0.00478996519522 feat_value:0.0 feat_value:0.025974025974 feat_value:0.00149700598802 feat_value:0.000541052346815 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:268086 feat_idx:507093 feat_idx:968965 feat_idx:115714 feat_idx:314332 feat_idx:108674 feat_idx:585814 feat_idx:144963 feat_idx:148475 feat_idx:1067472 feat_idx:905164 feat_idx:292795 feat_idx:1053010 feat_idx:223357 feat_idx:460894 feat_idx:592287 feat_idx:339114 feat_idx:1024304 feat_idx:0 feat_idx:0 feat_idx:1006115 feat_idx:0 feat_idx:122096 feat_idx:831861 feat_idx:0 feat_idx:0 feat_value:0.0152380952381 feat_value:0.00124962162078 feat_value:0.0 feat_value:0.00412796697626 feat_value:2.15894535692e-07 feat_value:9.27994580512e-06 feat_value:0.00158050824883 feat_value:0.00661485033901 feat_value:0.00303249595093 feat_value:0.0652173913043 feat_value:0.017316017316 feat_value:0.00299401197605 feat_value:0.000541052346815 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:1
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:268086 feat_idx:704711 feat_idx:160536 feat_idx:572549 feat_idx:314332 feat_idx:0 feat_idx:984584 feat_idx:144963 feat_idx:148475 feat_idx:120200 feat_idx:190379 feat_idx:768743 feat_idx:628725 feat_idx:288355 feat_idx:967940 feat_idx:824472 feat_idx:854924 feat_idx:575938 feat_idx:568485 feat_idx:330429 feat_idx:469863 feat_idx:0 feat_idx:122096 feat_idx:26849 feat_idx:502861 feat_idx:9838 feat_value:0.0 feat_value:1.55232499476e-05 feat_value:0.000274662394141 feat_value:0.00515995872033 feat_value:7.26701007139e-05 feat_value:0.000185598916102 feat_value:0.000674823746692 feat_value:0.000826856292376 feat_value:0.00327371721975 feat_value:0.0 feat_value:0.021645021645 feat_value:0.0 feat_value:0.000676315433518 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:0 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:506931 feat_idx:439682 feat_idx:462322 feat_idx:892535 feat_idx:314332 feat_idx:615411 feat_idx:183327 feat_idx:66687 feat_idx:31348 feat_idx:754940 feat_idx:780959 feat_idx:1076845 feat_idx:127420 feat_idx:59528 feat_idx:1034303 feat_idx:3336 feat_idx:587215 feat_idx:786401 feat_idx:0 feat_idx:0 feat_idx:273839 feat_idx:0 feat_idx:476211 feat_idx:841950 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.000116424374607 feat_value:0.0 feat_value:0.0 feat_value:0.00487394867997 feat_value:0.00488589146639 feat_value:0.0 feat_value:0.000330742516951 feat_value:0.00327371721975 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:0 feat_idx:0 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:268086 feat_idx:569676 feat_idx:460446 feat_idx:323226 feat_idx:943087 feat_idx:615411 feat_idx:646596 feat_idx:144963 feat_idx:148475 feat_idx:320091 feat_idx:786096 feat_idx:824386 feat_idx:708545 feat_idx:863222 feat_idx:406685 feat_idx:499188 feat_idx:599055 feat_idx:251433 feat_idx:0 feat_idx:0 feat_idx:335421 feat_idx:969590 feat_idx:476211 feat_idx:686449 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:7.7616249738e-06 feat_value:0.0 feat_value:0.0 feat_value:0.000211317571535 feat_value:4.63997290256e-05 feat_value:1.77585196498e-05 feat_value:0.00115759880933 feat_value:0.000689203625211 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:268086 feat_idx:585875 feat_idx:1083253 feat_idx:105841 feat_idx:314332 feat_idx:615411 feat_idx:183043 feat_idx:66687 feat_idx:148475 feat_idx:754940 feat_idx:785290 feat_idx:78319 feat_idx:769776 feat_idx:223357 feat_idx:715789 feat_idx:30992 feat_idx:854924 feat_idx:339749 feat_idx:0 feat_idx:0 feat_idx:87470 feat_idx:0 feat_idx:122096 feat_idx:141692 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.000135828437042 feat_value:0.0 feat_value:0.00103199174407 feat_value:0.000404802254423 feat_value:0.0 feat_value:0.0 feat_value:0.00611873656359 feat_value:0.00062028326269 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.000135263086704 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:268086 feat_idx:34199 feat_idx:460446 feat_idx:323226 feat_idx:360051 feat_idx:615411 feat_idx:617010 feat_idx:1041627 feat_idx:148475 feat_idx:754940 feat_idx:224964 feat_idx:824386 feat_idx:226878 feat_idx:288355 feat_idx:303932 feat_idx:499188 feat_idx:13161 feat_idx:628988 feat_idx:0 feat_idx:0 feat_idx:335421 feat_idx:0 feat_idx:122096 feat_idx:686449 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:3.49273123821e-05 feat_value:9.15541313802e-05 feat_value:0.015479876161 feat_value:0.000872775249989 feat_value:0.0011762331308 feat_value:0.000124309637549 feat_value:0.00694559285596 feat_value:0.0124056652538 feat_value:0.0 feat_value:0.00865800865801 feat_value:0.0 feat_value:0.00541052346815 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:664380 feat_idx:0 feat_idx:0 feat_idx:314332 feat_idx:108674 feat_idx:248083 feat_idx:144963 feat_idx:148475 feat_idx:804470 feat_idx:868888 feat_idx:0 feat_idx:797434 feat_idx:59528 feat_idx:747120 feat_idx:0 feat_idx:13161 feat_idx:521259 feat_idx:495815 feat_idx:330429 feat_idx:0 feat_idx:11923 feat_idx:407810 feat_idx:566713 feat_idx:24736 feat_idx:915104 feat_value:0.00536796536797 feat_value:7.7616249738e-05 feat_value:3.05180437934e-05 feat_value:0.0113519091847 feat_value:1.25218830701e-05 feat_value:5.33596883794e-05 feat_value:0.000550514109144 feat_value:0.00380353894493 feat_value:0.00223991178194 feat_value:0.0434782608696 feat_value:0.00865800865801 feat_value:0.0 feat_value:0.00148789395374 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:268086 feat_idx:439682 feat_idx:661250 feat_idx:819482 feat_idx:314332 feat_idx:404876 feat_idx:173004 feat_idx:795081 feat_idx:148475 feat_idx:133411 feat_idx:790823 feat_idx:853868 feat_idx:963286 feat_idx:223357 feat_idx:961787 feat_idx:355708 feat_idx:13161 feat_idx:618619 feat_idx:0 feat_idx:0 feat_idx:542491 feat_idx:0 feat_idx:377126 feat_idx:320543 feat_idx:0 feat_idx:0 feat_value:0.00017316017316 feat_value:0.00925573778126 feat_value:0.000198367284657 feat_value:0.00412796697626 feat_value:1.72715628554e-06 feat_value:9.27994580512e-06 feat_value:0.00122533785584 feat_value:0.000496113775426 feat_value:0.0209862503877 feat_value:0.0217391304348 feat_value:0.047619047619 feat_value:0.0 feat_value:0.000541052346815 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:0 feat_idx:0 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:695357 feat_idx:881707 feat_idx:387392 feat_idx:38631 feat_idx:314332 feat_idx:0 feat_idx:608594 feat_idx:144963 feat_idx:148475 feat_idx:756085 feat_idx:879727 feat_idx:1083007 feat_idx:253536 feat_idx:223357 feat_idx:462961 feat_idx:367591 feat_idx:13161 feat_idx:144331 feat_idx:0 feat_idx:0 feat_idx:853418 feat_idx:0 feat_idx:122096 feat_idx:783958 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.000748996809972 feat_value:0.0 feat_value:0.0 feat_value:7.01225451928e-05 feat_value:1.39199187077e-05 feat_value:0.000514997069844 feat_value:0.000992227550852 feat_value:0.00255005341328 feat_value:0.0 feat_value:0.038961038961 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:87449 feat_idx:536408 feat_idx:619856 feat_idx:729041 feat_idx:615411 feat_idx:689549 feat_idx:1041627 feat_idx:148475 feat_idx:754940 feat_idx:42362 feat_idx:181047 feat_idx:385295 feat_idx:223357 feat_idx:751650 feat_idx:367088 feat_idx:339114 feat_idx:644343 feat_idx:809973 feat_idx:330429 feat_idx:28648 feat_idx:0 feat_idx:217677 feat_idx:305383 feat_idx:343446 feat_idx:1083427 feat_value:0.0 feat_value:8.53778747118e-05 feat_value:0.000122072175174 feat_value:0.00928792569659 feat_value:6.50274341504e-05 feat_value:7.19195799897e-05 feat_value:5.32755589494e-05 feat_value:0.00115759880933 feat_value:0.00117164616286 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.00121736778033 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:506931 feat_idx:439682 feat_idx:0 feat_idx:0 feat_idx:314332 feat_idx:108674 feat_idx:713567 feat_idx:144963 feat_idx:148475 feat_idx:754940 feat_idx:963705 feat_idx:0 feat_idx:599643 feat_idx:59528 feat_idx:967283 feat_idx:0 feat_idx:587215 feat_idx:434748 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:925828 feat_idx:476211 feat_idx:753350 feat_idx:0 feat_idx:0 feat_value:0.00017316017316 feat_value:0.000128066812068 feat_value:0.0 feat_value:0.0030959752322 feat_value:5.00875322806e-06 feat_value:7.19195799897e-05 feat_value:1.77585196498e-05 feat_value:0.000496113775426 feat_value:0.000103380543782 feat_value:0.0217391304348 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000405789260111 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:1
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:432429 feat_idx:319665 feat_idx:183269 feat_idx:85674 feat_idx:463568 feat_idx:0 feat_idx:130525 feat_idx:144963 feat_idx:148475 feat_idx:754940 feat_idx:392441 feat_idx:1050223 feat_idx:862081 feat_idx:288355 feat_idx:484086 feat_idx:1077738 feat_idx:339114 feat_idx:934587 feat_idx:734534 feat_idx:94311 feat_idx:548757 feat_idx:0 feat_idx:321110 feat_idx:686449 feat_idx:474802 feat_idx:789529 feat_value:0.0 feat_value:3.49273123821e-05 feat_value:3.05180437934e-05 feat_value:0.0030959752322 feat_value:0.000119994182938 feat_value:0.0 feat_value:0.0 feat_value:0.000496113775426 feat_value:0.000447982356387 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.000405789260111 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:268086 feat_idx:702327 feat_idx:0 feat_idx:217102 feat_idx:314332 feat_idx:85900 feat_idx:331250 feat_idx:888742 feat_idx:148475 feat_idx:197667 feat_idx:872960 feat_idx:0 feat_idx:925332 feat_idx:223357 feat_idx:57227 feat_idx:0 feat_idx:339114 feat_idx:91753 feat_idx:305875 feat_idx:1047606 feat_idx:0 feat_idx:0 feat_idx:476211 feat_idx:117207 feat_idx:502861 feat_idx:866455 feat_value:0.0 feat_value:1.94040624345e-05 feat_value:0.000335698481727 feat_value:0.0030959752322 feat_value:0.000202379537758 feat_value:0.00056143672121 feat_value:0.000106551117899 feat_value:0.000992227550852 feat_value:0.00630621317068 feat_value:0.0 feat_value:0.004329004329 feat_value:0.000998003992016 feat_value:0.000405789260111 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:849120 feat_idx:982375 feat_idx:949507 feat_idx:82312 feat_idx:314332 feat_idx:615411 feat_idx:641839 feat_idx:66687 feat_idx:148475 feat_idx:351286 feat_idx:1067936 feat_idx:1021395 feat_idx:423678 feat_idx:288355 feat_idx:491071 feat_idx:210032 feat_idx:13161 feat_idx:384630 feat_idx:661313 feat_idx:330429 feat_idx:466643 feat_idx:0 feat_idx:407810 feat_idx:818126 feat_idx:35064 feat_idx:312157 feat_value:0.0 feat_value:0.00022508712424 feat_value:0.000244144350347 feat_value:0.00722394220846 feat_value:7.32314265067e-05 feat_value:0.000167039024492 feat_value:3.55170392996e-05 feat_value:0.00115759880933 feat_value:0.00327371721975 feat_value:0.0 feat_value:0.00865800865801 feat_value:0.0 feat_value:0.000946841606925 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:518052 feat_idx:702327 feat_idx:0 feat_idx:450730 feat_idx:314332 feat_idx:615411 feat_idx:491223 feat_idx:27549 feat_idx:148475 feat_idx:24666 feat_idx:283209 feat_idx:0 feat_idx:91978 feat_idx:59528 feat_idx:89255 feat_idx:282181 feat_idx:13161 feat_idx:91753 feat_idx:633602 feat_idx:94311 feat_idx:0 feat_idx:0 feat_idx:377126 feat_idx:26849 feat_idx:502861 feat_idx:989849 feat_value:0.00103896103896 feat_value:1.16424374607e-05 feat_value:0.000427252613107 feat_value:0.0 feat_value:1.33854612129e-06 feat_value:0.0 feat_value:0.000106551117899 feat_value:0.0 feat_value:0.0 feat_value:0.0217391304348 feat_value:0.004329004329 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:181401 feat_idx:704711 feat_idx:1084300 feat_idx:958176 feat_idx:314332 feat_idx:615411 feat_idx:809683 feat_idx:536544 feat_idx:148475 feat_idx:197667 feat_idx:23597 feat_idx:771551 feat_idx:444756 feat_idx:59528 feat_idx:28300 feat_idx:351738 feat_idx:339114 feat_idx:750233 feat_idx:734534 feat_idx:330429 feat_idx:5418 feat_idx:0 feat_idx:476211 feat_idx:221229 feat_idx:1007264 feat_idx:24246 feat_value:0.0 feat_value:8.53778747118e-05 feat_value:0.00013733119707 feat_value:0.0030959752322 feat_value:0.000622380767493 feat_value:0.00313894166858 feat_value:5.32755589494e-05 feat_value:0.000165371258475 feat_value:0.0124745856163 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000405789260111 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:268086 feat_idx:746729 feat_idx:742925 feat_idx:205831 feat_idx:912022 feat_idx:0 feat_idx:653684 feat_idx:144963 feat_idx:148475 feat_idx:891197 feat_idx:122292 feat_idx:282954 feat_idx:561978 feat_idx:223357 feat_idx:222724 feat_idx:538143 feat_idx:599055 feat_idx:706003 feat_idx:729650 feat_idx:1047606 feat_idx:475068 feat_idx:0 feat_idx:122096 feat_idx:744639 feat_idx:530010 feat_idx:785927 feat_value:0.0 feat_value:8.14970622249e-05 feat_value:0.00018310826276 feat_value:0.00825593395253 feat_value:0.000387098902496 feat_value:0.000102079403856 feat_value:3.55170392996e-05 feat_value:0.0019844551017 feat_value:0.00196423033185 feat_value:0.0 feat_value:0.00865800865801 feat_value:0.0 feat_value:0.00108210469363 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:201945 feat_idx:631742 feat_idx:306726 feat_idx:186386 feat_idx:314332 feat_idx:615411 feat_idx:337962 feat_idx:989504 feat_idx:31348 feat_idx:1068694 feat_idx:746192 feat_idx:359807 feat_idx:597620 feat_idx:59528 feat_idx:834098 feat_idx:463498 feat_idx:13161 feat_idx:144824 feat_idx:734534 feat_idx:1047606 feat_idx:447900 feat_idx:0 feat_idx:476211 feat_idx:421203 feat_idx:24736 feat_idx:272262 feat_value:0.0 feat_value:1.55232499476e-05 feat_value:3.05180437934e-05 feat_value:0.0 feat_value:0.00767176914691 feat_value:0.0 feat_value:0.0 feat_value:0.000496113775426 feat_value:6.89203625211e-05 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.000135263086704 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:894961 feat_idx:0 feat_idx:0 feat_idx:314332 feat_idx:615411 feat_idx:927764 feat_idx:144963 feat_idx:148475 feat_idx:967242 feat_idx:1062285 feat_idx:0 feat_idx:736367 feat_idx:59528 feat_idx:562438 feat_idx:0 feat_idx:587215 feat_idx:896897 feat_idx:960559 feat_idx:1047606 feat_idx:0 feat_idx:0 feat_idx:377126 feat_idx:428982 feat_idx:525837 feat_idx:697480 feat_value:0.0 feat_value:1.16424374607e-05 feat_value:0.000305180437934 feat_value:0.0 feat_value:0.000190505338295 feat_value:0.00198358841584 feat_value:0.0 feat_value:0.000661485033901 feat_value:0.017988214618 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.000676315433518 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:506931 feat_idx:889703 feat_idx:428972 feat_idx:323226 feat_idx:314332 feat_idx:108674 feat_idx:731191 feat_idx:66687 feat_idx:31348 feat_idx:754940 feat_idx:639052 feat_idx:789125 feat_idx:318898 feat_idx:223357 feat_idx:275810 feat_idx:791919 feat_idx:189960 feat_idx:990004 feat_idx:0 feat_idx:0 feat_idx:128761 feat_idx:0 feat_idx:441547 feat_idx:686449 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.000228967936727 feat_value:3.05180437934e-05 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:3.44601812606e-05 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:0 feat_idx:5 feat_idx:0 feat_idx:0 feat_idx:8 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:12 feat_idx:0 feat_idx:695357 feat_idx:702327 feat_idx:112382 feat_idx:364273 feat_idx:314332 feat_idx:615411 feat_idx:680585 feat_idx:144963 feat_idx:31348 feat_idx:776916 feat_idx:972993 feat_idx:307964 feat_idx:509894 feat_idx:59528 feat_idx:89255 feat_idx:498076 feat_idx:854924 feat_idx:91753 feat_idx:734534 feat_idx:94311 feat_idx:797195 feat_idx:0 feat_idx:377126 feat_idx:520021 feat_idx:522503 feat_idx:516793 feat_value:0.0 feat_value:0.000306584186465 feat_value:7.62951094835e-05 feat_value:0.0 feat_value:0.00199486550979 feat_value:0.0 feat_value:0.0 feat_value:0.00115759880933 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:711611 feat_idx:461913 feat_idx:1019942 feat_idx:360051 feat_idx:615411 feat_idx:1055981 feat_idx:948645 feat_idx:148475 feat_idx:754940 feat_idx:380775 feat_idx:858292 feat_idx:571110 feat_idx:288355 feat_idx:122497 feat_idx:986082 feat_idx:13161 feat_idx:87215 feat_idx:734534 feat_idx:94311 feat_idx:675199 feat_idx:0 feat_idx:122096 feat_idx:294199 feat_idx:522503 feat_idx:87571 feat_value:0.00675324675325 feat_value:4.26889373559e-05 feat_value:0.000640878919661 feat_value:0.0330237358101 feat_value:1.16583049274e-06 feat_value:7.65595528922e-05 feat_value:0.000692582266342 feat_value:0.00396891020341 feat_value:0.00110272580034 feat_value:0.0217391304348 feat_value:0.004329004329 feat_value:0.0 feat_value:0.00432841877452 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:704711 feat_idx:72868 feat_idx:17848 feat_idx:314332 feat_idx:615411 feat_idx:363835 feat_idx:144963 feat_idx:31348 feat_idx:1069123 feat_idx:258719 feat_idx:753245 feat_idx:820316 feat_idx:39086 feat_idx:992008 feat_idx:325584 feat_idx:13161 feat_idx:750233 feat_idx:321110 feat_idx:94311 feat_idx:644181 feat_idx:0 feat_idx:476211 feat_idx:221229 feat_idx:502861 feat_idx:952230 feat_value:0.0 feat_value:1.16424374607e-05 feat_value:0.000839246204318 feat_value:0.00515995872033 feat_value:0.000625101038643 feat_value:0.0 feat_value:0.0 feat_value:0.000826856292376 feat_value:3.44601812606e-05 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.000676315433518 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:1
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:268086 feat_idx:31161 feat_idx:0 feat_idx:0 feat_idx:314332 feat_idx:85900 feat_idx:834217 feat_idx:760883 feat_idx:148475 feat_idx:697060 feat_idx:390104 feat_idx:0 feat_idx:916053 feat_idx:59528 feat_idx:608516 feat_idx:0 feat_idx:587215 feat_idx:473726 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:476211 feat_idx:0 feat_idx:0 feat_idx:0 feat_value:0.00017316017316 feat_value:1.55232499476e-05 feat_value:7.62951094835e-05 feat_value:0.00825593395253 feat_value:3.02252349969e-07 feat_value:1.85598916102e-05 feat_value:1.77585196498e-05 feat_value:0.0013229700678 feat_value:0.000275681450084 feat_value:0.0217391304348 feat_value:0.004329004329 feat_value:0.0 feat_value:0.00108210469363 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:746729 feat_idx:0 feat_idx:415419 feat_idx:314332 feat_idx:85900 feat_idx:341613 feat_idx:341430 feat_idx:148475 feat_idx:219803 feat_idx:273068 feat_idx:0 feat_idx:427647 feat_idx:59528 feat_idx:86971 feat_idx:85678 feat_idx:13161 feat_idx:706003 feat_idx:970598 feat_idx:94311 feat_idx:378304 feat_idx:0 feat_idx:476211 feat_idx:26849 feat_idx:502861 feat_idx:1082916 feat_value:0.0 feat_value:1.55232499476e-05 feat_value:0.000106813153277 feat_value:0.0030959752322 feat_value:0.000435545636305 feat_value:0.000155439092236 feat_value:0.000106551117899 feat_value:0.000496113775426 feat_value:0.00196423033185 feat_value:0.0 feat_value:0.012987012987 feat_value:0.0 feat_value:0.000405789260111 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:1
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:0 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:0 feat_idx:695357 feat_idx:655161 feat_idx:410781 feat_idx:572549 feat_idx:314332 feat_idx:615411 feat_idx:438251 feat_idx:1017442 feat_idx:148475 feat_idx:754940 feat_idx:939988 feat_idx:175321 feat_idx:940584 feat_idx:223357 feat_idx:400890 feat_idx:229140 feat_idx:13161 feat_idx:512136 feat_idx:734534 feat_idx:94311 feat_idx:59009 feat_idx:0 feat_idx:122096 feat_idx:26849 feat_idx:502861 feat_idx:602609 feat_value:0.00121212121212 feat_value:1.55232499476e-05 feat_value:0.000610360875868 feat_value:0.0 feat_value:6.12276903223e-05 feat_value:5.33596883794e-05 feat_value:0.00261050238852 feat_value:0.0 feat_value:0.000241221268824 feat_value:0.0 feat_value:0.017316017316 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:181401 feat_idx:563443 feat_idx:0 feat_idx:0 feat_idx:314332 feat_idx:85900 feat_idx:1086355 feat_idx:66687 feat_idx:148475 feat_idx:754940 feat_idx:294725 feat_idx:0 feat_idx:937034 feat_idx:59528 feat_idx:827972 feat_idx:0 feat_idx:197974 feat_idx:319863 feat_idx:734534 feat_idx:1047606 feat_idx:0 feat_idx:0 feat_idx:122096 feat_idx:808702 feat_idx:502861 feat_idx:792764 feat_value:0.0 feat_value:1.16424374607e-05 feat_value:0.000152590218967 feat_value:0.00206398348813 feat_value:0.000153069225806 feat_value:0.0 feat_value:0.0 feat_value:0.000330742516951 feat_value:0.000103380543782 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.000270526173407 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:962300 feat_idx:623087 feat_idx:0 feat_idx:53376 feat_idx:314332 feat_idx:615411 feat_idx:264532 feat_idx:144963 feat_idx:148475 feat_idx:14838 feat_idx:682657 feat_idx:0 feat_idx:502067 feat_idx:59528 feat_idx:519185 feat_idx:0 feat_idx:854924 feat_idx:372673 feat_idx:764350 feat_idx:330429 feat_idx:0 feat_idx:925828 feat_idx:377126 feat_idx:383664 feat_idx:522503 feat_idx:14052 feat_value:0.000865800865801 feat_value:0.000209563874293 feat_value:0.0 feat_value:0.00515995872033 feat_value:1.97327605623e-05 feat_value:1.15999322564e-05 feat_value:8.8792598249e-05 feat_value:0.00115759880933 feat_value:0.000379061993866 feat_value:0.0217391304348 feat_value:0.004329004329 feat_value:0.000249500998004 feat_value:0.000676315433518 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:244091 feat_idx:428972 feat_idx:323226 feat_idx:314332 feat_idx:615411 feat_idx:253814 feat_idx:144963 feat_idx:148475 feat_idx:367991 feat_idx:359193 feat_idx:789125 feat_idx:173541 feat_idx:59528 feat_idx:433504 feat_idx:791919 feat_idx:587215 feat_idx:884062 feat_idx:0 feat_idx:0 feat_idx:128761 feat_idx:0 feat_idx:637620 feat_idx:686449 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:1.16424374607e-05 feat_value:0.00022888532845 feat_value:0.00206398348813 feat_value:0.000868414180368 feat_value:0.00070759586764 feat_value:1.77585196498e-05 feat_value:0.00711096411444 feat_value:0.00785692132741 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.00405789260111 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:518052 feat_idx:631742 feat_idx:209780 feat_idx:691946 feat_idx:463568 feat_idx:404876 feat_idx:781648 feat_idx:66687 feat_idx:148475 feat_idx:294231 feat_idx:673759 feat_idx:780141 feat_idx:636360 feat_idx:223357 feat_idx:656844 feat_idx:720701 feat_idx:13161 feat_idx:284891 feat_idx:734534 feat_idx:330429 feat_idx:564494 feat_idx:0 feat_idx:122096 feat_idx:529367 feat_idx:24736 feat_idx:225414 feat_value:0.0 feat_value:1.55232499476e-05 feat_value:6.10360875868e-05 feat_value:0.0030959752322 feat_value:7.29291741568e-05 feat_value:0.000426877507035 feat_value:0.000213102235798 feat_value:0.00760707788986 feat_value:0.00182638960681 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000405789260111 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:87449 feat_idx:0 feat_idx:0 feat_idx:943087 feat_idx:615411 feat_idx:14123 feat_idx:128514 feat_idx:148475 feat_idx:338941 feat_idx:655530 feat_idx:0 feat_idx:945302 feat_idx:288355 feat_idx:1078572 feat_idx:0 feat_idx:587215 feat_idx:644343 feat_idx:215210 feat_idx:330429 feat_idx:0 feat_idx:0 feat_idx:217677 feat_idx:830506 feat_idx:502861 feat_idx:560344 feat_value:0.000692640692641 feat_value:1.16424374607e-05 feat_value:0.00135805294881 feat_value:0.00412796697626 feat_value:2.09849488693e-05 feat_value:1.15999322564e-05 feat_value:7.10340785992e-05 feat_value:0.00115759880933 feat_value:0.000137840725042 feat_value:0.0217391304348 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000541052346815 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:76757 feat_idx:0 feat_idx:748549 feat_idx:729041 feat_idx:404876 feat_idx:897525 feat_idx:66687 feat_idx:148475 feat_idx:809357 feat_idx:739161 feat_idx:0 feat_idx:571774 feat_idx:223357 feat_idx:726585 feat_idx:450365 feat_idx:13161 feat_idx:1064696 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:925828 feat_idx:476211 feat_idx:381001 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.00016299412445 feat_value:3.05180437934e-05 feat_value:0.00103199174407 feat_value:0.000144347086564 feat_value:2.31998645128e-06 feat_value:0.000301894834047 feat_value:0.000330742516951 feat_value:3.44601812606e-05 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000135263086704 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:1
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:695357 feat_idx:702327 feat_idx:593344 feat_idx:1065368 feat_idx:463568 feat_idx:85900 feat_idx:669411 feat_idx:27549 feat_idx:148475 feat_idx:227359 feat_idx:1043530 feat_idx:320625 feat_idx:575561 feat_idx:223357 feat_idx:57227 feat_idx:1021160 feat_idx:854924 feat_idx:91753 feat_idx:943801 feat_idx:94311 feat_idx:758526 feat_idx:0 feat_idx:122096 feat_idx:154807 feat_idx:522503 feat_idx:406770 feat_value:0.0 feat_value:1.94040624345e-05 feat_value:1.52590218967e-05 feat_value:0.00206398348813 feat_value:0.000346985697764 feat_value:0.00038047777801 feat_value:0.000319653353696 feat_value:0.00214982636018 feat_value:0.0126468865226 feat_value:0.0 feat_value:0.00865800865801 feat_value:0.0 feat_value:0.000270526173407 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:0 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:518052 feat_idx:569676 feat_idx:460446 feat_idx:323226 feat_idx:314332 feat_idx:108674 feat_idx:2775 feat_idx:144963 feat_idx:31348 feat_idx:892705 feat_idx:1040029 feat_idx:824386 feat_idx:524213 feat_idx:863222 feat_idx:406685 feat_idx:499188 feat_idx:599055 feat_idx:251433 feat_idx:0 feat_idx:0 feat_idx:335421 feat_idx:0 feat_idx:476211 feat_idx:686449 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:7.7616249738e-06 feat_value:0.0 feat_value:0.0 feat_value:0.00307174745383 feat_value:0.000329438076082 feat_value:0.0 feat_value:0.00115759880933 feat_value:0.00217099141941 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:268086 feat_idx:328856 feat_idx:583609 feat_idx:356189 feat_idx:314332 feat_idx:0 feat_idx:407260 feat_idx:144963 feat_idx:148475 feat_idx:699806 feat_idx:967004 feat_idx:598842 feat_idx:676678 feat_idx:223357 feat_idx:310528 feat_idx:805012 feat_idx:599055 feat_idx:683739 feat_idx:734534 feat_idx:94311 feat_idx:135625 feat_idx:0 feat_idx:122096 feat_idx:737768 feat_idx:522503 feat_idx:618666 feat_value:0.0 feat_value:1.16424374607e-05 feat_value:0.000167849240864 feat_value:0.0030959752322 feat_value:0.000698807433128 feat_value:0.00028999830641 feat_value:3.55170392996e-05 feat_value:0.000496113775426 feat_value:0.00354939866984 feat_value:0.0 feat_value:0.00865800865801 feat_value:0.0 feat_value:0.000405789260111 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:1
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:0 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:849120 feat_idx:439682 feat_idx:0 feat_idx:0 feat_idx:314332 feat_idx:615411 feat_idx:443349 feat_idx:1007823 feat_idx:31348 feat_idx:754940 feat_idx:1072328 feat_idx:0 feat_idx:321212 feat_idx:59528 feat_idx:163883 feat_idx:0 feat_idx:189960 feat_idx:1040747 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:925828 feat_idx:122096 feat_idx:0 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.000554956185627 feat_value:3.05180437934e-05 feat_value:0.00206398348813 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.000330742516951 feat_value:6.89203625211e-05 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.000270526173407 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:738089 feat_idx:439682 feat_idx:374405 feat_idx:984218 feat_idx:943087 feat_idx:108674 feat_idx:884166 feat_idx:144963 feat_idx:148475 feat_idx:683571 feat_idx:374802 feat_idx:530646 feat_idx:826201 feat_idx:223357 feat_idx:43619 feat_idx:1001991 feat_idx:339114 feat_idx:603612 feat_idx:0 feat_idx:0 feat_idx:60686 feat_idx:0 feat_idx:122096 feat_idx:138318 feat_idx:0 feat_idx:0 feat_value:0.00034632034632 feat_value:1.16424374607e-05 feat_value:0.0 feat_value:0.00722394220846 feat_value:1.91282558623e-05 feat_value:8.58394986973e-05 feat_value:0.000124309637549 feat_value:0.00562262278816 feat_value:0.00971777111548 feat_value:0.0217391304348 feat_value:0.017316017316 feat_value:0.00174650698603 feat_value:0.000946841606925 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:268086 feat_idx:1049859 feat_idx:420263 feat_idx:271401 feat_idx:360051 feat_idx:615411 feat_idx:714816 feat_idx:144963 feat_idx:148475 feat_idx:900313 feat_idx:855314 feat_idx:74337 feat_idx:603555 feat_idx:288355 feat_idx:650698 feat_idx:322858 feat_idx:339114 feat_idx:311468 feat_idx:489978 feat_idx:330429 feat_idx:101492 feat_idx:0 feat_idx:217677 feat_idx:221229 feat_idx:917031 feat_idx:24246 feat_value:0.00034632034632 feat_value:1.55232499476e-05 feat_value:0.000915541313802 feat_value:0.077399380805 feat_value:2.63391333544e-06 feat_value:0.000280718360605 feat_value:0.00092344302179 feat_value:0.00644947908054 feat_value:0.00854612495262 feat_value:0.0217391304348 feat_value:0.034632034632 feat_value:0.000249500998004 feat_value:0.0104152576762 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:1
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:268086 feat_idx:541890 feat_idx:93486 feat_idx:892417 feat_idx:314332 feat_idx:0 feat_idx:870784 feat_idx:66687 feat_idx:148475 feat_idx:1064406 feat_idx:605532 feat_idx:908441 feat_idx:411003 feat_idx:223357 feat_idx:415710 feat_idx:177994 feat_idx:13161 feat_idx:721813 feat_idx:0 feat_idx:0 feat_idx:702388 feat_idx:0 feat_idx:122096 feat_idx:68781 feat_idx:0 feat_idx:0 feat_value:0.00017316017316 feat_value:0.000143590062015 feat_value:3.05180437934e-05 feat_value:0.0433436532508 feat_value:1.41626815414e-05 feat_value:0.000102079403856 feat_value:0.000266377794747 feat_value:0.00810319166529 feat_value:0.00199869051311 feat_value:0.0217391304348 feat_value:0.038961038961 feat_value:0.0 feat_value:0.00568104964155 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:0 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:268086 feat_idx:569676 feat_idx:460446 feat_idx:323226 feat_idx:463568 feat_idx:404876 feat_idx:679269 feat_idx:1007823 feat_idx:148475 feat_idx:754940 feat_idx:392943 feat_idx:824386 feat_idx:502022 feat_idx:863222 feat_idx:406685 feat_idx:499188 feat_idx:763481 feat_idx:251433 feat_idx:0 feat_idx:0 feat_idx:335421 feat_idx:0 feat_idx:476211 feat_idx:686449 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:1.16424374607e-05 feat_value:0.0 feat_value:0.0 feat_value:0.000644186115598 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:695357 feat_idx:52223 feat_idx:0 feat_idx:610088 feat_idx:360051 feat_idx:108674 feat_idx:207287 feat_idx:144963 feat_idx:148475 feat_idx:198726 feat_idx:1050332 feat_idx:0 feat_idx:575881 feat_idx:863222 feat_idx:428650 feat_idx:56538 feat_idx:587215 feat_idx:520546 feat_idx:0 feat_idx:0 feat_idx:3328 feat_idx:0 feat_idx:321110 feat_idx:604513 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:1.55232499476e-05 feat_value:0.0 feat_value:0.00103199174407 feat_value:0.00087290478671 feat_value:0.000153119105784 feat_value:1.77585196498e-05 feat_value:0.000165371258475 feat_value:3.44601812606e-05 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000135263086704 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:506931 feat_idx:664380 feat_idx:464058 feat_idx:794391 feat_idx:314332 feat_idx:615411 feat_idx:1008575 feat_idx:144963 feat_idx:148475 feat_idx:811905 feat_idx:262025 feat_idx:792836 feat_idx:853632 feat_idx:863222 feat_idx:190922 feat_idx:989611 feat_idx:13161 feat_idx:402822 feat_idx:622170 feat_idx:94311 feat_idx:626744 feat_idx:925828 feat_idx:122096 feat_idx:423382 feat_idx:24736 feat_idx:1081226 feat_value:0.00225108225108 feat_value:6.20929997904e-05 feat_value:0.00122072175174 feat_value:0.0330237358101 feat_value:1.63216268983e-05 feat_value:0.000266798441897 feat_value:0.000266377794747 feat_value:0.00611873656359 feat_value:0.00196423033185 feat_value:0.0217391304348 feat_value:0.00865800865801 feat_value:0.0 feat_value:0.00649262816177 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:268086 feat_idx:894961 feat_idx:594422 feat_idx:823711 feat_idx:360051 feat_idx:615411 feat_idx:919751 feat_idx:888742 feat_idx:148475 feat_idx:725649 feat_idx:522685 feat_idx:14144 feat_idx:242991 feat_idx:288355 feat_idx:645605 feat_idx:99736 feat_idx:379814 feat_idx:896897 feat_idx:734534 feat_idx:330429 feat_idx:710067 feat_idx:0 feat_idx:407810 feat_idx:474780 feat_idx:525837 feat_idx:815828 feat_value:0.0 feat_value:1.16424374607e-05 feat_value:0.0013885709926 feat_value:0.00412796697626 feat_value:1.26514197916e-05 feat_value:0.000510397019281 feat_value:0.000621548187743 feat_value:0.000661485033901 feat_value:0.0022743719632 feat_value:0.0 feat_value:0.021645021645 feat_value:0.000249500998004 feat_value:0.000541052346815 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:0 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:12 feat_idx:0 feat_idx:268086 feat_idx:704711 feat_idx:539260 feat_idx:133619 feat_idx:943087 feat_idx:108674 feat_idx:277955 feat_idx:795081 feat_idx:148475 feat_idx:46173 feat_idx:414978 feat_idx:796305 feat_idx:317564 feat_idx:59528 feat_idx:28300 feat_idx:252652 feat_idx:854924 feat_idx:750233 feat_idx:637425 feat_idx:330429 feat_idx:538163 feat_idx:0 feat_idx:122096 feat_idx:623412 feat_idx:917031 feat_idx:421993 feat_value:0.0 feat_value:7.7616249738e-06 feat_value:0.0 feat_value:0.0 feat_value:0.000244133540961 feat_value:0.0 feat_value:0.0 feat_value:0.000661485033901 feat_value:0.000516902718908 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:631742 feat_idx:0 feat_idx:618078 feat_idx:314332 feat_idx:831162 feat_idx:302234 feat_idx:144963 feat_idx:148475 feat_idx:754940 feat_idx:683585 feat_idx:0 feat_idx:460786 feat_idx:59528 feat_idx:834098 feat_idx:0 feat_idx:13161 feat_idx:144824 feat_idx:734534 feat_idx:1047606 feat_idx:0 feat_idx:0 feat_idx:122096 feat_idx:225853 feat_idx:24736 feat_idx:83301 feat_value:0.0 feat_value:0.000217325499267 feat_value:0.0 feat_value:0.0103199174407 feat_value:0.000282821841757 feat_value:0.000227358672225 feat_value:0.000603789668093 feat_value:0.00181908384323 feat_value:0.0120266032599 feat_value:0.0 feat_value:0.038961038961 feat_value:0.0 feat_value:0.00135263086704 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:704711 feat_idx:552317 feat_idx:56734 feat_idx:314332 feat_idx:615411 feat_idx:205494 feat_idx:66687 feat_idx:148475 feat_idx:721787 feat_idx:258719 feat_idx:1026950 feat_idx:820316 feat_idx:59528 feat_idx:28300 feat_idx:783420 feat_idx:13161 feat_idx:750233 feat_idx:505787 feat_idx:330429 feat_idx:515764 feat_idx:0 feat_idx:476211 feat_idx:221229 feat_idx:502861 feat_idx:24246 feat_value:0.00103896103896 feat_value:7.7616249738e-06 feat_value:0.000152590218967 feat_value:0.0061919504644 feat_value:0.0 feat_value:0.0 feat_value:0.000106551117899 feat_value:0.00148834132628 feat_value:0.000310141631345 feat_value:0.0217391304348 feat_value:0.004329004329 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:1
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:294042 feat_idx:507045 feat_idx:549419 feat_idx:314332 feat_idx:0 feat_idx:1012202 feat_idx:795081 feat_idx:148475 feat_idx:68578 feat_idx:717684 feat_idx:462100 feat_idx:729242 feat_idx:59528 feat_idx:182004 feat_idx:253871 feat_idx:763481 feat_idx:256400 feat_idx:0 feat_idx:0 feat_idx:915751 feat_idx:0 feat_idx:122096 feat_idx:1030847 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:1.16424374607e-05 feat_value:1.52590218967e-05 feat_value:0.0030959752322 feat_value:0.000125262009609 feat_value:0.0 feat_value:0.0 feat_value:0.000496113775426 feat_value:0.000310141631345 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.000405789260111 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:1
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:181401 feat_idx:439682 feat_idx:0 feat_idx:0 feat_idx:314332 feat_idx:0 feat_idx:1027059 feat_idx:144963 feat_idx:148475 feat_idx:307216 feat_idx:1086145 feat_idx:0 feat_idx:784143 feat_idx:59528 feat_idx:127555 feat_idx:0 feat_idx:13161 feat_idx:757164 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:122096 feat_idx:0 feat_idx:0 feat_idx:0 feat_value:0.00017316017316 feat_value:0.000100901124659 feat_value:1.52590218967e-05 feat_value:0.0144478844169 feat_value:2.41801879975e-06 feat_value:3.47997967692e-05 feat_value:0.000177585196498 feat_value:0.00578799404663 feat_value:0.00554808918295 feat_value:0.0217391304348 feat_value:0.00865800865801 feat_value:0.0 feat_value:0.00202894630055 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:849120 feat_idx:704711 feat_idx:160536 feat_idx:572549 feat_idx:360051 feat_idx:0 feat_idx:731718 feat_idx:66687 feat_idx:148475 feat_idx:31385 feat_idx:1047396 feat_idx:768743 feat_idx:258527 feat_idx:863222 feat_idx:866128 feat_idx:824472 feat_idx:599055 feat_idx:575938 feat_idx:568485 feat_idx:94311 feat_idx:469863 feat_idx:0 feat_idx:122096 feat_idx:26849 feat_idx:502861 feat_idx:9838 feat_value:0.0 feat_value:1.55232499476e-05 feat_value:0.00114442664225 feat_value:0.0227038183695 feat_value:0.000255273699002 feat_value:0.000419917547682 feat_value:3.55170392996e-05 feat_value:0.00363816768646 feat_value:0.00234329232572 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.00297578790748 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:695357 feat_idx:447935 feat_idx:937213 feat_idx:905937 feat_idx:314332 feat_idx:404876 feat_idx:142618 feat_idx:144963 feat_idx:148475 feat_idx:750865 feat_idx:596218 feat_idx:919681 feat_idx:840670 feat_idx:59528 feat_idx:380839 feat_idx:380828 feat_idx:13161 feat_idx:197572 feat_idx:1030936 feat_idx:94311 feat_idx:827510 feat_idx:0 feat_idx:377126 feat_idx:288434 feat_idx:24736 feat_idx:933741 feat_value:0.0 feat_value:0.000504505623297 feat_value:3.05180437934e-05 feat_value:0.0237358101135 feat_value:0.000683824352351 feat_value:5.33596883794e-05 feat_value:7.10340785992e-05 feat_value:0.00396891020341 feat_value:0.000792584168993 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.00311105099418 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:1
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:0 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:695357 feat_idx:569676 feat_idx:460446 feat_idx:323226 feat_idx:314332 feat_idx:404876 feat_idx:195437 feat_idx:144963 feat_idx:148475 feat_idx:303093 feat_idx:895160 feat_idx:824386 feat_idx:332768 feat_idx:288355 feat_idx:452911 feat_idx:499188 feat_idx:339114 feat_idx:1026477 feat_idx:0 feat_idx:0 feat_idx:335421 feat_idx:0 feat_idx:407810 feat_idx:686449 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:7.7616249738e-06 feat_value:0.0 feat_value:0.0 feat_value:0.00100192336124 feat_value:0.0 feat_value:0.0 feat_value:0.00529188027121 feat_value:0.0013094868879 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:0 feat_idx:5 feat_idx:0 feat_idx:0 feat_idx:8 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:268086 feat_idx:569676 feat_idx:460446 feat_idx:323226 feat_idx:943087 feat_idx:615411 feat_idx:831536 feat_idx:144963 feat_idx:31348 feat_idx:1084149 feat_idx:472585 feat_idx:824386 feat_idx:1085274 feat_idx:863222 feat_idx:406685 feat_idx:499188 feat_idx:13161 feat_idx:251433 feat_idx:0 feat_idx:0 feat_idx:335421 feat_idx:969590 feat_idx:476211 feat_idx:686449 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:7.7616249738e-06 feat_value:0.0 feat_value:0.0 feat_value:0.0294215028194 feat_value:0.0 feat_value:0.0 feat_value:0.00181908384323 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:29151 feat_idx:0 feat_idx:0 feat_idx:314332 feat_idx:615411 feat_idx:351823 feat_idx:144963 feat_idx:148475 feat_idx:633435 feat_idx:734591 feat_idx:0 feat_idx:346678 feat_idx:59528 feat_idx:246568 feat_idx:0 feat_idx:13161 feat_idx:669279 feat_idx:734534 feat_idx:94311 feat_idx:0 feat_idx:0 feat_idx:122096 feat_idx:311968 feat_idx:1007264 feat_idx:210855 feat_value:0.0 feat_value:1.55232499476e-05 feat_value:0.000976577401389 feat_value:0.0113519091847 feat_value:6.45092872648e-05 feat_value:0.00019951883481 feat_value:0.000266377794747 feat_value:0.00214982636018 feat_value:0.00796030187119 feat_value:0.0 feat_value:0.017316017316 feat_value:0.0 feat_value:0.00148789395374 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:0 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:268086 feat_idx:655161 feat_idx:160536 feat_idx:572549 feat_idx:943087 feat_idx:108674 feat_idx:179440 feat_idx:144963 feat_idx:148475 feat_idx:754940 feat_idx:216593 feat_idx:768743 feat_idx:272886 feat_idx:288355 feat_idx:1059113 feat_idx:824472 feat_idx:599055 feat_idx:512136 feat_idx:734534 feat_idx:94311 feat_idx:469863 feat_idx:0 feat_idx:476211 feat_idx:26849 feat_idx:502861 feat_idx:507836 feat_value:0.0 feat_value:1.55232499476e-05 feat_value:1.52590218967e-05 feat_value:0.0 feat_value:0.000125348367423 feat_value:4.63997290256e-06 feat_value:5.32755589494e-05 feat_value:0.0 feat_value:6.89203625211e-05 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:439682 feat_idx:434330 feat_idx:626900 feat_idx:360051 feat_idx:615411 feat_idx:448250 feat_idx:66687 feat_idx:31348 feat_idx:621494 feat_idx:345898 feat_idx:171523 feat_idx:728643 feat_idx:288355 feat_idx:993766 feat_idx:479691 feat_idx:599055 feat_idx:786401 feat_idx:0 feat_idx:0 feat_idx:914361 feat_idx:0 feat_idx:407810 feat_idx:253237 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.000419127748585 feat_value:1.52590218967e-05 feat_value:0.00103199174407 feat_value:0.00740600297347 feat_value:0.0 feat_value:0.0 feat_value:0.000165371258475 feat_value:0.000447982356387 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.000135263086704 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:599320 feat_idx:36543 feat_idx:348417 feat_idx:314332 feat_idx:615411 feat_idx:507688 feat_idx:795081 feat_idx:148475 feat_idx:1085001 feat_idx:538920 feat_idx:698736 feat_idx:914324 feat_idx:223357 feat_idx:726559 feat_idx:327135 feat_idx:13161 feat_idx:214732 feat_idx:324501 feat_idx:1047606 feat_idx:434899 feat_idx:0 feat_idx:377126 feat_idx:221229 feat_idx:522503 feat_idx:24246 feat_value:0.0 feat_value:0.000147470874502 feat_value:0.0013733119707 feat_value:0.00206398348813 feat_value:0.00178026634132 feat_value:0.00081663523085 feat_value:0.0 feat_value:0.000826856292376 feat_value:0.00151624797546 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.000676315433518 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:704711 feat_idx:0 feat_idx:417270 feat_idx:314332 feat_idx:404876 feat_idx:180197 feat_idx:144963 feat_idx:148475 feat_idx:891898 feat_idx:832883 feat_idx:0 feat_idx:406751 feat_idx:59528 feat_idx:28300 feat_idx:80459 feat_idx:587215 feat_idx:750233 feat_idx:52536 feat_idx:1047606 feat_idx:584293 feat_idx:0 feat_idx:476211 feat_idx:26849 feat_idx:502861 feat_idx:983005 feat_value:0.0 feat_value:1.16424374607e-05 feat_value:0.00119020370794 feat_value:0.00103199174407 feat_value:0.000683737994537 feat_value:0.000510397019281 feat_value:1.77585196498e-05 feat_value:0.000165371258475 feat_value:3.44601812606e-05 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000135263086704 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:506931 feat_idx:123566 feat_idx:961529 feat_idx:810019 feat_idx:314332 feat_idx:615411 feat_idx:475867 feat_idx:795081 feat_idx:148475 feat_idx:697060 feat_idx:1069621 feat_idx:370551 feat_idx:696973 feat_idx:69630 feat_idx:396064 feat_idx:95177 feat_idx:854924 feat_idx:488825 feat_idx:0 feat_idx:0 feat_idx:581782 feat_idx:0 feat_idx:476211 feat_idx:289148 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.0066672358525 feat_value:0.0 feat_value:0.00103199174407 feat_value:0.000325784854359 feat_value:4.40797425743e-05 feat_value:0.000266377794747 feat_value:0.000165371258475 feat_value:0.00299803576967 feat_value:0.0 feat_value:0.030303030303 feat_value:0.0 feat_value:0.000135263086704 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:704711 feat_idx:995515 feat_idx:139394 feat_idx:943087 feat_idx:0 feat_idx:546815 feat_idx:144963 feat_idx:148475 feat_idx:364765 feat_idx:552750 feat_idx:920037 feat_idx:816538 feat_idx:223357 feat_idx:790588 feat_idx:560935 feat_idx:13161 feat_idx:750233 feat_idx:734534 feat_idx:1047606 feat_idx:361734 feat_idx:0 feat_idx:122096 feat_idx:434883 feat_idx:502861 feat_idx:203213 feat_value:0.0 feat_value:0.000197921436832 feat_value:4.57770656901e-05 feat_value:0.00206398348813 feat_value:0.000625316933178 feat_value:0.000874634892132 feat_value:0.000142068157198 feat_value:0.000330742516951 feat_value:0.00975223129674 feat_value:0.0 feat_value:0.00865800865801 feat_value:0.0 feat_value:0.000270526173407 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
diff --git a/models/rank/deepfm/model.py b/models/rank/deepfm/model.py
index bd5e80e56de81cce0c92379a05797ca81027ac23..8ac8df134d08550c0db06e9aacbad21dbd74cfe9 100755
--- a/models/rank/deepfm/model.py
+++ b/models/rank/deepfm/model.py
@@ -1,77 +1,95 @@
-import paddle.fluid as fluid
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
import math
-from fleetrec.core.utils import envs
-from fleetrec.core.model import Model as ModelBase
+import paddle.fluid as fluid
+
+from paddlerec.core.utils import envs
+from paddlerec.core.model import Model as ModelBase
class Model(ModelBase):
def __init__(self, config):
ModelBase.__init__(self, config)
- def deepfm_net(self):
+ def _init_hyper_parameters(self):
+ self.sparse_feature_number = envs.get_global_env(
+ "hyper_parameters.sparse_feature_number", None)
+ self.sparse_feature_dim = envs.get_global_env(
+ "hyper_parameters.sparse_feature_dim", None)
+ self.num_field = envs.get_global_env("hyper_parameters.num_field",
+ None)
+ self.reg = envs.get_global_env("hyper_parameters.reg", 1e-4)
+ self.layer_sizes = envs.get_global_env("hyper_parameters.fc_sizes",
+ None)
+ self.act = envs.get_global_env("hyper_parameters.act", None)
+
+ def net(self, inputs, is_infer=False):
init_value_ = 0.1
is_distributed = True if envs.get_trainer() == "CtrTrainer" else False
- sparse_feature_number = envs.get_global_env("hyper_parameters.sparse_feature_number", None, self._namespace)
- sparse_feature_dim = envs.get_global_env("hyper_parameters.sparse_feature_dim", None, self._namespace)
-
+
# ------------------------- network input --------------------------
-
- num_field = envs.get_global_env("hyper_parameters.num_field", None, self._namespace)
- raw_feat_idx = fluid.data(name='feat_idx', shape=[None, num_field], dtype='int64') # None * num_field(defalut:39)
- raw_feat_value = fluid.data(name='feat_value', shape=[None, num_field], dtype='float32') # None * num_field
- self.label = fluid.data(name='label', shape=[None, 1], dtype='float32') # None * 1
- feat_idx = fluid.layers.reshape(raw_feat_idx,[-1, 1]) # (None * num_field) * 1
- feat_value = fluid.layers.reshape(raw_feat_value, [-1, num_field, 1]) # None * num_field * 1
-
- # ------------------------- set _data_var --------------------------
-
- self._data_var.append(raw_feat_idx)
- self._data_var.append(raw_feat_value)
- self._data_var.append(self.label)
- if self._platform != "LINUX":
- self._data_loader = fluid.io.DataLoader.from_generator(
- feed_list=self._data_var, capacity=64, use_double_buffer=False, iterable=False)
-
- #------------------------- first order term --------------------------
-
- reg = envs.get_global_env("hyper_parameters.reg", 1e-4, self._namespace)
+
+ raw_feat_idx = self._sparse_data_var[1]
+ raw_feat_value = self._dense_data_var[0]
+ self.label = self._sparse_data_var[0]
+
+ feat_idx = raw_feat_idx
+ feat_value = fluid.layers.reshape(
+ raw_feat_value, [-1, self.num_field, 1]) # None * num_field * 1
+
first_weights_re = fluid.embedding(
input=feat_idx,
is_sparse=True,
is_distributed=is_distributed,
dtype='float32',
- size=[sparse_feature_number + 1, 1],
+ size=[self.sparse_feature_number + 1, 1],
padding_idx=0,
param_attr=fluid.ParamAttr(
initializer=fluid.initializer.TruncatedNormalInitializer(
loc=0.0, scale=init_value_),
- regularizer=fluid.regularizer.L1DecayRegularizer(reg)))
+ regularizer=fluid.regularizer.L1DecayRegularizer(self.reg)))
first_weights = fluid.layers.reshape(
- first_weights_re, shape=[-1, num_field, 1]) # None * num_field * 1
- y_first_order = fluid.layers.reduce_sum((first_weights * feat_value), 1)
+ first_weights_re,
+ shape=[-1, self.num_field, 1]) # None * num_field * 1
+ y_first_order = fluid.layers.reduce_sum((first_weights * feat_value),
+ 1)
- #------------------------- second order term --------------------------
+ # ------------------------- second order term --------------------------
feat_embeddings_re = fluid.embedding(
input=feat_idx,
is_sparse=True,
is_distributed=is_distributed,
dtype='float32',
- size=[sparse_feature_number + 1, sparse_feature_dim],
+ size=[self.sparse_feature_number + 1, self.sparse_feature_dim],
padding_idx=0,
param_attr=fluid.ParamAttr(
initializer=fluid.initializer.TruncatedNormalInitializer(
- loc=0.0, scale=init_value_ / math.sqrt(float(sparse_feature_dim)))))
+ loc=0.0,
+ scale=init_value_ /
+ math.sqrt(float(self.sparse_feature_dim)))))
feat_embeddings = fluid.layers.reshape(
feat_embeddings_re,
- shape=[-1, num_field,
- sparse_feature_dim]) # None * num_field * embedding_size
+ shape=[-1, self.num_field, self.sparse_feature_dim
+ ]) # None * num_field * embedding_size
feat_embeddings = feat_embeddings * feat_value # None * num_field * embedding_size
-
+
# sum_square part
- summed_features_emb = fluid.layers.reduce_sum(feat_embeddings,
- 1) # None * embedding_size
+ summed_features_emb = fluid.layers.reduce_sum(
+ feat_embeddings, 1) # None * embedding_size
summed_features_emb_square = fluid.layers.square(
summed_features_emb) # None * embedding_size
@@ -82,21 +100,19 @@ class Model(ModelBase):
squared_features_emb, 1) # None * embedding_size
y_second_order = 0.5 * fluid.layers.reduce_sum(
- summed_features_emb_square - squared_sum_features_emb, 1,
+ summed_features_emb_square - squared_sum_features_emb,
+ 1,
keep_dim=True) # None * 1
+ # ------------------------- DNN --------------------------
- #------------------------- DNN --------------------------
-
- layer_sizes = envs.get_global_env("hyper_parameters.fc_sizes", None, self._namespace)
- act = envs.get_global_env("hyper_parameters.act", None, self._namespace)
- y_dnn = fluid.layers.reshape(feat_embeddings,
- [-1, num_field * sparse_feature_dim])
- for s in layer_sizes:
+ y_dnn = fluid.layers.reshape(
+ feat_embeddings, [-1, self.num_field * self.sparse_feature_dim])
+ for s in self.layer_sizes:
y_dnn = fluid.layers.fc(
input=y_dnn,
size=s,
- act=act,
+ act=self.act,
param_attr=fluid.ParamAttr(
initializer=fluid.initializer.TruncatedNormalInitializer(
loc=0.0, scale=init_value_ / math.sqrt(float(10)))),
@@ -113,35 +129,23 @@ class Model(ModelBase):
bias_attr=fluid.ParamAttr(
initializer=fluid.initializer.TruncatedNormalInitializer(
loc=0.0, scale=init_value_)))
-
- #------------------------- DeepFM --------------------------
- self.predict = fluid.layers.sigmoid(y_first_order + y_second_order + y_dnn)
-
- def train_net(self):
- self.deepfm_net()
-
- #------------------------- Cost(logloss) --------------------------
+ # ------------------------- DeepFM --------------------------
- cost = fluid.layers.log_loss(input=self.predict, label=self.label)
+ self.predict = fluid.layers.sigmoid(y_first_order + y_second_order +
+ y_dnn)
+ cost = fluid.layers.log_loss(
+ input=self.predict, label=fluid.layers.cast(self.label, "float32"))
avg_cost = fluid.layers.reduce_sum(cost)
-
+
self._cost = avg_cost
- #------------------------- Metric(Auc) --------------------------
-
predict_2d = fluid.layers.concat([1 - self.predict, self.predict], 1)
label_int = fluid.layers.cast(self.label, 'int64')
auc_var, batch_auc_var, _ = fluid.layers.auc(input=predict_2d,
- label=label_int,
- slide_steps=0)
+ label=label_int,
+ slide_steps=0)
self._metrics["AUC"] = auc_var
self._metrics["BATCH_AUC"] = batch_auc_var
-
- def optimizer(self):
- learning_rate = envs.get_global_env("hyper_parameters.learning_rate", None, self._namespace)
- optimizer = fluid.optimizer.Adam(learning_rate, lazy_mode=True)
- return optimizer
-
- def infer_net(self, parameter_list):
- self.deepfm_net()
\ No newline at end of file
+ if is_infer:
+ self._infer_results["AUC"] = auc_var
diff --git a/models/rank/din/config.yaml b/models/rank/din/config.yaml
index 1244c1639aae1065cde0c9e109a38b728af9ea3e..2885ba7a58083be470d9bc2f8d2d030c2c3207b5 100755
--- a/models/rank/din/config.yaml
+++ b/models/rank/din/config.yaml
@@ -12,40 +12,60 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-train:
- trainer:
- # for cluster training
- strategy: "async"
+# global settings
+debug: false
+workspace: "paddlerec.models.rank.din"
- epochs: 10
- workspace: "fleetrec.models.rank.din"
+dataset:
+ - name: sample_1
+ type: DataLoader
+ batch_size: 5
+ data_path: "{workspace}/data/train_data"
+ data_converter: "{workspace}/reader.py"
+ - name: infer_sample
+ type: DataLoader
+ batch_size: 5
+ data_path: "{workspace}/data/train_data"
+ data_converter: "{workspace}/reader.py"
- reader:
- batch_size: 2
- class: "{workspace}/reader.py"
- train_data_path: "{workspace}/data/train_data"
- dataset_class: "DataLoader"
+hyper_parameters:
+ optimizer:
+ class: SGD
+ learning_rate: 0.0001
+ use_DataLoader: True
+ item_emb_size: 64
+ cat_emb_size: 64
+ is_sparse: False
+ item_count: 63001
+ cat_count: 801
- model:
- models: "{workspace}/model.py"
- hyper_parameters:
- use_DataLoader: True
- item_emb_size: 64
- cat_emb_size: 64
- is_sparse: False
- config_path: "data/config.txt"
- fc_sizes: [400, 400, 400]
- learning_rate: 0.0001
- reg: 0.001
- act: "sigmoid"
- optimizer: SGD
+ act: "sigmoid"
- save:
- increment:
- dirname: "increment"
- epoch_interval: 2
- save_last: True
- inference:
- dirname: "inference"
- epoch_interval: 4
- save_last: True
+
+mode: train_runner
+
+runner:
+ - name: train_runner
+ trainer_class: single_train
+ epochs: 1
+ device: cpu
+ init_model_path: ""
+ save_checkpoint_interval: 1
+ save_inference_interval: 1
+ save_checkpoint_path: "increment"
+ save_inference_path: "inference"
+ print_interval: 1
+ - name: infer_runner
+ trainer_class: single_infer
+ epochs: 1
+ device: cpu
+ init_model_path: "increment/0"
+phase:
+- name: phase1
+ model: "{workspace}/model.py"
+ dataset_name: sample_1
+ thread_num: 1
+#- name: infer_phase
+# model: "{workspace}/model.py"
+# dataset_name: infer_sample
+# thread_num: 1
diff --git a/models/rank/din/data/build_dataset.py b/models/rank/din/data/build_dataset.py
index 34c053ccdb2686c10875740f72f1e0abf3cb4f10..b0ed187800b2f9f44d4dd0d34df204759059ac06 100755
--- a/models/rank/din/data/build_dataset.py
+++ b/models/rank/din/data/build_dataset.py
@@ -1,3 +1,17 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
from __future__ import print_function
import random
import pickle
diff --git a/models/rank/din/data/convert_pd.py b/models/rank/din/data/convert_pd.py
index d7927c7ef1a9da28732cad9c44be24e72095983a..a66290e1561084a10756ab98c3d70b9a5ac5a6ed 100755
--- a/models/rank/din/data/convert_pd.py
+++ b/models/rank/din/data/convert_pd.py
@@ -1,3 +1,17 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
from __future__ import print_function
import pickle
import pandas as pd
diff --git a/models/rank/din/data/remap_id.py b/models/rank/din/data/remap_id.py
index b110dac54de8f8d201ede7248d6a2844ac350c90..ee6983d7f0769a58352f61a0a05bbd81c6ccbc13 100755
--- a/models/rank/din/data/remap_id.py
+++ b/models/rank/din/data/remap_id.py
@@ -1,3 +1,17 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
from __future__ import print_function
import random
import pickle
diff --git a/models/rank/din/model.py b/models/rank/din/model.py
index 1d7bcca35064912ca4596fed9755755413c54c3b..4f6099119fae745b3b0c975ddcef853d3dce35b8 100755
--- a/models/rank/din/model.py
+++ b/models/rank/din/model.py
@@ -1,21 +1,80 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
import paddle.fluid as fluid
-import math
-from fleetrec.core.utils import envs
-from fleetrec.core.model import Model as ModelBase
+from paddlerec.core.utils import envs
+from paddlerec.core.model import Model as ModelBase
class Model(ModelBase):
def __init__(self, config):
ModelBase.__init__(self, config)
-
- def config_read(self, config_path):
- with open(config_path, "r") as fin:
- user_count = int(fin.readline().strip())
- item_count = int(fin.readline().strip())
- cat_count = int(fin.readline().strip())
- return user_count, item_count, cat_count
-
+
+ def _init_hyper_parameters(self):
+ self.item_emb_size = envs.get_global_env(
+ "hyper_parameters.item_emb_size", 64)
+ self.cat_emb_size = envs.get_global_env(
+ "hyper_parameters.cat_emb_size", 64)
+ self.act = envs.get_global_env("hyper_parameters.act", "sigmoid")
+ self.is_sparse = envs.get_global_env("hyper_parameters.is_sparse",
+ False)
+ #significant for speeding up the training process
+ self.use_DataLoader = envs.get_global_env(
+ "hyper_parameters.use_DataLoader", False)
+ self.item_count = envs.get_global_env("hyper_parameters.item_count",
+ 63001)
+ self.cat_count = envs.get_global_env("hyper_parameters.cat_count", 801)
+
+ def input_data(self, is_infer=False, **kwargs):
+ seq_len = -1
+ self.data_var = []
+ hist_item_seq = fluid.data(
+ name="hist_item_seq", shape=[None, seq_len], dtype="int64")
+ self.data_var.append(hist_item_seq)
+
+ hist_cat_seq = fluid.data(
+ name="hist_cat_seq", shape=[None, seq_len], dtype="int64")
+ self.data_var.append(hist_cat_seq)
+
+ target_item = fluid.data(
+ name="target_item", shape=[None], dtype="int64")
+ self.data_var.append(target_item)
+
+ target_cat = fluid.data(name="target_cat", shape=[None], dtype="int64")
+ self.data_var.append(target_cat)
+
+ label = fluid.data(name="label", shape=[None, 1], dtype="float32")
+ self.data_var.append(label)
+
+ mask = fluid.data(
+ name="mask", shape=[None, seq_len, 1], dtype="float32")
+ self.data_var.append(mask)
+
+ target_item_seq = fluid.data(
+ name="target_item_seq", shape=[None, seq_len], dtype="int64")
+ self.data_var.append(target_item_seq)
+
+ target_cat_seq = fluid.data(
+ name="target_cat_seq", shape=[None, seq_len], dtype="int64")
+ self.data_var.append(target_cat_seq)
+
+ train_inputs = [hist_item_seq] + [hist_cat_seq] + [target_item] + [
+ target_cat
+ ] + [label] + [mask] + [target_item_seq] + [target_cat_seq]
+ return train_inputs
+
def din_attention(self, hist, target_expand, mask):
"""activation weight"""
@@ -45,98 +104,63 @@ class Model(ModelBase):
out = fluid.layers.matmul(weight, hist)
out = fluid.layers.reshape(x=out, shape=[0, hidden_size])
return out
-
- def train_net(self):
- seq_len = -1
- self.item_emb_size = envs.get_global_env("hyper_parameters.item_emb_size", 64, self._namespace)
- self.cat_emb_size = envs.get_global_env("hyper_parameters.cat_emb_size", 64, self._namespace)
- self.act = envs.get_global_env("hyper_parameters.act", "sigmoid", self._namespace)
- #item_emb_size = 64
- #cat_emb_size = 64
- self.is_sparse = envs.get_global_env("hyper_parameters.is_sparse", False, self._namespace)
- #significant for speeding up the training process
- self.config_path = envs.get_global_env("hyper_parameters.config_path", "data/config.txt", self._namespace)
- self.use_DataLoader = envs.get_global_env("hyper_parameters.use_DataLoader", False, self._namespace)
- user_count, item_count, cat_count = self.config_read(self.config_path)
+ def net(self, inputs, is_infer=False):
+ hist_item_seq = inputs[0]
+ hist_cat_seq = inputs[1]
+ target_item = inputs[2]
+ target_cat = inputs[3]
+ label = inputs[4]
+ mask = inputs[5]
+ target_item_seq = inputs[6]
+ target_cat_seq = inputs[7]
item_emb_attr = fluid.ParamAttr(name="item_emb")
cat_emb_attr = fluid.ParamAttr(name="cat_emb")
- hist_item_seq = fluid.data(
- name="hist_item_seq", shape=[None, seq_len], dtype="int64")
- self._data_var.append(hist_item_seq)
-
- hist_cat_seq = fluid.data(
- name="hist_cat_seq", shape=[None, seq_len], dtype="int64")
- self._data_var.append(hist_cat_seq)
-
- target_item = fluid.data(name="target_item", shape=[None], dtype="int64")
- self._data_var.append(target_item)
-
- target_cat = fluid.data(name="target_cat", shape=[None], dtype="int64")
- self._data_var.append(target_cat)
-
- label = fluid.data(name="label", shape=[None, 1], dtype="float32")
- self._data_var.append(label)
-
- mask = fluid.data(name="mask", shape=[None, seq_len, 1], dtype="float32")
- self._data_var.append(mask)
-
- target_item_seq = fluid.data(
- name="target_item_seq", shape=[None, seq_len], dtype="int64")
- self._data_var.append(target_item_seq)
-
- target_cat_seq = fluid.data(
- name="target_cat_seq", shape=[None, seq_len], dtype="int64")
- self._data_var.append(target_cat_seq)
-
- if self.use_DataLoader:
- self._data_loader = fluid.io.DataLoader.from_generator(
- feed_list=self._data_var, capacity=10000, use_double_buffer=False, iterable=False)
-
hist_item_emb = fluid.embedding(
input=hist_item_seq,
- size=[item_count, self.item_emb_size],
+ size=[self.item_count, self.item_emb_size],
param_attr=item_emb_attr,
is_sparse=self.is_sparse)
hist_cat_emb = fluid.embedding(
input=hist_cat_seq,
- size=[cat_count, self.cat_emb_size],
+ size=[self.cat_count, self.cat_emb_size],
param_attr=cat_emb_attr,
is_sparse=self.is_sparse)
target_item_emb = fluid.embedding(
input=target_item,
- size=[item_count, self.item_emb_size],
+ size=[self.item_count, self.item_emb_size],
param_attr=item_emb_attr,
is_sparse=self.is_sparse)
target_cat_emb = fluid.embedding(
input=target_cat,
- size=[cat_count, self.cat_emb_size],
+ size=[self.cat_count, self.cat_emb_size],
param_attr=cat_emb_attr,
is_sparse=self.is_sparse)
target_item_seq_emb = fluid.embedding(
input=target_item_seq,
- size=[item_count, self.item_emb_size],
+ size=[self.item_count, self.item_emb_size],
param_attr=item_emb_attr,
is_sparse=self.is_sparse)
target_cat_seq_emb = fluid.embedding(
input=target_cat_seq,
- size=[cat_count, self.cat_emb_size],
+ size=[self.cat_count, self.cat_emb_size],
param_attr=cat_emb_attr,
is_sparse=self.is_sparse)
item_b = fluid.embedding(
input=target_item,
- size=[item_count, 1],
+ size=[self.item_count, 1],
param_attr=fluid.initializer.Constant(value=0.0))
- hist_seq_concat = fluid.layers.concat([hist_item_emb, hist_cat_emb], axis=2)
+ hist_seq_concat = fluid.layers.concat(
+ [hist_item_emb, hist_cat_emb], axis=2)
target_seq_concat = fluid.layers.concat(
[target_item_seq_emb, target_cat_seq_emb], axis=2)
target_concat = fluid.layers.concat(
@@ -144,21 +168,22 @@ class Model(ModelBase):
out = self.din_attention(hist_seq_concat, target_seq_concat, mask)
out_fc = fluid.layers.fc(name="out_fc",
- input=out,
- size=self.item_emb_size + self.cat_emb_size,
- num_flatten_dims=1)
+ input=out,
+ size=self.item_emb_size + self.cat_emb_size,
+ num_flatten_dims=1)
embedding_concat = fluid.layers.concat([out_fc, target_concat], axis=1)
fc1 = fluid.layers.fc(name="fc1",
- input=embedding_concat,
- size=80,
- act=self.act)
+ input=embedding_concat,
+ size=80,
+ act=self.act)
fc2 = fluid.layers.fc(name="fc2", input=fc1, size=40, act=self.act)
fc3 = fluid.layers.fc(name="fc3", input=fc2, size=1)
logit = fc3 + item_b
- loss = fluid.layers.sigmoid_cross_entropy_with_logits(x=logit, label=label)
-
+ loss = fluid.layers.sigmoid_cross_entropy_with_logits(
+ x=logit, label=label)
+
avg_loss = fluid.layers.mean(loss)
self._cost = avg_loss
@@ -166,16 +191,9 @@ class Model(ModelBase):
predict_2d = fluid.layers.concat([1 - self.predict, self.predict], 1)
label_int = fluid.layers.cast(label, 'int64')
auc_var, batch_auc_var, _ = fluid.layers.auc(input=predict_2d,
- label=label_int,
- slide_steps=0)
+ label=label_int,
+ slide_steps=0)
self._metrics["AUC"] = auc_var
self._metrics["BATCH_AUC"] = batch_auc_var
-
-
- def optimizer(self):
- learning_rate = envs.get_global_env("hyper_parameters.learning_rate", None, self._namespace)
- optimizer = fluid.optimizer.Adam(learning_rate, lazy_mode=True)
- return optimizer
-
- def infer_net(self, parameter_list):
- self.deepfm_net()
+ if is_infer:
+ self._infer_results["AUC"] = auc_var
diff --git a/models/rank/din/reader.py b/models/rank/din/reader.py
index dfaec833986d161b6706e101854afb3da00fa2e2..90d358b9f8122fd396bb1a6eb37cbb4d03b96143 100755
--- a/models/rank/din/reader.py
+++ b/models/rank/din/reader.py
@@ -13,25 +13,31 @@
# limitations under the License.
from __future__ import print_function
-from fleetrec.core.reader import Reader
-from fleetrec.core.utils import envs
-import numpy as np
import os
import random
+
try:
import cPickle as pickle
except ImportError:
import pickle
+import numpy as np
+
+from paddlerec.core.reader import Reader
+from paddlerec.core.utils import envs
+
+
class TrainReader(Reader):
def init(self):
- self.train_data_path = envs.get_global_env("train_data_path", None, "train.reader")
+ self.train_data_path = envs.get_global_env(
+ "dataset.sample_1.data_path", None)
self.res = []
self.max_len = 0
-
+
data_file_list = os.listdir(self.train_data_path)
- for i in range(0, len(data_file_list)):
- train_data_file = os.path.join(self.train_data_path, data_file_list[i])
+ for i in range(0, len(data_file_list)):
+ train_data_file = os.path.join(self.train_data_path,
+ data_file_list[i])
with open(train_data_file, "r") as fin:
for line in fin:
line = line.strip().split(';')
@@ -40,12 +46,10 @@ class TrainReader(Reader):
fo = open("tmp.txt", "w")
fo.write(str(self.max_len))
fo.close()
- self.batch_size = envs.get_global_env("batch_size", 32, "train.reader")
+ self.batch_size = envs.get_global_env("dataset.sample_1.batch_size",
+ 32, "train.reader")
self.group_size = self.batch_size * 20
-
-
-
def _process_line(self, line):
line = line.strip().split(';')
hist = line[0].split()
@@ -54,22 +58,22 @@ class TrainReader(Reader):
cate = [int(i) for i in cate]
return [hist, cate, [int(line[2])], [int(line[3])], [float(line[4])]]
-
def generate_sample(self, line):
"""
Read the data line by line and process it as a dictionary
"""
+
def data_iter():
- #feat_idx, feat_value, label = self._process_line(line)
+ # feat_idx, feat_value, label = self._process_line(line)
yield self._process_line(line)
return data_iter
-
+
def pad_batch_data(self, input, max_len):
res = np.array([x + [0] * (max_len - len(x)) for x in input])
res = res.astype("int64").reshape([-1, max_len])
return res
-
+
def make_data(self, b):
max_len = max(len(x[0]) for x in b)
item = self.pad_batch_data([x[0] for x in b], max_len)
@@ -79,9 +83,11 @@ class TrainReader(Reader):
[[0] * x + [-1e9] * (max_len - x) for x in len_array]).reshape(
[-1, max_len, 1])
target_item_seq = np.array(
- [[x[2]] * max_len for x in b]).astype("int64").reshape([-1, max_len])
+ [[x[2]] * max_len for x in b]).astype("int64").reshape(
+ [-1, max_len])
target_cat_seq = np.array(
- [[x[3]] * max_len for x in b]).astype("int64").reshape([-1, max_len])
+ [[x[3]] * max_len for x in b]).astype("int64").reshape(
+ [-1, max_len])
res = []
for i in range(len(b)):
res.append([
@@ -89,7 +95,7 @@ class TrainReader(Reader):
target_item_seq[i], target_cat_seq[i]
])
return res
-
+
def batch_reader(self, reader, batch_size, group_size):
def batch_reader():
bg = []
@@ -111,7 +117,7 @@ class TrainReader(Reader):
yield self.make_data(b)
return batch_reader
-
+
def base_read(self, file_dir):
res = []
for train_file in file_dir:
@@ -122,10 +128,9 @@ class TrainReader(Reader):
cate = line[1].split()
res.append([hist, cate, line[2], line[3], float(line[4])])
return res
-
+
def generate_batch_from_trainfiles(self, files):
data_set = self.base_read(files)
random.shuffle(data_set)
- return self.batch_reader(data_set, self.batch_size, self.batch_size * 20)
-
-
\ No newline at end of file
+ return self.batch_reader(data_set, self.batch_size,
+ self.batch_size * 20)
diff --git a/models/rank/dnn/README.md b/models/rank/dnn/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..c307efcc255bc3d087de0cbcb2aaae39e65ce2f2
--- /dev/null
+++ b/models/rank/dnn/README.md
@@ -0,0 +1,261 @@
+# 基于DNN模型的点击率预估模型
+
+## 介绍
+`CTR(Click Through Rate)`,即点击率,是“推荐系统/计算广告”等领域的重要指标,对其进行预估是商品推送/广告投放等决策的基础。简单来说,CTR预估对每次广告的点击情况做出预测,预测用户是点击还是不点击。CTR预估模型综合考虑各种因素、特征,在大量历史数据上训练,最终对商业决策提供帮助。本模型实现了下述论文中提出的DNN模型:
+
+```text
+@inproceedings{guo2017deepfm,
+ title={DeepFM: A Factorization-Machine based Neural Network for CTR Prediction},
+ author={Huifeng Guo, Ruiming Tang, Yunming Ye, Zhenguo Li and Xiuqiang He},
+ booktitle={the Twenty-Sixth International Joint Conference on Artificial Intelligence (IJCAI)},
+ pages={1725--1731},
+ year={2017}
+}
+```
+
+#
+## 数据准备
+### 数据来源
+训练及测试数据集选用[Display Advertising Challenge](https://www.kaggle.com/c/criteo-display-ad-challenge/)所用的Criteo数据集。该数据集包括两部分:训练集和测试集。训练集包含一段时间内Criteo的部分流量,测试集则对应训练数据后一天的广告点击流量。
+每一行数据格式如下所示:
+```bash
+ ... ...
+```
+其中``````表示广告是否被点击,点击用1表示,未点击用0表示。``````代表数值特征(连续特征),共有13个连续特征。``````代表分类特征(离散特征),共有26个离散特征。相邻两个特征用```\t```分隔,缺失特征用空格表示。测试集中``````特征已被移除。
+
+### 数据预处理
+数据预处理共包括两步:
+- 将原始训练集按9:1划分为训练集和验证集
+- 数值特征(连续特征)需进行归一化处理,但需要注意的是,对每一个特征``````,归一化时用到的最大值并不是用全局最大值,而是取排序后95%位置处的特征值作为最大值,同时保留极值。
+
+### 一键下载训练及测试数据
+```bash
+sh download_data.sh
+```
+执行该脚本,会从国内源的服务器上下载Criteo数据集,并解压到指定文件夹。全量训练数据放置于`./train_data_full/`,全量测试数据放置于`./test_data_full/`,用于快速验证的训练数据与测试数据放置于`./train_data/`与`./test_data/`。
+
+执行该脚本的理想输出为:
+```bash
+> sh download_data.sh
+--2019-11-26 06:31:33-- https://fleet.bj.bcebos.com/ctr_data.tar.gz
+Resolving fleet.bj.bcebos.com... 10.180.112.31
+Connecting to fleet.bj.bcebos.com|10.180.112.31|:443... connected.
+HTTP request sent, awaiting response... 200 OK
+Length: 4041125592 (3.8G) [application/x-gzip]
+Saving to: “ctr_data.tar.gz”
+
+100%[==================================================================================================================>] 4,041,125,592 120M/s in 32s
+
+2019-11-26 06:32:05 (120 MB/s) - “ctr_data.tar.gz” saved [4041125592/4041125592]
+
+raw_data/
+raw_data/part-55
+raw_data/part-113
+...
+test_data/part-227
+test_data/part-222
+Complete data download.
+Full Train data stored in ./train_data_full
+Full Test data stored in ./test_data_full
+Rapid Verification train data stored in ./train_data
+Rapid Verification test data stored in ./test_data
+```
+至此,我们已完成数据准备的全部工作。
+
+## 数据读取
+为了能高速运行CTR模型的训练,`PaddleRec`封装了`dataset`与`dataloader`API进行高性能的数据读取。
+
+如何在我们的训练中引入dataset读取方式呢?无需变更数据格式,只需在我们的训练代码中加入以下内容,便可达到媲美二进制读取的高效率,以下是一个比较完整的流程:
+
+### 引入dataset
+
+1. 通过工厂类`fluid.DatasetFactory()`创建一个dataset对象。
+2. 将我们定义好的数据输入格式传给dataset,通过`dataset.set_use_var(inputs)`实现。
+3. 指定我们的数据读取方式,由`dataset_generator.py`实现数据读取的规则,后面将会介绍读取规则的实现。
+4. 指定数据读取的batch_size。
+5. 指定数据读取的线程数,该线程数和训练线程应保持一致,两者为耦合的关系。
+6. 指定dataset读取的训练文件的列表。
+
+```python
+def get_dataset(inputs, args)
+ dataset = fluid.DatasetFactory().create_dataset()
+ dataset.set_use_var(inputs)
+ dataset.set_pipe_command("python dataset_generator.py")
+ dataset.set_batch_size(args.batch_size)
+ dataset.set_thread(int(args.cpu_num))
+ file_list = [
+ str(args.train_files_path) + "/%s" % x
+ for x in os.listdir(args.train_files_path)
+ ]
+ logger.info("file list: {}".format(file_list))
+ return dataset, file_list
+```
+
+### 如何指定数据读取规则
+
+在上文我们提到了由`dataset_generator.py`实现具体的数据读取规则,那么,怎样为dataset创建数据读取的规则呢?
+以下是`dataset_generator.py`的全部代码,具体流程如下:
+1. 首先我们需要引入dataset的库,位于`paddle.fluid.incubate.data_generator`。
+2. 声明一些在数据读取中会用到的变量,如示例代码中的`cont_min_`、`categorical_range_`等。
+3. 创建一个子类,继承dataset的基类,基类有多种选择,如果是多种数据类型混合,并且需要转化为数值进行预处理的,建议使用`MultiSlotDataGenerator`;若已经完成了预处理并保存为数据文件,可以直接以`string`的方式进行读取,使用`MultiSlotStringDataGenerator`,能够进一步加速。在示例代码,我们继承并实现了名为`CriteoDataset`的dataset子类,使用`MultiSlotDataGenerator`方法。
+4. 继承并实现基类中的`generate_sample`函数,逐行读取数据。该函数应返回一个可以迭代的reader方法(带有yield的函数不再是一个普通的函数,而是一个生成器generator,成为了可以迭代的对象,等价于一个数组、链表、文件、字符串etc.)
+5. 在这个可以迭代的函数中,如示例代码中的`def reader()`,我们定义数据读取的逻辑。例如对以行为单位的数据进行截取,转换及预处理。
+6. 最后,我们需要将数据整理为特定的格式,才能够被dataset正确读取,并灌入的训练的网络中。简单来说,数据的输出顺序与我们在网络中创建的`inputs`必须是严格一一对应的,并转换为类似字典的形式。在示例代码中,我们使用`zip`的方法将参数名与数值构成的元组组成了一个list,并将其yield输出。如果展开来看,我们输出的数据形如`[('dense_feature',[value]),('C1',[value]),('C2',[value]),...,('C26',[value]),('label',[value])]`
+
+
+```python
+import paddle.fluid.incubate.data_generator as dg
+
+cont_min_ = [0, -3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
+cont_max_ = [20, 600, 100, 50, 64000, 500, 100, 50, 500, 10, 10, 10, 50]
+cont_diff_ = [20, 603, 100, 50, 64000, 500, 100, 50, 500, 10, 10, 10, 50]
+hash_dim_ = 1000001
+continuous_range_ = range(1, 14)
+categorical_range_ = range(14, 40)
+
+class CriteoDataset(dg.MultiSlotDataGenerator):
+
+ def generate_sample(self, line):
+
+ def reader():
+ features = line.rstrip('\n').split('\t')
+ dense_feature = []
+ sparse_feature = []
+ for idx in continuous_range_:
+ if features[idx] == "":
+ dense_feature.append(0.0)
+ else:
+ dense_feature.append(
+ (float(features[idx]) - cont_min_[idx - 1]) /
+ cont_diff_[idx - 1])
+ for idx in categorical_range_:
+ sparse_feature.append(
+ [hash(str(idx) + features[idx]) % hash_dim_])
+ label = [int(features[0])]
+ process_line = dense_feature, sparse_feature, label
+ feature_name = ["dense_feature"]
+ for idx in categorical_range_:
+ feature_name.append("C" + str(idx - 13))
+ feature_name.append("label")
+
+ yield zip(feature_name, [dense_feature] + sparse_feature + [label])
+
+ return reader
+
+d = CriteoDataset()
+d.run_from_stdin()
+```
+### 快速调试Dataset
+我们可以脱离组网架构,单独验证Dataset的输出是否符合我们预期。使用命令
+`cat 数据文件 | python dataset读取python文件`进行dataset代码的调试:
+```bash
+cat train_data/part-0 | python dataset_generator.py
+```
+输出的数据格式如下:
+` dense_input:size ; dense_input:value ; sparse_input:size ; sparse_input:value ; ... ; sparse_input:size ; sparse_input:value ; label:size ; label:value `
+
+理想的输出为(截取了一个片段):
+```bash
+...
+13 0.05 0.00663349917081 0.05 0.0 0.02159375 0.008 0.15 0.04 0.362 0.1 0.2 0.0 0.04 1 715353 1 817085 1 851010 1 833725 1 286835 1 948614 1 881652 1 507110 1 27346 1 646986 1 643076 1 200960 1 18464 1 202774 1 532679 1 729573 1 342789 1 562805 1 880474 1 984402 1 666449 1 26235 1 700326 1 452909 1 884722 1 787527 1 0
+...
+```
+
+#
+## 模型组网
+### 数据输入声明
+正如数据准备章节所介绍,Criteo数据集中,分为连续数据与离散(稀疏)数据,所以整体而言,CTR-DNN模型的数据输入层包括三个,分别是:`dense_input`用于输入连续数据,维度由超参数`dense_feature_dim`指定,数据类型是归一化后的浮点型数据。`sparse_input_ids`用于记录离散数据,在Criteo数据集中,共有26个slot,所以我们创建了名为`C1~C26`的26个稀疏参数输入,并设置`lod_level=1`,代表其为变长数据,数据类型为整数;最后是每条样本的`label`,代表了是否被点击,数据类型是整数,0代表负样例,1代表正样例。
+
+在Paddle中数据输入的声明使用`paddle.fluid.data()`,会创建指定类型的占位符,数据IO会依据此定义进行数据的输入。
+```python
+dense_input = fluid.data(name="dense_input",
+ shape=[-1, args.dense_feature_dim],
+ dtype="float32")
+
+sparse_input_ids = [
+ fluid.data(name="C" + str(i),
+ shape=[-1, 1],
+ lod_level=1,
+ dtype="int64") for i in range(1, 27)
+]
+
+label = fluid.data(name="label", shape=[-1, 1], dtype="int64")
+inputs = [dense_input] + sparse_input_ids + [label]
+```
+
+### CTR-DNN模型组网
+
+CTR-DNN模型的组网比较直观,本质是一个二分类任务,代码参考`network_conf.py`。模型主要组成是一个`Embedding`层,三个`FC`层,以及相应的分类任务的loss计算和auc计算。
+
+#### Embedding层
+首先介绍Embedding层的搭建方式:`Embedding`层的输入是`sparse_input`,shape由超参的`sparse_feature_dim`和`embedding_size`定义。需要特别解释的是`is_sparse`参数,当我们指定`is_sprase=True`后,计算图会将该参数视为稀疏参数,反向更新以及分布式通信时,都以稀疏的方式进行,会极大的提升运行效率,同时保证效果一致。
+
+各个稀疏的输入通过Embedding层后,将其合并起来,置于一个list内,以方便进行concat的操作。
+
+```python
+def embedding_layer(input):
+ return fluid.layers.embedding(
+ input=input,
+ is_sparse=True,
+ size=[args.sparse_feature_dim,
+ args.embedding_size],
+ param_attr=fluid.ParamAttr(
+ name="SparseFeatFactors",
+ initializer=fluid.initializer.Uniform()),
+ )
+
+sparse_embed_seq = list(map(embedding_layer, inputs[1:-1])) # [C1~C26]
+```
+
+#### FC层
+将离散数据通过embedding查表得到的值,与连续数据的输入进行`concat`操作,合为一个整体输入,作为全链接层的原始输入。我们共设计了3层FC,每层FC的输出维度都为400,每层FC都后接一个`relu`激活函数,每层FC的初始化方式为符合正态分布的随机初始化,标准差与上一层的输出维度的平方根成反比。
+```python
+concated = fluid.layers.concat(sparse_embed_seq + inputs[0:1], axis=1)
+
+fc1 = fluid.layers.fc(
+ input=concated,
+ size=400,
+ act="relu",
+ param_attr=fluid.ParamAttr(initializer=fluid.initializer.Normal(
+ scale=1 / math.sqrt(concated.shape[1]))),
+)
+fc2 = fluid.layers.fc(
+ input=fc1,
+ size=400,
+ act="relu",
+ param_attr=fluid.ParamAttr(initializer=fluid.initializer.Normal(
+ scale=1 / math.sqrt(fc1.shape[1]))),
+)
+fc3 = fluid.layers.fc(
+ input=fc2,
+ size=400,
+ act="relu",
+ param_attr=fluid.ParamAttr(initializer=fluid.initializer.Normal(
+ scale=1 / math.sqrt(fc2.shape[1]))),
+)
+```
+#### Loss及Auc计算
+- 预测的结果通过一个输出shape为2的FC层给出,该FC层的激活函数时softmax,会给出每条样本分属于正负样本的概率。
+- 每条样本的损失由交叉熵给出,交叉熵的输入维度为[batch_size,2],数据类型为float,label的输入维度为[batch_size,1],数据类型为int。
+- 该batch的损失`avg_cost`是各条样本的损失之和
+- 我们同时还会计算预测的auc,auc的结果由`fluid.layers.auc()`给出,该层的返回值有三个,分别是全局auc: `auc_var`,当前batch的auc: `batch_auc_var`,以及auc_states: `auc_states`,auc_states包含了`batch_stat_pos, batch_stat_neg, stat_pos, stat_neg`信息。`batch_auc`我们取近20个batch的平均,由参数`slide_steps=20`指定,roc曲线的离散化的临界数值设置为4096,由`num_thresholds=2**12`指定。
+```
+predict = fluid.layers.fc(
+ input=fc3,
+ size=2,
+ act="softmax",
+ param_attr=fluid.ParamAttr(initializer=fluid.initializer.Normal(
+ scale=1 / math.sqrt(fc3.shape[1]))),
+ )
+
+cost = fluid.layers.cross_entropy(input=predict, label=inputs[-1])
+avg_cost = fluid.layers.reduce_sum(cost)
+accuracy = fluid.layers.accuracy(input=predict, label=inputs[-1])
+auc_var, batch_auc_var, auc_states = fluid.layers.auc(
+ input=predict,
+ label=inputs[-1],
+ num_thresholds=2**12,
+ slide_steps=20)
+```
+
+完成上述组网后,我们最终可以通过训练拿到`avg_cost`与`auc`两个重要指标。
diff --git a/models/rank/dnn/config.yaml b/models/rank/dnn/config.yaml
index 66f5053a0f2364a9038e6b9b5f3ccaad6481d6ca..fd64935dd2080291fc13911befc0481604c3464a 100755
--- a/models/rank/dnn/config.yaml
+++ b/models/rank/dnn/config.yaml
@@ -12,36 +12,72 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-train:
- trainer:
- # for cluster training
- strategy: "async"
+# workspace
+workspace: "paddlerec.models.rank.dnn"
- epochs: 10
- workspace: "fleetrec.models.rank.dnn"
+# list of dataset
+dataset:
+- name: dataset_train # name of dataset to distinguish different datasets
+ batch_size: 2
+ type: DataLoader # or QueueDataset
+ data_path: "{workspace}/data/sample_data/train"
+ sparse_slots: "click 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26"
+ dense_slots: "dense_var:13"
+- name: dataset_infer # name
+ batch_size: 2
+ type: DataLoader # or QueueDataset
+ data_path: "{workspace}/data/sample_data/train"
+ sparse_slots: "click 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26"
+ dense_slots: "dense_var:13"
- reader:
- batch_size: 2
- class: "{workspace}/../criteo_reader.py"
- train_data_path: "{workspace}/data/train"
+# hyper parameters of user-defined network
+hyper_parameters:
+ # optimizer config
+ optimizer:
+ class: Adam
+ learning_rate: 0.001
+ strategy: async
+ # user-defined pairs
+ sparse_inputs_slots: 27
+ sparse_feature_number: 1000001
+ sparse_feature_dim: 9
+ dense_input_dim: 13
+ fc_sizes: [512, 256, 128, 32]
- model:
- models: "{workspace}/model.py"
- hyper_parameters:
- sparse_inputs_slots: 27
- sparse_feature_number: 1000001
- sparse_feature_dim: 9
- dense_input_dim: 13
- fc_sizes: [512, 256, 128, 32]
- learning_rate: 0.001
- optimizer: adam
+# select runner by name
+mode: runner1
+# config of each runner.
+# runner is a kind of paddle training class, which wraps the train/infer process.
+runner:
+- name: runner1
+ class: single_train
+ # num of epochs
+ epochs: 10
+ # device to run training or infer
+ device: cpu
+ save_checkpoint_interval: 2 # save model interval of epochs
+ save_inference_interval: 4 # save inference
+ save_checkpoint_path: "increment" # save checkpoint path
+ save_inference_path: "inference" # save inference path
+ save_inference_feed_varnames: [] # feed vars of save inference
+ save_inference_fetch_varnames: [] # fetch vars of save inference
+ init_model_path: "" # load model path
+ print_interval: 10
+- name: runner2
+ class: single_infer
+ # num of epochs
+ epochs: 10
+ # device to run training or infer
+ device: cpu
+ init_model_path: "increment/0" # load model path
- save:
- increment:
- dirname: "increment"
- epoch_interval: 2
- save_last: True
- inference:
- dirname: "inference"
- epoch_interval: 4
- save_last: True
+# runner will run all the phase in each epoch
+phase:
+- name: phase1
+ model: "{workspace}/model.py" # user-defined model
+ dataset_name: dataset_train # select dataset by name
+ thread_num: 1
+#- name: phase2
+# model: "{workspace}/model.py" # user-defined model
+# dataset_name: dataset_infer # select dataset by name
+# thread_num: 1
diff --git a/models/rank/dnn/data/download.sh b/models/rank/dnn/data/download.sh
new file mode 100644
index 0000000000000000000000000000000000000000..56816f7d6be2227fbddafe49d3e24b1ef585a40c
--- /dev/null
+++ b/models/rank/dnn/data/download.sh
@@ -0,0 +1,13 @@
+wget --no-check-certificate https://fleet.bj.bcebos.com/ctr_data.tar.gz
+tar -zxvf ctr_data.tar.gz
+mv ./raw_data ./train_data_full
+mkdir train_data && cd train_data
+cp ../train_data_full/part-0 ../train_data_full/part-1 ./ && cd ..
+mv ./test_data ./test_data_full
+mkdir test_data && cd test_data
+cp ../test_data_full/part-220 ./ && cd ..
+echo "Complete data download."
+echo "Full Train data stored in ./train_data_full "
+echo "Full Test data stored in ./test_data_full "
+echo "Rapid Verification train data stored in ./train_data "
+echo "Rapid Verification test data stored in ./test_data "
diff --git a/models/rank/dnn/data/get_slot_data.py b/models/rank/dnn/data/get_slot_data.py
new file mode 100755
index 0000000000000000000000000000000000000000..f52447d06c297335685a704f688d71aa871328bc
--- /dev/null
+++ b/models/rank/dnn/data/get_slot_data.py
@@ -0,0 +1,71 @@
+# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import paddle.fluid.incubate.data_generator as dg
+
+cont_min_ = [0, -3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
+cont_max_ = [20, 600, 100, 50, 64000, 500, 100, 50, 500, 10, 10, 10, 50]
+cont_diff_ = [20, 603, 100, 50, 64000, 500, 100, 50, 500, 10, 10, 10, 50]
+hash_dim_ = 1000001
+continuous_range_ = range(1, 14)
+categorical_range_ = range(14, 40)
+
+
+class CriteoDataset(dg.MultiSlotDataGenerator):
+ """
+ DacDataset: inheritance MultiSlotDataGeneratior, Implement data reading
+ Help document: http://wiki.baidu.com/pages/viewpage.action?pageId=728820675
+ """
+
+ def generate_sample(self, line):
+ """
+ Read the data line by line and process it as a dictionary
+ """
+
+ def reader():
+ """
+ This function needs to be implemented by the user, based on data format
+ """
+ features = line.rstrip('\n').split('\t')
+ dense_feature = []
+ sparse_feature = []
+ for idx in continuous_range_:
+ if features[idx] == "":
+ dense_feature.append(0.0)
+ else:
+ dense_feature.append(
+ (float(features[idx]) - cont_min_[idx - 1]) /
+ cont_diff_[idx - 1])
+ for idx in categorical_range_:
+ sparse_feature.append(
+ [hash(str(idx) + features[idx]) % hash_dim_])
+ label = [int(features[0])]
+ process_line = dense_feature, sparse_feature, label
+ feature_name = ["dense_feature"]
+ for idx in categorical_range_:
+ feature_name.append("C" + str(idx - 13))
+ feature_name.append("label")
+ s = "click:" + str(label[0])
+ for i in dense_feature:
+ s += " dense_feature:" + str(i)
+ for i in range(1, 1 + len(categorical_range_)):
+ s += " " + str(i) + ":" + str(sparse_feature[i - 1][0])
+ print s.strip()
+ yield None
+
+ return reader
+
+
+d = CriteoDataset()
+d.run_from_stdin()
diff --git a/models/rank/dnn/data/run.sh b/models/rank/dnn/data/run.sh
new file mode 100644
index 0000000000000000000000000000000000000000..f2d1fc9210d65e521cb7fa19cadab3ec95d95a31
--- /dev/null
+++ b/models/rank/dnn/data/run.sh
@@ -0,0 +1,25 @@
+sh download.sh
+
+mkdir slot_train_data_full
+for i in `ls ./train_data_full`
+do
+ cat train_data_full/$i | python get_slot_data.py > slot_train_data_full/$i
+done
+
+mkdir slot_test_data_full
+for i in `ls ./test_data_full`
+do
+ cat test_data_full/$i | python get_slot_data.py > slot_test_data_full/$i
+done
+
+mkdir slot_train_data
+for i in `ls ./train_data`
+do
+ cat train_data/$i | python get_slot_data.py > slot_train_data/$i
+done
+
+mkdir slot_test_data
+for i in `ls ./test_data`
+do
+ cat test_data/$i | python get_slot_data.py > slot_test_data/$i
+done
diff --git a/models/rank/dnn/data/sample_data/train/sample_train.txt b/models/rank/dnn/data/sample_data/train/sample_train.txt
new file mode 100644
index 0000000000000000000000000000000000000000..b2b2e2022a8601d60b6e47ea9665dcc314bd04b6
--- /dev/null
+++ b/models/rank/dnn/data/sample_data/train/sample_train.txt
@@ -0,0 +1,80 @@
+click:0 dense_feature:0.0 dense_feature:0.00497512437811 dense_feature:0.05 dense_feature:0.08 dense_feature:0.207421875 dense_feature:0.028 dense_feature:0.35 dense_feature:0.08 dense_feature:0.082 dense_feature:0.0 dense_feature:0.4 dense_feature:0.0 dense_feature:0.08 1:737395 2:210498 3:903564 4:286224 5:286835 6:906818 7:906116 8:67180 9:27346 10:51086 11:142177 12:95024 13:157883 14:873363 15:600281 16:812592 17:228085 18:35900 19:880474 20:984402 21:100885 22:26235 23:410878 24:798162 25:499868 26:306163
+click:1 dense_feature:0.0 dense_feature:0.932006633499 dense_feature:0.02 dense_feature:0.14 dense_feature:0.0395625 dense_feature:0.328 dense_feature:0.98 dense_feature:0.12 dense_feature:1.886 dense_feature:0.0 dense_feature:1.8 dense_feature:0.0 dense_feature:0.14 1:715353 2:761523 3:432904 4:892267 5:515218 6:948614 7:266726 8:67180 9:27346 10:266081 11:286126 12:789480 13:49621 14:255651 15:47663 16:79797 17:342789 18:616331 19:880474 20:984402 21:242209 22:26235 23:669531 24:26284 25:269955 26:187951
+click:0 dense_feature:0.0 dense_feature:0.00829187396352 dense_feature:0.08 dense_feature:0.06 dense_feature:0.14125 dense_feature:0.076 dense_feature:0.05 dense_feature:0.22 dense_feature:0.208 dense_feature:0.0 dense_feature:0.2 dense_feature:0.0 dense_feature:0.06 1:737395 2:952384 3:511141 4:271077 5:286835 6:948614 7:903547 8:507110 9:27346 10:56047 11:612953 12:747707 13:977426 14:671506 15:158148 16:833738 17:342789 18:427155 19:880474 20:537425 21:916237 22:26235 23:468277 24:676936 25:751788 26:363967
+click:0 dense_feature:0.0 dense_feature:0.124378109453 dense_feature:0.02 dense_feature:0.04 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.08 dense_feature:0.024 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.04 1:210127 2:286436 3:183920 4:507656 5:286835 6:906818 7:199553 8:67180 9:502607 10:708281 11:809876 12:888238 13:375164 14:202774 15:459895 16:475933 17:555571 18:847163 19:26230 20:26229 21:808836 22:191474 23:410878 24:315120 25:26224 26:26223
+click:0 dense_feature:0.1 dense_feature:0.0149253731343 dense_feature:0.34 dense_feature:0.32 dense_feature:0.016421875 dense_feature:0.098 dense_feature:0.04 dense_feature:0.96 dense_feature:0.202 dense_feature:0.1 dense_feature:0.2 dense_feature:0.0 dense_feature:0.32 1:230803 2:817085 3:539110 4:388629 5:286835 6:948614 7:586040 8:67180 9:27346 10:271155 11:176640 12:827381 13:36881 14:202774 15:397299 16:411672 17:342789 18:474060 19:880474 20:984402 21:216871 22:26235 23:761351 24:787115 25:884722 26:904135
+click:0 dense_feature:0.0 dense_feature:0.00829187396352 dense_feature:0.13 dense_feature:0.04 dense_feature:0.246203125 dense_feature:0.108 dense_feature:0.05 dense_feature:0.04 dense_feature:0.03 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.04 1:737395 2:64837 3:259267 4:336976 5:515218 6:154084 7:847938 8:67180 9:27346 10:708281 11:776766 12:964800 13:324323 14:873363 15:212708 16:637238 17:681378 18:895034 19:673458 20:984402 21:18600 22:26235 23:410878 24:787115 25:884722 26:355412
+click:0 dense_feature:0.0 dense_feature:0.028192371476 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0245625 dense_feature:0.016 dense_feature:0.04 dense_feature:0.12 dense_feature:0.016 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.0 1:737395 2:554760 3:661483 4:263696 5:938478 6:906818 7:786926 8:67180 9:27346 10:245862 11:668197 12:745676 13:432600 14:413795 15:751427 16:272410 17:342789 18:422136 19:26230 20:26229 21:452501 22:26235 23:51381 24:776636 25:26224 26:26223
+click:0 dense_feature:0.0 dense_feature:0.00497512437811 dense_feature:1.95 dense_feature:0.28 dense_feature:0.092828125 dense_feature:0.57 dense_feature:0.06 dense_feature:0.4 dense_feature:0.4 dense_feature:0.0 dense_feature:0.2 dense_feature:0.0 dense_feature:0.4 1:371155 2:817085 3:773609 4:555449 5:938478 6:906818 7:166117 8:507110 9:27346 10:545822 11:316654 12:172765 13:989600 14:255651 15:792372 16:606361 17:342789 18:566554 19:880474 20:984402 21:235256 22:191474 23:700326 24:787115 25:884722 26:569095
+click:0 dense_feature:0.0 dense_feature:0.0912106135987 dense_feature:0.01 dense_feature:0.02 dense_feature:0.06625 dense_feature:0.018 dense_feature:0.05 dense_feature:0.06 dense_feature:0.098 dense_feature:0.0 dense_feature:0.4 dense_feature:0.0 dense_feature:0.04 1:230803 2:531472 3:284417 4:661677 5:938478 6:553107 7:21150 8:49466 9:27346 10:526914 11:164508 12:631773 13:882348 14:873363 15:523948 16:687081 17:342789 18:271301 19:26230 20:26229 21:647160 22:26235 23:410878 24:231695 25:26224 26:26223
+click:1 dense_feature:0.0 dense_feature:0.00663349917081 dense_feature:0.01 dense_feature:0.02 dense_feature:0.02153125 dense_feature:0.092 dense_feature:0.05 dense_feature:0.68 dense_feature:0.472 dense_feature:0.0 dense_feature:0.3 dense_feature:0.0 dense_feature:0.02 1:737395 2:532829 3:320762 4:887282 5:286835 6:25207 7:640357 8:67180 9:27346 10:695831 11:739268 12:835325 13:402539 14:873363 15:125813 16:168896 17:342789 18:374414 19:26230 20:26229 21:850229 22:26235 23:410878 24:480027 25:26224 26:26223
+click:0 dense_feature:0.0 dense_feature:0.00497512437811 dense_feature:0.05 dense_feature:0.04 dense_feature:0.086125 dense_feature:0.098 dense_feature:0.15 dense_feature:0.06 dense_feature:0.228 dense_feature:0.0 dense_feature:0.2 dense_feature:0.0 dense_feature:0.04 1:210127 2:999497 3:646348 4:520638 5:938478 6:906818 7:438398 8:67180 9:27346 10:975902 11:532544 12:708828 13:815045 14:255651 15:896230 16:663630 17:342789 18:820094 19:687226 20:537425 21:481536 22:26235 23:761351 24:888170 25:250729 26:381125
+click:1 dense_feature:0.1 dense_feature:0.00331674958541 dense_feature:0.02 dense_feature:0.02 dense_feature:0.00078125 dense_feature:0.002 dense_feature:0.73 dense_feature:0.08 dense_feature:0.254 dense_feature:0.1 dense_feature:1.4 dense_feature:0.0 dense_feature:0.02 1:715353 2:342833 3:551901 4:73418 5:286835 6:446063 7:219517 8:67180 9:27346 10:668726 11:40711 12:921745 13:361076 14:15048 15:214564 16:400893 17:228085 18:393370 19:26230 20:26229 21:383046 22:26235 23:700326 24:369764 25:26224 26:26223
+click:0 dense_feature:0.0 dense_feature:0.142620232172 dense_feature:0.04 dense_feature:0.1 dense_feature:0.08853125 dense_feature:0.028 dense_feature:0.01 dense_feature:0.1 dense_feature:0.028 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.1 1:737395 2:583707 3:519411 4:19103 5:286835 6:906818 7:801403 8:67180 9:27346 10:35743 11:626052 12:142351 13:988058 14:873363 15:617333 16:850339 17:276641 18:696084 19:26230 20:26229 21:121620 22:191474 23:468277 24:18340 25:26224 26:26223
+click:0 dense_feature:0.0 dense_feature:0.00995024875622 dense_feature:0.0 dense_feature:0.22 dense_feature:0.0251875 dense_feature:0.0 dense_feature:0.0 dense_feature:0.8 dense_feature:0.182 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.84 1:737395 2:19359 3:166075 4:381832 5:286835 6:446063 7:816009 8:67180 9:27346 10:708281 11:619790 12:524128 13:826787 14:202774 15:371495 16:392894 17:644532 18:271180 19:26230 20:26229 21:349978 22:26235 23:761351 24:517170 25:26224 26:26223
+click:0 dense_feature:0.0 dense_feature:0.0149253731343 dense_feature:0.52 dense_feature:0.1 dense_feature:6.25153125 dense_feature:0.0 dense_feature:0.0 dense_feature:0.3 dense_feature:0.03 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.1 1:230803 2:24784 3:519411 4:19103 5:843054 6:948614 7:529143 8:67180 9:502607 10:708281 11:430027 12:142351 13:529101 14:202774 15:618316 16:850339 17:644532 18:95370 19:880474 20:31181 21:121620 22:26235 23:744389 24:18340 25:269955 26:683431
+click:0 dense_feature:0.0 dense_feature:0.0480928689884 dense_feature:0.12 dense_feature:0.22 dense_feature:0.541703125 dense_feature:1.062 dense_feature:0.01 dense_feature:0.24 dense_feature:0.054 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.22 1:737395 2:378661 3:21539 4:552097 5:286835 6:553107 7:512138 8:67180 9:27346 10:708281 11:91094 12:516991 13:150114 14:873363 15:450569 16:353024 17:228085 18:539379 19:26230 20:26229 21:410733 22:26235 23:700326 24:272703 25:26224 26:26223
+click:0 dense_feature:0.0 dense_feature:0.016583747927 dense_feature:0.06 dense_feature:0.0 dense_feature:0.209625 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.09 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 1:737395 2:750131 3:807749 4:905739 5:286835 6:906818 7:11935 8:67180 9:27346 10:708281 11:505199 12:285350 13:724106 14:255651 15:625913 16:511836 17:644532 18:102288 19:26230 20:26229 21:726818 22:179327 23:744389 24:176417 25:26224 26:26223
+click:0 dense_feature:0.0 dense_feature:0.00663349917081 dense_feature:0.05 dense_feature:0.14 dense_feature:0.226703125 dense_feature:0.12 dense_feature:0.05 dense_feature:0.14 dense_feature:0.112 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.14 1:736218 2:690313 3:757279 4:763330 5:286835 6:553107 7:89560 8:642551 9:27346 10:128328 11:281593 12:246510 13:200341 14:255651 15:899145 16:807138 17:342789 18:659853 19:26230 20:26229 21:399608 22:26235 23:669531 24:787115 25:26224 26:26223
+click:0 dense_feature:0.0 dense_feature:0.00829187396352 dense_feature:0.3 dense_feature:0.2 dense_feature:0.021296875 dense_feature:0.83 dense_feature:0.2 dense_feature:0.56 dense_feature:1.122 dense_feature:0.0 dense_feature:0.5 dense_feature:0.0 dense_feature:0.2 1:715353 2:283434 3:523722 4:590869 5:286835 6:948614 7:25472 8:67180 9:27346 10:340404 11:811342 12:679454 13:897590 14:813514 15:578769 16:962576 17:342789 18:267210 19:310188 20:537425 21:746185 22:179327 23:761351 24:416923 25:253255 26:249672
+click:1 dense_feature:0.05 dense_feature:0.0149253731343 dense_feature:0.03 dense_feature:0.24 dense_feature:0.0 dense_feature:0.008 dense_feature:0.4 dense_feature:0.62 dense_feature:0.82 dense_feature:0.1 dense_feature:1.4 dense_feature:0.0 dense_feature:0.08 1:715353 2:532829 3:716475 4:940968 5:286835 6:948614 7:38171 8:67180 9:27346 10:619455 11:515541 12:779426 13:711791 14:255651 15:881750 16:408550 17:342789 18:612540 19:26230 20:26229 21:23444 22:26235 23:410878 24:88425 25:26224 26:26223
+click:0 dense_feature:0.0 dense_feature:0.00497512437811 dense_feature:0.11 dense_feature:0.08 dense_feature:0.135265625 dense_feature:0.426 dense_feature:0.06 dense_feature:0.06 dense_feature:0.42 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.08 1:737395 2:817085 3:506158 4:48876 5:286835 6:948614 7:95506 8:67180 9:27346 10:75825 11:220591 12:613471 13:159874 14:255651 15:121379 16:889290 17:681378 18:532453 19:880474 20:537425 21:717912 22:26235 23:270873 24:450199 25:884722 26:382723
+click:0 dense_feature:0.0 dense_feature:0.0829187396352 dense_feature:0.0 dense_feature:0.0 dense_feature:0.555859375 dense_feature:0.318 dense_feature:0.03 dense_feature:0.0 dense_feature:0.02 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.0 1:715353 2:465222 3:974451 4:892661 5:938478 6:948614 7:651987 8:67180 9:27346 10:708281 11:229311 12:545057 13:875629 14:149134 15:393524 16:213237 17:681378 18:540092 19:26230 20:26229 21:483290 22:26235 23:700326 24:946673 25:26224 26:26223
+click:1 dense_feature:0.05 dense_feature:0.854063018242 dense_feature:0.01 dense_feature:0.04 dense_feature:0.000171875 dense_feature:0.004 dense_feature:0.01 dense_feature:0.04 dense_feature:0.004 dense_feature:0.1 dense_feature:0.1 dense_feature:0.0 dense_feature:0.04 1:737395 2:99294 3:681584 4:398205 5:914075 6:906818 7:620358 8:67180 9:27346 10:147441 11:364583 12:535262 13:516341 14:813514 15:281303 16:714384 17:276641 18:443922 19:26230 20:26229 21:948746 22:26235 23:700326 24:928903 25:26224 26:26223
+click:0 dense_feature:0.0 dense_feature:0.00331674958541 dense_feature:0.0 dense_feature:0.0 dense_feature:0.45190625 dense_feature:0.048 dense_feature:0.01 dense_feature:0.16 dense_feature:0.044 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.0 1:737395 2:792512 3:676584 4:995262 5:938478 6:906818 7:888723 8:67180 9:27346 10:708281 11:310529 12:951172 13:885793 14:873363 15:62698 16:672021 17:276641 18:11502 19:880474 20:984402 21:501083 22:191474 23:744389 24:398029 25:218743 26:991064
+click:0 dense_feature:0.0 dense_feature:0.00663349917081 dense_feature:0.51 dense_feature:0.0 dense_feature:0.2689375 dense_feature:0.0 dense_feature:0.0 dense_feature:0.02 dense_feature:0.006 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 1:230803 2:239052 3:323170 4:474182 5:140103 6:553107 7:757837 8:524745 9:27346 10:743444 11:883533 12:123023 13:621127 14:255651 15:570872 16:883618 17:924903 18:984920 19:964183 20:984402 21:260134 22:179327 23:410878 24:787860 25:269955 26:949924
+click:0 dense_feature:0.0 dense_feature:0.273631840796 dense_feature:0.0 dense_feature:0.0 dense_feature:0.066453125 dense_feature:0.052 dense_feature:0.04 dense_feature:0.06 dense_feature:0.01 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.0 1:737395 2:531472 3:747313 4:362684 5:843054 6:553107 7:863980 8:718499 9:27346 10:881217 11:371751 12:168971 13:290788 14:202774 15:316669 16:269663 17:342789 18:136775 19:26230 20:26229 21:76865 22:26235 23:761351 24:441421 25:26224 26:26223
+click:0 dense_feature:0.0 dense_feature:0.116086235489 dense_feature:0.43 dense_feature:0.36 dense_feature:0.000953125 dense_feature:0.0 dense_feature:0.0 dense_feature:0.36 dense_feature:0.036 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.36 1:737395 2:24784 3:677469 4:820784 5:286835 6:553107 7:715520 8:718499 9:27346 10:708281 11:670424 12:122926 13:724619 14:873363 15:845517 16:488791 17:644532 18:183573 19:880474 20:31181 21:46761 22:26235 23:700326 24:629361 25:269955 26:862373
+click:0 dense_feature:2.55 dense_feature:0.0348258706468 dense_feature:0.01 dense_feature:0.38 dense_feature:0.001453125 dense_feature:0.046 dense_feature:1.11 dense_feature:0.44 dense_feature:2.312 dense_feature:0.2 dense_feature:1.1 dense_feature:0.0 dense_feature:0.46 1:594517 2:194636 3:496284 4:323209 5:286835 6:553107 7:259696 8:760861 9:27346 10:698046 11:478868 12:576074 13:635369 14:201966 15:926692 16:972906 17:342789 18:409802 19:26230 20:26229 21:395694 22:26235 23:410878 24:844671 25:26224 26:26223
+click:0 dense_feature:0.0 dense_feature:0.144278606965 dense_feature:0.43 dense_feature:0.22 dense_feature:0.00309375 dense_feature:0.15 dense_feature:0.14 dense_feature:0.54 dense_feature:0.152 dense_feature:0.0 dense_feature:0.2 dense_feature:0.1 dense_feature:0.22 1:737395 2:239052 3:456744 4:736474 5:286835 6:948614 7:13277 8:67180 9:27346 10:958384 11:778183 12:497627 13:136915 14:201966 15:757961 16:747483 17:228085 18:984920 19:905920 20:537425 21:472149 22:179327 23:410878 24:709155 25:269955 26:618673
+click:0 dense_feature:0.0 dense_feature:0.0132669983416 dense_feature:0.4 dense_feature:0.3 dense_feature:0.36440625 dense_feature:1.492 dense_feature:0.07 dense_feature:0.3 dense_feature:1.048 dense_feature:0.0 dense_feature:0.3 dense_feature:0.0 dense_feature:0.3 1:737395 2:19959 3:661391 4:748753 5:286835 6:948614 7:848540 8:67180 9:27346 10:708281 11:703964 12:72024 13:336272 14:255651 15:835686 16:703858 17:342789 18:274368 19:26230 20:26229 21:765452 22:26235 23:700326 24:815200 25:26224 26:26223
+click:0 dense_feature:0.0 dense_feature:0.0116086235489 dense_feature:0.01 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 1:210127 2:662691 3:334228 4:857003 5:286835 6:25207 7:280499 8:67180 9:502607 10:708281 11:195094 12:870026 13:783566 14:873363 15:139595 16:214259 17:555571 18:208248 19:880474 20:984402 21:471770 22:26235 23:744389 24:507551 25:383787 26:797121
+click:1 dense_feature:0.0 dense_feature:0.0348258706468 dense_feature:0.03 dense_feature:0.02 dense_feature:0.066140625 dense_feature:0.006 dense_feature:0.17 dense_feature:0.02 dense_feature:0.236 dense_feature:0.0 dense_feature:0.5 dense_feature:0.0 dense_feature:0.02 1:230803 2:999497 3:25361 4:892267 5:286835 6:906818 7:356528 8:67180 9:27346 10:5856 11:157692 12:554754 13:442501 14:255651 15:896230 16:248781 17:342789 18:820094 19:905920 20:984402 21:916436 22:26235 23:669531 24:26284 25:884722 26:187951
+click:0 dense_feature:0.0 dense_feature:4.62852404643 dense_feature:0.07 dense_feature:0.0 dense_feature:0.022671875 dense_feature:0.0 dense_feature:0.01 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.0 1:624252 2:344887 3:238747 4:308366 5:286835 6:553107 7:69291 8:67180 9:27346 10:781054 11:258240 12:546906 13:772337 14:873363 15:807640 16:525695 17:276641 18:613203 19:438655 20:984402 21:415123 22:191474 23:700326 24:729290 25:218743 26:953507
+click:0 dense_feature:0.0 dense_feature:0.00663349917081 dense_feature:0.06 dense_feature:0.02 dense_feature:0.06878125 dense_feature:0.044 dense_feature:0.01 dense_feature:0.22 dense_feature:0.044 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.02 1:737395 2:7753 3:871178 4:183530 5:286835 6:906818 7:273988 8:507110 9:27346 10:708281 11:942072 12:775997 13:612590 14:873363 15:669921 16:639940 17:681378 18:421122 19:880474 20:984402 21:410471 22:26235 23:410878 24:228420 25:269955 26:616000
+click:0 dense_feature:0.0 dense_feature:0.212271973466 dense_feature:0.02 dense_feature:0.28 dense_feature:0.113421875 dense_feature:0.06 dense_feature:0.02 dense_feature:0.28 dense_feature:0.194 dense_feature:0.0 dense_feature:0.2 dense_feature:0.0 dense_feature:0.28 1:210127 2:228963 3:692240 4:389834 5:938478 6:948614 7:125690 8:507110 9:27346 10:708281 11:549232 12:308284 13:262461 14:255651 15:629185 16:280660 17:276641 18:886164 19:26230 20:26229 21:367919 22:191474 23:700326 24:520083 25:26224 26:26223
+click:0 dense_feature:0.0 dense_feature:1.01658374793 dense_feature:0.01 dense_feature:0.02 dense_feature:0.11759375 dense_feature:0.08 dense_feature:0.02 dense_feature:0.02 dense_feature:0.024 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.02 1:230803 2:7753 3:194720 4:831884 5:286835 6:553107 7:620358 8:67180 9:27346 10:843010 11:424144 12:615986 13:516341 14:813514 15:782575 16:775856 17:342789 18:421122 19:880474 20:984402 21:110090 22:191474 23:700326 24:784174 25:269955 26:101161
+click:0 dense_feature:0.0 dense_feature:0.00663349917081 dense_feature:0.59 dense_feature:0.06 dense_feature:0.04321875 dense_feature:0.192 dense_feature:0.02 dense_feature:0.08 dense_feature:0.014 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.06 1:230803 2:532829 3:26258 4:853241 5:938478 6:948614 7:877607 8:67180 9:27346 10:613723 11:246387 12:538673 13:377975 14:873363 15:659013 16:601478 17:681378 18:199271 19:26230 20:26229 21:300137 22:26235 23:410878 24:372458 25:26224 26:26223
+click:0 dense_feature:0.0 dense_feature:0.06135986733 dense_feature:0.0 dense_feature:0.0 dense_feature:0.294671875 dense_feature:0.212 dense_feature:0.26 dense_feature:0.0 dense_feature:0.034 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.0 1:737395 2:154478 3:982044 4:501457 5:819883 6:906818 7:445051 8:67180 9:27346 10:976970 11:783630 12:609883 13:358461 14:15048 15:409791 16:756307 17:342789 18:480228 19:26230 20:26229 21:845147 22:26235 23:669531 24:124290 25:26224 26:26223
+click:1 dense_feature:0.05 dense_feature:0.537313432836 dense_feature:0.0 dense_feature:0.02 dense_feature:0.018578125 dense_feature:0.016 dense_feature:0.16 dense_feature:0.22 dense_feature:0.192 dense_feature:0.1 dense_feature:0.3 dense_feature:0.0 dense_feature:0.02 1:737395 2:194636 3:274597 4:418981 5:286835 6:553107 7:553528 8:67180 9:27346 10:901359 11:110700 12:108037 13:915461 14:255651 15:951604 16:421384 17:342789 18:728110 19:26230 20:26229 21:772733 22:191474 23:761351 24:844671 25:26224 26:26223
+click:0 dense_feature:0.1 dense_feature:0.00663349917081 dense_feature:0.16 dense_feature:0.26 dense_feature:0.00509375 dense_feature:0.122 dense_feature:0.03 dense_feature:0.94 dense_feature:0.526 dense_feature:0.1 dense_feature:0.1 dense_feature:0.0 dense_feature:1.1 1:210127 2:344887 3:343793 4:917598 5:286835 6:948614 7:220413 8:67180 9:27346 10:912799 11:370606 12:722621 13:569604 14:255651 15:499545 16:159495 17:342789 18:613203 19:305384 20:984402 21:844602 22:26235 23:410878 24:695516 25:218743 26:729263
+click:0 dense_feature:0.0 dense_feature:0.00497512437811 dense_feature:0.09 dense_feature:0.16 dense_feature:0.11221875 dense_feature:0.51 dense_feature:0.09 dense_feature:0.48 dense_feature:0.088 dense_feature:0.0 dense_feature:0.4 dense_feature:0.0 dense_feature:0.16 1:737395 2:532829 3:579624 4:980109 5:286835 6:948614 7:927736 8:67180 9:27346 10:970644 11:931289 12:377125 13:539272 14:873363 15:555779 16:405069 17:342789 18:701770 19:26230 20:26229 21:201088 22:26235 23:410878 24:113994 25:26224 26:26223
+click:0 dense_feature:0.0 dense_feature:0.182421227197 dense_feature:0.01 dense_feature:0.02 dense_feature:0.000109375 dense_feature:0.978 dense_feature:0.01 dense_feature:0.02 dense_feature:0.062 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.02 1:478318 2:158107 3:508317 4:452336 5:286835 6:948614 7:620358 8:67180 9:27346 10:147441 11:364583 12:34025 13:516341 14:873363 15:502825 16:683439 17:681378 18:889198 19:26230 20:26229 21:234451 22:26235 23:700326 24:256238 25:26224 26:26223
+click:0 dense_feature:0.0 dense_feature:0.469320066335 dense_feature:0.2 dense_feature:0.2 dense_feature:0.0705 dense_feature:0.102 dense_feature:0.05 dense_feature:0.22 dense_feature:0.194 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.2 1:715353 2:846239 3:573061 4:508181 5:286835 6:553107 7:892443 8:718499 9:27346 10:639370 11:866496 12:791636 13:895012 14:873363 15:362079 16:16082 17:228085 18:994402 19:880474 20:984402 21:35513 22:26235 23:669531 24:520197 25:934391 26:625657
+click:0 dense_feature:0.0 dense_feature:0.0729684908789 dense_feature:0.06 dense_feature:0.04 dense_feature:5.620296875 dense_feature:0.0 dense_feature:0.0 dense_feature:0.06 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.04 1:399845 2:239052 3:334610 4:593315 5:286835 6:948614 7:751495 8:67180 9:502607 10:111048 11:244081 12:115252 13:915518 14:873363 15:817451 16:296052 17:276641 18:984920 19:774721 20:984402 21:930636 22:26235 23:700326 24:975048 25:269955 26:266439
+click:1 dense_feature:0.05 dense_feature:0.0265339966833 dense_feature:0.07 dense_feature:0.22 dense_feature:1.5625e-05 dense_feature:0.008 dense_feature:0.04 dense_feature:0.36 dense_feature:0.088 dense_feature:0.1 dense_feature:0.3 dense_feature:0.0 dense_feature:0.08 1:737395 2:64837 3:534435 4:555449 5:286835 6:25207 7:661236 8:67180 9:27346 10:708281 11:785752 12:47348 13:524553 14:117289 15:776971 16:293528 17:681378 18:102169 19:758208 20:31181 21:27506 22:26235 23:410878 24:787115 25:884722 26:605635
+click:1 dense_feature:0.1 dense_feature:0.0464344941957 dense_feature:0.0 dense_feature:0.04 dense_feature:0.00059375 dense_feature:0.004 dense_feature:0.02 dense_feature:0.04 dense_feature:0.004 dense_feature:0.1 dense_feature:0.1 dense_feature:0.0 dense_feature:0.04 1:230803 2:7753 3:529866 4:437169 5:938478 6:948614 7:17274 8:67180 9:27346 10:461781 11:452641 12:302471 13:49621 14:873363 15:543432 16:858509 17:681378 18:402164 19:880474 20:984402 21:650184 22:191474 23:410878 24:492581 25:269955 26:217228
+click:0 dense_feature:0.55 dense_feature:0.00829187396352 dense_feature:0.03 dense_feature:0.0 dense_feature:0.0014375 dense_feature:0.004 dense_feature:0.36 dense_feature:0.0 dense_feature:0.042 dense_feature:0.1 dense_feature:0.4 dense_feature:0.0 dense_feature:0.0 1:26973 2:817085 3:961160 4:355882 5:843054 6:906818 7:417593 8:67180 9:27346 10:708281 11:402889 12:899379 13:552051 14:202774 15:532679 16:545549 17:342789 18:562805 19:880474 20:31181 21:355920 22:26235 23:700326 24:787115 25:884722 26:115004
+click:1 dense_feature:0.0 dense_feature:0.00663349917081 dense_feature:0.01 dense_feature:0.02 dense_feature:0.089296875 dense_feature:0.362 dense_feature:0.23 dense_feature:0.04 dense_feature:0.338 dense_feature:0.0 dense_feature:0.4 dense_feature:0.0 dense_feature:0.02 1:230803 2:977337 3:853759 4:880273 5:515218 6:25207 7:414263 8:437731 9:27346 10:205124 11:108170 12:676869 13:388798 14:255651 15:247232 16:172895 17:228085 18:543219 19:26230 20:26229 21:860937 22:179327 23:669531 24:959959 25:26224 26:26223
+click:0 dense_feature:0.0 dense_feature:0.0945273631841 dense_feature:0.62 dense_feature:0.24 dense_feature:0.11840625 dense_feature:0.368 dense_feature:0.07 dense_feature:0.24 dense_feature:0.144 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.48 1:737395 2:532829 3:805087 4:186661 5:286835 6:154084 7:468059 8:718499 9:27346 10:708281 11:968875 12:8177 13:47822 14:255651 15:979316 16:956543 17:342789 18:541633 19:26230 20:26229 21:646669 22:26235 23:410878 24:184909 25:26224 26:26223
+click:0 dense_feature:0.3 dense_feature:0.00497512437811 dense_feature:0.12 dense_feature:0.12 dense_feature:0.002890625 dense_feature:0.074 dense_feature:0.06 dense_feature:0.14 dense_feature:0.074 dense_feature:0.1 dense_feature:0.1 dense_feature:0.0 dense_feature:0.74 1:737395 2:64837 3:967865 4:249418 5:938478 6:948614 7:228716 8:67180 9:27346 10:627362 11:722606 12:193782 13:348283 14:255651 15:928582 16:221557 17:342789 18:895034 19:384556 20:984402 21:475712 22:26235 23:410878 24:492875 25:884722 26:468964
+click:0 dense_feature:0.0 dense_feature:0.177446102819 dense_feature:0.01 dense_feature:0.02 dense_feature:0.041859375 dense_feature:0.0 dense_feature:0.0 dense_feature:0.16 dense_feature:0.036 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.02 1:154064 2:834620 3:25206 4:25205 5:938478 6:948614 7:134101 8:92608 9:27346 10:708281 11:505199 12:25711 13:724106 14:671506 15:42927 16:25723 17:644532 18:1957 19:26230 20:26229 21:26236 22:26235 23:744389 24:26233 25:26224 26:26223
+click:0 dense_feature:0.0 dense_feature:5.61691542289 dense_feature:0.0 dense_feature:0.1 dense_feature:0.043796875 dense_feature:0.302 dense_feature:0.13 dense_feature:0.22 dense_feature:0.3 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.22 1:154184 2:19359 3:166075 4:381832 5:286835 6:906818 7:348227 8:49466 9:27346 10:645596 11:951584 12:524128 13:277250 14:255651 15:853732 16:392894 17:342789 18:619939 19:26230 20:26229 21:349978 22:26235 23:700326 24:517170 25:26224 26:26223
+click:1 dense_feature:0.0 dense_feature:0.00331674958541 dense_feature:0.0 dense_feature:0.0 dense_feature:0.093234375 dense_feature:0.022 dense_feature:0.04 dense_feature:0.02 dense_feature:0.02 dense_feature:0.0 dense_feature:0.2 dense_feature:0.0 dense_feature:0.0 1:715353 2:485136 3:386313 4:208181 5:286835 6:25207 7:227715 8:49466 9:27346 10:437476 11:733250 12:721260 13:389832 14:255651 15:47178 16:761962 17:342789 18:813169 19:26230 20:26229 21:464938 22:26235 23:410878 24:833196 25:26224 26:26223
+click:0 dense_feature:0.0 dense_feature:0.134328358209 dense_feature:0.0 dense_feature:0.14 dense_feature:0.00015625 dense_feature:0.0 dense_feature:0.0 dense_feature:0.14 dense_feature:0.014 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.14 1:737395 2:488655 3:221719 4:442408 5:286835 6:25207 7:898902 8:718499 9:27346 10:457066 11:290973 12:533168 13:949027 14:873363 15:270294 16:934635 17:924903 18:763017 19:880474 20:31181 21:517486 22:26235 23:410878 24:588215 25:499868 26:980179
+click:1 dense_feature:0.0 dense_feature:0.00331674958541 dense_feature:0.0 dense_feature:0.0 dense_feature:0.023578125 dense_feature:0.0 dense_feature:0.04 dense_feature:0.0 dense_feature:0.046 dense_feature:0.0 dense_feature:0.3 dense_feature:0.0 dense_feature:0.0 1:737395 2:729012 3:691820 4:351286 5:938478 6:553107 7:21150 8:67180 9:27346 10:947459 11:164508 12:205079 13:882348 14:255651 15:178324 16:282716 17:342789 18:193902 19:880474 20:31181 21:604480 22:191474 23:669531 24:727223 25:499868 26:236426
+click:1 dense_feature:0.1 dense_feature:0.00331674958541 dense_feature:0.0 dense_feature:0.0 dense_feature:0.00859375 dense_feature:0.006 dense_feature:1.55 dense_feature:0.16 dense_feature:0.06 dense_feature:0.2 dense_feature:1.6 dense_feature:0.0 dense_feature:0.0 1:712372 2:235347 3:483718 4:382039 5:914075 6:906818 7:727609 8:154004 9:27346 10:116648 11:40711 12:658199 13:361076 14:15048 15:15058 16:644988 17:342789 18:544170 19:26230 20:26229 21:251535 22:26235 23:700326 24:114111 25:26224 26:26223
+click:1 dense_feature:0.25 dense_feature:0.192371475954 dense_feature:0.06 dense_feature:0.36 dense_feature:0.0 dense_feature:0.02 dense_feature:0.09 dense_feature:0.42 dense_feature:0.042 dense_feature:0.2 dense_feature:0.3 dense_feature:0.3 dense_feature:0.0 1:737395 2:288975 3:885137 4:368487 5:515218 6:906818 7:569753 8:799133 9:27346 10:635043 11:883202 12:780104 13:492605 14:873363 15:234451 16:94894 17:796504 18:653705 19:880474 20:984402 21:400692 22:26235 23:410878 24:767424 25:934391 26:958132
+click:1 dense_feature:0.15 dense_feature:0.0398009950249 dense_feature:0.02 dense_feature:0.04 dense_feature:1.5625e-05 dense_feature:0.0 dense_feature:0.06 dense_feature:0.04 dense_feature:0.026 dense_feature:0.1 dense_feature:0.3 dense_feature:0.0 dense_feature:0.0 1:715353 2:532829 3:721632 4:377785 5:286835 6:553107 7:959856 8:718499 9:27346 10:737746 11:432444 12:706936 13:169268 14:873363 15:896219 16:461005 17:342789 18:286597 19:26230 20:26229 21:602049 22:26235 23:700326 24:510447 25:26224 26:26223
+click:0 dense_feature:0.0 dense_feature:0.00663349917081 dense_feature:0.05 dense_feature:0.08 dense_feature:0.155421875 dense_feature:0.55 dense_feature:0.08 dense_feature:0.24 dense_feature:1.73 dense_feature:0.0 dense_feature:0.3 dense_feature:0.0 dense_feature:0.08 1:737395 2:288975 3:385122 4:57409 5:286835 6:25207 7:339181 8:67180 9:27346 10:284863 11:531306 12:229544 13:32168 14:117289 15:632422 16:615549 17:342789 18:240865 19:880474 20:984402 21:253725 22:26235 23:410878 24:837371 25:934391 26:948190
+click:0 dense_feature:0.0 dense_feature:0.0398009950249 dense_feature:0.06 dense_feature:0.12 dense_feature:0.11359375 dense_feature:0.55 dense_feature:0.03 dense_feature:0.12 dense_feature:0.186 dense_feature:0.0 dense_feature:0.2 dense_feature:0.0 dense_feature:0.12 1:737395 2:158107 3:738359 4:343895 5:286835 6:948614 7:513189 8:760861 9:27346 10:741641 11:214926 12:142871 13:753229 14:873363 15:502825 16:864586 17:681378 18:889198 19:26230 20:26229 21:368414 22:191474 23:410878 24:256238 25:26224 26:26223
+click:1 dense_feature:0.25 dense_feature:0.00663349917081 dense_feature:0.03 dense_feature:0.04 dense_feature:7.8125e-05 dense_feature:0.0 dense_feature:0.48 dense_feature:0.06 dense_feature:0.004 dense_feature:0.2 dense_feature:1.3 dense_feature:0.0 dense_feature:0.0 1:737395 2:414770 3:100889 4:981572 5:286835 6:446063 7:600430 8:507110 9:27346 10:566014 11:40711 12:330691 13:361076 14:15048 15:176957 16:759140 17:342789 18:212244 19:26230 20:26229 21:688637 22:26235 23:634287 24:762432 25:26224 26:26223
+click:0 dense_feature:0.0 dense_feature:0.00663349917081 dense_feature:0.04 dense_feature:0.02 dense_feature:0.109765625 dense_feature:0.202 dense_feature:0.13 dense_feature:0.02 dense_feature:0.078 dense_feature:0.0 dense_feature:0.1 dense_feature:0.1 dense_feature:0.02 1:737395 2:7753 3:871178 4:183530 5:286835 6:948614 7:358953 8:718499 9:27346 10:837400 11:432444 12:775997 13:169268 14:255651 15:250644 16:639940 17:342789 18:421122 19:880474 20:984402 21:410471 22:26235 23:410878 24:228420 25:269955 26:870795
+click:0 dense_feature:0.05 dense_feature:0.162520729685 dense_feature:0.28 dense_feature:0.16 dense_feature:0.001046875 dense_feature:0.028 dense_feature:1.03 dense_feature:0.84 dense_feature:0.534 dense_feature:0.1 dense_feature:2.3 dense_feature:0.0 dense_feature:0.28 1:737395 2:334074 3:108983 4:898979 5:286835 6:948614 7:600430 8:718499 9:27346 10:668726 11:40711 12:62821 13:361076 14:202774 15:722413 16:688170 17:342789 18:746785 19:957809 20:984402 21:96056 22:191474 23:410878 24:703372 25:129305 26:591537
+click:0 dense_feature:0.2 dense_feature:0.0945273631841 dense_feature:0.02 dense_feature:0.18 dense_feature:0.021078125 dense_feature:0.046 dense_feature:0.52 dense_feature:0.44 dense_feature:0.18 dense_feature:0.1 dense_feature:0.8 dense_feature:0.0 dense_feature:0.22 1:663372 2:532829 3:714247 4:673800 5:286835 6:906818 7:219517 8:67180 9:27346 10:161916 11:40711 12:441505 13:361076 14:255651 15:992961 16:137571 17:796504 18:395194 19:26230 20:26229 21:800938 22:179327 23:410878 24:719782 25:26224 26:26223
+click:1 dense_feature:0.15 dense_feature:0.24543946932 dense_feature:0.0 dense_feature:0.12 dense_feature:0.0001875 dense_feature:0.004 dense_feature:0.08 dense_feature:0.12 dense_feature:0.072 dense_feature:0.1 dense_feature:0.4 dense_feature:0.0 dense_feature:0.04 1:663372 2:70321 3:202829 4:415480 5:286835 6:553107 7:32934 8:67180 9:27346 10:1873 11:699999 12:55775 13:371214 14:873363 15:685332 16:719499 17:342789 18:135819 19:26230 20:26229 21:973542 22:852086 23:410878 24:635223 25:26224 26:26223
+click:0 dense_feature:0.0 dense_feature:0.0679933665008 dense_feature:0.02 dense_feature:0.02 dense_feature:0.20015625 dense_feature:0.016 dense_feature:0.03 dense_feature:0.02 dense_feature:0.014 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.02 1:737395 2:229199 3:956202 4:475901 5:286835 6:948614 7:614385 8:718499 9:27346 10:171202 11:670646 12:566018 13:386065 14:873363 15:936716 16:825279 17:681378 18:758631 19:26230 20:26229 21:113534 22:26235 23:410878 24:551443 25:26224 26:26223
+click:1 dense_feature:0.05 dense_feature:0.00497512437811 dense_feature:0.04 dense_feature:0.22 dense_feature:0.015921875 dense_feature:0.022 dense_feature:0.04 dense_feature:0.4 dense_feature:0.182 dense_feature:0.1 dense_feature:0.2 dense_feature:0.0 dense_feature:0.22 1:737395 2:64837 3:751736 4:291977 5:286835 6:25207 7:377931 8:718499 9:27346 10:724396 11:433484 12:517940 13:439712 14:201966 15:628624 16:780717 17:342789 18:895034 19:880474 20:31181 21:463725 22:26235 23:410878 24:787115 25:884722 26:164940
+click:1 dense_feature:0.0 dense_feature:0.00995024875622 dense_feature:0.15 dense_feature:0.48 dense_feature:0.051375 dense_feature:0.0 dense_feature:0.0 dense_feature:0.06 dense_feature:0.556 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.5 1:737395 2:532829 3:158777 4:112926 5:286835 6:948614 7:764249 8:67180 9:27346 10:795273 11:330644 12:524443 13:78129 14:873363 15:127209 16:146094 17:342789 18:976129 19:26230 20:26229 21:901094 22:26235 23:410878 24:259263 25:26224 26:26223
+click:1 dense_feature:0.0 dense_feature:0.00497512437811 dense_feature:1.75 dense_feature:0.0 dense_feature:0.922828125 dense_feature:1.078 dense_feature:0.0 dense_feature:0.0 dense_feature:0.112 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 1:26973 2:62956 3:428206 4:935291 5:286835 6:446063 7:360307 8:437731 9:502607 10:957425 11:626052 12:641189 13:988058 14:217110 15:637914 16:293992 17:342789 18:832710 19:774721 20:537425 21:516798 22:191474 23:700326 24:204648 25:884722 26:776972
+click:1 dense_feature:1.95 dense_feature:0.00829187396352 dense_feature:0.08 dense_feature:0.1 dense_feature:0.01878125 dense_feature:0.044 dense_feature:0.42 dense_feature:0.24 dense_feature:0.358 dense_feature:0.1 dense_feature:0.2 dense_feature:0.1 dense_feature:0.26 1:737395 2:638265 3:526671 4:362576 5:938478 6:948614 7:999918 8:67180 9:27346 10:806276 11:181589 12:688684 13:367155 14:255651 15:709602 16:386859 17:228085 18:204112 19:668832 20:537425 21:541553 22:191474 23:410878 24:606704 25:49230 26:68113
+click:0 dense_feature:0.0 dense_feature:0.00331674958541 dense_feature:0.0 dense_feature:0.0 dense_feature:0.38159375 dense_feature:0.022 dense_feature:0.18 dense_feature:0.0 dense_feature:0.016 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.0 1:737395 2:841163 3:284187 4:385559 5:286835 6:446063 7:311604 8:67180 9:27346 10:38910 11:76230 12:520869 13:429321 14:255651 15:296507 16:542357 17:342789 18:377250 19:880474 20:31181 21:325494 22:26235 23:410878 24:26284 25:499868 26:467348
+click:0 dense_feature:0.0 dense_feature:0.00663349917081 dense_feature:0.08 dense_feature:0.0 dense_feature:0.077125 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.03 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 1:737395 2:238813 3:821667 4:209184 5:286835 6:906818 7:261420 8:67180 9:27346 10:748867 11:277196 12:790086 13:495408 14:873363 15:572266 16:281532 17:342789 18:99340 19:880474 20:537425 21:815896 22:26235 23:669531 24:17430 25:734238 26:251811
+click:0 dense_feature:0.0 dense_feature:0.210613598673 dense_feature:0.01 dense_feature:0.0 dense_feature:0.041375 dense_feature:0.0 dense_feature:0.0 dense_feature:0.08 dense_feature:0.026 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 1:737395 2:532829 3:559456 4:565823 5:286835 6:948614 7:48897 8:67180 9:27346 10:708281 11:214000 12:431427 13:477774 14:873363 15:637383 16:678446 17:276641 18:849284 19:26230 20:26229 21:758879 22:26235 23:410878 24:399458 25:26224 26:26223
+click:1 dense_feature:0.2 dense_feature:0.00331674958541 dense_feature:0.0 dense_feature:0.0 dense_feature:0.00440625 dense_feature:0.036 dense_feature:0.04 dense_feature:0.3 dense_feature:0.03 dense_feature:0.1 dense_feature:0.1 dense_feature:0.0 dense_feature:0.0 1:715353 2:532829 3:967094 4:707735 5:286835 6:948614 7:555710 8:154004 9:27346 10:708281 11:514992 12:158604 13:780149 14:255651 15:285282 16:149708 17:342789 18:553067 19:26230 20:26229 21:229985 22:26235 23:700326 24:777746 25:26224 26:26223
+click:1 dense_feature:0.0 dense_feature:0.00331674958541 dense_feature:0.0 dense_feature:0.0 dense_feature:0.23178125 dense_feature:0.222 dense_feature:0.06 dense_feature:0.0 dense_feature:0.408 dense_feature:0.0 dense_feature:0.2 dense_feature:0.0 dense_feature:0.0 1:715353 2:227084 3:456811 4:828682 5:286835 6:948614 7:406567 8:67180 9:27346 10:66123 11:598531 12:527138 13:731439 14:813514 15:35257 16:43339 17:342789 18:918487 19:26230 20:26229 21:580653 22:26235 23:410878 24:495283 25:26224 26:26223
+click:0 dense_feature:0.15 dense_feature:0.462686567164 dense_feature:0.08 dense_feature:0.22 dense_feature:0.00015625 dense_feature:0.022 dense_feature:0.03 dense_feature:0.52 dense_feature:0.022 dense_feature:0.1 dense_feature:0.1 dense_feature:0.0 dense_feature:0.22 1:576931 2:99294 3:263211 4:501662 5:938478 6:154084 7:128918 8:67180 9:27346 10:912799 11:801006 12:506258 13:378182 14:201966 15:150934 16:240427 17:681378 18:393279 19:26230 20:26229 21:152038 22:26235 23:700326 24:551443 25:26224 26:26223
+click:0 dense_feature:0.0 dense_feature:0.00331674958541 dense_feature:0.0 dense_feature:0.0 dense_feature:0.181484375 dense_feature:0.06 dense_feature:0.01 dense_feature:0.0 dense_feature:0.056 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.0 1:230803 2:283434 3:367596 4:197992 5:938478 6:948614 7:268098 8:67180 9:27346 10:870993 11:632267 12:139817 13:718764 14:255651 15:884839 16:80117 17:276641 18:556463 19:880474 20:537425 21:271358 22:26235 23:410878 24:488077 25:253255 26:584828
+click:0 dense_feature:0.0 dense_feature:0.00497512437811 dense_feature:0.0 dense_feature:0.16 dense_feature:4.790078125 dense_feature:0.0 dense_feature:0.0 dense_feature:0.28 dense_feature:0.016 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.2 1:737395 2:532829 3:158777 4:112926 5:286835 6:948614 7:277312 8:67180 9:502607 10:708281 11:755513 12:524443 13:4029 14:873363 15:503814 16:146094 17:644532 18:121590 19:26230 20:26229 21:901094 22:191474 23:744389 24:259263 25:26224 26:26223
+click:0 dense_feature:0.0 dense_feature:3.30845771144 dense_feature:0.0 dense_feature:0.04 dense_feature:0.022671875 dense_feature:0.062 dense_feature:0.01 dense_feature:0.4 dense_feature:0.062 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.04 1:663372 2:529436 3:511823 4:942782 5:286835 6:906818 7:190054 8:67180 9:27346 10:708281 11:32527 12:494263 13:652478 14:873363 15:616057 16:17325 17:342789 18:325238 19:26230 20:26229 21:256747 22:179327 23:410878 24:169709 25:26224 26:26223
+click:0 dense_feature:0.0 dense_feature:0.00829187396352 dense_feature:0.01 dense_feature:0.16 dense_feature:0.206765625 dense_feature:0.328 dense_feature:0.13 dense_feature:0.16 dense_feature:0.176 dense_feature:0.0 dense_feature:0.7 dense_feature:0.0 dense_feature:0.16 1:737395 2:552854 3:606082 4:267619 5:286835 6:948614 7:918889 8:67180 9:27346 10:708281 11:400024 12:972010 13:66330 14:255651 15:432931 16:650209 17:506108 18:212910 19:26230 20:26229 21:107726 22:26235 23:410878 24:718419 25:26224 26:26223
diff --git a/models/rank/dnn/data/test/sample_test.txt b/models/rank/dnn/data/test/sample_test.txt
deleted file mode 100755
index 3957a7ff04df61a450a8907d6f60e4f7d1ac2862..0000000000000000000000000000000000000000
--- a/models/rank/dnn/data/test/sample_test.txt
+++ /dev/null
@@ -1,100 +0,0 @@
-0 1 1 26 30 0 4 2 37 152 1 2 2 05db9164 38d50e09 ed5e4936 612ccfd4 25c83c98 38eb9cf4 1f89b562 a73ee510 2462946f 7f8ffe57 1d5d5b6e 46f42a63 b28479f6 7501d6be 6083e1d5 07c540c4 f855e3f0 21ddcdc9 5840adea 782e846e 32c7478e b2f178a3 001f3601 c4304c4b
-0 20 3 4 40479 444 0 1 157 0 4 68fd1e64 09e68b86 aa8c1539 85dd697c 25c83c98 fe6b92e5 e56a4862 5b392875 a73ee510 3b08e48b 5e183c58 d8c29807 1eb0f8f0 8ceecbc8 d2f03b75 c64d548f 07c540c4 63cdbb21 cf99e5de 5840adea 5f957280 55dd3565 1793a828 e8b83407 b7d9c3bc
-0 6 70 1 22 312 25 52 44 144 1 3 1 22 05db9164 04e09220 b1ecc6c4 5dff9b29 4cf72387 7e0ccccf d5f62b87 1f89b562 a73ee510 ce92c282 434d6c13 2436ff75 7301027a 07d13a8f f6b23a53 f4ead43c 3486227d 6fc84bfb 4f1aa25f c9d4222a 55dd3565 ded4aac9
-0 0 0 110 7 3251 44 1 32 39 0 1 31 05db9164 80e26c9b ba1947d0 85dd697c 25c83c98 85f287b3 0b153874 a73ee510 89270478 7c53dc69 34a238e0 4fd35e8f 1adce6ef 0f942372 da441c7e d4bb7bd8 005c6740 21ddcdc9 5840adea 8717ea07 423fab69 1793a828 e8b83407 9904c656
-0 0 29 19490 0 68fd1e64 287130e0 ba4559ea 33a72095 25c83c98 fbad5c96 ffdbd799 5b392875 a73ee510 60badee3 c72ca7a4 ebfb225c b9be5035 cfef1c29 655fad18 a9dcda12 d4bb7bd8 891589e7 419b4cef 5840adea 76ef8858 32c7478e 135c8b41 ea9a246c e3a60438
-0 2 2 20 43197 0 26 23 0 25 05db9164 9b6b8959 9c1c85e7 fd4d6dc3 25c83c98 7e0ccccf d2d741ca 0b153874 a73ee510 4e2d1b78 ea4adb47 cc239583 05781932 64c94865 de781d57 efd92064 e5ba7672 cac48684 4b0ac19f c9d4222a 3a171ecb 22dd4e42
-1 5 12 871 0 27 1 21 1 4 0 05db9164 e112a9de 29bb7bea d3e15e1a 25c83c98 7e0ccccf fd3483f3 0b153874 a73ee510 880e2781 9d7e66c3 bd5829ab df957573 07d13a8f 290e3042 390b7737 8efede7f 808e7bc3 af16dda0 ad3062eb 423fab69 a0ab2ce0
-0 6 263 41 53 0 44 42 0 42 05db9164 71ca0a25 ad876a43 0481f0ba 4cf72387 7e0ccccf bb0f47fb 5b392875 a73ee510 3b08e48b da3f45ff fde18531 a9fda8f5 07d13a8f a8e0f0c6 06f4ae56 776ce399 9bf8ffef 21ddcdc9 5840adea f5f07930 be7c41b4 62aa24c6 001f3601 1d5d3a57
-0 14 4301 48 2 3 51 2 68fd1e64 95e2d337 95c48c52 30b862e7 25c83c98 7e0ccccf b06857f8 0b153874 a73ee510 8228dde1 e4eb05d4 f0d5cc59 a4c5d6dd 1adce6ef 559cd202 e9194f3c 07c540c4 7b06fafe 21ddcdc9 a458ea53 cb105f80 423fab69 16bb3de8 2bf691b1 d3b2f8c3
-0 58 42 39 100 0 40 40 0 40 05db9164 207b2d81 25c37040 e8b2aee5 25c83c98 fe6b92e5 6e6e841b 1f89b562 a73ee510 3b08e48b dcc0e16b a04dc78a b093e98d b28479f6 c6438ddb 31da84fc 776ce399 fa0643ee 21ddcdc9 b1252a9d 931d653d c9d4222a be7c41b4 46f5e7df 001f3601 0e25d9c4
-0 0 2 3 46065 0 5 9 0 0 3 68fd1e64 b961056b 05eefcc3 65e58ae6 25c83c98 fbad5c96 68fbb662 0b153874 7cc72ec2 4aead435 922bbb91 10239ea6 ad61640d 1adce6ef 8187184a 551eb463 e5ba7672 5a6878f5 00018438 32c7478e 71292dbb
-1 1 0 224 0 4 0 3 4 27 1 2 0 05db9164 09e68b86 aa8c1539 85dd697c 25c83c98 7e0ccccf a4a8fd5a 0b153874 a73ee510 43a9b300 d13e1160 d8c29807 45820f61 b28479f6 2d49999f c64d548f e5ba7672 63cdbb21 cf99e5de 5840adea 5f957280 bcdee96c 1793a828 e8b83407 b7d9c3bc
-1 10 310 6 5 3 75 4 702 2 21 3 68fd1e64 3f0d3f28 4cf72387 7e0ccccf a097ff18 062b5529 a73ee510 ae07e31d 3407cf7b f0fe287d 1adce6ef 14108df6 27c07bd6 88416823 ad3062eb 3a171ecb
-0 0 0 19 2898 145 4 20 370 0 2 43 05db9164 38a947a1 0797f900 1da94763 25c83c98 fbad5c96 ba0ca6c5 64523cfa a73ee510 56ae5fb0 7ca01a9d c8ea9acc 97d749c9 1adce6ef a3dc522e d1079e54 e5ba7672 492bb129 828187a0 32c7478e 171ccf3e
-0 16 4 2 46248 0 2 49 0 2 05db9164 942f9a8d feafff7d d7b693da 25c83c98 7e0ccccf d9aa9d97 5b392875 7cc72ec2 3b08e48b c4adf918 4ebd8ffe 85dbe138 b28479f6 ac182643 48292aa0 776ce399 1f868fdd 21ddcdc9 b1252a9d be7cac53 32c7478e e3edc57b 9d93af03 7dfad416
-1 66 136 11 12 15 12 963 26 258 3 73 0 12 05db9164 89ddfee8 c314b537 e88cbfb4 4cf72387 7e0ccccf 1c86e0eb 0b153874 a73ee510 e9c971a2 755e4a50 bc8b54c7 5978055e b28479f6 25753fb1 fadc3903 e5ba7672 5bb2ec8e 5b1d6ed9 b1252a9d 8a903c79 32c7478e 7cb5b4d7 e8b83407 ec01bf7b
-0 1 34 29 2 10 2 1 2 2 1 1 2 05db9164 b80912da 7b467545 d0cbe447 0942e0a7 fbad5c96 fc8f52a9 0b153874 a73ee510 3b08e48b ad39ba86 dd94da95 751c7a99 b28479f6 79fcb5cb 169d489d e5ba7672 7119e567 3014a4b1 5840adea 23fcd679 3a171ecb de1e9c76 e8b83407 ce0bf6fc
-1 1 1 22 1 138 7 16 22 114 1 8 0 7 7e5c2ff4 287130e0 67fa93b5 1fa34039 43b19349 13718bbd f828f7fb 0b153874 a73ee510 b883655e ab066900 2eb927aa 5d4198ed 07d13a8f 10040656 6f930046 e5ba7672 891589e7 21ddcdc9 5840adea fce0d6a4 3a171ecb 1793a828 e8b83407 63093459
-0 2 8 4 4 2 4 8 6 55 2 5 0 4 05db9164 b80912da 02391f51 b9c629a9 b0530c50 7e0ccccf fd10f30e 0b153874 a73ee510 bfc44ba9 e3ee9d2e 2397259a 0d60a93e 07d13a8f ee76936d d37efe8c e5ba7672 30f25b5e 21ddcdc9 5840adea b6119319 423fab69 45ab94c8 ce62e669 b13f4ade
-1 15 2 88 27 4 1 21 49 124 1 3 1 5a9ed9b0 4f25e98b aee80afd ae78390d 25c83c98 fbad5c96 f00bddf8 6c41e35e a73ee510 16a81a6c 55795b33 12d1b214 39795005 1adce6ef fb2772ea 121f992b e5ba7672 bc5a0ff7 dfc341f8 a458ea53 b4847d32 32c7478e e7bc1058 001f3601 6b208992
-1 0 8 14551 26 2 0 22 2 0 87552397 80e26c9b 431913c5 85dd697c 25c83c98 fbad5c96 b46e01f1 0b153874 a73ee510 39cda501 7c53dc69 5798519c 4fd35e8f 07d13a8f e8f4b767 2d0bbe92 3486227d 005c6740 21ddcdc9 5840adea 91404954 3a171ecb 1793a828 e8b83407 b9809574
-0 0 12 9 4430 21 2 11 11 1 9 05db9164 333137d9 22fbf56a b92573a3 25c83c98 fe6b92e5 ad9b2639 0b153874 a73ee510 9c4dd39e e4034ebf 878d3428 ea089f5d b28479f6 a46bf7c6 7401a802 07c540c4 c61e82d7 21ddcdc9 a458ea53 634363f7 c9d4222a 32c7478e a2752662 445bbe3b fc1f43e7
-0 21 904 7 30 79 39 87 20 251 2 8 0 39 05db9164 f0cf0024 20009f96 73fec7fb 4cf72387 fbad5c96 a98972ab 0b153874 a73ee510 06363d2d a523f48a 57c08194 5cc21877 b28479f6 fdb1071f 054b386f 3486227d cc693e93 21ddcdc9 b1252a9d 0dd41d11 c9d4222a 32c7478e f9f7eb22 f0f449dd a3a8e8f4
-1 1 1 9 35 5 6 17 10 912 1 9 6 05db9164 09e68b86 21f56260 7cc584ad 89ff5705 fbad5c96 69b885a7 5b392875 a73ee510 b6900243 208d9dd6 252752f5 59dd51b4 07d13a8f 36721ddc e20cfabe e5ba7672 5aed7436 db0b20dc b1252a9d 3572f92c 423fab69 869261fd f0f449dd fb52e815
-0 0 3 47 10 1494 153 6 11 269 0 4 10 5a9ed9b0 39dfaa0d 86d9f7e6 77b5e5ed b2241560 7e0ccccf afa309bd 0b153874 a73ee510 c54560e0 77212bd7 04d776a9 7203f04e 07d13a8f 60fa10e5 465ae0d6 e5ba7672 df4fffb7 21ddcdc9 5840adea 8b9756be c9d4222a c7dc6720 c88bdcee 010f6491 4e7af834
-0 1 0 44 24 4 24 6 43 232 1 4 24 05db9164 c44e8a72 93655629 1b9f91ce 25c83c98 fbad5c96 a25cceac 67b76963 a73ee510 0b16773a 5bee5497 f0f6a9c1 a57cffd3 1adce6ef d6c04afa 6dc8c52c e5ba7672 456d734d 05e4794e a458ea53 dc1b605a bcdee96c 79fc7b8a 724b04da 0cc1543a
-1 18 0 37 20 28 20 18 19 20 1 1 0 20 05db9164 ad61f1c8 b64ac9a3 1df4d824 25c83c98 7e0ccccf ac2d4799 0b153874 a73ee510 da500e68 434d6c13 71d55d49 7301027a b28479f6 3403e98c ed6d847a e5ba7672 84eb7a34 1d0aeb7a ad3062eb c7dc6720 786a0db5
-1 4 88 4 20 14 27 357 31 874 2 41 13 05db9164 0eb070fa e75647d9 50912373 43b19349 7e0ccccf 1c86e0eb 0b153874 a73ee510 e7ba2569 755e4a50 a2337f7c 5978055e 07d13a8f 733cd612 2873175e e5ba7672 7ba9340b b4c77ec9 32c7478e 55cf97a5
-0 0 0 20 9 7441 13 4 9 12 0 1 9 05db9164 46320fff de0cea78 66d81227 25c83c98 604312d3 0b153874 a73ee510 3b08e48b 0f6f1a80 51f94b83 9077501d 07d13a8f 4b572351 3ec13e49 e5ba7672 d981a095 21ddcdc9 5840adea b1bb8218 32c7478e 4f272e57 c9f3bea7 25ae1dcc
-0 1 1 7 2 331 62 2 5 72 1 2 0 2 05db9164 8947f767 59f8a22b 16e92bee 25c83c98 7e0ccccf b471ac4f 1f89b562 a73ee510 4e56c58e e1ba038b 92352c1e e65a5fc3 07d13a8f 2c14c412 57ac7fda e5ba7672 bd17c3da 4b367914 b1252a9d e68624bc 3a171ecb c77fdeda 010f6491 0a798839
-1 25 16 11 11545 56 1 20 51 1 11 05db9164 8f5b4275 b009d929 c7043c4b 5a3e1872 fbad5c96 e76a087f 0b153874 a73ee510 3b08e48b 50349a3f 3563ab62 370eceb9 1adce6ef a6bf53df b688c8cc d4bb7bd8 65c9624a 21ddcdc9 5840adea 2754aaf1 93bad2c0 3b183c5c e8b83407 adb5d234
-0 1 20 3 6 1 2 2 8 8 1 2 2 5a9ed9b0 e5fb1af3 77f9d96e bc87885b 25c83c98 3bf701e7 6772d022 0b153874 a73ee510 9f7517e0 e0c3cae0 4ce8091c e8df3343 1adce6ef 60403b20 8fb0be40 07c540c4 13145934 21ddcdc9 b1252a9d c3f827f4 423fab69 f0f123e9 c243e98b 63ef9236
-0 -1 8 5 11535 32 0 7 13 0 5 5a9ed9b0 f8c8e8f8 74e1a23a 9a6888fb 25c83c98 fe6b92e5 93955fc0 1f89b562 a73ee510 7dab1649 5215184e fb8fab62 b8ae7766 07d13a8f d4696a42 c6b1e1b2 07c540c4 d2f0bce2 21ddcdc9 5840adea 99c09e97 3a171ecb 335a6a1e f55c04b6 68a2a837
-0 0 1 1 1 3755 124 6 8 198 0 3 1 5a9ed9b0 9819deea 533b1a61 f922efad 25c83c98 fe6b92e5 a4bbd4f4 0b153874 a73ee510 3b76bfa9 8d5ad79c b99ddbc8 4809d853 b28479f6 1150f5ed 87acb535 e5ba7672 7e32f7a4 a4b7004c 93bad2c0 b34f3128
-0 0 15 7 1 2652 57 5 40 55 0 1 1 8cf07265 8947f767 37722a24 8802788f 25c83c98 fda1a50f 0b153874 a73ee510 3b08e48b d2b7c44b e3caf087 68637c0d 64c94865 d120f347 42bc62e3 e5ba7672 bd17c3da 21ddcdc9 a458ea53 1891824e 32c7478e b7bf6986 010f6491 a6115607
-0 5 176 1 1 627 61 109 17 118 2 11 1 05db9164 38a947a1 1646cf1d fcdc5174 25c83c98 fe6b92e5 6fa3c1a7 1f89b562 a73ee510 5f50c86b b8deab54 c30bbcd1 efbb2435 07d13a8f 927edf61 ffb61047 e5ba7672 e73433e0 122d6055 423fab69 d8e17d82
-1 108 20 403 0 1 0 109 0 7 1 2 0 05db9164 942f9a8d 871b4299 25dd4760 4cf72387 7e0ccccf d70c05b1 7b6fecd5 a73ee510 7edea927 c4adf918 2f1be242 85dbe138 1adce6ef ae97ecc3 c637ec94 e5ba7672 1f868fdd 2e30f394 a458ea53 140ec002 ad3062eb bcdee96c b50e18f9 001f3601 f99af3bd
-0 15 0 14 10 609 35 29 12 419 1 3 3 10 05db9164 09e68b86 c86b9e6a e4fd0a5b 25c83c98 7e0ccccf a90a99c5 0b153874 a73ee510 e6003298 e9561d8b 906b3727 1cc9ac51 b28479f6 6f73304a a10da4c7 8efede7f 479030a6 7a1c9aad 5840adea c06c3736 32c7478e 41be4766 e8b83407 d8a062c4
-0 8 0 10 12 46 12 8 10 12 1 1 12 05db9164 b7ca2abd ee96fc95 68ad052c 25c83c98 7e0ccccf 968a6688 5b392875 a73ee510 e851ff7b f25fe7e9 ce875433 dd183b4c 64c94865 5f2d5a3a 5f92b84a e5ba7672 4771e483 95b757a6 3a171ecb 41be4766
-0 0 5 6 2 3021 151 6 10 18 0 1 2 be589b51 207b2d81 d0484442 68637816 25c83c98 7e0ccccf 12c61956 45f7c2dd a73ee510 29e50671 94d2aad8 3b9ae062 f23a3825 07d13a8f 0c67c4ca 3a1a0a65 07c540c4 395856b0 21ddcdc9 a458ea53 1720a38e 32c7478e 4de83b96 001f3601 8f16a3b8
-0 4 7954 19 2 6 17 1 68fd1e64 78ccd99e 0a1435c1 bdcfffba 25c83c98 7e0ccccf c4939891 0b153874 a73ee510 fbbf2c95 7d4bba07 5a276398 2fad1153 8ceecbc8 d5adea3d 4da40ea2 07c540c4 e7e991cb 21ddcdc9 5840adea 290c14f6 3a171ecb ded4aac9 2bf691b1 bdf46dce
-1 7 89 14 3 2 2 47 31 341 2 10 0 2 05db9164 421b43cd ced9477f 29998ed1 25c83c98 7e0ccccf 6bf83cdb 0b153874 a73ee510 89ff09ee 60adb56e 6aaba33c 53b60829 b28479f6 2d0bb053 b041b04a e5ba7672 2804effd 723b4dfd dbb486d7 b34f3128
-1 -1 27180 12 2 0 5 1 05db9164 46b01795 4cf72387 1dcabd2a 0b153874 a73ee510 1d56e466 9cf09d42 f66b043c 1adce6ef c830dc5e 07c540c4 e3a5430f 32c7478e
-0 1 1 39 15 119 18 1 18 15 1 1 15 05db9164 4f25e98b 01fefe29 e86b1560 25c83c98 7e0ccccf 0038e65c 0b153874 a73ee510 3b08e48b 7e728ed1 4676ac97 1ddad6aa 1adce6ef 17d9b759 3581aa7f d4bb7bd8 7ef5affa 9437f62f b1252a9d 745c79e6 bcdee96c 3fdb382b 001f3601 49d68486
-0 0 2 5 1284 0 23 24 0 5 05db9164 8084ee93 02cf9876 c18be181 0942e0a7 7e0ccccf 0b72a0e8 5b392875 a73ee510 3b08e48b 4950c85b 8fe001f4 1d27b635 b28479f6 16d2748c 36103458 776ce399 003d4f4f e587c466 bcdee96c 3b183c5c
-1 0 74 36 4 36375 8 0 4 68fd1e64 0468d672 08266a1d a3fc4871 4cf72387 7e0ccccf 5fd3419b 37e4aa92 a73ee510 972359d0 f69fd509 692521c3 c7176043 b28479f6 234191d3 dc3c41ba d4bb7bd8 9880032b 21ddcdc9 5840adea 10738086 3a171ecb e43a3efc ea9a246c 4e7af834
-1 4 5 8 35 1398 64 19 9 703 1 4 59 05db9164 2a69d406 30b6e3ea 13508380 4cf72387 7e0ccccf 579c293b 0b153874 a73ee510 b38bac58 f66047e5 4551eab3 13c89cc4 07d13a8f 3b2d8705 48f5ae81 e5ba7672 642f2610 55dd3565 b1252a9d de95351a c9d4222a 423fab69 45ab94c8 2bf691b1 c84c4aec
-0 7 48 41035 3 05db9164 6e638bbc 49a1cd79 cca79e1e 25c83c98 fe6b92e5 8f4478fe 0b153874 a73ee510 8ba6af1c 1cd8b8ae 0acdf55c 86b6351d b28479f6 c11477f0 f541ee61 d4bb7bd8 f6a2fc70 21ddcdc9 b1252a9d 1afb7d8e bcdee96c 75cfed80 445bbe3b e2f05ce0
-1 -1 14752 0 2 4 0 5bfa8ab5 38a947a1 e710f9eb ae6e2a08 25c83c98 fe6b92e5 56f361f1 0b153874 a73ee510 3b08e48b 6d91e005 d0649cfd 34098dd6 b28479f6 7160a164 6ffcab68 776ce399 82103027 9487db01 be7c41b4 f57138a8
-0 210 6 2 9072 0 2 12 0 2 05db9164 a07503cc 5d260103 13508380 25c83c98 987da766 0b153874 a73ee510 a9271c40 f37be5c0 519590f0 a59ea816 07d13a8f 77660bba 884b33b5 e5ba7672 912c7e21 1d1eb838 b1252a9d 353846c9 c7dc6720 45ab94c8 445bbe3b c84c4aec
-0 3 45 6 7 18 6 52 7 177 1 9 0 6 f5796c5b 80e26c9b 6e5bddab d3e92866 25c83c98 7e0ccccf 24e8ca9f 0b153874 a73ee510 5fd7dd92 94a1f0fa bf413137 153f0382 07d13a8f f3635baf af6fc4b8 3486227d f54016b9 21ddcdc9 5840adea a3405885 423fab69 b0fb6a50 e8b83407 61556511
-0 0 38 2 3 11664 0 6 3 0 0 0 3 68fd1e64 2c16a946 849cf586 b180f466 25c83c98 7e0ccccf 5547e1f4 0b153874 a73ee510 5db9788f 087dfcfd 48fc0800 5317f239 07d13a8f 18231224 9fbd58f8 e5ba7672 74ef3502 51c0191c 3a171ecb 9117a34a
-0 11 6 18 1632 0 19 21 0 19 5a9ed9b0 58e67aaf 381d8ea3 76bbce8c 25c83c98 7e0ccccf 9b7f373a 7b6fecd5 a73ee510 597e2a48 ec2b795a 732c8db2 a5975b1d 07d13a8f 10935a85 03f89a73 1e88c74f c21c3e4c 21ddcdc9 a458ea53 d83181ad c7dc6720 3fdb382b b9266ff0 25bf05c2
-0 180 35 1 31780 0 1 1 0 1 8cf07265 421b43cd bc27bcef 29998ed1 f281d2a7 fbad5c96 1d94dd40 0b153874 a73ee510 efea433b ccfdca2f 6aaba33c d76cea6e b28479f6 e1ac77f7 b041b04a d4bb7bd8 2804effd 723b4dfd 32c7478e b34f3128
-1 2 4 0 4 0 12 0 49 1 3 0 68fd1e64 38a947a1 cc9e717b 9ca2c15d 25c83c98 d5141a06 5b392875 a73ee510 af94b16c f2a5d7d2 37dfef2b a3b89afc b28479f6 a5118040 1cb7075e e5ba7672 b6b880ec 42dbeba8 32c7478e 88422d4d
-1 -1 6223 2 22 0 20 3 68fd1e64 38a947a1 6847b3c1 6cd6e51f 25c83c98 fbad5c96 93ec533b f0298c3c a73ee510 3b08e48b 9ffb3655 eed4a04f a0874a81 1adce6ef 4a591230 d4ca38be e5ba7672 e3c6d69d ba703820 32c7478e c50d808e
-1 3 153 3 3 1 0 4 4 13 1 2 0 05db9164 421b43cd 24146df6 29998ed1 25c83c98 7e0ccccf 4aa938fc 5b392875 a73ee510 451bd4e4 2b9c7071 6aaba33c 1aa94af3 b28479f6 e1ac77f7 b041b04a e5ba7672 2804effd 723b4dfd 3a171ecb b34f3128
-0 4 45 41 31 5 11 156 32 185 1 25 0 11 68fd1e64 89ddfee8 9732b11b 4c0dcfee 25c83c98 fbad5c96 1c86e0eb 5b392875 a73ee510 e7ba2569 755e4a50 ccb8af7d 5978055e b28479f6 25753fb1 19637c17 e5ba7672 5bb2ec8e ae44ba4c b1252a9d 0db71b18 32c7478e 5c960292 f0f449dd 45b5a9e7
-1 1 21 13 12 8 5 8 20 69 1 4 5 05db9164 e3db0bac 9cc6a4f1 9cd2a845 25c83c98 ab1ad103 0b153874 a73ee510 63c8d3d5 859b343f e68fa129 20819d96 07d13a8f 618b0ee5 3004a5f2 e5ba7672 a7ccaded 21ddcdc9 5840adea dc135e3f 8ec974f4 423fab69 08b0ce98 b9266ff0 b29c74dc
-0 2 3 14 9 5 9 2 10 9 1 1 9 8c6ba407 09e68b86 b976df14 0b839026 25c83c98 fbad5c96 cc5ed2f1 5b392875 a73ee510 3b08e48b e216a695 ab02884f 9f16a973 b28479f6 52baadf5 5fa439a6 e5ba7672 5aed7436 2aa4575d b1252a9d 32dcf845 32c7478e f8d85724 e8b83407 f643b6c5
-0 88 73 41 4420 0 46 47 0 46 05db9164 73a46ff0 c19a1e7a b7802d6b 25c83c98 fe6b92e5 28639f10 0b153874 a73ee510 3b08e48b 3a5bf2d6 0761d1a2 155ff7d9 b28479f6 4f648a87 079f48c0 776ce399 da507f45 21ddcdc9 b1252a9d a1fdd170 c9d4222a 3a171ecb a455dffb ea9a246c aa99435d
-0 2644 4 1 26246 0 1 14 0 1 05db9164 80e26c9b 7df8ac19 42cc30a8 25c83c98 fbad5c96 d2d741ca 0b153874 a73ee510 3b08e48b ea4adb47 6cf704b2 05781932 1adce6ef 8ba8b39a dbdb2c16 e5ba7672 f54016b9 21ddcdc9 a458ea53 a92be8d2 c9d4222a 3a171ecb 3037ff6a e8b83407 b112057a
-0 139 1 13556 79 1 13 59 1 0 1 68fd1e64 38a947a1 4fc317a6 6a14f9b9 25c83c98 fbad5c96 282b88fc 0b153874 a73ee510 0f1a2599 3e2feacf 9ff86c51 0e5bc979 07d13a8f 46df822a f8b34416 3486227d c9ac134a f3ddd519 32c7478e b34f3128
-0 1 13 2 12026 535 8 26 308 3 3 05db9164 90081f33 36e97f3a e96617b3 25c83c98 fbad5c96 7f9907fe 5b392875 a73ee510 a3e2e7a5 a7b606c4 ba5aae2e eae197fd 64c94865 eec7af60 23b497d2 d4bb7bd8 ef981aa1 36a4f6c3 3a171ecb 3e022f4d
-1 2 10 14 20 577 142 3 39 42 1 2 26 05db9164 08d6d899 9143c832 f56b7dd5 25c83c98 7e0ccccf dc7659bd 0b153874 a73ee510 efea433b e51ddf94 ae1bb660 3516f6e6 b28479f6 bfef54b3 bad5ee18 e5ba7672 87c6f83c 0429f84b 32c7478e c0d61a5c
-1 0 45 6 1584 37 10 28 228 0 6 11 5a9ed9b0 bce95927 b46f1f1d 13508380 25c83c98 fbad5c96 737174dc 0b153874 a73ee510 3b08e48b 3b0a3499 35dfe2c5 c8e4b0c1 07d13a8f fec218c0 9720e154 e5ba7672 04d863d5 b7380686 b1252a9d 2b0e5756 c9d4222a 32c7478e 45ab94c8 e8b83407 c84c4aec
-1 0 1214 4 20 2131 159 4 11 580 0 3 0 72 05db9164 4f25e98b 2d1ef417 68a5fcbb 4cf72387 7e0ccccf 5e64ce5f 0b153874 a73ee510 3ccfe0c0 4618e030 975c1c17 025225f2 b28479f6 8ab5b746 6720b72e 27c07bd6 7ef5affa 21ddcdc9 b1252a9d 722d167c 32c7478e 3fdb382b e8b83407 49d68486
-0 0 3 4553 49 1 0 0 1 5a9ed9b0 38a947a1 a16966ab 65803e5f 43b19349 fbad5c96 3b16ebba 0b153874 a73ee510 8edcd037 6803595d fc0ad095 2a2faae1 b28479f6 b593a63b fd97a107 d4bb7bd8 1263c077 392cde4b 32c7478e af55e227
-0 316 5 234 0 0 0 0 05db9164 38a947a1 3f5a37fe 1032bac8 25c83c98 7e0ccccf 1760a525 37e4aa92 a73ee510 3b08e48b 2d6f299a ce406f01 f0e0f335 b28479f6 77ef1e58 67f512fb 776ce399 b6b880ec c2b62b88 be7c41b4 c86755ff
-1 2040 14 54675 0 2 6 0 da4eff0f 09e68b86 5b8662c6 5bad2804 25c83c98 8c28e5b5 6a698541 7cc72ec2 feccf887 ae4c531b 8ee18973 01c2bbc7 b28479f6 52baadf5 d93ba614 e5ba7672 5aed7436 75916440 a458ea53 2554eed2 32c7478e 47577e42 e8b83407 89fa8140
-0 0 0 15 6 1512 18 15 10 215 0 6 6 05db9164 09e68b86 aa8c1539 85dd697c 43b19349 7e0ccccf af84702c c8ddd494 a73ee510 fa7d0797 ae19a197 d8c29807 7f0d7407 b28479f6 2d49999f c64d548f e5ba7672 63cdbb21 cf99e5de 5840adea 5f957280 3a171ecb 1793a828 e8b83407 b7d9c3bc
-0 39 9 9 3814 82 1 9 82 1 9 68fd1e64 421b43cd 3983c24c 29998ed1 4cf72387 fe6b92e5 dcc1b63d 1f89b562 a73ee510 d04aae7d 731cd88c 6aaba33c 34d253f7 b28479f6 2d0bb053 b041b04a d4bb7bd8 2804effd 723b4dfd 3a171ecb b34f3128
-0 0 32 13 35317 0 15 30 0 13 5a9ed9b0 09e68b86 39cbb726 afc54bd9 25c83c98 13718bbd d2d741ca 5b392875 a73ee510 3b08e48b ea4adb47 4f5c5791 05781932 07d13a8f 36721ddc 2f6bcbc0 d4bb7bd8 5aed7436 2442feac a458ea53 b215bc2d 3a171ecb 1793a828 e8b83407 02fa3dea
-0 45 11 15 40 44 1 15 44 1 15 64e77ae7 38d50e09 92eb3174 88e439d9 25c83c98 6f6d9be8 fc6b47d9 5b392875 a73ee510 5080de78 b3410e99 604f499b 0d2cad4c 07d13a8f e2275836 8e662061 d4bb7bd8 fffe2a63 21ddcdc9 b1252a9d 872c22d6 32c7478e df487a73 001f3601 c27f155b
-1 1122 41211 499 0 0 10 0 05db9164 207b2d81 d0484442 68637816 f281d2a7 12c61956 0b153874 a73ee510 48af2ba2 94d2aad8 3b9ae062 f23a3825 07d13a8f 0c67c4ca 3a1a0a65 d4bb7bd8 395856b0 21ddcdc9 a458ea53 1720a38e 32c7478e 4de83b96 001f3601 8f16a3b8
-1 1 -1 696 1 22 1 81 1 7 0 68fd1e64 537e899b 5037b88e 9dde01fd 25c83c98 7e0ccccf 17024f49 f504a6f4 a73ee510 f2a8242b ba0f9e8a 680d7261 4e4dd817 07d13a8f 6d68e99c c0673b44 e5ba7672 b34aa802 e049c839 c7dc6720 6095f986
-0 18 3 1480 340 9 3 26 2 0 3 05db9164 a796837e 08de7b18 97ce69e9 30903e74 7e0ccccf 12343fcc 0b153874 a73ee510 547c0ffe 9bcaeafe c5011072 46f42a63 cfef1c29 98eddd86 5a9431f3 27c07bd6 e90118d1 e754c5e1 3a171ecb 8fc66e78
-0 2 59 3 3 11 3 2 3 3 1 1 3 05db9164 09e68b86 27685115 a35ea34f 25c83c98 7e0ccccf 9b4ad590 1f89b562 a73ee510 3b08e48b 75b8e15e 92e9af0d ed43e458 1adce6ef dbc5e126 dc52e604 07c540c4 5aed7436 21ddcdc9 5840adea e5835dfb bcdee96c f89ffef1 e8b83407 a9637a08
-0 0 -1 5937 29 1 1 60 0 1 05db9164 09e68b86 d49019a8 8d5aa295 43b19349 13718bbd 89391314 0b153874 a73ee510 9372d502 608452cc 615e62e7 cbb8fa8b b28479f6 52baadf5 e606c6b3 e5ba7672 5aed7436 2b558521 b1252a9d 7440d805 32c7478e 18038694 e8b83407 7048bfb1
-1 0 0 2 2875 245 2 2 243 0 2 0 05db9164 86d4fccc 697f4e85 f2159098 4cf72387 fbad5c96 dc7659bd 5b392875 a73ee510 efea433b e51ddf94 35641a0a 3516f6e6 07d13a8f e87e1df4 c1eba210 e5ba7672 e727949e 21ddcdc9 5840adea 47e2c032 32c7478e 3b183c5c 001f3601 afd260f5
-0 1 0 70 6 135 27 14 2 45 1 2 0 6 68fd1e64 80e26c9b ba1947d0 85dd697c 4cf72387 16a2e9cb 1f89b562 a73ee510 1ce1e29d 44fa9a7f 34a238e0 f27ed3ab 1adce6ef 0f942372 da441c7e e5ba7672 005c6740 21ddcdc9 5840adea 8717ea07 423fab69 1793a828 e8b83407 9904c656
-1 80 25 2 3 2 3 80 3 3 1 1 1 3 05db9164 0b8e9caf 9b9cd1bb 5974d6bc 25c83c98 fbad5c96 4b815add 0b153874 a73ee510 3b08e48b 7cb56051 7364e701 1ac91ec9 b28479f6 5340cb84 1ab2aab4 3486227d ca6a63cf 91311aa2 bcdee96c 08b0ce98
-0 0 1 1 1801 14 0 0 05db9164 5dac953d d032c263 c18be181 384874ce 7e0ccccf 8363bee7 0b153874 a73ee510 efea433b bf09be0e dfbb09fb 3516f6e6 1adce6ef 32330105 84898b2a e5ba7672 24de59c1 0014c32a 32c7478e 3b183c5c
-0 1 0 6 8 18 8 1 8 8 1 1 8 5a9ed9b0 0468d672 c48cd8f8 24d89f30 25c83c98 24a360aa 5b392875 a73ee510 c8a342b9 2c9174a6 f25a8037 7eda22c5 b28479f6 234191d3 9ca51d92 d4bb7bd8 9880032b 21ddcdc9 5840adea 17b90ef0 32c7478e da89b7d5 ea9a246c 984e0db0
-1 7 1 5 1 1311 58 50 2 200 1 6 0 1 05db9164 6887a43c 9b792af9 9c6d05a0 25c83c98 7e0ccccf f367d44f 0b153874 a73ee510 3e3375c9 f68c5128 6532318c d86616b0 1adce6ef ef6b7bdf 2c9d222f 3486227d 8f0f692f 21ddcdc9 a458ea53 cc6a9262 ad3062eb 423fab69 a5862ce8 445bbe3b 0b89ae9f
-1 1 0 1 378 41 4 16 100 1 2 68fd1e64 38a947a1 75df6d36 b1c1e580 25c83c98 7e0ccccf 14ad5567 1f89b562 a73ee510 9dc8b302 9ddd72e9 6fbed051 37e99bb7 07d13a8f 6d74487d f10a7996 07c540c4 b3e92443 c576dc74 3a171ecb 67d37917
-0 183 3 3 27395 0 3 67 0 3 be589b51 f3139f76 1c8c8a04 bf0b19a8 30903e74 7e0ccccf 6d389dca 0b153874 a73ee510 98bd7a24 e4eb05d4 5b5ab0a8 a4c5d6dd b28479f6 28c50c84 5131d930 e5ba7672 df5475ca 3b226dea 3a171ecb 4fcc135f
-0 1 17 3 0 7 3 0 3 05db9164 083aa75b 88bd9da3 c235950d 25c83c98 7e0ccccf 0697a6a6 0b153874 7cc72ec2 3b08e48b 7fb7db93 f3ba84a1 208257bb 1adce6ef 84203dfc 30129ae3 2005abd1 06747363 21ddcdc9 b1252a9d 9ad721d6 be7c41b4 993d6982 f0f449dd 7eaed4be
-0 6 7 2 3003 0 42 8 0 0 9 241546e0 a796837e 42db3232 e3cc371a 25c83c98 7e0ccccf 11ffbf5b 37e4aa92 a73ee510 7ad4ea2c f2313205 c9669737 9c7a975e cfef1c29 f0bf9094 c4de5bba 8efede7f 1cdbd1c5 288eaded ad3062eb 3a171ecb 8fc66e78
-0 1 36771 112 1 0 77 1 05db9164 f3139f76 9d3adacf 28d926b8 43b19349 fe6b92e5 0cd2f08f 0b153874 a73ee510 3b08e48b 7592da6b 7b93a4c7 18f84563 b28479f6 28c50c84 fc53f85c d4bb7bd8 df5475ca ed35ed93 32c7478e 4fcc135f
-0 20 1 1 4841 20 3 5 16 2 1 68fd1e64 38d50e09 948ee031 b7ab56a2 4cf72387 fbad5c96 7d733ece 0b153874 a73ee510 3753b9eb 30b2a438 42bee2f2 aebdb575 b28479f6 06373944 67b3c631 07c540c4 fffe2a63 21ddcdc9 b1252a9d bd074856 32c7478e df487a73 001f3601 c27f155b
-1 7 1095 3 37 1 7 3 3 2 2 1 05db9164 85af3139 d032c263 c18be181 384874ce fe6b92e5 7195046d 1f89b562 a73ee510 f1b45aab 4d8549da dfbb09fb 51b97b8f b28479f6 af8db00e 84898b2a e5ba7672 d4328054 0014c32a bcdee96c 3b183c5c
-1 0 0 19 7 2193 41 9 18 199 0 4 0 9 05db9164 ef69887a 7007f08d f6131df0 4cf72387 7e0ccccf e8fc728b 0b153874 a73ee510 603ff749 e7ce7f20 2d936711 f522015f 07d13a8f b98be2c0 1c332795 e5ba7672 4bcc9449 abfaf938 a458ea53 caad4ae9 32c7478e 3fdb382b e8b83407 49d68486
-1 0 0 1 1 7571 57 19 1 16 0 7 0 1 05db9164 38a947a1 72e5eac0 eee0e446 25c83c98 fbad5c96 66a728c4 0b153874 a73ee510 d0ff5b05 dab547a5 673768e2 7aab7990 07d13a8f 613de492 d617f1ff 3486227d 7abb2837 72a8c407 ad3062eb 423fab69 375c3609
-0 156 2 25905 0 11 39 0 2 05db9164 08d6d899 9143c832 f56b7dd5 25c83c98 7e0ccccf 8ce3a35f 0b153874 a73ee510 3b08e48b c8e7f509 ae1bb660 6e8ef725 b28479f6 bffbd637 bad5ee18 776ce399 bbf70d82 0429f84b 32c7478e c0d61a5c
-0 0 102404 0 9a89b36c 38a947a1 b89c82b4 c10a6e59 25c83c98 7e0ccccf 04679a14 0b153874 7cc72ec2 975342c2 19a2ded8 15820680 90c7f9d1 64c94865 fd056e92 911ebe1c 07c540c4 b2e570f5 00cfee60 ad3062eb 3a171ecb 4904c5a1
-1 46 614 210 0 10 0 71 0 257 1 5 4 0 5a9ed9b0 942f9a8d d61e0f0a c2fcecf6 4cf72387 7e0ccccf 3f4ec687 45f7c2dd a73ee510 0e9ead52 c4adf918 f6f14c38 85dbe138 07d13a8f a8e962af 64c4c290 27c07bd6 1f868fdd 21ddcdc9 b1252a9d 06316f4c ad3062eb 32c7478e 38be899f e8b83407 9bef54fd
-1 0 9 2 1576 29 3 4 14 0 1 05db9164 6887a43c bce3f26f 1d8a14d0 43b19349 fe6b92e5 675e81f6 0b153874 a73ee510 a5bb26cf 4a77ddca 381dd9fd dc1d72e4 64c94865 004dd4ed c26ce5c1 1e88c74f 36a1d942 21ddcdc9 b1252a9d e22e102f c9d4222a 32c7478e 47c5aea3 445bbe3b 12d4e9a4
-0 -1 101295 0 05db9164 2ae0a573 b7810abb 65b2bfc7 25c83c98 fe6b92e5 ccbac4d9 0b153874 7cc72ec2 3b08e48b c012107d 82665b78 c8dca410 07d13a8f 413cc8c6 6399ea39 07c540c4 f2fc99b1 ea03ca8b ad3062eb be7c41b4 d91ea8bd
-0 -1 0 0 32 0 87552397 a8b6b751 25c83c98 7e0ccccf d9aa9d97 5b392875 7cc72ec2 3b08e48b 6e647667 85dbe138 b28479f6 694e45e3 2005abd1 d787f192 21ddcdc9 5840adea 32c7478e 001f3601 99f4f64c
\ No newline at end of file
diff --git a/models/rank/dnn/data/train/sample_train.txt b/models/rank/dnn/data/train/sample_train.txt
deleted file mode 100755
index e3468bf965e13cc0af29e38ccd4f5c69a8c857f0..0000000000000000000000000000000000000000
--- a/models/rank/dnn/data/train/sample_train.txt
+++ /dev/null
@@ -1,100 +0,0 @@
-0 0 5 4 13275 14 35 4 41 4 0 4 05db9164 f0cf0024 6f67f7e5 41274cd7 25c83c98 fbad5c96 25c8362c 0b153874 a73ee510 0e97bb27 ba0f9e8a 623049e6 4e4dd817 b28479f6 e6c5b5cd c92f3b61 3486227d b04e4670 21ddcdc9 b1252a9d 60f6221e 32c7478e 43f13e8b ea9a246c 731c3655
-1 0 559 2 7 2532 164 98 6 943 0 18 0 7 68fd1e64 bc478804 b96e826a 13508380 43b19349 7e0ccccf 8363bee7 0b153874 a73ee510 f322117a bf09be0e f53c5949 3516f6e6 07d13a8f 0af7c64c 170db6b2 e5ba7672 65a2ac26 21ddcdc9 b1252a9d f0ce5c73 c7dc6720 45ab94c8 001f3601 c84c4aec
-0 2 8 3 9040 38 5 11 104 2 3 05db9164 e5fb1af3 2c003e73 6eaa3680 25c83c98 7e0ccccf 860f347d 1f89b562 a73ee510 4b8a7639 9f0003f4 0962e10a 5afd9e51 f862f261 2a079683 59c31b64 e5ba7672 13145934 21ddcdc9 a458ea53 24a384ae bcdee96c f11826cf 3a6f6b59 25cb8912
-0 72 2 2 0 4 12 0 2 8cf07265 b0660259 31567fba 1a1efaf8 25c83c98 fbad5c96 88002ee1 0b153874 7cc72ec2 3b08e48b f1b78ab4 b6d5a886 6e5da64f 1adce6ef bd5431ee 7b977dd1 2005abd1 8ec3405f f0474b68 ad3062eb 32c7478e 53c37c32
-0 2 6 34 16 1051 49 4 48 101 1 2 16 5a9ed9b0 80e26c9b 09275b26 f2ee08c0 25c83c98 7e0ccccf 372a0c4c 0b153874 a73ee510 a08eee5a ec88dd34 4e99cf84 94881fc3 1adce6ef 91f5e393 6bc40863 e5ba7672 ce25450e 21ddcdc9 b1252a9d 5dc70c60 423fab69 1793a828 e8b83407 91116abe
-0 2 13 2 15757 54 5 2 15 1 0 2 05db9164 09e68b86 eb76bef2 804f7741 43b19349 13718bbd cc5ed2f1 0b153874 a73ee510 3b08e48b facf05cc f282fc98 9f16a973 b28479f6 52baadf5 eb62e551 07c540c4 5aed7436 c361c1be b1252a9d be7ab5d2 32c7478e 1793a828 e8b83407 e9938fed
-0 0 14 1572 8 4 6 8 0 1 05db9164 e112a9de 9db30a48 b3dbc908 4cf72387 fbad5c96 f2530a89 0b153874 a73ee510 671ae88f 2181d913 2598d8eb 1e750733 ad1cc976 f1e1df0a 9ab4d6b1 e5ba7672 fdbdefe6 bbf96cac c3dc6cef 8f079aa5
-0 0 195 14 5941 285 6 20 200 2 20 5bfa8ab5 80e26c9b 36984eba 85dd697c 4cf72387 fbad5c96 f6619575 1f89b562 a73ee510 73be1da8 d5cf9352 db02a7b5 09e3bbd5 07d13a8f e8f4b767 2d0bbe92 e5ba7672 005c6740 21ddcdc9 b1252a9d 6e55e022 ad3062eb 3a171ecb 1793a828 e8b83407 9904c656
-0 0 52 1 1 4240 9 5 3 49 0 4 2 5a9ed9b0 b961056b 2fe61b6b 3642dc05 4cf72387 fe6b92e5 81bb0302 062b5529 a73ee510 8b7e21f6 b7094596 4ab3cda1 1f9d2c38 b28479f6 7eaf5074 a4b0914f e5ba7672 5742e45c 789fddf7 32c7478e 3b047130
-1 1 1 1 1378 46 5 34 236 3 1 05db9164 38a947a1 a50fea16 0a8cd7bc 25c83c98 a601d936 0b153874 a73ee510 3fb38a44 348e21cb 2b2be35d 1d8cfec6 b28479f6 66106852 31cf393e e5ba7672 0458e647 58d08d44 32c7478e 355b6af8
-0 0 0 5 2 5512 49 15 3 114 0 2 2 8cf07265 78ccd99e 01d1b993 20bb14e7 4cf72387 fbad5c96 a1eeac3d 0b153874 a73ee510 5f49e872 2e9d5aa6 500d0b9a 0a9ac04c 07d13a8f 162f3329 f24599ab e5ba7672 e7e991cb 4b1019ff a458ea53 b49094cd 423fab69 dc73316d fd2fe0bd 60a86ddf
-1 2 -1 2 1 50 1 73 4 127 1 14 0 1 68fd1e64 2aee75a8 32c8cb11 c04614ba 25c83c98 3bf701e7 407438c8 0b153874 a73ee510 213889cd 755e4a50 fa20173a 5978055e 32813e21 6aa1d799 de53b24a 3486227d ad19d8d8 64c766b8 3a171ecb 747559ec
-0 83 4 5 5666 14 1 5 14 1 5 05db9164 85af3139 d032c263 c18be181 25c83c98 fbad5c96 7195046d 0b153874 a73ee510 686e97b9 4d8549da dfbb09fb 51b97b8f b28479f6 af8db00e 84898b2a d4bb7bd8 d4328054 0014c32a ad3062eb bcdee96c 3b183c5c
-0 3 11 1612 0 40 91 0 42 05db9164 537e899b 5037b88e 9dde01fd 25c83c98 3bf701e7 ac07b602 0b153874 a73ee510 3b08e48b 7ce882d2 680d7261 f5ff33d9 1adce6ef c535a0ec c0673b44 776ce399 b34aa802 e049c839 423fab69 6095f986
-0 6 52 5 400098 0 15 15 0 5 5a9ed9b0 38d50e09 d032c263 c18be181 384874ce 7e0ccccf 6cd97108 0b153874 7cc72ec2 3b08e48b 62cdafdf dfbb09fb 2e551bbe 1adce6ef e2c18d5a 84898b2a 776ce399 582152eb 21ddcdc9 5840adea 0014c32a be7c41b4 3b183c5c 001f3601 99f4f64c
-0 26 12 11 34669 531 1 12 27 1 0 11 05db9164 98159f6d 3cc4baf5 7b110c65 25c83c98 fe6b92e5 c03eb803 0b153874 a73ee510 3b08e48b d700703a 169e9533 bccbbffe b28479f6 b2db654e 16c48bd2 3486227d 4854928e 114ff696 3a171ecb 3599e91f
-0 7 6 13416 0 0 45 0 05db9164 247a1a11 896e7bb3 c2fcecf6 25c83c98 fbad5c96 c31847f5 0b153874 a73ee510 3b08e48b a12fca95 7fa9c0a1 9b9e44d2 07d13a8f 2559d9b6 ef01918c 776ce399 51360aab 5cc5adb2 c9d4222a be7c41b4 38be899f
-0 1 5 7 14509 60 5 7 56 1 7 75ac2fe6 3e25b403 7c7b6098 f00503da 25c83c98 fe6b92e5 ef0d76b7 51d76abe a73ee510 82bb4986 529e8447 c1b3491a 50a56f08 07d13a8f ae1edc05 ab50786f e5ba7672 1c381aea f6801a20 c7dc6720 1793a828
-0 0 2 30 10 1363 415 20 28 561 0 5 0 10 68fd1e64 95e2d337 8d85271d 69040d07 25c83c98 7e0ccccf 3603d925 0b153874 a73ee510 0065486b 7934c105 8b7685bd 4840c1ab 64c94865 7de4908b b1f23afa e5ba7672 701d695d 712d530c a458ea53 da0adeef c9d4222a 423fab69 4921c033 2bf691b1 80b0aeb9
-1 1 6 3 12 0 4 40 31 410 1 14 4 68fd1e64 38a947a1 8962afa9 28625509 25c83c98 7e0ccccf 5fbd9170 0b153874 a73ee510 dc9f749b 2bcfb78f 662d25fe e6fc496d 07d13a8f 022e018a 2dd4e74f e5ba7672 f5508183 c5cea7f6 32c7478e 38255568
-0 0 0 11 4 8657 213 6 3 210 0 1 4 05db9164 80e26c9b 0bd844ef aae30d38 25c83c98 7e0ccccf d2d741ca 0b153874 a73ee510 18139a78 ea4adb47 38a37d81 05781932 07d13a8f 856b2bc1 00c7a1bf 07c540c4 fdf644e0 21ddcdc9 a458ea53 45d05ca3 dbb486d7 3e1eed85 e8b83407 6a4b2388
-0 47 35575 159 3 0 10 1 68fd1e64 999aae35 79bc99b4 e5e453f3 4cf72387 7e0ccccf c88e8d4f 0b153874 a73ee510 3b08e48b a21d2994 424e28fe 2e94d3f7 243a4e68 39a6addf a6a69939 07c540c4 63aa00dd 424af181 3a171ecb 869caea3
-1 1 512 1 2 11 2 1 2 2 1 1 2 05db9164 b26462db 9d1d0933 ebdba02b f281d2a7 fbad5c96 12343fcc 0b153874 a73ee510 f6f942d1 7f8ffe57 c6a076d2 46f42a63 64c94865 f93f84eb d9e8fb80 d4bb7bd8 195c811d 306c202e 3a171ecb 340d03c3
-0 -1 28922 24 1 8 22 1 05db9164 c8687797 5c7d8ff6 902872c9 4cf72387 fbad5c96 3833f734 0b153874 a73ee510 3b08e48b c05bd0b8 79b87c55 e411c4db b28479f6 dc96c4b0 5627d7e0 d4bb7bd8 a7e06874 21ddcdc9 b1252a9d 4063500f ad3062eb be7c41b4 54baf4d1 010f6491 ba676e3c
-0 1 51 17212 0 1 3 0 5a9ed9b0 4f25e98b b5044e29 a311963e 307e775a fe6b92e5 fe4dce68 a6d156f4 a73ee510 75542289 68357db6 1290fbf4 768f6658 07d13a8f dfab705f d5a1b8fe 1e88c74f 7ef5affa 2b558521 b1252a9d b5074db5 c9d4222a 32c7478e c832486f 001f3601 f0353f67
-0 0 162 4253 26 4 3 5 0 1 05db9164 b961056b 502bedec 81b1d519 384874ce fe6b92e5 c52b5f8e 5b392875 a73ee510 8b349795 419d31d4 e23a52b4 08961fd0 1adce6ef addd37ac b4df7a81 e5ba7672 43de85d3 fdb27279 423fab69 71640730
-0 67 43 18 61 0 18 18 0 18 05db9164 38d50e09 c4205697 bbc8d361 25c83c98 fe6b92e5 165cb289 5b392875 a73ee510 3b08e48b b94c0f2d d8c2300e b9fa764b b28479f6 7501d6be bf300501 776ce399 f855e3f0 21ddcdc9 5840adea b59344cd 3a171ecb 17f458f7 001f3601 984e0db0
-0 51 18 1 19 93 23 111 22 1156 2 11 0 23 287e684f a796837e 42db3232 e3cc371a 25c83c98 fe6b92e5 ff493eb4 25239412 a73ee510 efea433b 0983d89c c9669737 1aa94af3 cfef1c29 0d054fb9 c4de5bba e5ba7672 70e5bba7 288eaded 32c7478e 8fc66e78
-0 0 84 43 11 198 75 14 27 76 0 2 1 11 05db9164 4f25e98b 23edf366 e4889f1e 25c83c98 7e0ccccf ac28d9ec 0b153874 a73ee510 6cb0e696 bc0819f7 92107e36 c9059ff0 cfef1c29 dddd963f 4e447cf7 3486227d 7ef5affa 55dd3565 a458ea53 9a91ae21 c9d4222a 32c7478e 54a607b7 001f3601 d568f27d
-0 5 40 15 23322 746 7 15 524 3 15 05db9164 dda1caf9 f83418e0 a44d75e2 25c83c98 7e0ccccf 7d09e065 0b153874 a73ee510 3b08e48b bf2008fa a9165671 c9ae71af 07d13a8f 24c5daaf 839572dd e5ba7672 65cebfa5 cd746367 3a171ecb a9a2ac1a
-0 4 1 0 8cf07265 38c81d1a 27388f4d 539558b1 25c83c98 ce8217f8 0b153874 7cc72ec2 3b08e48b 9d12ce9b cc83e10f 9dfda2b9 b28479f6 558590b3 ed956dff 2005abd1 a5ac4b1e 21ddcdc9 b1252a9d 1061dd07 be7c41b4 a79557ea b9266ff0 6ddc02f9
-1 18 3 1 4233 3 17 1 118 5 1 5a9ed9b0 78ccd99e ced2e736 13508380 25c83c98 fbad5c96 c8b3d034 0b153874 a73ee510 3275d09a 80da9312 c7fe806a d14c9212 07d13a8f 162f3329 d274b433 e5ba7672 e7e991cb 55dd3565 b1252a9d b46cb608 c7dc6720 45ab94c8 e8b83407 c84c4aec
-0 0 2788 7 1451 0 1 0 0 0 1 1464facd 8947f767 64a350ad 5e369129 25c83c98 fe6b92e5 a13be9ad 0b153874 a73ee510 4e56c58e 62aedd5c 70aaa25e e65a5fc3 b28479f6 a473257f 9bb1dfa5 d4bb7bd8 bd17c3da 083e89d9 b1252a9d d3a891c1 ad3062eb 3a171ecb b6b5bc47 010f6491 c4510344
-0 1 6 1 4402 22 1 11 22 1 1 05db9164 207b2d81 d52980aa b66d15e3 25c83c98 fbad5c96 6ce84868 1f89b562 a73ee510 3b08e48b 609032c1 b519c595 437a58d6 b28479f6 3c767806 7c8ae841 07c540c4 395856b0 21ddcdc9 b1252a9d 605305ee 32c7478e f090fae7 001f3601 6024c307
-0 125 2 14 7259 30 2 14 97 2 14 8cf07265 04e09220 b1ecc6c4 5dff9b29 4cf72387 7e0ccccf 543f351f 1f89b562 a73ee510 3b08e48b be8a7bc2 2436ff75 7d1f1fa0 07d13a8f cae64906 f4ead43c d4bb7bd8 e161d23a 4f1aa25f ad3062eb 3a171ecb ded4aac9
-0 610 1 1 7526 40 2 1 12 1 1 5a9ed9b0 207b2d81 8a48553d 1e10bd9f 25c83c98 fe6b92e5 12343fcc 0b153874 a73ee510 547c0ffe bc8c9f21 6803e296 46f42a63 64c94865 11b2ae92 ff48ade9 e5ba7672 395856b0 21ddcdc9 b1252a9d c3d093fb ad3062eb 3a171ecb 84a27184 001f3601 8d2deb5a
-0 0 1 59 3 2766 96 2 4 7 0 1 3 5a9ed9b0 38a947a1 cf1b3029 36b520dc 4cf72387 7e0ccccf 5e64ce5f 0b153874 a73ee510 d4a82fb9 8b94178b 0d74ab27 025225f2 b28479f6 77ef1e58 5dcf110f 07c540c4 b6b880ec dd70b3ec 32c7478e 8f282db5
-0 34 18859 106 26 0 17 1 0 05db9164 c1c79489 66fa4409 bdc253c8 5a3e1872 fbad5c96 8d51595c 0b153874 a73ee510 216f775a 7110c233 7e4627d4 bb7a2c12 32813e21 59b212e4 5f231427 e5ba7672 7549f127 50798fce c7dc6720 0f9697f0
-1 1 321 1 1189 8 16 11 96 1 3 1 05db9164 a796837e 5c05f1ab 97ce69e9 25c83c98 fe6b92e5 81b62616 0b153874 a73ee510 06ee81ba fa1b06e6 50ec33a6 0eb69562 07d13a8f 47a431f5 5a9431f3 e5ba7672 f1a8f10f e9672021 ad3062eb 423fab69 8fc66e78
-0 2 1 16 13 326 61 3 47 263 1 1 0 55 8cf07265 8947f767 999b4cd3 f862f65d 25c83c98 7e0ccccf 9e8dab66 0b153874 a73ee510 fbbf2c95 46febd4d ea486dc7 949ea585 07d13a8f 2c14c412 e51f35a7 e5ba7672 bd17c3da 83236299 b1252a9d 19bea55f 32c7478e 75aae369 010f6491 08e0e995
-0 0 9 8 7182 255 9 24 44 4 8 05db9164 38a947a1 7a2ffaba 8dcfa982 25c83c98 7e0ccccf c519c54d 0b153874 a73ee510 19fd5a0e 59cd5ae7 842e9873 8b216f7b b28479f6 67596d53 1e8e1075 e5ba7672 3fb55a52 8b5b9b68 32c7478e 10edf4e4
-0 107 1 1 7 489 1 1 31 1 1 24eda356 2c16a946 adf23330 17a25a2e 25c83c98 7e0ccccf 12343fcc 0b153874 a73ee510 f6f942d1 7f8ffe57 8a390857 46f42a63 b28479f6 3628a186 64f2ada9 07c540c4 e4ca448c 467f6a77 3a171ecb 9117a34a
-0 0 280 20 10 4512 51 5 11 97 0 1 0 10 68fd1e64 6c713117 f6f030bc 19d6ddb8 25c83c98 fe6b92e5 7c59aadb 5b392875 a73ee510 c5a978c5 ff78732c 7bea4a04 9b656adc b28479f6 73b98472 7afa5706 3486227d bf6b118a 21ddcdc9 b1252a9d aef05b30 c7dc6720 1caea946 445bbe3b 69a06689
-0 41 6 2 359699 3 2 87552397 4f25e98b 16958dc8 8dfe2376 25c83c98 7e0ccccf 8025502e 0b153874 7cc72ec2 b118f931 29e4ad33 ea50fad8 80467802 b28479f6 8ab5b746 0e4c86f8 d4bb7bd8 7ef5affa 5b885066 b1252a9d e103da3e 3a171ecb fcb2509d 001f3601 24488670
-1 1 13 7 11 1 4 4 18 44 1 3 4 05db9164 09e68b86 aa8c1539 85dd697c 25c83c98 41e1828d 0b153874 a73ee510 3b08e48b b6358cf2 d8c29807 61c65daf 8ceecbc8 d2f03b75 c64d548f 07c540c4 63cdbb21 cf99e5de 5840adea 5f957280 32c7478e 1793a828 e8b83407 b7d9c3bc
-1 2 25 2 38 2 2 2 2 1 1 2 5a9ed9b0 207b2d81 fb47f7d0 6c02aa53 4cf72387 7e0ccccf 6fb62f1a 0b153874 a73ee510 4f6357b0 e51ddf94 d9fc673a 3516f6e6 b28479f6 0739b998 7e14b290 07c540c4 934abd6e 21ddcdc9 b1252a9d 0a47a519 ad3062eb 32c7478e 47620345 001f3601 c36f2d3c
-0 11 2 3 92 2 36 0 21 1 4 f473b8dc 80e26c9b 1c791144 51d55e9c 384874ce fbad5c96 57b4bd89 0b153874 a73ee510 3b08e48b 71fd20d9 b49c9404 ddd66ce1 1adce6ef 8ba8b39a 2cbed9f7 e5ba7672 f54016b9 21ddcdc9 5840adea 80f3703a 3a171ecb 1793a828 e8b83407 dbd4e512
-1 0 1 1 1 5715 181 23 2 169 0 4 0 1 5a9ed9b0 7182b361 b2aa5dce 462749d8 43b19349 7de93965 37e4aa92 a73ee510 28c6ef79 9ba53fcc 05ce35fd 42156eb4 07d13a8f 47367e94 1a5c540a 3486227d ec9b0866 437ad2af c9d4222a c7dc6720 73338ee2
-0 54 62 12 7578 184 7 12 72 1 24 05db9164 38a947a1 bc2aea05 ac975db6 25c83c98 13718bbd 80162d04 5b392875 a73ee510 3b08e48b 5b97686e b67ac327 47727147 07d13a8f 22223d6c d388d33c e5ba7672 97b81540 0ac4575d 32c7478e d28d80ac
-0 6 0 12 6 185 37 6 7 37 1 1 37 05db9164 09e68b86 9596aa6c b26d2eda 4cf72387 7e0ccccf 1a95b4d0 0b153874 a73ee510 995f172b 507605d4 bb2b1b19 5f3a0c1b 07d13a8f 36721ddc 872b1c96 e5ba7672 5aed7436 338f20de b1252a9d 8f7b9fe2 32c7478e cad46f36 e8b83407 58a43195
-0 104 1 1 2679 0 8 18 0 1 9684fd4d 8dbd550a 4cf72387 7e0ccccf 8cf87048 c8ddd494 a73ee510 3b08e48b a12fca95 9b9e44d2 f862f261 b13d160c 776ce399 53d8aa6f be7c41b4
-0 3384 5 2803 151 13 11 150 1 11 65aada8c 537e899b 5037b88e 9dde01fd 25c83c98 fbad5c96 7bcc368f 062b5529 a73ee510 f26b2389 60d2afd7 680d7261 155ff7d9 07d13a8f 73c54e3e c0673b44 e5ba7672 86b4fc22 e049c839 3a171ecb 6095f986
-1 -1 5967 11 4 1 10 2 0 68fd1e64 510b40a5 d03e7c24 eb1fd928 25c83c98 ac902434 062b5529 a73ee510 e5da7278 1294fec1 951fe4a9 7bbf93ce 07d13a8f 67daf98c 8ec71479 e5ba7672 03364bd3 0e63fca0 32c7478e 0e8fe315
-0 78 7 10 0 7 7 0 7 05db9164 9f7e1d07 e3818eb2 a4456f7e 25c83c98 02d72eea 5b392875 a73ee510 c9e11adf e09c447b 3bec5d45 8dab0422 b28479f6 08812651 72d1790f 1e88c74f 6a58e423 21ddcdc9 5840adea 950d91c1 32c7478e 2f7e98de ea9a246c e7ecb821
-1 -1 1509 0 4 0 23 3 05db9164 cc8e236e 1c239854 cf3dc9c2 4cf72387 fe6b92e5 81bb0302 0b153874 a73ee510 983552b8 b7094596 98d78b2b 1f9d2c38 07d13a8f 3a8c68b7 da1333b6 e5ba7672 775e80fe 21ddcdc9 5840adea 3ee29a07 ad3062eb c7dc6720 c83e0347 ea9a246c 2fede552
-1 2 -1 550 3 155 8 30 2 16 0 ae82ea21 3f0d3f28 c2b2b3f5 77a160bd f281d2a7 fbad5c96 3625ff87 6c41e35e a73ee510 67eea4ef 755e4a50 db21b797 5978055e 32813e21 e8d4033b fae7560f e5ba7672 744ad4a0 a17a10b3 3a171ecb e5fca70a
-1 5 113 6 18 0 10 9 21 21 2 3 3 0 05db9164 6887a43c 1e361e58 825b2615 43b19349 fbad5c96 6d0ca8d7 0017bc7c a73ee510 666a1d31 6939835e 9b62c79b dc1d72e4 b28479f6 9cc57c4d fd420402 27c07bd6 2ae4f30d 21ddcdc9 b1252a9d d12542f8 32c7478e 488d4283 445bbe3b d20e4b7a
-1 3 21 2 2 1 0 6 2 13 1 3 0 68fd1e64 38a947a1 756e3a77 bd47cb50 25c83c98 fe6b92e5 09e42cac 5b392875 a73ee510 79877583 30b2a438 74a6216c aebdb575 b28479f6 b3547943 9e33c845 e5ba7672 04fdc63f 77cd58fc 3a171ecb 69e7316d
-0 0 1 5 4 9947 275 8 12 865 0 3 4 05db9164 6887a43c 9b792af9 9c6d05a0 25c83c98 84c427f0 0b153874 a73ee510 9bc1a7c1 41b3f655 6532318c ce5114a2 8ceecbc8 4e06592a 2c9d222f e5ba7672 8f0f692f 21ddcdc9 b1252a9d cc6a9262 32c7478e a5862ce8 445bbe3b c4c8f547
-0 0 21 6 6 7270 275 3 6 93 0 2 6 05db9164 2c16a946 f7ef15ea 6ad68ce1 25c83c98 7e0ccccf 5ff926ae 25239412 a73ee510 4497acf2 864d33c2 bfe72c91 34786fb9 b28479f6 3628a186 2d08259c 07c540c4 e4ca448c 96739728 ad3062eb 32c7478e 9117a34a
-1 5 1 3 2 5 0 48 3 2 2 13 0 05db9164 2efdbb44 88f1ca30 aece8ab6 25c83c98 3bf701e7 1c86e0eb 1f89b562 a73ee510 f7ab55a0 755e4a50 4cf7f85a 5978055e 32813e21 ff824c52 5a58ab6d e5ba7672 42076ccd 3991fb63 55dd3565 4721fd29
-0 0 1 4 1 7025 101 13 1 39 0 1 1 1 05db9164 207b2d81 d52980aa b66d15e3 25c83c98 7e0ccccf f6d03c1b 5b392875 a73ee510 fe687d88 30b2a438 b519c595 aebdb575 07d13a8f 0c67c4ca 7c8ae841 e5ba7672 395856b0 21ddcdc9 b1252a9d 605305ee 32c7478e f090fae7 001f3601 77e5b96c
-0 1 95 28 8 67 14 103 42 267 1 23 14 05db9164 89ddfee8 d8f59a85 f1d06e8a 25c83c98 7e0ccccf 1c86e0eb 5b392875 a73ee510 213889cd 755e4a50 64f3690c 5978055e 1adce6ef 34cce7d2 eeb76d70 e5ba7672 5bb2ec8e 0053530c b1252a9d 7f6bcbee ad3062eb 32c7478e 43fe299c f0f449dd f3b1f00d
-0 4 54 2 9 1349 23 52 22 90 1 8 0 11 be589b51 38a947a1 e28faa26 f44af879 25c83c98 fbad5c96 407438c8 0b153874 a73ee510 5df036eb 755e4a50 defeb71b 5978055e 07d13a8f 92b9a831 d4aed6bf 27c07bd6 2a870f7f fecb5e8c c9d4222a 32c7478e a9313cb6
-1 3 145 6 12 2 8 6 36 1 4 2 be589b51 0c0567c2 9424724f fa30ea43 25c83c98 fe6b92e5 52b4e012 0b153874 a73ee510 5ba3608f a739bbee f400be52 79128231 b28479f6 1e82594c efbacdc0 e5ba7672 98c4d3e0 cd86ac29 78e2e389 32c7478e 7fb4ff91
-0 38 2 1 12810 8 3 1 7 1 1 05db9164 af447d7a fc39fe56 3197d543 25c83c98 7e0ccccf 9aba5215 5b392875 a73ee510 46c32c26 8cfaeec1 ebf6ae0a ef800ef3 b28479f6 f0d27586 38fc4d35 07c540c4 98ff11f4 11e4edec 32c7478e 0ff91809
-1 1 0 4 11 1019 11 4 20 91 1 2 0 11 05db9164 09e68b86 b1ffdff4 aff068bc 25c83c98 b87f4a4a 5b392875 a73ee510 e70742b0 319687c9 ee94532f 62036f49 cfef1c29 18847041 17442b68 e5ba7672 5aed7436 21ddcdc9 5840adea acf8bce8 32c7478e 1793a828 e8b83407 63093459
-1 3 15 24 3288 0 3 278 0 25 05db9164 38a947a1 4470baf4 8c8a4c47 25c83c98 7e0ccccf 0dbf2675 0b153874 a73ee510 48a94b2e 88196a93 bb669e25 1211c647 b28479f6 59621a99 2b2ce127 e5ba7672 b133fcd4 2b796e4a 32c7478e 8d365d3b
-1 0 175 59061 539 0 0 56 0 f473b8dc 1cfdf714 43b964ee f089159e 25c83c98 3bf701e7 c86e8c6b 37e4aa92 7cc72ec2 eab78bab 4d8549da 031b2b3a 51b97b8f 051219e6 af56328b e162466d e5ba7672 e88ffc9d 5b885066 a458ea53 9f7d1d43 ad3062eb 3a171ecb b15e807d e8b83407 fcefd6a4
-1 39 2 8 5 1202 22 42 12 179 1 2 1 13 05db9164 942f9a8d 8658d326 2b884b66 4cf72387 7e0ccccf 3f4ec687 0b153874 a73ee510 0e9ead52 c4adf918 cf9c76af 85dbe138 07d13a8f a8e962af 12ff41b8 3486227d 1f868fdd 1d04f4a4 a458ea53 9e55b62d ad3062eb 32c7478e 3fdb382b 9d93af03 49d68486
-0 -1 24422 11 18 0 8 1 05db9164 9e5ce894 02391f51 b9c629a9 25c83c98 3bf701e7 22fd2464 0b153874 a73ee510 5aca218f d9085127 2397259a ef7e2c01 07d13a8f 8cf98699 d37efe8c e5ba7672 a5bb7b8a 21ddcdc9 5840adea b6119319 32c7478e 45ab94c8 ea9a246c b13f4ade
-0 1 8 4936 0 0 15 0 05db9164 ea3a5818 e33cc329 7a5aa046 25c83c98 fbad5c96 4a45f6c5 0b153874 a73ee510 fe01516c 2a0b79f8 aaa493f6 25512dff b28479f6 0a069322 ba35244c e5ba7672 a1d0cc4f 21ddcdc9 a458ea53 c1a3607e c7dc6720 a7084d70 1575c75f ef4df1dd
-0 124 1 2648 0 4 13 0 05db9164 38a947a1 7b9e7a93 49afffac 25c83c98 7e0ccccf bddc9773 0b153874 a73ee510 3b08e48b ff2333c8 5f12b145 140595a0 b28479f6 7c5bcff3 e699400f d4bb7bd8 876521e0 6d6ae2d8 32c7478e b258af68
-1 4 -1 282 18 4 15 15 1 1 68fd1e64 38a947a1 840eeb3a f7263320 25c83c98 7e0ccccf 44fb02c7 6c41e35e a73ee510 3b08e48b 2386466b 317bfd7d 45db6793 07d13a8f 6f1ab4eb 1689e4de e5ba7672 5d961bca dc55d6df 3a171ecb aa0115d2
-1 -1 14834 111 6 0 204 2 68fd1e64 08d6d899 6a8a1217 14bfebf4 25c83c98 7e0ccccf 9e0ed189 0b153874 a73ee510 f68bc089 c3e44774 5b355b50 c278016c 64c94865 a8e4fe6e 0ded9094 e5ba7672 9dde83ca 831d5286 32c7478e 9e9a60e4
-0 3 276 8 11 10 11 3 26 11 1 1 0 11 5e53cc38 b26462db b6025941 06b1cf6e 4cf72387 13718bbd 65ae2219 0b153874 a73ee510 fbbf2c95 447a6784 72e65cea 9be66b48 cfef1c29 fc8350a5 25b075e4 07c540c4 35ee3e9e ad6ee353 3a171ecb 0ff91809
-0 -1 11615 30 1 0 28 1 5a9ed9b0 95e2d337 086df0da 6262590b 4cf72387 7e0ccccf 72cf945c 0b153874 a73ee510 ef2fbb20 7b61aa9b 547c3f98 7f5bf282 07d13a8f 4e505ea3 7ac9f411 d4bb7bd8 7b06fafe 21ddcdc9 a458ea53 29ac833e 32c7478e 7c28ef9f 2bf691b1 b288bc0b
-0 0 8 306565 0 14 8 0 10 05db9164 38a947a1 4470baf4 8c8a4c47 25c83c98 7e0ccccf 2e85de94 0b153874 7cc72ec2 3b08e48b 8d6d03a0 bb669e25 86c652c6 b28479f6 091737ad 2b2ce127 776ce399 ade68c22 2b796e4a ad3062eb be7c41b4 8d365d3b
-0 0 1992 2 1451 31 1 20 31 0 1 2 be589b51 d833535f b00d1501 d16679b9 25c83c98 fbad5c96 9a75d128 0b153874 a73ee510 3b08e48b 90bf7fef e0d76380 a70d1580 b28479f6 a733d362 1203a270 e5ba7672 281769c2 73d06dde c9d4222a 32c7478e aee52b6f
-0 2 1 8 13233 164 13 8 88 7 0 8 05db9164 13f25995 0b0f3952 35d9e6fe 25c83c98 7e0ccccf 87e29668 0b153874 a73ee510 3b08e48b 0bc0e6ed ff8c6fd9 abd69a9d 07d13a8f 7cad642c 5015d391 8efede7f c7cf2414 3db17de9 32c7478e 4fe18e82
-0 50 15 6661 0 40 49 0 21 5bfa8ab5 08d6d899 77f2f2e5 d16679b9 25c83c98 7e0ccccf 7f2c5a6e a61cc0ef a73ee510 3b08e48b d21494f8 9f32b866 f47f13e4 b28479f6 bffbd637 31ca40b6 1e88c74f bbf70d82 dfcfc3fa c9d4222a 32c7478e aee52b6f
-1 0 13 2 10320 72 0 2 45 0 2 05db9164 09e68b86 e95580ff 653ee14f 25c83c98 fe6b92e5 26a81064 5b392875 a73ee510 dcbc7c2b 9e511730 49a381fa 04e4a7e0 1adce6ef dbc5e126 cf6ed269 d4bb7bd8 5aed7436 21ddcdc9 a458ea53 c9fcf5fd 3a171ecb 5e22c595 e8b83407 8e27cf04
-0 -1 11066 0 0 1 0 05db9164 38a947a1 a64c7bd9 67a8407c 25c83c98 fe6b92e5 71fd6dcd 0b153874 a73ee510 3b08e48b e5cd3d61 0a8d756f 08ba5c35 b28479f6 b7815e37 ef7d43b0 776ce399 a6bfeb0a 455f53fb 93bad2c0 928e948f
-0 107 8939 0 1 2 0 05db9164 a244fe99 25c83c98 7e0ccccf c8e48a82 0b153874 a73ee510 c6c8dd7c ae4c531b 01c2bbc7 07d13a8f 2f5df569 d4bb7bd8 35901cfb ad3062eb 423fab69
-0 0 2 15 12 2504 365 1 10 77 0 1 12 68fd1e64 08d6d899 03942b3f afe92929 25c83c98 7e0ccccf f4b9d7ad 0b153874 a73ee510 663eefea c1ee56d0 e977ae2f ebd756bd 07d13a8f 1a277242 82f06a35 d4bb7bd8 87c6f83c 08119c8b 55dd3565 f96a556f
-1 13 2153 1 25 37 3 13 9 29 2 2 3 68fd1e64 4f25e98b de211a17 a8925441 25c83c98 5e64ce5f 1f89b562 a73ee510 be630248 8b94178b fcaae253 025225f2 b28479f6 8ab5b746 3b58b07a e5ba7672 7ef5affa 9437f62f b1252a9d ce247dc1 32c7478e 3fdb382b 001f3601 0fd820a6
-1 37 72 2 3 4 2 49 42 222 1 5 2 05db9164 3f0d3f28 d73310fa b40012b1 4cf72387 fbad5c96 ad3508b1 0b153874 a73ee510 08658f3b ad757a5a 0e466d8f 93b18cb5 32813e21 3440b690 f4219d4b e5ba7672 7da064fc 0471db05 ad3062eb c7dc6720 e5fca70a
-0 0 5 11541 0 0 7 0 05db9164 89ddfee8 15d7420a ff441594 25c83c98 7e0ccccf bdaf7920 0b153874 a73ee510 fbbf2c95 4c074d2a 5f27bc59 f948ca5d 051219e6 d5223973 e2b64862 1e88c74f 5bb2ec8e 0053530c a458ea53 2f4978df 32c7478e 75c8ca05 f0f449dd d21d0b82
-0 15 2 2 87297 0 3 23 0 3 05db9164 a8b6b751 3e67fbbb 10056215 25c83c98 7e0ccccf d9aa9d97 5b392875 7cc72ec2 3b08e48b c4adf918 d9f32d8d 85dbe138 b28479f6 694e45e3 345db5a2 776ce399 d787f192 21ddcdc9 5840adea 7463465b ad3062eb 32c7478e 3d236c54 001f3601 984e0db0
-0 30 1 12 5 11 5 608 19 286 1 47 1 5 05db9164 89ddfee8 ab2fe4c8 428cff52 43b19349 3bf701e7 407438c8 1f89b562 a73ee510 0a164266 755e4a50 3989acff 5978055e b28479f6 25753fb1 cf445916 8efede7f 5bb2ec8e 21ddcdc9 b1252a9d d64ee25a 78e2e389 32c7478e 0b351a52 e8b83407 b1c17344
-1 5 7 2 2 414 21 83 33 925 1 36 2 68fd1e64 421b43cd 06ded108 29998ed1 43b19349 7e0ccccf 4aa938fc 5b392875 a73ee510 03ed27e7 2b9c7071 6aaba33c 1aa94af3 b28479f6 2d0bb053 b041b04a e5ba7672 2804effd 723b4dfd c9d4222a 3a171ecb b34f3128
-0 1 6 21905 0 15 49 0 6 05db9164 62e9e9bf 91c52fd6 89085a81 43b19349 fe6b92e5 e88f1cec 45f7c2dd a73ee510 3b08e48b 8f410860 5ad710aa b8eec0b1 cfef1c29 9a7936cb 9decb3fe 776ce399 d2651d6e c7d10c5e be7c41b4 6f90ebe1
-0 0 174 5 14718 10 0 5 5a9ed9b0 2fe85f57 b61789da 230aba50 25c83c98 fe6b92e5 3a6d4c08 0b153874 a73ee510 d108fc83 41656eae 24604d0c 66815d59 07d13a8f d8524628 78d9f0d0 e5ba7672 f4373605 ab303097 c9d4222a 32c7478e fab2a151
-0 0 7 1 15780 12 6 1 1 1 1 05db9164 8ab240be cedcacac 7967fcf5 25c83c98 7e0ccccf 5f29da0e 0b153874 a73ee510 f476fbe3 0ad37b4b 553e02c3 f9d99d81 1adce6ef 28883800 91a6eec5 1e88c74f ca533012 21ddcdc9 5840adea a97b62ca 423fab69 727a7cc7 445bbe3b 6935065e
-0 0 2 1 1540 44 4 4 268 0 4 5 05db9164 68b3edbf 77f2f2e5 d16679b9 25c83c98 7e0ccccf fcf0132a 1f89b562 a73ee510 aed3d80e d650f1bd 9f32b866 863f8f8a b28479f6 f511c49f 31ca40b6 e5ba7672 752d8b8a dfcfc3fa c7dc6720 aee52b6f
-0 7 31 1 239 1 8 9 49 1 2 0 1 68fd1e64 8084ee93 d032c263 c18be181 43b19349 fe6b92e5 cee47266 0b153874 a73ee510 14781fa9 87fe3e10 dfbb09fb 3bd6c21d b28479f6 16d2748c 84898b2a 27c07bd6 003d4f4f 0014c32a 32c7478e 3b183c5c
-0 -1 12674 4 26 0 73 2 05db9164 09e68b86 eecaacb9 d268ac84 25c83c98 13718bbd 33cca6fa 0b153874 a73ee510 401ced54 683e14e9 ce76d69d 2b9fb512 b28479f6 52baadf5 7bf10350 e5ba7672 5aed7436 55dd3565 b1252a9d 3d7cfd1b 3a171ecb 3fdb382b 3d2bedd7 49d68486
-0 259 4 103468 0 0 14 0 05db9164 8947f767 d8ec4c68 ac1667dd 4cf72387 7e0ccccf 3527bb7c 0b153874 7cc72ec2 3b08e48b 2b9f131d 2a63b3ee aca10c14 07d13a8f 2c14c412 11b43c2e 8efede7f bd17c3da 21ddcdc9 a458ea53 79a05ba5 32c7478e 4fb9fee0 010f6491 004f1180
-1 3 145 4 108 6 4 4 31 1 2 4 8cf07265 6c2cbbdc a42bd759 8b3b6b2e 25c83c98 f00bddf8 062b5529 a73ee510 0d538fca 55795b33 6bb7b021 39795005 64c94865 af094307 c3815fe3 e5ba7672 fb299884 987d0b7a 32c7478e 145ae095
-1 147 1 159966 0 1 1 0 1 68fd1e64 38d50e09 c86b2d8d 657dc3b9 25c83c98 7e0ccccf bc324536 1f89b562 7cc72ec2 474773a7 2bcfb78f 1ca7a526 e6fc496d b28479f6 06373944 ba46c3a1 e5ba7672 fffe2a63 21ddcdc9 b1252a9d eb0fc6f8 ad3062eb 32c7478e df487a73 001f3601 c27f155b
diff --git a/models/rank/dnn/model.py b/models/rank/dnn/model.py
index 71f88d627213d8ddab2f3ecaa97e105553d4a99a..f4425e3d9853b7f7decbc45ff607c4173901d0cf 100755
--- a/models/rank/dnn/model.py
+++ b/models/rank/dnn/model.py
@@ -13,90 +13,60 @@
# limitations under the License.
import math
+
import paddle.fluid as fluid
-from fleetrec.core.utils import envs
-from fleetrec.core.model import Model as ModelBase
+from paddlerec.core.utils import envs
+from paddlerec.core.model import Model as ModelBase
class Model(ModelBase):
def __init__(self, config):
ModelBase.__init__(self, config)
- def input(self):
- def sparse_inputs():
- ids = envs.get_global_env("hyper_parameters.sparse_inputs_slots", None, self._namespace)
-
- sparse_input_ids = [
- fluid.layers.data(name="S" + str(i),
- shape=[1],
- lod_level=1,
- dtype="int64") for i in range(1, ids)
- ]
- return sparse_input_ids
-
- def dense_input():
- dim = envs.get_global_env("hyper_parameters.dense_input_dim", None, self._namespace)
-
- dense_input_var = fluid.layers.data(name="D",
- shape=[dim],
- dtype="float32")
- return dense_input_var
-
- def label_input():
- label = fluid.layers.data(name="click", shape=[1], dtype="int64")
- return label
-
- self.sparse_inputs = sparse_inputs()
- self.dense_input = dense_input()
- self.label_input = label_input()
-
- self._data_var.append(self.dense_input)
-
- for input in self.sparse_inputs:
- self._data_var.append(input)
-
- self._data_var.append(self.label_input)
-
- if self._platform != "LINUX":
- self._data_loader = fluid.io.DataLoader.from_generator(
- feed_list=self._data_var, capacity=64, use_double_buffer=False, iterable=False)
-
- def net(self):
- is_distributed = True if envs.get_trainer() == "CtrTrainer" else False
- sparse_feature_number = envs.get_global_env("hyper_parameters.sparse_feature_number", None, self._namespace)
- sparse_feature_dim = envs.get_global_env("hyper_parameters.sparse_feature_dim", None, self._namespace)
+ def _init_hyper_parameters(self):
+ self.is_distributed = True if envs.get_trainer(
+ ) == "CtrTrainer" else False
+ self.sparse_feature_number = envs.get_global_env(
+ "hyper_parameters.sparse_feature_number")
+ self.sparse_feature_dim = envs.get_global_env(
+ "hyper_parameters.sparse_feature_dim")
+ self.learning_rate = envs.get_global_env(
+ "hyper_parameters.learning_rate")
+
+ def net(self, input, is_infer=False):
+ self.sparse_inputs = self._sparse_data_var[1:]
+ self.dense_input = [] #self._dense_data_var[0]
+ self.label_input = self._sparse_data_var[0]
def embedding_layer(input):
emb = fluid.layers.embedding(
input=input,
is_sparse=True,
- is_distributed=is_distributed,
- size=[sparse_feature_number, sparse_feature_dim],
+ is_distributed=self.is_distributed,
+ size=[self.sparse_feature_number, self.sparse_feature_dim],
param_attr=fluid.ParamAttr(
name="SparseFeatFactors",
- initializer=fluid.initializer.Uniform()),
- )
- emb_sum = fluid.layers.sequence_pool(
- input=emb, pool_type='sum')
+ initializer=fluid.initializer.Uniform()), )
+ emb_sum = fluid.layers.sequence_pool(input=emb, pool_type='sum')
return emb_sum
- def fc(input, output_size):
- output = fluid.layers.fc(
- input=input, size=output_size,
- act='relu', param_attr=fluid.ParamAttr(
- initializer=fluid.initializer.Normal(
- scale=1.0 / math.sqrt(input.shape[1]))))
- return output
-
sparse_embed_seq = list(map(embedding_layer, self.sparse_inputs))
- concated = fluid.layers.concat(sparse_embed_seq + [self.dense_input], axis=1)
+ concated = fluid.layers.concat(sparse_embed_seq, axis=1)
+ #sparse_embed_seq + [self.dense_input], axis=1)
fcs = [concated]
- hidden_layers = envs.get_global_env("hyper_parameters.fc_sizes", None, self._namespace)
+ hidden_layers = envs.get_global_env("hyper_parameters.fc_sizes")
for size in hidden_layers:
- fcs.append(fc(fcs[-1], size))
+ output = fluid.layers.fc(
+ input=fcs[-1],
+ size=size,
+ act='relu',
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Normal(
+ scale=1.0 / math.sqrt(fcs[-1].shape[1]))))
+ fcs.append(output)
predict = fluid.layers.fc(
input=fcs[-1],
@@ -107,30 +77,25 @@ class Model(ModelBase):
self.predict = predict
- def avg_loss(self):
- cost = fluid.layers.cross_entropy(input=self.predict, label=self.label_input)
- avg_cost = fluid.layers.reduce_mean(cost)
- self._cost = avg_cost
-
- def metrics(self):
auc, batch_auc, _ = fluid.layers.auc(input=self.predict,
label=self.label_input,
- num_thresholds=2 ** 12,
+ num_thresholds=2**12,
slide_steps=20)
+ if is_infer:
+ self._infer_results["AUC"] = auc
+ self._infer_results["BATCH_AUC"] = batch_auc
+ return
+
self._metrics["AUC"] = auc
self._metrics["BATCH_AUC"] = batch_auc
-
- def train_net(self):
- self.input()
- self.net()
- self.avg_loss()
- self.metrics()
+ cost = fluid.layers.cross_entropy(
+ input=self.predict, label=self.label_input)
+ avg_cost = fluid.layers.reduce_mean(cost)
+ self._cost = avg_cost
def optimizer(self):
- learning_rate = envs.get_global_env("hyper_parameters.learning_rate", None, self._namespace)
- optimizer = fluid.optimizer.Adam(learning_rate, lazy_mode=True)
+ optimizer = fluid.optimizer.Adam(self.learning_rate, lazy_mode=True)
return optimizer
def infer_net(self):
- self.input()
- self.net()
+ pass
diff --git a/models/rank/readme.md b/models/rank/readme.md
index e458bc04e43a20a4b802a705a55e4c490632c340..91e165b4edcedd72344bf1a57e134b0d037686e7 100755
--- a/models/rank/readme.md
+++ b/models/rank/readme.md
@@ -1,13 +1,13 @@
# 排序模型库
## 简介
-我们提供了常见的排序任务中使用的模型算法的PaddleRec实现, 单机训练&预测效果指标以及分布式训练&预测性能指标等。实现的排序模型包括 [多层神经网络](http://gitlab.baidu.com/tangwei12/paddlerec/tree/develop/models/rank/dnn)、[Deep Cross Network](http://gitlab.baidu.com/tangwei12/paddlerec/tree/develop/models/rank/dcn)、[DeepFM](http://gitlab.baidu.com/tangwei12/paddlerec/tree/develop/models/rank/deepfm)、 [xDeepFM](http://gitlab.baidu.com/tangwei12/paddlerec/tree/develop/models/rank/xdeepfm)、[Deep Interest Network](http://gitlab.baidu.com/tangwei12/paddlerec/tree/develop/models/rank/din)、[Wide&Deep](http://gitlab.baidu.com/tangwei12/paddlerec/tree/develop/models/rank/wide_deep)。
+我们提供了常见的排序任务中使用的模型算法的PaddleRec实现, 单机训练&预测效果指标以及分布式训练&预测性能指标等。实现的排序模型包括 [多层神经网络](dnn)、[Deep Cross Network](dcn)、[DeepFM](deepfm)、 [xDeepFM](xdeepfm)、[Deep Interest Network](din)、[Wide&Deep](wide_deep)。
模型算法库在持续添加中,欢迎关注。
## 目录
* [整体介绍](#整体介绍)
- * [排序模型列表](#排序模型列表)
+ * [模型列表](#模型列表)
* [使用教程](#使用教程)
* [数据处理](#数据处理)
* [训练](#训练)
@@ -18,41 +18,115 @@
* [模型性能列表](#模型性能列表)
## 整体介绍
-### 排序模型列表
+### 模型列表
| 模型 | 简介 | 论文 |
| :------------------: | :--------------------: | :---------: |
| DNN | 多层神经网络 | -- |
-| wide&deep | Deep + wide(LR) | [Wide & Deep Learning for Recommender Systems](https://dl.acm.org/doi/abs/10.1145/2988450.2988454)(2016) |
-| DeepFM | DeepFM | [DeepFM: A Factorization-Machine based Neural Network for CTR Prediction](https://arxiv.org/abs/1703.04247)(2017) |
-| xDeepFM | xDeepFM | [xDeepFM: Combining Explicit and Implicit Feature Interactions for Recommender Systems](https://dl.acm.org/doi/abs/10.1145/3219819.3220023)(2018) |
-| DCN | Deep Cross Network | [Deep & Cross Network for Ad Click Predictions](https://dl.acm.org/doi/abs/10.1145/3124749.3124754)(2017) |
-| DIN | Deep Interest Network | [Deep Interest Network for Click-Through Rate Prediction](https://dl.acm.org/doi/abs/10.1145/3219819.3219823)(2018) |
+| wide&deep | Deep + wide(LR) | [Wide & Deep Learning for Recommender Systems](https://dl.acm.org/doi/pdf/10.1145/2988450.2988454)(2016) |
+| DeepFM | DeepFM | [DeepFM: A Factorization-Machine based Neural Network for CTR Prediction](https://arxiv.org/pdf/1703.04247.pdf)(2017) |
+| DCN | Deep Cross Network | [Deep & Cross Network for Ad Click Predictions](https://dl.acm.org/doi/pdf/10.1145/3124749.3124754)(2017) |
+| xDeepFM | xDeepFM | [xDeepFM: Combining Explicit and Implicit Feature Interactions for Recommender Systems](https://dl.acm.org/doi/pdf/10.1145/3219819.3220023)(2018) |
+| DIN | Deep Interest Network | [Deep Interest Network for Click-Through Rate Prediction](https://dl.acm.org/doi/pdf/10.1145/3219819.3219823)(2018) |
-## 使用教程
+下面是每个模型的简介(注:图片引用自链接中的论文)
+
+[wide&deep](https://dl.acm.org/doi/pdf/10.1145/2988450.2988454):
+
+
+
+
+[DeepFM](https://arxiv.org/pdf/1703.04247.pdf):
+
+
+
+
+[XDeepFM](https://dl.acm.org/doi/pdf/10.1145/3219819.3220023):
+
+
+
+
+[DCN](https://dl.acm.org/doi/pdf/10.1145/3124749.3124754):
+
+
+
+
+[DIN](https://dl.acm.org/doi/pdf/10.1145/3219819.3219823):
+
+
+
+
+## 使用教程(快速开始)
+使用样例数据快速开始,参考[训练](###训练) & [预测](###预测)
+## 使用教程(复现论文)
+为了方便使用者能够快速的跑通每一个模型,我们在每个模型下都提供了样例数据,并且调整了batch_size等超参以便在样例数据上更加友好的显示训练&测试日志。如果需要复现readme中的效果请按照如下表格调整batch_size等超参,并使用提供的脚本下载对应数据集以及数据预处理。
+| 模型 | batch_size | thread_num | epoch_num |
+| :------------------: | :--------------------: | :--------------------: | :--------------------: |
+| DNN | 1000 | 10 | 1 |
+| DCN | 512 | 20 | 2 |
+| DeepFM | 100 | 10 | 30 |
+| DIN | 32 | 10 | 100 |
+| Wide&Deep | 40 | 1 | 40 |
+| xDeepFM | 100 | 1 | 10 |
### 数据处理
+参考每个模型目录数据下载&预处理脚本
+
+```
+sh run.sh
+```
+
+数据读取默认使用core/reader.py
+
### 训练
+```
+cd modles/rank/dnn # 进入选定好的排序模型的目录 以DNN为例
+python -m paddlerec.run -m paddlerec.models.rank.dnn # 使用内置配置
+# 如果需要使用自定义配置,config.yaml中workspace需要使用改模型目录的绝对路径
+# 自定义修改超参后,指定配置文件,使用自定义配置
+python -m paddlerec.run -m ./config.yaml
+```
### 预测
+```
+# 修改对应模型的config.yaml,mode配置infer_runner
+# 示例: mode: runner1 -> mode: infer_runner
+# infer_runner中 class配置为 class: single_infer
+# 如果训练阶段和预测阶段的模型输入一致,phase不需要改动,复用train的即可
+
+# 修改完config.yaml后 执行:
+python -m paddlerec.run -m ./config.yaml # 以DNN为例
+```
## 效果对比
-### 模型效果列表
+### 模型效果 (测试)
-| 数据集 | 模型 | loss | 测试auc | acc | mae |
+| 数据集 | 模型 | loss | auc | acc | mae |
| :------------------: | :--------------------: | :---------: |:---------: | :---------: |:---------: |
| Criteo | DNN | -- | 0.79395 | -- | -- |
-| Criteo | DeepFM | 0.44797 | 0.8046 | -- | -- |
-| Criteo | DCN | 0.44703564 | 0.80654419 | -- | -- |
-| Criteo | xDeepFM | -- | -- | 0.48657 | -- |
-| Census-income Data | Wide&Deep | 0.76195(mean) | 0.90577(mean) | -- | -- |
-| Amazon Product | DIN | 0.47005194 | 0.863794952818 | -- | -- |
+| Criteo | DeepFM | 0.44797 | 0.80460 | -- | -- |
+| Criteo | DCN | 0.44704 | 0.80654 | -- | -- |
+| Criteo | xDeepFM | 0.48657 | -- | -- | -- |
+| Census-income Data | Wide&Deep | 0.76195 | 0.90577 | -- | -- |
+| Amazon Product | DIN | 0.47005 | 0.86379 | -- | -- |
+
## 分布式
-### 模型性能列表
-| 数据集 | 模型 | 单机 | 多机(同步) | 多机(异步) | GPU |
-| :------------------: | :--------------------: | :---------: |:---------: |:---------: |:---------: |
-| Criteo | DNN | -- | -- | -- | -- |
-| Criteo | DeepFM | -- | -- | -- | -- |
-| Criteo | DCN | -- | -- | -- | -- |
-| Criteo | xDeepFM | -- | -- | -- | -- |
-| Census-income Data | Wide&Deep | -- | -- | -- | -- |
-| Amazon Product | DIN | -- | -- | -- | -- |
+### 模型训练性能 (样本/s)
+| 数据集 | 模型 | 单机 | 同步 (4节点) | 同步 (8节点) | 同步 (16节点) | 同步 (32节点) |
+| :------------------: | :--------------------: | :---------: |:---------: |:---------: |:---------: |:---------: |
+| Criteo | DNN | 99821 | 148788 | 148788 | 507936 | 856032 |
+| Criteo | DeepFM | -- | -- | -- | -- | -- |
+| Criteo | DCN | -- | -- | -- | -- | -- |
+| Criteo | xDeepFM | -- | -- | -- | -- | -- |
+| Census-income Data | Wide&Deep | -- | -- | -- | -- | -- |
+| Amazon Product | DIN | -- | -- | -- | -- | -- |
+
+----
+
+| 数据集 | 模型 | 单机 | 异步 (4节点) | 异步 (8节点) | 异步 (16节点) | 异步 (32节点) |
+| :------------------: | :--------------------: | :---------: |:---------: |:---------: |:---------: |:---------: |
+| Criteo | DNN | 99821 | 316918 | 602536 | 1130557 | 2048384 |
+| Criteo | DeepFM | -- | -- | -- | -- | -- |
+| Criteo | DCN | -- | -- | -- | -- | -- |
+| Criteo | xDeepFM | -- | -- | -- | -- | -- |
+| Census-income Data | Wide&Deep | -- | -- | -- | -- | -- |
+| Amazon Product | DIN | -- | -- | -- | -- | -- |
diff --git a/models/rank/tagspace/config.yaml b/models/rank/tagspace/config.yaml
deleted file mode 100644
index ce0283e24e0298d20f6d1b5de7200009eb9c9f35..0000000000000000000000000000000000000000
--- a/models/rank/tagspace/config.yaml
+++ /dev/null
@@ -1,50 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-train:
- trainer:
- # for cluster training
- strategy: "async"
-
- epochs: 10
- workspace: "fleetrec.models.rank.tagspace"
-
- reader:
- batch_size: 5
- class: "{workspace}/reader.py"
- train_data_path: "{workspace}/train_data"
-
- model:
- models: "{workspace}/model.py"
- hyper_parameters:
- vocab_text_size: 11447
- vocab_tag_size: 4
- emb_dim: 10
- hid_dim: 1000
- win_size: 5
- margin: 0.1
- neg_size: 3
- num_devices: 1
-
-
- save:
- increment:
- dirname: "increment"
- epoch_interval: 1
- save_last: True
- inference:
- dirname: "inference"
- epoch_interval: 100
- save_last: True
-
diff --git a/models/rank/tagspace/model.py b/models/rank/tagspace/model.py
deleted file mode 100644
index 46d3d7529db7a689d2769642014e25ea6e3b8142..0000000000000000000000000000000000000000
--- a/models/rank/tagspace/model.py
+++ /dev/null
@@ -1,95 +0,0 @@
-import paddle.fluid as fluid
-import math
-
-from fleetrec.core.utils import envs
-from fleetrec.core.model import Model as ModelBase
-
-import paddle.fluid as fluid
-import paddle.fluid.layers.nn as nn
-import paddle.fluid.layers.tensor as tensor
-import paddle.fluid.layers.control_flow as cf
-
-class Model(ModelBase):
- def __init__(self, config):
- ModelBase.__init__(self, config)
- self.cost = None
- self.metrics = {}
- self.vocab_text_size = 11447#envs.get_global_env("vocab_text_size", None, self._namespace)
- self.vocab_tag_size = 4#envs.get_global_env("vocab_tag_size", None, self._namespace)
- self.emb_dim = 10#envs.get_global_env("emb_dim", None, self._namespace)
- self.hid_dim = 1000#envs.get_global_env("hid_dim", None, self._namespace)
- self.win_size = 5#envs.get_global_env("win_size", None, self._namespace)
- self.margin = 0.1#envs.get_global_env("margin", None, self._namespace)
- self.neg_size = 3#envs.get_global_env("neg_size", None, self._namespace)
- print self.emb_dim
-
- def train_net(self):
- """ network definition """
- text = fluid.data(name="text", shape=[None, 1], lod_level=1, dtype='int64')
- pos_tag = fluid.data(
- name="pos_tag", shape=[None, 1], lod_level=1, dtype='int64')
- neg_tag = fluid.data(
- name="neg_tag", shape=[None, 1], lod_level=1, dtype='int64')
-
- self._data_var = [text, pos_tag, neg_tag]
-
- text_emb = fluid.embedding(
- input=text, size=[self.vocab_text_size, self.emb_dim], param_attr="text_emb")
- text_emb = fluid.layers.squeeze(input=text_emb, axes=[1])
- pos_tag_emb = fluid.embedding(
- input=pos_tag, size=[self.vocab_tag_size, self.emb_dim], param_attr="tag_emb")
- pos_tag_emb = fluid.layers.squeeze(input=pos_tag_emb, axes=[1])
- neg_tag_emb = fluid.embedding(
- input=neg_tag, size=[self.vocab_tag_size, self.emb_dim], param_attr="tag_emb")
- neg_tag_emb = fluid.layers.squeeze(input=neg_tag_emb, axes=[1])
-
- conv_1d = fluid.nets.sequence_conv_pool(
- input=text_emb,
- num_filters=self.hid_dim,
- filter_size=self.win_size,
- act="tanh",
- pool_type="max",
- param_attr="cnn")
- text_hid = fluid.layers.fc(input=conv_1d,
- size=self.emb_dim,
- param_attr="text_hid")
- cos_pos = nn.cos_sim(pos_tag_emb, text_hid)
- mul_text_hid = fluid.layers.sequence_expand_as(x=text_hid, y=neg_tag_emb)
- mul_cos_neg = nn.cos_sim(neg_tag_emb, mul_text_hid)
- cos_neg_all = fluid.layers.sequence_reshape(
- input=mul_cos_neg, new_dim=self.neg_size)
- #choose max negtive cosine
- cos_neg = nn.reduce_max(cos_neg_all, dim=1, keep_dim=True)
- #calculate hinge loss
- loss_part1 = nn.elementwise_sub(
- tensor.fill_constant_batch_size_like(
- input=cos_pos, shape=[-1, 1], value=self.margin, dtype='float32'),
- cos_pos)
- loss_part2 = nn.elementwise_add(loss_part1, cos_neg)
- loss_part3 = nn.elementwise_max(
- tensor.fill_constant_batch_size_like(
- input=loss_part2, shape=[-1, 1], value=0.0, dtype='float32'),
- loss_part2)
- avg_cost = nn.mean(loss_part3)
- less = tensor.cast(cf.less_than(cos_neg, cos_pos), dtype='float32')
- correct = nn.reduce_sum(less)
- self.cost = avg_cost
-
- self.metrics["correct"] = correct
- self.metrics["cos_pos"] = cos_pos
-
- def get_cost_op(self):
- return self.cost
-
- def get_metrics(self):
- return self.metrics
-
- def optimizer(self):
- learning_rate = 0.01#envs.get_global_env("hyper_parameters.base_lr", None, self._namespace)
- sgd_optimizer = fluid.optimizer.Adagrad(learning_rate=learning_rate)
- #sgd_optimizer.minimize(avg_cost)
- return sgd_optimizer
-
-
- def infer_net(self, parameter_list):
- self.train_net()
diff --git a/models/rank/text_classification/config.yaml b/models/rank/text_classification/config.yaml
deleted file mode 100644
index 2104a6131523fe20540e243e1b42d82ae81a800f..0000000000000000000000000000000000000000
--- a/models/rank/text_classification/config.yaml
+++ /dev/null
@@ -1,40 +0,0 @@
-# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-train:
- trainer:
- # for cluster training
- strategy: "async"
-
- epochs: 10
- workspace: "fleetrec.models.rank.text_classification"
-
- reader:
- batch_size: 5
- class: "{workspace}/reader.py"
- train_data_path: "{workspace}/train_data"
-
- model:
- models: "{workspace}/model.py"
-
- save:
- increment:
- dirname: "increment"
- epoch_interval: 1
- save_last: True
- inference:
- dirname: "inference"
- epoch_interval: 100
- save_last: True
-
diff --git a/models/rank/text_classification/model.py b/models/rank/text_classification/model.py
deleted file mode 100644
index 0218ab8c48c1508db623a27a73e611ba20257a36..0000000000000000000000000000000000000000
--- a/models/rank/text_classification/model.py
+++ /dev/null
@@ -1,60 +0,0 @@
-import paddle.fluid as fluid
-import math
-
-from fleetrec.core.utils import envs
-from fleetrec.core.model import Model as ModelBase
-
-import paddle.fluid as fluid
-import paddle.fluid.layers.nn as nn
-import paddle.fluid.layers.tensor as tensor
-import paddle.fluid.layers.control_flow as cf
-
-class Model(ModelBase):
- def __init__(self, config):
- ModelBase.__init__(self, config)
-
- def train_net(self):
- """ network definition """
-
- data = fluid.data(name="input", shape=[None, max_len], dtype='int64')
- label = fluid.data(name="label", shape=[None, 1], dtype='int64')
- seq_len = fluid.data(name="seq_len", shape=[None], dtype='int64')
- # embedding layer
- emb = fluid.embedding(input=data, size=[dict_dim, emb_dim])
- emb = fluid.layers.sequence_unpad(emb, length=seq_len)
- # convolution layer
- conv = fluid.nets.sequence_conv_pool(
- input=emb,
- num_filters=cnn_dim,
- filter_size=cnn_filter_size,
- act="tanh",
- pool_type="max")
-
- # full connect layer
- fc_1 = fluid.layers.fc(input=[conv], size=hid_dim)
- # softmax layer
- prediction = fluid.layers.fc(input=[fc_1], size=class_dim, act="softmax")
- #if is_prediction:
- # return prediction
- cost = fluid.layers.cross_entropy(input=prediction, label=label)
- avg_cost = fluid.layers.mean(x=cost)
- acc = fluid.layers.accuracy(input=prediction, label=label)
-
- self.cost = avg_cost
- self.metrics["acc"] = cos_pos
-
- def get_cost_op(self):
- return self.cost
-
- def get_metrics(self):
- return self.metrics
-
- def optimizer(self):
- learning_rate = 0.01#envs.get_global_env("hyper_parameters.base_lr", None, self._namespace)
- sgd_optimizer = fluid.optimizer.Adagrad(learning_rate=learning_rate)
- #sgd_optimizer.minimize(avg_cost)
- return sgd_optimizer
-
-
- def infer_net(self, parameter_list):
- self.train_net()
diff --git a/models/rank/text_classification/reader.py b/models/rank/text_classification/reader.py
deleted file mode 100644
index 36d07b2c553e452a23b8d53ce4553a8f5245b84a..0000000000000000000000000000000000000000
--- a/models/rank/text_classification/reader.py
+++ /dev/null
@@ -1,34 +0,0 @@
-import re
-import sys
-import collections
-import os
-import six
-import time
-import numpy as np
-import paddle.fluid as fluid
-import paddle
-import csv
-import io
-
-from fleetrec.core.reader import Reader
-from fleetrec.core.utils import envs
-
-class TrainReader(Reader):
- def init(self):
- pass
-
- def _process_line(self, l):
- l = l.strip().split(" ")
- data = l[0:10]
- seq_len = l[10:11]
- label = l[11:]
- return data, label, seq_len
-
- def generate_sample(self, line):
- def data_iter():
- data, label, seq_len = self._process_line(line)
- if data is None:
- yield None
- return
- yield [('data', data), ('label', label), ('seq_len', seq_len)]
- return data_iter
diff --git a/models/rank/wide_deep/config.yaml b/models/rank/wide_deep/config.yaml
index 4608715acc8ea8e1f02c2b623edd129c2453dd31..af9e106e24a6c9a6e985f671fabf0e60c4f8608f 100755
--- a/models/rank/wide_deep/config.yaml
+++ b/models/rank/wide_deep/config.yaml
@@ -12,36 +12,59 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-train:
- trainer:
- # for cluster training
- strategy: "async"
-
- epochs: 10
- workspace: "fleetrec.models.rank.wide_deep"
-
- reader:
- batch_size: 2
- class: "{workspace}/reader.py"
- train_data_path: "{workspace}/data/train_data"
-
- model:
- models: "{workspace}/model.py"
- hyper_parameters:
- hidden1_units: 75
- hidden2_units: 50
- hidden3_units: 25
- learning_rate: 0.0001
- reg: 0.001
- act: "relu"
- optimizer: SGD
-
- save:
- increment:
- dirname: "increment"
- epoch_interval: 2
- save_last: True
- inference:
- dirname: "inference"
- epoch_interval: 4
- save_last: True
+# global settings
+debug: false
+workspace: "paddlerec.models.rank.wide_deep"
+
+
+dataset:
+ - name: sample_1
+ type: QueueDataset
+ batch_size: 5
+ data_path: "{workspace}/data/sample_data/train"
+ sparse_slots: "label"
+ dense_slots: "wide_input:8 deep_input:58"
+ - name: infer_sample
+ type: QueueDataset
+ batch_size: 5
+ data_path: "{workspace}/data/sample_data/train"
+ sparse_slots: "label"
+ dense_slots: "wide_input:8 deep_input:58"
+
+hyper_parameters:
+ optimizer:
+ class: SGD
+ learning_rate: 0.0001
+ hidden1_units: 75
+ hidden2_units: 50
+ hidden3_units: 25
+
+
+mode: train_runner
+# if infer, change mode to "infer_runner" and change phase to "infer_phase"
+
+runner:
+ - name: train_runner
+ trainer_class: single_train
+ epochs: 1
+ device: cpu
+ init_model_path: ""
+ save_checkpoint_interval: 1
+ save_inference_interval: 1
+ save_checkpoint_path: "increment"
+ save_inference_path: "inference"
+ - name: infer_runner
+ trainer_class: single_infer
+ epochs: 1
+ device: cpu
+ init_model_path: "increment/0"
+
+phase:
+- name: phase1
+ model: "{workspace}/model.py"
+ dataset_name: sample_1
+ thread_num: 1
+#- name: infer_phase
+# model: "{workspace}/model.py"
+# dataset_name: infer_sample
+# thread_num: 1
diff --git a/models/rank/wide_deep/create_data.sh b/models/rank/wide_deep/create_data.sh
deleted file mode 100755
index 3e5e2f4ef3ea38652302d81ef3441ce5e6f0e838..0000000000000000000000000000000000000000
--- a/models/rank/wide_deep/create_data.sh
+++ /dev/null
@@ -1,17 +0,0 @@
-mkdir train_data
-mkdir test_data
-mkdir data
-train_path="/home/yaoxuefeng/repos/models/models/PaddleRec/ctr/wide_deep/data/adult.data"
-test_path="/home/yaoxuefeng/repos/models/models/PaddleRec/ctr/wide_deep/data/adult.test"
-train_data_path="/home/yaoxuefeng/repos/models/models/PaddleRec/ctr/wide_deep/train_data/train_data.csv"
-test_data_path="/home/yaoxuefeng/repos/models/models/PaddleRec/ctr/wide_deep/test_data/test_data.csv"
-
-#pip install -r requirements.txt
-
-#wget -P data/ https://archive.ics.uci.edu/ml/machine-learning-databases/adult/adult.data
-#wget -P data/ https://archive.ics.uci.edu/ml/machine-learning-databases/adult/adult.test
-
-python data_preparation.py --train_path ${train_path} \
- --test_path ${test_path} \
- --train_data_path ${train_data_path}\
- --test_data_path ${test_data_path}
diff --git a/models/rank/wide_deep/data/create_data.sh b/models/rank/wide_deep/data/create_data.sh
new file mode 100755
index 0000000000000000000000000000000000000000..daf60cea46562a3d910177f509d51d261d69cf1d
--- /dev/null
+++ b/models/rank/wide_deep/data/create_data.sh
@@ -0,0 +1,16 @@
+mkdir train_data
+mkdir test_data
+train_path="adult.data"
+test_path="adult.test"
+train_data_path="./train_data/train_data.csv"
+test_data_path="./test_data/test_data.csv"
+
+pip install -r requirements.txt
+
+wget -P data/ https://archive.ics.uci.edu/ml/machine-learning-databases/adult/adult.data
+wget -P data/ https://archive.ics.uci.edu/ml/machine-learning-databases/adult/adult.test
+
+python data_preparation.py --train_path ${train_path} \
+ --test_path ${test_path} \
+ --train_data_path ${train_data_path}\
+ --test_data_path ${test_data_path}
diff --git a/models/rank/wide_deep/data/data_preparation.py b/models/rank/wide_deep/data/data_preparation.py
new file mode 100644
index 0000000000000000000000000000000000000000..885070096cd3fd084e9695919121f782505b9e77
--- /dev/null
+++ b/models/rank/wide_deep/data/data_preparation.py
@@ -0,0 +1,151 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import os
+import io
+import args
+import pandas as pd
+from sklearn import preprocessing
+
+
+def _clean_file(source_path, target_path):
+ """makes changes to match the CSV format."""
+ with io.open(source_path, 'r') as temp_eval_file:
+ with io.open(target_path, 'w') as eval_file:
+ for line in temp_eval_file:
+ line = line.strip()
+ line = line.replace(', ', ',')
+ if not line or ',' not in line:
+ continue
+ if line[-1] == '.':
+ line = line[:-1]
+ line += '\n'
+ eval_file.write(line)
+
+
+def build_model_columns(train_data_path, test_data_path):
+ # The column names are from
+ # https://www2.1010data.com/documentationcenter/prod/Tutorials/MachineLearningExamples/CensusIncomeDataSet.html
+ column_names = [
+ 'age', 'workclass', 'fnlwgt', 'education', 'education_num',
+ 'marital_status', 'occupation', 'relationship', 'race', 'gender',
+ 'capital_gain', 'capital_loss', 'hours_per_week', 'native_country',
+ 'income_bracket'
+ ]
+
+ # Load the dataset in Pandas
+ train_df = pd.read_csv(
+ train_data_path,
+ delimiter=',',
+ header=None,
+ index_col=None,
+ names=column_names)
+ test_df = pd.read_csv(
+ test_data_path,
+ delimiter=',',
+ header=None,
+ index_col=None,
+ names=column_names)
+
+ # First group of tasks according to the paper
+ #label_columns = ['income_50k', 'marital_stat']
+ categorical_columns = [
+ 'education', 'marital_status', 'relationship', 'workclass',
+ 'occupation'
+ ]
+ for col in categorical_columns:
+ label_train = preprocessing.LabelEncoder()
+ train_df[col] = label_train.fit_transform(train_df[col])
+ label_test = preprocessing.LabelEncoder()
+ test_df[col] = label_test.fit_transform(test_df[col])
+
+ bins = [18, 25, 30, 35, 40, 45, 50, 55, 60, 65]
+ train_df['age_buckets'] = pd.cut(train_df['age'].values.tolist(),
+ bins,
+ labels=False)
+ test_df['age_buckets'] = pd.cut(test_df['age'].values.tolist(),
+ bins,
+ labels=False)
+
+ base_columns = [
+ 'education', 'marital_status', 'relationship', 'workclass',
+ 'occupation', 'age_buckets'
+ ]
+
+ train_df['education_occupation'] = train_df['education'].astype(
+ str) + '_' + train_df['occupation'].astype(str)
+ test_df['education_occupation'] = test_df['education'].astype(
+ str) + '_' + test_df['occupation'].astype(str)
+ train_df['age_buckets_education_occupation'] = train_df[
+ 'age_buckets'].astype(str) + '_' + train_df['education'].astype(
+ str) + '_' + train_df['occupation'].astype(str)
+ test_df['age_buckets_education_occupation'] = test_df[
+ 'age_buckets'].astype(str) + '_' + test_df['education'].astype(
+ str) + '_' + test_df['occupation'].astype(str)
+ crossed_columns = [
+ 'education_occupation', 'age_buckets_education_occupation'
+ ]
+
+ for col in crossed_columns:
+ label_train = preprocessing.LabelEncoder()
+ train_df[col] = label_train.fit_transform(train_df[col])
+ label_test = preprocessing.LabelEncoder()
+ test_df[col] = label_test.fit_transform(test_df[col])
+
+ wide_columns = base_columns + crossed_columns
+
+ train_df_temp = pd.get_dummies(
+ train_df[categorical_columns], columns=categorical_columns)
+ test_df_temp = pd.get_dummies(
+ test_df[categorical_columns], columns=categorical_columns)
+ train_df = train_df.join(train_df_temp)
+ test_df = test_df.join(test_df_temp)
+
+ deep_columns = list(train_df_temp.columns) + [
+ 'age', 'education_num', 'capital_gain', 'capital_loss',
+ 'hours_per_week'
+ ]
+
+ train_df['label'] = train_df['income_bracket'].apply(
+ lambda x: 1 if x == '>50K' else 0)
+ test_df['label'] = test_df['income_bracket'].apply(
+ lambda x: 1 if x == '>50K' else 0)
+
+ with io.open('train_data/columns.txt', 'w') as f:
+ write_str = str(len(wide_columns)) + '\n' + str(len(
+ deep_columns)) + '\n'
+ f.write(write_str)
+ f.close()
+ with io.open('test_data/columns.txt', 'w') as f:
+ write_str = str(len(wide_columns)) + '\n' + str(len(
+ deep_columns)) + '\n'
+ f.write(write_str)
+ f.close()
+
+ train_df[wide_columns + deep_columns + ['label']].fillna(0).to_csv(
+ train_data_path, index=False)
+ test_df[wide_columns + deep_columns + ['label']].fillna(0).to_csv(
+ test_data_path, index=False)
+
+
+def clean_file(train_path, test_path, train_data_path, test_data_path):
+ _clean_file(train_path, train_data_path)
+ _clean_file(test_path, test_data_path)
+
+
+if __name__ == '__main__':
+ args = args.parse_args()
+ clean_file(args.train_path, args.test_path, args.train_data_path,
+ args.test_data_path)
+ build_model_columns(args.train_data_path, args.test_data_path)
diff --git a/models/rank/wide_deep/data/get_slot_data.py b/models/rank/wide_deep/data/get_slot_data.py
new file mode 100755
index 0000000000000000000000000000000000000000..831d05665b01649f22a3270ec949ebda2941928d
--- /dev/null
+++ b/models/rank/wide_deep/data/get_slot_data.py
@@ -0,0 +1,68 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+import yaml
+from paddlerec.core.reader import Reader
+from paddlerec.core.utils import envs
+try:
+ import cPickle as pickle
+except ImportError:
+ import pickle
+import paddle.fluid.incubate.data_generator as dg
+
+
+class TrainReader(dg.MultiSlotDataGenerator):
+ def __init__(self, config):
+ dg.MultiSlotDataGenerator.__init__(self)
+
+ if os.path.isfile(config):
+ with open(config, 'r') as rb:
+ _config = yaml.load(rb.read(), Loader=yaml.FullLoader)
+ else:
+ raise ValueError("reader config only support yaml")
+
+ def init(self):
+ pass
+
+ def _process_line(self, line):
+ line = line.strip().split(',')
+ features = list(map(float, line))
+ wide_feat = features[0:8]
+ deep_feat = features[8:58 + 8]
+ label = features[-1]
+ return wide_feat, deep_feat, [label]
+
+ def generate_sample(self, line):
+ """
+ Read the data line by line and process it as a dictionary
+ """
+
+ def data_iter():
+ wide_feat, deep_deat, label = self._process_line(line)
+
+ s = ""
+ for i in [('wide_input', wide_feat), ('deep_input', deep_deat),
+ ('label', label)]:
+ k = i[0]
+ v = i[1]
+ for j in v:
+ s += " " + k + ":" + str(j)
+ print s.strip()
+ yield None
+
+ return data_iter
+
+
+reader = TrainReader("../config.yaml")
+reader.init()
+reader.run_from_stdin()
diff --git a/models/rank/wide_deep/data/run.sh b/models/rank/wide_deep/data/run.sh
new file mode 100644
index 0000000000000000000000000000000000000000..7b4fb8492a05769377f388faece9e0dc0a82c6c0
--- /dev/null
+++ b/models/rank/wide_deep/data/run.sh
@@ -0,0 +1,13 @@
+sh create_data.sh
+
+mkdir slot_train_data
+for i in `ls ./train_data`
+do
+ cat train_data/$i | python get_slot_data.py > slot_train_data/$i
+done
+
+mkdir slot_test_data
+for i in `ls ./test_data`
+do
+ cat test_data/$i | python get_slot_data.py > slot_test_data/$i
+done
diff --git a/models/rank/wide_deep/data/sample_data/train/train_data.txt b/models/rank/wide_deep/data/sample_data/train/train_data.txt
new file mode 100644
index 0000000000000000000000000000000000000000..967b975d703d5aaa0a6e73e6fca74384c6e289dc
--- /dev/null
+++ b/models/rank/wide_deep/data/sample_data/train/train_data.txt
@@ -0,0 +1,500 @@
+wide_input:9.0 wide_input:4.0 wide_input:1.0 wide_input:7.0 wide_input:1.0 wide_input:3.0 wide_input:203.0 wide_input:643.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:39.0 deep_input:13.0 deep_input:2174.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:6.0 wide_input:4.0 wide_input:5.0 wide_input:211.0 wide_input:980.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:13.0 label:0
+wide_input:11.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:6.0 wide_input:3.0 wide_input:36.0 wide_input:519.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:38.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:1.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:6.0 wide_input:6.0 wide_input:101.0 wide_input:1054.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:53.0 deep_input:7.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:5.0 wide_input:4.0 wide_input:10.0 wide_input:1.0 wide_input:204.0 wide_input:320.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:28.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:12.0 wide_input:2.0 wide_input:5.0 wide_input:4.0 wide_input:4.0 wide_input:3.0 wide_input:49.0 wide_input:531.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:37.0 deep_input:14.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:6.0 wide_input:3.0 wide_input:1.0 wide_input:4.0 wide_input:8.0 wide_input:5.0 wide_input:172.0 wide_input:946.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:49.0 deep_input:5.0 deep_input:0.0 deep_input:0.0 deep_input:16.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:6.0 wide_input:4.0 wide_input:6.0 wide_input:34.0 wide_input:1010.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:52.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 label:1
+wide_input:12.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:10.0 wide_input:2.0 wide_input:42.0 wide_input:362.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:31.0 deep_input:14.0 deep_input:14084.0 deep_input:0.0 deep_input:50.0 label:1
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:4.0 wide_input:211.0 wide_input:819.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:42.0 deep_input:13.0 deep_input:5178.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:3.0 wide_input:85.0 wide_input:553.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:37.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:80.0 label:1
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:7.0 wide_input:10.0 wide_input:1.0 wide_input:204.0 wide_input:320.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:30.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:9.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:1.0 wide_input:0.0 wide_input:203.0 wide_input:153.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:23.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:30.0 label:0
+wide_input:7.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:12.0 wide_input:2.0 wide_input:178.0 wide_input:459.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:32.0 deep_input:12.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 label:0
+wide_input:8.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:3.0 wide_input:3.0 wide_input:195.0 wide_input:636.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 deep_input:11.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:5.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:14.0 wide_input:2.0 wide_input:152.0 wide_input:438.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:34.0 deep_input:4.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:3.0 wide_input:6.0 wide_input:5.0 wide_input:0.0 wide_input:35.0 wide_input:24.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:25.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:4.0 wide_input:4.0 wide_input:7.0 wide_input:2.0 wide_input:37.0 wide_input:357.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:32.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:1.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:12.0 wide_input:3.0 wide_input:95.0 wide_input:562.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:38.0 deep_input:7.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 label:0
+wide_input:12.0 wide_input:0.0 wide_input:4.0 wide_input:6.0 wide_input:4.0 wide_input:4.0 wide_input:49.0 wide_input:694.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:43.0 deep_input:14.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 label:1
+wide_input:10.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:10.0 wide_input:3.0 wide_input:16.0 wide_input:506.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 deep_input:16.0 deep_input:0.0 deep_input:0.0 deep_input:60.0 label:1
+wide_input:11.0 wide_input:5.0 wide_input:4.0 wide_input:4.0 wide_input:8.0 wide_input:6.0 wide_input:38.0 wide_input:1014.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:54.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 label:0
+wide_input:6.0 wide_input:2.0 wide_input:0.0 wide_input:1.0 wide_input:5.0 wide_input:2.0 wide_input:169.0 wide_input:451.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 deep_input:5.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:1.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:14.0 wide_input:4.0 wide_input:97.0 wide_input:728.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:43.0 deep_input:7.0 deep_input:0.0 deep_input:2042.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:13.0 wide_input:7.0 wide_input:30.0 wide_input:1166.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:59.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:2.0 wide_input:13.0 wide_input:7.0 wide_input:207.0 wide_input:1293.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:56.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:11.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:3.0 wide_input:0.0 wide_input:33.0 wide_input:22.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:19.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:0.0 wide_input:0.0 wide_input:6.0 wide_input:76.0 wide_input:1035.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:54.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:60.0 label:1
+wide_input:11.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:4.0 wide_input:3.0 wide_input:34.0 wide_input:517.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:39.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:80.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:3.0 wide_input:5.0 wide_input:33.0 wide_input:846.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:49.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:7.0 wide_input:4.0 wide_input:1.0 wide_input:2.0 wide_input:11.0 wide_input:0.0 wide_input:177.0 wide_input:128.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:23.0 deep_input:12.0 deep_input:0.0 deep_input:0.0 deep_input:52.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:12.0 wide_input:0.0 wide_input:80.0 wide_input:49.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:44.0 label:0
+wide_input:9.0 wide_input:0.0 wide_input:3.0 wide_input:4.0 wide_input:4.0 wide_input:4.0 wide_input:211.0 wide_input:819.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 deep_input:13.0 deep_input:0.0 deep_input:1408.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:3.0 wide_input:1.0 wide_input:1.0 wide_input:1.0 wide_input:77.0 wide_input:216.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:30.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:7.0 wide_input:8.0 wide_input:0.0 wide_input:89.0 wide_input:58.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:22.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:15.0 label:0
+wide_input:1.0 wide_input:4.0 wide_input:4.0 wide_input:4.0 wide_input:7.0 wide_input:5.0 wide_input:102.0 wide_input:900.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:48.0 deep_input:7.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:7.0 wide_input:0.0 wide_input:88.0 wide_input:57.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:21.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:1.0 wide_input:5.0 wide_input:4.0 wide_input:1.0 wide_input:0.0 wide_input:26.0 wide_input:15.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:19.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:25.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:12.0 wide_input:2.0 wide_input:80.0 wide_input:385.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:31.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:38.0 label:1
+wide_input:7.0 wide_input:2.0 wide_input:0.0 wide_input:6.0 wide_input:10.0 wide_input:5.0 wide_input:176.0 wide_input:949.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:48.0 deep_input:12.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:6.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:7.0 wide_input:2.0 wide_input:171.0 wide_input:453.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:31.0 deep_input:5.0 deep_input:0.0 deep_input:0.0 deep_input:43.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:6.0 wide_input:10.0 wide_input:6.0 wide_input:204.0 wide_input:1133.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:53.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:13.0 wide_input:0.0 wide_input:207.0 wide_input:157.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:24.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 label:0
+wide_input:11.0 wide_input:5.0 wide_input:4.0 wide_input:4.0 wide_input:1.0 wide_input:5.0 wide_input:26.0 wide_input:840.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:49.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:6.0 wide_input:0.0 wide_input:36.0 wide_input:25.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:25.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:1.0 wide_input:10.0 wide_input:7.0 wide_input:204.0 wide_input:1290.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:57.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:7.0 wide_input:6.0 wide_input:37.0 wide_input:1013.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:53.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:38.0 label:0
+wide_input:12.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:4.0 wide_input:4.0 wide_input:49.0 wide_input:694.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:44.0 deep_input:14.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:8.0 wide_input:2.0 wide_input:0.0 wide_input:7.0 wide_input:3.0 wide_input:4.0 wide_input:195.0 wide_input:804.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:41.0 deep_input:11.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:8.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:10.0 wide_input:1.0 wide_input:190.0 wide_input:306.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:29.0 deep_input:11.0 deep_input:0.0 deep_input:0.0 deep_input:43.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:5.0 wide_input:4.0 wide_input:4.0 wide_input:0.0 wide_input:85.0 wide_input:54.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:25.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:8.0 wide_input:0.0 wide_input:38.0 wide_input:1483.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:18.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:30.0 label:0
+wide_input:14.0 wide_input:2.0 wide_input:5.0 wide_input:4.0 wide_input:10.0 wide_input:5.0 wide_input:66.0 wide_input:869.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:47.0 deep_input:15.0 deep_input:0.0 deep_input:1902.0 deep_input:60.0 label:1
+wide_input:9.0 wide_input:0.0 wide_input:1.0 wide_input:1.0 wide_input:4.0 wide_input:5.0 wide_input:211.0 wide_input:980.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:55.0 label:1
+wide_input:11.0 wide_input:0.0 wide_input:1.0 wide_input:5.0 wide_input:4.0 wide_input:5.0 wide_input:34.0 wide_input:847.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:47.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:60.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:13.0 wide_input:4.0 wide_input:81.0 wide_input:714.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:43.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:4.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:7.0 wide_input:5.0 wide_input:143.0 wide_input:924.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:46.0 deep_input:3.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:8.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:8.0 wide_input:2.0 wide_input:200.0 wide_input:480.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 deep_input:11.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:26.0 wide_input:673.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:41.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:48.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:7.0 wide_input:1.0 wide_input:37.0 wide_input:193.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:30.0 deep_input:9.0 deep_input:5013.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:12.0 wide_input:1.0 wide_input:206.0 wide_input:322.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:30.0 deep_input:13.0 deep_input:2407.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:5.0 wide_input:3.0 wide_input:1.0 wide_input:0.0 wide_input:0.0 wide_input:2.0 wide_input:146.0 wide_input:433.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:32.0 deep_input:4.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:14.0 wide_input:5.0 wide_input:31.0 wide_input:845.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:48.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:10.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:10.0 wide_input:4.0 wide_input:16.0 wide_input:667.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:42.0 deep_input:16.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 label:1
+wide_input:15.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:13.0 wide_input:1.0 wide_input:81.0 wide_input:220.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:29.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:58.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:3.0 wide_input:3.0 wide_input:33.0 wide_input:516.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:36.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:1.0 wide_input:1.0 wide_input:77.0 wide_input:216.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:28.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:5.0 wide_input:4.0 wide_input:1.0 wide_input:6.0 wide_input:26.0 wide_input:1003.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:53.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:5.0 wide_input:4.0 wide_input:5.0 wide_input:85.0 wide_input:885.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:49.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 label:1
+wide_input:15.0 wide_input:4.0 wide_input:3.0 wide_input:0.0 wide_input:0.0 wide_input:0.0 wide_input:76.0 wide_input:45.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:25.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:10.0 wide_input:0.0 wide_input:78.0 wide_input:47.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:19.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:32.0 label:0
+wide_input:9.0 wide_input:5.0 wide_input:3.0 wide_input:4.0 wide_input:12.0 wide_input:2.0 wide_input:206.0 wide_input:485.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:31.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:6.0 wide_input:12.0 wide_input:1.0 wide_input:206.0 wide_input:322.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:29.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:70.0 label:1
+wide_input:15.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:7.0 wide_input:0.0 wide_input:88.0 wide_input:57.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:23.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:2.0 wide_input:4.0 wide_input:10.0 wide_input:0.0 wide_input:78.0 wide_input:1504.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:79.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:8.0 wide_input:1.0 wide_input:38.0 wide_input:194.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:27.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:7.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:1.0 wide_input:3.0 wide_input:175.0 wide_input:618.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 deep_input:12.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:0.0 wide_input:2.0 wide_input:0.0 wide_input:0.0 wide_input:0.0 wide_input:0.0 wide_input:0.0 wide_input:1452.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:67.0 deep_input:6.0 deep_input:0.0 deep_input:0.0 deep_input:2.0 label:0
+wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:8.0 wide_input:0.0 wide_input:103.0 wide_input:1527.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:18.0 deep_input:7.0 deep_input:0.0 deep_input:0.0 deep_input:22.0 label:0
+wide_input:5.0 wide_input:2.0 wide_input:0.0 wide_input:2.0 wide_input:5.0 wide_input:2.0 wide_input:155.0 wide_input:440.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:31.0 deep_input:4.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:12.0 wide_input:0.0 wide_input:29.0 wide_input:1475.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:18.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:30.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:8.0 wide_input:6.0 wide_input:215.0 wide_input:1143.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:52.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:5.0 wide_input:4.0 wide_input:8.0 wide_input:5.0 wide_input:38.0 wide_input:851.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:46.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:12.0 wide_input:7.0 wide_input:29.0 wide_input:1165.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:59.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:48.0 label:0
+wide_input:11.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:33.0 wide_input:679.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:44.0 deep_input:9.0 deep_input:14344.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:11.0 wide_input:0.0 wide_input:3.0 wide_input:4.0 wide_input:12.0 wide_input:6.0 wide_input:29.0 wide_input:1006.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:53.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:2.0 wide_input:11.0 wide_input:5.0 wide_input:28.0 wide_input:842.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:49.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:12.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:10.0 wide_input:2.0 wide_input:42.0 wide_input:362.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:33.0 deep_input:14.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 label:0
+wide_input:6.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:12.0 wide_input:1.0 wide_input:164.0 wide_input:282.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:30.0 deep_input:5.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:10.0 wide_input:4.0 wide_input:1.0 wide_input:1.0 wide_input:10.0 wide_input:4.0 wide_input:16.0 wide_input:667.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:43.0 deep_input:16.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 label:1
+wide_input:8.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:10.0 wide_input:7.0 wide_input:190.0 wide_input:1280.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:57.0 deep_input:11.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:3.0 wide_input:3.0 wide_input:84.0 wide_input:552.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:37.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:7.0 wide_input:1.0 wide_input:88.0 wide_input:227.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:28.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:25.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:5.0 wide_input:4.0 wide_input:12.0 wide_input:1.0 wide_input:29.0 wide_input:185.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:30.0 deep_input:9.0 deep_input:0.0 deep_input:1573.0 deep_input:35.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:2.0 wide_input:11.0 wide_input:2.0 wide_input:205.0 wide_input:484.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:34.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:15.0 wide_input:4.0 wide_input:1.0 wide_input:2.0 wide_input:6.0 wide_input:1.0 wide_input:87.0 wide_input:226.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:29.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 label:0
+wide_input:10.0 wide_input:2.0 wide_input:0.0 wide_input:6.0 wide_input:10.0 wide_input:5.0 wide_input:16.0 wide_input:834.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:48.0 deep_input:16.0 deep_input:0.0 deep_input:1902.0 deep_input:60.0 label:1
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:12.0 wide_input:3.0 wide_input:80.0 wide_input:549.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:37.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:48.0 label:1
+wide_input:7.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:4.0 wide_input:5.0 wide_input:182.0 wide_input:955.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:48.0 deep_input:12.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:3.0 wide_input:1.0 wide_input:8.0 wide_input:2.0 wide_input:38.0 wide_input:358.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:32.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:12.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:0.0 wide_input:49.0 wide_input:1491.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:76.0 deep_input:14.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:4.0 wide_input:211.0 wide_input:819.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:44.0 deep_input:13.0 deep_input:15024.0 deep_input:0.0 deep_input:60.0 label:1
+wide_input:12.0 wide_input:4.0 wide_input:1.0 wide_input:6.0 wide_input:10.0 wide_input:5.0 wide_input:42.0 wide_input:855.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:47.0 deep_input:14.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:13.0 wide_input:0.0 wide_input:81.0 wide_input:50.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:20.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:1.0 wide_input:33.0 wide_input:189.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:29.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:5.0 wide_input:3.0 wide_input:2.0 wide_input:33.0 wide_input:353.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:32.0 deep_input:9.0 deep_input:7688.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:0.0 wide_input:4.0 wide_input:3.0 wide_input:0.0 wide_input:0.0 wide_input:0.0 wide_input:0.0 wide_input:1452.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:17.0 deep_input:6.0 deep_input:34095.0 deep_input:0.0 deep_input:32.0 label:0
+wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:6.0 wide_input:1.0 wide_input:101.0 wide_input:237.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:30.0 deep_input:7.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:1.0 wide_input:2.0 wide_input:1.0 wide_input:2.0 wide_input:26.0 wide_input:347.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:31.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:6.0 wide_input:4.0 wide_input:36.0 wide_input:682.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:42.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:4.0 wide_input:2.0 wide_input:4.0 wide_input:12.0 wide_input:0.0 wide_input:206.0 wide_input:156.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:24.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:14.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:10.0 wide_input:3.0 wide_input:66.0 wide_input:540.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:38.0 deep_input:15.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:6.0 wide_input:8.0 wide_input:7.0 wide_input:38.0 wide_input:1173.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:56.0 deep_input:9.0 deep_input:0.0 deep_input:1887.0 deep_input:50.0 label:1
+wide_input:15.0 wide_input:2.0 wide_input:5.0 wide_input:4.0 wide_input:12.0 wide_input:1.0 wide_input:80.0 wide_input:219.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:28.0 deep_input:10.0 deep_input:4064.0 deep_input:0.0 deep_input:25.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:7.0 wide_input:3.0 wide_input:37.0 wide_input:520.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:36.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:6.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:6.0 wide_input:6.0 wide_input:170.0 wide_input:1104.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:53.0 deep_input:5.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:5.0 wide_input:12.0 wide_input:7.0 wide_input:80.0 wide_input:1198.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:56.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 label:0
+wide_input:8.0 wide_input:2.0 wide_input:0.0 wide_input:2.0 wide_input:3.0 wide_input:5.0 wide_input:195.0 wide_input:967.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:49.0 deep_input:11.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:12.0 wide_input:6.0 wide_input:80.0 wide_input:1039.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:55.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:56.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:0.0 wide_input:33.0 wide_input:22.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:22.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:41.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:8.0 wide_input:0.0 wide_input:89.0 wide_input:58.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:21.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:3.0 wide_input:211.0 wide_input:650.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:60.0 label:0
+wide_input:9.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:4.0 wide_input:1.0 wide_input:211.0 wide_input:326.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:30.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:7.0 wide_input:10.0 wide_input:1.0 wide_input:204.0 wide_input:320.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:29.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 label:1
+wide_input:15.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:8.0 wide_input:0.0 wide_input:89.0 wide_input:58.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:19.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:5.0 wide_input:4.0 wide_input:4.0 wide_input:5.0 wide_input:211.0 wide_input:980.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:47.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:15.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:1.0 wide_input:0.0 wide_input:77.0 wide_input:46.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 deep_input:10.0 deep_input:0.0 deep_input:1719.0 deep_input:28.0 label:0
+wide_input:7.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:1.0 wide_input:2.0 wide_input:175.0 wide_input:456.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:31.0 deep_input:12.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:0.0 wide_input:0.0 wide_input:2.0 wide_input:25.0 wide_input:346.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:3.0 wide_input:84.0 wide_input:552.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:39.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:7.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:12.0 wide_input:1.0 wide_input:178.0 wide_input:295.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:28.0 deep_input:12.0 deep_input:0.0 deep_input:0.0 deep_input:60.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:8.0 wide_input:0.0 wide_input:38.0 wide_input:27.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:24.0 deep_input:9.0 deep_input:0.0 deep_input:1762.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:6.0 wide_input:3.0 wide_input:3.0 wide_input:33.0 wide_input:516.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:38.0 deep_input:9.0 deep_input:4386.0 deep_input:0.0 deep_input:35.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:12.0 wide_input:3.0 wide_input:206.0 wide_input:646.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:37.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 label:0
+wide_input:7.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:13.0 wide_input:5.0 wide_input:179.0 wide_input:952.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:46.0 deep_input:12.0 deep_input:0.0 deep_input:0.0 deep_input:36.0 label:0
+wide_input:12.0 wide_input:2.0 wide_input:0.0 wide_input:1.0 wide_input:10.0 wide_input:3.0 wide_input:42.0 wide_input:525.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:38.0 deep_input:14.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:6.0 wide_input:3.0 wide_input:4.0 wide_input:33.0 wide_input:679.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:43.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:60.0 label:0
+wide_input:8.0 wide_input:2.0 wide_input:5.0 wide_input:4.0 wide_input:1.0 wide_input:1.0 wide_input:189.0 wide_input:305.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:27.0 deep_input:11.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:12.0 wide_input:0.0 wide_input:80.0 wide_input:49.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:3.0 wide_input:5.0 wide_input:84.0 wide_input:884.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:49.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:5.0 wide_input:3.0 wide_input:8.0 wide_input:33.0 wide_input:1322.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:61.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:12.0 wide_input:1.0 wide_input:80.0 wide_input:219.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:27.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:8.0 wide_input:0.0 wide_input:89.0 wide_input:58.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:19.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:30.0 label:0
+wide_input:8.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:10.0 wide_input:4.0 wide_input:190.0 wide_input:799.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 deep_input:11.0 deep_input:0.0 deep_input:1564.0 deep_input:40.0 label:1
+wide_input:15.0 wide_input:4.0 wide_input:2.0 wide_input:4.0 wide_input:13.0 wide_input:0.0 wide_input:81.0 wide_input:1507.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:70.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:4.0 wide_input:4.0 wide_input:14.0 wide_input:2.0 wide_input:31.0 wide_input:352.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:31.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:30.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:8.0 wide_input:0.0 wide_input:89.0 wide_input:58.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:22.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:24.0 label:0
+wide_input:11.0 wide_input:6.0 wide_input:4.0 wide_input:4.0 wide_input:8.0 wide_input:3.0 wide_input:38.0 wide_input:521.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:36.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:24.0 label:0
+wide_input:1.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:3.0 wide_input:8.0 wide_input:98.0 wide_input:1366.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:64.0 deep_input:7.0 deep_input:0.0 deep_input:2179.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:0.0 wide_input:1.0 wide_input:0.0 wide_input:0.0 wide_input:4.0 wide_input:76.0 wide_input:709.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:43.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:0.0 wide_input:4.0 wide_input:2.0 wide_input:1.0 wide_input:5.0 wide_input:77.0 wide_input:878.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:47.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:38.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:1.0 wide_input:2.0 wide_input:26.0 wide_input:347.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:34.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:4.0 wide_input:2.0 wide_input:4.0 wide_input:4.0 wide_input:2.0 wide_input:211.0 wide_input:489.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:33.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:3.0 wide_input:0.0 wide_input:33.0 wide_input:22.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:21.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 label:0
+wide_input:11.0 wide_input:0.0 wide_input:1.0 wide_input:0.0 wide_input:0.0 wide_input:6.0 wide_input:25.0 wide_input:1002.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:52.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 label:1
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:3.0 wide_input:5.0 wide_input:33.0 wide_input:846.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:48.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:46.0 label:0
+wide_input:9.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:6.0 wide_input:0.0 wide_input:213.0 wide_input:163.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:23.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:5.0 wide_input:4.0 wide_input:6.0 wide_input:12.0 wide_input:0.0 wide_input:80.0 wide_input:1506.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:71.0 deep_input:10.0 deep_input:0.0 deep_input:1816.0 deep_input:2.0 label:0
+wide_input:11.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:1.0 wide_input:33.0 wide_input:189.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:29.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:60.0 label:0
+wide_input:9.0 wide_input:5.0 wide_input:2.0 wide_input:4.0 wide_input:8.0 wide_input:4.0 wide_input:215.0 wide_input:823.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:42.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 label:0
+wide_input:3.0 wide_input:0.0 wide_input:1.0 wide_input:0.0 wide_input:0.0 wide_input:0.0 wide_input:120.0 wide_input:1542.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:68.0 deep_input:2.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:4.0 wide_input:4.0 wide_input:8.0 wide_input:0.0 wide_input:38.0 wide_input:27.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:25.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:12.0 wide_input:0.0 wide_input:4.0 wide_input:5.0 wide_input:4.0 wide_input:4.0 wide_input:49.0 wide_input:694.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:44.0 deep_input:14.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:4.0 wide_input:1.0 wide_input:34.0 wide_input:190.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:28.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:12.0 wide_input:2.0 wide_input:0.0 wide_input:6.0 wide_input:12.0 wide_input:4.0 wide_input:44.0 wide_input:690.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 deep_input:14.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:15.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:8.0 wide_input:3.0 wide_input:89.0 wide_input:557.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:36.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:7.0 wide_input:4.0 wide_input:1.0 wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:182.0 wide_input:625.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:39.0 deep_input:12.0 deep_input:0.0 deep_input:0.0 deep_input:42.0 label:0
+wide_input:12.0 wide_input:6.0 wide_input:4.0 wide_input:7.0 wide_input:11.0 wide_input:5.0 wide_input:43.0 wide_input:856.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:46.0 deep_input:14.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:6.0 wide_input:0.0 wide_input:101.0 wide_input:1525.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:18.0 deep_input:7.0 deep_input:0.0 deep_input:0.0 deep_input:16.0 label:0
+wide_input:8.0 wide_input:6.0 wide_input:1.0 wide_input:2.0 wide_input:10.0 wide_input:0.0 wide_input:190.0 wide_input:1593.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:66.0 deep_input:11.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:8.0 wide_input:1.0 wide_input:38.0 wide_input:194.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:27.0 deep_input:9.0 deep_input:0.0 deep_input:1980.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:7.0 wide_input:11.0 wide_input:1.0 wide_input:28.0 wide_input:184.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:28.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:12.0 wide_input:6.0 wide_input:80.0 wide_input:1039.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:51.0 deep_input:10.0 deep_input:0.0 deep_input:1977.0 deep_input:40.0 label:1
+wide_input:9.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:1.0 wide_input:210.0 wide_input:325.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:27.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:7.0 wide_input:10.0 wide_input:1.0 wide_input:204.0 wide_input:320.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:28.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:11.0 wide_input:3.0 wide_input:4.0 wide_input:4.0 wide_input:12.0 wide_input:1.0 wide_input:29.0 wide_input:185.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:27.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:25.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:3.0 wide_input:0.0 wide_input:33.0 wide_input:22.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:21.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:6.0 wide_input:2.0 wide_input:36.0 wide_input:356.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:34.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:8.0 wide_input:0.0 wide_input:38.0 wide_input:1483.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:18.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:12.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:10.0 wide_input:2.0 wide_input:204.0 wide_input:483.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:33.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:65.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:2.0 wide_input:4.0 wide_input:4.0 wide_input:85.0 wide_input:717.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:44.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:38.0 label:1
+wide_input:15.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:77.0 wide_input:710.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:43.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:1.0 wide_input:84.0 wide_input:223.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:30.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 label:0
+wide_input:5.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:7.0 wide_input:3.0 wide_input:157.0 wide_input:604.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 deep_input:4.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:1.0 wide_input:1.0 wide_input:3.0 wide_input:77.0 wide_input:546.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:37.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:42.0 label:1
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:1.0 wide_input:2.0 wide_input:26.0 wide_input:347.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:34.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:0.0 wide_input:1.0 wide_input:6.0 wide_input:4.0 wide_input:4.0 wide_input:211.0 wide_input:819.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:41.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:0.0 wide_input:1.0 wide_input:0.0 wide_input:0.0 wide_input:6.0 wide_input:202.0 wide_input:1131.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:53.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 label:0
+wide_input:12.0 wide_input:2.0 wide_input:5.0 wide_input:4.0 wide_input:10.0 wide_input:2.0 wide_input:42.0 wide_input:362.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:31.0 deep_input:14.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 label:0
+wide_input:10.0 wide_input:2.0 wide_input:0.0 wide_input:7.0 wide_input:10.0 wide_input:7.0 wide_input:16.0 wide_input:1157.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:58.0 deep_input:16.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 label:1
+wide_input:15.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:7.0 wide_input:3.0 wide_input:88.0 wide_input:556.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:38.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:28.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:7.0 wide_input:0.0 wide_input:88.0 wide_input:57.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:24.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:2.0 wide_input:3.0 wide_input:4.0 wide_input:210.0 wide_input:818.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:41.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:13.0 wide_input:5.0 wide_input:81.0 wide_input:882.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:47.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:1.0 wide_input:13.0 wide_input:4.0 wide_input:207.0 wide_input:816.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:41.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:24.0 label:0
+wide_input:4.0 wide_input:2.0 wide_input:2.0 wide_input:4.0 wide_input:14.0 wide_input:0.0 wide_input:138.0 wide_input:95.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:23.0 deep_input:3.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:3.0 wide_input:84.0 wide_input:552.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:36.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:12.0 wide_input:4.0 wide_input:1.0 wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:49.0 wide_input:531.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 deep_input:14.0 deep_input:14084.0 deep_input:0.0 deep_input:55.0 label:1
+wide_input:12.0 wide_input:2.0 wide_input:2.0 wide_input:4.0 wide_input:10.0 wide_input:2.0 wide_input:42.0 wide_input:362.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 deep_input:14.0 deep_input:7298.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:11.0 wide_input:4.0 wide_input:1.0 wide_input:6.0 wide_input:12.0 wide_input:0.0 wide_input:29.0 wide_input:18.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:24.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:12.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:10.0 wide_input:1.0 wide_input:42.0 wide_input:198.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:26.0 deep_input:14.0 deep_input:0.0 deep_input:1876.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:3.0 wide_input:0.0 wide_input:0.0 wide_input:0.0 wide_input:25.0 wide_input:14.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:19.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:6.0 wide_input:211.0 wide_input:1139.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:51.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 label:1
+wide_input:15.0 wide_input:4.0 wide_input:1.0 wide_input:2.0 wide_input:10.0 wide_input:4.0 wide_input:78.0 wide_input:711.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:42.0 deep_input:10.0 deep_input:0.0 deep_input:1340.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:0.0 wide_input:4.0 wide_input:7.0 wide_input:1.0 wide_input:3.0 wide_input:26.0 wide_input:509.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:37.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 label:0
+wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:12.0 wide_input:0.0 wide_input:95.0 wide_input:1520.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:18.0 deep_input:7.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:8.0 wide_input:3.0 wide_input:215.0 wide_input:654.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:36.0 deep_input:13.0 deep_input:7298.0 deep_input:0.0 deep_input:36.0 label:1
+wide_input:11.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:2.0 wide_input:33.0 wide_input:353.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:60.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:5.0 wide_input:5.0 wide_input:12.0 wide_input:7.0 wide_input:29.0 wide_input:1165.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:58.0 deep_input:9.0 deep_input:15024.0 deep_input:0.0 deep_input:35.0 label:1
+wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:12.0 wide_input:0.0 wide_input:95.0 wide_input:1520.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:17.0 deep_input:7.0 deep_input:0.0 deep_input:0.0 deep_input:12.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:2.0 wide_input:14.0 wide_input:4.0 wide_input:31.0 wide_input:678.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:44.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:3.0 wide_input:3.0 wide_input:33.0 wide_input:516.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:37.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:15.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:12.0 wide_input:2.0 wide_input:80.0 wide_input:385.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:3.0 wide_input:7.0 wide_input:33.0 wide_input:1168.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:60.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:5.0 wide_input:2.0 wide_input:0.0 wide_input:5.0 wide_input:7.0 wide_input:6.0 wide_input:157.0 wide_input:1093.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:54.0 deep_input:4.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:9.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:4.0 wide_input:3.0 wide_input:211.0 wide_input:650.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:37.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:60.0 label:1
+wide_input:7.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:12.0 wide_input:5.0 wide_input:178.0 wide_input:951.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 deep_input:12.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:12.0 wide_input:3.0 wide_input:206.0 wide_input:646.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:38.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:10.0 wide_input:4.0 wide_input:204.0 wide_input:813.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:55.0 label:0
+wide_input:0.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:0.0 wide_input:7.0 wide_input:6.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:25.0 deep_input:6.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:6.0 wide_input:2.0 wide_input:87.0 wide_input:391.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:31.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:3.0 wide_input:0.0 wide_input:1.0 wide_input:0.0 wide_input:0.0 wide_input:8.0 wide_input:120.0 wide_input:1381.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:64.0 deep_input:2.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:8.0 wide_input:0.0 wide_input:38.0 wide_input:1483.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:90.0 deep_input:9.0 deep_input:0.0 deep_input:2206.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:1.0 wide_input:6.0 wide_input:26.0 wide_input:1003.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:54.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 label:0
+wide_input:13.0 wide_input:4.0 wide_input:1.0 wide_input:2.0 wide_input:7.0 wide_input:6.0 wide_input:61.0 wide_input:1029.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:53.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:12.0 wide_input:0.0 wide_input:29.0 wide_input:1475.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:18.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:0.0 wide_input:0.0 wide_input:1.0 wide_input:0.0 wide_input:0.0 wide_input:7.0 wide_input:0.0 wide_input:1144.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:60.0 deep_input:6.0 deep_input:0.0 deep_input:0.0 deep_input:10.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:6.0 wide_input:5.0 wide_input:0.0 wide_input:35.0 wide_input:1480.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:66.0 deep_input:9.0 deep_input:1409.0 deep_input:0.0 deep_input:50.0 label:0
+wide_input:8.0 wide_input:6.0 wide_input:1.0 wide_input:4.0 wide_input:1.0 wide_input:0.0 wide_input:189.0 wide_input:1592.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:75.0 deep_input:11.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 label:0
+wide_input:11.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:1.0 wide_input:8.0 wide_input:26.0 wide_input:1316.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:65.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:1.0 wide_input:5.0 wide_input:1.0 wide_input:4.0 wide_input:14.0 wide_input:2.0 wide_input:97.0 wide_input:397.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:35.0 deep_input:7.0 deep_input:3674.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:12.0 wide_input:4.0 wide_input:29.0 wide_input:676.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:41.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:38.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:1.0 wide_input:0.0 wide_input:77.0 wide_input:46.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:25.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:42.0 label:0
+wide_input:15.0 wide_input:0.0 wide_input:2.0 wide_input:4.0 wide_input:3.0 wide_input:2.0 wide_input:84.0 wide_input:388.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:33.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:14.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:10.0 wide_input:1.0 wide_input:66.0 wide_input:210.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:28.0 deep_input:15.0 deep_input:0.0 deep_input:0.0 deep_input:55.0 label:1
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:7.0 wide_input:8.0 wide_input:7.0 wide_input:38.0 wide_input:1173.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:59.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:1.0 wide_input:7.0 wide_input:1.0 wide_input:3.0 wide_input:77.0 wide_input:546.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:38.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:4.0 wide_input:211.0 wide_input:819.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:41.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 label:1
+wide_input:12.0 wide_input:2.0 wide_input:0.0 wide_input:2.0 wide_input:4.0 wide_input:3.0 wide_input:49.0 wide_input:531.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:38.0 deep_input:14.0 deep_input:0.0 deep_input:0.0 deep_input:70.0 label:1
+wide_input:11.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:14.0 wide_input:0.0 wide_input:31.0 wide_input:20.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:23.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:10.0 wide_input:3.0 wide_input:204.0 wide_input:644.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:11.0 wide_input:2.0 wide_input:5.0 wide_input:6.0 wide_input:8.0 wide_input:4.0 wide_input:38.0 wide_input:684.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:41.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 label:0
+wide_input:7.0 wide_input:2.0 wide_input:0.0 wide_input:7.0 wide_input:11.0 wide_input:0.0 wide_input:177.0 wide_input:128.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:24.0 deep_input:12.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:3.0 wide_input:0.0 wide_input:0.0 wide_input:0.0 wide_input:76.0 wide_input:45.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 label:0
+wide_input:15.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:3.0 wide_input:84.0 wide_input:552.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:38.0 deep_input:10.0 deep_input:0.0 deep_input:1741.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:1.0 wide_input:7.0 wide_input:26.0 wide_input:1162.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:56.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:1.0 wide_input:7.0 wide_input:26.0 wide_input:1162.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:58.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 label:1
+wide_input:11.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:8.0 wide_input:2.0 wide_input:38.0 wide_input:358.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:32.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:34.0 label:0
+wide_input:7.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:13.0 wide_input:3.0 wide_input:179.0 wide_input:622.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:40.0 deep_input:12.0 deep_input:0.0 deep_input:1977.0 deep_input:60.0 label:1
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:26.0 wide_input:673.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:14.0 wide_input:2.0 wide_input:5.0 wide_input:4.0 wide_input:10.0 wide_input:4.0 wide_input:66.0 wide_input:702.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:41.0 deep_input:15.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:9.0 wide_input:2.0 wide_input:5.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:203.0 wide_input:812.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:42.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:0.0 wide_input:6.0 wide_input:4.0 wide_input:2.0 wide_input:8.0 wide_input:7.0 wide_input:12.0 wide_input:1155.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:59.0 deep_input:6.0 deep_input:0.0 deep_input:0.0 deep_input:30.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:3.0 wide_input:2.0 wide_input:1.0 wide_input:0.0 wide_input:77.0 wide_input:46.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:19.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:14.0 wide_input:7.0 wide_input:82.0 wide_input:1200.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:58.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:6.0 wide_input:5.0 wide_input:4.0 wide_input:35.0 wide_input:681.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:42.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:11.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:8.0 wide_input:0.0 wide_input:38.0 wide_input:27.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:5.0 wide_input:1.0 wide_input:4.0 wide_input:8.0 wide_input:2.0 wide_input:38.0 wide_input:358.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:32.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:30.0 label:0
+wide_input:8.0 wide_input:6.0 wide_input:1.0 wide_input:4.0 wide_input:4.0 wide_input:4.0 wide_input:196.0 wide_input:805.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 deep_input:11.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 label:0
+wide_input:5.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:5.0 wide_input:153.0 wide_input:931.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 deep_input:4.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:0.0 wide_input:3.0 wide_input:4.0 wide_input:10.0 wide_input:3.0 wide_input:204.0 wide_input:644.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:36.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:12.0 wide_input:4.0 wide_input:29.0 wide_input:676.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:60.0 label:0
+wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:8.0 wide_input:0.0 wide_input:103.0 wide_input:1527.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:17.0 deep_input:7.0 deep_input:0.0 deep_input:0.0 deep_input:12.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:12.0 wide_input:7.0 wide_input:80.0 wide_input:1198.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:59.0 deep_input:10.0 deep_input:4064.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:1.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:3.0 wide_input:1.0 wide_input:98.0 wide_input:234.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:26.0 deep_input:7.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:5.0 wide_input:4.0 wide_input:3.0 wide_input:85.0 wide_input:553.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:37.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:75.0 label:1
+wide_input:15.0 wide_input:4.0 wide_input:3.0 wide_input:0.0 wide_input:0.0 wide_input:0.0 wide_input:76.0 wide_input:45.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:19.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:24.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:8.0 wide_input:34.0 wide_input:1323.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:64.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:9.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:10.0 wide_input:2.0 wide_input:204.0 wide_input:483.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:33.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:5.0 wide_input:4.0 wide_input:3.0 wide_input:2.0 wide_input:33.0 wide_input:353.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:33.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:8.0 wide_input:34.0 wide_input:1323.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:61.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:6.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:8.0 wide_input:0.0 wide_input:172.0 wide_input:1584.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:17.0 deep_input:5.0 deep_input:1055.0 deep_input:0.0 deep_input:24.0 label:0
+wide_input:12.0 wide_input:2.0 wide_input:0.0 wide_input:6.0 wide_input:5.0 wide_input:5.0 wide_input:50.0 wide_input:863.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 deep_input:14.0 deep_input:2407.0 deep_input:0.0 deep_input:98.0 label:0
+wide_input:12.0 wide_input:4.0 wide_input:3.0 wide_input:2.0 wide_input:10.0 wide_input:1.0 wide_input:42.0 wide_input:198.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:27.0 deep_input:14.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 label:0
+wide_input:11.0 wide_input:0.0 wide_input:3.0 wide_input:4.0 wide_input:10.0 wide_input:1.0 wide_input:27.0 wide_input:183.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:30.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:6.0 wide_input:4.0 wide_input:36.0 wide_input:682.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:43.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:84.0 wide_input:716.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:44.0 deep_input:10.0 deep_input:7298.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:15.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:4.0 wide_input:2.0 wide_input:85.0 wide_input:389.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:13.0 wide_input:0.0 wide_input:81.0 wide_input:50.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:25.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:5.0 wide_input:4.0 wide_input:7.0 wide_input:0.0 wide_input:88.0 wide_input:57.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:24.0 deep_input:10.0 deep_input:7298.0 deep_input:0.0 deep_input:48.0 label:1
+wide_input:9.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:10.0 wide_input:0.0 wide_input:204.0 wide_input:154.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:22.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:15.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:2.0 wide_input:3.0 wide_input:4.0 wide_input:84.0 wide_input:716.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:42.0 deep_input:10.0 deep_input:5178.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:7.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:12.0 wide_input:2.0 wide_input:178.0 wide_input:459.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:34.0 deep_input:12.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 label:0
+wide_input:9.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:10.0 wide_input:7.0 wide_input:204.0 wide_input:1290.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:60.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:42.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:6.0 wide_input:0.0 wide_input:36.0 wide_input:25.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:21.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:12.0 wide_input:2.0 wide_input:0.0 wide_input:1.0 wide_input:12.0 wide_input:7.0 wide_input:44.0 wide_input:1179.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:57.0 deep_input:14.0 deep_input:15024.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:14.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:10.0 wide_input:4.0 wide_input:66.0 wide_input:702.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:41.0 deep_input:15.0 deep_input:0.0 deep_input:0.0 deep_input:60.0 label:1
+wide_input:15.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:8.0 wide_input:5.0 wide_input:89.0 wide_input:889.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 label:0
+wide_input:9.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:4.0 wide_input:0.0 wide_input:211.0 wide_input:161.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:25.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:5.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:3.0 wide_input:5.0 wide_input:153.0 wide_input:931.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 deep_input:4.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:10.0 wide_input:3.0 wide_input:204.0 wide_input:644.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:36.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 label:0
+wide_input:11.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:10.0 wide_input:2.0 wide_input:27.0 wide_input:348.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:31.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:4.0 wide_input:1.0 wide_input:2.0 wide_input:11.0 wide_input:1.0 wide_input:205.0 wide_input:321.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:29.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:56.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:12.0 wide_input:0.0 wide_input:80.0 wide_input:49.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:21.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 label:0
+wide_input:9.0 wide_input:4.0 wide_input:4.0 wide_input:4.0 wide_input:13.0 wide_input:1.0 wide_input:207.0 wide_input:323.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:27.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:14.0 wide_input:8.0 wide_input:31.0 wide_input:1321.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:65.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:16.0 label:0
+wide_input:9.0 wide_input:0.0 wide_input:1.0 wide_input:5.0 wide_input:12.0 wide_input:3.0 wide_input:206.0 wide_input:646.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:37.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:60.0 label:0
+wide_input:12.0 wide_input:2.0 wide_input:5.0 wide_input:0.0 wide_input:0.0 wide_input:3.0 wide_input:40.0 wide_input:523.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:39.0 deep_input:14.0 deep_input:3464.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:3.0 wide_input:0.0 wide_input:33.0 wide_input:22.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:24.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:12.0 wide_input:3.0 wide_input:29.0 wide_input:512.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:38.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:80.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:1.0 wide_input:5.0 wide_input:203.0 wide_input:973.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:48.0 deep_input:13.0 deep_input:7688.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:15.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:8.0 wide_input:0.0 wide_input:89.0 wide_input:58.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:21.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:25.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:14.0 wide_input:2.0 wide_input:31.0 wide_input:352.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:31.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 label:1
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:12.0 wide_input:6.0 wide_input:29.0 wide_input:1006.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:55.0 deep_input:9.0 deep_input:4386.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:15.0 wide_input:2.0 wide_input:5.0 wide_input:4.0 wide_input:1.0 wide_input:0.0 wide_input:77.0 wide_input:46.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:24.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:26.0 wide_input:673.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:43.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:7.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:13.0 wide_input:1.0 wide_input:179.0 wide_input:296.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:26.0 deep_input:12.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 label:0
+wide_input:7.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:10.0 wide_input:5.0 wide_input:176.0 wide_input:949.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:46.0 deep_input:12.0 deep_input:0.0 deep_input:0.0 deep_input:33.0 label:0
+wide_input:7.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:1.0 wide_input:2.0 wide_input:175.0 wide_input:456.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 deep_input:12.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:5.0 wide_input:5.0 wide_input:4.0 wide_input:86.0 wide_input:718.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:41.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:54.0 label:1
+wide_input:9.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:12.0 wide_input:1.0 wide_input:206.0 wide_input:322.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:26.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:12.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:2.0 wide_input:49.0 wide_input:368.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:34.0 deep_input:14.0 deep_input:7298.0 deep_input:0.0 deep_input:35.0 label:1
+wide_input:15.0 wide_input:4.0 wide_input:3.0 wide_input:0.0 wide_input:0.0 wide_input:0.0 wide_input:76.0 wide_input:45.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:19.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:25.0 label:0
+wide_input:9.0 wide_input:0.0 wide_input:1.0 wide_input:6.0 wide_input:10.0 wide_input:3.0 wide_input:204.0 wide_input:644.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:36.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:15.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:12.0 wide_input:0.0 wide_input:80.0 wide_input:49.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:22.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:15.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:7.0 wide_input:0.0 wide_input:88.0 wide_input:57.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:24.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:6.0 wide_input:12.0 wide_input:0.0 wide_input:80.0 wide_input:1506.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:77.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:4.0 wide_input:2.0 wide_input:2.0 wide_input:4.0 wide_input:7.0 wide_input:0.0 wide_input:143.0 wide_input:99.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:22.0 deep_input:3.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:13.0 wide_input:1.0 wide_input:81.0 wide_input:220.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:29.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:6.0 wide_input:1.0 wide_input:4.0 wide_input:8.0 wide_input:8.0 wide_input:38.0 wide_input:1327.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:62.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:24.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:6.0 wide_input:4.0 wide_input:3.0 wide_input:34.0 wide_input:517.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:39.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:84.0 wide_input:716.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:43.0 deep_input:10.0 deep_input:0.0 deep_input:1485.0 deep_input:50.0 label:0
+wide_input:11.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:14.0 wide_input:2.0 wide_input:31.0 wide_input:352.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:35.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 label:0
+wide_input:1.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:4.0 wide_input:1.0 wide_input:99.0 wide_input:235.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:29.0 deep_input:7.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:12.0 wide_input:2.0 wide_input:0.0 wide_input:6.0 wide_input:3.0 wide_input:0.0 wide_input:48.0 wide_input:1490.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:76.0 deep_input:14.0 deep_input:0.0 deep_input:0.0 deep_input:10.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:6.0 wide_input:5.0 wide_input:8.0 wide_input:35.0 wide_input:1324.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:63.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 label:0
+wide_input:8.0 wide_input:4.0 wide_input:3.0 wide_input:0.0 wide_input:0.0 wide_input:0.0 wide_input:188.0 wide_input:138.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:23.0 deep_input:11.0 deep_input:0.0 deep_input:0.0 deep_input:15.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:5.0 wide_input:4.0 wide_input:10.0 wide_input:4.0 wide_input:78.0 wide_input:711.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:43.0 deep_input:10.0 deep_input:0.0 deep_input:1887.0 deep_input:50.0 label:1
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:6.0 wide_input:6.0 wide_input:7.0 wide_input:36.0 wide_input:1171.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:58.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:14.0 wide_input:0.0 wide_input:31.0 wide_input:1477.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:66.0 deep_input:9.0 deep_input:2050.0 deep_input:0.0 deep_input:55.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:84.0 wide_input:716.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:41.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 label:0
+wide_input:9.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:1.0 wide_input:1.0 wide_input:203.0 wide_input:319.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:26.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:8.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:8.0 wide_input:5.0 wide_input:200.0 wide_input:971.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:47.0 deep_input:11.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:0.0 wide_input:2.0 wide_input:0.0 wide_input:2.0 wide_input:1.0 wide_input:6.0 wide_input:1.0 wide_input:986.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:55.0 deep_input:6.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:14.0 wide_input:6.0 wide_input:31.0 wide_input:1008.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:53.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:4.0 wide_input:4.0 wide_input:2.0 wide_input:4.0 wide_input:8.0 wide_input:0.0 wide_input:144.0 wide_input:1558.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:17.0 deep_input:3.0 deep_input:0.0 deep_input:0.0 deep_input:48.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:2.0 wide_input:4.0 wide_input:3.0 wide_input:1.0 wide_input:33.0 wide_input:189.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:30.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:12.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:10.0 wide_input:5.0 wide_input:42.0 wide_input:855.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:49.0 deep_input:14.0 deep_input:0.0 deep_input:0.0 deep_input:60.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:4.0 wide_input:4.0 wide_input:12.0 wide_input:0.0 wide_input:29.0 wide_input:18.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:19.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:30.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:8.0 wide_input:4.0 wide_input:38.0 wide_input:684.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:7.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:12.0 wide_input:1.0 wide_input:178.0 wide_input:295.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:26.0 deep_input:12.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:8.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:3.0 wide_input:3.0 wide_input:195.0 wide_input:636.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:38.0 deep_input:11.0 deep_input:7298.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:7.0 wide_input:11.0 wide_input:3.0 wide_input:28.0 wide_input:511.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:36.0 deep_input:9.0 deep_input:7298.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:11.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:5.0 wide_input:2.0 wide_input:35.0 wide_input:355.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:33.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:3.0 wide_input:7.0 wide_input:11.0 wide_input:0.0 wide_input:79.0 wide_input:48.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:22.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:4.0 wide_input:1.0 wide_input:6.0 wide_input:12.0 wide_input:4.0 wide_input:206.0 wide_input:815.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:43.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:42.0 label:0
+wide_input:1.0 wide_input:2.0 wide_input:0.0 wide_input:0.0 wide_input:0.0 wide_input:0.0 wide_input:91.0 wide_input:1516.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:67.0 deep_input:7.0 deep_input:0.0 deep_input:0.0 deep_input:8.0 label:0
+wide_input:8.0 wide_input:0.0 wide_input:4.0 wide_input:0.0 wide_input:0.0 wide_input:1.0 wide_input:188.0 wide_input:304.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:30.0 deep_input:11.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:7.0 wide_input:3.0 wide_input:1.0 wide_input:4.0 wide_input:8.0 wide_input:7.0 wide_input:186.0 wide_input:1276.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:56.0 deep_input:12.0 deep_input:0.0 deep_input:0.0 deep_input:25.0 label:0
+wide_input:9.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:10.0 wide_input:2.0 wide_input:204.0 wide_input:483.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:31.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:25.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:2.0 wide_input:211.0 wide_input:489.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:33.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:8.0 wide_input:1.0 wide_input:38.0 wide_input:194.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:26.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:2.0 wide_input:211.0 wide_input:489.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:33.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:1.0 wide_input:2.0 wide_input:0.0 wide_input:2.0 wide_input:14.0 wide_input:5.0 wide_input:97.0 wide_input:895.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:46.0 deep_input:7.0 deep_input:0.0 deep_input:0.0 deep_input:30.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:0.0 wide_input:0.0 wide_input:7.0 wide_input:202.0 wide_input:1288.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:59.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:1.0 wide_input:1.0 wide_input:3.0 wide_input:26.0 wide_input:509.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:38.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:1.0 wide_input:6.0 wide_input:4.0 wide_input:4.0 wide_input:8.0 wide_input:8.0 wide_input:103.0 wide_input:1371.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:65.0 deep_input:7.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:8.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:13.0 wide_input:3.0 wide_input:193.0 wide_input:634.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:40.0 deep_input:11.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:11.0 wide_input:2.0 wide_input:5.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:26.0 wide_input:673.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:42.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 label:0
+wide_input:6.0 wide_input:2.0 wide_input:0.0 wide_input:6.0 wide_input:3.0 wide_input:1.0 wide_input:167.0 wide_input:285.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:26.0 deep_input:5.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:0.0 wide_input:2.0 wide_input:5.0 wide_input:4.0 wide_input:8.0 wide_input:3.0 wide_input:12.0 wide_input:504.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:36.0 deep_input:6.0 deep_input:0.0 deep_input:0.0 deep_input:24.0 label:0
+wide_input:12.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:10.0 wide_input:8.0 wide_input:42.0 wide_input:1331.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:62.0 deep_input:14.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:14.0 wide_input:4.0 wide_input:208.0 wide_input:817.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:43.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 label:0
+wide_input:9.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:4.0 wide_input:4.0 wide_input:211.0 wide_input:819.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:43.0 deep_input:13.0 deep_input:0.0 deep_input:1564.0 deep_input:45.0 label:1
+wide_input:11.0 wide_input:3.0 wide_input:1.0 wide_input:4.0 wide_input:12.0 wide_input:0.0 wide_input:29.0 wide_input:18.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:22.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:55.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:3.0 wide_input:1.0 wide_input:33.0 wide_input:189.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:28.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:6.0 wide_input:12.0 wide_input:7.0 wide_input:80.0 wide_input:1198.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:56.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 label:0
+wide_input:7.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:12.0 wide_input:0.0 wide_input:178.0 wide_input:129.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:22.0 deep_input:12.0 deep_input:0.0 deep_input:0.0 deep_input:15.0 label:0
+wide_input:9.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:4.0 wide_input:7.0 wide_input:211.0 wide_input:1296.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:57.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 label:1
+wide_input:11.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:12.0 wide_input:3.0 wide_input:29.0 wide_input:512.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:39.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:4.0 wide_input:1.0 wide_input:1.0 wide_input:1.0 wide_input:77.0 wide_input:216.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:26.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:15.0 label:0
+wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:8.0 wide_input:0.0 wide_input:103.0 wide_input:1527.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:17.0 deep_input:7.0 deep_input:0.0 deep_input:0.0 deep_input:10.0 label:0
+wide_input:8.0 wide_input:2.0 wide_input:0.0 wide_input:7.0 wide_input:7.0 wide_input:3.0 wide_input:199.0 wide_input:640.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 deep_input:11.0 deep_input:0.0 deep_input:0.0 deep_input:38.0 label:1
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:7.0 wide_input:4.0 wide_input:37.0 wide_input:683.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:33.0 wide_input:679.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:44.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:3.0 wide_input:2.0 wide_input:1.0 wide_input:0.0 wide_input:77.0 wide_input:46.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:10.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:5.0 wide_input:4.0 wide_input:3.0 wide_input:2.0 wide_input:33.0 wide_input:353.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:33.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:12.0 wide_input:0.0 wide_input:80.0 wide_input:49.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:23.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:12.0 wide_input:0.0 wide_input:1.0 wide_input:6.0 wide_input:4.0 wide_input:5.0 wide_input:49.0 wide_input:862.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:46.0 deep_input:14.0 deep_input:0.0 deep_input:0.0 deep_input:30.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:12.0 wide_input:3.0 wide_input:29.0 wide_input:512.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:38.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 label:0
+wide_input:11.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:14.0 wide_input:6.0 wide_input:31.0 wide_input:1008.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:54.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:4.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:5.0 wide_input:139.0 wide_input:922.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:46.0 deep_input:3.0 deep_input:0.0 deep_input:2339.0 deep_input:45.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:3.0 wide_input:0.0 wide_input:84.0 wide_input:53.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:25.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:12.0 wide_input:5.0 wide_input:80.0 wide_input:881.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:46.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:5.0 wide_input:2.0 wide_input:14.0 wide_input:3.0 wide_input:82.0 wide_input:551.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:36.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:8.0 wide_input:0.0 wide_input:38.0 wide_input:27.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:23.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:25.0 label:0
+wide_input:11.0 wide_input:5.0 wide_input:1.0 wide_input:4.0 wide_input:7.0 wide_input:1.0 wide_input:37.0 wide_input:193.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:29.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:5.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:77.0 wide_input:710.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:44.0 deep_input:10.0 deep_input:0.0 deep_input:2415.0 deep_input:6.0 label:1
+wide_input:15.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:1.0 wide_input:0.0 wide_input:77.0 wide_input:46.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:19.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:16.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:2.0 wide_input:4.0 wide_input:8.0 wide_input:0.0 wide_input:38.0 wide_input:27.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:19.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:5.0 wide_input:4.0 wide_input:8.0 wide_input:2.0 wide_input:38.0 wide_input:358.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:9.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:1.0 wide_input:210.0 wide_input:325.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:27.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 label:0
+wide_input:7.0 wide_input:5.0 wide_input:1.0 wide_input:6.0 wide_input:3.0 wide_input:5.0 wide_input:181.0 wide_input:954.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:46.0 deep_input:12.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:7.0 wide_input:4.0 wide_input:2.0 wide_input:211.0 wide_input:489.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:34.0 deep_input:13.0 deep_input:7688.0 deep_input:0.0 deep_input:45.0 label:1
+wide_input:11.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:8.0 wide_input:2.0 wide_input:38.0 wide_input:358.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:34.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:13.0 wide_input:4.0 wide_input:81.0 wide_input:714.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:44.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:6.0 wide_input:4.0 wide_input:87.0 wide_input:719.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:2.0 wide_input:0.0 wide_input:0.0 wide_input:0.0 wide_input:25.0 wide_input:14.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:0.0 wide_input:211.0 wide_input:161.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:25.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:55.0 label:1
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:5.0 wide_input:4.0 wide_input:6.0 wide_input:211.0 wide_input:1139.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:52.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 label:1
+wide_input:15.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:6.0 wide_input:0.0 wide_input:87.0 wide_input:56.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:8.0 wide_input:1.0 wide_input:89.0 wide_input:228.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:28.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:7.0 wide_input:11.0 wide_input:5.0 wide_input:205.0 wide_input:975.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:8.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:13.0 wide_input:2.0 wide_input:193.0 wide_input:473.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:34.0 deep_input:11.0 deep_input:0.0 deep_input:0.0 deep_input:64.0 label:0
+wide_input:5.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:6.0 wide_input:1.0 wide_input:156.0 wide_input:276.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:28.0 deep_input:4.0 deep_input:0.0 deep_input:2179.0 deep_input:40.0 label:0
+wide_input:12.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:4.0 wide_input:49.0 wide_input:694.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:41.0 deep_input:14.0 deep_input:0.0 deep_input:1977.0 deep_input:65.0 label:1
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:1.0 wide_input:1.0 wide_input:26.0 wide_input:182.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:28.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:6.0 wide_input:3.0 wide_input:5.0 wide_input:33.0 wide_input:846.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:46.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:25.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:5.0 wide_input:0.0 wide_input:0.0 wide_input:1.0 wide_input:76.0 wide_input:215.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:28.0 deep_input:10.0 deep_input:0.0 deep_input:1887.0 deep_input:40.0 label:1
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:2.0 wide_input:85.0 wide_input:389.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:32.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 label:0
+wide_input:15.0 wide_input:0.0 wide_input:1.0 wide_input:2.0 wide_input:8.0 wide_input:4.0 wide_input:89.0 wide_input:721.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:41.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:24.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:4.0 wide_input:4.0 wide_input:6.0 wide_input:0.0 wide_input:36.0 wide_input:25.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:24.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:4.0 wide_input:2.0 wide_input:85.0 wide_input:389.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:33.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:10.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:5.0 wide_input:21.0 wide_input:838.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:46.0 deep_input:16.0 deep_input:15024.0 deep_input:0.0 deep_input:60.0 label:1
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:7.0 wide_input:7.0 wide_input:2.0 wide_input:88.0 wide_input:392.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:31.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:2.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:2.0 wide_input:113.0 wide_input:412.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 deep_input:8.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 label:1
+wide_input:3.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:7.0 wide_input:6.0 wide_input:129.0 wide_input:1072.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:52.0 deep_input:2.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:8.0 wide_input:1.0 wide_input:103.0 wide_input:239.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:30.0 deep_input:7.0 deep_input:0.0 deep_input:0.0 deep_input:19.0 label:0
+wide_input:1.0 wide_input:5.0 wide_input:1.0 wide_input:2.0 wide_input:7.0 wide_input:2.0 wide_input:102.0 wide_input:402.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:34.0 deep_input:7.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:0.0 wide_input:4.0 wide_input:7.0 wide_input:14.0 wide_input:2.0 wide_input:31.0 wide_input:352.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:34.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:12.0 wide_input:0.0 wide_input:80.0 wide_input:49.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:30.0 label:0
+wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:6.0 wide_input:0.0 wide_input:101.0 wide_input:1525.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:17.0 deep_input:7.0 deep_input:2176.0 deep_input:0.0 deep_input:18.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:5.0 wide_input:7.0 wide_input:4.0 wide_input:2.0 wide_input:211.0 wide_input:489.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:32.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:11.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:5.0 wide_input:1.0 wide_input:35.0 wide_input:191.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:29.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:4.0 wide_input:3.0 wide_input:2.0 wide_input:7.0 wide_input:336.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:33.0 deep_input:6.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:14.0 wide_input:0.0 wide_input:31.0 wide_input:20.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:25.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:60.0 label:0
+wide_input:12.0 wide_input:4.0 wide_input:1.0 wide_input:1.0 wide_input:10.0 wide_input:3.0 wide_input:42.0 wide_input:525.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:36.0 deep_input:14.0 deep_input:0.0 deep_input:1408.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:6.0 wide_input:0.0 wide_input:36.0 wide_input:25.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:23.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:72.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:5.0 wide_input:4.0 wide_input:8.0 wide_input:34.0 wide_input:1323.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:63.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:12.0 wide_input:2.0 wide_input:0.0 wide_input:2.0 wide_input:10.0 wide_input:5.0 wide_input:42.0 wide_input:855.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:47.0 deep_input:14.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:6.0 wide_input:1.0 wide_input:0.0 wide_input:0.0 wide_input:0.0 wide_input:25.0 wide_input:1471.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:80.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:24.0 label:0
+wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:0.0 wide_input:0.0 wide_input:0.0 wide_input:91.0 wide_input:1516.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:17.0 deep_input:7.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:3.0 wide_input:1.0 wide_input:6.0 wide_input:10.0 wide_input:3.0 wide_input:204.0 wide_input:644.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 deep_input:13.0 deep_input:2174.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:7.0 wide_input:2.0 wide_input:5.0 wide_input:4.0 wide_input:8.0 wide_input:1.0 wide_input:186.0 wide_input:302.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:30.0 deep_input:12.0 deep_input:0.0 deep_input:0.0 deep_input:25.0 label:1
+wide_input:15.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:3.0 wide_input:1.0 wide_input:84.0 wide_input:223.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:27.0 deep_input:10.0 deep_input:0.0 deep_input:1980.0 deep_input:40.0 label:0
+wide_input:12.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:10.0 wide_input:2.0 wide_input:42.0 wide_input:362.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:33.0 deep_input:14.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 label:1
+wide_input:11.0 wide_input:4.0 wide_input:1.0 wide_input:2.0 wide_input:5.0 wide_input:2.0 wide_input:35.0 wide_input:355.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:34.0 deep_input:9.0 deep_input:594.0 deep_input:0.0 deep_input:60.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:1.0 wide_input:2.0 wide_input:11.0 wide_input:2.0 wide_input:79.0 wide_input:384.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:34.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:1.0 wide_input:0.0 wide_input:77.0 wide_input:46.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:23.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:5.0 wide_input:2.0 wide_input:1.0 wide_input:4.0 wide_input:77.0 wide_input:710.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:42.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:10.0 wide_input:1.0 wide_input:204.0 wide_input:320.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:29.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:26.0 wide_input:673.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:28.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:3.0 wide_input:1.0 wide_input:2.0 wide_input:0.0 wide_input:32.0 wide_input:21.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:24.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:10.0 wide_input:4.0 wide_input:78.0 wide_input:711.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:44.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:6.0 wide_input:4.0 wide_input:2.0 wide_input:4.0 wide_input:3.0 wide_input:1.0 wide_input:167.0 wide_input:285.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:27.0 deep_input:5.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:12.0 wide_input:0.0 wide_input:80.0 wide_input:49.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 label:0
+wide_input:11.0 wide_input:6.0 wide_input:4.0 wide_input:4.0 wide_input:4.0 wide_input:4.0 wide_input:34.0 wide_input:680.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:44.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:3.0 wide_input:6.0 wide_input:33.0 wide_input:1009.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:51.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:7.0 wide_input:0.0 wide_input:37.0 wide_input:26.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:0.0 wide_input:0.0 wide_input:0.0 wide_input:91.0 wide_input:1516.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:17.0 deep_input:7.0 deep_input:0.0 deep_input:0.0 deep_input:5.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:7.0 wide_input:0.0 wide_input:37.0 wide_input:26.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:19.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:5.0 wide_input:3.0 wide_input:4.0 wide_input:84.0 wide_input:716.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:55.0 label:1
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:3.0 wide_input:7.0 wide_input:33.0 wide_input:1168.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:60.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:7.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:4.0 wide_input:182.0 wide_input:792.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:42.0 deep_input:12.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:11.0 wide_input:2.0 wide_input:5.0 wide_input:6.0 wide_input:12.0 wide_input:4.0 wide_input:29.0 wide_input:676.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:44.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:8.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:7.0 wide_input:3.0 wide_input:199.0 wide_input:640.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 deep_input:11.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:11.0 wide_input:2.0 wide_input:5.0 wide_input:4.0 wide_input:4.0 wide_input:1.0 wide_input:34.0 wide_input:190.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:30.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:14.0 wide_input:3.0 wide_input:82.0 wide_input:551.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:38.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:14.0 wide_input:0.0 wide_input:31.0 wide_input:20.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:23.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:60.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:0.0 wide_input:0.0 wide_input:2.0 wide_input:76.0 wide_input:381.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:32.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:7.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:5.0 wide_input:4.0 wide_input:183.0 wide_input:793.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:44.0 deep_input:12.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:2.0 wide_input:14.0 wide_input:6.0 wide_input:31.0 wide_input:1008.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:54.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:10.0 wide_input:2.0 wide_input:204.0 wide_input:483.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:32.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 label:0
+wide_input:3.0 wide_input:3.0 wide_input:4.0 wide_input:4.0 wide_input:3.0 wide_input:5.0 wide_input:125.0 wide_input:913.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:50.0 deep_input:2.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:13.0 wide_input:0.0 wide_input:207.0 wide_input:157.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:24.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:5.0 wide_input:4.0 wide_input:1.0 wide_input:3.0 wide_input:26.0 wide_input:509.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:37.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:60.0 label:0
+wide_input:11.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:14.0 wide_input:6.0 wide_input:31.0 wide_input:1008.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:52.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:14.0 wide_input:3.0 wide_input:82.0 wide_input:551.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:38.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:12.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:5.0 wide_input:49.0 wide_input:862.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:49.0 deep_input:14.0 deep_input:0.0 deep_input:1977.0 deep_input:45.0 label:1
+wide_input:9.0 wide_input:2.0 wide_input:5.0 wide_input:4.0 wide_input:13.0 wide_input:1.0 wide_input:207.0 wide_input:323.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:30.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:16.0 label:1
+wide_input:15.0 wide_input:4.0 wide_input:4.0 wide_input:4.0 wide_input:10.0 wide_input:7.0 wide_input:78.0 wide_input:1196.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:60.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:1
+wide_input:15.0 wide_input:4.0 wide_input:3.0 wide_input:0.0 wide_input:0.0 wide_input:0.0 wide_input:76.0 wide_input:45.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:22.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:2.0 wide_input:33.0 wide_input:353.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:14.0 wide_input:1.0 wide_input:31.0 wide_input:187.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:30.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:60.0 label:0
+wide_input:8.0 wide_input:0.0 wide_input:1.0 wide_input:4.0 wide_input:8.0 wide_input:0.0 wide_input:200.0 wide_input:1599.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:67.0 deep_input:11.0 deep_input:0.0 deep_input:0.0 deep_input:24.0 label:0
+wide_input:9.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:3.0 wide_input:5.0 wide_input:210.0 wide_input:979.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:46.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:6.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:8.0 wide_input:0.0 wide_input:172.0 wide_input:1584.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:17.0 deep_input:5.0 deep_input:0.0 deep_input:0.0 deep_input:6.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:6.0 wide_input:0.0 wide_input:87.0 wide_input:56.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:22.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:30.0 label:0
+wide_input:0.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:8.0 wide_input:1.0 wide_input:12.0 wide_input:177.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:27.0 deep_input:6.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:11.0 wide_input:4.0 wide_input:4.0 wide_input:4.0 wide_input:8.0 wide_input:0.0 wide_input:38.0 wide_input:27.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:23.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 label:0
+wide_input:7.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:3.0 wide_input:2.0 wide_input:181.0 wide_input:462.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:33.0 deep_input:12.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 label:0
+wide_input:15.0 wide_input:2.0 wide_input:0.0 wide_input:6.0 wide_input:3.0 wide_input:4.0 wide_input:84.0 wide_input:716.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:43.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 label:0
+wide_input:9.0 wide_input:4.0 wide_input:1.0 wide_input:7.0 wide_input:10.0 wide_input:1.0 wide_input:204.0 wide_input:320.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:28.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 label:0
+wide_input:11.0 wide_input:3.0 wide_input:4.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:33.0 wide_input:679.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:41.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:2.0 wide_input:4.0 wide_input:2.0 wide_input:0.0 wide_input:0.0 wide_input:6.0 wide_input:105.0 wide_input:1058.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:52.0 deep_input:8.0 deep_input:594.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:3.0 wide_input:3.0 wide_input:4.0 wide_input:1.0 wide_input:0.0 wide_input:77.0 wide_input:46.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:25.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 label:0
+wide_input:3.0 wide_input:2.0 wide_input:0.0 wide_input:0.0 wide_input:0.0 wide_input:8.0 wide_input:120.0 wide_input:1381.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:63.0 deep_input:2.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:2.0 wide_input:14.0 wide_input:7.0 wide_input:31.0 wide_input:1167.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:59.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 label:0
+wide_input:9.0 wide_input:0.0 wide_input:4.0 wide_input:2.0 wide_input:10.0 wide_input:4.0 wide_input:204.0 wide_input:813.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:45.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:9.0 wide_input:2.0 wide_input:0.0 wide_input:4.0 wide_input:12.0 wide_input:3.0 wide_input:206.0 wide_input:646.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:38.0 deep_input:13.0 deep_input:15024.0 deep_input:0.0 deep_input:60.0 label:1
+wide_input:11.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:8.0 wide_input:3.0 wide_input:38.0 wide_input:521.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 label:0
+wide_input:11.0 wide_input:0.0 wide_input:1.0 wide_input:1.0 wide_input:1.0 wide_input:5.0 wide_input:26.0 wide_input:840.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:46.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:7.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:4.0 wide_input:2.0 wide_input:182.0 wide_input:463.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 deep_input:12.0 deep_input:0.0 deep_input:0.0 deep_input:60.0 label:0
+wide_input:11.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:10.0 wide_input:2.0 wide_input:27.0 wide_input:348.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:34.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:25.0 label:0
+wide_input:9.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:3.0 wide_input:2.0 wide_input:210.0 wide_input:488.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:33.0 deep_input:13.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 label:0
+wide_input:11.0 wide_input:0.0 wide_input:4.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:26.0 wide_input:673.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:41.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:36.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:7.0 wide_input:0.0 wide_input:88.0 wide_input:57.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:4.0 wide_input:4.0 wide_input:3.0 wide_input:4.0 wide_input:8.0 wide_input:0.0 wide_input:144.0 wide_input:100.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:23.0 deep_input:3.0 deep_input:0.0 deep_input:0.0 deep_input:40.0 label:0
+wide_input:15.0 wide_input:4.0 wide_input:1.0 wide_input:4.0 wide_input:10.0 wide_input:1.0 wide_input:78.0 wide_input:217.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:26.0 deep_input:10.0 deep_input:0.0 deep_input:0.0 deep_input:35.0 label:0
+wide_input:11.0 wide_input:2.0 wide_input:0.0 wide_input:0.0 wide_input:0.0 wide_input:0.0 wide_input:25.0 wide_input:1471.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:1.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:0.0 deep_input:72.0 deep_input:9.0 deep_input:0.0 deep_input:0.0 deep_input:20.0 label:0
diff --git a/models/rank/wide_deep/model.py b/models/rank/wide_deep/model.py
index c260c90180b017eb777198203b6552d84730a038..e9d4da603e5abf6b44ce86873795695c7cfe150b 100755
--- a/models/rank/wide_deep/model.py
+++ b/models/rank/wide_deep/model.py
@@ -1,84 +1,121 @@
-import paddle.fluid as fluid
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
import math
-from fleetrec.core.utils import envs
-from fleetrec.core.model import Model as ModelBase
+import paddle.fluid as fluid
+
+from paddlerec.core.utils import envs
+from paddlerec.core.model import Model as ModelBase
class Model(ModelBase):
def __init__(self, config):
ModelBase.__init__(self, config)
-
+
+ def _init_hyper_parameters(self):
+ self.hidden1_units = envs.get_global_env(
+ "hyper_parameters.hidden1_units", 75)
+ self.hidden2_units = envs.get_global_env(
+ "hyper_parameters.hidden2_units", 50)
+ self.hidden3_units = envs.get_global_env(
+ "hyper_parameters.hidden3_units", 25)
+
def wide_part(self, data):
- out = fluid.layers.fc(input=data,
- size=1,
- param_attr=fluid.ParamAttr(initializer=fluid.initializer.TruncatedNormal(loc=0.0, scale=1.0 / math.sqrt(data.shape[1])),
- regularizer=fluid.regularizer.L2DecayRegularizer(regularization_coeff=1e-4)),
- act=None,
- name='wide')
+ out = fluid.layers.fc(
+ input=data,
+ size=1,
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.TruncatedNormal(
+ loc=0.0, scale=1.0 / math.sqrt(data.shape[1])),
+ regularizer=fluid.regularizer.L2DecayRegularizer(
+ regularization_coeff=1e-4)),
+ act=None,
+ name='wide')
return out
-
+
def fc(self, data, hidden_units, active, tag):
- output = fluid.layers.fc(input=data,
- size=hidden_units,
- param_attr=fluid.ParamAttr(initializer=fluid.initializer.TruncatedNormal(loc=0.0, scale=1.0 / math.sqrt(data.shape[1]))),
- act=active,
- name=tag)
-
+ output = fluid.layers.fc(
+ input=data,
+ size=hidden_units,
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.TruncatedNormal(
+ loc=0.0, scale=1.0 / math.sqrt(data.shape[1]))),
+ act=active,
+ name=tag)
+
return output
-
+
def deep_part(self, data, hidden1_units, hidden2_units, hidden3_units):
l1 = self.fc(data, hidden1_units, 'relu', 'l1')
l2 = self.fc(l1, hidden2_units, 'relu', 'l2')
l3 = self.fc(l2, hidden3_units, 'relu', 'l3')
-
+
return l3
-
- def train_net(self):
- wide_input = fluid.data(name='wide_input', shape=[None, 8], dtype='float32')
- deep_input = fluid.data(name='deep_input', shape=[None, 58], dtype='float32')
- label = fluid.data(name='label', shape=[None, 1], dtype='float32')
- self._data_var.append(wide_input)
- self._data_var.append(deep_input)
- self._data_var.append(label)
-
- hidden1_units = envs.get_global_env("hyper_parameters.hidden1_units", 75, self._namespace)
- hidden2_units = envs.get_global_env("hyper_parameters.hidden2_units", 50, self._namespace)
- hidden3_units = envs.get_global_env("hyper_parameters.hidden3_units", 25, self._namespace)
+
+ def net(self, inputs, is_infer=False):
+ wide_input = self._dense_data_var[0]
+ deep_input = self._dense_data_var[1]
+ label = self._sparse_data_var[0]
+
wide_output = self.wide_part(wide_input)
- deep_output = self.deep_part(deep_input, hidden1_units, hidden2_units, hidden3_units)
-
- wide_model = fluid.layers.fc(input=wide_output,
- size=1,
- param_attr=fluid.ParamAttr(initializer=fluid.initializer.TruncatedNormal(loc=0.0, scale=1.0)),
- act=None,
- name='w_wide')
-
- deep_model = fluid.layers.fc(input=deep_output,
- size=1,
- param_attr=fluid.ParamAttr(initializer=fluid.initializer.TruncatedNormal(loc=0.0, scale=1.0)),
- act=None,
- name='w_deep')
-
+ deep_output = self.deep_part(deep_input, self.hidden1_units,
+ self.hidden2_units, self.hidden3_units)
+
+ wide_model = fluid.layers.fc(
+ input=wide_output,
+ size=1,
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.TruncatedNormal(
+ loc=0.0, scale=1.0)),
+ act=None,
+ name='w_wide')
+
+ deep_model = fluid.layers.fc(
+ input=deep_output,
+ size=1,
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.TruncatedNormal(
+ loc=0.0, scale=1.0)),
+ act=None,
+ name='w_deep')
+
prediction = fluid.layers.elementwise_add(wide_model, deep_model)
- pred = fluid.layers.sigmoid(fluid.layers.clip(prediction, min=-15.0, max=15.0), name="prediction")
+ pred = fluid.layers.sigmoid(
+ fluid.layers.clip(
+ prediction, min=-15.0, max=15.0),
+ name="prediction")
num_seqs = fluid.layers.create_tensor(dtype='int64')
- acc = fluid.layers.accuracy(input=pred, label=fluid.layers.cast(x=label, dtype='int64'), total=num_seqs)
- auc_var, batch_auc, auc_states = fluid.layers.auc(input=pred, label=fluid.layers.cast(x=label, dtype='int64'))
-
+ acc = fluid.layers.accuracy(
+ input=pred,
+ label=fluid.layers.cast(
+ x=label, dtype='int64'),
+ total=num_seqs)
+ auc_var, batch_auc, auc_states = fluid.layers.auc(
+ input=pred, label=fluid.layers.cast(
+ x=label, dtype='int64'))
+
self._metrics["AUC"] = auc_var
self._metrics["BATCH_AUC"] = batch_auc
self._metrics["ACC"] = acc
+ if is_infer:
+ self._infer_results["AUC"] = auc_var
+ self._infer_results["ACC"] = acc
- cost = fluid.layers.sigmoid_cross_entropy_with_logits(x=prediction, label=label)
+ cost = fluid.layers.sigmoid_cross_entropy_with_logits(
+ x=prediction, label=fluid.layers.cast(
+ label, dtype='float32'))
avg_cost = fluid.layers.mean(cost)
self._cost = avg_cost
-
- def optimizer(self):
- learning_rate = envs.get_global_env("hyper_parameters.learning_rate", None, self._namespace)
- optimizer = fluid.optimizer.Adam(learning_rate, lazy_mode=True)
- return optimizer
-
- def infer_net(self, parameter_list):
- self.deepfm_net()
\ No newline at end of file
diff --git a/models/rank/xdeepfm/config.yaml b/models/rank/xdeepfm/config.yaml
index a93b61fbce0d835b478c86f0e572bc5c88ab6138..6274d58559f6eaf54549a8cc82b00c2c50684032 100755
--- a/models/rank/xdeepfm/config.yaml
+++ b/models/rank/xdeepfm/config.yaml
@@ -11,40 +11,61 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
+debug: false
+workspace: "paddlerec.models.rank.xdeepfm"
-train:
- trainer:
- # for cluster training
- strategy: "async"
+dataset:
+ - name: sample_1
+ type: QueueDataset #或者DataLoader
+ batch_size: 5
+ data_path: "{workspace}/data/sample_data/train"
+ sparse_slots: "label feat_idx"
+ dense_slots: "feat_value:39"
+ - name: infer_sample
+ type: QueueDataset #或者DataLoader
+ batch_size: 5
+ data_path: "{workspace}/data/sample_data/train"
+ sparse_slots: "label feat_idx"
+ dense_slots: "feat_value:39"
- epochs: 10
- workspace: "fleetrec.models.rank.xdeepfm"
+hyper_parameters:
+ optimizer:
+ class: SGD
+ learning_rate: 0.0001
+ layer_sizes_dnn: [10, 10, 10]
+ layer_sizes_cin: [10, 10]
+ sparse_feature_number: 1086460
+ sparse_feature_dim: 9
+ num_field: 39
+ fc_sizes: [400, 400, 400]
+ act: "relu"
- reader:
- batch_size: 2
- class: "{workspace}/criteo_reader.py"
- train_data_path: "{workspace}/data/train_data"
- model:
- models: "{workspace}/model.py"
- hyper_parameters:
- layer_sizes_dnn: [10, 10, 10]
- layer_sizes_cin: [10, 10]
- sparse_feature_number: 1086460
- sparse_feature_dim: 9
- num_field: 39
- fc_sizes: [400, 400, 400]
- learning_rate: 0.0001
- reg: 0.0001
- act: "relu"
- optimizer: SGD
+mode: train_runner
+# if infer, change mode to "infer_runner" and change phase to "infer_phase"
- save:
- increment:
- dirname: "increment"
- epoch_interval: 2
- save_last: True
- inference:
- dirname: "inference"
- epoch_interval: 4
- save_last: True
+runner:
+ - name: train_runner
+ trainer_class: single_train
+ epochs: 1
+ device: cpu
+ init_model_path: ""
+ save_checkpoint_interval: 1
+ save_inference_interval: 1
+ save_checkpoint_path: "increment"
+ save_inference_path: "inference"
+ - name: infer_runner
+ trainer_class: single_infer
+ epochs: 1
+ device: cpu
+ init_model_path: "increment/0"
+
+phase:
+- name: phase1
+ model: "{workspace}/model.py"
+ dataset_name: sample_1
+ thread_num: 1
+#- name: infer_phase
+# model: "{workspace}/model.py"
+# dataset_name: infer_sample
+# thread_num: 1
diff --git a/models/rank/xdeepfm/data/download.py b/models/rank/xdeepfm/data/download.py
index d0483ea3f0d5ddfeb1ad5123bd91cf2d5b6e1331..e46a9ced4a69339f5c5f6c45067d34bbbfa39469 100755
--- a/models/rank/xdeepfm/data/download.py
+++ b/models/rank/xdeepfm/data/download.py
@@ -1,3 +1,17 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
import os
import shutil
import sys
@@ -6,7 +20,7 @@ LOCAL_PATH = os.path.dirname(os.path.abspath(__file__))
TOOLS_PATH = os.path.join(LOCAL_PATH, "..", "..", "tools")
sys.path.append(TOOLS_PATH)
-from fleetrec.tools.tools import download_file_and_uncompress, download_file
+from paddlerec.tools.tools import download_file_and_uncompress, download_file
if __name__ == '__main__':
url_train = "https://paddlerec.bj.bcebos.com/xdeepfm%2Ftr"
diff --git a/models/rank/xdeepfm/criteo_reader.py b/models/rank/xdeepfm/data/get_slot_data.py
similarity index 56%
rename from models/rank/xdeepfm/criteo_reader.py
rename to models/rank/xdeepfm/data/get_slot_data.py
index 1b5e4041625228f5ebaa0fefe9f2ada566a5cecb..4426e9647c080dce5debdcdbc3e039ac69a69935 100755
--- a/models/rank/xdeepfm/criteo_reader.py
+++ b/models/rank/xdeepfm/data/get_slot_data.py
@@ -11,19 +11,29 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
-from __future__ import print_function
-from fleetrec.core.reader import Reader
-from fleetrec.core.utils import envs
+import yaml
+from paddlerec.core.reader import Reader
+from paddlerec.core.utils import envs
try:
import cPickle as pickle
except ImportError:
import pickle
+import paddle.fluid.incubate.data_generator as dg
+
+
+class TrainReader(dg.MultiSlotDataGenerator):
+ def __init__(self, config):
+ dg.MultiSlotDataGenerator.__init__(self)
+ if os.path.isfile(config):
+ with open(config, 'r') as rb:
+ _config = yaml.load(rb.read(), Loader=yaml.FullLoader)
+ else:
+ raise ValueError("reader config only support yaml")
-class TrainReader(Reader):
def init(self):
pass
-
+
def _process_line(self, line):
features = line.strip('\n').split('\t')
feat_idx = []
@@ -33,11 +43,24 @@ class TrainReader(Reader):
feat_value.append(1.0)
label = [int(features[0])]
return feat_idx, feat_value, label
-
+
def generate_sample(self, line):
def data_iter():
feat_idx, feat_value, label = self._process_line(line)
- yield [('feat_idx', feat_idx), ('feat_value', feat_value), ('label',
- label)]
- return data_iter
\ No newline at end of file
+ s = ""
+ for i in [('feat_idx', feat_idx), ('feat_value', feat_value),
+ ('label', label)]:
+ k = i[0]
+ v = i[1]
+ for j in v:
+ s += " " + k + ":" + str(j)
+ print s.strip()
+ yield None
+
+ return data_iter
+
+
+reader = TrainReader("../config.yaml")
+reader.init()
+reader.run_from_stdin()
diff --git a/models/rank/xdeepfm/data/run.sh b/models/rank/xdeepfm/data/run.sh
new file mode 100644
index 0000000000000000000000000000000000000000..e0e6780632153cd53b4de329b5500f944035b70a
--- /dev/null
+++ b/models/rank/xdeepfm/data/run.sh
@@ -0,0 +1,13 @@
+python download.py
+
+mkdir -p slot_train_data/tr
+for i in `ls ./train_data/tr`
+do
+ cat train_data/tr/$i | python get_slot_data.py > slot_train_data/tr/$i
+done
+
+mkdir slot_test_data/ev
+for i in `ls ./test_data/ev`
+do
+ cat test_data/ev/$i | python get_slot_data.py > slot_test_data/ev/$i
+done
diff --git a/models/rank/xdeepfm/data/sample_data/train/sample_train.txt b/models/rank/xdeepfm/data/sample_data/train/sample_train.txt
new file mode 100644
index 0000000000000000000000000000000000000000..4b0308e17f74efa4272e1871e86d03c236b1945a
--- /dev/null
+++ b/models/rank/xdeepfm/data/sample_data/train/sample_train.txt
@@ -0,0 +1,100 @@
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:695357 feat_idx:655161 feat_idx:0 feat_idx:1075467 feat_idx:314332 feat_idx:615411 feat_idx:733564 feat_idx:795081 feat_idx:148475 feat_idx:123424 feat_idx:582322 feat_idx:0 feat_idx:1082305 feat_idx:288355 feat_idx:328646 feat_idx:756244 feat_idx:13161 feat_idx:134834 feat_idx:734534 feat_idx:1047606 feat_idx:626828 feat_idx:0 feat_idx:476211 feat_idx:819217 feat_idx:502861 feat_idx:767167 feat_value:0.00017316017316 feat_value:1.55232499476e-05 feat_value:7.62951094835e-05 feat_value:0.0 feat_value:5.96732496653e-05 feat_value:9.27994580512e-06 feat_value:0.000266377794747 feat_value:0.000330742516951 feat_value:0.00623729280816 feat_value:0.0217391304348 feat_value:0.00865800865801 feat_value:0.0 feat_value:0.000270526173407 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:695357 feat_idx:328856 feat_idx:583609 feat_idx:356189 feat_idx:314332 feat_idx:404876 feat_idx:233441 feat_idx:144963 feat_idx:148475 feat_idx:954707 feat_idx:778340 feat_idx:598842 feat_idx:701804 feat_idx:223357 feat_idx:310528 feat_idx:805012 feat_idx:599055 feat_idx:683739 feat_idx:734534 feat_idx:94311 feat_idx:135625 feat_idx:0 feat_idx:476211 feat_idx:737768 feat_idx:502861 feat_idx:618666 feat_value:0.00034632034632 feat_value:1.16424374607e-05 feat_value:0.000671396963455 feat_value:0.00103199174407 feat_value:4.40424852812e-06 feat_value:1.85598916102e-05 feat_value:3.55170392996e-05 feat_value:0.000330742516951 feat_value:0.000137840725042 feat_value:0.0217391304348 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000541052346815 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:125230 feat_idx:244091 feat_idx:428972 feat_idx:323226 feat_idx:314332 feat_idx:615411 feat_idx:655488 feat_idx:144963 feat_idx:148475 feat_idx:754940 feat_idx:989454 feat_idx:789125 feat_idx:274685 feat_idx:59528 feat_idx:142028 feat_idx:791919 feat_idx:339114 feat_idx:12934 feat_idx:0 feat_idx:0 feat_idx:128761 feat_idx:925828 feat_idx:476211 feat_idx:686449 feat_idx:0 feat_idx:0 feat_value:0.00034632034632 feat_value:1.16424374607e-05 feat_value:1.52590218967e-05 feat_value:0.0144478844169 feat_value:3.31182217752e-05 feat_value:0.000206478794164 feat_value:7.10340785992e-05 feat_value:0.000330742516951 feat_value:0.00844274440884 feat_value:0.0217391304348 feat_value:0.012987012987 feat_value:0.000748502994012 feat_value:0.00608683890166 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:0 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:695357 feat_idx:541890 feat_idx:0 feat_idx:1012660 feat_idx:314332 feat_idx:404876 feat_idx:1742 feat_idx:144963 feat_idx:148475 feat_idx:456917 feat_idx:220560 feat_idx:0 feat_idx:480237 feat_idx:59528 feat_idx:402233 feat_idx:0 feat_idx:763481 feat_idx:885529 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:476211 feat_idx:68781 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.00347720798826 feat_value:0.0 feat_value:0.0 feat_value:0.000189641760152 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:518052 feat_idx:52223 feat_idx:0 feat_idx:610088 feat_idx:314332 feat_idx:85900 feat_idx:253972 feat_idx:144963 feat_idx:148475 feat_idx:581401 feat_idx:921618 feat_idx:374454 feat_idx:576858 feat_idx:288355 feat_idx:526081 feat_idx:597631 feat_idx:763481 feat_idx:468634 feat_idx:0 feat_idx:0 feat_idx:360559 feat_idx:0 feat_idx:122096 feat_idx:604513 feat_idx:0 feat_idx:0 feat_value:0.000519480519481 feat_value:7.7616249738e-06 feat_value:0.0 feat_value:0.0 feat_value:8.63578142768e-08 feat_value:0.0 feat_value:5.32755589494e-05 feat_value:0.0 feat_value:0.0 feat_value:0.0217391304348 feat_value:0.004329004329 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:0 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:268086 feat_idx:844726 feat_idx:589259 feat_idx:34922 feat_idx:943087 feat_idx:831162 feat_idx:687817 feat_idx:144963 feat_idx:148475 feat_idx:754940 feat_idx:160002 feat_idx:879363 feat_idx:979424 feat_idx:59528 feat_idx:844314 feat_idx:974289 feat_idx:197974 feat_idx:82573 feat_idx:0 feat_idx:0 feat_idx:4620 feat_idx:811639 feat_idx:441547 feat_idx:578537 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:7.7616249738e-06 feat_value:0.0 feat_value:0.0 feat_value:0.000553726305143 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.000206761087563 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:0 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:74940 feat_idx:503640 feat_idx:888356 feat_idx:507702 feat_idx:943087 feat_idx:404876 feat_idx:1081499 feat_idx:144963 feat_idx:148475 feat_idx:754940 feat_idx:202629 feat_idx:486504 feat_idx:981942 feat_idx:59528 feat_idx:404100 feat_idx:210897 feat_idx:197974 feat_idx:821035 feat_idx:0 feat_idx:0 feat_idx:627303 feat_idx:0 feat_idx:637620 feat_idx:409520 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:1.55232499476e-05 feat_value:3.05180437934e-05 feat_value:0.0 feat_value:0.000136790777814 feat_value:0.0 feat_value:0.0 feat_value:0.000165371258475 feat_value:6.89203625211e-05 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:695357 feat_idx:541890 feat_idx:0 feat_idx:175574 feat_idx:1022525 feat_idx:85900 feat_idx:114990 feat_idx:795081 feat_idx:148475 feat_idx:391150 feat_idx:172637 feat_idx:0 feat_idx:831202 feat_idx:59528 feat_idx:402233 feat_idx:0 feat_idx:13161 feat_idx:885529 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:122096 feat_idx:68781 feat_idx:0 feat_idx:0 feat_value:0.00017316017316 feat_value:2.71656874083e-05 feat_value:3.05180437934e-05 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.77585196498e-05 feat_value:0.0 feat_value:0.0 feat_value:0.0217391304348 feat_value:0.004329004329 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:1
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:585875 feat_idx:460446 feat_idx:323226 feat_idx:314332 feat_idx:615411 feat_idx:453185 feat_idx:144963 feat_idx:148475 feat_idx:995582 feat_idx:409958 feat_idx:824386 feat_idx:745363 feat_idx:223357 feat_idx:782190 feat_idx:499188 feat_idx:13161 feat_idx:826986 feat_idx:0 feat_idx:0 feat_idx:335421 feat_idx:0 feat_idx:122096 feat_idx:686449 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.000182398186884 feat_value:6.10360875868e-05 feat_value:0.00825593395253 feat_value:0.000820831024701 feat_value:0.000577676626369 feat_value:0.000497238550194 feat_value:0.00512650901273 feat_value:0.00485888555774 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.00108210469363 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:952850 feat_idx:444926 feat_idx:327161 feat_idx:314332 feat_idx:0 feat_idx:48165 feat_idx:144963 feat_idx:148475 feat_idx:408072 feat_idx:220560 feat_idx:313350 feat_idx:480237 feat_idx:59528 feat_idx:767941 feat_idx:274209 feat_idx:587215 feat_idx:49542 feat_idx:0 feat_idx:0 feat_idx:918027 feat_idx:0 feat_idx:122096 feat_idx:210681 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.000147470874502 feat_value:0.0 feat_value:0.00103199174407 feat_value:0.00145672679013 feat_value:4.87197154769e-05 feat_value:1.77585196498e-05 feat_value:0.000330742516951 feat_value:0.000103380543782 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000135263086704 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:323969 feat_idx:1007141 feat_idx:1053419 feat_idx:314332 feat_idx:615411 feat_idx:926319 feat_idx:144963 feat_idx:31348 feat_idx:754940 feat_idx:35969 feat_idx:469428 feat_idx:394416 feat_idx:223357 feat_idx:878804 feat_idx:9647 feat_idx:197974 feat_idx:316785 feat_idx:734534 feat_idx:94311 feat_idx:409871 feat_idx:0 feat_idx:476211 feat_idx:755653 feat_idx:522503 feat_idx:379855 feat_value:0.0 feat_value:1.94040624345e-05 feat_value:0.00964370183871 feat_value:0.0 feat_value:0.00245126655825 feat_value:0.0 feat_value:0.0 feat_value:0.000826856292376 feat_value:0.00223991178194 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.000270526173407 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:985125 feat_idx:0 feat_idx:0 feat_idx:360051 feat_idx:0 feat_idx:304911 feat_idx:144963 feat_idx:148475 feat_idx:754940 feat_idx:887175 feat_idx:0 feat_idx:701330 feat_idx:59528 feat_idx:670083 feat_idx:0 feat_idx:587215 feat_idx:334296 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:122096 feat_idx:0 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:3.49273123821e-05 feat_value:9.15541313802e-05 feat_value:0.0061919504644 feat_value:1.81783199053e-05 feat_value:0.000252878523189 feat_value:1.77585196498e-05 feat_value:0.00115759880933 feat_value:0.00368723939488 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000811578520222 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:0 feat_idx:0 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:685954 feat_idx:439682 feat_idx:0 feat_idx:983567 feat_idx:314332 feat_idx:404876 feat_idx:909239 feat_idx:795081 feat_idx:148475 feat_idx:36347 feat_idx:663689 feat_idx:0 feat_idx:398775 feat_idx:59528 feat_idx:996203 feat_idx:150509 feat_idx:13161 feat_idx:183924 feat_idx:0 feat_idx:0 feat_idx:379144 feat_idx:0 feat_idx:122096 feat_idx:604513 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:7.7616249738e-06 feat_value:0.0 feat_value:0.0 feat_value:6.32570989578e-05 feat_value:0.0 feat_value:0.000301894834047 feat_value:0.0 feat_value:0.000137840725042 feat_value:0.0 feat_value:0.017316017316 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:1
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:181401 feat_idx:702327 feat_idx:0 feat_idx:334017 feat_idx:314332 feat_idx:0 feat_idx:191120 feat_idx:299805 feat_idx:148475 feat_idx:442554 feat_idx:480141 feat_idx:0 feat_idx:16042 feat_idx:288355 feat_idx:928072 feat_idx:0 feat_idx:599055 feat_idx:91753 feat_idx:297696 feat_idx:330429 feat_idx:0 feat_idx:0 feat_idx:122096 feat_idx:590863 feat_idx:525837 feat_idx:413413 feat_value:0.0 feat_value:1.94040624345e-05 feat_value:0.000167849240864 feat_value:0.00515995872033 feat_value:0.000443101945054 feat_value:7.88795393435e-05 feat_value:3.55170392996e-05 feat_value:0.000661485033901 feat_value:0.000172300906303 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000676315433518 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:1
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:506931 feat_idx:655161 feat_idx:0 feat_idx:49997 feat_idx:1076285 feat_idx:85900 feat_idx:79619 feat_idx:144963 feat_idx:148475 feat_idx:817613 feat_idx:933612 feat_idx:0 feat_idx:733763 feat_idx:288355 feat_idx:565066 feat_idx:310463 feat_idx:854924 feat_idx:378884 feat_idx:734534 feat_idx:1047606 feat_idx:884047 feat_idx:0 feat_idx:241528 feat_idx:40100 feat_idx:502861 feat_idx:752176 feat_value:0.0 feat_value:0.000209563874293 feat_value:0.00128175783932 feat_value:0.00412796697626 feat_value:0.000156868969634 feat_value:6.03196477333e-05 feat_value:1.77585196498e-05 feat_value:0.000661485033901 feat_value:0.000275681450084 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000541052346815 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:328239 feat_idx:910743 feat_idx:915614 feat_idx:360051 feat_idx:615411 feat_idx:49489 feat_idx:1007823 feat_idx:148475 feat_idx:754940 feat_idx:224964 feat_idx:235573 feat_idx:226878 feat_idx:693306 feat_idx:277510 feat_idx:277345 feat_idx:197974 feat_idx:969807 feat_idx:0 feat_idx:0 feat_idx:539201 feat_idx:0 feat_idx:476211 feat_idx:650546 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:1.94040624345e-05 feat_value:1.52590218967e-05 feat_value:0.0185758513932 feat_value:0.000874588764088 feat_value:0.0 feat_value:0.0 feat_value:0.000165371258475 feat_value:0.0450049967263 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.00270526173407 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:695357 feat_idx:211148 feat_idx:0 feat_idx:0 feat_idx:943087 feat_idx:615411 feat_idx:98894 feat_idx:144963 feat_idx:148475 feat_idx:754940 feat_idx:683585 feat_idx:0 feat_idx:460786 feat_idx:59528 feat_idx:883086 feat_idx:0 feat_idx:587215 feat_idx:197941 feat_idx:734534 feat_idx:1047606 feat_idx:0 feat_idx:0 feat_idx:122096 feat_idx:537421 feat_idx:24736 feat_idx:962390 feat_value:0.00017316017316 feat_value:0.00384200436203 feat_value:0.0 feat_value:0.00206398348813 feat_value:4.53378524953e-06 feat_value:4.63997290256e-06 feat_value:1.77585196498e-05 feat_value:0.000330742516951 feat_value:6.89203625211e-05 feat_value:0.0217391304348 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000270526173407 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:1
+feat_idx:1 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:518052 feat_idx:894672 feat_idx:521506 feat_idx:105841 feat_idx:360051 feat_idx:108674 feat_idx:642013 feat_idx:144963 feat_idx:148475 feat_idx:165260 feat_idx:212992 feat_idx:1009370 feat_idx:775147 feat_idx:223357 feat_idx:274230 feat_idx:833849 feat_idx:13161 feat_idx:57230 feat_idx:0 feat_idx:0 feat_idx:844134 feat_idx:925828 feat_idx:122096 feat_idx:141692 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:1.55232499476e-05 feat_value:0.0 feat_value:0.0 feat_value:0.000716640321776 feat_value:0.00129223245336 feat_value:5.32755589494e-05 feat_value:0.000826856292376 feat_value:0.00423860229505 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000135263086704 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:328856 feat_idx:506639 feat_idx:78755 feat_idx:463568 feat_idx:108674 feat_idx:152478 feat_idx:888742 feat_idx:148475 feat_idx:14838 feat_idx:682657 feat_idx:993166 feat_idx:502067 feat_idx:288355 feat_idx:190674 feat_idx:472919 feat_idx:13161 feat_idx:683739 feat_idx:734534 feat_idx:1047606 feat_idx:768815 feat_idx:0 feat_idx:122096 feat_idx:1010006 feat_idx:522503 feat_idx:963757 feat_value:0.0 feat_value:0.000104781937146 feat_value:6.10360875868e-05 feat_value:0.00206398348813 feat_value:8.87758330766e-05 feat_value:2.78398374153e-05 feat_value:0.000106551117899 feat_value:0.00165371258475 feat_value:0.00286019504463 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000270526173407 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:738089 feat_idx:606995 feat_idx:964206 feat_idx:269737 feat_idx:360051 feat_idx:85900 feat_idx:608469 feat_idx:144963 feat_idx:148475 feat_idx:307543 feat_idx:405000 feat_idx:65140 feat_idx:749745 feat_idx:218723 feat_idx:686050 feat_idx:594443 feat_idx:13161 feat_idx:96125 feat_idx:0 feat_idx:0 feat_idx:946269 feat_idx:0 feat_idx:943262 feat_idx:395579 feat_idx:0 feat_idx:0 feat_value:0.00121212121212 feat_value:0.000407485311125 feat_value:0.0 feat_value:0.0030959752322 feat_value:3.3679547568e-05 feat_value:3.47997967692e-05 feat_value:0.000124309637549 feat_value:0.00248056887713 feat_value:0.000516902718908 feat_value:0.0217391304348 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000405789260111 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:906706 feat_idx:439682 feat_idx:4257 feat_idx:430841 feat_idx:314332 feat_idx:615411 feat_idx:998076 feat_idx:66687 feat_idx:148475 feat_idx:754940 feat_idx:648531 feat_idx:779745 feat_idx:718037 feat_idx:288355 feat_idx:360204 feat_idx:944849 feat_idx:13161 feat_idx:631544 feat_idx:0 feat_idx:0 feat_idx:177363 feat_idx:0 feat_idx:122096 feat_idx:1072137 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.000194040624345 feat_value:0.0 feat_value:0.0 feat_value:0.000276301826779 feat_value:8.81594851486e-05 feat_value:0.000337411873346 feat_value:0.00165371258475 feat_value:0.00492780592026 feat_value:0.0 feat_value:0.04329004329 feat_value:0.0 feat_value:0.000811578520222 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:1
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:704711 feat_idx:0 feat_idx:388090 feat_idx:314332 feat_idx:615411 feat_idx:595457 feat_idx:144963 feat_idx:148475 feat_idx:754940 feat_idx:298800 feat_idx:0 feat_idx:349549 feat_idx:59528 feat_idx:28300 feat_idx:0 feat_idx:587215 feat_idx:750233 feat_idx:832803 feat_idx:330429 feat_idx:0 feat_idx:0 feat_idx:122096 feat_idx:612991 feat_idx:502861 feat_idx:691775 feat_value:0.0 feat_value:1.55232499476e-05 feat_value:0.00122072175174 feat_value:0.0 feat_value:7.97946203918e-05 feat_value:0.000665836111517 feat_value:1.77585196498e-05 feat_value:0.000661485033901 feat_value:0.00158516833799 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000541052346815 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:439682 feat_idx:998375 feat_idx:373577 feat_idx:314332 feat_idx:108674 feat_idx:76428 feat_idx:66687 feat_idx:148475 feat_idx:636407 feat_idx:840978 feat_idx:221841 feat_idx:110276 feat_idx:223357 feat_idx:104371 feat_idx:535541 feat_idx:599055 feat_idx:892333 feat_idx:0 feat_idx:0 feat_idx:519737 feat_idx:0 feat_idx:476211 feat_idx:26849 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:1.16424374607e-05 feat_value:0.000213626306554 feat_value:0.0061919504644 feat_value:0.000307951965711 feat_value:0.000396717683169 feat_value:3.55170392996e-05 feat_value:0.000330742516951 feat_value:0.000206761087563 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000811578520222 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:507093 feat_idx:28898 feat_idx:1067105 feat_idx:314332 feat_idx:615411 feat_idx:875540 feat_idx:144963 feat_idx:148475 feat_idx:801559 feat_idx:965246 feat_idx:93410 feat_idx:648840 feat_idx:59528 feat_idx:63243 feat_idx:1041736 feat_idx:763481 feat_idx:206486 feat_idx:0 feat_idx:0 feat_idx:623203 feat_idx:0 feat_idx:377126 feat_idx:1017627 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:4.65697498428e-05 feat_value:0.00013733119707 feat_value:0.0175438596491 feat_value:0.000508388452648 feat_value:0.0 feat_value:0.0 feat_value:0.00380353894493 feat_value:0.00441090320135 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.00229947247396 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:0 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:506931 feat_idx:195832 feat_idx:460446 feat_idx:323226 feat_idx:314332 feat_idx:615411 feat_idx:414506 feat_idx:144963 feat_idx:148475 feat_idx:127380 feat_idx:385804 feat_idx:824386 feat_idx:203621 feat_idx:59528 feat_idx:631370 feat_idx:499188 feat_idx:587215 feat_idx:855342 feat_idx:0 feat_idx:0 feat_idx:335421 feat_idx:969590 feat_idx:476211 feat_idx:686449 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:1.55232499476e-05 feat_value:3.05180437934e-05 feat_value:0.0 feat_value:0.000267277435187 feat_value:0.000194878861907 feat_value:1.77585196498e-05 feat_value:0.00446502397883 feat_value:0.0024466728695 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:0 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:506931 feat_idx:704711 feat_idx:701980 feat_idx:42486 feat_idx:314332 feat_idx:0 feat_idx:786460 feat_idx:144963 feat_idx:148475 feat_idx:466556 feat_idx:775018 feat_idx:404666 feat_idx:1065844 feat_idx:39086 feat_idx:992008 feat_idx:506428 feat_idx:599055 feat_idx:750233 feat_idx:256242 feat_idx:330429 feat_idx:218251 feat_idx:0 feat_idx:122096 feat_idx:221229 feat_idx:502861 feat_idx:24246 feat_value:0.0 feat_value:2.71656874083e-05 feat_value:0.000244144350347 feat_value:0.0 feat_value:0.000255835024795 feat_value:4.63997290256e-06 feat_value:3.55170392996e-05 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:518052 feat_idx:1049859 feat_idx:0 feat_idx:1096 feat_idx:314332 feat_idx:615411 feat_idx:714816 feat_idx:795081 feat_idx:148475 feat_idx:900313 feat_idx:855314 feat_idx:0 feat_idx:603555 feat_idx:59528 feat_idx:211559 feat_idx:0 feat_idx:379814 feat_idx:311468 feat_idx:734534 feat_idx:330429 feat_idx:0 feat_idx:0 feat_idx:122096 feat_idx:383498 feat_idx:917031 feat_idx:879752 feat_value:0.0 feat_value:1.55232499476e-05 feat_value:0.000305180437934 feat_value:0.0165118679051 feat_value:6.68409482503e-05 feat_value:0.000215758739969 feat_value:0.000745857825292 feat_value:0.00529188027121 feat_value:0.0314276853096 feat_value:0.0 feat_value:0.0649350649351 feat_value:0.000249500998004 feat_value:0.00216420938726 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:1
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:695357 feat_idx:439682 feat_idx:433159 feat_idx:217415 feat_idx:360051 feat_idx:615411 feat_idx:235834 feat_idx:144963 feat_idx:148475 feat_idx:343946 feat_idx:489781 feat_idx:168412 feat_idx:950158 feat_idx:59528 feat_idx:419036 feat_idx:782554 feat_idx:854924 feat_idx:502656 feat_idx:0 feat_idx:0 feat_idx:1082526 feat_idx:0 feat_idx:476211 feat_idx:972567 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:8.92586871988e-05 feat_value:3.05180437934e-05 feat_value:0.00206398348813 feat_value:0.000310369984511 feat_value:0.000394397696717 feat_value:3.55170392996e-05 feat_value:0.000496113775426 feat_value:0.000827044350253 feat_value:0.0 feat_value:0.00865800865801 feat_value:0.0 feat_value:0.000270526173407 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:1
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:695357 feat_idx:983083 feat_idx:555506 feat_idx:311508 feat_idx:360051 feat_idx:831162 feat_idx:662893 feat_idx:144963 feat_idx:148475 feat_idx:453404 feat_idx:437228 feat_idx:866349 feat_idx:987534 feat_idx:223357 feat_idx:872276 feat_idx:719825 feat_idx:13161 feat_idx:146364 feat_idx:0 feat_idx:0 feat_idx:1083188 feat_idx:0 feat_idx:122096 feat_idx:33938 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.000314345811439 feat_value:3.05180437934e-05 feat_value:0.015479876161 feat_value:0.000186144268674 feat_value:0.000197198848359 feat_value:7.10340785992e-05 feat_value:0.00297668265255 feat_value:0.00792584168993 feat_value:0.0 feat_value:0.012987012987 feat_value:0.0 feat_value:0.00202894630055 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:1
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:638696 feat_idx:232393 feat_idx:537609 feat_idx:314332 feat_idx:85900 feat_idx:158968 feat_idx:144963 feat_idx:148475 feat_idx:411650 feat_idx:220560 feat_idx:633471 feat_idx:480237 feat_idx:39086 feat_idx:611928 feat_idx:584121 feat_idx:13161 feat_idx:747604 feat_idx:0 feat_idx:0 feat_idx:204145 feat_idx:0 feat_idx:476211 feat_idx:485685 feat_idx:0 feat_idx:0 feat_value:0.000519480519481 feat_value:1.16424374607e-05 feat_value:6.10360875868e-05 feat_value:0.0134158926729 feat_value:9.672075199e-06 feat_value:6.49596206358e-05 feat_value:5.32755589494e-05 feat_value:0.00578799404663 feat_value:0.000930424894035 feat_value:0.0217391304348 feat_value:0.004329004329 feat_value:0.0 feat_value:0.00175842012715 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:1
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:518052 feat_idx:245713 feat_idx:964221 feat_idx:976933 feat_idx:360051 feat_idx:404876 feat_idx:469669 feat_idx:144963 feat_idx:148475 feat_idx:754940 feat_idx:496768 feat_idx:978607 feat_idx:788967 feat_idx:59528 feat_idx:717827 feat_idx:227446 feat_idx:13161 feat_idx:251726 feat_idx:0 feat_idx:0 feat_idx:2400 feat_idx:0 feat_idx:476211 feat_idx:942610 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.00108662749633 feat_value:0.0 feat_value:0.0030959752322 feat_value:0.000315983242439 feat_value:5.56796748307e-05 feat_value:0.000106551117899 feat_value:0.000496113775426 feat_value:0.00337709776353 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000405789260111 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:1
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:0 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:181401 feat_idx:569676 feat_idx:460446 feat_idx:323226 feat_idx:314332 feat_idx:404876 feat_idx:286011 feat_idx:144963 feat_idx:148475 feat_idx:754940 feat_idx:966589 feat_idx:824386 feat_idx:429895 feat_idx:863222 feat_idx:406685 feat_idx:499188 feat_idx:197974 feat_idx:251433 feat_idx:0 feat_idx:0 feat_idx:335421 feat_idx:0 feat_idx:321110 feat_idx:686449 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:7.7616249738e-06 feat_value:0.0 feat_value:0.0 feat_value:0.000213994663778 feat_value:0.0 feat_value:0.0 feat_value:0.00611873656359 feat_value:0.00334263758227 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:0 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:0 feat_idx:268086 feat_idx:83142 feat_idx:288162 feat_idx:1060646 feat_idx:360051 feat_idx:615411 feat_idx:714816 feat_idx:144963 feat_idx:148475 feat_idx:138291 feat_idx:855314 feat_idx:165496 feat_idx:603555 feat_idx:59528 feat_idx:224690 feat_idx:316295 feat_idx:854924 feat_idx:257823 feat_idx:0 feat_idx:0 feat_idx:704548 feat_idx:0 feat_idx:122096 feat_idx:782694 feat_idx:0 feat_idx:0 feat_value:0.00017316017316 feat_value:1.16424374607e-05 feat_value:1.52590218967e-05 feat_value:0.0 feat_value:6.16163004865e-05 feat_value:6.95995935384e-06 feat_value:0.000284136314397 feat_value:0.00181908384323 feat_value:0.00172300906303 feat_value:0.0 feat_value:0.00865800865801 feat_value:0.000249500998004 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:563443 feat_idx:51995 feat_idx:49997 feat_idx:314332 feat_idx:0 feat_idx:595457 feat_idx:144963 feat_idx:148475 feat_idx:754940 feat_idx:188162 feat_idx:721984 feat_idx:349549 feat_idx:199920 feat_idx:180762 feat_idx:310463 feat_idx:197974 feat_idx:319863 feat_idx:734534 feat_idx:330429 feat_idx:467968 feat_idx:0 feat_idx:122096 feat_idx:40100 feat_idx:502861 feat_idx:777305 feat_value:0.000692640692641 feat_value:1.16424374607e-05 feat_value:0.000839246204318 feat_value:0.00825593395253 feat_value:3.70906812319e-05 feat_value:3.01598238666e-05 feat_value:7.10340785992e-05 feat_value:0.0019844551017 feat_value:0.000447982356387 feat_value:0.0217391304348 feat_value:0.004329004329 feat_value:0.0 feat_value:0.00108210469363 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:281207 feat_idx:430926 feat_idx:909211 feat_idx:314332 feat_idx:0 feat_idx:928918 feat_idx:144963 feat_idx:148475 feat_idx:904134 feat_idx:535335 feat_idx:327558 feat_idx:639245 feat_idx:223357 feat_idx:18380 feat_idx:471487 feat_idx:13161 feat_idx:188469 feat_idx:0 feat_idx:0 feat_idx:500616 feat_idx:0 feat_idx:122096 feat_idx:657898 feat_idx:0 feat_idx:0 feat_value:0.00017316017316 feat_value:0.00101677287157 feat_value:1.52590218967e-05 feat_value:0.00103199174407 feat_value:2.15894535692e-07 feat_value:2.31998645128e-06 feat_value:0.000106551117899 feat_value:0.000165371258475 feat_value:3.44601812606e-05 feat_value:0.0217391304348 feat_value:0.012987012987 feat_value:0.0 feat_value:0.000135263086704 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:1
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:268086 feat_idx:87449 feat_idx:691591 feat_idx:466372 feat_idx:360051 feat_idx:108674 feat_idx:537959 feat_idx:144963 feat_idx:148475 feat_idx:882632 feat_idx:1037965 feat_idx:783604 feat_idx:521533 feat_idx:59528 feat_idx:185313 feat_idx:972394 feat_idx:339114 feat_idx:644343 feat_idx:603603 feat_idx:330429 feat_idx:722203 feat_idx:925828 feat_idx:377126 feat_idx:221229 feat_idx:343446 feat_idx:24246 feat_value:0.0 feat_value:0.000504505623297 feat_value:1.52590218967e-05 feat_value:0.0030959752322 feat_value:7.26701007139e-05 feat_value:4.40797425743e-05 feat_value:0.000461721510895 feat_value:0.00281131139408 feat_value:0.0163685860988 feat_value:0.0 feat_value:0.038961038961 feat_value:0.0 feat_value:0.000405789260111 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:1
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:0 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:87868 feat_idx:585875 feat_idx:143202 feat_idx:105841 feat_idx:314332 feat_idx:615411 feat_idx:685294 feat_idx:795081 feat_idx:148475 feat_idx:754940 feat_idx:853239 feat_idx:1062322 feat_idx:529712 feat_idx:223357 feat_idx:715789 feat_idx:334774 feat_idx:197974 feat_idx:339749 feat_idx:0 feat_idx:0 feat_idx:540979 feat_idx:0 feat_idx:122096 feat_idx:141692 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:1.55232499476e-05 feat_value:0.0 feat_value:0.0 feat_value:0.0010041254855 feat_value:0.0 feat_value:0.0 feat_value:0.000165371258475 feat_value:0.00251559323202 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:0 feat_idx:0 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:154881 feat_idx:664380 feat_idx:0 feat_idx:470673 feat_idx:314332 feat_idx:108674 feat_idx:610634 feat_idx:144963 feat_idx:148475 feat_idx:125722 feat_idx:153800 feat_idx:0 feat_idx:297062 feat_idx:223357 feat_idx:712970 feat_idx:124318 feat_idx:13161 feat_idx:521259 feat_idx:734534 feat_idx:330429 feat_idx:0 feat_idx:969590 feat_idx:217677 feat_idx:643925 feat_idx:24736 feat_idx:941404 feat_value:0.00103896103896 feat_value:7.7616249738e-06 feat_value:0.0 feat_value:0.0 feat_value:3.95087000316e-05 feat_value:9.27994580512e-05 feat_value:0.000461721510895 feat_value:0.00545725152968 feat_value:0.00248113305076 feat_value:0.0217391304348 feat_value:0.012987012987 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:695357 feat_idx:245713 feat_idx:987054 feat_idx:399764 feat_idx:360051 feat_idx:615411 feat_idx:684605 feat_idx:144963 feat_idx:148475 feat_idx:874792 feat_idx:107682 feat_idx:879950 feat_idx:321212 feat_idx:288355 feat_idx:369087 feat_idx:762311 feat_idx:13161 feat_idx:879575 feat_idx:0 feat_idx:0 feat_idx:1086254 feat_idx:0 feat_idx:122096 feat_idx:942610 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:1.16424374607e-05 feat_value:4.57770656901e-05 feat_value:0.0123839009288 feat_value:0.000315551453367 feat_value:0.000225038685774 feat_value:3.55170392996e-05 feat_value:0.00347279642798 feat_value:0.00310141631345 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.00162315704044 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:506931 feat_idx:714652 feat_idx:0 feat_idx:213479 feat_idx:314332 feat_idx:0 feat_idx:432079 feat_idx:144963 feat_idx:148475 feat_idx:666980 feat_idx:405740 feat_idx:0 feat_idx:705197 feat_idx:288355 feat_idx:104862 feat_idx:0 feat_idx:339114 feat_idx:679030 feat_idx:734534 feat_idx:1047606 feat_idx:0 feat_idx:0 feat_idx:122096 feat_idx:1057480 feat_idx:343446 feat_idx:502409 feat_value:0.00138528138528 feat_value:1.16424374607e-05 feat_value:0.00022888532845 feat_value:0.0206398348813 feat_value:4.96557432092e-06 feat_value:5.56796748307e-05 feat_value:0.000142068157198 feat_value:0.00380353894493 feat_value:0.000827044350253 feat_value:0.0434782608696 feat_value:0.00865800865801 feat_value:0.0 feat_value:0.00270526173407 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:268086 feat_idx:83142 feat_idx:460446 feat_idx:323226 feat_idx:360051 feat_idx:108674 feat_idx:714816 feat_idx:795081 feat_idx:148475 feat_idx:900313 feat_idx:855314 feat_idx:824386 feat_idx:603555 feat_idx:59528 feat_idx:95559 feat_idx:499188 feat_idx:339114 feat_idx:882666 feat_idx:0 feat_idx:0 feat_idx:335421 feat_idx:0 feat_idx:122096 feat_idx:686449 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.000159113311963 feat_value:3.05180437934e-05 feat_value:0.00412796697626 feat_value:0.000134675011365 feat_value:0.000345677981241 feat_value:0.00113654525759 feat_value:0.00793782040681 feat_value:0.00478996519522 feat_value:0.0 feat_value:0.025974025974 feat_value:0.00149700598802 feat_value:0.000541052346815 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:268086 feat_idx:507093 feat_idx:968965 feat_idx:115714 feat_idx:314332 feat_idx:108674 feat_idx:585814 feat_idx:144963 feat_idx:148475 feat_idx:1067472 feat_idx:905164 feat_idx:292795 feat_idx:1053010 feat_idx:223357 feat_idx:460894 feat_idx:592287 feat_idx:339114 feat_idx:1024304 feat_idx:0 feat_idx:0 feat_idx:1006115 feat_idx:0 feat_idx:122096 feat_idx:831861 feat_idx:0 feat_idx:0 feat_value:0.0152380952381 feat_value:0.00124962162078 feat_value:0.0 feat_value:0.00412796697626 feat_value:2.15894535692e-07 feat_value:9.27994580512e-06 feat_value:0.00158050824883 feat_value:0.00661485033901 feat_value:0.00303249595093 feat_value:0.0652173913043 feat_value:0.017316017316 feat_value:0.00299401197605 feat_value:0.000541052346815 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:1
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:268086 feat_idx:704711 feat_idx:160536 feat_idx:572549 feat_idx:314332 feat_idx:0 feat_idx:984584 feat_idx:144963 feat_idx:148475 feat_idx:120200 feat_idx:190379 feat_idx:768743 feat_idx:628725 feat_idx:288355 feat_idx:967940 feat_idx:824472 feat_idx:854924 feat_idx:575938 feat_idx:568485 feat_idx:330429 feat_idx:469863 feat_idx:0 feat_idx:122096 feat_idx:26849 feat_idx:502861 feat_idx:9838 feat_value:0.0 feat_value:1.55232499476e-05 feat_value:0.000274662394141 feat_value:0.00515995872033 feat_value:7.26701007139e-05 feat_value:0.000185598916102 feat_value:0.000674823746692 feat_value:0.000826856292376 feat_value:0.00327371721975 feat_value:0.0 feat_value:0.021645021645 feat_value:0.0 feat_value:0.000676315433518 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:0 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:506931 feat_idx:439682 feat_idx:462322 feat_idx:892535 feat_idx:314332 feat_idx:615411 feat_idx:183327 feat_idx:66687 feat_idx:31348 feat_idx:754940 feat_idx:780959 feat_idx:1076845 feat_idx:127420 feat_idx:59528 feat_idx:1034303 feat_idx:3336 feat_idx:587215 feat_idx:786401 feat_idx:0 feat_idx:0 feat_idx:273839 feat_idx:0 feat_idx:476211 feat_idx:841950 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.000116424374607 feat_value:0.0 feat_value:0.0 feat_value:0.00487394867997 feat_value:0.00488589146639 feat_value:0.0 feat_value:0.000330742516951 feat_value:0.00327371721975 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:0 feat_idx:0 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:268086 feat_idx:569676 feat_idx:460446 feat_idx:323226 feat_idx:943087 feat_idx:615411 feat_idx:646596 feat_idx:144963 feat_idx:148475 feat_idx:320091 feat_idx:786096 feat_idx:824386 feat_idx:708545 feat_idx:863222 feat_idx:406685 feat_idx:499188 feat_idx:599055 feat_idx:251433 feat_idx:0 feat_idx:0 feat_idx:335421 feat_idx:969590 feat_idx:476211 feat_idx:686449 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:7.7616249738e-06 feat_value:0.0 feat_value:0.0 feat_value:0.000211317571535 feat_value:4.63997290256e-05 feat_value:1.77585196498e-05 feat_value:0.00115759880933 feat_value:0.000689203625211 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:268086 feat_idx:585875 feat_idx:1083253 feat_idx:105841 feat_idx:314332 feat_idx:615411 feat_idx:183043 feat_idx:66687 feat_idx:148475 feat_idx:754940 feat_idx:785290 feat_idx:78319 feat_idx:769776 feat_idx:223357 feat_idx:715789 feat_idx:30992 feat_idx:854924 feat_idx:339749 feat_idx:0 feat_idx:0 feat_idx:87470 feat_idx:0 feat_idx:122096 feat_idx:141692 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.000135828437042 feat_value:0.0 feat_value:0.00103199174407 feat_value:0.000404802254423 feat_value:0.0 feat_value:0.0 feat_value:0.00611873656359 feat_value:0.00062028326269 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.000135263086704 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:268086 feat_idx:34199 feat_idx:460446 feat_idx:323226 feat_idx:360051 feat_idx:615411 feat_idx:617010 feat_idx:1041627 feat_idx:148475 feat_idx:754940 feat_idx:224964 feat_idx:824386 feat_idx:226878 feat_idx:288355 feat_idx:303932 feat_idx:499188 feat_idx:13161 feat_idx:628988 feat_idx:0 feat_idx:0 feat_idx:335421 feat_idx:0 feat_idx:122096 feat_idx:686449 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:3.49273123821e-05 feat_value:9.15541313802e-05 feat_value:0.015479876161 feat_value:0.000872775249989 feat_value:0.0011762331308 feat_value:0.000124309637549 feat_value:0.00694559285596 feat_value:0.0124056652538 feat_value:0.0 feat_value:0.00865800865801 feat_value:0.0 feat_value:0.00541052346815 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:664380 feat_idx:0 feat_idx:0 feat_idx:314332 feat_idx:108674 feat_idx:248083 feat_idx:144963 feat_idx:148475 feat_idx:804470 feat_idx:868888 feat_idx:0 feat_idx:797434 feat_idx:59528 feat_idx:747120 feat_idx:0 feat_idx:13161 feat_idx:521259 feat_idx:495815 feat_idx:330429 feat_idx:0 feat_idx:11923 feat_idx:407810 feat_idx:566713 feat_idx:24736 feat_idx:915104 feat_value:0.00536796536797 feat_value:7.7616249738e-05 feat_value:3.05180437934e-05 feat_value:0.0113519091847 feat_value:1.25218830701e-05 feat_value:5.33596883794e-05 feat_value:0.000550514109144 feat_value:0.00380353894493 feat_value:0.00223991178194 feat_value:0.0434782608696 feat_value:0.00865800865801 feat_value:0.0 feat_value:0.00148789395374 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:268086 feat_idx:439682 feat_idx:661250 feat_idx:819482 feat_idx:314332 feat_idx:404876 feat_idx:173004 feat_idx:795081 feat_idx:148475 feat_idx:133411 feat_idx:790823 feat_idx:853868 feat_idx:963286 feat_idx:223357 feat_idx:961787 feat_idx:355708 feat_idx:13161 feat_idx:618619 feat_idx:0 feat_idx:0 feat_idx:542491 feat_idx:0 feat_idx:377126 feat_idx:320543 feat_idx:0 feat_idx:0 feat_value:0.00017316017316 feat_value:0.00925573778126 feat_value:0.000198367284657 feat_value:0.00412796697626 feat_value:1.72715628554e-06 feat_value:9.27994580512e-06 feat_value:0.00122533785584 feat_value:0.000496113775426 feat_value:0.0209862503877 feat_value:0.0217391304348 feat_value:0.047619047619 feat_value:0.0 feat_value:0.000541052346815 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:0 feat_idx:0 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:695357 feat_idx:881707 feat_idx:387392 feat_idx:38631 feat_idx:314332 feat_idx:0 feat_idx:608594 feat_idx:144963 feat_idx:148475 feat_idx:756085 feat_idx:879727 feat_idx:1083007 feat_idx:253536 feat_idx:223357 feat_idx:462961 feat_idx:367591 feat_idx:13161 feat_idx:144331 feat_idx:0 feat_idx:0 feat_idx:853418 feat_idx:0 feat_idx:122096 feat_idx:783958 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.000748996809972 feat_value:0.0 feat_value:0.0 feat_value:7.01225451928e-05 feat_value:1.39199187077e-05 feat_value:0.000514997069844 feat_value:0.000992227550852 feat_value:0.00255005341328 feat_value:0.0 feat_value:0.038961038961 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:87449 feat_idx:536408 feat_idx:619856 feat_idx:729041 feat_idx:615411 feat_idx:689549 feat_idx:1041627 feat_idx:148475 feat_idx:754940 feat_idx:42362 feat_idx:181047 feat_idx:385295 feat_idx:223357 feat_idx:751650 feat_idx:367088 feat_idx:339114 feat_idx:644343 feat_idx:809973 feat_idx:330429 feat_idx:28648 feat_idx:0 feat_idx:217677 feat_idx:305383 feat_idx:343446 feat_idx:1083427 feat_value:0.0 feat_value:8.53778747118e-05 feat_value:0.000122072175174 feat_value:0.00928792569659 feat_value:6.50274341504e-05 feat_value:7.19195799897e-05 feat_value:5.32755589494e-05 feat_value:0.00115759880933 feat_value:0.00117164616286 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.00121736778033 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:506931 feat_idx:439682 feat_idx:0 feat_idx:0 feat_idx:314332 feat_idx:108674 feat_idx:713567 feat_idx:144963 feat_idx:148475 feat_idx:754940 feat_idx:963705 feat_idx:0 feat_idx:599643 feat_idx:59528 feat_idx:967283 feat_idx:0 feat_idx:587215 feat_idx:434748 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:925828 feat_idx:476211 feat_idx:753350 feat_idx:0 feat_idx:0 feat_value:0.00017316017316 feat_value:0.000128066812068 feat_value:0.0 feat_value:0.0030959752322 feat_value:5.00875322806e-06 feat_value:7.19195799897e-05 feat_value:1.77585196498e-05 feat_value:0.000496113775426 feat_value:0.000103380543782 feat_value:0.0217391304348 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000405789260111 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:1
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:432429 feat_idx:319665 feat_idx:183269 feat_idx:85674 feat_idx:463568 feat_idx:0 feat_idx:130525 feat_idx:144963 feat_idx:148475 feat_idx:754940 feat_idx:392441 feat_idx:1050223 feat_idx:862081 feat_idx:288355 feat_idx:484086 feat_idx:1077738 feat_idx:339114 feat_idx:934587 feat_idx:734534 feat_idx:94311 feat_idx:548757 feat_idx:0 feat_idx:321110 feat_idx:686449 feat_idx:474802 feat_idx:789529 feat_value:0.0 feat_value:3.49273123821e-05 feat_value:3.05180437934e-05 feat_value:0.0030959752322 feat_value:0.000119994182938 feat_value:0.0 feat_value:0.0 feat_value:0.000496113775426 feat_value:0.000447982356387 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.000405789260111 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:268086 feat_idx:702327 feat_idx:0 feat_idx:217102 feat_idx:314332 feat_idx:85900 feat_idx:331250 feat_idx:888742 feat_idx:148475 feat_idx:197667 feat_idx:872960 feat_idx:0 feat_idx:925332 feat_idx:223357 feat_idx:57227 feat_idx:0 feat_idx:339114 feat_idx:91753 feat_idx:305875 feat_idx:1047606 feat_idx:0 feat_idx:0 feat_idx:476211 feat_idx:117207 feat_idx:502861 feat_idx:866455 feat_value:0.0 feat_value:1.94040624345e-05 feat_value:0.000335698481727 feat_value:0.0030959752322 feat_value:0.000202379537758 feat_value:0.00056143672121 feat_value:0.000106551117899 feat_value:0.000992227550852 feat_value:0.00630621317068 feat_value:0.0 feat_value:0.004329004329 feat_value:0.000998003992016 feat_value:0.000405789260111 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:849120 feat_idx:982375 feat_idx:949507 feat_idx:82312 feat_idx:314332 feat_idx:615411 feat_idx:641839 feat_idx:66687 feat_idx:148475 feat_idx:351286 feat_idx:1067936 feat_idx:1021395 feat_idx:423678 feat_idx:288355 feat_idx:491071 feat_idx:210032 feat_idx:13161 feat_idx:384630 feat_idx:661313 feat_idx:330429 feat_idx:466643 feat_idx:0 feat_idx:407810 feat_idx:818126 feat_idx:35064 feat_idx:312157 feat_value:0.0 feat_value:0.00022508712424 feat_value:0.000244144350347 feat_value:0.00722394220846 feat_value:7.32314265067e-05 feat_value:0.000167039024492 feat_value:3.55170392996e-05 feat_value:0.00115759880933 feat_value:0.00327371721975 feat_value:0.0 feat_value:0.00865800865801 feat_value:0.0 feat_value:0.000946841606925 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:518052 feat_idx:702327 feat_idx:0 feat_idx:450730 feat_idx:314332 feat_idx:615411 feat_idx:491223 feat_idx:27549 feat_idx:148475 feat_idx:24666 feat_idx:283209 feat_idx:0 feat_idx:91978 feat_idx:59528 feat_idx:89255 feat_idx:282181 feat_idx:13161 feat_idx:91753 feat_idx:633602 feat_idx:94311 feat_idx:0 feat_idx:0 feat_idx:377126 feat_idx:26849 feat_idx:502861 feat_idx:989849 feat_value:0.00103896103896 feat_value:1.16424374607e-05 feat_value:0.000427252613107 feat_value:0.0 feat_value:1.33854612129e-06 feat_value:0.0 feat_value:0.000106551117899 feat_value:0.0 feat_value:0.0 feat_value:0.0217391304348 feat_value:0.004329004329 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:181401 feat_idx:704711 feat_idx:1084300 feat_idx:958176 feat_idx:314332 feat_idx:615411 feat_idx:809683 feat_idx:536544 feat_idx:148475 feat_idx:197667 feat_idx:23597 feat_idx:771551 feat_idx:444756 feat_idx:59528 feat_idx:28300 feat_idx:351738 feat_idx:339114 feat_idx:750233 feat_idx:734534 feat_idx:330429 feat_idx:5418 feat_idx:0 feat_idx:476211 feat_idx:221229 feat_idx:1007264 feat_idx:24246 feat_value:0.0 feat_value:8.53778747118e-05 feat_value:0.00013733119707 feat_value:0.0030959752322 feat_value:0.000622380767493 feat_value:0.00313894166858 feat_value:5.32755589494e-05 feat_value:0.000165371258475 feat_value:0.0124745856163 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000405789260111 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:268086 feat_idx:746729 feat_idx:742925 feat_idx:205831 feat_idx:912022 feat_idx:0 feat_idx:653684 feat_idx:144963 feat_idx:148475 feat_idx:891197 feat_idx:122292 feat_idx:282954 feat_idx:561978 feat_idx:223357 feat_idx:222724 feat_idx:538143 feat_idx:599055 feat_idx:706003 feat_idx:729650 feat_idx:1047606 feat_idx:475068 feat_idx:0 feat_idx:122096 feat_idx:744639 feat_idx:530010 feat_idx:785927 feat_value:0.0 feat_value:8.14970622249e-05 feat_value:0.00018310826276 feat_value:0.00825593395253 feat_value:0.000387098902496 feat_value:0.000102079403856 feat_value:3.55170392996e-05 feat_value:0.0019844551017 feat_value:0.00196423033185 feat_value:0.0 feat_value:0.00865800865801 feat_value:0.0 feat_value:0.00108210469363 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:201945 feat_idx:631742 feat_idx:306726 feat_idx:186386 feat_idx:314332 feat_idx:615411 feat_idx:337962 feat_idx:989504 feat_idx:31348 feat_idx:1068694 feat_idx:746192 feat_idx:359807 feat_idx:597620 feat_idx:59528 feat_idx:834098 feat_idx:463498 feat_idx:13161 feat_idx:144824 feat_idx:734534 feat_idx:1047606 feat_idx:447900 feat_idx:0 feat_idx:476211 feat_idx:421203 feat_idx:24736 feat_idx:272262 feat_value:0.0 feat_value:1.55232499476e-05 feat_value:3.05180437934e-05 feat_value:0.0 feat_value:0.00767176914691 feat_value:0.0 feat_value:0.0 feat_value:0.000496113775426 feat_value:6.89203625211e-05 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.000135263086704 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:894961 feat_idx:0 feat_idx:0 feat_idx:314332 feat_idx:615411 feat_idx:927764 feat_idx:144963 feat_idx:148475 feat_idx:967242 feat_idx:1062285 feat_idx:0 feat_idx:736367 feat_idx:59528 feat_idx:562438 feat_idx:0 feat_idx:587215 feat_idx:896897 feat_idx:960559 feat_idx:1047606 feat_idx:0 feat_idx:0 feat_idx:377126 feat_idx:428982 feat_idx:525837 feat_idx:697480 feat_value:0.0 feat_value:1.16424374607e-05 feat_value:0.000305180437934 feat_value:0.0 feat_value:0.000190505338295 feat_value:0.00198358841584 feat_value:0.0 feat_value:0.000661485033901 feat_value:0.017988214618 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.000676315433518 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:506931 feat_idx:889703 feat_idx:428972 feat_idx:323226 feat_idx:314332 feat_idx:108674 feat_idx:731191 feat_idx:66687 feat_idx:31348 feat_idx:754940 feat_idx:639052 feat_idx:789125 feat_idx:318898 feat_idx:223357 feat_idx:275810 feat_idx:791919 feat_idx:189960 feat_idx:990004 feat_idx:0 feat_idx:0 feat_idx:128761 feat_idx:0 feat_idx:441547 feat_idx:686449 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.000228967936727 feat_value:3.05180437934e-05 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:3.44601812606e-05 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:0 feat_idx:5 feat_idx:0 feat_idx:0 feat_idx:8 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:12 feat_idx:0 feat_idx:695357 feat_idx:702327 feat_idx:112382 feat_idx:364273 feat_idx:314332 feat_idx:615411 feat_idx:680585 feat_idx:144963 feat_idx:31348 feat_idx:776916 feat_idx:972993 feat_idx:307964 feat_idx:509894 feat_idx:59528 feat_idx:89255 feat_idx:498076 feat_idx:854924 feat_idx:91753 feat_idx:734534 feat_idx:94311 feat_idx:797195 feat_idx:0 feat_idx:377126 feat_idx:520021 feat_idx:522503 feat_idx:516793 feat_value:0.0 feat_value:0.000306584186465 feat_value:7.62951094835e-05 feat_value:0.0 feat_value:0.00199486550979 feat_value:0.0 feat_value:0.0 feat_value:0.00115759880933 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:711611 feat_idx:461913 feat_idx:1019942 feat_idx:360051 feat_idx:615411 feat_idx:1055981 feat_idx:948645 feat_idx:148475 feat_idx:754940 feat_idx:380775 feat_idx:858292 feat_idx:571110 feat_idx:288355 feat_idx:122497 feat_idx:986082 feat_idx:13161 feat_idx:87215 feat_idx:734534 feat_idx:94311 feat_idx:675199 feat_idx:0 feat_idx:122096 feat_idx:294199 feat_idx:522503 feat_idx:87571 feat_value:0.00675324675325 feat_value:4.26889373559e-05 feat_value:0.000640878919661 feat_value:0.0330237358101 feat_value:1.16583049274e-06 feat_value:7.65595528922e-05 feat_value:0.000692582266342 feat_value:0.00396891020341 feat_value:0.00110272580034 feat_value:0.0217391304348 feat_value:0.004329004329 feat_value:0.0 feat_value:0.00432841877452 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:704711 feat_idx:72868 feat_idx:17848 feat_idx:314332 feat_idx:615411 feat_idx:363835 feat_idx:144963 feat_idx:31348 feat_idx:1069123 feat_idx:258719 feat_idx:753245 feat_idx:820316 feat_idx:39086 feat_idx:992008 feat_idx:325584 feat_idx:13161 feat_idx:750233 feat_idx:321110 feat_idx:94311 feat_idx:644181 feat_idx:0 feat_idx:476211 feat_idx:221229 feat_idx:502861 feat_idx:952230 feat_value:0.0 feat_value:1.16424374607e-05 feat_value:0.000839246204318 feat_value:0.00515995872033 feat_value:0.000625101038643 feat_value:0.0 feat_value:0.0 feat_value:0.000826856292376 feat_value:3.44601812606e-05 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.000676315433518 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:1
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:268086 feat_idx:31161 feat_idx:0 feat_idx:0 feat_idx:314332 feat_idx:85900 feat_idx:834217 feat_idx:760883 feat_idx:148475 feat_idx:697060 feat_idx:390104 feat_idx:0 feat_idx:916053 feat_idx:59528 feat_idx:608516 feat_idx:0 feat_idx:587215 feat_idx:473726 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:476211 feat_idx:0 feat_idx:0 feat_idx:0 feat_value:0.00017316017316 feat_value:1.55232499476e-05 feat_value:7.62951094835e-05 feat_value:0.00825593395253 feat_value:3.02252349969e-07 feat_value:1.85598916102e-05 feat_value:1.77585196498e-05 feat_value:0.0013229700678 feat_value:0.000275681450084 feat_value:0.0217391304348 feat_value:0.004329004329 feat_value:0.0 feat_value:0.00108210469363 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:746729 feat_idx:0 feat_idx:415419 feat_idx:314332 feat_idx:85900 feat_idx:341613 feat_idx:341430 feat_idx:148475 feat_idx:219803 feat_idx:273068 feat_idx:0 feat_idx:427647 feat_idx:59528 feat_idx:86971 feat_idx:85678 feat_idx:13161 feat_idx:706003 feat_idx:970598 feat_idx:94311 feat_idx:378304 feat_idx:0 feat_idx:476211 feat_idx:26849 feat_idx:502861 feat_idx:1082916 feat_value:0.0 feat_value:1.55232499476e-05 feat_value:0.000106813153277 feat_value:0.0030959752322 feat_value:0.000435545636305 feat_value:0.000155439092236 feat_value:0.000106551117899 feat_value:0.000496113775426 feat_value:0.00196423033185 feat_value:0.0 feat_value:0.012987012987 feat_value:0.0 feat_value:0.000405789260111 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:1
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:0 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:0 feat_idx:695357 feat_idx:655161 feat_idx:410781 feat_idx:572549 feat_idx:314332 feat_idx:615411 feat_idx:438251 feat_idx:1017442 feat_idx:148475 feat_idx:754940 feat_idx:939988 feat_idx:175321 feat_idx:940584 feat_idx:223357 feat_idx:400890 feat_idx:229140 feat_idx:13161 feat_idx:512136 feat_idx:734534 feat_idx:94311 feat_idx:59009 feat_idx:0 feat_idx:122096 feat_idx:26849 feat_idx:502861 feat_idx:602609 feat_value:0.00121212121212 feat_value:1.55232499476e-05 feat_value:0.000610360875868 feat_value:0.0 feat_value:6.12276903223e-05 feat_value:5.33596883794e-05 feat_value:0.00261050238852 feat_value:0.0 feat_value:0.000241221268824 feat_value:0.0 feat_value:0.017316017316 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:181401 feat_idx:563443 feat_idx:0 feat_idx:0 feat_idx:314332 feat_idx:85900 feat_idx:1086355 feat_idx:66687 feat_idx:148475 feat_idx:754940 feat_idx:294725 feat_idx:0 feat_idx:937034 feat_idx:59528 feat_idx:827972 feat_idx:0 feat_idx:197974 feat_idx:319863 feat_idx:734534 feat_idx:1047606 feat_idx:0 feat_idx:0 feat_idx:122096 feat_idx:808702 feat_idx:502861 feat_idx:792764 feat_value:0.0 feat_value:1.16424374607e-05 feat_value:0.000152590218967 feat_value:0.00206398348813 feat_value:0.000153069225806 feat_value:0.0 feat_value:0.0 feat_value:0.000330742516951 feat_value:0.000103380543782 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.000270526173407 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:962300 feat_idx:623087 feat_idx:0 feat_idx:53376 feat_idx:314332 feat_idx:615411 feat_idx:264532 feat_idx:144963 feat_idx:148475 feat_idx:14838 feat_idx:682657 feat_idx:0 feat_idx:502067 feat_idx:59528 feat_idx:519185 feat_idx:0 feat_idx:854924 feat_idx:372673 feat_idx:764350 feat_idx:330429 feat_idx:0 feat_idx:925828 feat_idx:377126 feat_idx:383664 feat_idx:522503 feat_idx:14052 feat_value:0.000865800865801 feat_value:0.000209563874293 feat_value:0.0 feat_value:0.00515995872033 feat_value:1.97327605623e-05 feat_value:1.15999322564e-05 feat_value:8.8792598249e-05 feat_value:0.00115759880933 feat_value:0.000379061993866 feat_value:0.0217391304348 feat_value:0.004329004329 feat_value:0.000249500998004 feat_value:0.000676315433518 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:244091 feat_idx:428972 feat_idx:323226 feat_idx:314332 feat_idx:615411 feat_idx:253814 feat_idx:144963 feat_idx:148475 feat_idx:367991 feat_idx:359193 feat_idx:789125 feat_idx:173541 feat_idx:59528 feat_idx:433504 feat_idx:791919 feat_idx:587215 feat_idx:884062 feat_idx:0 feat_idx:0 feat_idx:128761 feat_idx:0 feat_idx:637620 feat_idx:686449 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:1.16424374607e-05 feat_value:0.00022888532845 feat_value:0.00206398348813 feat_value:0.000868414180368 feat_value:0.00070759586764 feat_value:1.77585196498e-05 feat_value:0.00711096411444 feat_value:0.00785692132741 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.00405789260111 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:518052 feat_idx:631742 feat_idx:209780 feat_idx:691946 feat_idx:463568 feat_idx:404876 feat_idx:781648 feat_idx:66687 feat_idx:148475 feat_idx:294231 feat_idx:673759 feat_idx:780141 feat_idx:636360 feat_idx:223357 feat_idx:656844 feat_idx:720701 feat_idx:13161 feat_idx:284891 feat_idx:734534 feat_idx:330429 feat_idx:564494 feat_idx:0 feat_idx:122096 feat_idx:529367 feat_idx:24736 feat_idx:225414 feat_value:0.0 feat_value:1.55232499476e-05 feat_value:6.10360875868e-05 feat_value:0.0030959752322 feat_value:7.29291741568e-05 feat_value:0.000426877507035 feat_value:0.000213102235798 feat_value:0.00760707788986 feat_value:0.00182638960681 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000405789260111 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:87449 feat_idx:0 feat_idx:0 feat_idx:943087 feat_idx:615411 feat_idx:14123 feat_idx:128514 feat_idx:148475 feat_idx:338941 feat_idx:655530 feat_idx:0 feat_idx:945302 feat_idx:288355 feat_idx:1078572 feat_idx:0 feat_idx:587215 feat_idx:644343 feat_idx:215210 feat_idx:330429 feat_idx:0 feat_idx:0 feat_idx:217677 feat_idx:830506 feat_idx:502861 feat_idx:560344 feat_value:0.000692640692641 feat_value:1.16424374607e-05 feat_value:0.00135805294881 feat_value:0.00412796697626 feat_value:2.09849488693e-05 feat_value:1.15999322564e-05 feat_value:7.10340785992e-05 feat_value:0.00115759880933 feat_value:0.000137840725042 feat_value:0.0217391304348 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000541052346815 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:76757 feat_idx:0 feat_idx:748549 feat_idx:729041 feat_idx:404876 feat_idx:897525 feat_idx:66687 feat_idx:148475 feat_idx:809357 feat_idx:739161 feat_idx:0 feat_idx:571774 feat_idx:223357 feat_idx:726585 feat_idx:450365 feat_idx:13161 feat_idx:1064696 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:925828 feat_idx:476211 feat_idx:381001 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.00016299412445 feat_value:3.05180437934e-05 feat_value:0.00103199174407 feat_value:0.000144347086564 feat_value:2.31998645128e-06 feat_value:0.000301894834047 feat_value:0.000330742516951 feat_value:3.44601812606e-05 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000135263086704 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:1
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:695357 feat_idx:702327 feat_idx:593344 feat_idx:1065368 feat_idx:463568 feat_idx:85900 feat_idx:669411 feat_idx:27549 feat_idx:148475 feat_idx:227359 feat_idx:1043530 feat_idx:320625 feat_idx:575561 feat_idx:223357 feat_idx:57227 feat_idx:1021160 feat_idx:854924 feat_idx:91753 feat_idx:943801 feat_idx:94311 feat_idx:758526 feat_idx:0 feat_idx:122096 feat_idx:154807 feat_idx:522503 feat_idx:406770 feat_value:0.0 feat_value:1.94040624345e-05 feat_value:1.52590218967e-05 feat_value:0.00206398348813 feat_value:0.000346985697764 feat_value:0.00038047777801 feat_value:0.000319653353696 feat_value:0.00214982636018 feat_value:0.0126468865226 feat_value:0.0 feat_value:0.00865800865801 feat_value:0.0 feat_value:0.000270526173407 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:0 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:518052 feat_idx:569676 feat_idx:460446 feat_idx:323226 feat_idx:314332 feat_idx:108674 feat_idx:2775 feat_idx:144963 feat_idx:31348 feat_idx:892705 feat_idx:1040029 feat_idx:824386 feat_idx:524213 feat_idx:863222 feat_idx:406685 feat_idx:499188 feat_idx:599055 feat_idx:251433 feat_idx:0 feat_idx:0 feat_idx:335421 feat_idx:0 feat_idx:476211 feat_idx:686449 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:7.7616249738e-06 feat_value:0.0 feat_value:0.0 feat_value:0.00307174745383 feat_value:0.000329438076082 feat_value:0.0 feat_value:0.00115759880933 feat_value:0.00217099141941 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:268086 feat_idx:328856 feat_idx:583609 feat_idx:356189 feat_idx:314332 feat_idx:0 feat_idx:407260 feat_idx:144963 feat_idx:148475 feat_idx:699806 feat_idx:967004 feat_idx:598842 feat_idx:676678 feat_idx:223357 feat_idx:310528 feat_idx:805012 feat_idx:599055 feat_idx:683739 feat_idx:734534 feat_idx:94311 feat_idx:135625 feat_idx:0 feat_idx:122096 feat_idx:737768 feat_idx:522503 feat_idx:618666 feat_value:0.0 feat_value:1.16424374607e-05 feat_value:0.000167849240864 feat_value:0.0030959752322 feat_value:0.000698807433128 feat_value:0.00028999830641 feat_value:3.55170392996e-05 feat_value:0.000496113775426 feat_value:0.00354939866984 feat_value:0.0 feat_value:0.00865800865801 feat_value:0.0 feat_value:0.000405789260111 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:1
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:0 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:849120 feat_idx:439682 feat_idx:0 feat_idx:0 feat_idx:314332 feat_idx:615411 feat_idx:443349 feat_idx:1007823 feat_idx:31348 feat_idx:754940 feat_idx:1072328 feat_idx:0 feat_idx:321212 feat_idx:59528 feat_idx:163883 feat_idx:0 feat_idx:189960 feat_idx:1040747 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:925828 feat_idx:122096 feat_idx:0 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.000554956185627 feat_value:3.05180437934e-05 feat_value:0.00206398348813 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.000330742516951 feat_value:6.89203625211e-05 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.000270526173407 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:738089 feat_idx:439682 feat_idx:374405 feat_idx:984218 feat_idx:943087 feat_idx:108674 feat_idx:884166 feat_idx:144963 feat_idx:148475 feat_idx:683571 feat_idx:374802 feat_idx:530646 feat_idx:826201 feat_idx:223357 feat_idx:43619 feat_idx:1001991 feat_idx:339114 feat_idx:603612 feat_idx:0 feat_idx:0 feat_idx:60686 feat_idx:0 feat_idx:122096 feat_idx:138318 feat_idx:0 feat_idx:0 feat_value:0.00034632034632 feat_value:1.16424374607e-05 feat_value:0.0 feat_value:0.00722394220846 feat_value:1.91282558623e-05 feat_value:8.58394986973e-05 feat_value:0.000124309637549 feat_value:0.00562262278816 feat_value:0.00971777111548 feat_value:0.0217391304348 feat_value:0.017316017316 feat_value:0.00174650698603 feat_value:0.000946841606925 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:268086 feat_idx:1049859 feat_idx:420263 feat_idx:271401 feat_idx:360051 feat_idx:615411 feat_idx:714816 feat_idx:144963 feat_idx:148475 feat_idx:900313 feat_idx:855314 feat_idx:74337 feat_idx:603555 feat_idx:288355 feat_idx:650698 feat_idx:322858 feat_idx:339114 feat_idx:311468 feat_idx:489978 feat_idx:330429 feat_idx:101492 feat_idx:0 feat_idx:217677 feat_idx:221229 feat_idx:917031 feat_idx:24246 feat_value:0.00034632034632 feat_value:1.55232499476e-05 feat_value:0.000915541313802 feat_value:0.077399380805 feat_value:2.63391333544e-06 feat_value:0.000280718360605 feat_value:0.00092344302179 feat_value:0.00644947908054 feat_value:0.00854612495262 feat_value:0.0217391304348 feat_value:0.034632034632 feat_value:0.000249500998004 feat_value:0.0104152576762 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:1
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:268086 feat_idx:541890 feat_idx:93486 feat_idx:892417 feat_idx:314332 feat_idx:0 feat_idx:870784 feat_idx:66687 feat_idx:148475 feat_idx:1064406 feat_idx:605532 feat_idx:908441 feat_idx:411003 feat_idx:223357 feat_idx:415710 feat_idx:177994 feat_idx:13161 feat_idx:721813 feat_idx:0 feat_idx:0 feat_idx:702388 feat_idx:0 feat_idx:122096 feat_idx:68781 feat_idx:0 feat_idx:0 feat_value:0.00017316017316 feat_value:0.000143590062015 feat_value:3.05180437934e-05 feat_value:0.0433436532508 feat_value:1.41626815414e-05 feat_value:0.000102079403856 feat_value:0.000266377794747 feat_value:0.00810319166529 feat_value:0.00199869051311 feat_value:0.0217391304348 feat_value:0.038961038961 feat_value:0.0 feat_value:0.00568104964155 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:0 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:268086 feat_idx:569676 feat_idx:460446 feat_idx:323226 feat_idx:463568 feat_idx:404876 feat_idx:679269 feat_idx:1007823 feat_idx:148475 feat_idx:754940 feat_idx:392943 feat_idx:824386 feat_idx:502022 feat_idx:863222 feat_idx:406685 feat_idx:499188 feat_idx:763481 feat_idx:251433 feat_idx:0 feat_idx:0 feat_idx:335421 feat_idx:0 feat_idx:476211 feat_idx:686449 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:1.16424374607e-05 feat_value:0.0 feat_value:0.0 feat_value:0.000644186115598 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:695357 feat_idx:52223 feat_idx:0 feat_idx:610088 feat_idx:360051 feat_idx:108674 feat_idx:207287 feat_idx:144963 feat_idx:148475 feat_idx:198726 feat_idx:1050332 feat_idx:0 feat_idx:575881 feat_idx:863222 feat_idx:428650 feat_idx:56538 feat_idx:587215 feat_idx:520546 feat_idx:0 feat_idx:0 feat_idx:3328 feat_idx:0 feat_idx:321110 feat_idx:604513 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:1.55232499476e-05 feat_value:0.0 feat_value:0.00103199174407 feat_value:0.00087290478671 feat_value:0.000153119105784 feat_value:1.77585196498e-05 feat_value:0.000165371258475 feat_value:3.44601812606e-05 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000135263086704 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:506931 feat_idx:664380 feat_idx:464058 feat_idx:794391 feat_idx:314332 feat_idx:615411 feat_idx:1008575 feat_idx:144963 feat_idx:148475 feat_idx:811905 feat_idx:262025 feat_idx:792836 feat_idx:853632 feat_idx:863222 feat_idx:190922 feat_idx:989611 feat_idx:13161 feat_idx:402822 feat_idx:622170 feat_idx:94311 feat_idx:626744 feat_idx:925828 feat_idx:122096 feat_idx:423382 feat_idx:24736 feat_idx:1081226 feat_value:0.00225108225108 feat_value:6.20929997904e-05 feat_value:0.00122072175174 feat_value:0.0330237358101 feat_value:1.63216268983e-05 feat_value:0.000266798441897 feat_value:0.000266377794747 feat_value:0.00611873656359 feat_value:0.00196423033185 feat_value:0.0217391304348 feat_value:0.00865800865801 feat_value:0.0 feat_value:0.00649262816177 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:268086 feat_idx:894961 feat_idx:594422 feat_idx:823711 feat_idx:360051 feat_idx:615411 feat_idx:919751 feat_idx:888742 feat_idx:148475 feat_idx:725649 feat_idx:522685 feat_idx:14144 feat_idx:242991 feat_idx:288355 feat_idx:645605 feat_idx:99736 feat_idx:379814 feat_idx:896897 feat_idx:734534 feat_idx:330429 feat_idx:710067 feat_idx:0 feat_idx:407810 feat_idx:474780 feat_idx:525837 feat_idx:815828 feat_value:0.0 feat_value:1.16424374607e-05 feat_value:0.0013885709926 feat_value:0.00412796697626 feat_value:1.26514197916e-05 feat_value:0.000510397019281 feat_value:0.000621548187743 feat_value:0.000661485033901 feat_value:0.0022743719632 feat_value:0.0 feat_value:0.021645021645 feat_value:0.000249500998004 feat_value:0.000541052346815 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:0 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:12 feat_idx:0 feat_idx:268086 feat_idx:704711 feat_idx:539260 feat_idx:133619 feat_idx:943087 feat_idx:108674 feat_idx:277955 feat_idx:795081 feat_idx:148475 feat_idx:46173 feat_idx:414978 feat_idx:796305 feat_idx:317564 feat_idx:59528 feat_idx:28300 feat_idx:252652 feat_idx:854924 feat_idx:750233 feat_idx:637425 feat_idx:330429 feat_idx:538163 feat_idx:0 feat_idx:122096 feat_idx:623412 feat_idx:917031 feat_idx:421993 feat_value:0.0 feat_value:7.7616249738e-06 feat_value:0.0 feat_value:0.0 feat_value:0.000244133540961 feat_value:0.0 feat_value:0.0 feat_value:0.000661485033901 feat_value:0.000516902718908 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:631742 feat_idx:0 feat_idx:618078 feat_idx:314332 feat_idx:831162 feat_idx:302234 feat_idx:144963 feat_idx:148475 feat_idx:754940 feat_idx:683585 feat_idx:0 feat_idx:460786 feat_idx:59528 feat_idx:834098 feat_idx:0 feat_idx:13161 feat_idx:144824 feat_idx:734534 feat_idx:1047606 feat_idx:0 feat_idx:0 feat_idx:122096 feat_idx:225853 feat_idx:24736 feat_idx:83301 feat_value:0.0 feat_value:0.000217325499267 feat_value:0.0 feat_value:0.0103199174407 feat_value:0.000282821841757 feat_value:0.000227358672225 feat_value:0.000603789668093 feat_value:0.00181908384323 feat_value:0.0120266032599 feat_value:0.0 feat_value:0.038961038961 feat_value:0.0 feat_value:0.00135263086704 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:704711 feat_idx:552317 feat_idx:56734 feat_idx:314332 feat_idx:615411 feat_idx:205494 feat_idx:66687 feat_idx:148475 feat_idx:721787 feat_idx:258719 feat_idx:1026950 feat_idx:820316 feat_idx:59528 feat_idx:28300 feat_idx:783420 feat_idx:13161 feat_idx:750233 feat_idx:505787 feat_idx:330429 feat_idx:515764 feat_idx:0 feat_idx:476211 feat_idx:221229 feat_idx:502861 feat_idx:24246 feat_value:0.00103896103896 feat_value:7.7616249738e-06 feat_value:0.000152590218967 feat_value:0.0061919504644 feat_value:0.0 feat_value:0.0 feat_value:0.000106551117899 feat_value:0.00148834132628 feat_value:0.000310141631345 feat_value:0.0217391304348 feat_value:0.004329004329 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:1
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:294042 feat_idx:507045 feat_idx:549419 feat_idx:314332 feat_idx:0 feat_idx:1012202 feat_idx:795081 feat_idx:148475 feat_idx:68578 feat_idx:717684 feat_idx:462100 feat_idx:729242 feat_idx:59528 feat_idx:182004 feat_idx:253871 feat_idx:763481 feat_idx:256400 feat_idx:0 feat_idx:0 feat_idx:915751 feat_idx:0 feat_idx:122096 feat_idx:1030847 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:1.16424374607e-05 feat_value:1.52590218967e-05 feat_value:0.0030959752322 feat_value:0.000125262009609 feat_value:0.0 feat_value:0.0 feat_value:0.000496113775426 feat_value:0.000310141631345 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.000405789260111 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:1
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:181401 feat_idx:439682 feat_idx:0 feat_idx:0 feat_idx:314332 feat_idx:0 feat_idx:1027059 feat_idx:144963 feat_idx:148475 feat_idx:307216 feat_idx:1086145 feat_idx:0 feat_idx:784143 feat_idx:59528 feat_idx:127555 feat_idx:0 feat_idx:13161 feat_idx:757164 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:122096 feat_idx:0 feat_idx:0 feat_idx:0 feat_value:0.00017316017316 feat_value:0.000100901124659 feat_value:1.52590218967e-05 feat_value:0.0144478844169 feat_value:2.41801879975e-06 feat_value:3.47997967692e-05 feat_value:0.000177585196498 feat_value:0.00578799404663 feat_value:0.00554808918295 feat_value:0.0217391304348 feat_value:0.00865800865801 feat_value:0.0 feat_value:0.00202894630055 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:849120 feat_idx:704711 feat_idx:160536 feat_idx:572549 feat_idx:360051 feat_idx:0 feat_idx:731718 feat_idx:66687 feat_idx:148475 feat_idx:31385 feat_idx:1047396 feat_idx:768743 feat_idx:258527 feat_idx:863222 feat_idx:866128 feat_idx:824472 feat_idx:599055 feat_idx:575938 feat_idx:568485 feat_idx:94311 feat_idx:469863 feat_idx:0 feat_idx:122096 feat_idx:26849 feat_idx:502861 feat_idx:9838 feat_value:0.0 feat_value:1.55232499476e-05 feat_value:0.00114442664225 feat_value:0.0227038183695 feat_value:0.000255273699002 feat_value:0.000419917547682 feat_value:3.55170392996e-05 feat_value:0.00363816768646 feat_value:0.00234329232572 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.00297578790748 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:695357 feat_idx:447935 feat_idx:937213 feat_idx:905937 feat_idx:314332 feat_idx:404876 feat_idx:142618 feat_idx:144963 feat_idx:148475 feat_idx:750865 feat_idx:596218 feat_idx:919681 feat_idx:840670 feat_idx:59528 feat_idx:380839 feat_idx:380828 feat_idx:13161 feat_idx:197572 feat_idx:1030936 feat_idx:94311 feat_idx:827510 feat_idx:0 feat_idx:377126 feat_idx:288434 feat_idx:24736 feat_idx:933741 feat_value:0.0 feat_value:0.000504505623297 feat_value:3.05180437934e-05 feat_value:0.0237358101135 feat_value:0.000683824352351 feat_value:5.33596883794e-05 feat_value:7.10340785992e-05 feat_value:0.00396891020341 feat_value:0.000792584168993 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.00311105099418 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:1
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:0 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:695357 feat_idx:569676 feat_idx:460446 feat_idx:323226 feat_idx:314332 feat_idx:404876 feat_idx:195437 feat_idx:144963 feat_idx:148475 feat_idx:303093 feat_idx:895160 feat_idx:824386 feat_idx:332768 feat_idx:288355 feat_idx:452911 feat_idx:499188 feat_idx:339114 feat_idx:1026477 feat_idx:0 feat_idx:0 feat_idx:335421 feat_idx:0 feat_idx:407810 feat_idx:686449 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:7.7616249738e-06 feat_value:0.0 feat_value:0.0 feat_value:0.00100192336124 feat_value:0.0 feat_value:0.0 feat_value:0.00529188027121 feat_value:0.0013094868879 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:0 feat_idx:5 feat_idx:0 feat_idx:0 feat_idx:8 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:0 feat_idx:268086 feat_idx:569676 feat_idx:460446 feat_idx:323226 feat_idx:943087 feat_idx:615411 feat_idx:831536 feat_idx:144963 feat_idx:31348 feat_idx:1084149 feat_idx:472585 feat_idx:824386 feat_idx:1085274 feat_idx:863222 feat_idx:406685 feat_idx:499188 feat_idx:13161 feat_idx:251433 feat_idx:0 feat_idx:0 feat_idx:335421 feat_idx:969590 feat_idx:476211 feat_idx:686449 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:7.7616249738e-06 feat_value:0.0 feat_value:0.0 feat_value:0.0294215028194 feat_value:0.0 feat_value:0.0 feat_value:0.00181908384323 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:29151 feat_idx:0 feat_idx:0 feat_idx:314332 feat_idx:615411 feat_idx:351823 feat_idx:144963 feat_idx:148475 feat_idx:633435 feat_idx:734591 feat_idx:0 feat_idx:346678 feat_idx:59528 feat_idx:246568 feat_idx:0 feat_idx:13161 feat_idx:669279 feat_idx:734534 feat_idx:94311 feat_idx:0 feat_idx:0 feat_idx:122096 feat_idx:311968 feat_idx:1007264 feat_idx:210855 feat_value:0.0 feat_value:1.55232499476e-05 feat_value:0.000976577401389 feat_value:0.0113519091847 feat_value:6.45092872648e-05 feat_value:0.00019951883481 feat_value:0.000266377794747 feat_value:0.00214982636018 feat_value:0.00796030187119 feat_value:0.0 feat_value:0.017316017316 feat_value:0.0 feat_value:0.00148789395374 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:0 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:0 feat_idx:268086 feat_idx:655161 feat_idx:160536 feat_idx:572549 feat_idx:943087 feat_idx:108674 feat_idx:179440 feat_idx:144963 feat_idx:148475 feat_idx:754940 feat_idx:216593 feat_idx:768743 feat_idx:272886 feat_idx:288355 feat_idx:1059113 feat_idx:824472 feat_idx:599055 feat_idx:512136 feat_idx:734534 feat_idx:94311 feat_idx:469863 feat_idx:0 feat_idx:476211 feat_idx:26849 feat_idx:502861 feat_idx:507836 feat_value:0.0 feat_value:1.55232499476e-05 feat_value:1.52590218967e-05 feat_value:0.0 feat_value:0.000125348367423 feat_value:4.63997290256e-06 feat_value:5.32755589494e-05 feat_value:0.0 feat_value:6.89203625211e-05 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:0 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:439682 feat_idx:434330 feat_idx:626900 feat_idx:360051 feat_idx:615411 feat_idx:448250 feat_idx:66687 feat_idx:31348 feat_idx:621494 feat_idx:345898 feat_idx:171523 feat_idx:728643 feat_idx:288355 feat_idx:993766 feat_idx:479691 feat_idx:599055 feat_idx:786401 feat_idx:0 feat_idx:0 feat_idx:914361 feat_idx:0 feat_idx:407810 feat_idx:253237 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.000419127748585 feat_value:1.52590218967e-05 feat_value:0.00103199174407 feat_value:0.00740600297347 feat_value:0.0 feat_value:0.0 feat_value:0.000165371258475 feat_value:0.000447982356387 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.000135263086704 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:599320 feat_idx:36543 feat_idx:348417 feat_idx:314332 feat_idx:615411 feat_idx:507688 feat_idx:795081 feat_idx:148475 feat_idx:1085001 feat_idx:538920 feat_idx:698736 feat_idx:914324 feat_idx:223357 feat_idx:726559 feat_idx:327135 feat_idx:13161 feat_idx:214732 feat_idx:324501 feat_idx:1047606 feat_idx:434899 feat_idx:0 feat_idx:377126 feat_idx:221229 feat_idx:522503 feat_idx:24246 feat_value:0.0 feat_value:0.000147470874502 feat_value:0.0013733119707 feat_value:0.00206398348813 feat_value:0.00178026634132 feat_value:0.00081663523085 feat_value:0.0 feat_value:0.000826856292376 feat_value:0.00151624797546 feat_value:0.0 feat_value:0.0 feat_value:0.0 feat_value:0.000676315433518 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:704711 feat_idx:0 feat_idx:417270 feat_idx:314332 feat_idx:404876 feat_idx:180197 feat_idx:144963 feat_idx:148475 feat_idx:891898 feat_idx:832883 feat_idx:0 feat_idx:406751 feat_idx:59528 feat_idx:28300 feat_idx:80459 feat_idx:587215 feat_idx:750233 feat_idx:52536 feat_idx:1047606 feat_idx:584293 feat_idx:0 feat_idx:476211 feat_idx:26849 feat_idx:502861 feat_idx:983005 feat_value:0.0 feat_value:1.16424374607e-05 feat_value:0.00119020370794 feat_value:0.00103199174407 feat_value:0.000683737994537 feat_value:0.000510397019281 feat_value:1.77585196498e-05 feat_value:0.000165371258475 feat_value:3.44601812606e-05 feat_value:0.0 feat_value:0.004329004329 feat_value:0.0 feat_value:0.000135263086704 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
+feat_idx:0 feat_idx:2 feat_idx:0 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:0 feat_idx:11 feat_idx:12 feat_idx:13 feat_idx:506931 feat_idx:123566 feat_idx:961529 feat_idx:810019 feat_idx:314332 feat_idx:615411 feat_idx:475867 feat_idx:795081 feat_idx:148475 feat_idx:697060 feat_idx:1069621 feat_idx:370551 feat_idx:696973 feat_idx:69630 feat_idx:396064 feat_idx:95177 feat_idx:854924 feat_idx:488825 feat_idx:0 feat_idx:0 feat_idx:581782 feat_idx:0 feat_idx:476211 feat_idx:289148 feat_idx:0 feat_idx:0 feat_value:0.0 feat_value:0.0066672358525 feat_value:0.0 feat_value:0.00103199174407 feat_value:0.000325784854359 feat_value:4.40797425743e-05 feat_value:0.000266377794747 feat_value:0.000165371258475 feat_value:0.00299803576967 feat_value:0.0 feat_value:0.030303030303 feat_value:0.0 feat_value:0.000135263086704 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:0.0 label:0
+feat_idx:1 feat_idx:2 feat_idx:3 feat_idx:4 feat_idx:5 feat_idx:6 feat_idx:7 feat_idx:8 feat_idx:9 feat_idx:10 feat_idx:11 feat_idx:0 feat_idx:13 feat_idx:268086 feat_idx:704711 feat_idx:995515 feat_idx:139394 feat_idx:943087 feat_idx:0 feat_idx:546815 feat_idx:144963 feat_idx:148475 feat_idx:364765 feat_idx:552750 feat_idx:920037 feat_idx:816538 feat_idx:223357 feat_idx:790588 feat_idx:560935 feat_idx:13161 feat_idx:750233 feat_idx:734534 feat_idx:1047606 feat_idx:361734 feat_idx:0 feat_idx:122096 feat_idx:434883 feat_idx:502861 feat_idx:203213 feat_value:0.0 feat_value:0.000197921436832 feat_value:4.57770656901e-05 feat_value:0.00206398348813 feat_value:0.000625316933178 feat_value:0.000874634892132 feat_value:0.000142068157198 feat_value:0.000330742516951 feat_value:0.00975223129674 feat_value:0.0 feat_value:0.00865800865801 feat_value:0.0 feat_value:0.000270526173407 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:0.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 feat_value:1.0 label:0
diff --git a/models/rank/xdeepfm/model.py b/models/rank/xdeepfm/model.py
index 6619c78bc718674f0efd12cf841efbe85f3cd729..4ca057bdcf9b858b7423b3fbaaa8e1e51d12ae86 100755
--- a/models/rank/xdeepfm/model.py
+++ b/models/rank/xdeepfm/model.py
@@ -1,96 +1,113 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
import paddle.fluid as fluid
-import math
-from fleetrec.core.utils import envs
-from fleetrec.core.model import Model as ModelBase
+from paddlerec.core.utils import envs
+from paddlerec.core.model import Model as ModelBase
class Model(ModelBase):
def __init__(self, config):
ModelBase.__init__(self, config)
- def xdeepfm_net(self):
+ def _init_hyper_parameters(self):
+ self.sparse_feature_number = envs.get_global_env(
+ "hyper_parameters.sparse_feature_number", None)
+ self.sparse_feature_dim = envs.get_global_env(
+ "hyper_parameters.sparse_feature_dim", None)
+ self.num_field = envs.get_global_env("hyper_parameters.num_field",
+ None)
+ self.layer_sizes_cin = envs.get_global_env(
+ "hyper_parameters.layer_sizes_cin", None)
+ self.layer_sizes_dnn = envs.get_global_env(
+ "hyper_parameters.layer_sizes_dnn", None)
+ self.act = envs.get_global_env("hyper_parameters.act", None)
+
+ def net(self, inputs, is_infer=False):
+ raw_feat_idx = self._sparse_data_var[1]
+ raw_feat_value = self._dense_data_var[0]
+ self.label = self._sparse_data_var[0]
+
init_value_ = 0.1
initer = fluid.initializer.TruncatedNormalInitializer(
loc=0.0, scale=init_value_)
-
+
is_distributed = True if envs.get_trainer() == "CtrTrainer" else False
- sparse_feature_number = envs.get_global_env("hyper_parameters.sparse_feature_number", None, self._namespace)
- sparse_feature_dim = envs.get_global_env("hyper_parameters.sparse_feature_dim", None, self._namespace)
-
+
# ------------------------- network input --------------------------
-
- num_field = envs.get_global_env("hyper_parameters.num_field", None, self._namespace)
- raw_feat_idx = fluid.data(name='feat_idx', shape=[None, num_field], dtype='int64')
- raw_feat_value = fluid.data(name='feat_value', shape=[None, num_field], dtype='float32')
- self.label = fluid.data(name='label', shape=[None, 1], dtype='float32') # None * 1
- feat_idx = fluid.layers.reshape(raw_feat_idx, [-1, 1]) # (None * num_field) * 1
- feat_value = fluid.layers.reshape(raw_feat_value, [-1, num_field, 1]) # None * num_field * 1
+
+ feat_idx = raw_feat_idx
+ feat_value = fluid.layers.reshape(
+ raw_feat_value, [-1, self.num_field, 1]) # None * num_field * 1
feat_embeddings = fluid.embedding(
input=feat_idx,
is_sparse=True,
dtype='float32',
- size=[sparse_feature_number + 1, sparse_feature_dim],
+ size=[self.sparse_feature_number + 1, self.sparse_feature_dim],
padding_idx=0,
param_attr=fluid.ParamAttr(initializer=initer))
- feat_embeddings = fluid.layers.reshape(
- feat_embeddings,
- [-1, num_field, sparse_feature_dim]) # None * num_field * embedding_size
+ feat_embeddings = fluid.layers.reshape(feat_embeddings, [
+ -1, self.num_field, self.sparse_feature_dim
+ ]) # None * num_field * embedding_size
feat_embeddings = feat_embeddings * feat_value # None * num_field * embedding_size
-
- # ------------------------- set _data_var --------------------------
-
- self._data_var.append(raw_feat_idx)
- self._data_var.append(raw_feat_value)
- self._data_var.append(self.label)
- if self._platform != "LINUX":
- self._data_loader = fluid.io.DataLoader.from_generator(
- feed_list=self._data_var, capacity=64, use_double_buffer=False, iterable=False)
-
+
# -------------------- linear --------------------
weights_linear = fluid.embedding(
input=feat_idx,
is_sparse=True,
dtype='float32',
- size=[sparse_feature_number + 1, 1],
+ size=[self.sparse_feature_number + 1, 1],
padding_idx=0,
param_attr=fluid.ParamAttr(initializer=initer))
weights_linear = fluid.layers.reshape(
- weights_linear, [-1, num_field, 1]) # None * num_field * 1
+ weights_linear, [-1, self.num_field, 1]) # None * num_field * 1
b_linear = fluid.layers.create_parameter(
shape=[1],
dtype='float32',
default_initializer=fluid.initializer.ConstantInitializer(value=0))
y_linear = fluid.layers.reduce_sum(
(weights_linear * feat_value), 1) + b_linear
-
+
# -------------------- CIN --------------------
- layer_sizes_cin = envs.get_global_env("hyper_parameters.layer_sizes_cin", None, self._namespace)
Xs = [feat_embeddings]
- last_s = num_field
- for s in layer_sizes_cin:
+ last_s = self.num_field
+ for s in self.layer_sizes_cin:
# calculate Z^(k+1) with X^k and X^0
X_0 = fluid.layers.reshape(
fluid.layers.transpose(Xs[0], [0, 2, 1]),
- [-1, sparse_feature_dim, num_field,
- 1]) # None, embedding_size, num_field, 1
+ [-1, self.sparse_feature_dim, self.num_field,
+ 1]) # None, embedding_size, num_field, 1
X_k = fluid.layers.reshape(
fluid.layers.transpose(Xs[-1], [0, 2, 1]),
- [-1, sparse_feature_dim, 1, last_s]) # None, embedding_size, 1, last_s
+ [-1, self.sparse_feature_dim, 1,
+ last_s]) # None, embedding_size, 1, last_s
Z_k_1 = fluid.layers.matmul(
X_0, X_k) # None, embedding_size, num_field, last_s
# compresses Z^(k+1) to X^(k+1)
Z_k_1 = fluid.layers.reshape(Z_k_1, [
- -1, sparse_feature_dim, last_s * num_field
+ -1, self.sparse_feature_dim, last_s * self.num_field
]) # None, embedding_size, last_s*num_field
Z_k_1 = fluid.layers.transpose(
Z_k_1, [0, 2, 1]) # None, s*num_field, embedding_size
Z_k_1 = fluid.layers.reshape(
- Z_k_1, [-1, last_s * num_field, 1, sparse_feature_dim]
+ Z_k_1,
+ [-1, last_s * self.num_field, 1, self.sparse_feature_dim]
) # None, last_s*num_field, 1, embedding_size (None, channal_in, h, w)
X_k_1 = fluid.layers.conv2d(
Z_k_1,
@@ -101,7 +118,8 @@ class Model(ModelBase):
param_attr=fluid.ParamAttr(
initializer=initer)) # None, s, 1, embedding_size
X_k_1 = fluid.layers.reshape(
- X_k_1, [-1, s, sparse_feature_dim]) # None, s, embedding_size
+ X_k_1,
+ [-1, s, self.sparse_feature_dim]) # None, s, embedding_size
Xs.append(X_k_1)
last_s = s
@@ -119,16 +137,15 @@ class Model(ModelBase):
# -------------------- DNN --------------------
- layer_sizes_dnn = envs.get_global_env("hyper_parameters.layer_sizes_dnn", None, self._namespace)
- act = envs.get_global_env("hyper_parameters.act", None, self._namespace)
- y_dnn = fluid.layers.reshape(feat_embeddings,
- [-1, num_field * sparse_feature_dim])
- for s in layer_sizes_dnn:
- y_dnn = fluid.layers.fc(input=y_dnn,
- size=s,
- act=act,
- param_attr=fluid.ParamAttr(initializer=initer),
- bias_attr=None)
+ y_dnn = fluid.layers.reshape(
+ feat_embeddings, [-1, self.num_field * self.sparse_feature_dim])
+ for s in self.layer_sizes_dnn:
+ y_dnn = fluid.layers.fc(
+ input=y_dnn,
+ size=s,
+ act=self.act,
+ param_attr=fluid.ParamAttr(initializer=initer),
+ bias_attr=None)
y_dnn = fluid.layers.fc(input=y_dnn,
size=1,
act=None,
@@ -138,11 +155,10 @@ class Model(ModelBase):
# ------------------- xDeepFM ------------------
self.predict = fluid.layers.sigmoid(y_linear + y_cin + y_dnn)
-
- def train_net(self):
- self.xdeepfm_net()
-
- cost = fluid.layers.log_loss(input=self.predict, label=self.label, epsilon=0.0000001)
+ cost = fluid.layers.log_loss(
+ input=self.predict,
+ label=fluid.layers.cast(self.label, "float32"),
+ epsilon=0.0000001)
batch_cost = fluid.layers.reduce_mean(cost)
self._cost = batch_cost
@@ -150,15 +166,9 @@ class Model(ModelBase):
predict_2d = fluid.layers.concat([1 - self.predict, self.predict], 1)
label_int = fluid.layers.cast(self.label, 'int64')
auc_var, batch_auc_var, _ = fluid.layers.auc(input=predict_2d,
- label=label_int,
- slide_steps=0)
+ label=label_int,
+ slide_steps=0)
self._metrics["AUC"] = auc_var
self._metrics["BATCH_AUC"] = batch_auc_var
-
- def optimizer(self):
- learning_rate = envs.get_global_env("hyper_parameters.learning_rate", None, self._namespace)
- optimizer = fluid.optimizer.Adam(learning_rate, lazy_mode=True)
- return optimizer
-
- def infer_net(self, parameter_list):
- self.xdeepfm_net()
\ No newline at end of file
+ if is_infer:
+ self._infer_results["AUC"] = auc_var
diff --git a/models/recall/__init__.py b/models/recall/__init__.py
index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..abf198b97e6e818e1fbe59006f98492640bcee54 100755
--- a/models/recall/__init__.py
+++ b/models/recall/__init__.py
@@ -0,0 +1,13 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
diff --git a/models/recall/gnn/config.yaml b/models/recall/gnn/config.yaml
index 19eeb9e4f8fcbf473e2ef399801fe4d2f85a468d..50c6d401153a607a88d5eba713fc439303aad868 100755
--- a/models/recall/gnn/config.yaml
+++ b/models/recall/gnn/config.yaml
@@ -12,7 +12,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
evaluate:
- workspace: "fleetrec.models.recall.gnn"
+ workspace: "paddlerec.models.recall.gnn"
reader:
batch_size: 50
class: "{workspace}/evaluate_reader.py"
@@ -24,7 +24,7 @@ train:
strategy: "async"
epochs: 2
- workspace: "fleetrec.models.recall.gnn"
+ workspace: "paddlerec.models.recall.gnn"
reader:
batch_size: 100
diff --git a/models/recall/gnn/data_process.sh b/models/recall/gnn/data_process.sh
index 9aa009f03cf2595ea0cb54e691800486f26a21bf..fc7ed827e0368c59cab8134d22f78e2200980f18 100755
--- a/models/recall/gnn/data_process.sh
+++ b/models/recall/gnn/data_process.sh
@@ -1,5 +1,19 @@
#! /bin/bash
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
set -e
echo "begin to download data"
@@ -17,5 +31,3 @@ mv diginetica/train.txt train_data
mkdir test_data
mv diginetica/test.txt test_data
-
-
diff --git a/models/recall/gnn/evaluate_reader.py b/models/recall/gnn/evaluate_reader.py
index 113433986b79530aad1c8da8aae84e5d7ad3d60d..b26ea8fa9fc347ce402575104dcfa6de23aa80fc 100755
--- a/models/recall/gnn/evaluate_reader.py
+++ b/models/recall/gnn/evaluate_reader.py
@@ -1,4 +1,4 @@
-# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved.
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
@@ -11,28 +11,32 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
-import numpy as np
-import io
+
import copy
import random
-from fleetrec.core.reader import Reader
-from fleetrec.core.utils import envs
+
+import numpy as np
+
+from paddlerec.core.reader import Reader
+from paddlerec.core.utils import envs
class EvaluateReader(Reader):
def init(self):
- self.batch_size = envs.get_global_env("batch_size", None, "evaluate.reader")
-
+ self.batch_size = envs.get_global_env("batch_size", None,
+ "evaluate.reader")
+
self.input = []
self.length = None
def base_read(self, files):
res = []
for f in files:
- with open(f, "r") as fin:
+ with open(f, "r") as fin:
for line in fin:
- line = line.strip().split('\t')
- res.append(tuple([map(int, line[0].split(',')), int(line[1])]))
+ line = line.strip().split('\t')
+ res.append(
+ tuple([map(int, line[0].split(',')), int(line[1])]))
return res
def make_data(self, cur_batch, batch_size):
@@ -73,10 +77,8 @@ class EvaluateReader(Reader):
u_deg_out[np.where(u_deg_out == 0)] = 1
adj_out.append(np.divide(adj.transpose(), u_deg_out).transpose())
- seq_index.append(
- [[id, np.where(node == i)[0][0]] for i in e[0]])
- last_index.append(
- [id, np.where(node == e[0][last_id[id]])[0][0]])
+ seq_index.append([[id, np.where(node == i)[0][0]] for i in e[0]])
+ last_index.append([id, np.where(node == e[0][last_id[id]])[0][0]])
label.append(e[1] - 1)
mask.append([[1] * (last_id[id] + 1) + [0] *
(max_seq_len - last_id[id] - 1)])
@@ -99,10 +101,13 @@ class EvaluateReader(Reader):
def _reader():
random.shuffle(self.input)
group_remain = self.length % batch_group_size
- for bg_id in range(0, self.length - group_remain, batch_group_size):
- cur_bg = copy.deepcopy(self.input[bg_id:bg_id + batch_group_size])
+ for bg_id in range(0, self.length - group_remain,
+ batch_group_size):
+ cur_bg = copy.deepcopy(self.input[bg_id:bg_id +
+ batch_group_size])
if train:
- cur_bg = sorted(cur_bg, key=lambda x: len(x[0]), reverse=True)
+ cur_bg = sorted(
+ cur_bg, key=lambda x: len(x[0]), reverse=True)
for i in range(0, batch_group_size, batch_size):
cur_batch = cur_bg[i:i + batch_size]
yield self.make_data(cur_batch, batch_size)
@@ -120,10 +125,11 @@ class EvaluateReader(Reader):
else:
# Due to fixed batch_size, discard the remaining ins
return
- #cur_batch = remain_data[i:]
- #yield self.make_data(cur_batch, group_remain % batch_size)
+ # cur_batch = remain_data[i:]
+ # yield self.make_data(cur_batch, group_remain % batch_size)
+
return _reader
-
+
def generate_batch_from_trainfiles(self, files):
self.input = self.base_read(files)
self.length = len(self.input)
@@ -132,4 +138,5 @@ class EvaluateReader(Reader):
def generate_sample(self, line):
def data_iter():
yield []
+
return data_iter
diff --git a/models/recall/gnn/model.py b/models/recall/gnn/model.py
index e63945e8a424257e742c911e2ae2444d234a0805..027fbb721131e203ed22485b4d8f9bd96b8ed3a3 100755
--- a/models/recall/gnn/model.py
+++ b/models/recall/gnn/model.py
@@ -12,32 +12,39 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-import numpy as np
import math
+import numpy as np
+
import paddle.fluid as fluid
import paddle.fluid.layers as layers
-from fleetrec.core.utils import envs
-from fleetrec.core.model import Model as ModelBase
+from paddlerec.core.utils import envs
+from paddlerec.core.model import Model as ModelBase
class Model(ModelBase):
def __init__(self, config):
ModelBase.__init__(self, config)
self.init_config()
-
+
def init_config(self):
self._fetch_interval = 1
- self.items_num, self.ins_num = self.config_read(envs.get_global_env("hyper_parameters.config_path", None, self._namespace))
- self.train_batch_size = envs.get_global_env("batch_size", None, "train.reader")
- self.evaluate_batch_size = envs.get_global_env("batch_size", None, "evaluate.reader")
- self.hidden_size = envs.get_global_env("hyper_parameters.sparse_feature_dim", None, self._namespace)
- self.step = envs.get_global_env("hyper_parameters.gnn_propogation_steps", None, self._namespace)
-
+ self.items_num, self.ins_num = self.config_read(
+ envs.get_global_env("hyper_parameters.config_path", None,
+ self._namespace))
+ self.train_batch_size = envs.get_global_env("batch_size", None,
+ "train.reader")
+ self.evaluate_batch_size = envs.get_global_env("batch_size", None,
+ "evaluate.reader")
+ self.hidden_size = envs.get_global_env(
+ "hyper_parameters.sparse_feature_dim", None, self._namespace)
+ self.step = envs.get_global_env(
+ "hyper_parameters.gnn_propogation_steps", None, self._namespace)
def config_read(self, config_path=None):
- if config_path is None:
- raise ValueError("please set train.model.hyper_parameters.config_path at first")
+ if config_path is None:
+ raise ValueError(
+ "please set train.model.hyper_parameters.config_path at first")
with open(config_path, "r") as fin:
item_nums = int(fin.readline().strip())
ins_nums = int(fin.readline().strip())
@@ -45,106 +52,114 @@ class Model(ModelBase):
def input(self, bs):
self.items = fluid.data(
- name="items",
- shape=[bs, -1],
- dtype="int64") #[batch_size, uniq_max]
+ name="items", shape=[bs, -1],
+ dtype="int64") # [batch_size, uniq_max]
self.seq_index = fluid.data(
- name="seq_index",
- shape=[bs, -1, 2],
- dtype="int32") #[batch_size, seq_max, 2]
+ name="seq_index", shape=[bs, -1, 2],
+ dtype="int32") # [batch_size, seq_max, 2]
self.last_index = fluid.data(
- name="last_index",
- shape=[bs, 2],
- dtype="int32") #[batch_size, 2]
+ name="last_index", shape=[bs, 2], dtype="int32") # [batch_size, 2]
self.adj_in = fluid.data(
- name="adj_in",
- shape=[bs, -1, -1],
- dtype="float32") #[batch_size, seq_max, seq_max]
+ name="adj_in", shape=[bs, -1, -1],
+ dtype="float32") # [batch_size, seq_max, seq_max]
self.adj_out = fluid.data(
- name="adj_out",
- shape=[bs, -1, -1],
- dtype="float32") #[batch_size, seq_max, seq_max]
+ name="adj_out", shape=[bs, -1, -1],
+ dtype="float32") # [batch_size, seq_max, seq_max]
self.mask = fluid.data(
- name="mask",
- shape=[bs, -1, 1],
- dtype="float32") #[batch_size, seq_max, 1]
+ name="mask", shape=[bs, -1, 1],
+ dtype="float32") # [batch_size, seq_max, 1]
self.label = fluid.data(
- name="label",
- shape=[bs, 1],
- dtype="int64") #[batch_size, 1]
+ name="label", shape=[bs, 1], dtype="int64") # [batch_size, 1]
- res = [self.items, self.seq_index, self.last_index, self.adj_in, self.adj_out, self.mask, self.label]
+ res = [
+ self.items, self.seq_index, self.last_index, self.adj_in,
+ self.adj_out, self.mask, self.label
+ ]
return res
-
+
def train_input(self):
res = self.input(self.train_batch_size)
self._data_var = res
- use_dataloader = envs.get_global_env("hyper_parameters.use_DataLoader", False, self._namespace)
+ use_dataloader = envs.get_global_env("hyper_parameters.use_DataLoader",
+ False, self._namespace)
if self._platform != "LINUX" or use_dataloader:
self._data_loader = fluid.io.DataLoader.from_generator(
- feed_list=self._data_var, capacity=256, use_double_buffer=False, iterable=False)
+ feed_list=self._data_var,
+ capacity=256,
+ use_double_buffer=False,
+ iterable=False)
def net(self, items_num, hidden_size, step, bs):
- stdv = 1.0 / math.sqrt(hidden_size)
+ stdv = 1.0 / math.sqrt(hidden_size)
- def embedding_layer(input, table_name, emb_dim, initializer_instance=None):
+ def embedding_layer(input,
+ table_name,
+ emb_dim,
+ initializer_instance=None):
emb = fluid.embedding(
input=input,
size=[items_num, emb_dim],
param_attr=fluid.ParamAttr(
- name=table_name,
- initializer=initializer_instance),
- )
- return emb
-
- sparse_initializer = fluid.initializer.Uniform(low=-stdv, high=stdv)
- items_emb = embedding_layer(self.items, "emb", hidden_size, sparse_initializer)
+ name=table_name, initializer=initializer_instance), )
+ return emb
+
+ sparse_initializer = fluid.initializer.Uniform(low=-stdv, high=stdv)
+ items_emb = embedding_layer(self.items, "emb", hidden_size,
+ sparse_initializer)
pre_state = items_emb
for i in range(step):
- pre_state = layers.reshape(x=pre_state, shape=[bs, -1, hidden_size])
+ pre_state = layers.reshape(
+ x=pre_state, shape=[bs, -1, hidden_size])
state_in = layers.fc(
input=pre_state,
name="state_in",
size=hidden_size,
act=None,
num_flatten_dims=2,
- param_attr=fluid.ParamAttr(initializer=fluid.initializer.Uniform(
- low=-stdv, high=stdv)),
- bias_attr=fluid.ParamAttr(initializer=fluid.initializer.Uniform(
- low=-stdv, high=stdv))) #[batch_size, uniq_max, h]
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Uniform(
+ low=-stdv, high=stdv)),
+ bias_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Uniform(
+ low=-stdv, high=stdv))) # [batch_size, uniq_max, h]
state_out = layers.fc(
input=pre_state,
name="state_out",
size=hidden_size,
act=None,
num_flatten_dims=2,
- param_attr=fluid.ParamAttr(initializer=fluid.initializer.Uniform(
- low=-stdv, high=stdv)),
- bias_attr=fluid.ParamAttr(initializer=fluid.initializer.Uniform(
- low=-stdv, high=stdv))) #[batch_size, uniq_max, h]
-
- state_adj_in = layers.matmul(self.adj_in, state_in) #[batch_size, uniq_max, h]
- state_adj_out = layers.matmul(self.adj_out, state_out) #[batch_size, uniq_max, h]
-
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Uniform(
+ low=-stdv, high=stdv)),
+ bias_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Uniform(
+ low=-stdv, high=stdv))) # [batch_size, uniq_max, h]
+
+ state_adj_in = layers.matmul(self.adj_in,
+ state_in) # [batch_size, uniq_max, h]
+ state_adj_out = layers.matmul(
+ self.adj_out, state_out) # [batch_size, uniq_max, h]
+
gru_input = layers.concat([state_adj_in, state_adj_out], axis=2)
-
- gru_input = layers.reshape(x=gru_input, shape=[-1, hidden_size * 2])
- gru_fc = layers.fc(
- input=gru_input,
- name="gru_fc",
- size=3 * hidden_size,
- bias_attr=False)
+
+ gru_input = layers.reshape(
+ x=gru_input, shape=[-1, hidden_size * 2])
+ gru_fc = layers.fc(input=gru_input,
+ name="gru_fc",
+ size=3 * hidden_size,
+ bias_attr=False)
pre_state, _, _ = fluid.layers.gru_unit(
input=gru_fc,
- hidden=layers.reshape(x=pre_state, shape=[-1, hidden_size]),
+ hidden=layers.reshape(
+ x=pre_state, shape=[-1, hidden_size]),
size=3 * hidden_size)
-
+
final_state = layers.reshape(pre_state, shape=[bs, -1, hidden_size])
seq = layers.gather_nd(final_state, self.seq_index)
last = layers.gather_nd(final_state, self.last_index)
-
+
seq_fc = layers.fc(
input=seq,
name="seq_fc",
@@ -152,34 +167,32 @@ class Model(ModelBase):
bias_attr=False,
act=None,
num_flatten_dims=2,
- param_attr=fluid.ParamAttr(
- initializer=fluid.initializer.Uniform(
- low=-stdv, high=stdv))) #[batch_size, seq_max, h]
- last_fc = layers.fc(
- input=last,
- name="last_fc",
- size=hidden_size,
- bias_attr=False,
- act=None,
- num_flatten_dims=1,
- param_attr=fluid.ParamAttr(
- initializer=fluid.initializer.Uniform(
- low=-stdv, high=stdv))) #[bathc_size, h]
-
+ param_attr=fluid.ParamAttr(initializer=fluid.initializer.Uniform(
+ low=-stdv, high=stdv))) # [batch_size, seq_max, h]
+ last_fc = layers.fc(input=last,
+ name="last_fc",
+ size=hidden_size,
+ bias_attr=False,
+ act=None,
+ num_flatten_dims=1,
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Uniform(
+ low=-stdv, high=stdv))) # [bathc_size, h]
+
seq_fc_t = layers.transpose(
- seq_fc, perm=[1, 0, 2]) #[seq_max, batch_size, h]
- add = layers.elementwise_add(
- seq_fc_t, last_fc) #[seq_max, batch_size, h]
+ seq_fc, perm=[1, 0, 2]) # [seq_max, batch_size, h]
+ add = layers.elementwise_add(seq_fc_t,
+ last_fc) # [seq_max, batch_size, h]
b = layers.create_parameter(
shape=[hidden_size],
dtype='float32',
- default_initializer=fluid.initializer.Constant(value=0.0)) #[h]
- add = layers.elementwise_add(add, b) #[seq_max, batch_size, h]
-
- add_sigmoid = layers.sigmoid(add) #[seq_max, batch_size, h]
+ default_initializer=fluid.initializer.Constant(value=0.0)) # [h]
+ add = layers.elementwise_add(add, b) # [seq_max, batch_size, h]
+
+ add_sigmoid = layers.sigmoid(add) # [seq_max, batch_size, h]
add_sigmoid = layers.transpose(
- add_sigmoid, perm=[1, 0, 2]) #[batch_size, seq_max, h]
-
+ add_sigmoid, perm=[1, 0, 2]) # [batch_size, seq_max, h]
+
weight = layers.fc(
input=add_sigmoid,
name="weight_fc",
@@ -187,15 +200,16 @@ class Model(ModelBase):
act=None,
num_flatten_dims=2,
bias_attr=False,
- param_attr=fluid.ParamAttr(
- initializer=fluid.initializer.Uniform(
- low=-stdv, high=stdv))) #[batch_size, seq_max, 1]
+ param_attr=fluid.ParamAttr(initializer=fluid.initializer.Uniform(
+ low=-stdv, high=stdv))) # [batch_size, seq_max, 1]
weight *= self.mask
- weight_mask = layers.elementwise_mul(seq, weight, axis=0) #[batch_size, seq_max, h]
- global_attention = layers.reduce_sum(weight_mask, dim=1) #[batch_size, h]
-
+ weight_mask = layers.elementwise_mul(
+ seq, weight, axis=0) # [batch_size, seq_max, h]
+ global_attention = layers.reduce_sum(
+ weight_mask, dim=1) # [batch_size, h]
+
final_attention = layers.concat(
- [global_attention, last], axis=1) #[batch_size, 2*h]
+ [global_attention, last], axis=1) # [batch_size, 2*h]
final_attention_fc = layers.fc(
input=final_attention,
name="final_attention_fc",
@@ -203,16 +217,17 @@ class Model(ModelBase):
bias_attr=False,
act=None,
param_attr=fluid.ParamAttr(initializer=fluid.initializer.Uniform(
- low=-stdv, high=stdv))) #[batch_size, h]
-
- # all_vocab = layers.create_global_var(
- # shape=[items_num - 1],
- # value=0,
- # dtype="int64",
- # persistable=True,
- # name="all_vocab")
+ low=-stdv, high=stdv))) # [batch_size, h]
+
+ # all_vocab = layers.create_global_var(
+ # shape=[items_num - 1],
+ # value=0,
+ # dtype="int64",
+ # persistable=True,
+ # name="all_vocab")
all_vocab = np.arange(1, items_num).reshape((-1)).astype('int32')
- all_vocab = fluid.layers.cast(x=fluid.layers.assign(all_vocab), dtype='int64')
+ all_vocab = fluid.layers.cast(
+ x=fluid.layers.assign(all_vocab), dtype='int64')
all_emb = fluid.embedding(
input=all_vocab,
@@ -220,13 +235,13 @@ class Model(ModelBase):
name="emb",
initializer=fluid.initializer.Uniform(
low=-stdv, high=stdv)),
- size=[items_num, hidden_size]) #[all_vocab, h]
-
+ size=[items_num, hidden_size]) # [all_vocab, h]
+
logits = layers.matmul(
x=final_attention_fc, y=all_emb,
- transpose_y=True) #[batch_size, all_vocab]
+ transpose_y=True) # [batch_size, all_vocab]
softmax = layers.softmax_with_cross_entropy(
- logits=logits, label=self.label) #[batch_size, 1]
+ logits=logits, label=self.label) # [batch_size, 1]
self.loss = layers.reduce_mean(softmax) # [1]
self.acc = layers.accuracy(input=logits, label=self.label, k=20)
@@ -239,17 +254,21 @@ class Model(ModelBase):
def train_net(self):
self.train_input()
- self.net(self.items_num, self.hidden_size, self.step, self.train_batch_size)
+ self.net(self.items_num, self.hidden_size, self.step,
+ self.train_batch_size)
self.avg_loss()
self.metrics()
def optimizer(self):
- learning_rate = envs.get_global_env("hyper_parameters.learning_rate", None, self._namespace)
+ learning_rate = envs.get_global_env("hyper_parameters.learning_rate",
+ None, self._namespace)
step_per_epoch = self.ins_num // self.train_batch_size
- decay_steps = envs.get_global_env("hyper_parameters.decay_steps", None, self._namespace)
- decay_rate = envs.get_global_env("hyper_parameters.decay_rate", None, self._namespace)
+ decay_steps = envs.get_global_env("hyper_parameters.decay_steps", None,
+ self._namespace)
+ decay_rate = envs.get_global_env("hyper_parameters.decay_rate", None,
+ self._namespace)
l2 = envs.get_global_env("hyper_parameters.l2", None, self._namespace)
- optimizer = fluid.optimizer.Adam(
+ optimizer = fluid.optimizer.Adam(
learning_rate=fluid.layers.exponential_decay(
learning_rate=learning_rate,
decay_steps=decay_steps * step_per_epoch,
@@ -257,18 +276,22 @@ class Model(ModelBase):
regularization=fluid.regularizer.L2DecayRegularizer(
regularization_coeff=l2))
- return optimizer
+ return optimizer
def infer_input(self):
self._reader_namespace = "evaluate.reader"
res = self.input(self.evaluate_batch_size)
- self._infer_data_var = res
+ self._infer_data_var = res
self._infer_data_loader = fluid.io.DataLoader.from_generator(
- feed_list=self._infer_data_var, capacity=64, use_double_buffer=False, iterable=False)
-
+ feed_list=self._infer_data_var,
+ capacity=64,
+ use_double_buffer=False,
+ iterable=False)
+
def infer_net(self):
- self.infer_input()
- self.net(self.items_num, self.hidden_size, self.step, self.evaluate_batch_size)
+ self.infer_input()
+ self.net(self.items_num, self.hidden_size, self.step,
+ self.evaluate_batch_size)
self._infer_results['acc'] = self.acc
- self._infer_results['loss'] = self.loss
+ self._infer_results['loss'] = self.loss
diff --git a/models/recall/gnn/raw_data/convert_data.py b/models/recall/gnn/raw_data/convert_data.py
index 2e0e57f1f781f7210c46ef265e1189e99a6f7a96..dfe6bc49fcfca0b98ed5cb0ee9d41832dc5c2205 100755
--- a/models/recall/gnn/raw_data/convert_data.py
+++ b/models/recall/gnn/raw_data/convert_data.py
@@ -1,3 +1,17 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
import argparse
import time
import pickle
@@ -10,6 +24,7 @@ parser.add_argument(
help='dataset dir: diginetica/yoochoose1_4/yoochoose1_64/sample')
opt = parser.parse_args()
+
def process_data(file_type):
path = os.path.join(opt.data_dir, file_type)
output_path = os.path.splitext(path)[0] + ".txt"
@@ -23,6 +38,7 @@ def process_data(file_type):
fout.write(str(data[i][1]))
fout.write("\n")
+
process_data("train")
process_data("test")
diff --git a/models/recall/gnn/raw_data/download.py b/models/recall/gnn/raw_data/download.py
index 69a1ee20b2d634e9eca47c621dce82ac2d98b5f2..9bebdf1b37e2cd45369c14bb7446c206de8017a0 100755
--- a/models/recall/gnn/raw_data/download.py
+++ b/models/recall/gnn/raw_data/download.py
@@ -1,3 +1,17 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
import requests
import sys
import time
diff --git a/models/recall/gnn/reader.py b/models/recall/gnn/reader.py
index 89150d5d2faf241f132614ef825a409d600c7ec3..68170f09a7a7c84547a67f970b6e127de40b0ccc 100755
--- a/models/recall/gnn/reader.py
+++ b/models/recall/gnn/reader.py
@@ -1,4 +1,4 @@
-# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved.
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
@@ -11,28 +11,32 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
-import numpy as np
-import io
+
import copy
import random
-from fleetrec.core.reader import Reader
-from fleetrec.core.utils import envs
+
+import numpy as np
+
+from paddlerec.core.reader import Reader
+from paddlerec.core.utils import envs
class TrainReader(Reader):
def init(self):
- self.batch_size = envs.get_global_env("batch_size", None, "train.reader")
-
+ self.batch_size = envs.get_global_env("batch_size", None,
+ "train.reader")
+
self.input = []
self.length = None
def base_read(self, files):
res = []
for f in files:
- with open(f, "r") as fin:
+ with open(f, "r") as fin:
for line in fin:
- line = line.strip().split('\t')
- res.append(tuple([map(int, line[0].split(',')), int(line[1])]))
+ line = line.strip().split('\t')
+ res.append(
+ tuple([map(int, line[0].split(',')), int(line[1])]))
return res
def make_data(self, cur_batch, batch_size):
@@ -73,10 +77,8 @@ class TrainReader(Reader):
u_deg_out[np.where(u_deg_out == 0)] = 1
adj_out.append(np.divide(adj.transpose(), u_deg_out).transpose())
- seq_index.append(
- [[id, np.where(node == i)[0][0]] for i in e[0]])
- last_index.append(
- [id, np.where(node == e[0][last_id[id]])[0][0]])
+ seq_index.append([[id, np.where(node == i)[0][0]] for i in e[0]])
+ last_index.append([id, np.where(node == e[0][last_id[id]])[0][0]])
label.append(e[1] - 1)
mask.append([[1] * (last_id[id] + 1) + [0] *
(max_seq_len - last_id[id] - 1)])
@@ -99,10 +101,13 @@ class TrainReader(Reader):
def _reader():
random.shuffle(self.input)
group_remain = self.length % batch_group_size
- for bg_id in range(0, self.length - group_remain, batch_group_size):
- cur_bg = copy.deepcopy(self.input[bg_id:bg_id + batch_group_size])
+ for bg_id in range(0, self.length - group_remain,
+ batch_group_size):
+ cur_bg = copy.deepcopy(self.input[bg_id:bg_id +
+ batch_group_size])
if train:
- cur_bg = sorted(cur_bg, key=lambda x: len(x[0]), reverse=True)
+ cur_bg = sorted(
+ cur_bg, key=lambda x: len(x[0]), reverse=True)
for i in range(0, batch_group_size, batch_size):
cur_batch = cur_bg[i:i + batch_size]
yield self.make_data(cur_batch, batch_size)
@@ -120,10 +125,11 @@ class TrainReader(Reader):
else:
# Due to fixed batch_size, discard the remaining ins
return
- #cur_batch = remain_data[i:]
- #yield self.make_data(cur_batch, group_remain % batch_size)
+ # cur_batch = remain_data[i:]
+ # yield self.make_data(cur_batch, group_remain % batch_size)
+
return _reader
-
+
def generate_batch_from_trainfiles(self, files):
self.input = self.base_read(files)
self.length = len(self.input)
@@ -132,4 +138,5 @@ class TrainReader(Reader):
def generate_sample(self, line):
def data_iter():
yield []
+
return data_iter
diff --git a/models/recall/gru4rec/config.yaml b/models/recall/gru4rec/config.yaml
index 35e57e01fddf030905a4ca0ae3a94ac782ea1bb3..90cc2d2debca27a0a5e5e7c2fba512c2796a1b14 100644
--- a/models/recall/gru4rec/config.yaml
+++ b/models/recall/gru4rec/config.yaml
@@ -12,38 +12,59 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-train:
- trainer:
- # for cluster training
- strategy: "async"
+workspace: "paddlerec.models.recall.gru4rec"
- epochs: 3
- workspace: "fleetrec.models.recall.gru4rec"
+dataset:
+- name: dataset_train
+ batch_size: 5
+ type: QueueDataset
+ data_path: "{workspace}/data/train"
+ data_converter: "{workspace}/rsc15_reader.py"
+- name: dataset_infer
+ batch_size: 5
+ type: QueueDataset
+ data_path: "{workspace}/data/test"
+ data_converter: "{workspace}/rsc15_reader.py"
+
+hyper_parameters:
+ vocab_size: 1000
+ hid_size: 100
+ emb_lr_x: 10.0
+ gru_lr_x: 1.0
+ fc_lr_x: 1.0
+ init_low_bound: -0.04
+ init_high_bound: 0.04
+ optimizer:
+ class: adagrad
+ learning_rate: 0.01
+ strategy: async
- reader:
- batch_size: 5
- class: "{workspace}/rsc15_reader.py"
- train_data_path: "{workspace}/data/train"
+#use infer_runner mode and modify 'phase' below if infer
+mode: train_runner
+#mode: infer_runner
- model:
- models: "{workspace}/model.py"
- hyper_parameters:
- vocab_size: 1000
- hid_size: 100
- emb_lr_x: 10.0
- gru_lr_x: 1.0
- fc_lr_x: 1.0
- init_low_bound: -0.04
- init_high_bound: 0.04
- learning_rate: 0.01
- optimizer: adagrad
+runner:
+- name: train_runner
+ class: single_train
+ device: cpu
+ epochs: 3
+ save_checkpoint_interval: 2
+ save_inference_interval: 4
+ save_checkpoint_path: "increment"
+ save_inference_path: "inference"
+ print_interval: 10
+- name: infer_runner
+ class: single_infer
+ init_model_path: "increment/0"
+ device: cpu
+ epochs: 3
- save:
- increment:
- dirname: "increment"
- epoch_interval: 2
- save_last: True
- inference:
- dirname: "inference"
- epoch_interval: 4
- save_last: True
+phase:
+- name: train
+ model: "{workspace}/model.py"
+ dataset_name: dataset_train
+ thread_num: 1
+ #- name: infer
+ # model: "{workspace}/model.py"
+ # dataset_name: dataset_infer
+ # thread_num: 1
diff --git a/models/recall/gru4rec/hdfs.log b/models/recall/gru4rec/hdfs.log
deleted file mode 100644
index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000
diff --git a/models/recall/gru4rec/model.py b/models/recall/gru4rec/model.py
index 5b461d76b17bae5535bc945c1d4c9e988eb7e1a8..571deadf7d97c1010a03590d5360337528b25685 100644
--- a/models/recall/gru4rec/model.py
+++ b/models/recall/gru4rec/model.py
@@ -12,76 +12,82 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-import math
import paddle.fluid as fluid
-from fleetrec.core.utils import envs
-from fleetrec.core.model import Model as ModelBase
+from paddlerec.core.utils import envs
+from paddlerec.core.model import Model as ModelBase
class Model(ModelBase):
def __init__(self, config):
ModelBase.__init__(self, config)
- def all_vocab_network(self):
- """ network definition """
- recall_k = envs.get_global_env("hyper_parameters.recall_k", None, self._namespace)
- vocab_size = envs.get_global_env("hyper_parameters.vocab_size", None, self._namespace)
- hid_size = envs.get_global_env("hyper_parameters.hid_size", None, self._namespace)
- init_low_bound = envs.get_global_env("hyper_parameters.init_low_bound", None, self._namespace)
- init_high_bound = envs.get_global_env("hyper_parameters.init_high_bound", None, self._namespace)
- emb_lr_x = envs.get_global_env("hyper_parameters.emb_lr_x", None, self._namespace)
- gru_lr_x = envs.get_global_env("hyper_parameters.gru_lr_x", None, self._namespace)
- fc_lr_x = envs.get_global_env("hyper_parameters.fc_lr_x", None, self._namespace)
+ def _init_hyper_parameters(self):
+ self.recall_k = envs.get_global_env("hyper_parameters.recall_k")
+ self.vocab_size = envs.get_global_env("hyper_parameters.vocab_size")
+ self.hid_size = envs.get_global_env("hyper_parameters.hid_size")
+ self.init_low_bound = envs.get_global_env(
+ "hyper_parameters.init_low_bound")
+ self.init_high_bound = envs.get_global_env(
+ "hyper_parameters.init_high_bound")
+ self.emb_lr_x = envs.get_global_env("hyper_parameters.emb_lr_x")
+ self.gru_lr_x = envs.get_global_env("hyper_parameters.gru_lr_x")
+ self.fc_lr_x = envs.get_global_env("hyper_parameters.fc_lr_x")
+
+ def input_data(self, is_infer=False, **kwargs):
+
# Input data
src_wordseq = fluid.data(
name="src_wordseq", shape=[None, 1], dtype="int64", lod_level=1)
dst_wordseq = fluid.data(
name="dst_wordseq", shape=[None, 1], dtype="int64", lod_level=1)
+ return [src_wordseq, dst_wordseq]
+
+ def net(self, inputs, is_infer=False):
+ src_wordseq = inputs[0]
+ dst_wordseq = inputs[1]
+
emb = fluid.embedding(
input=src_wordseq,
- size=[vocab_size, hid_size],
+ size=[self.vocab_size, self.hid_size],
param_attr=fluid.ParamAttr(
+ name="emb",
initializer=fluid.initializer.Uniform(
- low=init_low_bound, high=init_high_bound),
- learning_rate=emb_lr_x),
+ low=self.init_low_bound, high=self.init_high_bound),
+ learning_rate=self.emb_lr_x),
is_sparse=True)
fc0 = fluid.layers.fc(input=emb,
- size=hid_size * 3,
+ size=self.hid_size * 3,
param_attr=fluid.ParamAttr(
initializer=fluid.initializer.Uniform(
- low=init_low_bound, high=init_high_bound),
- learning_rate=gru_lr_x))
+ low=self.init_low_bound,
+ high=self.init_high_bound),
+ learning_rate=self.gru_lr_x))
gru_h0 = fluid.layers.dynamic_gru(
input=fc0,
- size=hid_size,
+ size=self.hid_size,
param_attr=fluid.ParamAttr(
initializer=fluid.initializer.Uniform(
- low=init_low_bound, high=init_high_bound),
- learning_rate=gru_lr_x))
+ low=self.init_low_bound, high=self.init_high_bound),
+ learning_rate=self.gru_lr_x))
fc = fluid.layers.fc(input=gru_h0,
- size=vocab_size,
+ size=self.vocab_size,
act='softmax',
param_attr=fluid.ParamAttr(
initializer=fluid.initializer.Uniform(
- low=init_low_bound, high=init_high_bound),
- learning_rate=fc_lr_x))
+ low=self.init_low_bound,
+ high=self.init_high_bound),
+ learning_rate=self.fc_lr_x))
cost = fluid.layers.cross_entropy(input=fc, label=dst_wordseq)
- acc = fluid.layers.accuracy(input=fc, label=dst_wordseq, k=recall_k)
+ acc = fluid.layers.accuracy(
+ input=fc, label=dst_wordseq, k=self.recall_k)
+ if is_infer:
+ self._infer_results['recall20'] = acc
+ return
avg_cost = fluid.layers.mean(x=cost)
- self._data_var.append(src_wordseq)
- self._data_var.append(dst_wordseq)
self._cost = avg_cost
self._metrics["cost"] = avg_cost
self._metrics["acc"] = acc
-
-
- def train_net(self):
- self.all_vocab_network()
-
-
- def infer_net(self):
- pass
diff --git a/models/recall/gru4rec/rsc15_reader.py b/models/recall/gru4rec/rsc15_reader.py
index fbfc4b91c749eb0c603ac335932103e2dabe83b0..4fe9433a65e1ceec508891eecfaaa5e464bc9e24 100644
--- a/models/recall/gru4rec/rsc15_reader.py
+++ b/models/recall/gru4rec/rsc15_reader.py
@@ -11,10 +11,10 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
+
from __future__ import print_function
-from fleetrec.core.reader import Reader
-from fleetrec.core.utils import envs
+from paddlerec.core.reader import Reader
class TrainReader(Reader):
diff --git a/models/recall/multiview-simnet/data_process.sh b/models/recall/multiview-simnet/data_process.sh
deleted file mode 100755
index 15c6c908477cd3ba6a72a65bad039bb10295bd9c..0000000000000000000000000000000000000000
--- a/models/recall/multiview-simnet/data_process.sh
+++ /dev/null
@@ -1,10 +0,0 @@
-#! /bin/bash
-
-set -e
-echo "begin to prepare data"
-
-mkdir -p data/train
-mkdir -p data/test
-
-python generate_synthetic_data.py
-
diff --git a/models/recall/ncf/__init__.py b/models/recall/ncf/__init__.py
new file mode 100755
index 0000000000000000000000000000000000000000..abf198b97e6e818e1fbe59006f98492640bcee54
--- /dev/null
+++ b/models/recall/ncf/__init__.py
@@ -0,0 +1,13 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
diff --git a/models/recall/ncf/config.yaml b/models/recall/ncf/config.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..16d298b12fd551bd8421b44bc12d536fdc962e8b
--- /dev/null
+++ b/models/recall/ncf/config.yaml
@@ -0,0 +1,67 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+workspace: "paddlerec.models.recall.ncf"
+
+dataset:
+- name: dataset_train
+ batch_size: 5
+ type: QueueDataset
+ data_path: "{workspace}/data/train"
+ data_converter: "{workspace}/movielens_reader.py"
+- name: dataset_infer
+ batch_size: 5
+ type: QueueDataset
+ data_path: "{workspace}/data/test"
+ data_converter: "{workspace}/movielens_infer_reader.py"
+
+hyper_parameters:
+ num_users: 6040
+ num_items: 3706
+ latent_dim: 8
+ fc_layers: [64, 32, 16, 8]
+ optimizer:
+ class: adam
+ learning_rate: 0.001
+ strategy: async
+
+#use infer_runner mode and modify 'phase' below if infer
+mode: train_runner
+#mode: infer_runner
+
+runner:
+- name: train_runner
+ class: single_train
+ device: cpu
+ epochs: 3
+ save_checkpoint_interval: 2
+ save_inference_interval: 4
+ save_checkpoint_path: "increment"
+ save_inference_path: "inference"
+ print_interval: 10
+- name: infer_runner
+ class: single_infer
+ init_model_path: "increment/0"
+ device: cpu
+ epochs: 3
+
+phase:
+- name: train
+ model: "{workspace}/model.py"
+ dataset_name: dataset_train
+ thread_num: 1
+ #- name: infer
+ # model: "{workspace}/model.py"
+ # dataset_name: dataset_infer
+ # thread_num: 1
diff --git a/models/recall/ncf/data/test/small_data.txt b/models/recall/ncf/data/test/small_data.txt
new file mode 100644
index 0000000000000000000000000000000000000000..c3c4cf5f84f66594e76603cce1f18d211ebd05a7
--- /dev/null
+++ b/models/recall/ncf/data/test/small_data.txt
@@ -0,0 +1,100 @@
+4764,174,1
+4764,2958,0
+4764,452,0
+4764,1946,0
+4764,3208,0
+2044,2237,1
+2044,1998,0
+2044,328,0
+2044,1542,0
+2044,1932,0
+4276,65,1
+4276,3247,0
+4276,942,0
+4276,3666,0
+4276,2222,0
+3933,682,1
+3933,2451,0
+3933,3695,0
+3933,1643,0
+3933,3568,0
+1151,1265,1
+1151,118,0
+1151,2532,0
+1151,2083,0
+1151,2350,0
+1757,876,1
+1757,201,0
+1757,3633,0
+1757,1068,0
+1757,2549,0
+3370,276,1
+3370,2435,0
+3370,606,0
+3370,910,0
+3370,2146,0
+5137,1018,1
+5137,2163,0
+5137,3167,0
+5137,2315,0
+5137,3595,0
+3933,2831,1
+3933,2881,0
+3933,2949,0
+3933,3660,0
+3933,417,0
+3102,999,1
+3102,1902,0
+3102,2161,0
+3102,3042,0
+3102,1113,0
+2022,336,1
+2022,1672,0
+2022,2656,0
+2022,3649,0
+2022,883,0
+2664,655,1
+2664,3660,0
+2664,1711,0
+2664,3386,0
+2664,1668,0
+25,701,1
+25,32,0
+25,2482,0
+25,3177,0
+25,2767,0
+1738,1643,1
+1738,2187,0
+1738,228,0
+1738,650,0
+1738,3101,0
+5411,1241,1
+5411,2546,0
+5411,3019,0
+5411,3618,0
+5411,1674,0
+638,579,1
+638,3512,0
+638,783,0
+638,2111,0
+638,1880,0
+3554,200,1
+3554,2893,0
+3554,2428,0
+3554,969,0
+3554,2741,0
+4283,1074,1
+4283,3056,0
+4283,2032,0
+4283,405,0
+4283,1505,0
+5111,200,1
+5111,3488,0
+5111,477,0
+5111,2790,0
+5111,40,0
+3964,515,1
+3964,1528,0
+3964,2173,0
+3964,1701,0
+3964,2832,0
diff --git a/models/recall/ncf/data/train/small_data.txt b/models/recall/ncf/data/train/small_data.txt
new file mode 100644
index 0000000000000000000000000000000000000000..c3c4cf5f84f66594e76603cce1f18d211ebd05a7
--- /dev/null
+++ b/models/recall/ncf/data/train/small_data.txt
@@ -0,0 +1,100 @@
+4764,174,1
+4764,2958,0
+4764,452,0
+4764,1946,0
+4764,3208,0
+2044,2237,1
+2044,1998,0
+2044,328,0
+2044,1542,0
+2044,1932,0
+4276,65,1
+4276,3247,0
+4276,942,0
+4276,3666,0
+4276,2222,0
+3933,682,1
+3933,2451,0
+3933,3695,0
+3933,1643,0
+3933,3568,0
+1151,1265,1
+1151,118,0
+1151,2532,0
+1151,2083,0
+1151,2350,0
+1757,876,1
+1757,201,0
+1757,3633,0
+1757,1068,0
+1757,2549,0
+3370,276,1
+3370,2435,0
+3370,606,0
+3370,910,0
+3370,2146,0
+5137,1018,1
+5137,2163,0
+5137,3167,0
+5137,2315,0
+5137,3595,0
+3933,2831,1
+3933,2881,0
+3933,2949,0
+3933,3660,0
+3933,417,0
+3102,999,1
+3102,1902,0
+3102,2161,0
+3102,3042,0
+3102,1113,0
+2022,336,1
+2022,1672,0
+2022,2656,0
+2022,3649,0
+2022,883,0
+2664,655,1
+2664,3660,0
+2664,1711,0
+2664,3386,0
+2664,1668,0
+25,701,1
+25,32,0
+25,2482,0
+25,3177,0
+25,2767,0
+1738,1643,1
+1738,2187,0
+1738,228,0
+1738,650,0
+1738,3101,0
+5411,1241,1
+5411,2546,0
+5411,3019,0
+5411,3618,0
+5411,1674,0
+638,579,1
+638,3512,0
+638,783,0
+638,2111,0
+638,1880,0
+3554,200,1
+3554,2893,0
+3554,2428,0
+3554,969,0
+3554,2741,0
+4283,1074,1
+4283,3056,0
+4283,2032,0
+4283,405,0
+4283,1505,0
+5111,200,1
+5111,3488,0
+5111,477,0
+5111,2790,0
+5111,40,0
+3964,515,1
+3964,1528,0
+3964,2173,0
+3964,1701,0
+3964,2832,0
diff --git a/models/recall/ncf/model.py b/models/recall/ncf/model.py
new file mode 100644
index 0000000000000000000000000000000000000000..bc8b71cd85af647e054dda38048da68703859c88
--- /dev/null
+++ b/models/recall/ncf/model.py
@@ -0,0 +1,124 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import math
+import paddle.fluid as fluid
+
+from paddlerec.core.utils import envs
+from paddlerec.core.model import Model as ModelBase
+import numpy as np
+
+
+class Model(ModelBase):
+ def __init__(self, config):
+ ModelBase.__init__(self, config)
+
+ def _init_hyper_parameters(self):
+ self.num_users = envs.get_global_env("hyper_parameters.num_users")
+ self.num_items = envs.get_global_env("hyper_parameters.num_items")
+ self.latent_dim = envs.get_global_env("hyper_parameters.latent_dim")
+ self.layers = envs.get_global_env("hyper_parameters.fc_layers")
+
+ def input_data(self, is_infer=False, **kwargs):
+ user_input = fluid.data(
+ name="user_input", shape=[-1, 1], dtype="int64", lod_level=0)
+ item_input = fluid.data(
+ name="item_input", shape=[-1, 1], dtype="int64", lod_level=0)
+ label = fluid.data(
+ name="label", shape=[-1, 1], dtype="int64", lod_level=0)
+ if is_infer:
+ inputs = [user_input] + [item_input]
+ else:
+ inputs = [user_input] + [item_input] + [label]
+
+ return inputs
+
+ def net(self, inputs, is_infer=False):
+
+ num_layer = len(self.layers) #Number of layers in the MLP
+
+ MF_Embedding_User = fluid.embedding(
+ input=inputs[0],
+ size=[self.num_users, self.latent_dim],
+ param_attr=fluid.initializer.Normal(
+ loc=0.0, scale=0.01),
+ is_sparse=True)
+ MF_Embedding_Item = fluid.embedding(
+ input=inputs[1],
+ size=[self.num_items, self.latent_dim],
+ param_attr=fluid.initializer.Normal(
+ loc=0.0, scale=0.01),
+ is_sparse=True)
+
+ MLP_Embedding_User = fluid.embedding(
+ input=inputs[0],
+ size=[self.num_users, int(self.layers[0] / 2)],
+ param_attr=fluid.initializer.Normal(
+ loc=0.0, scale=0.01),
+ is_sparse=True)
+ MLP_Embedding_Item = fluid.embedding(
+ input=inputs[1],
+ size=[self.num_items, int(self.layers[0] / 2)],
+ param_attr=fluid.initializer.Normal(
+ loc=0.0, scale=0.01),
+ is_sparse=True)
+
+ # MF part
+ mf_user_latent = fluid.layers.flatten(x=MF_Embedding_User, axis=1)
+ mf_item_latent = fluid.layers.flatten(x=MF_Embedding_Item, axis=1)
+ mf_vector = fluid.layers.elementwise_mul(mf_user_latent,
+ mf_item_latent)
+
+ # MLP part
+ # The 0-th layer is the concatenation of embedding layers
+ mlp_user_latent = fluid.layers.flatten(x=MLP_Embedding_User, axis=1)
+ mlp_item_latent = fluid.layers.flatten(x=MLP_Embedding_Item, axis=1)
+ mlp_vector = fluid.layers.concat(
+ input=[mlp_user_latent, mlp_item_latent], axis=-1)
+
+ for i in range(1, num_layer):
+ mlp_vector = fluid.layers.fc(
+ input=mlp_vector,
+ size=self.layers[i],
+ act='relu',
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.TruncatedNormal(
+ loc=0.0, scale=1.0 / math.sqrt(mlp_vector.shape[1])),
+ regularizer=fluid.regularizer.L2DecayRegularizer(
+ regularization_coeff=1e-4)),
+ name='layer_' + str(i))
+
+ # Concatenate MF and MLP parts
+ predict_vector = fluid.layers.concat(
+ input=[mf_vector, mlp_vector], axis=-1)
+
+ # Final prediction layer
+ prediction = fluid.layers.fc(
+ input=predict_vector,
+ size=1,
+ act='sigmoid',
+ param_attr=fluid.initializer.MSRAInitializer(uniform=True),
+ name='prediction')
+ if is_infer:
+ self._infer_results["prediction"] = prediction
+ return
+
+ cost = fluid.layers.log_loss(
+ input=prediction,
+ label=fluid.layers.cast(
+ x=inputs[2], dtype='float32'))
+ avg_cost = fluid.layers.mean(cost)
+
+ self._cost = avg_cost
+ self._metrics["cost"] = avg_cost
diff --git a/models/rank/wide_deep/reader.py b/models/recall/ncf/movielens_infer_reader.py
old mode 100755
new mode 100644
similarity index 57%
rename from models/rank/wide_deep/reader.py
rename to models/recall/ncf/movielens_infer_reader.py
index acb6d3cc21ce553f4f876622b3f0a3f749f619c0..148c8008eb058ee3a126b1ec3253f2893d2e7150
--- a/models/rank/wide_deep/reader.py
+++ b/models/recall/ncf/movielens_infer_reader.py
@@ -13,31 +13,29 @@
# limitations under the License.
from __future__ import print_function
-from fleetrec.core.reader import Reader
-from fleetrec.core.utils import envs
-try:
- import cPickle as pickle
-except ImportError:
- import pickle
+from paddlerec.core.reader import Reader
+from paddlerec.core.utils import envs
+from collections import defaultdict
+import numpy as np
+
class TrainReader(Reader):
def init(self):
pass
- def _process_line(self, line):
- line = line.strip().split(',')
- features = list(map(float, line))
- wide_feat = features[0:8]
- deep_feat = features[8:58+8]
- label = features[-1]
- return wide_feat, deep_feat, [label]
-
def generate_sample(self, line):
"""
Read the data line by line and process it as a dictionary
"""
- def data_iter():
- wide_feat, deep_deat, label = self._process_line(line)
- yield [('wide_input', wide_feat), ('deep_input', deep_deat), ('label', label)]
- return data_iter
\ No newline at end of file
+ def reader():
+ """
+ This function needs to be implemented by the user, based on data format
+ """
+ features = line.strip().split(',')
+
+ feature_name = ["user_input", "item_input"]
+ yield zip(feature_name,
+ [[int(features[0])]] + [[int(features[1])]])
+
+ return reader
diff --git a/fleet_rec/core/reader.py b/models/recall/ncf/movielens_reader.py
old mode 100755
new mode 100644
similarity index 52%
rename from fleet_rec/core/reader.py
rename to models/recall/ncf/movielens_reader.py
index 81da9ebf82f5bfa9d409a9a17169f19b13c7716b..add9b6397cef93f3a8f416f19c6847c41537fb5f
--- a/fleet_rec/core/reader.py
+++ b/models/recall/ncf/movielens_reader.py
@@ -13,34 +13,29 @@
# limitations under the License.
from __future__ import print_function
-import abc
-import os
+from paddlerec.core.reader import Reader
+from paddlerec.core.utils import envs
+from collections import defaultdict
+import numpy as np
-import paddle.fluid.incubate.data_generator as dg
-import yaml
-from fleetrec.core.utils import envs
-
-
-class Reader(dg.MultiSlotDataGenerator):
- __metaclass__ = abc.ABCMeta
-
- def __init__(self, config):
- dg.MultiSlotDataGenerator.__init__(self)
-
- if os.path.isfile(config):
- with open(config, 'r') as rb:
- _config = yaml.load(rb.read(), Loader=yaml.FullLoader)
- else:
- raise ValueError("reader config only support yaml")
-
- envs.set_global_envs(_config)
- envs.update_workspace()
-
- @abc.abstractmethod
+class TrainReader(Reader):
def init(self):
pass
- @abc.abstractmethod
def generate_sample(self, line):
- pass
+ """
+ Read the data line by line and process it as a dictionary
+ """
+
+ def reader():
+ """
+ This function needs to be implemented by the user, based on data format
+ """
+ features = line.strip().split(',')
+
+ feature_name = ["user_input", "item_input", "label"]
+ yield zip(feature_name, [[int(features[0])]] +
+ [[int(features[1])]] + [[int(features[2])]])
+
+ return reader
diff --git a/models/recall/readme.md b/models/recall/readme.md
new file mode 100755
index 0000000000000000000000000000000000000000..421df1315dc22396f2ff3bb5aec99508435e2c8d
--- /dev/null
+++ b/models/recall/readme.md
@@ -0,0 +1,80 @@
+# 召回模型库
+
+## 简介
+我们提供了常见的召回任务中使用的模型算法的PaddleRec实现, 单机训练&预测效果指标以及分布式训练&预测性能指标等。实现的召回模型包括 [SR-GNN](gnn)、[GRU4REC](gru4rec)、[Sequence Semantic Retrieval Model](ssr)、[Word2Vector](word2vec)、[Youtube_DNN](youtube_dnn)、[ncf](ncf)。
+
+模型算法库在持续添加中,欢迎关注。
+
+## 目录
+* [整体介绍](#整体介绍)
+ * [召回模型列表](#召回模型列表)
+* [使用教程](#使用教程)
+ * [训练 预测](#训练 预测)
+* [效果对比](#效果对比)
+ * [模型效果列表](#模型效果列表)
+
+## 整体介绍
+### 召回模型列表
+
+| 模型 | 简介 | 论文 |
+| :------------------: | :--------------------: | :---------: |
+| Word2Vec | word2vector | [Distributed Representations of Words and Phrases and their Compositionality](https://papers.nips.cc/paper/5021-distributed-representations-of-words-and-phrases-and-their-compositionality.pdf)(2013) |
+| GRU4REC | SR-GRU | [Session-based Recommendations with Recurrent Neural Networks](https://arxiv.org/abs/1511.06939)(2015) |
+| Youtube_DNN | Youtube_DNN | [Deep Neural Networks for YouTube Recommendations](https://static.googleusercontent.com/media/research.google.com/zh-CN//pubs/archive/45530.pdf)(2016) |
+| SSR | Sequence Semantic Retrieval Model | [Multi-Rate Deep Learning for Temporal Recommendation](http://sonyis.me/paperpdf/spr209-song_sigir16.pdf)(2016) |
+| NCF | Neural Collaborative Filtering | [Neural Collaborative Filtering](https://arxiv.org/pdf/1708.05031.pdf)(2017) |
+| GNN | SR-GNN | [Session-based Recommendation with Graph Neural Networks](https://arxiv.org/abs/1811.00855)(2018) |
+
+下面是每个模型的简介(注:图片引用自链接中的论文)
+
+[Word2Vec](https://papers.nips.cc/paper/5021-distributed-representations-of-words-and-phrases-and-their-compositionality.pdf):
+
+
+
+
+[GRU4REC](https://arxiv.org/abs/1511.06939):
+
+
+
+
+[Youtube_DNN](https://static.googleusercontent.com/media/research.google.com/zh-CN//pubs/archive/45530.pdf):
+
+
+
+
+[SSR](http://sonyis.me/paperpdf/spr209-song_sigir16.pdf):
+
+
+
+
+[NCF](https://arxiv.org/pdf/1708.05031.pdf):
+
+
+
+
+[GNN](https://arxiv.org/abs/1811.00855):
+
+
+
+
+## 使用教程
+### 训练 预测
+```shell
+python -m paddlerec.run -m paddlerec.models.recall.word2vec # word2vec
+python -m paddlerec.run -m paddlerec.models.recall.ssr # ssr
+python -m paddlerec.run -m paddlerec.models.recall.gru4rec # gru4rec
+python -m paddlerec.run -m paddlerec.models.recall.gnn # gnn
+python -m paddlerec.run -m paddlerec.models.recall.ncf # ncf
+python -m paddlerec.run -m paddlerec.models.recall.youtube_dnn # youtube_dnn
+```
+## 效果对比
+### 模型效果列表
+
+| 数据集 | 模型 | HR@10 | Recall@20 |
+| :------------------: | :--------------------: | :---------: |:---------: |
+| DIGINETICA | GNN | -- | 0.507 |
+| RSC15 | GRU4REC | -- | 0.670 |
+| RSC15 | SSR | -- | 0.590 |
+| MOVIELENS | NCF | 0.688 | -- |
+| -- | Youtube | -- | -- |
+| 1 Billion Word Language Model Benchmark | Word2Vec | -- | 0.54 |
diff --git a/models/recall/ssr/config.yaml b/models/recall/ssr/config.yaml
index 640c93b9dfa9f10cb2de4092bc7d94c45ba45b85..7dcecde84d6119501dea9c84047b705e2a9ba410 100644
--- a/models/recall/ssr/config.yaml
+++ b/models/recall/ssr/config.yaml
@@ -12,34 +12,55 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-train:
- trainer:
- # for cluster training
- strategy: "async"
+workspace: "paddlerec.models.recall.ssr"
- epochs: 3
- workspace: "fleetrec.models.recall.ssr"
+dataset:
+- name: dataset_train
+ batch_size: 5
+ type: QueueDataset
+ data_path: "{workspace}/data/train"
+ data_converter: "{workspace}/ssr_reader.py"
+- name: dataset_infer
+ batch_size: 5
+ type: QueueDataset
+ data_path: "{workspace}/data/test"
+ data_converter: "{workspace}/ssr_infer_reader.py"
+
+hyper_parameters:
+ vocab_size: 1000
+ emb_dim: 128
+ hidden_size: 100
+ optimizer:
+ class: adagrad
+ learning_rate: 0.01
+ strategy: async
- reader:
- batch_size: 5
- class: "{workspace}/ssr_reader.py"
- train_data_path: "{workspace}/data/train"
+#use infer_runner mode and modify 'phase' below if infer
+mode: train_runner
+#mode: infer_runner
- model:
- models: "{workspace}/model.py"
- hyper_parameters:
- vocab_size: 1000
- emb_dim: 128
- hidden_size: 100
- learning_rate: 0.01
- optimizer: adagrad
+runner:
+- name: train_runner
+ class: single_train
+ device: cpu
+ epochs: 3
+ save_checkpoint_interval: 2
+ save_inference_interval: 4
+ save_checkpoint_path: "increment"
+ save_inference_path: "inference"
+ print_interval: 10
+- name: infer_runner
+ class: single_infer
+ init_model_path: "increment/0"
+ device: cpu
+ epochs: 3
- save:
- increment:
- dirname: "increment"
- epoch_interval: 2
- save_last: True
- inference:
- dirname: "inference"
- epoch_interval: 4
- save_last: True
+phase:
+- name: train
+ model: "{workspace}/model.py"
+ dataset_name: dataset_train
+ thread_num: 1
+ #- name: infer
+ # model: "{workspace}/model.py"
+ # dataset_name: dataset_infer
+ # thread_num: 1
diff --git a/models/recall/ssr/model.py b/models/recall/ssr/model.py
index 5fd94cf8763d79e4cc7edd92019aae1e8bf716a2..b97a5927f736e97c763fec177882f40097650011 100644
--- a/models/recall/ssr/model.py
+++ b/models/recall/ssr/model.py
@@ -12,15 +12,126 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-import math
import paddle.fluid as fluid
-
-from fleetrec.core.utils import envs
-from fleetrec.core.model import Model as ModelBase
import paddle.fluid.layers.tensor as tensor
-import paddle.fluid.layers.io as io
import paddle.fluid.layers.control_flow as cf
+from paddlerec.core.utils import envs
+from paddlerec.core.model import Model as ModelBase
+
+
+class Model(ModelBase):
+ def __init__(self, config):
+ ModelBase.__init__(self, config)
+
+ def _init_hyper_parameters(self):
+ self.vocab_size = envs.get_global_env("hyper_parameters.vocab_size")
+ self.emb_dim = envs.get_global_env("hyper_parameters.emb_dim")
+ self.hidden_size = envs.get_global_env("hyper_parameters.hidden_size")
+
+ def input_data(self, is_infer=False, **kwargs):
+ if is_infer:
+ user_data = fluid.data(
+ name="user", shape=[None, 1], dtype="int64", lod_level=1)
+ all_item_data = fluid.data(
+ name="all_item", shape=[None, self.vocab_size], dtype="int64")
+ pos_label = fluid.data(
+ name="pos_label", shape=[None, 1], dtype="int64")
+ return [user_data, all_item_data, pos_label]
+ else:
+ user_data = fluid.data(
+ name="user", shape=[None, 1], dtype="int64", lod_level=1)
+ pos_item_data = fluid.data(
+ name="p_item", shape=[None, 1], dtype="int64", lod_level=1)
+ neg_item_data = fluid.data(
+ name="n_item", shape=[None, 1], dtype="int64", lod_level=1)
+ return [user_data, pos_item_data, neg_item_data]
+
+ def net(self, inputs, is_infer=False):
+ if is_infer:
+ self._infer_net(inputs)
+ return
+ user_data = inputs[0]
+ pos_item_data = inputs[1]
+ neg_item_data = inputs[2]
+ emb_shape = [self.vocab_size, self.emb_dim]
+ self.user_encoder = GrnnEncoder()
+ self.item_encoder = BowEncoder()
+ self.pairwise_hinge_loss = PairwiseHingeLoss()
+
+ user_emb = fluid.embedding(
+ input=user_data, size=emb_shape, param_attr="emb.item")
+ pos_item_emb = fluid.embedding(
+ input=pos_item_data, size=emb_shape, param_attr="emb.item")
+ neg_item_emb = fluid.embedding(
+ input=neg_item_data, size=emb_shape, param_attr="emb.item")
+ user_enc = self.user_encoder.forward(user_emb)
+ pos_item_enc = self.item_encoder.forward(pos_item_emb)
+ neg_item_enc = self.item_encoder.forward(neg_item_emb)
+ user_hid = fluid.layers.fc(input=user_enc,
+ size=self.hidden_size,
+ param_attr='user.w',
+ bias_attr="user.b")
+ pos_item_hid = fluid.layers.fc(input=pos_item_enc,
+ size=self.hidden_size,
+ param_attr='item.w',
+ bias_attr="item.b")
+ neg_item_hid = fluid.layers.fc(input=neg_item_enc,
+ size=self.hidden_size,
+ param_attr='item.w',
+ bias_attr="item.b")
+ cos_pos = fluid.layers.cos_sim(user_hid, pos_item_hid)
+ cos_neg = fluid.layers.cos_sim(user_hid, neg_item_hid)
+ hinge_loss = self.pairwise_hinge_loss.forward(cos_pos, cos_neg)
+ avg_cost = fluid.layers.mean(hinge_loss)
+ correct = self._get_correct(cos_neg, cos_pos)
+
+ self._cost = avg_cost
+ self._metrics["correct"] = correct
+ self._metrics["hinge_loss"] = hinge_loss
+
+ def _infer_net(self, inputs):
+ user_data = inputs[0]
+ all_item_data = inputs[1]
+ pos_label = inputs[2]
+
+ user_emb = fluid.embedding(
+ input=user_data,
+ size=[self.vocab_size, self.emb_dim],
+ param_attr="emb.item")
+ all_item_emb = fluid.embedding(
+ input=all_item_data,
+ size=[self.vocab_size, self.emb_dim],
+ param_attr="emb.item")
+ all_item_emb_re = fluid.layers.reshape(
+ x=all_item_emb, shape=[-1, self.emb_dim])
+
+ user_encoder = GrnnEncoder()
+ user_enc = user_encoder.forward(user_emb)
+ user_hid = fluid.layers.fc(input=user_enc,
+ size=self.hidden_size,
+ param_attr='user.w',
+ bias_attr="user.b")
+ user_exp = fluid.layers.expand(
+ x=user_hid, expand_times=[1, self.vocab_size])
+ user_re = fluid.layers.reshape(
+ x=user_exp, shape=[-1, self.hidden_size])
+
+ all_item_hid = fluid.layers.fc(input=all_item_emb_re,
+ size=self.hidden_size,
+ param_attr='item.w',
+ bias_attr="item.b")
+ cos_item = fluid.layers.cos_sim(X=all_item_hid, Y=user_re)
+ all_pre_ = fluid.layers.reshape(
+ x=cos_item, shape=[-1, self.vocab_size])
+ acc = fluid.layers.accuracy(input=all_pre_, label=pos_label, k=20)
+
+ self._infer_results['recall20'] = acc
+
+ def _get_correct(self, x, y):
+ less = tensor.cast(cf.less_than(x, y), dtype='float32')
+ correct = fluid.layers.reduce_sum(less)
+ return correct
class BowEncoder(object):
@@ -54,6 +165,7 @@ class GrnnEncoder(object):
bias_attr=self.param_name + ".bias")
return fluid.layers.sequence_pool(input=gru_h, pool_type='max')
+
class PairwiseHingeLoss(object):
def __init__(self, margin=0.8):
self.margin = margin
@@ -69,70 +181,3 @@ class PairwiseHingeLoss(object):
input=loss_part2, shape=[-1, 1], value=0.0, dtype='float32'),
loss_part2)
return loss_part3
-
-class Model(ModelBase):
- def __init__(self, config):
- ModelBase.__init__(self, config)
-
- def get_correct(self, x, y):
- less = tensor.cast(cf.less_than(x, y), dtype='float32')
- correct = fluid.layers.reduce_sum(less)
- return correct
-
- def train(self):
-
- vocab_size = envs.get_global_env("hyper_parameters.vocab_size", None, self._namespace)
- emb_dim = envs.get_global_env("hyper_parameters.emb_dim", None, self._namespace)
- hidden_size = envs.get_global_env("hyper_parameters.hidden_size", None, self._namespace)
- emb_shape = [vocab_size, emb_dim]
-
- self.user_encoder = GrnnEncoder()
- self.item_encoder = BowEncoder()
- self.pairwise_hinge_loss = PairwiseHingeLoss()
-
- user_data = fluid.data(
- name="user", shape=[None, 1], dtype="int64", lod_level=1)
- pos_item_data = fluid.data(
- name="p_item", shape=[None, 1], dtype="int64", lod_level=1)
- neg_item_data = fluid.data(
- name="n_item", shape=[None, 1], dtype="int64", lod_level=1)
- self._data_var.extend([user_data, pos_item_data, neg_item_data])
-
- user_emb = fluid.embedding(
- input=user_data, size=emb_shape, param_attr="emb.item")
- pos_item_emb = fluid.embedding(
- input=pos_item_data, size=emb_shape, param_attr="emb.item")
- neg_item_emb = fluid.embedding(
- input=neg_item_data, size=emb_shape, param_attr="emb.item")
- user_enc = self.user_encoder.forward(user_emb)
- pos_item_enc = self.item_encoder.forward(pos_item_emb)
- neg_item_enc = self.item_encoder.forward(neg_item_emb)
- user_hid = fluid.layers.fc(input=user_enc,
- size=hidden_size,
- param_attr='user.w',
- bias_attr="user.b")
- pos_item_hid = fluid.layers.fc(input=pos_item_enc,
- size=hidden_size,
- param_attr='item.w',
- bias_attr="item.b")
- neg_item_hid = fluid.layers.fc(input=neg_item_enc,
- size=hidden_size,
- param_attr='item.w',
- bias_attr="item.b")
- cos_pos = fluid.layers.cos_sim(user_hid, pos_item_hid)
- cos_neg = fluid.layers.cos_sim(user_hid, neg_item_hid)
- hinge_loss = self.pairwise_hinge_loss.forward(cos_pos, cos_neg)
- avg_cost = fluid.layers.mean(hinge_loss)
- correct = self.get_correct(cos_neg, cos_pos)
-
- self._cost = avg_cost
- self._metrics["correct"] = correct
- self._metrics["hinge_loss"] = hinge_loss
-
-
- def train_net(self):
- self.train()
-
-
- def infer_net(self):
- pass
diff --git a/models/recall/ssr/ssr_infer_reader.py b/models/recall/ssr/ssr_infer_reader.py
new file mode 100644
index 0000000000000000000000000000000000000000..1f94b1d21fbd428282d3e9faecd09a590588fbc9
--- /dev/null
+++ b/models/recall/ssr/ssr_infer_reader.py
@@ -0,0 +1,48 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+from __future__ import print_function
+
+import numpy as np
+
+from paddlerec.core.reader import Reader
+from paddlerec.core.utils import envs
+
+
+class EvaluateReader(Reader):
+ def init(self):
+ self.vocab_size = envs.get_global_env("vocab_size", 10,
+ "train.model.hyper_parameters")
+
+ def generate_sample(self, line):
+ """
+ Read the data line by line and process it as a dictionary
+ """
+
+ def reader():
+ """
+ This function needs to be implemented by the user, based on data format
+ """
+ ids = line.strip().split()
+ conv_ids = [int(i) for i in ids]
+ boundary = len(ids) - 1
+ src = conv_ids[:boundary]
+ pos_tgt = [conv_ids[boundary]]
+ feature_name = ["user", "all_item", "p_item"]
+ yield zip(
+ feature_name,
+ [src] + [np.arange(self.vocab_size).astype("int64").tolist()] +
+ [pos_tgt])
+
+ return reader
diff --git a/models/recall/ssr/ssr_reader.py b/models/recall/ssr/ssr_reader.py
index ba81e01f7b84a462938383a141eeb2bfe3f816d1..d2d35458d867bd560e7e0b751f61de83d0f822b6 100644
--- a/models/recall/ssr/ssr_reader.py
+++ b/models/recall/ssr/ssr_reader.py
@@ -11,12 +11,13 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
+
from __future__ import print_function
-from fleetrec.core.reader import Reader
-from fleetrec.core.utils import envs
import random
+from paddlerec.core.reader import Reader
+
class TrainReader(Reader):
def init(self):
@@ -25,7 +26,6 @@ class TrainReader(Reader):
def sample_neg_from_seq(self, seq):
return seq[random.randint(0, len(seq) - 1)]
-
def generate_sample(self, line):
"""
Read the data line by line and process it as a dictionary
diff --git a/models/recall/tdm/__init__.py b/models/recall/tdm/__init__.py
deleted file mode 100755
index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..0000000000000000000000000000000000000000
diff --git a/models/recall/tdm/tree/layer_list.txt b/models/recall/tdm/tree/layer_list.txt
deleted file mode 100755
index d8606bc601202390bd9aa54197fac8f34e3c5b59..0000000000000000000000000000000000000000
--- a/models/recall/tdm/tree/layer_list.txt
+++ /dev/null
@@ -1,4 +0,0 @@
-1,2
-3,4,5,6
-7,8,9,10,11,12,13
-14,15,16,17,18,19,20,21,22,23,24,25
\ No newline at end of file
diff --git a/models/recall/word2vec/config.yaml b/models/recall/word2vec/config.yaml
index a3591e73d447a1a51286fba295bc2eea26a097d0..9bb5c4d3fe42bc2385ff22ad150924ad5a3a59cd 100755
--- a/models/recall/word2vec/config.yaml
+++ b/models/recall/word2vec/config.yaml
@@ -12,7 +12,11 @@
# See the License for the specific language governing permissions and
# limitations under the License.
evaluate:
- workspace: "fleetrec.models.recall.word2vec"
+ workspace: "paddlerec.models.recall.word2vec"
+
+ evaluate_only: False
+ evaluate_model_path: ""
+
reader:
batch_size: 50
class: "{workspace}/w2v_evaluate_reader.py"
@@ -25,7 +29,7 @@ train:
strategy: "async"
epochs: 2
- workspace: "fleetrec.models.recall.word2vec"
+ workspace: "paddlerec.models.recall.word2vec"
reader:
batch_size: 100
diff --git a/models/recall/word2vec/model.py b/models/recall/word2vec/model.py
index 7899710a19740926fc43691276600611b515d1fb..fefc89043c2f926f37318e1094b9cdf98dd6235a 100755
--- a/models/recall/word2vec/model.py
+++ b/models/recall/word2vec/model.py
@@ -12,12 +12,11 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-import math
import numpy as np
import paddle.fluid as fluid
-from fleetrec.core.utils import envs
-from fleetrec.core.model import Model as ModelBase
+from paddlerec.core.utils import envs
+from paddlerec.core.model import Model as ModelBase
class Model(ModelBase):
@@ -25,46 +24,57 @@ class Model(ModelBase):
ModelBase.__init__(self, config)
def input(self):
- neg_num = int(envs.get_global_env(
- "hyper_parameters.neg_num", None, self._namespace))
- self.input_word = fluid.data(name="input_word", shape=[
- None, 1], dtype='int64')
- self.true_word = fluid.data(name='true_label', shape=[
- None, 1], dtype='int64')
+ neg_num = int(
+ envs.get_global_env("hyper_parameters.neg_num", None,
+ self._namespace))
+ self.input_word = fluid.data(
+ name="input_word", shape=[None, 1], dtype='int64')
+ self.true_word = fluid.data(
+ name='true_label', shape=[None, 1], dtype='int64')
self._data_var.append(self.input_word)
self._data_var.append(self.true_word)
- with_shuffle_batch = bool(int(envs.get_global_env(
- "hyper_parameters.with_shuffle_batch", None, self._namespace)))
+ with_shuffle_batch = bool(
+ int(
+ envs.get_global_env("hyper_parameters.with_shuffle_batch",
+ None, self._namespace)))
if not with_shuffle_batch:
- self.neg_word = fluid.data(name="neg_label", shape=[
- None, neg_num], dtype='int64')
+ self.neg_word = fluid.data(
+ name="neg_label", shape=[None, neg_num], dtype='int64')
self._data_var.append(self.neg_word)
if self._platform != "LINUX":
self._data_loader = fluid.io.DataLoader.from_generator(
- feed_list=self._data_var, capacity=64, use_double_buffer=False, iterable=False)
+ feed_list=self._data_var,
+ capacity=64,
+ use_double_buffer=False,
+ iterable=False)
def net(self):
is_distributed = True if envs.get_trainer() == "CtrTrainer" else False
- neg_num = int(envs.get_global_env(
- "hyper_parameters.neg_num", None, self._namespace))
+ neg_num = int(
+ envs.get_global_env("hyper_parameters.neg_num", None,
+ self._namespace))
sparse_feature_number = envs.get_global_env(
"hyper_parameters.sparse_feature_number", None, self._namespace)
sparse_feature_dim = envs.get_global_env(
"hyper_parameters.sparse_feature_dim", None, self._namespace)
- with_shuffle_batch = bool(int(envs.get_global_env(
- "hyper_parameters.with_shuffle_batch", None, self._namespace)))
+ with_shuffle_batch = bool(
+ int(
+ envs.get_global_env("hyper_parameters.with_shuffle_batch",
+ None, self._namespace)))
- def embedding_layer(input, table_name, emb_dim, initializer_instance=None, squeeze=False):
+ def embedding_layer(input,
+ table_name,
+ emb_dim,
+ initializer_instance=None,
+ squeeze=False):
emb = fluid.embedding(
input=input,
is_sparse=True,
is_distributed=is_distributed,
size=[sparse_feature_number, emb_dim],
param_attr=fluid.ParamAttr(
- name=table_name,
- initializer=initializer_instance),
- )
+ name=table_name, initializer=initializer_instance), )
if squeeze:
return fluid.layers.squeeze(input=emb, axes=[1])
else:
@@ -74,35 +84,38 @@ class Model(ModelBase):
emb_initializer = fluid.initializer.Uniform(-init_width, init_width)
emb_w_initializer = fluid.initializer.Constant(value=0.0)
- input_emb = embedding_layer(
- self.input_word, "emb", sparse_feature_dim, emb_initializer, True)
- true_emb_w = embedding_layer(
- self.true_word, "emb_w", sparse_feature_dim, emb_w_initializer, True)
- true_emb_b = embedding_layer(
- self.true_word, "emb_b", 1, emb_w_initializer, True)
+ input_emb = embedding_layer(self.input_word, "emb", sparse_feature_dim,
+ emb_initializer, True)
+ true_emb_w = embedding_layer(self.true_word, "emb_w",
+ sparse_feature_dim, emb_w_initializer,
+ True)
+ true_emb_b = embedding_layer(self.true_word, "emb_b", 1,
+ emb_w_initializer, True)
if with_shuffle_batch:
neg_emb_w_list = []
for i in range(neg_num):
- neg_emb_w_list.append(fluid.contrib.layers.shuffle_batch(
- true_emb_w)) # shuffle true_word
+ neg_emb_w_list.append(
+ fluid.contrib.layers.shuffle_batch(
+ true_emb_w)) # shuffle true_word
neg_emb_w_concat = fluid.layers.concat(neg_emb_w_list, axis=0)
neg_emb_w = fluid.layers.reshape(
neg_emb_w_concat, shape=[-1, neg_num, sparse_feature_dim])
neg_emb_b_list = []
for i in range(neg_num):
- neg_emb_b_list.append(fluid.contrib.layers.shuffle_batch(
- true_emb_b)) # shuffle true_word
+ neg_emb_b_list.append(
+ fluid.contrib.layers.shuffle_batch(
+ true_emb_b)) # shuffle true_word
neg_emb_b = fluid.layers.concat(neg_emb_b_list, axis=0)
neg_emb_b_vec = fluid.layers.reshape(
neg_emb_b, shape=[-1, neg_num])
else:
- neg_emb_w = embedding_layer(
- self.neg_word, "emb_w", sparse_feature_dim, emb_w_initializer)
- neg_emb_b = embedding_layer(
- self.neg_word, "emb_b", 1, emb_w_initializer)
+ neg_emb_w = embedding_layer(self.neg_word, "emb_w",
+ sparse_feature_dim, emb_w_initializer)
+ neg_emb_b = embedding_layer(self.neg_word, "emb_b", 1,
+ emb_w_initializer)
neg_emb_b_vec = fluid.layers.reshape(
neg_emb_b, shape=[-1, neg_num])
@@ -118,7 +131,8 @@ class Model(ModelBase):
neg_matmul = fluid.layers.matmul(
input_emb_re, neg_emb_w, transpose_y=True)
neg_logits = fluid.layers.elementwise_add(
- fluid.layers.reshape(neg_matmul, shape=[-1, neg_num]),
+ fluid.layers.reshape(
+ neg_matmul, shape=[-1, neg_num]),
neg_emb_b_vec)
label_ones = fluid.layers.fill_constant_batch_size_like(
@@ -137,9 +151,17 @@ class Model(ModelBase):
neg_xent, dim=1))
self.avg_cost = fluid.layers.reduce_mean(cost)
global_right_cnt = fluid.layers.create_global_var(
- name="global_right_cnt", persistable=True, dtype='float32', shape=[1], value=0)
+ name="global_right_cnt",
+ persistable=True,
+ dtype='float32',
+ shape=[1],
+ value=0)
global_total_cnt = fluid.layers.create_global_var(
- name="global_total_cnt", persistable=True, dtype='float32', shape=[1], value=0)
+ name="global_total_cnt",
+ persistable=True,
+ dtype='float32',
+ shape=[1],
+ value=0)
global_right_cnt.stop_gradient = True
global_total_cnt.stop_gradient = True
@@ -156,12 +178,12 @@ class Model(ModelBase):
self.metrics()
def optimizer(self):
- learning_rate = envs.get_global_env(
- "hyper_parameters.learning_rate", None, self._namespace)
- decay_steps = envs.get_global_env(
- "hyper_parameters.decay_steps", None, self._namespace)
- decay_rate = envs.get_global_env(
- "hyper_parameters.decay_rate", None, self._namespace)
+ learning_rate = envs.get_global_env("hyper_parameters.learning_rate",
+ None, self._namespace)
+ decay_steps = envs.get_global_env("hyper_parameters.decay_steps", None,
+ self._namespace)
+ decay_rate = envs.get_global_env("hyper_parameters.decay_rate", None,
+ self._namespace)
optimizer = fluid.optimizer.SGD(
learning_rate=fluid.layers.exponential_decay(
learning_rate=learning_rate,
@@ -181,11 +203,15 @@ class Model(ModelBase):
name="analogy_c", shape=[None], dtype='int64')
self.analogy_d = fluid.data(
name="analogy_d", shape=[None], dtype='int64')
- self._infer_data_var = [self.analogy_a,
- self.analogy_b, self.analogy_c, self.analogy_d]
+ self._infer_data_var = [
+ self.analogy_a, self.analogy_b, self.analogy_c, self.analogy_d
+ ]
self._infer_data_loader = fluid.io.DataLoader.from_generator(
- feed_list=self._infer_data_var, capacity=64, use_double_buffer=False, iterable=False)
+ feed_list=self._infer_data_var,
+ capacity=64,
+ use_double_buffer=False,
+ iterable=False)
def infer_net(self):
sparse_feature_dim = envs.get_global_env(
@@ -217,18 +243,28 @@ class Model(ModelBase):
dist = fluid.layers.matmul(
x=target, y=emb_all_label_l2, transpose_y=True)
values, pred_idx = fluid.layers.topk(input=dist, k=4)
- label = fluid.layers.expand(fluid.layers.unsqueeze(
- self.analogy_d, axes=[1]), expand_times=[1, 4])
+ label = fluid.layers.expand(
+ fluid.layers.unsqueeze(
+ self.analogy_d, axes=[1]),
+ expand_times=[1, 4])
label_ones = fluid.layers.fill_constant_batch_size_like(
label, shape=[-1, 1], value=1.0, dtype='float32')
- right_cnt = fluid.layers.reduce_sum(
- input=fluid.layers.cast(fluid.layers.equal(pred_idx, label), dtype='float32'))
+ right_cnt = fluid.layers.reduce_sum(input=fluid.layers.cast(
+ fluid.layers.equal(pred_idx, label), dtype='float32'))
total_cnt = fluid.layers.reduce_sum(label_ones)
global_right_cnt = fluid.layers.create_global_var(
- name="global_right_cnt", persistable=True, dtype='float32', shape=[1], value=0)
+ name="global_right_cnt",
+ persistable=True,
+ dtype='float32',
+ shape=[1],
+ value=0)
global_total_cnt = fluid.layers.create_global_var(
- name="global_total_cnt", persistable=True, dtype='float32', shape=[1], value=0)
+ name="global_total_cnt",
+ persistable=True,
+ dtype='float32',
+ shape=[1],
+ value=0)
global_right_cnt.stop_gradient = True
global_total_cnt.stop_gradient = True
diff --git a/models/recall/word2vec/prepare_data.sh b/models/recall/word2vec/prepare_data.sh
index 743ae99871ba1cc2d7309cda11117446613eb0e5..cfd067350ce1d33112806ab72ca78222381a86f4 100755
--- a/models/recall/word2vec/prepare_data.sh
+++ b/models/recall/word2vec/prepare_data.sh
@@ -1,5 +1,20 @@
#! /bin/bash
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+
# download train_data
mkdir raw_data
wget --no-check-certificate https://paddlerec.bj.bcebos.com/word2vec/1-billion-word-language-modeling-benchmark-r13output.tar
@@ -20,6 +35,3 @@ wget --no-check-certificate https://paddlerec.bj.bcebos.com/word2vec/test_dir.ta
tar xzvf test_dir.tar -C raw_data
mv raw_data/data/test_dir test_data/
rm -rf raw_data
-
-
-
diff --git a/models/recall/word2vec/preprocess.py b/models/recall/word2vec/preprocess.py
index 31088efb90498e44ffd5fe4fe24942e1d52c2e6b..6c9ee16cd2d136006dc10e7ce0c970974e8bf2b5 100755
--- a/models/recall/word2vec/preprocess.py
+++ b/models/recall/word2vec/preprocess.py
@@ -1,11 +1,27 @@
# -*- coding: utf-8 -*
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import io
+import math
import os
import random
import re
import six
+
import argparse
-import io
-import math
+
prog = re.compile("[^a-z ]", flags=0)
@@ -33,8 +49,7 @@ def parse_args():
'--file_nums',
type=int,
default=1024,
- help="re-split input corpus file nums"
- )
+ help="re-split input corpus file nums")
parser.add_argument(
'--downsample',
type=float,
@@ -59,7 +74,7 @@ def parse_args():
def text_strip(text):
- #English Preprocess Rule
+ # English Preprocess Rule
return prog.sub("", text.lower())
@@ -101,7 +116,7 @@ def filter_corpus(args):
word_all_count = 0
id_counts = []
word_id = 0
- #read dict
+ # read dict
with io.open(args.dict_path, 'r', encoding='utf-8') as f:
for line in f:
word, count = line.split()[0], int(line.split()[1])
@@ -111,19 +126,21 @@ def filter_corpus(args):
id_counts.append(count)
word_all_count += count
- #write word2id file
+ # write word2id file
print("write word2id file to : " + args.dict_path + "_word_to_id_")
with io.open(
args.dict_path + "_word_to_id_", 'w+', encoding='utf-8') as fid:
for k, v in word_to_id_.items():
fid.write(k + " " + str(v) + '\n')
- #filter corpus and convert id
+ # filter corpus and convert id
if not os.path.exists(args.output_corpus_dir):
os.makedirs(args.output_corpus_dir)
for file in os.listdir(args.input_corpus_dir):
- with io.open(args.output_corpus_dir + '/convert_' + file + '.csv', "w") as wf:
+ with io.open(args.output_corpus_dir + '/convert_' + file + '.csv',
+ "w") as wf:
with io.open(
- args.input_corpus_dir + '/' + file, encoding='utf-8') as rf:
+ args.input_corpus_dir + '/' + file,
+ encoding='utf-8') as rf:
print(args.input_corpus_dir + '/' + file)
for line in rf:
signal = False
@@ -166,7 +183,8 @@ def build_dict(args):
for file in os.listdir(args.build_dict_corpus_dir):
with io.open(
- args.build_dict_corpus_dir + "/" + file, encoding='utf-8') as f:
+ args.build_dict_corpus_dir + "/" + file,
+ encoding='utf-8') as f:
print("build dict : ", args.build_dict_corpus_dir + "/" + file)
for line in f:
line = text_strip(line)
@@ -186,7 +204,7 @@ def build_dict(args):
for item in item_to_remove:
unk_sum += word_count[item]
del word_count[item]
- #sort by count
+ # sort by count
word_count[native_to_unicode('')] = unk_sum
word_count = sorted(
word_count.items(), key=lambda word_count: -word_count[1])
@@ -208,17 +226,19 @@ def data_split(args):
for file_ in files:
with open(os.path.join(raw_data_dir, file_), 'r') as f:
contents.extend(f.readlines())
-
+
num = int(args.file_nums)
lines_per_file = len(contents) / num
print("contents: ", str(len(contents)))
print("lines_per_file: ", str(lines_per_file))
-
- for i in range(1, num+1):
+
+ for i in range(1, num + 1):
with open(os.path.join(new_data_dir, "part_" + str(i)), 'w') as fout:
- data = contents[(i-1)*lines_per_file:min(i*lines_per_file,len(contents))]
+ data = contents[(i - 1) * lines_per_file:min(i * lines_per_file,
+ len(contents))]
for line in data:
- fout.write(line)
+ fout.write(line)
+
if __name__ == "__main__":
args = parse_args()
diff --git a/models/recall/word2vec/w2v_evaluate_reader.py b/models/recall/word2vec/w2v_evaluate_reader.py
index df6de9315360c1db0e6a2e4b3534a950390b6659..6350c960e61d8ef3580cc4cc605ba24cb5623b0b 100755
--- a/models/recall/word2vec/w2v_evaluate_reader.py
+++ b/models/recall/word2vec/w2v_evaluate_reader.py
@@ -1,4 +1,4 @@
-# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved.
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
@@ -11,16 +11,19 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
-import numpy as np
+
import io
+
import six
-from fleetrec.core.reader import Reader
-from fleetrec.core.utils import envs
+
+from paddlerec.core.reader import Reader
+from paddlerec.core.utils import envs
class EvaluateReader(Reader):
def init(self):
- dict_path = envs.get_global_env("word_id_dict_path", None, "evaluate.reader")
+ dict_path = envs.get_global_env("word_id_dict_path", None,
+ "evaluate.reader")
self.word_to_id = dict()
self.id_to_word = dict()
with io.open(dict_path, 'r', encoding='utf-8') as f:
@@ -46,19 +49,16 @@ class EvaluateReader(Reader):
if isinstance(s, str):
return True
return False
-
-
+
def _to_unicode(self, s, ignore_errors=False):
if self._is_unicode(s):
return s
error_mode = "ignore" if ignore_errors else "strict"
return s.decode("utf-8", errors=error_mode)
-
-
+
def strip_lines(self, line, vocab):
return self._replace_oov(vocab, self.native_to_unicode(line))
-
-
+
def _replace_oov(self, original_vocab, line):
"""Replace out-of-vocab words with "".
This maintains compatibility with published results.
@@ -69,12 +69,17 @@ class EvaluateReader(Reader):
a unicode string - a space-delimited sequence of words.
"""
return u" ".join([
- word if word in original_vocab else u"" for word in line.split()
+ word if word in original_vocab else u""
+ for word in line.split()
])
def generate_sample(self, line):
def reader():
features = self.strip_lines(line.lower(), self.word_to_id)
features = features.split()
- yield [('analogy_a', [self.word_to_id[features[0]]]), ('analogy_b', [self.word_to_id[features[1]]]), ('analogy_c', [self.word_to_id[features[2]]]), ('analogy_d', [self.word_to_id[features[3]]])]
+ yield [('analogy_a', [self.word_to_id[features[0]]]),
+ ('analogy_b', [self.word_to_id[features[1]]]),
+ ('analogy_c', [self.word_to_id[features[2]]]),
+ ('analogy_d', [self.word_to_id[features[3]]])]
+
return reader
diff --git a/models/recall/word2vec/w2v_reader.py b/models/recall/word2vec/w2v_reader.py
index 4857a6b06e78f05503349ebaeee17f6d44d423a4..9b3e69127055118bbc16b30eaac63f9a282bd1eb 100755
--- a/models/recall/word2vec/w2v_reader.py
+++ b/models/recall/word2vec/w2v_reader.py
@@ -1,4 +1,4 @@
-# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved.
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
@@ -11,10 +11,13 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
-import numpy as np
+
import io
-from fleetrec.core.reader import Reader
-from fleetrec.core.utils import envs
+
+import numpy as np
+
+from paddlerec.core.reader import Reader
+from paddlerec.core.utils import envs
class NumpyRandomInt(object):
@@ -37,10 +40,14 @@ class NumpyRandomInt(object):
class TrainReader(Reader):
def init(self):
- dict_path = envs.get_global_env("word_count_dict_path", None, "train.reader")
- self.window_size = envs.get_global_env("hyper_parameters.window_size", None, "train.model")
- self.neg_num = envs.get_global_env("hyper_parameters.neg_num", None, "train.model")
- self.with_shuffle_batch = envs.get_global_env("hyper_parameters.with_shuffle_batch", None, "train.model")
+ dict_path = envs.get_global_env("word_count_dict_path", None,
+ "train.reader")
+ self.window_size = envs.get_global_env("hyper_parameters.window_size",
+ None, "train.model")
+ self.neg_num = envs.get_global_env("hyper_parameters.neg_num", None,
+ "train.model")
+ self.with_shuffle_batch = envs.get_global_env(
+ "hyper_parameters.with_shuffle_batch", None, "train.model")
self.random_generator = NumpyRandomInt(1, self.window_size + 1)
self.cs = None
@@ -72,19 +79,21 @@ class TrainReader(Reader):
start_point = 0
end_point = idx + target_window
targets = words[start_point:idx] + words[idx + 1:end_point + 1]
- return targets
+ return targets
def generate_sample(self, line):
def reader():
word_ids = [w for w in line.split()]
for idx, target_id in enumerate(word_ids):
- context_word_ids = self.get_context_words(
- word_ids, idx)
+ context_word_ids = self.get_context_words(word_ids, idx)
for context_id in context_word_ids:
- output = [('input_word', [int(target_id)]), ('true_label', [int(context_id)])]
+ output = [('input_word', [int(target_id)]),
+ ('true_label', [int(context_id)])]
if not self.with_shuffle_batch:
- neg_array = self.cs.searchsorted(np.random.sample(self.neg_num))
- output += [('neg_label', [int(str(i)) for i in neg_array ])]
+ neg_array = self.cs.searchsorted(
+ np.random.sample(self.neg_num))
+ output += [('neg_label',
+ [int(str(i)) for i in neg_array])]
yield output
- return reader
+ return reader
diff --git a/models/recall/youtube_dnn/__init__.py b/models/recall/youtube_dnn/__init__.py
new file mode 100755
index 0000000000000000000000000000000000000000..abf198b97e6e818e1fbe59006f98492640bcee54
--- /dev/null
+++ b/models/recall/youtube_dnn/__init__.py
@@ -0,0 +1,13 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
diff --git a/models/recall/youtube_dnn/config.yaml b/models/recall/youtube_dnn/config.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..5bbc41a9e850044101fa844fca256db358dc1754
--- /dev/null
+++ b/models/recall/youtube_dnn/config.yaml
@@ -0,0 +1,54 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+
+workspace: "paddlerec.models.recall.youtube_dnn"
+
+dataset:
+- name: dataset_train
+ batch_size: 5
+ type: DataLoader
+ #type: QueueDataset
+ data_path: "{workspace}/data/train"
+ data_converter: "{workspace}/random_reader.py"
+
+hyper_parameters:
+ watch_vec_size: 64
+ search_vec_size: 64
+ other_feat_size: 64
+ output_size: 100
+ layers: [128, 64, 32]
+ optimizer:
+ class: adam
+ learning_rate: 0.001
+ strategy: async
+
+mode: train_runner
+
+runner:
+- name: train_runner
+ class: single_train
+ device: cpu
+ epochs: 3
+ save_checkpoint_interval: 2
+ save_inference_interval: 4
+ save_checkpoint_path: "increment"
+ save_inference_path: "inference"
+ print_interval: 10
+
+phase:
+- name: train
+ model: "{workspace}/model.py"
+ dataset_name: dataset_train
+ thread_num: 1
diff --git a/models/recall/youtube_dnn/data/test/small_data.txt b/models/recall/youtube_dnn/data/test/small_data.txt
new file mode 100644
index 0000000000000000000000000000000000000000..c3c4cf5f84f66594e76603cce1f18d211ebd05a7
--- /dev/null
+++ b/models/recall/youtube_dnn/data/test/small_data.txt
@@ -0,0 +1,100 @@
+4764,174,1
+4764,2958,0
+4764,452,0
+4764,1946,0
+4764,3208,0
+2044,2237,1
+2044,1998,0
+2044,328,0
+2044,1542,0
+2044,1932,0
+4276,65,1
+4276,3247,0
+4276,942,0
+4276,3666,0
+4276,2222,0
+3933,682,1
+3933,2451,0
+3933,3695,0
+3933,1643,0
+3933,3568,0
+1151,1265,1
+1151,118,0
+1151,2532,0
+1151,2083,0
+1151,2350,0
+1757,876,1
+1757,201,0
+1757,3633,0
+1757,1068,0
+1757,2549,0
+3370,276,1
+3370,2435,0
+3370,606,0
+3370,910,0
+3370,2146,0
+5137,1018,1
+5137,2163,0
+5137,3167,0
+5137,2315,0
+5137,3595,0
+3933,2831,1
+3933,2881,0
+3933,2949,0
+3933,3660,0
+3933,417,0
+3102,999,1
+3102,1902,0
+3102,2161,0
+3102,3042,0
+3102,1113,0
+2022,336,1
+2022,1672,0
+2022,2656,0
+2022,3649,0
+2022,883,0
+2664,655,1
+2664,3660,0
+2664,1711,0
+2664,3386,0
+2664,1668,0
+25,701,1
+25,32,0
+25,2482,0
+25,3177,0
+25,2767,0
+1738,1643,1
+1738,2187,0
+1738,228,0
+1738,650,0
+1738,3101,0
+5411,1241,1
+5411,2546,0
+5411,3019,0
+5411,3618,0
+5411,1674,0
+638,579,1
+638,3512,0
+638,783,0
+638,2111,0
+638,1880,0
+3554,200,1
+3554,2893,0
+3554,2428,0
+3554,969,0
+3554,2741,0
+4283,1074,1
+4283,3056,0
+4283,2032,0
+4283,405,0
+4283,1505,0
+5111,200,1
+5111,3488,0
+5111,477,0
+5111,2790,0
+5111,40,0
+3964,515,1
+3964,1528,0
+3964,2173,0
+3964,1701,0
+3964,2832,0
diff --git a/models/recall/youtube_dnn/data/train/samll_data.txt b/models/recall/youtube_dnn/data/train/samll_data.txt
new file mode 100644
index 0000000000000000000000000000000000000000..c3c4cf5f84f66594e76603cce1f18d211ebd05a7
--- /dev/null
+++ b/models/recall/youtube_dnn/data/train/samll_data.txt
@@ -0,0 +1,100 @@
+4764,174,1
+4764,2958,0
+4764,452,0
+4764,1946,0
+4764,3208,0
+2044,2237,1
+2044,1998,0
+2044,328,0
+2044,1542,0
+2044,1932,0
+4276,65,1
+4276,3247,0
+4276,942,0
+4276,3666,0
+4276,2222,0
+3933,682,1
+3933,2451,0
+3933,3695,0
+3933,1643,0
+3933,3568,0
+1151,1265,1
+1151,118,0
+1151,2532,0
+1151,2083,0
+1151,2350,0
+1757,876,1
+1757,201,0
+1757,3633,0
+1757,1068,0
+1757,2549,0
+3370,276,1
+3370,2435,0
+3370,606,0
+3370,910,0
+3370,2146,0
+5137,1018,1
+5137,2163,0
+5137,3167,0
+5137,2315,0
+5137,3595,0
+3933,2831,1
+3933,2881,0
+3933,2949,0
+3933,3660,0
+3933,417,0
+3102,999,1
+3102,1902,0
+3102,2161,0
+3102,3042,0
+3102,1113,0
+2022,336,1
+2022,1672,0
+2022,2656,0
+2022,3649,0
+2022,883,0
+2664,655,1
+2664,3660,0
+2664,1711,0
+2664,3386,0
+2664,1668,0
+25,701,1
+25,32,0
+25,2482,0
+25,3177,0
+25,2767,0
+1738,1643,1
+1738,2187,0
+1738,228,0
+1738,650,0
+1738,3101,0
+5411,1241,1
+5411,2546,0
+5411,3019,0
+5411,3618,0
+5411,1674,0
+638,579,1
+638,3512,0
+638,783,0
+638,2111,0
+638,1880,0
+3554,200,1
+3554,2893,0
+3554,2428,0
+3554,969,0
+3554,2741,0
+4283,1074,1
+4283,3056,0
+4283,2032,0
+4283,405,0
+4283,1505,0
+5111,200,1
+5111,3488,0
+5111,477,0
+5111,2790,0
+5111,40,0
+3964,515,1
+3964,1528,0
+3964,2173,0
+3964,1701,0
+3964,2832,0
diff --git a/models/recall/youtube_dnn/model.py b/models/recall/youtube_dnn/model.py
new file mode 100644
index 0000000000000000000000000000000000000000..a1203447c6a66404f270a8f65215eea5cd9e82c7
--- /dev/null
+++ b/models/recall/youtube_dnn/model.py
@@ -0,0 +1,94 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import math
+import numpy as np
+import paddle.fluid as fluid
+
+from paddlerec.core.utils import envs
+from paddlerec.core.model import Model as ModelBase
+
+
+class Model(ModelBase):
+ def __init__(self, config):
+ ModelBase.__init__(self, config)
+
+ def _init_hyper_parameters(self):
+ self.watch_vec_size = envs.get_global_env(
+ "hyper_parameters.watch_vec_size")
+ self.search_vec_size = envs.get_global_env(
+ "hyper_parameters.search_vec_size")
+ self.other_feat_size = envs.get_global_env(
+ "hyper_parameters.other_feat_size")
+ self.output_size = envs.get_global_env("hyper_parameters.output_size")
+ self.layers = envs.get_global_env("hyper_parameters.layers")
+
+ def input_data(self, is_infer=False, **kwargs):
+
+ watch_vec = fluid.data(
+ name="watch_vec",
+ shape=[None, self.watch_vec_size],
+ dtype="float32")
+ search_vec = fluid.data(
+ name="search_vec",
+ shape=[None, self.search_vec_size],
+ dtype="float32")
+ other_feat = fluid.data(
+ name="other_feat",
+ shape=[None, self.other_feat_size],
+ dtype="float32")
+ label = fluid.data(name="label", shape=[None, 1], dtype="int64")
+ inputs = [watch_vec] + [search_vec] + [other_feat] + [label]
+
+ return inputs
+
+ def net(self, inputs, is_infer=False):
+ concat_feats = fluid.layers.concat(input=inputs[:-1], axis=-1)
+
+ l1 = self._fc('l1', concat_feats, self.layers[0], 'relu')
+ l2 = self._fc('l2', l1, self.layers[1], 'relu')
+ l3 = self._fc('l3', l2, self.layers[2], 'relu')
+ l4 = self._fc('l4', l3, self.output_size, 'softmax')
+
+ num_seqs = fluid.layers.create_tensor(dtype='int64')
+ acc = fluid.layers.accuracy(input=l4, label=inputs[-1], total=num_seqs)
+
+ cost = fluid.layers.cross_entropy(input=l4, label=inputs[-1])
+ avg_cost = fluid.layers.mean(cost)
+
+ self._cost = avg_cost
+ self._metrics["acc"] = acc
+
+ def _fc(self, tag, data, out_dim, active='relu'):
+ init_stddev = 1.0
+ scales = 1.0 / np.sqrt(data.shape[1])
+
+ if tag == 'l4':
+ p_attr = fluid.param_attr.ParamAttr(
+ name='%s_weight' % tag,
+ initializer=fluid.initializer.NormalInitializer(
+ loc=0.0, scale=init_stddev * scales))
+ else:
+ p_attr = None
+
+ b_attr = fluid.ParamAttr(
+ name='%s_bias' % tag, initializer=fluid.initializer.Constant(0.1))
+
+ out = fluid.layers.fc(input=data,
+ size=out_dim,
+ act=active,
+ param_attr=p_attr,
+ bias_attr=b_attr,
+ name=tag)
+ return out
diff --git a/models/recall/youtube_dnn/random_reader.py b/models/recall/youtube_dnn/random_reader.py
new file mode 100644
index 0000000000000000000000000000000000000000..cdb0add6dbb358dba52ba9c933c060fec3ddf516
--- /dev/null
+++ b/models/recall/youtube_dnn/random_reader.py
@@ -0,0 +1,50 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+from __future__ import print_function
+
+import numpy as np
+
+from paddlerec.core.reader import Reader
+from paddlerec.core.utils import envs
+from collections import defaultdict
+
+
+class TrainReader(Reader):
+ def init(self):
+ self.watch_vec_size = envs.get_global_env(
+ "hyper_parameters.watch_vec_size")
+ self.search_vec_size = envs.get_global_env(
+ "hyper_parameters.search_vec_size")
+ self.other_feat_size = envs.get_global_env(
+ "hyper_parameters.other_feat_size")
+ self.output_size = envs.get_global_env("hyper_parameters.output_size")
+
+ def generate_sample(self, line):
+ """
+ the file is not used
+ """
+
+ def reader():
+ """
+ This function needs to be implemented by the user, based on data format
+ """
+
+ feature_name = ["watch_vec", "search_vec", "other_feat", "label"]
+ yield zip(feature_name,
+ [np.random.rand(self.watch_vec_size).tolist()] +
+ [np.random.rand(self.search_vec_size).tolist()] +
+ [np.random.rand(self.other_feat_size).tolist()] +
+ [[np.random.randint(self.output_size)]])
+
+ return reader
diff --git a/models/rerank/__init__.py b/models/rerank/__init__.py
new file mode 100755
index 0000000000000000000000000000000000000000..abf198b97e6e818e1fbe59006f98492640bcee54
--- /dev/null
+++ b/models/rerank/__init__.py
@@ -0,0 +1,13 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
diff --git a/models/rerank/listwise/__init__.py b/models/rerank/listwise/__init__.py
new file mode 100755
index 0000000000000000000000000000000000000000..abf198b97e6e818e1fbe59006f98492640bcee54
--- /dev/null
+++ b/models/rerank/listwise/__init__.py
@@ -0,0 +1,13 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
diff --git a/models/rerank/listwise/config.yaml b/models/rerank/listwise/config.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..2ddfa32fe08aa8bece00727aefc46bb893b4d090
--- /dev/null
+++ b/models/rerank/listwise/config.yaml
@@ -0,0 +1,67 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+
+workspace: "paddlerec.models.rerank.listwise"
+
+dataset:
+- name: dataset_train
+ type: DataLoader
+ data_path: "{workspace}/data/train"
+ data_converter: "{workspace}/random_reader.py"
+- name: dataset_infer
+ type: DataLoader
+ data_path: "{workspace}/data/test"
+ data_converter: "{workspace}/random_reader.py"
+
+hyper_parameters:
+ hidden_size: 128
+ user_vocab: 200
+ item_vocab: 1000
+ item_len: 5
+ embed_size: 16
+ batch_size: 1
+ optimizer:
+ class: sgd
+ learning_rate: 0.01
+ strategy: async
+
+#use infer_runner mode and modify 'phase' below if infer
+mode: train_runner
+#mode: infer_runner
+
+runner:
+- name: train_runner
+ class: single_train
+ device: cpu
+ epochs: 3
+ save_checkpoint_interval: 2
+ save_inference_interval: 4
+ save_checkpoint_path: "increment"
+ save_inference_path: "inference"
+- name: infer_runner
+ class: single_infer
+ init_model_path: "increment/0"
+ device: cpu
+ epochs: 3
+
+phase:
+- name: train
+ model: "{workspace}/model.py"
+ dataset_name: dataset_train
+ thread_num: 1
+ #- name: infer
+ # model: "{workspace}/model.py"
+ # dataset_name: dataset_infer
+ # thread_num: 1
diff --git a/models/rerank/listwise/data/test/small_data.txt b/models/rerank/listwise/data/test/small_data.txt
new file mode 100644
index 0000000000000000000000000000000000000000..c3c4cf5f84f66594e76603cce1f18d211ebd05a7
--- /dev/null
+++ b/models/rerank/listwise/data/test/small_data.txt
@@ -0,0 +1,100 @@
+4764,174,1
+4764,2958,0
+4764,452,0
+4764,1946,0
+4764,3208,0
+2044,2237,1
+2044,1998,0
+2044,328,0
+2044,1542,0
+2044,1932,0
+4276,65,1
+4276,3247,0
+4276,942,0
+4276,3666,0
+4276,2222,0
+3933,682,1
+3933,2451,0
+3933,3695,0
+3933,1643,0
+3933,3568,0
+1151,1265,1
+1151,118,0
+1151,2532,0
+1151,2083,0
+1151,2350,0
+1757,876,1
+1757,201,0
+1757,3633,0
+1757,1068,0
+1757,2549,0
+3370,276,1
+3370,2435,0
+3370,606,0
+3370,910,0
+3370,2146,0
+5137,1018,1
+5137,2163,0
+5137,3167,0
+5137,2315,0
+5137,3595,0
+3933,2831,1
+3933,2881,0
+3933,2949,0
+3933,3660,0
+3933,417,0
+3102,999,1
+3102,1902,0
+3102,2161,0
+3102,3042,0
+3102,1113,0
+2022,336,1
+2022,1672,0
+2022,2656,0
+2022,3649,0
+2022,883,0
+2664,655,1
+2664,3660,0
+2664,1711,0
+2664,3386,0
+2664,1668,0
+25,701,1
+25,32,0
+25,2482,0
+25,3177,0
+25,2767,0
+1738,1643,1
+1738,2187,0
+1738,228,0
+1738,650,0
+1738,3101,0
+5411,1241,1
+5411,2546,0
+5411,3019,0
+5411,3618,0
+5411,1674,0
+638,579,1
+638,3512,0
+638,783,0
+638,2111,0
+638,1880,0
+3554,200,1
+3554,2893,0
+3554,2428,0
+3554,969,0
+3554,2741,0
+4283,1074,1
+4283,3056,0
+4283,2032,0
+4283,405,0
+4283,1505,0
+5111,200,1
+5111,3488,0
+5111,477,0
+5111,2790,0
+5111,40,0
+3964,515,1
+3964,1528,0
+3964,2173,0
+3964,1701,0
+3964,2832,0
diff --git a/models/rerank/listwise/data/train/small_data.txt b/models/rerank/listwise/data/train/small_data.txt
new file mode 100644
index 0000000000000000000000000000000000000000..c3c4cf5f84f66594e76603cce1f18d211ebd05a7
--- /dev/null
+++ b/models/rerank/listwise/data/train/small_data.txt
@@ -0,0 +1,100 @@
+4764,174,1
+4764,2958,0
+4764,452,0
+4764,1946,0
+4764,3208,0
+2044,2237,1
+2044,1998,0
+2044,328,0
+2044,1542,0
+2044,1932,0
+4276,65,1
+4276,3247,0
+4276,942,0
+4276,3666,0
+4276,2222,0
+3933,682,1
+3933,2451,0
+3933,3695,0
+3933,1643,0
+3933,3568,0
+1151,1265,1
+1151,118,0
+1151,2532,0
+1151,2083,0
+1151,2350,0
+1757,876,1
+1757,201,0
+1757,3633,0
+1757,1068,0
+1757,2549,0
+3370,276,1
+3370,2435,0
+3370,606,0
+3370,910,0
+3370,2146,0
+5137,1018,1
+5137,2163,0
+5137,3167,0
+5137,2315,0
+5137,3595,0
+3933,2831,1
+3933,2881,0
+3933,2949,0
+3933,3660,0
+3933,417,0
+3102,999,1
+3102,1902,0
+3102,2161,0
+3102,3042,0
+3102,1113,0
+2022,336,1
+2022,1672,0
+2022,2656,0
+2022,3649,0
+2022,883,0
+2664,655,1
+2664,3660,0
+2664,1711,0
+2664,3386,0
+2664,1668,0
+25,701,1
+25,32,0
+25,2482,0
+25,3177,0
+25,2767,0
+1738,1643,1
+1738,2187,0
+1738,228,0
+1738,650,0
+1738,3101,0
+5411,1241,1
+5411,2546,0
+5411,3019,0
+5411,3618,0
+5411,1674,0
+638,579,1
+638,3512,0
+638,783,0
+638,2111,0
+638,1880,0
+3554,200,1
+3554,2893,0
+3554,2428,0
+3554,969,0
+3554,2741,0
+4283,1074,1
+4283,3056,0
+4283,2032,0
+4283,405,0
+4283,1505,0
+5111,200,1
+5111,3488,0
+5111,477,0
+5111,2790,0
+5111,40,0
+3964,515,1
+3964,1528,0
+3964,2173,0
+3964,1701,0
+3964,2832,0
diff --git a/models/rerank/listwise/model.py b/models/rerank/listwise/model.py
new file mode 100644
index 0000000000000000000000000000000000000000..d588db0629439eec9396ec9b1f81f1988e99d51e
--- /dev/null
+++ b/models/rerank/listwise/model.py
@@ -0,0 +1,218 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import math
+import numpy as np
+import paddle.fluid as fluid
+
+from paddlerec.core.utils import envs
+from paddlerec.core.model import Model as ModelBase
+
+
+class Model(ModelBase):
+ def __init__(self, config):
+ ModelBase.__init__(self, config)
+
+ def _init_hyper_parameters(self):
+ self.item_len = envs.get_global_env("hyper_parameters.self.item_len")
+ self.hidden_size = envs.get_global_env("hyper_parameters.hidden_size")
+ self.user_vocab = envs.get_global_env("hyper_parameters.user_vocab")
+ self.item_vocab = envs.get_global_env("hyper_parameters.item_vocab")
+ self.embed_size = envs.get_global_env("hyper_parameters.embed_size")
+
+ def input_data(self, is_infer=False, **kwargs):
+ user_slot_names = fluid.data(
+ name='user_slot_names',
+ shape=[None, 1],
+ dtype='int64',
+ lod_level=1)
+ item_slot_names = fluid.data(
+ name='item_slot_names',
+ shape=[None, self.item_len],
+ dtype='int64',
+ lod_level=1)
+ lens = fluid.data(name='lens', shape=[None], dtype='int64')
+ labels = fluid.data(
+ name='labels',
+ shape=[None, self.item_len],
+ dtype='int64',
+ lod_level=1)
+
+ inputs = [user_slot_names] + [item_slot_names] + [lens] + [labels]
+
+ # demo: hot to use is_infer:
+ if is_infer:
+ return inputs
+ else:
+ return inputs
+
+ def net(self, inputs, is_infer=False):
+ # user encode
+ user_embedding = fluid.embedding(
+ input=inputs[0],
+ size=[self.user_vocab, self.embed_size],
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Xavier(),
+ regularizer=fluid.regularizer.L2Decay(1e-5)),
+ is_sparse=True)
+
+ user_feature = fluid.layers.fc(
+ input=user_embedding,
+ size=self.hidden_size,
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.TruncatedNormal(
+ loc=0.0, scale=np.sqrt(1.0 / self.hidden_size))),
+ bias_attr=fluid.ParamAttr(initializer=fluid.initializer.Constant(
+ value=0.0)),
+ act='relu',
+ name='user_feature_fc')
+ # item encode
+ item_embedding = fluid.embedding(
+ input=inputs[1],
+ size=[self.item_vocab, self.embed_size],
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Xavier(),
+ regularizer=fluid.regularizer.L2Decay(1e-5)),
+ is_sparse=True)
+
+ item_embedding = fluid.layers.sequence_unpad(
+ x=item_embedding, length=inputs[2])
+
+ item_fc = fluid.layers.fc(
+ input=item_embedding,
+ size=self.hidden_size,
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.TruncatedNormal(
+ loc=0.0, scale=np.sqrt(1.0 / self.hidden_size))),
+ bias_attr=fluid.ParamAttr(initializer=fluid.initializer.Constant(
+ value=0.0)),
+ act='relu',
+ name='item_fc')
+
+ pos = self._fluid_sequence_get_pos(item_fc)
+ pos_embed = fluid.embedding(
+ input=pos,
+ size=[self.user_vocab, self.embed_size],
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.Xavier(),
+ regularizer=fluid.regularizer.L2Decay(1e-5)),
+ is_sparse=True)
+
+ pos_embed = fluid.layers.squeeze(pos_embed, [1])
+
+ # item gru
+ gru_input = fluid.layers.fc(
+ input=fluid.layers.concat([item_fc, pos_embed], 1),
+ size=self.hidden_size * 3,
+ name='item_gru_fc')
+
+ # forward gru
+ item_gru_forward = fluid.layers.dynamic_gru(
+ input=gru_input,
+ size=self.hidden_size,
+ is_reverse=False,
+ h_0=user_feature)
+ # backward gru
+ item_gru_backward = fluid.layers.dynamic_gru(
+ input=gru_input,
+ size=self.hidden_size,
+ is_reverse=True,
+ h_0=user_feature)
+
+ item_gru = fluid.layers.concat(
+ [item_gru_forward, item_gru_backward], axis=1)
+
+ out_click_fc1 = fluid.layers.fc(
+ input=item_gru,
+ size=self.hidden_size,
+ param_attr=fluid.ParamAttr(
+ initializer=fluid.initializer.TruncatedNormal(
+ loc=0.0, scale=np.sqrt(1.0 / self.hidden_size))),
+ bias_attr=fluid.ParamAttr(initializer=fluid.initializer.Constant(
+ value=0.0)),
+ act='relu',
+ name='out_click_fc1')
+
+ click_prob = fluid.layers.fc(input=out_click_fc1,
+ size=2,
+ act='softmax',
+ name='out_click_fc2')
+
+ labels = fluid.layers.sequence_unpad(x=inputs[3], length=inputs[2])
+
+ auc_val, batch_auc, auc_states = fluid.layers.auc(input=click_prob,
+ label=labels)
+
+ if is_infer:
+ self._infer_results["AUC"] = auc_val
+ return
+
+ loss = fluid.layers.reduce_mean(
+ fluid.layers.cross_entropy(
+ input=click_prob, label=labels))
+ self._cost = loss
+ self._metrics['auc'] = auc_val
+
+ def _fluid_sequence_pad(self, input, pad_value, maxlen=None):
+ """
+ args:
+ input: (batch*seq_len, dim)
+ returns:
+ (batch, max_seq_len, dim)
+ """
+ pad_value = fluid.layers.cast(
+ fluid.layers.assign(input=np.array([pad_value], 'float32')),
+ input.dtype)
+ input_padded, _ = fluid.layers.sequence_pad(
+ input, pad_value,
+ maxlen=maxlen) # (batch, max_seq_len, 1), (batch, 1)
+ # TODO, maxlen=300, used to solve issues: https://github.com/PaddlePaddle/Paddle/issues/14164
+ return input_padded
+
+ def _fluid_sequence_get_pos(self, lodtensor):
+ """
+ args:
+ lodtensor: lod = [[0,4,7]]
+ return:
+ pos: lod = [[0,4,7]]
+ data = [0,1,2,3,0,1,3]
+ shape = [-1, 1]
+ """
+ lodtensor = fluid.layers.reduce_sum(lodtensor, dim=1, keep_dim=True)
+ assert lodtensor.shape == (-1, 1), (lodtensor.shape())
+ ones = fluid.layers.cast(lodtensor * 0 + 1,
+ 'float32') # (batch*seq_len, 1)
+ ones_padded = self._fluid_sequence_pad(ones,
+ 0) # (batch, max_seq_len, 1)
+ ones_padded = fluid.layers.squeeze(ones_padded,
+ [2]) # (batch, max_seq_len)
+ seq_len = fluid.layers.cast(
+ fluid.layers.reduce_sum(
+ ones_padded, 1, keep_dim=True), 'int64') # (batch, 1)
+ seq_len = fluid.layers.squeeze(seq_len, [1])
+
+ pos = fluid.layers.cast(
+ fluid.layers.cumsum(
+ ones_padded, 1, exclusive=True), 'int64')
+ pos = fluid.layers.sequence_unpad(pos, seq_len) # (batch*seq_len, 1)
+ pos.stop_gradient = True
+ return pos
+
+ #def train_net(self):
+ # input_data = self.input_data()
+ # self.net(input_data)
+
+ #def infer_net(self):
+ # input_data = self.input_data(is_infer=True)
+ # self.net(input_data, is_infer=True)
diff --git a/models/rerank/listwise/random_infer_reader.py b/models/rerank/listwise/random_infer_reader.py
new file mode 100644
index 0000000000000000000000000000000000000000..4f93688e59dd2dc142a0ab79201278b25e18b468
--- /dev/null
+++ b/models/rerank/listwise/random_infer_reader.py
@@ -0,0 +1,68 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+from __future__ import print_function
+
+import numpy as np
+import paddle.fluid as fluid
+
+from paddlerec.core.reader import Reader
+from paddlerec.core.utils import envs
+from collections import defaultdict
+
+
+class EvaluateReader(Reader):
+ def init(self):
+ self.user_vocab = envs.get_global_env("hyper_parameters.user_vocab",
+ None, "train.model")
+ self.item_vocab = envs.get_global_env("hyper_parameters.item_vocab",
+ None, "train.model")
+ self.item_len = envs.get_global_env("hyper_parameters.item_len", None,
+ "train.model")
+ self.batch_size = envs.get_global_env("batch_size", None,
+ "train.reader")
+
+ def reader_creator(self):
+ def reader():
+ user_slot_name = []
+ for j in range(self.batch_size):
+ user_slot_name.append(
+ [int(np.random.randint(self.user_vocab))])
+ item_slot_name = np.random.randint(
+ self.item_vocab, size=(self.batch_size,
+ self.item_len)).tolist()
+ length = [self.item_len] * self.batch_size
+ label = np.random.randint(
+ 2, size=(self.batch_size, self.item_len)).tolist()
+ output = [user_slot_name, item_slot_name, length, label]
+
+ yield output
+
+ return reader
+
+ def generate_batch_from_trainfiles(self, files):
+ return fluid.io.batch(
+ self.reader_creator(), batch_size=self.batch_size)
+
+ def generate_sample(self, line):
+ """
+ the file is not used
+ """
+
+ def reader():
+ """
+ This function needs to be implemented by the user, based on data format
+ """
+ pass
+
+ return reader
diff --git a/models/rerank/listwise/random_reader.py b/models/rerank/listwise/random_reader.py
new file mode 100644
index 0000000000000000000000000000000000000000..aa7af3f083c720d35e9f11f5f5ec1bddd107cabc
--- /dev/null
+++ b/models/rerank/listwise/random_reader.py
@@ -0,0 +1,64 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+from __future__ import print_function
+
+import numpy as np
+import paddle.fluid as fluid
+
+from paddlerec.core.reader import Reader
+from paddlerec.core.utils import envs
+from collections import defaultdict
+
+
+class TrainReader(Reader):
+ def init(self):
+ self.user_vocab = envs.get_global_env("hyper_parameters.user_vocab")
+ self.item_vocab = envs.get_global_env("hyper_parameters.item_vocab")
+ self.item_len = envs.get_global_env("hyper_parameters.item_len")
+ self.batch_size = envs.get_global_env("hyper_parameters.batch_size")
+
+ def reader_creator(self):
+ def reader():
+ user_slot_name = []
+ for j in range(self.batch_size):
+ user_slot_name.append(
+ [int(np.random.randint(self.user_vocab))])
+ item_slot_name = np.random.randint(
+ self.item_vocab, size=(self.batch_size,
+ self.item_len)).tolist()
+ length = [self.item_len] * self.batch_size
+ label = np.random.randint(
+ 2, size=(self.batch_size, self.item_len)).tolist()
+ output = [user_slot_name, item_slot_name, length, label]
+
+ yield output
+
+ return reader
+
+ def generate_batch_from_trainfiles(self, files):
+ return fluid.io.batch(
+ self.reader_creator(), batch_size=self.batch_size)
+
+ def generate_sample(self, line):
+ """
+ the file is not used
+ """
+
+ def reader():
+ """
+ This function needs to be implemented by the user, based on data format
+ """
+ pass
+
+ return reader
diff --git a/models/rerank/readme.md b/models/rerank/readme.md
new file mode 100755
index 0000000000000000000000000000000000000000..6f698daf9f9a7529abcb8d18010965988838a940
--- /dev/null
+++ b/models/rerank/readme.md
@@ -0,0 +1,36 @@
+# 重排序模型库
+
+## 简介
+我们提供了常见的重排序使用的模型算法的PaddleRec实现, 单机训练&预测效果指标以及分布式训练&预测性能指标等。目前实现的模型是 [Listwise](listwise)。
+
+模型算法库在持续添加中,欢迎关注。
+
+## 目录
+* [整体介绍](#整体介绍)
+ * [重排序模型列表](#重排序模型列表)
+* [使用教程](#使用教程)
+
+## 整体介绍
+### 融合模型列表
+
+| 模型 | 简介 | 论文 |
+| :------------------: | :--------------------: | :---------: |
+| Listwise | Listwise | [Sequential Evaluation and Generation Framework for Combinatorial Recommender System](https://arxiv.org/pdf/1902.00245.pdf)(2019) |
+
+下面是每个模型的简介(注:图片引用自链接中的论文)
+
+
+[Listwise](https://arxiv.org/pdf/1902.00245.pdf):
+
+
+
+
+
+## 使用教程(快速开始)
+```shell
+python -m paddlerec.run -m paddlerec.models.rerank.listwise # listwise
+```
+
+## 使用教程(复现论文)
+
+listwise原论文没有给出训练数据,我们使用了随机的数据,可参考快速开始
diff --git a/models/treebased/README.md b/models/treebased/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..3ceb13b62eba8127aa0394397d141b2abe343a32
--- /dev/null
+++ b/models/treebased/README.md
@@ -0,0 +1,27 @@
+# Paddle-TDM
+本代码库提供了基于PaddlePaddle实现的TreeBased推荐搜索算法,主要包含以下组成:
+
+- 基于fake数据集,适用于快速调试的paddle-tdm模型。主要用于理解paddle-tdm的设计原理,高效上手设计适合您的使用场景的模型。
+
+以上内容将随paddle版本迭代不断更新,欢迎您关注该代码库。
+
+## TDM设计思路
+
+### 基本概念
+TDM是为大规模推荐系统设计的、能承载任意先进模型来高效检索用户兴趣的推荐算法解决方案。TDM基于树结构,提出了一套对用户兴趣度量进行层次化建模与检索的方法论,使得系统能直接利高级深度学习模型在全库范围内检索用户兴趣。其基本原理是使用树结构对全库item进行索引,然后训练深度模型以支持树上的逐层检索,从而将大规模推荐中全库检索的复杂度由O(n)(n为所有item的量级)下降至O(log n)。
+
+### 核心问题
+
+1. 如何构建树结构?
+2. 如何基于树结构做深度学习模型的训练?
+3. 如何基于树及模型进行高效检索?
+
+### PaddlePaddle的TDM方案
+
+1. 树结构的数据,来源于各个业务的实际场景,构造方式各有不同,paddle-TDM一期暂不提供统一的树的构造流程,但会统一树构造好之后,输入paddle网络的数据组织形式。业务方可以使用任意工具构造自己的树,生成指定的数据格式,参与tdm网络训练。
+2. 网络训练中,有三个核心问题:
+
+ - 如何组网?答:paddle封装了大量的深度学习OP,用户可以根据需求设计自己的网络结构。
+ - 训练数据如何组织?答:tdm的训练数据主要为:`user/query emb` 加 `item`的正样本,`item`需要映射到树的某个叶子节点。用户只需准备符合该构成的数据即可。负样本的生成,会基于用户提供的树结构,以及paddle提供的`tdm-sampler op`完成高效的负采样,并自动添加相应的label,参与tdm中深度学习模型的训练。
+ - 大规模的数据与模型训练如何实现?答:基于paddle优秀的大规模参数服务器分布式能力,可以实现高效的分布式训练。基于paddle-fleet api,学习门槛极低,且可以灵活的支持增量训练,流式训练等业务需求。
+3. 训练好模型后,可以基于paddle,将检索与打分等流程都融入paddle的组网中,生成inference_model与参数文件,基于PaddlePaddle的预测库或者PaddleLite进行快速部署与高效检索。
diff --git a/models/treebased/__init__.py b/models/treebased/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..abf198b97e6e818e1fbe59006f98492640bcee54
--- /dev/null
+++ b/models/treebased/__init__.py
@@ -0,0 +1,13 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
diff --git a/models/treebased/tdm/README.md b/models/treebased/tdm/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..220b90481962a157aa1bb348c040d35fbcda6fa7
--- /dev/null
+++ b/models/treebased/tdm/README.md
@@ -0,0 +1,532 @@
+# Paddle-TDM-DEMO
+
+本代码仅作tdm组网示例,使用fake数据集,用于快速调研paddle-tdm。
+
+
+## 树结构的准备
+### 名词概念
+
+为防止概念混淆,让我们明确tdm中名词的概念:
+
+
+
+
+
+- **item**:具有实际物理含义,是我们希望通过tdm检索最后得到的结果,如一个商品,一篇文章,一个关键词等等,在tdm模型中,item位于树的叶子节点。item有其自身的ID,我们姑且称之为 `item_id`。
+- **节点(node)**:tdm树的任意节点。我们完成聚类后,会生成一颗树,树的叶子节点对应着item。而非叶子节点,则是一种类别上的概括,从大方向的兴趣,不断细分到小方向的兴趣。我们希望这棵树的结构满足最大堆树的性质。同样,节点也有其自身的ID,我们称之为node_id。如上图,最左下方的节点,它的node_id是14,而对应的item_id是0.
+- **Node-Embedding**:注意,此处的Embedding,并非我们已有的item-embedding,而是构建完成的树的节点对应的Embedding,由item-embedding通过规则生成,是我们的网络主要训练的目标。ID范围为所有0->节点数-1。我们同时也需准备一个映射表,来告诉模型,item_id到node_id的映射关系。
+- **Travel**:是指叶子节点从root开始直到其自身的遍历路径,如上图,14号节点的Travel:0->1->3->7->14
+- **Layer**:指树的层,如上图,共有4层。
+
+> Paddle-TDM在训练时,不会改动树的结构,只会改动Node-Embedding。
+
+
+### 树的准备流程
+
+让我们以上图给出的简单树结构为例,来介绍TDM的模型准备流程。假设我们已经完成了树的聚类,并得到了如上图所示的树结构:
+
+- 问题一:叶子节点的Embedding值是多少?答:叶子节点的node-embedding与对应的item的embedding值一致
+- 问题二:非叶子节点的Embedding值是多少?答:非叶子节点的Embedding值初始化目前有两种方案:1、随机初始化。2、使用其子节点的emb均值。
+- 问题三:树一定要求二叉树吗?答:没有这样的硬性要求。
+- 问题四:若有item不在最后一层,树不平衡怎么办?答:树尽量平衡,不在最后一层也没有关系,我们可以通过其他方式让网络正常训练。
+- 问题五:是每个用户都有一棵树,还是所有用户共享一颗树?答:只有一棵树,通过每一层的模型牵引节点的emb,使其尽量满足最大堆性质。
+
+完成以上步骤,我们已经得到了树的结构,与每个节点的全部信息,现在让我们将其转换为Paddle-TDM训练所需的格式。我们需要产出四个数据:
+#### Layer_list
+
+记录了每一层都有哪些节点。训练用
+```bash
+# Layer list
+1,2
+3,4,5,6
+7,8,9,10,11,12,13
+14,15,16,17,18,19,20,21,22,23,24,25
+```
+#### Travel_list
+
+记录每个叶子节点的Travel路径。训练用
+```bash
+# Travel list
+1,3,7,14
+1,3,7,15
+1,3,8,16
+...
+2,5,12,25
+2,6,13,0
+```
+
+#### Tree_Info
+
+记录了每个节点的信息,主要为:是否是item/item_id,所在层级,父节点,子节点。检索用
+```bash
+# Tree info
+0,0,0,1,2
+0,1,0,3,4
+0,1,0,5,6
+0,2,1,7,8
+...
+10,4,12,0,0
+11,4,12,0,0
+```
+
+#### Tree_Embedding
+
+记录所有节点的Embedding表,格式如正常表。训练及检索用
+
+以上数据设计的初衷是为了高效,在paddle网络中以Tensor的形式参与训练,运行时,不再需要进行频繁的树的结构的遍历,直接根据已有信息进行快速查找与训练。以上数据可以明文保存,但最终都需要转成ndarray,参与网络的初始化。
+结合示例树,数据可以组织如右,下面介绍一些细节:
+
+- Layer list从第2(index=1)层开始即可,因为0号节点不参与训练也不参与检索;
+- Travel list的按照item_id的顺序组织,如第一行对应着item_id=0的遍历信息,同样,也不需要包含0号节点;
+- Travel_list每行的长度必须相等,遇到不在最后一层的item,需要padding 0 直至长度和其他item一致;
+- Tree_info包含了0号节点的信息,主要考量是,当我们拿到node_id查找其信息时,可以根据id在该数据中寻找第id行;
+- Tree_info各列的含义是:itme_id(若无则为0),层级Layer,父节点node_id(无则为0),子节点node_id(若无则为0,若子节点数量不满,则需要paddding 0)
+
+## 数据准备
+如前所述,若我们关心的是输入一个user emb,得到他所感兴趣的item id,那我们就准备user_emb + 正样本item的格式的数据,负采样会通过paddle的tdm_sampler op得到。数据的准备不涉及树的结构,因而可以快速复用其他任务的训练数据来验证TDM效果。
+
+```bash
+# emb(float) \t item_id (int)
+-0.9480544328689575 0.8702829480171204 -0.5691063404083252 ...... -0.04391402751207352 -0.5352795124053955 -0.9972627758979797 0.9397293329238892 4690780
+```
+
+## TDM网络设计
+假设输入数据是 Emb + item_id,下面让我们开始介绍一个最简单的网络设计。
+
+
+
+
+
+上图给出了一个非常简单的TDM示例网络,没有添加任何复杂的逻辑,纯用DNN实现。
+TDM的组网,宏观上,可以概括为三个部分
+- 第一部分,输入侧的组网,如果想要对user/query进行一些预处理,或者添加Attention结构,通常都是在这一层次实现。
+- 第二部分,每层的输入与节点信息交互的组网,这一部分是将user/query的信息与node信息结合,在树的不同层下,进行不同粒度兴趣的学习。通常而言,第一部分与第二部分具有紧密的联系,可以统一为一个部分。
+- 第三部分,最终的判别组网,将每层交互得到的信息进行最终的概率判决。但这一层也不是必须的,并不要求所有层的信息都经过一个统一的分类器,可以各层拥有独立的概率判决器。为了逻辑划分更加清晰,我们在示例中添加了这个层次的组网,方便您更加直观的理解tdm网络。
+再次强调,该示例组网仅为展示tdm的基本运行逻辑,请基于这个框架,升级改进您自己的网络。
+
+## TDM组网细节
+
+### 训练组网
+
+训练组网中需要重点关注五个部分:
+1. 网络输入的定义
+2. 网络中输入侧的处理逻辑
+3. node的负采样组网
+4. input与node的交互网络
+5. 判别及loss计算组网
+
+
+#### 输入的定义
+
+首先简要介绍输入的定义:
+
+**demo模型,假设输入为两个元素:**
+> 一、user/query的emb表示,该emb应该来源于特征的组合在某个空间的映射(比如若干特征取emb后concat到一起),或其他预训练模型的处理结果(比如将明文query通过nlp预处理得到emb表示)
+
+> 二、item的正样本,是发生了实际点击/购买/浏览等行为的item_id,与输入的user/query emb强相关,是我们之后通过预测想得到的结果。
+
+在paddle组网中,我们这样定义上面两个变量:
+```python
+def input_data(self):
+ """
+ 指定tdm训练网络的输入变量
+ """
+ input_emb = fluid.data(
+ name="input_emb",
+ shape=[None, self.input_embed_size],
+ dtype="float32",
+ )
+
+ item_label = fluid.data(
+ name="item_label",
+ shape=[None, 1],
+ dtype="int64",
+ )
+
+ inputs = [input_emb] + [item_label]
+ return inputs
+```
+
+#### 输入侧的组网
+
+**输入侧组网由FC层组成**
+> 一、`input_fc`,主要功能是input_emb维度的压缩,只需一个fc即可。
+
+> 二、`layer_fc`,主要功能是将input_emb映射到不同的兴趣层空间,和当层的node学习兴趣关系。有多少层,就添加多少个fc。
+
+
+
+
+
+在paddle组网中,我们这样快速实现输入侧组网:
+```python
+def input_trans_layer(self, input_emb):
+ """
+ 输入侧训练组网
+ """
+ # 将input压缩到与node相同的维度
+ input_fc_out = fluid.layers.fc(
+ input=input_emb,
+ size=self.node_emb_size,
+ act=None,
+ param_attr=fluid.ParamAttr(name="trans.input_fc.weight"),
+ bias_attr=fluid.ParamAttr(name="trans.input_fc.bias"),
+ )
+
+ # 将input_emb映射到各个不同层次的向量表示空间
+ input_layer_fc_out = [
+ fluid.layers.fc(
+ input=input_fc_out,
+ size=self.node_emb_size,
+ act="tanh",
+ param_attr=fluid.ParamAttr(
+ name="trans.layer_fc.weight." + str(i)),
+ bias_attr=fluid.ParamAttr(name="trans.layer_fc.bias."+str(i)),
+ ) for i in range(self.max_layers)
+ ]
+
+ return input_layer_fc_out
+```
+
+#### node的负采样组网
+**tdm 负采样的核心是tdm_sampler OP**
+
+tdm_sampler的运行逻辑如下:
+
+1. 输入item_id,读travel_list,查表,得到该item_id对应的遍历路径(从靠近根节点的第一层一直往下直到存放该item的node)
+2. 读layer_list,查表,得到每层都有哪些node
+3. 循环:i = 0, 从第i层开始进行负采样
+ - 在item遍历路径上的node视为正样本,`positive_node_id`由`travel_list[item_id][i]`给出,其他同层的兄弟节点视为负样本,该层节点列表由`layer_list[i]`给出,如果`positive_node_id`不在`layer_list[i]`中,会提示错误。
+
+ - 在兄弟节点中进行随机采样,采样N个node,N由`neg_sampling_list[i]`的值决定,如果该值大于兄弟节点的数量,会提示错误。 采样结果不会重复,且不会采样到正样本。
+
+ - 如果`output_positive=True`,则会同时输出正负样本,否则只输出负采样的结果
+
+ - 生成该层`label`,shape与采样结果一致,正样本对应的label=1,负样本的label=0
+
+ - 生成该层`mask`,如果树是不平衡的,则有些item不会位于树的最后一层,所以遍历路径的实际长度会比其他item少,为了tensor维度一致,travel_list中padding了0。当遇到了padding的0时,tdm_sampler也会输出正常维度的采样结果,采样结果与label都为0。为了区分这部分虚拟的采样结果与真实采样结果,会给虚拟采样结果额外设置mask=0,如果是真实采样结果mask=1
+ - i += 1, 若i > layer_nums, break
+4. 对输出的采样结果、label、mask进行整理:
+ - 如果`output_list=False`,则会输出三个tensor(samping_result, label, mask),shape形如`[batch_size, all_layer_sampling_nums, 1]`;
+ - 若`output_list=True`,则会输出三个`list[tensor,...,tensor]`,`sampling_result_list/label_list/mask_list`,`len(list)`等于层数,将采样结果按照分属哪一层进行拆分,每个tensor的shape形如`[batch_size, layer_i_sampling_nums,1]`
+
+```python
+# 根据输入的item的正样本在给定的树上进行负采样
+# sample_nodes 是采样的node_id的结果,包含正负样本
+# sample_label 是采样的node_id对应的正负标签
+# sample_mask 是为了保持tensor维度一致,padding部分的标签,若为0,则是padding的虚拟node_id
+sample_nodes, sample_label, sample_mask = fluid.contrib.layers.tdm_sampler(
+ x=item_label,
+ neg_samples_num_list=self.neg_sampling_list,
+ layer_node_num_list=self.layer_node_num_list,
+ leaf_node_num=self.leaf_node_num,
+ tree_travel_attr=fluid.ParamAttr(name="TDM_Tree_Travel"),
+ tree_layer_attr=fluid.ParamAttr(name="TDM_Tree_Layer"),
+ output_positive=self.output_positive,
+ output_list=True,
+ seed=0,
+ tree_dtype='int64',
+ dtype='int64'
+)
+
+# 查表得到每个节点的Embedding
+sample_nodes_emb = [
+ fluid.embedding(
+ input=sample_nodes[i],
+ is_sparse=True,
+ size=[self.node_nums, self.node_emb_size],
+ param_attr=fluid.ParamAttr(
+ name="TDM_Tree_Emb")
+ ) for i in range(self.max_layers)
+]
+
+# 此处进行reshape是为了之后层次化的分类器训练
+sample_nodes_emb = [
+ fluid.layers.reshape(sample_nodes_emb[i],
+ [-1, self.neg_sampling_list[i] +
+ self.output_positive, self.node_emb_size]
+ ) for i in range(self.max_layers)
+]
+
+```
+
+#### input与node的交互网络
+**交互网络由FC层组成**
+
+主要包含两个流程:
+> 一、将输入进行维度上的`expand`,与采样得到的noed数量一致(当然也可以使用其他`broadcast`的网络结构)
+
+> 二、input_emb与node_emb进行`concat`,过FC,计算兴趣上的匹配关系
+
+
+
+
+
+在paddle的组网中,我们这样实现这一部分的逻辑:
+
+```python
+def _expand_layer(self, input_layer, node, layer_idx):
+ # 扩展input的输入,使数量与node一致,
+ # 也可以以其他broadcast的操作进行代替
+ # 同时兼容了训练组网与预测组网
+ input_layer_unsequeeze = fluid.layers.unsqueeze(
+ input=input_layer, axes=[1])
+ if self.is_test:
+ input_layer_expand = fluid.layers.expand(
+ input_layer_unsequeeze, expand_times=[1, node.shape[1], 1])
+ else:
+ input_layer_expand = fluid.layers.expand(
+ input_layer_unsequeeze, expand_times=[1, node[layer_idx].shape[1], 1])
+ return input_layer_expand
+
+def classifier_layer(self, input, node):
+ # 扩展input,使维度与node匹配
+ input_expand = [
+ self._expand_layer(input[i], node, i) for i in range(self.max_layers)
+ ]
+
+ # 将input_emb与node_emb concat到一起过分类器FC
+ input_node_concat = [
+ fluid.layers.concat(
+ input=[input_expand[i], node[i]],
+ axis=2) for i in range(self.max_layers)
+ ]
+
+ hidden_states_fc = [
+ fluid.layers.fc(
+ input=input_node_concat[i],
+ size=self.node_emb_size,
+ num_flatten_dims=2,
+ act="tanh",
+ param_attr=fluid.ParamAttr(
+ name="cls.concat_fc.weight."+str(i)),
+ bias_attr=fluid.ParamAttr(name="cls.concat_fc.bias."+str(i))
+ ) for i in range(self.max_layers)
+ ]
+
+ # 如果将所有层次的node放到一起计算loss,则需要在此处concat
+ # 将分类器结果以batch为准绳concat到一起,而不是layer
+ # 维度形如[batch_size, total_node_num, node_emb_size]
+ hidden_states_concat = fluid.layers.concat(hidden_states_fc, axis=1)
+ return hidden_states_concat
+```
+
+#### 判别及loss计算组网
+
+最终的判别组网会将所有层的输出打平放到一起,过`tdm.cls_fc`,再过`softmax_with_cross_entropy`层,计算cost,同时得到softmax的中间结果,计算acc或者auc。
+
+```python
+tdm_fc = fluid.layers.fc(input=layer_classifier_res,
+ size=self.label_nums,
+ act=None,
+ num_flatten_dims=2,
+ param_attr=fluid.ParamAttr(
+ name="tdm.cls_fc.weight"),
+ bias_attr=fluid.ParamAttr(name="tdm.cls_fc.bias"))
+
+# 将loss打平,放到一起计算整体网络的loss
+tdm_fc_re = fluid.layers.reshape(tdm_fc, [-1, 2])
+
+# 若想对各个层次的loss辅以不同的权重,则在此处无需concat
+# 支持各个层次分别计算loss,再乘相应的权重
+sample_label = fluid.layers.concat(sample_label, axis=1)
+sample_label.stop_gradient = True
+labels_reshape = fluid.layers.reshape(sample_label, [-1, 1])
+
+# 计算整体的loss并得到softmax的输出
+cost, softmax_prob = fluid.layers.softmax_with_cross_entropy(
+ logits=tdm_fc_re, label=labels_reshape, return_softmax=True)
+
+# 通过mask过滤掉虚拟节点的loss
+sample_mask = fluid.layers.concat(sample_mask, axis=1)
+sample_mask.stop_gradient = True
+mask_reshape = fluid.layers.reshape(sample_mask, [-1, 1])
+
+mask_index = fluid.layers.where(mask_reshape != 0)
+mask_cost = fluid.layers.gather_nd(cost, mask_index)
+
+# 计算该batch的均值loss,同时计算acc, 亦可在这里计算auc
+avg_cost = fluid.layers.reduce_mean(mask_cost)
+acc = fluid.layers.accuracy(input=softmax_prob, label=labels_reshape)
+```
+
+### 预测组网
+
+预测的整体流程类似于beamsearch。预测组网中需要重点关注三个部分:
+1. 网络输入的定义,及初始检索层的确定
+2. 串行的通过各层分类器的流程
+3. 每层选出topK及最终选出topK的方法
+
+#### 网络输入
+
+首先考虑这样一个问题,假设树是一颗十层的完全二叉树,我们要取topK=1024的召回结果,那么我们需不需要从树的第一层开始逐层计算fc与topK?答案显然是否定的,我们只需要计算最后一层的1024个节点即可。在开发的实验中,我们得到的结论是,tdm检索时,计算复杂度并不高,时间耗费中OP的调度占据了主要矛盾,通过模型裁剪,树剪枝,模型量化等都可以加快预测速度。
+
+因此infer组网中首先考虑了跳层与剪枝的实现:我们定义了三个变量,`input_emb`,`first_layer_node`,`first_layer_node_mask`作为网络的输入。
+- `input_emb`:预测时输入的user/query向量。
+- `first_layer_node`:检索时的起始node_id,是变长类型。这样设置的好处是:1、可以输入某一层所有节点node_id,从而在某一层开始进行检索。2、也可以设置为特定node_id,从指定的树分枝开始检索。输入的node_id应该位于同一层。
+- `first_layer_node_mask`:维度与`first_layer_node`相同,值一一对应。若node是叶子节点,则mask=1,否则设置mask=0。
+
+在demo网络中,我们设置为从某一层的所有节点开始进行检索。paddle组网对输入定义的实现如下:
+```python
+def input_data(self):
+ input_emb = fluid.layers.data(
+ name="input_emb",
+ shape=[self.input_embed_size],
+ dtype="float32",
+ )
+
+ # first_layer 与 first_layer_mask 对应着infer起始的节点
+ first_layer = fluid.layers.data(
+ name="first_layer_node",
+ shape=[1],
+ dtype="int64",
+ lod_level=1, #支持变长
+ )
+
+ first_layer_mask = fluid.layers.data(
+ name="first_layer_node_mask",
+ shape=[1],
+ dtype="int64",
+ lod_level=1,
+ )
+
+ inputs = [input_emb] + [first_layer] + [first_layer_mask]
+ return inputs
+```
+
+确定起始层的方式比较简单,比较topK的大小与当层节点数,选取第一个节点数大于等于topK的层作为起始层,取它的节点作为起始节点。代码如下:
+```python
+def create_first_layer(self, args):
+ """decide which layer to start infer"""
+ first_layer_id = 0
+ for idx, layer_node in args.layer_node_num_list:
+ if layer_node >= self.topK:
+ first_layer_id = idx
+ break
+ first_layer_node = self.layer_list[first_layer_id]
+ self.first_layer_idx = first_layer_id
+ return first_layer_node
+```
+
+#### 通过各层分类器的流程
+
+tdm的检索逻辑类似beamsearch,简单来说:在每一层计算打分,得到topK的节点,将这些节点的孩子节点作为下一层的输入,如此循环,得到最终的topK。但仍然有一些需要注意的细节,下面将详细介绍。
+
+- 问题一:怎么处理`input_emb`?
+
+ - input_emb过`input_fc`,检索中,只需过一次即可:
+ ```python
+ nput_trans_emb = self.input_trans_net.input_fc_infer(input_emb)
+ ```
+ - 在通过每一层的分类器之前,过`layer_fc`,指定`layer_idx`,加载对应层的分类器,将输入映射到不同的兴趣粒度空间
+ ```python
+ input_fc_out = self.input_trans_net.layer_fc_infer(
+ input_trans_emb, layer_idx)
+ ```
+
+- 问题二:怎样实现beamsearch?
+
+ 我们通过在每一层计算打分,计算topK并拿到对应的孩子节点,for循环这个过程实现beamsearch。
+ ```python
+ for layer_idx in range(self.first_layer_idx, self.max_layers):
+ # 确定当前层的需要计算的节点数
+ if layer_idx == self.first_layer_idx:
+ current_layer_node_num = len(self.first_layer_node)
+ else:
+ current_layer_node_num = current_layer_node.shape[1] * \
+ current_layer_node.shape[2]
+
+ current_layer_node = fluid.layers.reshape(
+ current_layer_node, [self.batch_size, current_layer_node_num])
+ current_layer_child_mask = fluid.layers.reshape(
+ current_layer_child_mask, [self.batch_size, current_layer_node_num])
+
+ # 查当前层node的emb
+ node_emb = fluid.embedding(
+ input=current_layer_node,
+ size=[self.node_nums, self.node_embed_size],
+ param_attr=fluid.ParamAttr(name="TDM_Tree_Emb"))
+
+ input_fc_out = self.input_trans_net.layer_fc_infer(
+ input_trans_emb, layer_idx)
+
+ # 过每一层的分类器
+ layer_classifier_res = self.layer_classifier.classifier_layer_infer(input_fc_out, node_emb, layer_idx)
+
+ # 过最终的判别分类器
+ tdm_fc = fluid.layers.fc(input=layer_classifier_res,
+ size=self.label_nums,
+ act=None,
+ num_flatten_dims=2,
+ param_attr=fluid.ParamAttr(
+ name="tdm.cls_fc.weight"),
+ bias_attr=fluid.ParamAttr(name="tdm.cls_fc.bias"))
+
+ prob = fluid.layers.softmax(tdm_fc)
+ positive_prob = fluid.layers.slice(
+ prob, axes=[2], starts=[1], ends=[2])
+ prob_re = fluid.layers.reshape(
+ positive_prob, [self.batch_size, current_layer_node_num])
+
+ # 过滤掉padding产生的无效节点(node_id=0)
+ node_zero_mask = fluid.layers.cast(current_layer_node, 'bool')
+ node_zero_mask = fluid.layers.cast(node_zero_mask, 'float')
+ prob_re = prob_re * node_zero_mask
+
+ # 在当前层的分类结果中取topK,并将对应的score及node_id保存下来
+ k = self.topK
+ if current_layer_node_num < self.topK:
+ k = current_layer_node_num
+ _, topk_i = fluid.layers.topk(prob_re, k)
+
+ # index_sample op根据下标索引tensor对应位置的值
+ # 若paddle版本>2.0,调用方式为paddle.index_sample
+ top_node = fluid.contrib.layers.index_sample(
+ current_layer_node, topk_i)
+ prob_re_mask = prob_re * current_layer_child_mask # 过滤掉非叶子节点
+ topk_value = fluid.contrib.layers.index_sample(
+ prob_re_mask, topk_i)
+ node_score.append(topk_value)
+ node_list.append(top_node)
+
+ # 取当前层topK结果的孩子节点,作为下一层的输入
+ if layer_idx < self.max_layers - 1:
+ # tdm_child op 根据输入返回其 child 及 child_mask
+ # 若child是叶子节点,则child_mask=1,否则为0
+ current_layer_node, current_layer_child_mask = \
+ fluid.contrib.layers.tdm_child(x=top_node,
+ node_nums=self.node_nums,
+ child_nums=self.child_nums,
+ param_attr=fluid.ParamAttr(
+ name="TDM_Tree_Info"),
+ dtype='int64')
+ ```
+
+- 问题三:怎样得到最终的topK个叶子节点?
+
+ 在过每层分类器的过程中,我们保存了每层的topk节点,并将非叶子节点的打分置为了0,保存在`node_score`与`node_list`中。显然,我们需要召回的是topk个叶子节点,对所有层的叶子节点打分再计算一次topk,拿到结果。
+
+ ```python
+ total_node_score = fluid.layers.concat(node_score, axis=1)
+ total_node = fluid.layers.concat(node_list, axis=1)
+
+ # 考虑到树可能是不平衡的,计算所有层的叶子节点的topK
+ res_score, res_i = fluid.layers.topk(total_node_score, self.topK)
+ res_layer_node = fluid.contrib.layers.index_sample(total_node, res_i)
+ res_node = fluid.layers.reshape(res_layer_node, [-1, self.topK, 1])
+ ```
+
+- 问题四:现在拿到的是node_id,怎么转成item_id?
+
+ 如果有额外的映射表,可以将node_id转为item_id,但也有更方便的方法。在生成`tree_info`时,我们保存了每个node_id相应的item_id信息,直接使用它,将node_id对应的tree_info行的数据的第零维的item_id切出来。
+
+ ```python
+ # 利用Tree_info信息,将node_id转换为item_id
+ tree_info = fluid.default_main_program().global_block().var("TDM_Tree_Info")
+ res_node_emb = fluid.layers.gather_nd(tree_info, res_node)
+
+ res_item = fluid.layers.slice(
+ res_node_emb, axes=[2], starts=[0], ends=[1])
+ res_item_re = fluid.layers.reshape(res_item, [-1, self.topK])
+ ```
+以上,我们便完成了训练及预测组网的全部部分。
diff --git a/models/treebased/tdm/__init__.py b/models/treebased/tdm/__init__.py
new file mode 100755
index 0000000000000000000000000000000000000000..abf198b97e6e818e1fbe59006f98492640bcee54
--- /dev/null
+++ b/models/treebased/tdm/__init__.py
@@ -0,0 +1,13 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
diff --git a/models/recall/tdm/config.yaml b/models/treebased/tdm/config.yaml
similarity index 94%
rename from models/recall/tdm/config.yaml
rename to models/treebased/tdm/config.yaml
index bd870490f79646e7e250a65bda302d726ff61473..ec79017496c971d40d62f9f8a64769f2f39ccd9e 100755
--- a/models/recall/tdm/config.yaml
+++ b/models/treebased/tdm/config.yaml
@@ -18,13 +18,14 @@ train:
strategy: "async"
epochs: 2
- workspace: "fleetrec.models.recall.tdm"
+ workspace: "paddlerec.models.treebased.tdm"
reader:
batch_size: 32
class: "{workspace}/tdm_reader.py"
train_data_path: "{workspace}/data/train"
test_data_path: "{workspace}/data/test"
+ reader_debug_mode: False
model:
models: "{workspace}/model.py"
@@ -73,7 +74,7 @@ train:
save_last: True
evaluate:
- workspace: "fleetrec.models.recall.tdm"
+ workspace: "paddlerec.models.recall.tdm"
reader:
batch_size: 1
class: "{workspace}/tdm_evaluate_reader.py"
diff --git a/models/recall/tdm/data/test/demo_fake_test.txt b/models/treebased/tdm/data/test/demo_fake_test.txt
similarity index 100%
rename from models/recall/tdm/data/test/demo_fake_test.txt
rename to models/treebased/tdm/data/test/demo_fake_test.txt
diff --git a/models/recall/tdm/data/train/demo_fake_input.txt b/models/treebased/tdm/data/train/demo_fake_input.txt
similarity index 100%
rename from models/recall/tdm/data/train/demo_fake_input.txt
rename to models/treebased/tdm/data/train/demo_fake_input.txt
diff --git a/models/treebased/tdm/img/demo_network.png b/models/treebased/tdm/img/demo_network.png
new file mode 100644
index 0000000000000000000000000000000000000000..3a9981f360ffaaaa337d0c87b2fd6135969bc2a3
Binary files /dev/null and b/models/treebased/tdm/img/demo_network.png differ
diff --git a/models/treebased/tdm/img/demo_tree.png b/models/treebased/tdm/img/demo_tree.png
new file mode 100644
index 0000000000000000000000000000000000000000..5e50d34ab7e53ce4bf505b5affc5c2e1d9ebfa73
Binary files /dev/null and b/models/treebased/tdm/img/demo_tree.png differ
diff --git a/models/treebased/tdm/img/dnn-net.png b/models/treebased/tdm/img/dnn-net.png
new file mode 100644
index 0000000000000000000000000000000000000000..3974bdc10e74ff80f738b4372947c7acc669a507
Binary files /dev/null and b/models/treebased/tdm/img/dnn-net.png differ
diff --git a/models/treebased/tdm/img/input-net.png b/models/treebased/tdm/img/input-net.png
new file mode 100644
index 0000000000000000000000000000000000000000..ca0633cc120367de1a7834a9987764ca4a092951
Binary files /dev/null and b/models/treebased/tdm/img/input-net.png differ
diff --git a/models/recall/tdm/model.py b/models/treebased/tdm/model.py
similarity index 77%
rename from models/recall/tdm/model.py
rename to models/treebased/tdm/model.py
index ee96b260c10e85901242ac527f653e40573fefb9..319a7b4f9a3695537b43c8f1078dc4e1b73549fb 100755
--- a/models/recall/tdm/model.py
+++ b/models/treebased/tdm/model.py
@@ -14,49 +14,49 @@
# See the License for the specific language governing permissions and
# limitations under the License.
"""
+
import paddle.fluid as fluid
-import math
-from fleetrec.core.utils import envs
-from fleetrec.core.model import Model as ModelBase
+from paddlerec.core.utils import envs
+from paddlerec.core.model import Model as ModelBase
class Model(ModelBase):
def __init__(self, config):
ModelBase.__init__(self, config)
# tree meta hyper parameters
- self.max_layers = envs.get_global_env(
- "tree_parameters.max_layers", 4, self._namespace)
- self.node_nums = envs.get_global_env(
- "tree_parameters.node_nums", 26, self._namespace)
+ self.max_layers = envs.get_global_env("tree_parameters.max_layers", 4,
+ self._namespace)
+ self.node_nums = envs.get_global_env("tree_parameters.node_nums", 26,
+ self._namespace)
self.leaf_node_nums = envs.get_global_env(
"tree_parameters.leaf_node_nums", 13, self._namespace)
self.output_positive = envs.get_global_env(
"tree_parameters.output_positive", True, self._namespace)
self.layer_node_num_list = envs.get_global_env(
- "tree_parameters.layer_node_num_list", [
- 2, 4, 7, 12], self._namespace)
- self.child_nums = envs.get_global_env(
- "tree_parameters.child_nums", 2, self._namespace)
- self.tree_layer_path = envs.get_global_env(
- "tree.tree_layer_path", None, "train.startup")
+ "tree_parameters.layer_node_num_list", [2, 4, 7,
+ 12], self._namespace)
+ self.child_nums = envs.get_global_env("tree_parameters.child_nums", 2,
+ self._namespace)
+ self.tree_layer_path = envs.get_global_env("tree.tree_layer_path",
+ None, "train.startup")
# model training hyper parameter
self.node_emb_size = envs.get_global_env(
"hyper_parameters.node_emb_size", 64, self._namespace)
self.input_emb_size = envs.get_global_env(
"hyper_parameters.input_emb_size", 768, self._namespace)
- self.act = envs.get_global_env(
- "hyper_parameters.act", "tanh", self._namespace)
+ self.act = envs.get_global_env("hyper_parameters.act", "tanh",
+ self._namespace)
self.neg_sampling_list = envs.get_global_env(
- "hyper_parameters.neg_sampling_list", [
- 1, 2, 3, 4], self._namespace)
+ "hyper_parameters.neg_sampling_list", [1, 2, 3,
+ 4], self._namespace)
# model infer hyper parameter
- self.topK = envs.get_global_env(
- "hyper_parameters.node_nums", 1, self._namespace)
- self.batch_size = envs.get_global_env(
- "batch_size", 1, "evaluate.reader")
+ self.topK = envs.get_global_env("hyper_parameters.node_nums", 1,
+ self._namespace)
+ self.batch_size = envs.get_global_env("batch_size", 1,
+ "evaluate.reader")
def train_net(self):
self.train_input()
@@ -76,21 +76,22 @@ class Model(ModelBase):
input_emb = fluid.data(
name="input_emb",
shape=[None, self.input_emb_size],
- dtype="float32",
- )
+ dtype="float32", )
self._data_var.append(input_emb)
item_label = fluid.data(
name="item_label",
shape=[None, 1],
- dtype="int64",
- )
+ dtype="int64", )
self._data_var.append(item_label)
if self._platform != "LINUX":
self._data_loader = fluid.io.DataLoader.from_generator(
- feed_list=self._data_var, capacity=64, use_double_buffer=False, iterable=False)
+ feed_list=self._data_var,
+ capacity=64,
+ use_double_buffer=False,
+ iterable=False)
def tdm_net(self):
"""
@@ -116,8 +117,7 @@ class Model(ModelBase):
output_list=True,
seed=0,
tree_dtype='int64',
- dtype='int64'
- )
+ dtype='int64')
# 查表得到每个节点的Embedding
sample_nodes_emb = [
@@ -125,35 +125,34 @@ class Model(ModelBase):
input=sample_nodes[i],
is_sparse=True,
size=[self.node_nums, self.node_emb_size],
- param_attr=fluid.ParamAttr(
- name="TDM_Tree_Emb")
- ) for i in range(self.max_layers)
+ param_attr=fluid.ParamAttr(name="TDM_Tree_Emb"))
+ for i in range(self.max_layers)
]
# 此处进行Reshape是为了之后层次化的分类器训练
sample_nodes_emb = [
- fluid.layers.reshape(sample_nodes_emb[i],
- [-1, self.neg_sampling_list[i] +
- self.output_positive, self.node_emb_size]
- ) for i in range(self.max_layers)
+ fluid.layers.reshape(sample_nodes_emb[i], [
+ -1, self.neg_sampling_list[i] + self.output_positive,
+ self.node_emb_size
+ ]) for i in range(self.max_layers)
]
# 对输入的input_emb进行转换,使其维度与node_emb维度一致
input_trans_emb = self.input_trans_layer(input_emb)
# 分类器的主体网络,分别训练不同层次的分类器
- layer_classifier_res = self.classifier_layer(
- input_trans_emb, sample_nodes_emb)
+ layer_classifier_res = self.classifier_layer(input_trans_emb,
+ sample_nodes_emb)
# 最后的概率判别FC,将所有层次的node分类结果放到一起以相同的标准进行判别
# 考虑到树极大可能不平衡,有些item不在最后一层,所以需要这样的机制保证每个item都有机会被召回
- tdm_fc = fluid.layers.fc(input=layer_classifier_res,
- size=2,
- act=None,
- num_flatten_dims=2,
- param_attr=fluid.ParamAttr(
- name="tdm.cls_fc.weight"),
- bias_attr=fluid.ParamAttr(name="tdm.cls_fc.bias"))
+ tdm_fc = fluid.layers.fc(
+ input=layer_classifier_res,
+ size=2,
+ act=None,
+ num_flatten_dims=2,
+ param_attr=fluid.ParamAttr(name="tdm.cls_fc.weight"),
+ bias_attr=fluid.ParamAttr(name="tdm.cls_fc.bias"))
# 将loss打平,放到一起计算整体网络的loss
tdm_fc_re = fluid.layers.reshape(tdm_fc, [-1, 2])
@@ -202,7 +201,7 @@ class Model(ModelBase):
def metrics(self):
auc, batch_auc, _ = fluid.layers.auc(input=self._predict,
label=self.mask_label,
- num_thresholds=2 ** 12,
+ num_thresholds=2**12,
slide_steps=20)
self._metrics["AUC"] = auc
self._metrics["BATCH_AUC"] = batch_auc
@@ -218,8 +217,7 @@ class Model(ModelBase):
size=self.node_emb_size,
act=None,
param_attr=fluid.ParamAttr(name="trans.input_fc.weight"),
- bias_attr=fluid.ParamAttr(name="trans.input_fc.bias"),
- )
+ bias_attr=fluid.ParamAttr(name="trans.input_fc.bias"), )
# 将input_emb映射到各个不同层次的向量表示空间
input_layer_fc_out = [
@@ -229,8 +227,9 @@ class Model(ModelBase):
act=self.act,
param_attr=fluid.ParamAttr(
name="trans.layer_fc.weight." + str(i)),
- bias_attr=fluid.ParamAttr(name="trans.layer_fc.bias."+str(i)),
- ) for i in range(self.max_layers)
+ bias_attr=fluid.ParamAttr(
+ name="trans.layer_fc.bias." + str(i)), )
+ for i in range(self.max_layers)
]
return input_layer_fc_out
@@ -246,20 +245,22 @@ class Model(ModelBase):
input_layer_unsequeeze, expand_times=[1, node.shape[1], 1])
else:
input_layer_expand = fluid.layers.expand(
- input_layer_unsequeeze, expand_times=[1, node[layer_idx].shape[1], 1])
+ input_layer_unsequeeze,
+ expand_times=[1, node[layer_idx].shape[1], 1])
return input_layer_expand
def classifier_layer(self, input, node):
# 扩展input,使维度与node匹配
input_expand = [
- self._expand_layer(input[i], node, i) for i in range(self.max_layers)
+ self._expand_layer(input[i], node, i)
+ for i in range(self.max_layers)
]
# 将input_emb与node_emb concat到一起过分类器FC
input_node_concat = [
fluid.layers.concat(
- input=[input_expand[i], node[i]],
- axis=2) for i in range(self.max_layers)
+ input=[input_expand[i], node[i]], axis=2)
+ for i in range(self.max_layers)
]
hidden_states_fc = [
fluid.layers.fc(
@@ -268,9 +269,9 @@ class Model(ModelBase):
num_flatten_dims=2,
act=self.act,
param_attr=fluid.ParamAttr(
- name="cls.concat_fc.weight."+str(i)),
- bias_attr=fluid.ParamAttr(name="cls.concat_fc.bias."+str(i))
- ) for i in range(self.max_layers)
+ name="cls.concat_fc.weight." + str(i)),
+ bias_attr=fluid.ParamAttr(name="cls.concat_fc.bias." + str(i)))
+ for i in range(self.max_layers)
]
# 如果将所有层次的node放到一起计算loss,则需要在此处concat
@@ -285,12 +286,14 @@ class Model(ModelBase):
input_emb = fluid.layers.data(
name="input_emb",
shape=[self.input_emb_size],
- dtype="float32",
- )
+ dtype="float32", )
self._infer_data_var.append(input_emb)
self._infer_data_loader = fluid.io.DataLoader.from_generator(
- feed_list=self._infer_data_var, capacity=64, use_double_buffer=False, iterable=False)
+ feed_list=self._infer_data_var,
+ capacity=64,
+ use_double_buffer=False,
+ iterable=False)
def get_layer_list(self):
"""get layer list from layer_list.txt"""
@@ -318,10 +321,12 @@ class Model(ModelBase):
node_list = []
mask_list = []
for id in first_layer_node:
- node_list.append(fluid.layers.fill_constant(
- [self.batch_size, 1], value=int(id), dtype='int64'))
- mask_list.append(fluid.layers.fill_constant(
- [self.batch_size, 1], value=0, dtype='int64'))
+ node_list.append(
+ fluid.layers.fill_constant(
+ [self.batch_size, 1], value=int(id), dtype='int64'))
+ mask_list.append(
+ fluid.layers.fill_constant(
+ [self.batch_size, 1], value=0, dtype='int64'))
self.first_layer_node = fluid.layers.concat(node_list, axis=1)
self.first_layer_node_mask = fluid.layers.concat(mask_list, axis=1)
@@ -348,7 +353,7 @@ class Model(ModelBase):
current_layer_node_num = self.first_layer_node.shape[1]
else:
current_layer_node_num = current_layer_node.shape[1] * \
- current_layer_node.shape[2]
+ current_layer_node.shape[2]
current_layer_node = fluid.layers.reshape(
current_layer_node, [-1, current_layer_node_num])
@@ -359,28 +364,26 @@ class Model(ModelBase):
size=[self.node_nums, self.node_emb_size],
param_attr=fluid.ParamAttr(name="TDM_Tree_Emb"))
- input_fc_out = self.layer_fc_infer(
- input_trans_emb, layer_idx)
+ input_fc_out = self.layer_fc_infer(input_trans_emb, layer_idx)
# 过每一层的分类器
- layer_classifier_res = self.classifier_layer_infer(input_fc_out,
- node_emb,
- layer_idx)
+ layer_classifier_res = self.classifier_layer_infer(
+ input_fc_out, node_emb, layer_idx)
# 过最终的判别分类器
- tdm_fc = fluid.layers.fc(input=layer_classifier_res,
- size=2,
- act=None,
- num_flatten_dims=2,
- param_attr=fluid.ParamAttr(
- name="tdm.cls_fc.weight"),
- bias_attr=fluid.ParamAttr(name="tdm.cls_fc.bias"))
+ tdm_fc = fluid.layers.fc(
+ input=layer_classifier_res,
+ size=2,
+ act=None,
+ num_flatten_dims=2,
+ param_attr=fluid.ParamAttr(name="tdm.cls_fc.weight"),
+ bias_attr=fluid.ParamAttr(name="tdm.cls_fc.bias"))
prob = fluid.layers.softmax(tdm_fc)
positive_prob = fluid.layers.slice(
prob, axes=[2], starts=[1], ends=[2])
- prob_re = fluid.layers.reshape(
- positive_prob, [-1, current_layer_node_num])
+ prob_re = fluid.layers.reshape(positive_prob,
+ [-1, current_layer_node_num])
# 过滤掉padding产生的无效节点(node_id=0)
node_zero_mask = fluid.layers.cast(current_layer_node, 'bool')
@@ -395,11 +398,11 @@ class Model(ModelBase):
# index_sample op根据下标索引tensor对应位置的值
# 若paddle版本>2.0,调用方式为paddle.index_sample
- top_node = fluid.contrib.layers.index_sample(
- current_layer_node, topk_i)
+ top_node = fluid.contrib.layers.index_sample(current_layer_node,
+ topk_i)
prob_re_mask = prob_re * current_layer_node_mask # 过滤掉非叶子节点
- topk_value = fluid.contrib.layers.index_sample(
- prob_re_mask, topk_i)
+ topk_value = fluid.contrib.layers.index_sample(prob_re_mask,
+ topk_i)
node_score.append(topk_value)
node_list.append(top_node)
@@ -424,7 +427,8 @@ class Model(ModelBase):
res_node = fluid.layers.reshape(res_layer_node, [-1, self.topK, 1])
# 利用Tree_info信息,将node_id转换为item_id
- tree_info = fluid.default_main_program().global_block().var("TDM_Tree_Info")
+ tree_info = fluid.default_main_program().global_block().var(
+ "TDM_Tree_Info")
res_node_emb = fluid.layers.gather_nd(tree_info, res_node)
res_item = fluid.layers.slice(
@@ -442,8 +446,7 @@ class Model(ModelBase):
size=self.node_emb_size,
act=None,
param_attr=fluid.ParamAttr(name="trans.input_fc.weight"),
- bias_attr=fluid.ParamAttr(name="trans.input_fc.bias"),
- )
+ bias_attr=fluid.ParamAttr(name="trans.input_fc.bias"), )
return input_fc_out
def layer_fc_infer(self, input_fc_out, layer_idx):
@@ -458,8 +461,7 @@ class Model(ModelBase):
param_attr=fluid.ParamAttr(
name="trans.layer_fc.weight." + str(layer_idx)),
bias_attr=fluid.ParamAttr(
- name="trans.layer_fc.bias."+str(layer_idx)),
- )
+ name="trans.layer_fc.bias." + str(layer_idx)), )
return input_layer_fc_out
def classifier_layer_infer(self, input, node, layer_idx):
@@ -479,6 +481,7 @@ class Model(ModelBase):
num_flatten_dims=2,
act=self.act,
param_attr=fluid.ParamAttr(
- name="cls.concat_fc.weight."+str(layer_idx)),
- bias_attr=fluid.ParamAttr(name="cls.concat_fc.bias."+str(layer_idx)))
+ name="cls.concat_fc.weight." + str(layer_idx)),
+ bias_attr=fluid.ParamAttr(
+ name="cls.concat_fc.bias." + str(layer_idx)))
return hidden_states_fc
diff --git a/models/recall/tdm/tdm_evaluate_reader.py b/models/treebased/tdm/tdm_evaluate_reader.py
similarity index 89%
rename from models/recall/tdm/tdm_evaluate_reader.py
rename to models/treebased/tdm/tdm_evaluate_reader.py
index 915389ba3328e3e0ba60052532f4f93eca8faa4f..4e3b64770b42c05726dd3c90466d77e422e00902 100644
--- a/models/recall/tdm/tdm_evaluate_reader.py
+++ b/models/treebased/tdm/tdm_evaluate_reader.py
@@ -17,7 +17,7 @@
from __future__ import print_function
-from fleetrec.core.reader import Reader
+from paddlerec.core.reader import Reader
class EvaluateReader(Reader):
@@ -28,12 +28,14 @@ class EvaluateReader(Reader):
"""
Read the data line by line and process it as a dictionary
"""
+
def reader():
"""
This function needs to be implemented by the user, based on data format
"""
features = (line.strip('\n')).split('\t')
- input_emb = map(float, features[0].split(' '))
+ input_emb = features[0].split(' ')
+ input_emb = [float(i) for i in input_emb]
feature_name = ["input_emb"]
yield zip(feature_name, [input_emb])
diff --git a/models/recall/tdm/tdm_reader.py b/models/treebased/tdm/tdm_reader.py
similarity index 89%
rename from models/recall/tdm/tdm_reader.py
rename to models/treebased/tdm/tdm_reader.py
index 17413249c2adc54d450129c76ac0761e27ba27a4..709900649a03c3439cbf474781a5c0ae7b087dd7 100755
--- a/models/recall/tdm/tdm_reader.py
+++ b/models/treebased/tdm/tdm_reader.py
@@ -17,7 +17,7 @@
from __future__ import print_function
-from fleetrec.core.reader import Reader
+from paddlerec.core.reader import Reader
class TrainReader(Reader):
@@ -28,12 +28,14 @@ class TrainReader(Reader):
"""
Read the data line by line and process it as a dictionary
"""
+
def reader():
"""
This function needs to be implemented by the user, based on data format
"""
features = (line.strip('\n')).split('\t')
- input_emb = map(float, features[0].split(' '))
+ input_emb = features[0].split(' ')
+ input_emb = [float(i) for i in input_emb]
item_label = [int(features[1])]
feature_name = ["input_emb", "item_label"]
diff --git a/models/treebased/tdm/tree/layer_list.txt b/models/treebased/tdm/tree/layer_list.txt
new file mode 100755
index 0000000000000000000000000000000000000000..d1c6c50a10f1b40aa1fbdef7d57bdd600549fb11
--- /dev/null
+++ b/models/treebased/tdm/tree/layer_list.txt
@@ -0,0 +1,4 @@
+1,2
+3,4,5,6
+7,8,9,10,11,12,13
+14,15,16,17,18,19,20,21,22,23,24,25
diff --git a/models/recall/tdm/tree/travel_list.npy b/models/treebased/tdm/tree/travel_list.npy
similarity index 100%
rename from models/recall/tdm/tree/travel_list.npy
rename to models/treebased/tdm/tree/travel_list.npy
diff --git a/models/recall/tdm/tree/tree_emb.npy b/models/treebased/tdm/tree/tree_emb.npy
similarity index 100%
rename from models/recall/tdm/tree/tree_emb.npy
rename to models/treebased/tdm/tree/tree_emb.npy
diff --git a/models/recall/tdm/tree/tree_info.npy b/models/treebased/tdm/tree/tree_info.npy
similarity index 100%
rename from models/recall/tdm/tree/tree_info.npy
rename to models/treebased/tdm/tree/tree_info.npy
diff --git a/readme.md b/readme.md
deleted file mode 100644
index 6a37e1d30c5582a8ff6aa6236ec961b1ae8fa403..0000000000000000000000000000000000000000
--- a/readme.md
+++ /dev/null
@@ -1,155 +0,0 @@
-
-
-
-
-[](LICENSE)
-[](https://github.com/PaddlePaddle/PaddleRec/releases)
-
-PaddleRec是源于飞桨生态的搜索推荐模型一站式开箱即用工具,无论您是初学者,开发者,研究者均可便捷的使用PaddleRec完成调研,训练到预测部署的全流程工作。PaddleRec提供了搜索推荐任务中语义理解、召回、粗排、精排、多任务学习的全流程解决方案。
-
-PadlleRec以预置模型为核心,具备以下特点:
-- [易于上手,开箱即用](https://www.paddlepaddle.org.cn)
-- [灵活配置,个性调参](https://www.paddlepaddle.org.cn)
-- [分布式训练,大规模稀疏](https://www.paddlepaddle.org.cn)
-- [快速部署,一键上线](https://www.paddlepaddle.org.cn)
-
-
-
-
-
-# 目录
-* [特性](#特性)
-* [支持模型列表](#支持模型列表)
-* [文档教程](#文档教程)
- * [入门教程](#入门教程)
- * [环境要求](#环境要求)
- * [安装命令](#安装命令)
- * [快速开始](#快速开始)
- * [常见问题FAQ](#常见问题faq)
- * [进阶教程](#进阶教程)
- * [自定义数据集及Reader](#自定义数据集及reader)
- * [模型调参](#模型调参)
- * [单机训练](#单机训练)
- * [分布式训练](#分布式训练)
- * [预测部署](#预测部署)
-* [版本历史](#版本历史)
- * [版本更新](#版本更新)
- * [Benchamrk](#benchamrk)
-* [许可证书](#许可证书)
-* [如何贡献代码](#如何贡献代码)
- * [优化PaddleRec框架](#优化paddlerec框架)
- * [新增模型到PaddleRec](#新增模型到paddlerec)
-
-
-
-# 特性
-- 易于上手,开箱即用
-- 灵活配置,个性调参
-- 分布式训练,大规模稀疏
-- 快速部署,一键上线
-
-# 支持模型列表
-| 方向 | 模型 | 单机CPU训练 | 单机GPU训练 | 分布式CPU训练 | 分布式GPU训练 | 自定义数据集 | 服务器部署 |
-| :------------------: | :--------------------: | :---------: | :---------: | :-----------: | :-----------: | :----------: | :--------: |
-| ContentUnderstanding | [Text-Classifcation]() | ✓ | x | ✓ | x | ✓ | ✓ |
-| ContentUnderstanding | [TagSpace]() | ✓ | x | ✓ | x | ✓ | ✓ |
-| Recall | [Word2Vec]() | ✓ | x | ✓ | x | ✓ | ✓ |
-| Recall | [TDM]() | ✓ | x | ✓ | x | ✓ | ✓ |
-| Rank | [CTR-Dnn]() | ✓ | x | ✓ | x | ✓ | ✓ |
-| Rank | [DeepFm]() | ✓ | x | ✓ | x | ✓ | ✓ |
-| Rerank | [ListWise]() | ✓ | x | ✓ | x | ✓ | ✓ |
-| MultiTask | [MMOE]() | ✓ | x | ✓ | x | ✓ | ✓ |
-| MultiTask | [ESSM]() | ✓ | x | ✓ | x | ✓ | ✓ |
-| Match | [DSSM]() | ✓ | x | ✓ | x | ✓ | ✓ |
-| Match | [Multiview-Simnet]() | ✓ | x | ✓ | x | ✓ | ✓ |
-
-# 文档教程
-## 入门教程
-### 环境要求
-* Python >= 2.7
-* PaddlePaddle >= 1.7.2
-* 操作系统: Windows/Mac/Linux
-
-### 安装命令
-
-- 安装方法一:
- ```bash
- python -m pip install fleet-rec
- ```
-
-- 安装方法二
-
- * 安装飞桨 **注:需要用户安装最新版本的飞桨<当前只支持Linux系统>。**
-
- ```bash
- python -m pip install paddlepaddle -i https://mirror.baidu.com/pypi/simple
- ```
-
- * 源码安装Fleet-Rec
-
- ```
- git clone https://github.com/seiriosPlus/FleetRec/
- cd FleetRec
- python setup.py install
- ```
-
-### 快速开始
-#### ctr-dnn示例使用
-目前框架内置了多个模型,简单的命令即可使用内置模型开始单机训练和本地1*1模拟训练
-
-##### 单机训练
-```bash
-cd FleetRec
-
-python -m fleetrec.run \
- -m fleetrec.models.rank.dnn \
- -d cpu \
- -e single
-
-# 使用GPU资源进行训练
-python -m fleetrec.run \
- -m fleetrec.models.rank.dnn \
- -d gpu \
- -e single
-```
-
-##### 本地模拟分布式训练
-
-```bash
-cd FleetRec
-# 使用CPU资源进行训练
-python -m fleetrec.run \
- -m fleetrec.models.rank.dnn \
- -d cpu \
- -e local_cluster
-```
-
-##### 集群提交分布式训练<需要用户预先配置好集群环境,本提交命令不包含提交客户端>
-
-```bash
-cd FleetRec
-
-python -m fleetrec.run \
- -m fleetrec.models.rank.dnn \
- -d cpu \
- -e cluster
-```
-
-### 常见问题FAQ
-
-## 进阶教程
-### 自定义数据集及Reader
-### 模型调参
-### 单机训练
-### 分布式训练
-### 预测部署
-
-# 版本历史
-## 版本更新
-## Benchamrk
-
-# 许可证书
-本项目的发布受[Apache 2.0 license](LICENSE)许可认证。
-# 如何贡献代码
-## 优化PaddleRec框架
-## 新增模型到PaddleRec
diff --git a/run.py b/run.py
new file mode 100755
index 0000000000000000000000000000000000000000..594801fcdd5edb1821799ef53994674aec6a934d
--- /dev/null
+++ b/run.py
@@ -0,0 +1,329 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import os
+import subprocess
+
+import argparse
+import tempfile
+import yaml
+import copy
+from paddlerec.core.factory import TrainerFactory
+from paddlerec.core.utils import envs
+from paddlerec.core.utils import util
+
+engines = {}
+device = ["CPU", "GPU"]
+clusters = ["SINGLE", "LOCAL_CLUSTER", "CLUSTER"]
+engine_choices = [
+ "SINGLE_TRAIN", "LOCAL_CLUSTER", "CLUSTER", "TDM_SINGLE",
+ "TDM_LOCAL_CLUSTER", "TDM_CLUSTER", "SINGLE_INFER"
+]
+custom_model = ['TDM']
+model_name = ""
+
+
+def engine_registry():
+ engines["TRANSPILER"] = {}
+ engines["PSLIB"] = {}
+
+ engines["TRANSPILER"]["SINGLE_TRAIN"] = single_train_engine
+ engines["TRANSPILER"]["SINGLE_INFER"] = single_infer_engine
+ engines["TRANSPILER"]["LOCAL_CLUSTER"] = local_cluster_engine
+ engines["TRANSPILER"]["CLUSTER"] = cluster_engine
+ engines["PSLIB"]["SINGLE"] = local_mpi_engine
+ engines["PSLIB"]["LOCAL_CLUSTER"] = local_mpi_engine
+ engines["PSLIB"]["CLUSTER"] = cluster_mpi_engine
+
+
+def get_inters_from_yaml(file, filters):
+ with open(file, 'r') as rb:
+ _envs = yaml.load(rb.read(), Loader=yaml.FullLoader)
+
+ flattens = envs.flatten_environs(_envs)
+ inters = {}
+ for k, v in flattens.items():
+ for f in filters:
+ if k.startswith(f):
+ inters[k] = v
+ return inters
+
+
+def get_all_inters_from_yaml(file, filters):
+ with open(file, 'r') as rb:
+ _envs = yaml.load(rb.read(), Loader=yaml.FullLoader)
+ all_flattens = {}
+
+ def fatten_env_namespace(namespace_nests, local_envs):
+ for k, v in local_envs.items():
+ if isinstance(v, dict):
+ nests = copy.deepcopy(namespace_nests)
+ nests.append(k)
+ fatten_env_namespace(nests, v)
+ elif (k == "dataset" or k == "phase" or
+ k == "runner") and isinstance(v, list):
+ for i in v:
+ if i.get("name") is None:
+ raise ValueError("name must be in dataset list ", v)
+ nests = copy.deepcopy(namespace_nests)
+ nests.append(k)
+ nests.append(i["name"])
+ fatten_env_namespace(nests, i)
+ else:
+ global_k = ".".join(namespace_nests + [k])
+ all_flattens[global_k] = v
+
+ fatten_env_namespace([], _envs)
+ ret = {}
+ for k, v in all_flattens.items():
+ for f in filters:
+ if k.startswith(f):
+ ret[k] = v
+ return ret
+
+
+def get_engine(args):
+ transpiler = get_transpiler()
+ with open(args.model, 'r') as rb:
+ envs = yaml.load(rb.read(), Loader=yaml.FullLoader)
+ run_extras = get_all_inters_from_yaml(args.model, ["train.", "runner."])
+
+ engine = run_extras.get("train.engine", None)
+ if engine is None:
+ engine = run_extras.get("runner." + envs["mode"] + ".class", None)
+ if engine is None:
+ engine = "single_train"
+ engine = engine.upper()
+ if engine not in engine_choices:
+ raise ValueError("train.engin can not be chosen in {}".format(
+ engine_choices))
+
+ print("engines: \n{}".format(engines))
+
+ run_engine = engines[transpiler].get(engine, None)
+
+ return run_engine
+
+
+def get_transpiler():
+ FNULL = open(os.devnull, 'w')
+ cmd = [
+ "python", "-c",
+ "import paddle.fluid as fluid; fleet_ptr = fluid.core.Fleet(); [fleet_ptr.copy_table_by_feasign(10, 10, [2020, 1010])];"
+ ]
+ proc = subprocess.Popen(cmd, stdout=FNULL, stderr=FNULL, cwd=os.getcwd())
+ ret = proc.wait()
+ if ret == -11:
+ return "PSLIB"
+ else:
+ return "TRANSPILER"
+
+
+def set_runtime_envs(cluster_envs, engine_yaml):
+ if cluster_envs is None:
+ cluster_envs = {}
+
+ engine_extras = get_inters_from_yaml(engine_yaml, "train.trainer.")
+ if "train.trainer.threads" in engine_extras and "CPU_NUM" in cluster_envs:
+ cluster_envs["CPU_NUM"] = engine_extras["train.trainer.threads"]
+
+ envs.set_runtime_environs(cluster_envs)
+ envs.set_runtime_environs(engine_extras)
+
+ need_print = {}
+ for k, v in os.environ.items():
+ if k.startswith("train.trainer."):
+ need_print[k] = v
+
+ print(envs.pretty_print_envs(need_print, ("Runtime Envs", "Value")))
+
+
+def get_trainer_prefix(args):
+ if model_name in custom_model:
+ return model_name.upper()
+ return ""
+
+
+def single_train_engine(args):
+ trainer = get_trainer_prefix(args) + "SingleTrainer"
+ single_envs = {}
+ single_envs["train.trainer.trainer"] = trainer
+ single_envs["train.trainer.threads"] = "2"
+ single_envs["train.trainer.engine"] = "single_train"
+ single_envs["train.trainer.platform"] = envs.get_platform()
+ print("use {} engine to run model: {}".format(trainer, args.model))
+ set_runtime_envs(single_envs, args.model)
+ trainer = TrainerFactory.create(args.model)
+ return trainer
+
+
+def single_infer_engine(args):
+ trainer = get_trainer_prefix(args) + "SingleInfer"
+ single_envs = {}
+ single_envs["train.trainer.trainer"] = trainer
+ single_envs["train.trainer.threads"] = "2"
+ single_envs["train.trainer.engine"] = "single_infer"
+ single_envs["train.trainer.platform"] = envs.get_platform()
+ print("use {} engine to run model: {}".format(trainer, args.model))
+ set_runtime_envs(single_envs, args.model)
+ trainer = TrainerFactory.create(args.model)
+ return trainer
+
+
+def cluster_engine(args):
+ def update_workspace(cluster_envs):
+ workspace = cluster_envs.get("engine_workspace", None)
+
+ if not workspace:
+ return
+ path = envs.path_adapter(workspace)
+ for name, value in cluster_envs.items():
+ if isinstance(value, str):
+ value = value.replace("{workspace}", path)
+ value = envs.windows_path_converter(value)
+ cluster_envs[name] = value
+
+ def master():
+ role = "MASTER"
+ from paddlerec.core.engine.cluster.cluster import ClusterEngine
+ with open(args.backend, 'r') as rb:
+ _envs = yaml.load(rb.read(), Loader=yaml.FullLoader)
+
+ flattens = envs.flatten_environs(_envs, "_")
+ flattens["engine_role"] = role
+ flattens["engine_run_config"] = args.model
+ flattens["engine_temp_path"] = tempfile.mkdtemp()
+ update_workspace(flattens)
+
+ envs.set_runtime_environs(flattens)
+ print(envs.pretty_print_envs(flattens, ("Submit Runtime Envs", "Value"
+ )))
+
+ launch = ClusterEngine(None, args.model)
+ return launch
+
+ def worker():
+ role = "WORKER"
+ trainer = get_trainer_prefix(args) + "ClusterTrainer"
+ cluster_envs = {}
+ cluster_envs["train.trainer.trainer"] = trainer
+ cluster_envs["train.trainer.engine"] = "cluster"
+ cluster_envs["train.trainer.threads"] = envs.get_runtime_environ(
+ "CPU_NUM")
+ cluster_envs["train.trainer.platform"] = envs.get_platform()
+ print("launch {} engine with cluster to with model: {}".format(
+ trainer, args.model))
+ set_runtime_envs(cluster_envs, args.model)
+
+ trainer = TrainerFactory.create(args.model)
+ return trainer
+
+ role = os.getenv("PADDLE_PADDLEREC_ROLE", "MASTER")
+
+ if role == "WORKER":
+ return worker()
+ else:
+ return master()
+
+
+def cluster_mpi_engine(args):
+ print("launch cluster engine with cluster to run model: {}".format(
+ args.model))
+
+ cluster_envs = {}
+ cluster_envs["train.trainer.trainer"] = "CtrCodingTrainer"
+ cluster_envs["train.trainer.platform"] = envs.get_platform()
+
+ set_runtime_envs(cluster_envs, args.model)
+
+ trainer = TrainerFactory.create(args.model)
+ return trainer
+
+
+def local_cluster_engine(args):
+ from paddlerec.core.engine.local_cluster import LocalClusterEngine
+
+ trainer = get_trainer_prefix(args) + "ClusterTrainer"
+ cluster_envs = {}
+ cluster_envs["server_num"] = 1
+ cluster_envs["worker_num"] = 1
+ cluster_envs["start_port"] = envs.find_free_port()
+ cluster_envs["log_dir"] = "logs"
+ cluster_envs["train.trainer.trainer"] = trainer
+ cluster_envs["train.trainer.strategy"] = "async"
+ cluster_envs["train.trainer.threads"] = "2"
+ cluster_envs["train.trainer.engine"] = "local_cluster"
+ cluster_envs["train.trainer.platform"] = envs.get_platform()
+
+ cluster_envs["CPU_NUM"] = "2"
+ print("launch {} engine with cluster to run model: {}".format(trainer,
+ args.model))
+
+ set_runtime_envs(cluster_envs, args.model)
+ launch = LocalClusterEngine(cluster_envs, args.model)
+ return launch
+
+
+def local_mpi_engine(args):
+ print("launch cluster engine with cluster to run model: {}".format(
+ args.model))
+ from paddlerec.core.engine.local_mpi import LocalMPIEngine
+
+ print("use 1X1 MPI ClusterTraining at localhost to run model: {}".format(
+ args.model))
+
+ mpi = util.run_which("mpirun")
+ if not mpi:
+ raise RuntimeError("can not find mpirun, please check environment")
+ cluster_envs = {}
+ cluster_envs["mpirun"] = mpi
+ cluster_envs["train.trainer.trainer"] = "CtrCodingTrainer"
+ cluster_envs["log_dir"] = "logs"
+ cluster_envs["train.trainer.engine"] = "local_cluster"
+
+ cluster_envs["train.trainer.platform"] = envs.get_platform()
+
+ set_runtime_envs(cluster_envs, args.model)
+ launch = LocalMPIEngine(cluster_envs, args.model)
+ return launch
+
+
+def get_abs_model(model):
+ if model.startswith("paddlerec."):
+ dir = envs.path_adapter(model)
+ path = os.path.join(dir, "config.yaml")
+ else:
+ if not os.path.isfile(model):
+ raise IOError("model config: {} invalid".format(model))
+ path = model
+ return path
+
+
+if __name__ == "__main__":
+ parser = argparse.ArgumentParser(description='paddle-rec run')
+ parser.add_argument("-m", "--model", type=str)
+ parser.add_argument("-b", "--backend", type=str, default=None)
+
+ abs_dir = os.path.dirname(os.path.abspath(__file__))
+ envs.set_runtime_environs({"PACKAGE_BASE": abs_dir})
+
+ args = parser.parse_args()
+
+ model_name = args.model.split('.')[-1]
+ args.model = get_abs_model(args.model)
+ engine_registry()
+
+ which_engine = get_engine(args)
+ engine = which_engine(args)
+ engine.run()
diff --git a/setup.py b/setup.py
index 80e39b5533e050e2307a436865ce14e44d797773..8ad1cc742434aa39513a1c618b56649c3530686a 100644
--- a/setup.py
+++ b/setup.py
@@ -1,25 +1,37 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
"""
-setup for fleet-rec.
+setup for paddle-rec.
"""
+
import os
+
from setuptools import setup, find_packages
-import tempfile
import shutil
+import tempfile
-requires = [
- "paddlepaddle == 1.7.2",
- "pyyaml >= 5.1.1"
-]
+requires = ["paddlepaddle == 1.7.2", "pyyaml >= 5.1.1"]
about = {}
-about["__title__"] = "fleet-rec"
+about["__title__"] = "paddle-rec"
about["__version__"] = "0.0.2"
-about["__description__"] = "fleet-rec"
-about["__author__"] = "seiriosPlus"
-about["__author_email__"] = "tangwei12@baidu.com"
-about["__url__"] = "https://github.com/seiriosPlus/FleetRec"
+about["__description__"] = "paddle-rec"
+about["__author__"] = "paddle-dev"
+about["__author_email__"] = "paddle-dev@baidu.com"
+about["__url__"] = "https://github.com/PaddlePaddle/PaddleRec"
-readme = "..."
+readme = ""
def run_cmd(command):
@@ -30,19 +42,36 @@ def run_cmd(command):
def build(dirname):
package_dir = os.path.dirname(os.path.abspath(__file__))
run_cmd("cp -r {}/* {}".format(package_dir, dirname))
- run_cmd("mkdir {}".format(os.path.join(dirname, "fleetrec")))
- run_cmd("mv {}/* {}".format(os.path.join(dirname, "fleet_rec"), os.path.join(dirname, "fleetrec")))
- run_cmd("mv {} {}".format(os.path.join(dirname, "doc"), os.path.join(dirname, "fleetrec")))
- run_cmd("mv {} {}".format(os.path.join(dirname, "models"), os.path.join(dirname, "fleetrec")))
- run_cmd("mv {} {}".format(os.path.join(dirname, "tools"), os.path.join(dirname, "fleetrec")))
+ run_cmd("mkdir {}".format(os.path.join(dirname, "paddlerec")))
+ run_cmd("mv {} {}".format(
+ os.path.join(dirname, "core"), os.path.join(dirname, "paddlerec")))
+ run_cmd("mv {} {}".format(
+ os.path.join(dirname, "doc"), os.path.join(dirname, "paddlerec")))
+ run_cmd("mv {} {}".format(
+ os.path.join(dirname, "models"), os.path.join(dirname, "paddlerec")))
+ run_cmd("mv {} {}".format(
+ os.path.join(dirname, "tests"), os.path.join(dirname, "paddlerec")))
+ run_cmd("mv {} {}".format(
+ os.path.join(dirname, "tools"), os.path.join(dirname, "paddlerec")))
+ run_cmd("mv {} {}".format(
+ os.path.join(dirname, "*.py"), os.path.join(dirname, "paddlerec")))
- packages = find_packages(dirname, include=('fleetrec.*'))
+ packages = find_packages(dirname, include=('paddlerec.*'))
package_dir = {'': dirname}
package_data = {}
- need_copy = ['data/*.txt', 'data/*/*.txt', '*.yaml', 'tree/*.npy','tree/*.txt']
+
+ models_copy = [
+ 'data/*.txt', 'data/*/*.txt', '*.yaml', '*.sh', 'tree/*.npy',
+ 'tree/*.txt', 'data/sample_data/*', 'data/sample_data/train/*',
+ 'data/sample_data/infer/*', 'data/*/*.csv'
+ ]
+
+ engine_copy = ['*/*.sh']
for package in packages:
- if package.startswith("fleetrec.models."):
- package_data[package] = need_copy
+ if package.startswith("paddlerec.models."):
+ package_data[package] = models_copy
+ if package.startswith("paddlerec.core.engine"):
+ package_data[package] = engine_copy
setup(
name=about["__title__"],
@@ -57,8 +86,7 @@ def build(dirname):
package_data=package_data,
python_requires=">=2.7",
install_requires=requires,
- zip_safe=False
- )
+ zip_safe=False)
dirname = tempfile.mkdtemp()
@@ -67,13 +95,13 @@ shutil.rmtree(dirname)
print('''
\033[32m
- _ _ _ _ _ _ _ _ _
- / \ / \ / \ / \ / \ / \ / \ / \ / \
-( F | L | E | E | T | - | R | E | C )
- \_/ \_/ \_/ \_/ \_/ \_/ \_/ \_/ \_/
+ _ _ _ _ _ _ _ _ _
+ / \ / \ / \ / \ / \ / \ / \ / \ / \
+( P | A | D | D | L | E | - | R | E | C )
+ \_/ \_/ \_/ \_/ \_/ \_/ \_/ \_/ \_/
\033[0m
\033[34m
Installation Complete. Congratulations!
-How to use it ? Please visit our webside: https://github.com/seiriosPlus/FleetRec
+How to use it ? Please visit our webside: https://github.com/PaddlePaddle/PaddleRec
\033[0m
''')
diff --git a/tests/__init__.py b/tests/__init__.py
new file mode 100755
index 0000000000000000000000000000000000000000..abf198b97e6e818e1fbe59006f98492640bcee54
--- /dev/null
+++ b/tests/__init__.py
@@ -0,0 +1,13 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
diff --git a/tools/__init__.py b/tools/__init__.py
index e69de29bb2d1d6434b8b29ae775ad8c2e48c5391..abf198b97e6e818e1fbe59006f98492640bcee54 100644
--- a/tools/__init__.py
+++ b/tools/__init__.py
@@ -0,0 +1,13 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
diff --git a/tools/build_script.sh b/tools/build_script.sh
new file mode 100755
index 0000000000000000000000000000000000000000..6fa779fac7b7e99f203d64fe69d339469f19d3bf
--- /dev/null
+++ b/tools/build_script.sh
@@ -0,0 +1,61 @@
+#!/usr/bin/env bash
+
+# Copyright (c) 2018 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+#=================================================
+# Utils
+#=================================================
+
+set -ex
+
+function init() {
+ RED='\033[0;31m'
+ BLUE='\033[0;34m'
+ BOLD='\033[1m'
+ NONE='\033[0m'
+
+ ROOT="$( cd "$( dirname "${BASH_SOURCE[0]}")/../../" && pwd )"
+}
+
+function check_style() {
+ set -e
+
+ export PATH=/usr/bin:$PATH
+ pre-commit install
+
+ if ! pre-commit run -a; then
+ git diff
+ exit 1
+ fi
+
+ exit 0
+}
+
+function main() {
+ local CMD=$1
+ init
+ case $CMD in
+ check_style)
+ check_style
+ ;;
+ *)
+ echo "build failed"
+ exit 1
+ ;;
+ esac
+ echo "check_style finished as expected"
+}
+
+main $@
diff --git a/tools/codestyle/copyright.hook b/tools/codestyle/copyright.hook
new file mode 100644
index 0000000000000000000000000000000000000000..23aaf38f6f9b97220a55b29c7d0e800fb1e86105
--- /dev/null
+++ b/tools/codestyle/copyright.hook
@@ -0,0 +1,121 @@
+from __future__ import absolute_import
+from __future__ import print_function
+from __future__ import unicode_literals
+
+import argparse
+import io, re
+import sys, os
+import subprocess
+import platform
+
+COPYRIGHT = '''
+Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+
+Licensed under the Apache License, Version 2.0 (the "License");
+you may not use this file except in compliance with the License.
+You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing, software
+distributed under the License is distributed on an "AS IS" BASIS,
+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+See the License for the specific language governing permissions and
+limitations under the License.
+'''
+
+LANG_COMMENT_MARK = None
+
+NEW_LINE_MARK = None
+
+COPYRIGHT_HEADER = None
+
+if platform.system() == "Windows":
+ NEW_LINE_MARK = "\r\n"
+else:
+ NEW_LINE_MARK = '\n'
+ COPYRIGHT_HEADER = COPYRIGHT.split(NEW_LINE_MARK)[1]
+ p = re.search('(\d{4})', COPYRIGHT_HEADER).group(0)
+ process = subprocess.Popen(["date", "+%Y"], stdout=subprocess.PIPE)
+ date, err = process.communicate()
+ date = date.decode("utf-8").rstrip("\n")
+ COPYRIGHT_HEADER = COPYRIGHT_HEADER.replace(p, date)
+
+
+def generate_copyright(template, lang='C'):
+ if lang == 'Python':
+ LANG_COMMENT_MARK = '#'
+ else:
+ LANG_COMMENT_MARK = "//"
+
+ lines = template.split(NEW_LINE_MARK)
+ BLANK = " "
+ ans = LANG_COMMENT_MARK + BLANK + COPYRIGHT_HEADER + NEW_LINE_MARK
+ for lino, line in enumerate(lines):
+ if lino == 0 or lino == 1 or lino == len(lines) - 1: continue
+ if len(line) == 0:
+ BLANK = ""
+ else:
+ BLANK = " "
+ ans += LANG_COMMENT_MARK + BLANK + line + NEW_LINE_MARK
+
+ return ans + "\n"
+
+
+def lang_type(filename):
+ if filename.endswith(".py"):
+ return "Python"
+ elif filename.endswith(".h"):
+ return "C"
+ elif filename.endswith(".c"):
+ return "C"
+ elif filename.endswith(".hpp"):
+ return "C"
+ elif filename.endswith(".cc"):
+ return "C"
+ elif filename.endswith(".cpp"):
+ return "C"
+ elif filename.endswith(".cu"):
+ return "C"
+ elif filename.endswith(".cuh"):
+ return "C"
+ elif filename.endswith(".go"):
+ return "C"
+ elif filename.endswith(".proto"):
+ return "C"
+ else:
+ print("Unsupported filetype %s", filename)
+ exit(0)
+
+
+PYTHON_ENCODE = re.compile("^[ \t\v]*#.*?coding[:=][ \t]*([-_.a-zA-Z0-9]+)")
+
+
+def main(argv=None):
+ parser = argparse.ArgumentParser(
+ description='Checker for copyright declaration.')
+ parser.add_argument('filenames', nargs='*', help='Filenames to check')
+ args = parser.parse_args(argv)
+
+ retv = 0
+ for filename in args.filenames:
+ fd = io.open(filename, encoding="utf-8")
+ first_line = fd.readline()
+ second_line = fd.readline()
+ if "COPYRIGHT (C)" in first_line.upper(): continue
+ if first_line.startswith("#!") or PYTHON_ENCODE.match(
+ second_line) != None or PYTHON_ENCODE.match(first_line) != None:
+ continue
+ original_contents = io.open(filename, encoding="utf-8").read()
+ new_contents = generate_copyright(
+ COPYRIGHT, lang_type(filename)) + original_contents
+ print('Auto Insert Copyright Header {}'.format(filename))
+ retv = 1
+ with io.open(filename, 'w') as output_file:
+ output_file.write(new_contents)
+
+ return retv
+
+
+if __name__ == '__main__':
+ exit(main())
diff --git a/tools/codestyle/pylint_pre_commit.hook b/tools/codestyle/pylint_pre_commit.hook
new file mode 100644
index 0000000000000000000000000000000000000000..150a3f5666bd39d30b7e6518e58a14fb5fe2f14b
--- /dev/null
+++ b/tools/codestyle/pylint_pre_commit.hook
@@ -0,0 +1,19 @@
+#!/bin/bash
+
+TOTAL_ERRORS=0
+
+
+DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+export PYTHONPATH=$DIR:$PYTHONPATH
+
+# The trick to remove deleted files: https://stackoverflow.com/a/2413151
+for file in $(git diff --name-status | awk '$1 != "D" {print $2}'); do
+ pylint --disable=all --load-plugins=docstring_checker \
+ --enable=doc-string-one-line,doc-string-end-with,doc-string-with-all-args,doc-string-triple-quotes,doc-string-missing,doc-string-indent-error,doc-string-with-returns,doc-string-with-raises $file;
+ TOTAL_ERRORS=$(expr $TOTAL_ERRORS + $?);
+done
+
+exit $TOTAL_ERRORS
+#For now, just warning:
+#exit 0
+
diff --git a/tools/tools.py b/tools/tools.py
index da34a027c027d6809603869f946499ec45edf8e6..8508a790bc8012954b53cc43b088d2e50655647d 100644
--- a/tools/tools.py
+++ b/tools/tools.py
@@ -1,12 +1,27 @@
+# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import functools
import os
-import time
+import platform
+import sys
import shutil
+import time
+
import requests
-import sys
import tarfile
import zipfile
-import platform
-import functools
lasttime = time.time()
FLUSH_INTERVAL = 0.1