diff --git a/doc/online_learning.md b/doc/online_learning.md deleted file mode 100644 index 83bab4bd466331894ec2d82c8c40aded2f90a2df..0000000000000000000000000000000000000000 --- a/doc/online_learning.md +++ /dev/null @@ -1,173 +0,0 @@ -# PaddleRec 流式训练(OnlineLearning)任务启动及配置流程 - - -## 流式训练简介 -流式训练是按照一定顺序进行数据的接收和处理,每接收一个数据,模型会对它进行预测并对当前模型进行更新,然后处理下一个数据。 像信息流、小视频、电商等场景,每天都会新增大量的数据, 让每天(每一刻)新增的数据基于上一天(上一刻)的模型进行新的预测和模型更新。 - -在大规模流式训练场景下, 需要使用的深度学习框架有对应的能力支持, 即: -* 支持大规模分布式训练的能力, 数据量巨大, 需要有良好的分布式训练及扩展能力,才能满足训练的时效要求 -* 支持超大规模的Embedding, 能够支持十亿甚至千亿级别的Embedding, 拥有合理的参数输出的能力,能够快速输出模型参数并和线上其他系统进行对接 -* Embedding的特征ID需要支持HASH映射,不要求ID的编码,能够自动增长及控制特征的准入(原先不存在的特征可以以适当的条件创建), 能够定期淘汰(能够以一定的策略进行过期的特征的清理) 并拥有准入及淘汰策略 -* 最后就是要基于框架开发一套完备的流式训练的 trainer.py, 能够拥有完善的流式训练流程 - -## 使用PaddleRec内置的 online learning 进行模型的训练 -目前,PaddleRec基于飞桨分布式训练框架的能力,实现了这套流式训练的流程。 供大家参考和使用。我们在`models/online_learning`目录下提供了一个ctr-dnn的online_training的版本,供大家更好的理解和参考。 - -**注意** -1. 使用online learning 需要安装目前Paddle最新的开发者版本, 你可以从 https://www.paddlepaddle.org.cn/documentation/docs/zh/install/Tables.html#whl-dev 此处获得它,需要先卸载当前已经安装的飞桨版本,根据自己的Python环境下载相应的安装包。 -2. 使用流式训练及大规模稀疏的能力,需要对模型做一些微调, 因此需要你修改部分代码。 -3. 当前只有参数服务器的分布式训练是支持带大规模稀疏的流式训练的,因此运行时,请直接选择参数服务器本地训练或集群训练方法。 - -## 启动方法 - -### 1. 启动内置模型的默认配置训练 - -在安装好`paddlepaddle`及`paddlerec`后,可以直接使用一行命令快速启动内置模型的默认配置训练,命令如下; - -```shell -python -m paddlerec.run -m paddlerec.models.xxx.yyy -``` - -注意事项: -1. 请确保调用的是安装了paddlerec的`python`环境 -2. `xxx`为paddlerec.models下有多个大类,如:`recall`/`rank`/`rerank`等 -3. `yyy`为每个类别下又有多个模型,如`recall`下有:`gnn`/`grup4rec`/`ncf`等 - -例如启动`recall`下的`word2vec`模型的默认配置; - -```shell -python -m paddlerec.run -m models/recall/word2vec -``` - -### 2. 启动内置模型的个性化配置训练 - -如果我们修改了默认模型的config.yaml文件,怎么运行修改后的模型呢? - -- **没有改动模型组网** - - 假如你将paddlerec代码库克隆在了`/home/PaddleRec`,并修改了`/home/PaddleRec/models/rank/dnn/config.yaml`,则如下启动训练 - - ```shell - python -m paddlerec.run -m /home/PaddleRec/models/rank/dnn/config.yaml - ``` - - paddlerec 运行的是在paddlerec库安装目录下的组网文件(model.py),但个性化配置`config.yaml`是用的是指定路径下的yaml文件。 - -- **改动了模型组网** - - 假如你将paddlerec代码库克隆在了`/home/PaddleRec`,并修改了`/home/PaddleRec/models/rank/dnn/model.py`, 以及`/home/PaddleRec/models/rank/dnn/config.yaml`,则首先需要更改`yaml`中的`workspace`的设置: - - ```yaml - workspace: /home/PaddleRec/models/rank/dnn/ - ``` - - 再执行: - - ```shell - python -m paddlerec.run -m /home/PaddleRec/models/rank/dnn/config.yaml - ``` - - paddlerec 运行的是绝对路径下的组网文件(model.py)以及个性化配置文件(config.yaml) - - - - -## yaml训练配置 - -### yaml中训练相关的概念 - -`config.yaml`中训练流程相关有两个重要的逻辑概念,`runner`与`phase`: - -- **`runner`** : runner是训练的引擎,亦可称之为运行器,在runner中定义执行设备(cpu、gpu),执行的模式(训练、预测、单机、多机等),以及运行的超参,例如训练轮数,模型保存地址等。 -- **`phase`** : phase是训练中的阶段的概念,是引擎具体执行的内容,该内容是指:具体运行哪个模型文件,使用哪个reader。 - -PaddleRec每次运行时,会执行一个或多个运行器,通过`mode`指定`runner`的名字。每个运行器可以执行一个或多个`phase`,所以PaddleRec支持一键启动多阶段的训练。 - -### 单机CPU训练 - -下面我们开始定义一个单机CPU训练的`runner`: - -```yaml -mode: single_cpu_train # 执行名为 single_cpu_train 的运行器 -# mode 也支持多个runner的执行,此处可以改为 mode: [single_cpu_train, single_cpu_infer] - -runner: -- name: single_cpu_train # 定义 runner 名为 single_cpu_train - class: train # 执行单机训练 - device: cpu # 执行在 cpu 上 - epochs: 10 # 训练轮数 - - save_checkpoint_interval: 2 # 每隔2轮保存一次checkpoint - save_inference_interval: 4 # 每隔4轮保存一次inference model - save_checkpoint_path: "increment" # checkpoint 的保存地址 - save_inference_path: "inference" # inference model 的保存地址 - save_inference_feed_varnames: [] # inference model 的feed参数的名字 - save_inference_fetch_varnames: [] # inference model 的fetch参数的名字 - init_model_path: "" # 如果是加载模型热启,则可以指定初始化模型的地址 - print_interval: 10 # 训练信息的打印间隔,以batch为单位 - phases: [phase_train] # 若没有指定phases,则会默认运行所有phase - # phase 也支持自定多个phase的执行,此处可以改为 phases: [phase_train, phase_infer] -``` - -再定义具体的执行内容: - -```yaml -phase: -- name: phase_train # 该阶段名为 phase1 - model: "{workspace}/model.py" # 模型文件为workspace下的model.py - dataset_name: dataset_train # reader的名字 - -dataset: -- name: dataset_train - type: DataLoader # 使用DataLoader的数据读取方式 - batch_size: 2 - data_path: "{workspace}/train_data" # 数据地址 - sparse_slots: "click 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26" # sparse 输入的位置定义 - dense_slots: "dense_var:13" # dense参数的维度定义 - -``` - -### 单机单卡GPU训练 - -具体执行内容与reader与前述相同,下面介绍需要改动的地方 - -```yaml -mode: single_gpu_train # 执行名为 single_gpu_train 的运行器 - -runner: -- name: single_gpu_train # 定义 runner 名为 single_gpu_train - class: train # 执行单机训练 - device: gpu # 执行在 gpu 上 - selected_gpus: "0" # 默认选择在id=0的卡上执行训练 - epochs: 10 # 训练轮数 -``` - -### 单机多卡GPU训练 - -具体执行内容与reader与前述相同,下面介绍需要改动的地方 - -```yaml -mode: single_multi_gpu_train # 执行名为 single_multi_gpu_train 的运行器 - -runner: -- name: single_multi_gpu_train # 定义 runner 名为 single_multi_gpu_train - class: train # 执行单机训练 - device: gpu # 执行在 gpu 上 - selected_gpus: "0,1,2,3" # 选择多卡执行训练 - epochs: 10 # 训练轮数 -``` - -### 本地模拟参数服务器训练 -具体执行内容与reader与前述相同,下面介绍需要改动的地方 - -```yaml -mode: local_cluster_cpu_train # 执行名为 local_cluster_cpu_train 的运行器 - -runner: -- name: local_cluster_cpu_train # 定义 runner 名为 runner_train - class: local_cluster_train # 执行本地模拟分布式——参数服务器训练 - device: cpu # 执行在 cpu 上(paddle后续版本会支持PS-GPU) - worker_num: 1 # (可选)worker进程数量,默认1 - server_num: 1 # (可选)server进程数量,默认1 - epochs: 10 # 训练轮数 -``` diff --git a/doc/train.md b/doc/train.md index 16fad1b23783b5fe0c2a785f5500ba88c42ae356..b275b66f3b0ec7c88424c4c18afd855182136c41 100644 --- a/doc/train.md +++ b/doc/train.md @@ -20,7 +20,7 @@ python -m paddlerec.run -m paddlerec.models.xxx.yyy 例如启动`recall`下的`word2vec`模型的默认配置; ```shell -python -m paddlerec.run -m models/recall/word2vec +python -m paddlerec.run -m models/recall/word2vec/config.yaml ``` ### 2. 启动内置模型的个性化配置训练 diff --git a/models/demo/online_learning/__init__.py b/models/demo/online_learning/__init__.py deleted file mode 100755 index abf198b97e6e818e1fbe59006f98492640bcee54..0000000000000000000000000000000000000000 --- a/models/demo/online_learning/__init__.py +++ /dev/null @@ -1,13 +0,0 @@ -# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. diff --git a/models/demo/online_learning/config.yaml b/models/demo/online_learning/config.yaml deleted file mode 100755 index 03a92a2ecddcd44472165795031901ce3b99e2a2..0000000000000000000000000000000000000000 --- a/models/demo/online_learning/config.yaml +++ /dev/null @@ -1,88 +0,0 @@ -# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -# workspace -workspace: "models/demo/online_learning" - -# list of dataset -dataset: -- name: dataloader_train # name of dataset to distinguish different datasets - batch_size: 2 - type: DataLoader # or QueueDataset - data_path: "{workspace}/data/sample_data/train" - sparse_slots: "click 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26" - dense_slots: "dense_var:13" -- name: dataset_train # name of dataset to distinguish different datasets - batch_size: 2 - type: QueueDataset # or DataLoader - data_path: "{workspace}/data/sample_data/train" - sparse_slots: "click 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26" - dense_slots: "dense_var:13" -- name: dataset_infer # name - batch_size: 2 - type: DataLoader # or QueueDataset - data_path: "{workspace}/data/sample_data/train" - sparse_slots: "click 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26" - dense_slots: "dense_var:13" - -# hyper parameters of user-defined network -hyper_parameters: - # optimizer config - optimizer: - class: Adam - learning_rate: 0.001 - strategy: async - # user-defined pairs - sparse_inputs_slots: 27 - sparse_feature_dim: 9 - dense_input_dim: 13 - fc_sizes: [512, 256, 128, 32] - -# select runner by name -mode: [ps_cluster, single_cpu_infer] -# config of each runner. -# runner is a kind of paddle training class, which wraps the train/infer process. -runner: -- name: single_cpu_infer - class: infer - # num of epochs - epochs: 1 - # device to run training or infer - device: cpu - init_model_path: "increment_dnn" # load model path - phases: [phase2] - -- name: ps_cluster - class: cluster_train - runner_class_path: "{workspace}/online_learning_runner.py" - epochs: 2 - device: cpu - fleet_mode: ps - save_checkpoint_interval: 1 # save model interval of epochs - save_checkpoint_path: "increment_dnn" # save checkpoint path - init_model_path: "" # load model path - print_interval: 1 - phases: [phase1] - -# runner will run all the phase in each epoch -phase: -- name: phase1 - model: "{workspace}/model.py" # user-defined model - dataset_name: dataloader_train # select dataset by name - thread_num: 1 - -- name: phase2 - model: "{workspace}/model.py" # user-defined model - dataset_name: dataset_infer # select dataset by name - thread_num: 1 diff --git a/models/demo/online_learning/data/download.sh b/models/demo/online_learning/data/download.sh deleted file mode 100644 index 56816f7d6be2227fbddafe49d3e24b1ef585a40c..0000000000000000000000000000000000000000 --- a/models/demo/online_learning/data/download.sh +++ /dev/null @@ -1,13 +0,0 @@ -wget --no-check-certificate https://fleet.bj.bcebos.com/ctr_data.tar.gz -tar -zxvf ctr_data.tar.gz -mv ./raw_data ./train_data_full -mkdir train_data && cd train_data -cp ../train_data_full/part-0 ../train_data_full/part-1 ./ && cd .. -mv ./test_data ./test_data_full -mkdir test_data && cd test_data -cp ../test_data_full/part-220 ./ && cd .. -echo "Complete data download." -echo "Full Train data stored in ./train_data_full " -echo "Full Test data stored in ./test_data_full " -echo "Rapid Verification train data stored in ./train_data " -echo "Rapid Verification test data stored in ./test_data " diff --git a/models/demo/online_learning/data/get_slot_data.py b/models/demo/online_learning/data/get_slot_data.py deleted file mode 100755 index cacdc279bd9877bfe974e6f093ba912972bee876..0000000000000000000000000000000000000000 --- a/models/demo/online_learning/data/get_slot_data.py +++ /dev/null @@ -1,71 +0,0 @@ -# Copyright (c) 2019 PaddlePaddle Authors. All Rights Reserved. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -import paddle.fluid.incubate.data_generator as dg - -cont_min_ = [0, -3, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] -cont_max_ = [20, 600, 100, 50, 64000, 500, 100, 50, 500, 10, 10, 10, 50] -cont_diff_ = [20, 603, 100, 50, 64000, 500, 100, 50, 500, 10, 10, 10, 50] -hash_dim_ = 1000001 -continuous_range_ = range(1, 14) -categorical_range_ = range(14, 40) - - -class CriteoDataset(dg.MultiSlotDataGenerator): - """ - DacDataset: inheritance MultiSlotDataGeneratior, Implement data reading - Help document: http://wiki.baidu.com/pages/viewpage.action?pageId=728820675 - """ - - def generate_sample(self, line): - """ - Read the data line by line and process it as a dictionary - """ - - def reader(): - """ - This function needs to be implemented by the user, based on data format - """ - features = line.rstrip('\n').split('\t') - dense_feature = [] - sparse_feature = [] - for idx in continuous_range_: - if features[idx] == "": - dense_feature.append(0.0) - else: - dense_feature.append( - (float(features[idx]) - cont_min_[idx - 1]) / - cont_diff_[idx - 1]) - for idx in categorical_range_: - sparse_feature.append( - [hash(str(idx) + features[idx]) % hash_dim_]) - label = [int(features[0])] - process_line = dense_feature, sparse_feature, label - feature_name = ["dense_feature"] - for idx in categorical_range_: - feature_name.append("C" + str(idx - 13)) - feature_name.append("label") - s = "click:" + str(label[0]) - for i in dense_feature: - s += " dense_feature:" + str(i) - for i in range(1, 1 + len(categorical_range_)): - s += " " + str(i) + ":" + str(sparse_feature[i - 1][0]) - print(s.strip()) - yield None - - return reader - - -d = CriteoDataset() -d.run_from_stdin() diff --git a/models/demo/online_learning/data/run.sh b/models/demo/online_learning/data/run.sh deleted file mode 100644 index f2d1fc9210d65e521cb7fa19cadab3ec95d95a31..0000000000000000000000000000000000000000 --- a/models/demo/online_learning/data/run.sh +++ /dev/null @@ -1,25 +0,0 @@ -sh download.sh - -mkdir slot_train_data_full -for i in `ls ./train_data_full` -do - cat train_data_full/$i | python get_slot_data.py > slot_train_data_full/$i -done - -mkdir slot_test_data_full -for i in `ls ./test_data_full` -do - cat test_data_full/$i | python get_slot_data.py > slot_test_data_full/$i -done - -mkdir slot_train_data -for i in `ls ./train_data` -do - cat train_data/$i | python get_slot_data.py > slot_train_data/$i -done - -mkdir slot_test_data -for i in `ls ./test_data` -do - cat test_data/$i | python get_slot_data.py > slot_test_data/$i -done diff --git a/models/demo/online_learning/data/sample_data/train/sample_train.txt b/models/demo/online_learning/data/sample_data/train/sample_train.txt deleted file mode 100644 index b2b2e2022a8601d60b6e47ea9665dcc314bd04b6..0000000000000000000000000000000000000000 --- a/models/demo/online_learning/data/sample_data/train/sample_train.txt +++ /dev/null @@ -1,80 +0,0 @@ -click:0 dense_feature:0.0 dense_feature:0.00497512437811 dense_feature:0.05 dense_feature:0.08 dense_feature:0.207421875 dense_feature:0.028 dense_feature:0.35 dense_feature:0.08 dense_feature:0.082 dense_feature:0.0 dense_feature:0.4 dense_feature:0.0 dense_feature:0.08 1:737395 2:210498 3:903564 4:286224 5:286835 6:906818 7:906116 8:67180 9:27346 10:51086 11:142177 12:95024 13:157883 14:873363 15:600281 16:812592 17:228085 18:35900 19:880474 20:984402 21:100885 22:26235 23:410878 24:798162 25:499868 26:306163 -click:1 dense_feature:0.0 dense_feature:0.932006633499 dense_feature:0.02 dense_feature:0.14 dense_feature:0.0395625 dense_feature:0.328 dense_feature:0.98 dense_feature:0.12 dense_feature:1.886 dense_feature:0.0 dense_feature:1.8 dense_feature:0.0 dense_feature:0.14 1:715353 2:761523 3:432904 4:892267 5:515218 6:948614 7:266726 8:67180 9:27346 10:266081 11:286126 12:789480 13:49621 14:255651 15:47663 16:79797 17:342789 18:616331 19:880474 20:984402 21:242209 22:26235 23:669531 24:26284 25:269955 26:187951 -click:0 dense_feature:0.0 dense_feature:0.00829187396352 dense_feature:0.08 dense_feature:0.06 dense_feature:0.14125 dense_feature:0.076 dense_feature:0.05 dense_feature:0.22 dense_feature:0.208 dense_feature:0.0 dense_feature:0.2 dense_feature:0.0 dense_feature:0.06 1:737395 2:952384 3:511141 4:271077 5:286835 6:948614 7:903547 8:507110 9:27346 10:56047 11:612953 12:747707 13:977426 14:671506 15:158148 16:833738 17:342789 18:427155 19:880474 20:537425 21:916237 22:26235 23:468277 24:676936 25:751788 26:363967 -click:0 dense_feature:0.0 dense_feature:0.124378109453 dense_feature:0.02 dense_feature:0.04 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.08 dense_feature:0.024 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.04 1:210127 2:286436 3:183920 4:507656 5:286835 6:906818 7:199553 8:67180 9:502607 10:708281 11:809876 12:888238 13:375164 14:202774 15:459895 16:475933 17:555571 18:847163 19:26230 20:26229 21:808836 22:191474 23:410878 24:315120 25:26224 26:26223 -click:0 dense_feature:0.1 dense_feature:0.0149253731343 dense_feature:0.34 dense_feature:0.32 dense_feature:0.016421875 dense_feature:0.098 dense_feature:0.04 dense_feature:0.96 dense_feature:0.202 dense_feature:0.1 dense_feature:0.2 dense_feature:0.0 dense_feature:0.32 1:230803 2:817085 3:539110 4:388629 5:286835 6:948614 7:586040 8:67180 9:27346 10:271155 11:176640 12:827381 13:36881 14:202774 15:397299 16:411672 17:342789 18:474060 19:880474 20:984402 21:216871 22:26235 23:761351 24:787115 25:884722 26:904135 -click:0 dense_feature:0.0 dense_feature:0.00829187396352 dense_feature:0.13 dense_feature:0.04 dense_feature:0.246203125 dense_feature:0.108 dense_feature:0.05 dense_feature:0.04 dense_feature:0.03 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.04 1:737395 2:64837 3:259267 4:336976 5:515218 6:154084 7:847938 8:67180 9:27346 10:708281 11:776766 12:964800 13:324323 14:873363 15:212708 16:637238 17:681378 18:895034 19:673458 20:984402 21:18600 22:26235 23:410878 24:787115 25:884722 26:355412 -click:0 dense_feature:0.0 dense_feature:0.028192371476 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0245625 dense_feature:0.016 dense_feature:0.04 dense_feature:0.12 dense_feature:0.016 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.0 1:737395 2:554760 3:661483 4:263696 5:938478 6:906818 7:786926 8:67180 9:27346 10:245862 11:668197 12:745676 13:432600 14:413795 15:751427 16:272410 17:342789 18:422136 19:26230 20:26229 21:452501 22:26235 23:51381 24:776636 25:26224 26:26223 -click:0 dense_feature:0.0 dense_feature:0.00497512437811 dense_feature:1.95 dense_feature:0.28 dense_feature:0.092828125 dense_feature:0.57 dense_feature:0.06 dense_feature:0.4 dense_feature:0.4 dense_feature:0.0 dense_feature:0.2 dense_feature:0.0 dense_feature:0.4 1:371155 2:817085 3:773609 4:555449 5:938478 6:906818 7:166117 8:507110 9:27346 10:545822 11:316654 12:172765 13:989600 14:255651 15:792372 16:606361 17:342789 18:566554 19:880474 20:984402 21:235256 22:191474 23:700326 24:787115 25:884722 26:569095 -click:0 dense_feature:0.0 dense_feature:0.0912106135987 dense_feature:0.01 dense_feature:0.02 dense_feature:0.06625 dense_feature:0.018 dense_feature:0.05 dense_feature:0.06 dense_feature:0.098 dense_feature:0.0 dense_feature:0.4 dense_feature:0.0 dense_feature:0.04 1:230803 2:531472 3:284417 4:661677 5:938478 6:553107 7:21150 8:49466 9:27346 10:526914 11:164508 12:631773 13:882348 14:873363 15:523948 16:687081 17:342789 18:271301 19:26230 20:26229 21:647160 22:26235 23:410878 24:231695 25:26224 26:26223 -click:1 dense_feature:0.0 dense_feature:0.00663349917081 dense_feature:0.01 dense_feature:0.02 dense_feature:0.02153125 dense_feature:0.092 dense_feature:0.05 dense_feature:0.68 dense_feature:0.472 dense_feature:0.0 dense_feature:0.3 dense_feature:0.0 dense_feature:0.02 1:737395 2:532829 3:320762 4:887282 5:286835 6:25207 7:640357 8:67180 9:27346 10:695831 11:739268 12:835325 13:402539 14:873363 15:125813 16:168896 17:342789 18:374414 19:26230 20:26229 21:850229 22:26235 23:410878 24:480027 25:26224 26:26223 -click:0 dense_feature:0.0 dense_feature:0.00497512437811 dense_feature:0.05 dense_feature:0.04 dense_feature:0.086125 dense_feature:0.098 dense_feature:0.15 dense_feature:0.06 dense_feature:0.228 dense_feature:0.0 dense_feature:0.2 dense_feature:0.0 dense_feature:0.04 1:210127 2:999497 3:646348 4:520638 5:938478 6:906818 7:438398 8:67180 9:27346 10:975902 11:532544 12:708828 13:815045 14:255651 15:896230 16:663630 17:342789 18:820094 19:687226 20:537425 21:481536 22:26235 23:761351 24:888170 25:250729 26:381125 -click:1 dense_feature:0.1 dense_feature:0.00331674958541 dense_feature:0.02 dense_feature:0.02 dense_feature:0.00078125 dense_feature:0.002 dense_feature:0.73 dense_feature:0.08 dense_feature:0.254 dense_feature:0.1 dense_feature:1.4 dense_feature:0.0 dense_feature:0.02 1:715353 2:342833 3:551901 4:73418 5:286835 6:446063 7:219517 8:67180 9:27346 10:668726 11:40711 12:921745 13:361076 14:15048 15:214564 16:400893 17:228085 18:393370 19:26230 20:26229 21:383046 22:26235 23:700326 24:369764 25:26224 26:26223 -click:0 dense_feature:0.0 dense_feature:0.142620232172 dense_feature:0.04 dense_feature:0.1 dense_feature:0.08853125 dense_feature:0.028 dense_feature:0.01 dense_feature:0.1 dense_feature:0.028 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.1 1:737395 2:583707 3:519411 4:19103 5:286835 6:906818 7:801403 8:67180 9:27346 10:35743 11:626052 12:142351 13:988058 14:873363 15:617333 16:850339 17:276641 18:696084 19:26230 20:26229 21:121620 22:191474 23:468277 24:18340 25:26224 26:26223 -click:0 dense_feature:0.0 dense_feature:0.00995024875622 dense_feature:0.0 dense_feature:0.22 dense_feature:0.0251875 dense_feature:0.0 dense_feature:0.0 dense_feature:0.8 dense_feature:0.182 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.84 1:737395 2:19359 3:166075 4:381832 5:286835 6:446063 7:816009 8:67180 9:27346 10:708281 11:619790 12:524128 13:826787 14:202774 15:371495 16:392894 17:644532 18:271180 19:26230 20:26229 21:349978 22:26235 23:761351 24:517170 25:26224 26:26223 -click:0 dense_feature:0.0 dense_feature:0.0149253731343 dense_feature:0.52 dense_feature:0.1 dense_feature:6.25153125 dense_feature:0.0 dense_feature:0.0 dense_feature:0.3 dense_feature:0.03 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.1 1:230803 2:24784 3:519411 4:19103 5:843054 6:948614 7:529143 8:67180 9:502607 10:708281 11:430027 12:142351 13:529101 14:202774 15:618316 16:850339 17:644532 18:95370 19:880474 20:31181 21:121620 22:26235 23:744389 24:18340 25:269955 26:683431 -click:0 dense_feature:0.0 dense_feature:0.0480928689884 dense_feature:0.12 dense_feature:0.22 dense_feature:0.541703125 dense_feature:1.062 dense_feature:0.01 dense_feature:0.24 dense_feature:0.054 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.22 1:737395 2:378661 3:21539 4:552097 5:286835 6:553107 7:512138 8:67180 9:27346 10:708281 11:91094 12:516991 13:150114 14:873363 15:450569 16:353024 17:228085 18:539379 19:26230 20:26229 21:410733 22:26235 23:700326 24:272703 25:26224 26:26223 -click:0 dense_feature:0.0 dense_feature:0.016583747927 dense_feature:0.06 dense_feature:0.0 dense_feature:0.209625 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.09 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 1:737395 2:750131 3:807749 4:905739 5:286835 6:906818 7:11935 8:67180 9:27346 10:708281 11:505199 12:285350 13:724106 14:255651 15:625913 16:511836 17:644532 18:102288 19:26230 20:26229 21:726818 22:179327 23:744389 24:176417 25:26224 26:26223 -click:0 dense_feature:0.0 dense_feature:0.00663349917081 dense_feature:0.05 dense_feature:0.14 dense_feature:0.226703125 dense_feature:0.12 dense_feature:0.05 dense_feature:0.14 dense_feature:0.112 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.14 1:736218 2:690313 3:757279 4:763330 5:286835 6:553107 7:89560 8:642551 9:27346 10:128328 11:281593 12:246510 13:200341 14:255651 15:899145 16:807138 17:342789 18:659853 19:26230 20:26229 21:399608 22:26235 23:669531 24:787115 25:26224 26:26223 -click:0 dense_feature:0.0 dense_feature:0.00829187396352 dense_feature:0.3 dense_feature:0.2 dense_feature:0.021296875 dense_feature:0.83 dense_feature:0.2 dense_feature:0.56 dense_feature:1.122 dense_feature:0.0 dense_feature:0.5 dense_feature:0.0 dense_feature:0.2 1:715353 2:283434 3:523722 4:590869 5:286835 6:948614 7:25472 8:67180 9:27346 10:340404 11:811342 12:679454 13:897590 14:813514 15:578769 16:962576 17:342789 18:267210 19:310188 20:537425 21:746185 22:179327 23:761351 24:416923 25:253255 26:249672 -click:1 dense_feature:0.05 dense_feature:0.0149253731343 dense_feature:0.03 dense_feature:0.24 dense_feature:0.0 dense_feature:0.008 dense_feature:0.4 dense_feature:0.62 dense_feature:0.82 dense_feature:0.1 dense_feature:1.4 dense_feature:0.0 dense_feature:0.08 1:715353 2:532829 3:716475 4:940968 5:286835 6:948614 7:38171 8:67180 9:27346 10:619455 11:515541 12:779426 13:711791 14:255651 15:881750 16:408550 17:342789 18:612540 19:26230 20:26229 21:23444 22:26235 23:410878 24:88425 25:26224 26:26223 -click:0 dense_feature:0.0 dense_feature:0.00497512437811 dense_feature:0.11 dense_feature:0.08 dense_feature:0.135265625 dense_feature:0.426 dense_feature:0.06 dense_feature:0.06 dense_feature:0.42 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.08 1:737395 2:817085 3:506158 4:48876 5:286835 6:948614 7:95506 8:67180 9:27346 10:75825 11:220591 12:613471 13:159874 14:255651 15:121379 16:889290 17:681378 18:532453 19:880474 20:537425 21:717912 22:26235 23:270873 24:450199 25:884722 26:382723 -click:0 dense_feature:0.0 dense_feature:0.0829187396352 dense_feature:0.0 dense_feature:0.0 dense_feature:0.555859375 dense_feature:0.318 dense_feature:0.03 dense_feature:0.0 dense_feature:0.02 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.0 1:715353 2:465222 3:974451 4:892661 5:938478 6:948614 7:651987 8:67180 9:27346 10:708281 11:229311 12:545057 13:875629 14:149134 15:393524 16:213237 17:681378 18:540092 19:26230 20:26229 21:483290 22:26235 23:700326 24:946673 25:26224 26:26223 -click:1 dense_feature:0.05 dense_feature:0.854063018242 dense_feature:0.01 dense_feature:0.04 dense_feature:0.000171875 dense_feature:0.004 dense_feature:0.01 dense_feature:0.04 dense_feature:0.004 dense_feature:0.1 dense_feature:0.1 dense_feature:0.0 dense_feature:0.04 1:737395 2:99294 3:681584 4:398205 5:914075 6:906818 7:620358 8:67180 9:27346 10:147441 11:364583 12:535262 13:516341 14:813514 15:281303 16:714384 17:276641 18:443922 19:26230 20:26229 21:948746 22:26235 23:700326 24:928903 25:26224 26:26223 -click:0 dense_feature:0.0 dense_feature:0.00331674958541 dense_feature:0.0 dense_feature:0.0 dense_feature:0.45190625 dense_feature:0.048 dense_feature:0.01 dense_feature:0.16 dense_feature:0.044 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.0 1:737395 2:792512 3:676584 4:995262 5:938478 6:906818 7:888723 8:67180 9:27346 10:708281 11:310529 12:951172 13:885793 14:873363 15:62698 16:672021 17:276641 18:11502 19:880474 20:984402 21:501083 22:191474 23:744389 24:398029 25:218743 26:991064 -click:0 dense_feature:0.0 dense_feature:0.00663349917081 dense_feature:0.51 dense_feature:0.0 dense_feature:0.2689375 dense_feature:0.0 dense_feature:0.0 dense_feature:0.02 dense_feature:0.006 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 1:230803 2:239052 3:323170 4:474182 5:140103 6:553107 7:757837 8:524745 9:27346 10:743444 11:883533 12:123023 13:621127 14:255651 15:570872 16:883618 17:924903 18:984920 19:964183 20:984402 21:260134 22:179327 23:410878 24:787860 25:269955 26:949924 -click:0 dense_feature:0.0 dense_feature:0.273631840796 dense_feature:0.0 dense_feature:0.0 dense_feature:0.066453125 dense_feature:0.052 dense_feature:0.04 dense_feature:0.06 dense_feature:0.01 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.0 1:737395 2:531472 3:747313 4:362684 5:843054 6:553107 7:863980 8:718499 9:27346 10:881217 11:371751 12:168971 13:290788 14:202774 15:316669 16:269663 17:342789 18:136775 19:26230 20:26229 21:76865 22:26235 23:761351 24:441421 25:26224 26:26223 -click:0 dense_feature:0.0 dense_feature:0.116086235489 dense_feature:0.43 dense_feature:0.36 dense_feature:0.000953125 dense_feature:0.0 dense_feature:0.0 dense_feature:0.36 dense_feature:0.036 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.36 1:737395 2:24784 3:677469 4:820784 5:286835 6:553107 7:715520 8:718499 9:27346 10:708281 11:670424 12:122926 13:724619 14:873363 15:845517 16:488791 17:644532 18:183573 19:880474 20:31181 21:46761 22:26235 23:700326 24:629361 25:269955 26:862373 -click:0 dense_feature:2.55 dense_feature:0.0348258706468 dense_feature:0.01 dense_feature:0.38 dense_feature:0.001453125 dense_feature:0.046 dense_feature:1.11 dense_feature:0.44 dense_feature:2.312 dense_feature:0.2 dense_feature:1.1 dense_feature:0.0 dense_feature:0.46 1:594517 2:194636 3:496284 4:323209 5:286835 6:553107 7:259696 8:760861 9:27346 10:698046 11:478868 12:576074 13:635369 14:201966 15:926692 16:972906 17:342789 18:409802 19:26230 20:26229 21:395694 22:26235 23:410878 24:844671 25:26224 26:26223 -click:0 dense_feature:0.0 dense_feature:0.144278606965 dense_feature:0.43 dense_feature:0.22 dense_feature:0.00309375 dense_feature:0.15 dense_feature:0.14 dense_feature:0.54 dense_feature:0.152 dense_feature:0.0 dense_feature:0.2 dense_feature:0.1 dense_feature:0.22 1:737395 2:239052 3:456744 4:736474 5:286835 6:948614 7:13277 8:67180 9:27346 10:958384 11:778183 12:497627 13:136915 14:201966 15:757961 16:747483 17:228085 18:984920 19:905920 20:537425 21:472149 22:179327 23:410878 24:709155 25:269955 26:618673 -click:0 dense_feature:0.0 dense_feature:0.0132669983416 dense_feature:0.4 dense_feature:0.3 dense_feature:0.36440625 dense_feature:1.492 dense_feature:0.07 dense_feature:0.3 dense_feature:1.048 dense_feature:0.0 dense_feature:0.3 dense_feature:0.0 dense_feature:0.3 1:737395 2:19959 3:661391 4:748753 5:286835 6:948614 7:848540 8:67180 9:27346 10:708281 11:703964 12:72024 13:336272 14:255651 15:835686 16:703858 17:342789 18:274368 19:26230 20:26229 21:765452 22:26235 23:700326 24:815200 25:26224 26:26223 -click:0 dense_feature:0.0 dense_feature:0.0116086235489 dense_feature:0.01 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 1:210127 2:662691 3:334228 4:857003 5:286835 6:25207 7:280499 8:67180 9:502607 10:708281 11:195094 12:870026 13:783566 14:873363 15:139595 16:214259 17:555571 18:208248 19:880474 20:984402 21:471770 22:26235 23:744389 24:507551 25:383787 26:797121 -click:1 dense_feature:0.0 dense_feature:0.0348258706468 dense_feature:0.03 dense_feature:0.02 dense_feature:0.066140625 dense_feature:0.006 dense_feature:0.17 dense_feature:0.02 dense_feature:0.236 dense_feature:0.0 dense_feature:0.5 dense_feature:0.0 dense_feature:0.02 1:230803 2:999497 3:25361 4:892267 5:286835 6:906818 7:356528 8:67180 9:27346 10:5856 11:157692 12:554754 13:442501 14:255651 15:896230 16:248781 17:342789 18:820094 19:905920 20:984402 21:916436 22:26235 23:669531 24:26284 25:884722 26:187951 -click:0 dense_feature:0.0 dense_feature:4.62852404643 dense_feature:0.07 dense_feature:0.0 dense_feature:0.022671875 dense_feature:0.0 dense_feature:0.01 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.0 1:624252 2:344887 3:238747 4:308366 5:286835 6:553107 7:69291 8:67180 9:27346 10:781054 11:258240 12:546906 13:772337 14:873363 15:807640 16:525695 17:276641 18:613203 19:438655 20:984402 21:415123 22:191474 23:700326 24:729290 25:218743 26:953507 -click:0 dense_feature:0.0 dense_feature:0.00663349917081 dense_feature:0.06 dense_feature:0.02 dense_feature:0.06878125 dense_feature:0.044 dense_feature:0.01 dense_feature:0.22 dense_feature:0.044 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.02 1:737395 2:7753 3:871178 4:183530 5:286835 6:906818 7:273988 8:507110 9:27346 10:708281 11:942072 12:775997 13:612590 14:873363 15:669921 16:639940 17:681378 18:421122 19:880474 20:984402 21:410471 22:26235 23:410878 24:228420 25:269955 26:616000 -click:0 dense_feature:0.0 dense_feature:0.212271973466 dense_feature:0.02 dense_feature:0.28 dense_feature:0.113421875 dense_feature:0.06 dense_feature:0.02 dense_feature:0.28 dense_feature:0.194 dense_feature:0.0 dense_feature:0.2 dense_feature:0.0 dense_feature:0.28 1:210127 2:228963 3:692240 4:389834 5:938478 6:948614 7:125690 8:507110 9:27346 10:708281 11:549232 12:308284 13:262461 14:255651 15:629185 16:280660 17:276641 18:886164 19:26230 20:26229 21:367919 22:191474 23:700326 24:520083 25:26224 26:26223 -click:0 dense_feature:0.0 dense_feature:1.01658374793 dense_feature:0.01 dense_feature:0.02 dense_feature:0.11759375 dense_feature:0.08 dense_feature:0.02 dense_feature:0.02 dense_feature:0.024 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.02 1:230803 2:7753 3:194720 4:831884 5:286835 6:553107 7:620358 8:67180 9:27346 10:843010 11:424144 12:615986 13:516341 14:813514 15:782575 16:775856 17:342789 18:421122 19:880474 20:984402 21:110090 22:191474 23:700326 24:784174 25:269955 26:101161 -click:0 dense_feature:0.0 dense_feature:0.00663349917081 dense_feature:0.59 dense_feature:0.06 dense_feature:0.04321875 dense_feature:0.192 dense_feature:0.02 dense_feature:0.08 dense_feature:0.014 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.06 1:230803 2:532829 3:26258 4:853241 5:938478 6:948614 7:877607 8:67180 9:27346 10:613723 11:246387 12:538673 13:377975 14:873363 15:659013 16:601478 17:681378 18:199271 19:26230 20:26229 21:300137 22:26235 23:410878 24:372458 25:26224 26:26223 -click:0 dense_feature:0.0 dense_feature:0.06135986733 dense_feature:0.0 dense_feature:0.0 dense_feature:0.294671875 dense_feature:0.212 dense_feature:0.26 dense_feature:0.0 dense_feature:0.034 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.0 1:737395 2:154478 3:982044 4:501457 5:819883 6:906818 7:445051 8:67180 9:27346 10:976970 11:783630 12:609883 13:358461 14:15048 15:409791 16:756307 17:342789 18:480228 19:26230 20:26229 21:845147 22:26235 23:669531 24:124290 25:26224 26:26223 -click:1 dense_feature:0.05 dense_feature:0.537313432836 dense_feature:0.0 dense_feature:0.02 dense_feature:0.018578125 dense_feature:0.016 dense_feature:0.16 dense_feature:0.22 dense_feature:0.192 dense_feature:0.1 dense_feature:0.3 dense_feature:0.0 dense_feature:0.02 1:737395 2:194636 3:274597 4:418981 5:286835 6:553107 7:553528 8:67180 9:27346 10:901359 11:110700 12:108037 13:915461 14:255651 15:951604 16:421384 17:342789 18:728110 19:26230 20:26229 21:772733 22:191474 23:761351 24:844671 25:26224 26:26223 -click:0 dense_feature:0.1 dense_feature:0.00663349917081 dense_feature:0.16 dense_feature:0.26 dense_feature:0.00509375 dense_feature:0.122 dense_feature:0.03 dense_feature:0.94 dense_feature:0.526 dense_feature:0.1 dense_feature:0.1 dense_feature:0.0 dense_feature:1.1 1:210127 2:344887 3:343793 4:917598 5:286835 6:948614 7:220413 8:67180 9:27346 10:912799 11:370606 12:722621 13:569604 14:255651 15:499545 16:159495 17:342789 18:613203 19:305384 20:984402 21:844602 22:26235 23:410878 24:695516 25:218743 26:729263 -click:0 dense_feature:0.0 dense_feature:0.00497512437811 dense_feature:0.09 dense_feature:0.16 dense_feature:0.11221875 dense_feature:0.51 dense_feature:0.09 dense_feature:0.48 dense_feature:0.088 dense_feature:0.0 dense_feature:0.4 dense_feature:0.0 dense_feature:0.16 1:737395 2:532829 3:579624 4:980109 5:286835 6:948614 7:927736 8:67180 9:27346 10:970644 11:931289 12:377125 13:539272 14:873363 15:555779 16:405069 17:342789 18:701770 19:26230 20:26229 21:201088 22:26235 23:410878 24:113994 25:26224 26:26223 -click:0 dense_feature:0.0 dense_feature:0.182421227197 dense_feature:0.01 dense_feature:0.02 dense_feature:0.000109375 dense_feature:0.978 dense_feature:0.01 dense_feature:0.02 dense_feature:0.062 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.02 1:478318 2:158107 3:508317 4:452336 5:286835 6:948614 7:620358 8:67180 9:27346 10:147441 11:364583 12:34025 13:516341 14:873363 15:502825 16:683439 17:681378 18:889198 19:26230 20:26229 21:234451 22:26235 23:700326 24:256238 25:26224 26:26223 -click:0 dense_feature:0.0 dense_feature:0.469320066335 dense_feature:0.2 dense_feature:0.2 dense_feature:0.0705 dense_feature:0.102 dense_feature:0.05 dense_feature:0.22 dense_feature:0.194 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.2 1:715353 2:846239 3:573061 4:508181 5:286835 6:553107 7:892443 8:718499 9:27346 10:639370 11:866496 12:791636 13:895012 14:873363 15:362079 16:16082 17:228085 18:994402 19:880474 20:984402 21:35513 22:26235 23:669531 24:520197 25:934391 26:625657 -click:0 dense_feature:0.0 dense_feature:0.0729684908789 dense_feature:0.06 dense_feature:0.04 dense_feature:5.620296875 dense_feature:0.0 dense_feature:0.0 dense_feature:0.06 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.04 1:399845 2:239052 3:334610 4:593315 5:286835 6:948614 7:751495 8:67180 9:502607 10:111048 11:244081 12:115252 13:915518 14:873363 15:817451 16:296052 17:276641 18:984920 19:774721 20:984402 21:930636 22:26235 23:700326 24:975048 25:269955 26:266439 -click:1 dense_feature:0.05 dense_feature:0.0265339966833 dense_feature:0.07 dense_feature:0.22 dense_feature:1.5625e-05 dense_feature:0.008 dense_feature:0.04 dense_feature:0.36 dense_feature:0.088 dense_feature:0.1 dense_feature:0.3 dense_feature:0.0 dense_feature:0.08 1:737395 2:64837 3:534435 4:555449 5:286835 6:25207 7:661236 8:67180 9:27346 10:708281 11:785752 12:47348 13:524553 14:117289 15:776971 16:293528 17:681378 18:102169 19:758208 20:31181 21:27506 22:26235 23:410878 24:787115 25:884722 26:605635 -click:1 dense_feature:0.1 dense_feature:0.0464344941957 dense_feature:0.0 dense_feature:0.04 dense_feature:0.00059375 dense_feature:0.004 dense_feature:0.02 dense_feature:0.04 dense_feature:0.004 dense_feature:0.1 dense_feature:0.1 dense_feature:0.0 dense_feature:0.04 1:230803 2:7753 3:529866 4:437169 5:938478 6:948614 7:17274 8:67180 9:27346 10:461781 11:452641 12:302471 13:49621 14:873363 15:543432 16:858509 17:681378 18:402164 19:880474 20:984402 21:650184 22:191474 23:410878 24:492581 25:269955 26:217228 -click:0 dense_feature:0.55 dense_feature:0.00829187396352 dense_feature:0.03 dense_feature:0.0 dense_feature:0.0014375 dense_feature:0.004 dense_feature:0.36 dense_feature:0.0 dense_feature:0.042 dense_feature:0.1 dense_feature:0.4 dense_feature:0.0 dense_feature:0.0 1:26973 2:817085 3:961160 4:355882 5:843054 6:906818 7:417593 8:67180 9:27346 10:708281 11:402889 12:899379 13:552051 14:202774 15:532679 16:545549 17:342789 18:562805 19:880474 20:31181 21:355920 22:26235 23:700326 24:787115 25:884722 26:115004 -click:1 dense_feature:0.0 dense_feature:0.00663349917081 dense_feature:0.01 dense_feature:0.02 dense_feature:0.089296875 dense_feature:0.362 dense_feature:0.23 dense_feature:0.04 dense_feature:0.338 dense_feature:0.0 dense_feature:0.4 dense_feature:0.0 dense_feature:0.02 1:230803 2:977337 3:853759 4:880273 5:515218 6:25207 7:414263 8:437731 9:27346 10:205124 11:108170 12:676869 13:388798 14:255651 15:247232 16:172895 17:228085 18:543219 19:26230 20:26229 21:860937 22:179327 23:669531 24:959959 25:26224 26:26223 -click:0 dense_feature:0.0 dense_feature:0.0945273631841 dense_feature:0.62 dense_feature:0.24 dense_feature:0.11840625 dense_feature:0.368 dense_feature:0.07 dense_feature:0.24 dense_feature:0.144 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.48 1:737395 2:532829 3:805087 4:186661 5:286835 6:154084 7:468059 8:718499 9:27346 10:708281 11:968875 12:8177 13:47822 14:255651 15:979316 16:956543 17:342789 18:541633 19:26230 20:26229 21:646669 22:26235 23:410878 24:184909 25:26224 26:26223 -click:0 dense_feature:0.3 dense_feature:0.00497512437811 dense_feature:0.12 dense_feature:0.12 dense_feature:0.002890625 dense_feature:0.074 dense_feature:0.06 dense_feature:0.14 dense_feature:0.074 dense_feature:0.1 dense_feature:0.1 dense_feature:0.0 dense_feature:0.74 1:737395 2:64837 3:967865 4:249418 5:938478 6:948614 7:228716 8:67180 9:27346 10:627362 11:722606 12:193782 13:348283 14:255651 15:928582 16:221557 17:342789 18:895034 19:384556 20:984402 21:475712 22:26235 23:410878 24:492875 25:884722 26:468964 -click:0 dense_feature:0.0 dense_feature:0.177446102819 dense_feature:0.01 dense_feature:0.02 dense_feature:0.041859375 dense_feature:0.0 dense_feature:0.0 dense_feature:0.16 dense_feature:0.036 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.02 1:154064 2:834620 3:25206 4:25205 5:938478 6:948614 7:134101 8:92608 9:27346 10:708281 11:505199 12:25711 13:724106 14:671506 15:42927 16:25723 17:644532 18:1957 19:26230 20:26229 21:26236 22:26235 23:744389 24:26233 25:26224 26:26223 -click:0 dense_feature:0.0 dense_feature:5.61691542289 dense_feature:0.0 dense_feature:0.1 dense_feature:0.043796875 dense_feature:0.302 dense_feature:0.13 dense_feature:0.22 dense_feature:0.3 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.22 1:154184 2:19359 3:166075 4:381832 5:286835 6:906818 7:348227 8:49466 9:27346 10:645596 11:951584 12:524128 13:277250 14:255651 15:853732 16:392894 17:342789 18:619939 19:26230 20:26229 21:349978 22:26235 23:700326 24:517170 25:26224 26:26223 -click:1 dense_feature:0.0 dense_feature:0.00331674958541 dense_feature:0.0 dense_feature:0.0 dense_feature:0.093234375 dense_feature:0.022 dense_feature:0.04 dense_feature:0.02 dense_feature:0.02 dense_feature:0.0 dense_feature:0.2 dense_feature:0.0 dense_feature:0.0 1:715353 2:485136 3:386313 4:208181 5:286835 6:25207 7:227715 8:49466 9:27346 10:437476 11:733250 12:721260 13:389832 14:255651 15:47178 16:761962 17:342789 18:813169 19:26230 20:26229 21:464938 22:26235 23:410878 24:833196 25:26224 26:26223 -click:0 dense_feature:0.0 dense_feature:0.134328358209 dense_feature:0.0 dense_feature:0.14 dense_feature:0.00015625 dense_feature:0.0 dense_feature:0.0 dense_feature:0.14 dense_feature:0.014 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.14 1:737395 2:488655 3:221719 4:442408 5:286835 6:25207 7:898902 8:718499 9:27346 10:457066 11:290973 12:533168 13:949027 14:873363 15:270294 16:934635 17:924903 18:763017 19:880474 20:31181 21:517486 22:26235 23:410878 24:588215 25:499868 26:980179 -click:1 dense_feature:0.0 dense_feature:0.00331674958541 dense_feature:0.0 dense_feature:0.0 dense_feature:0.023578125 dense_feature:0.0 dense_feature:0.04 dense_feature:0.0 dense_feature:0.046 dense_feature:0.0 dense_feature:0.3 dense_feature:0.0 dense_feature:0.0 1:737395 2:729012 3:691820 4:351286 5:938478 6:553107 7:21150 8:67180 9:27346 10:947459 11:164508 12:205079 13:882348 14:255651 15:178324 16:282716 17:342789 18:193902 19:880474 20:31181 21:604480 22:191474 23:669531 24:727223 25:499868 26:236426 -click:1 dense_feature:0.1 dense_feature:0.00331674958541 dense_feature:0.0 dense_feature:0.0 dense_feature:0.00859375 dense_feature:0.006 dense_feature:1.55 dense_feature:0.16 dense_feature:0.06 dense_feature:0.2 dense_feature:1.6 dense_feature:0.0 dense_feature:0.0 1:712372 2:235347 3:483718 4:382039 5:914075 6:906818 7:727609 8:154004 9:27346 10:116648 11:40711 12:658199 13:361076 14:15048 15:15058 16:644988 17:342789 18:544170 19:26230 20:26229 21:251535 22:26235 23:700326 24:114111 25:26224 26:26223 -click:1 dense_feature:0.25 dense_feature:0.192371475954 dense_feature:0.06 dense_feature:0.36 dense_feature:0.0 dense_feature:0.02 dense_feature:0.09 dense_feature:0.42 dense_feature:0.042 dense_feature:0.2 dense_feature:0.3 dense_feature:0.3 dense_feature:0.0 1:737395 2:288975 3:885137 4:368487 5:515218 6:906818 7:569753 8:799133 9:27346 10:635043 11:883202 12:780104 13:492605 14:873363 15:234451 16:94894 17:796504 18:653705 19:880474 20:984402 21:400692 22:26235 23:410878 24:767424 25:934391 26:958132 -click:1 dense_feature:0.15 dense_feature:0.0398009950249 dense_feature:0.02 dense_feature:0.04 dense_feature:1.5625e-05 dense_feature:0.0 dense_feature:0.06 dense_feature:0.04 dense_feature:0.026 dense_feature:0.1 dense_feature:0.3 dense_feature:0.0 dense_feature:0.0 1:715353 2:532829 3:721632 4:377785 5:286835 6:553107 7:959856 8:718499 9:27346 10:737746 11:432444 12:706936 13:169268 14:873363 15:896219 16:461005 17:342789 18:286597 19:26230 20:26229 21:602049 22:26235 23:700326 24:510447 25:26224 26:26223 -click:0 dense_feature:0.0 dense_feature:0.00663349917081 dense_feature:0.05 dense_feature:0.08 dense_feature:0.155421875 dense_feature:0.55 dense_feature:0.08 dense_feature:0.24 dense_feature:1.73 dense_feature:0.0 dense_feature:0.3 dense_feature:0.0 dense_feature:0.08 1:737395 2:288975 3:385122 4:57409 5:286835 6:25207 7:339181 8:67180 9:27346 10:284863 11:531306 12:229544 13:32168 14:117289 15:632422 16:615549 17:342789 18:240865 19:880474 20:984402 21:253725 22:26235 23:410878 24:837371 25:934391 26:948190 -click:0 dense_feature:0.0 dense_feature:0.0398009950249 dense_feature:0.06 dense_feature:0.12 dense_feature:0.11359375 dense_feature:0.55 dense_feature:0.03 dense_feature:0.12 dense_feature:0.186 dense_feature:0.0 dense_feature:0.2 dense_feature:0.0 dense_feature:0.12 1:737395 2:158107 3:738359 4:343895 5:286835 6:948614 7:513189 8:760861 9:27346 10:741641 11:214926 12:142871 13:753229 14:873363 15:502825 16:864586 17:681378 18:889198 19:26230 20:26229 21:368414 22:191474 23:410878 24:256238 25:26224 26:26223 -click:1 dense_feature:0.25 dense_feature:0.00663349917081 dense_feature:0.03 dense_feature:0.04 dense_feature:7.8125e-05 dense_feature:0.0 dense_feature:0.48 dense_feature:0.06 dense_feature:0.004 dense_feature:0.2 dense_feature:1.3 dense_feature:0.0 dense_feature:0.0 1:737395 2:414770 3:100889 4:981572 5:286835 6:446063 7:600430 8:507110 9:27346 10:566014 11:40711 12:330691 13:361076 14:15048 15:176957 16:759140 17:342789 18:212244 19:26230 20:26229 21:688637 22:26235 23:634287 24:762432 25:26224 26:26223 -click:0 dense_feature:0.0 dense_feature:0.00663349917081 dense_feature:0.04 dense_feature:0.02 dense_feature:0.109765625 dense_feature:0.202 dense_feature:0.13 dense_feature:0.02 dense_feature:0.078 dense_feature:0.0 dense_feature:0.1 dense_feature:0.1 dense_feature:0.02 1:737395 2:7753 3:871178 4:183530 5:286835 6:948614 7:358953 8:718499 9:27346 10:837400 11:432444 12:775997 13:169268 14:255651 15:250644 16:639940 17:342789 18:421122 19:880474 20:984402 21:410471 22:26235 23:410878 24:228420 25:269955 26:870795 -click:0 dense_feature:0.05 dense_feature:0.162520729685 dense_feature:0.28 dense_feature:0.16 dense_feature:0.001046875 dense_feature:0.028 dense_feature:1.03 dense_feature:0.84 dense_feature:0.534 dense_feature:0.1 dense_feature:2.3 dense_feature:0.0 dense_feature:0.28 1:737395 2:334074 3:108983 4:898979 5:286835 6:948614 7:600430 8:718499 9:27346 10:668726 11:40711 12:62821 13:361076 14:202774 15:722413 16:688170 17:342789 18:746785 19:957809 20:984402 21:96056 22:191474 23:410878 24:703372 25:129305 26:591537 -click:0 dense_feature:0.2 dense_feature:0.0945273631841 dense_feature:0.02 dense_feature:0.18 dense_feature:0.021078125 dense_feature:0.046 dense_feature:0.52 dense_feature:0.44 dense_feature:0.18 dense_feature:0.1 dense_feature:0.8 dense_feature:0.0 dense_feature:0.22 1:663372 2:532829 3:714247 4:673800 5:286835 6:906818 7:219517 8:67180 9:27346 10:161916 11:40711 12:441505 13:361076 14:255651 15:992961 16:137571 17:796504 18:395194 19:26230 20:26229 21:800938 22:179327 23:410878 24:719782 25:26224 26:26223 -click:1 dense_feature:0.15 dense_feature:0.24543946932 dense_feature:0.0 dense_feature:0.12 dense_feature:0.0001875 dense_feature:0.004 dense_feature:0.08 dense_feature:0.12 dense_feature:0.072 dense_feature:0.1 dense_feature:0.4 dense_feature:0.0 dense_feature:0.04 1:663372 2:70321 3:202829 4:415480 5:286835 6:553107 7:32934 8:67180 9:27346 10:1873 11:699999 12:55775 13:371214 14:873363 15:685332 16:719499 17:342789 18:135819 19:26230 20:26229 21:973542 22:852086 23:410878 24:635223 25:26224 26:26223 -click:0 dense_feature:0.0 dense_feature:0.0679933665008 dense_feature:0.02 dense_feature:0.02 dense_feature:0.20015625 dense_feature:0.016 dense_feature:0.03 dense_feature:0.02 dense_feature:0.014 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.02 1:737395 2:229199 3:956202 4:475901 5:286835 6:948614 7:614385 8:718499 9:27346 10:171202 11:670646 12:566018 13:386065 14:873363 15:936716 16:825279 17:681378 18:758631 19:26230 20:26229 21:113534 22:26235 23:410878 24:551443 25:26224 26:26223 -click:1 dense_feature:0.05 dense_feature:0.00497512437811 dense_feature:0.04 dense_feature:0.22 dense_feature:0.015921875 dense_feature:0.022 dense_feature:0.04 dense_feature:0.4 dense_feature:0.182 dense_feature:0.1 dense_feature:0.2 dense_feature:0.0 dense_feature:0.22 1:737395 2:64837 3:751736 4:291977 5:286835 6:25207 7:377931 8:718499 9:27346 10:724396 11:433484 12:517940 13:439712 14:201966 15:628624 16:780717 17:342789 18:895034 19:880474 20:31181 21:463725 22:26235 23:410878 24:787115 25:884722 26:164940 -click:1 dense_feature:0.0 dense_feature:0.00995024875622 dense_feature:0.15 dense_feature:0.48 dense_feature:0.051375 dense_feature:0.0 dense_feature:0.0 dense_feature:0.06 dense_feature:0.556 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.5 1:737395 2:532829 3:158777 4:112926 5:286835 6:948614 7:764249 8:67180 9:27346 10:795273 11:330644 12:524443 13:78129 14:873363 15:127209 16:146094 17:342789 18:976129 19:26230 20:26229 21:901094 22:26235 23:410878 24:259263 25:26224 26:26223 -click:1 dense_feature:0.0 dense_feature:0.00497512437811 dense_feature:1.75 dense_feature:0.0 dense_feature:0.922828125 dense_feature:1.078 dense_feature:0.0 dense_feature:0.0 dense_feature:0.112 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 1:26973 2:62956 3:428206 4:935291 5:286835 6:446063 7:360307 8:437731 9:502607 10:957425 11:626052 12:641189 13:988058 14:217110 15:637914 16:293992 17:342789 18:832710 19:774721 20:537425 21:516798 22:191474 23:700326 24:204648 25:884722 26:776972 -click:1 dense_feature:1.95 dense_feature:0.00829187396352 dense_feature:0.08 dense_feature:0.1 dense_feature:0.01878125 dense_feature:0.044 dense_feature:0.42 dense_feature:0.24 dense_feature:0.358 dense_feature:0.1 dense_feature:0.2 dense_feature:0.1 dense_feature:0.26 1:737395 2:638265 3:526671 4:362576 5:938478 6:948614 7:999918 8:67180 9:27346 10:806276 11:181589 12:688684 13:367155 14:255651 15:709602 16:386859 17:228085 18:204112 19:668832 20:537425 21:541553 22:191474 23:410878 24:606704 25:49230 26:68113 -click:0 dense_feature:0.0 dense_feature:0.00331674958541 dense_feature:0.0 dense_feature:0.0 dense_feature:0.38159375 dense_feature:0.022 dense_feature:0.18 dense_feature:0.0 dense_feature:0.016 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.0 1:737395 2:841163 3:284187 4:385559 5:286835 6:446063 7:311604 8:67180 9:27346 10:38910 11:76230 12:520869 13:429321 14:255651 15:296507 16:542357 17:342789 18:377250 19:880474 20:31181 21:325494 22:26235 23:410878 24:26284 25:499868 26:467348 -click:0 dense_feature:0.0 dense_feature:0.00663349917081 dense_feature:0.08 dense_feature:0.0 dense_feature:0.077125 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.03 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 1:737395 2:238813 3:821667 4:209184 5:286835 6:906818 7:261420 8:67180 9:27346 10:748867 11:277196 12:790086 13:495408 14:873363 15:572266 16:281532 17:342789 18:99340 19:880474 20:537425 21:815896 22:26235 23:669531 24:17430 25:734238 26:251811 -click:0 dense_feature:0.0 dense_feature:0.210613598673 dense_feature:0.01 dense_feature:0.0 dense_feature:0.041375 dense_feature:0.0 dense_feature:0.0 dense_feature:0.08 dense_feature:0.026 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 1:737395 2:532829 3:559456 4:565823 5:286835 6:948614 7:48897 8:67180 9:27346 10:708281 11:214000 12:431427 13:477774 14:873363 15:637383 16:678446 17:276641 18:849284 19:26230 20:26229 21:758879 22:26235 23:410878 24:399458 25:26224 26:26223 -click:1 dense_feature:0.2 dense_feature:0.00331674958541 dense_feature:0.0 dense_feature:0.0 dense_feature:0.00440625 dense_feature:0.036 dense_feature:0.04 dense_feature:0.3 dense_feature:0.03 dense_feature:0.1 dense_feature:0.1 dense_feature:0.0 dense_feature:0.0 1:715353 2:532829 3:967094 4:707735 5:286835 6:948614 7:555710 8:154004 9:27346 10:708281 11:514992 12:158604 13:780149 14:255651 15:285282 16:149708 17:342789 18:553067 19:26230 20:26229 21:229985 22:26235 23:700326 24:777746 25:26224 26:26223 -click:1 dense_feature:0.0 dense_feature:0.00331674958541 dense_feature:0.0 dense_feature:0.0 dense_feature:0.23178125 dense_feature:0.222 dense_feature:0.06 dense_feature:0.0 dense_feature:0.408 dense_feature:0.0 dense_feature:0.2 dense_feature:0.0 dense_feature:0.0 1:715353 2:227084 3:456811 4:828682 5:286835 6:948614 7:406567 8:67180 9:27346 10:66123 11:598531 12:527138 13:731439 14:813514 15:35257 16:43339 17:342789 18:918487 19:26230 20:26229 21:580653 22:26235 23:410878 24:495283 25:26224 26:26223 -click:0 dense_feature:0.15 dense_feature:0.462686567164 dense_feature:0.08 dense_feature:0.22 dense_feature:0.00015625 dense_feature:0.022 dense_feature:0.03 dense_feature:0.52 dense_feature:0.022 dense_feature:0.1 dense_feature:0.1 dense_feature:0.0 dense_feature:0.22 1:576931 2:99294 3:263211 4:501662 5:938478 6:154084 7:128918 8:67180 9:27346 10:912799 11:801006 12:506258 13:378182 14:201966 15:150934 16:240427 17:681378 18:393279 19:26230 20:26229 21:152038 22:26235 23:700326 24:551443 25:26224 26:26223 -click:0 dense_feature:0.0 dense_feature:0.00331674958541 dense_feature:0.0 dense_feature:0.0 dense_feature:0.181484375 dense_feature:0.06 dense_feature:0.01 dense_feature:0.0 dense_feature:0.056 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.0 1:230803 2:283434 3:367596 4:197992 5:938478 6:948614 7:268098 8:67180 9:27346 10:870993 11:632267 12:139817 13:718764 14:255651 15:884839 16:80117 17:276641 18:556463 19:880474 20:537425 21:271358 22:26235 23:410878 24:488077 25:253255 26:584828 -click:0 dense_feature:0.0 dense_feature:0.00497512437811 dense_feature:0.0 dense_feature:0.16 dense_feature:4.790078125 dense_feature:0.0 dense_feature:0.0 dense_feature:0.28 dense_feature:0.016 dense_feature:0.0 dense_feature:0.0 dense_feature:0.0 dense_feature:0.2 1:737395 2:532829 3:158777 4:112926 5:286835 6:948614 7:277312 8:67180 9:502607 10:708281 11:755513 12:524443 13:4029 14:873363 15:503814 16:146094 17:644532 18:121590 19:26230 20:26229 21:901094 22:191474 23:744389 24:259263 25:26224 26:26223 -click:0 dense_feature:0.0 dense_feature:3.30845771144 dense_feature:0.0 dense_feature:0.04 dense_feature:0.022671875 dense_feature:0.062 dense_feature:0.01 dense_feature:0.4 dense_feature:0.062 dense_feature:0.0 dense_feature:0.1 dense_feature:0.0 dense_feature:0.04 1:663372 2:529436 3:511823 4:942782 5:286835 6:906818 7:190054 8:67180 9:27346 10:708281 11:32527 12:494263 13:652478 14:873363 15:616057 16:17325 17:342789 18:325238 19:26230 20:26229 21:256747 22:179327 23:410878 24:169709 25:26224 26:26223 -click:0 dense_feature:0.0 dense_feature:0.00829187396352 dense_feature:0.01 dense_feature:0.16 dense_feature:0.206765625 dense_feature:0.328 dense_feature:0.13 dense_feature:0.16 dense_feature:0.176 dense_feature:0.0 dense_feature:0.7 dense_feature:0.0 dense_feature:0.16 1:737395 2:552854 3:606082 4:267619 5:286835 6:948614 7:918889 8:67180 9:27346 10:708281 11:400024 12:972010 13:66330 14:255651 15:432931 16:650209 17:506108 18:212910 19:26230 20:26229 21:107726 22:26235 23:410878 24:718419 25:26224 26:26223 diff --git a/models/demo/online_learning/model.py b/models/demo/online_learning/model.py deleted file mode 100755 index 658afec27fb8e019b6a3fbc07cee132e1e82494b..0000000000000000000000000000000000000000 --- a/models/demo/online_learning/model.py +++ /dev/null @@ -1,101 +0,0 @@ -# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -import math - -import paddle.fluid as fluid - -from paddlerec.core.utils import envs -from paddlerec.core.model import ModelBase - - -class Model(ModelBase): - def __init__(self, config): - ModelBase.__init__(self, config) - - def _init_hyper_parameters(self): - self.is_distributed = True if envs.get_fleet_mode().upper( - ) == "PSLIB" else False - self.sparse_feature_number = envs.get_global_env( - "hyper_parameters.sparse_feature_number") - self.sparse_feature_dim = envs.get_global_env( - "hyper_parameters.sparse_feature_dim") - self.learning_rate = envs.get_global_env( - "hyper_parameters.optimizer.learning_rate") - - def net(self, input, is_infer=False): - self.sparse_inputs = self._sparse_data_var[1:] - self.dense_input = self._dense_data_var[0] - self.label_input = self._sparse_data_var[0] - - def embedding_layer(input): - emb = fluid.contrib.layers.sparse_embedding( - input=input, - is_test=False, - # for distributed sparse embedding, dim0 just fake. - size=[1024, self.sparse_feature_dim], - param_attr=fluid.ParamAttr( - name="SparseFeatFactors", - initializer=fluid.initializer.Uniform()), ) - emb_sum = fluid.layers.sequence_pool(input=emb, pool_type='sum') - return emb_sum - - sparse_embed_seq = list(map(embedding_layer, self.sparse_inputs)) - concated = fluid.layers.concat( - sparse_embed_seq + [self.dense_input], axis=1) - - fcs = [concated] - hidden_layers = envs.get_global_env("hyper_parameters.fc_sizes") - - for size in hidden_layers: - output = fluid.layers.fc( - input=fcs[-1], - size=size, - act='relu', - param_attr=fluid.ParamAttr( - initializer=fluid.initializer.Normal( - scale=1.0 / math.sqrt(fcs[-1].shape[1])))) - fcs.append(output) - - predict = fluid.layers.fc( - input=fcs[-1], - size=2, - act="softmax", - param_attr=fluid.ParamAttr(initializer=fluid.initializer.Normal( - scale=1 / math.sqrt(fcs[-1].shape[1])))) - - self.predict = predict - - auc, batch_auc, _ = fluid.layers.auc(input=self.predict, - label=self.label_input, - num_thresholds=2**12, - slide_steps=20) - if is_infer: - self._infer_results["AUC"] = auc - self._infer_results["BATCH_AUC"] = batch_auc - return - - self._metrics["AUC"] = auc - self._metrics["BATCH_AUC"] = batch_auc - cost = fluid.layers.cross_entropy( - input=self.predict, label=self.label_input) - avg_cost = fluid.layers.reduce_mean(cost) - self._cost = avg_cost - - def optimizer(self): - optimizer = fluid.optimizer.Adam(self.learning_rate, lazy_mode=True) - return optimizer - - def infer_net(self): - pass diff --git a/models/rank/dnn/README.md b/models/rank/dnn/README.md index 9656adc655d6e2d8931861a492af3583906f98f5..d4167777220a12fc3d59c87a01cdf8dcac7dae4d 100644 --- a/models/rank/dnn/README.md +++ b/models/rank/dnn/README.md @@ -259,3 +259,133 @@ auc_var, batch_auc_var, auc_states = fluid.layers.auc( ``` 完成上述组网后,我们最终可以通过训练拿到`avg_cost`与`auc`两个重要指标。 + + +## 流式训练(OnlineLearning)任务启动及配置流程 + +### 流式训练简介 +流式训练是按照一定顺序进行数据的接收和处理,每接收一个数据,模型会对它进行预测并对当前模型进行更新,然后处理下一个数据。 像信息流、小视频、电商等场景,每天都会新增大量的数据, 让每天(每一刻)新增的数据基于上一天(上一刻)的模型进行新的预测和模型更新。 + +在大规模流式训练场景下, 需要使用的深度学习框架有对应的能力支持, 即: +* 支持大规模分布式训练的能力, 数据量巨大, 需要有良好的分布式训练及扩展能力,才能满足训练的时效要求 +* 支持超大规模的Embedding, 能够支持十亿甚至千亿级别的Embedding, 拥有合理的参数输出的能力,能够快速输出模型参数并和线上其他系统进行对接 +* Embedding的特征ID需要支持HASH映射,不要求ID的编码,能够自动增长及控制特征的准入(原先不存在的特征可以以适当的条件创建), 能够定期淘汰(能够以一定的策略进行过期的特征的清理) 并拥有准入及淘汰策略 +* 最后就是要基于框架开发一套完备的流式训练的 trainer.py, 能够拥有完善的流式训练流程 + +### 使用ctr-dnn online learning 进行模型的训练 +目前,PaddleRec基于飞桨分布式训练框架的能力,实现了这套流式训练的流程。 供大家参考和使用。我们基于`models/rank/ctr-dnn`修改了一个online_training的版本,供大家更好的理解和参考。 + +**注意** +1. 使用online learning 需要安装目前Paddle最新的开发者版本, 你可以从 https://www.paddlepaddle.org.cn/documentation/docs/zh/install/Tables.html#whl-dev 此处获得它,需要先卸载当前已经安装的飞桨版本,根据自己的Python环境下载相应的安装包。 +2. 使用online learning 需要安装目前PaddleRec最新的开发者版本, 你可以通过 git clone https://github.com/PaddlePaddle/PaddleRec.git 得到最新版的PaddleRec并自行安装 + +### 启动方法 +1. 修改config.yaml中的 hyper_parameters.distributed_embedding=1,表示打开大规模稀疏的模式 +2. 修改config.yaml中的 mode: [single_cpu_train, single_cpu_infer] 中的 `single_cpu_train` 为online_learning_cluster,表示使用online learning对应的运行模式 +3. 准备训练数据, ctr-dnn中使用的online learning对应的训练模式为 天级别训练, 每天又分为24个小时, 因此训练数据需要 天--小时的目录结构进行整理。 + 以 2020年08月10日 到 2020年08月11日 2天的训练数据举例, 用户需要准备的数据的目录结构如下: + ``` + train_data/ + |-- 20200810 + | |-- 00 + | | `-- train.txt + | |-- 01 + | | `-- train.txt + | |-- 02 + | | `-- train.txt + | |-- 03 + | | `-- train.txt + | |-- 04 + | | `-- train.txt + | |-- 05 + | | `-- train.txt + | |-- 06 + | | `-- train.txt + | |-- 07 + | | `-- train.txt + | |-- 08 + | | `-- train.txt + | |-- 09 + | | `-- train.txt + | |-- 10 + | | `-- train.txt + | |-- 11 + | | `-- train.txt + | |-- 12 + | | `-- train.txt + | |-- 13 + | | `-- train.txt + | |-- 14 + | | `-- train.txt + | |-- 15 + | | `-- train.txt + | |-- 16 + | | `-- train.txt + | |-- 17 + | | `-- train.txt + | |-- 18 + | | `-- train.txt + | |-- 19 + | | `-- train.txt + | |-- 20 + | | `-- train.txt + | |-- 21 + | | `-- train.txt + | |-- 22 + | | `-- train.txt + | `-- 23 + | `-- train.txt + `-- 20200811 + |-- 00 + | `-- train.txt + |-- 01 + | `-- train.txt + |-- 02 + | `-- train.txt + |-- 03 + | `-- train.txt + |-- 04 + | `-- train.txt + |-- 05 + | `-- train.txt + |-- 06 + | `-- train.txt + |-- 07 + | `-- train.txt + |-- 08 + | `-- train.txt + |-- 09 + | `-- train.txt + |-- 10 + | `-- train.txt + |-- 11 + | `-- train.txt + |-- 12 + | `-- train.txt + |-- 13 + | `-- train.txt + |-- 14 + | `-- train.txt + |-- 15 + | `-- train.txt + |-- 16 + | `-- train.txt + |-- 17 + | `-- train.txt + |-- 18 + | `-- train.txt + |-- 19 + | `-- train.txt + |-- 20 + | `-- train.txt + |-- 21 + | `-- train.txt + |-- 22 + | `-- train.txt + `-- 23 + `-- train.txt + ``` +4. 准备好数据后, 即可按照标准的训练流程进行流式训练了 + ```shell + python -m paddlerec.run -m models/rerank/ctr-dnn/config.yaml + ``` diff --git a/models/rank/dnn/config.yaml b/models/rank/dnn/config.yaml index aa84a5070470cba750f7832644a9ce676c1d4ddd..fdb4470b492ce1b1aece958ca20ac78903f84b46 100755 --- a/models/rank/dnn/config.yaml +++ b/models/rank/dnn/config.yaml @@ -49,6 +49,7 @@ hyper_parameters: sparse_feature_dim: 9 dense_input_dim: 13 fc_sizes: [512, 256, 128, 32] + distributed_embedding: 0 # select runner by name mode: [single_cpu_train, single_cpu_infer] @@ -90,6 +91,18 @@ runner: print_interval: 1 phases: [phase1] +- name: online_learning_cluster + class: cluster_train + runner_class_path: "{workspace}/online_learning_runner.py" + epochs: 2 + device: cpu + fleet_mode: ps + save_checkpoint_interval: 1 # save model interval of epochs + save_checkpoint_path: "increment_dnn" # save checkpoint path + init_model_path: "" # load model path + print_interval: 1 + phases: [phase1] + - name: collective_cluster class: cluster_train epochs: 2 diff --git a/models/rank/dnn/model.py b/models/rank/dnn/model.py index ac6f0b946ef4dfd753b79d50a6ec34d099298698..b614934bebcbec342ecd6e711d3864d6ad506faa 100755 --- a/models/rank/dnn/model.py +++ b/models/rank/dnn/model.py @@ -25,8 +25,16 @@ class Model(ModelBase): ModelBase.__init__(self, config) def _init_hyper_parameters(self): - self.is_distributed = True if envs.get_fleet_mode().upper( - ) == "PSLIB" else False + self.is_distributed = False + self.distributed_embedding = False + + if envs.get_fleet_mode().upper() == "PSLIB": + self.is_distributed = True + + if envs.get_global_env("hyper_parameters.distributed_embedding", + 0) == 1: + self.distributed_embedding = True + self.sparse_feature_number = envs.get_global_env( "hyper_parameters.sparse_feature_number") self.sparse_feature_dim = envs.get_global_env( @@ -40,14 +48,26 @@ class Model(ModelBase): self.label_input = self._sparse_data_var[0] def embedding_layer(input): - emb = fluid.layers.embedding( - input=input, - is_sparse=True, - is_distributed=self.is_distributed, - size=[self.sparse_feature_number, self.sparse_feature_dim], - param_attr=fluid.ParamAttr( - name="SparseFeatFactors", - initializer=fluid.initializer.Uniform()), ) + if self.distributed_embedding: + emb = fluid.contrib.layers.sparse_embedding( + input=input, + size=[ + self.sparse_feature_number, self.sparse_feature_dim + ], + param_attr=fluid.ParamAttr( + name="SparseFeatFactors", + initializer=fluid.initializer.Uniform())) + else: + emb = fluid.layers.embedding( + input=input, + is_sparse=True, + is_distributed=self.is_distributed, + size=[ + self.sparse_feature_number, self.sparse_feature_dim + ], + param_attr=fluid.ParamAttr( + name="SparseFeatFactors", + initializer=fluid.initializer.Uniform())) emb_sum = fluid.layers.sequence_pool(input=emb, pool_type='sum') return emb_sum diff --git a/models/demo/online_learning/online_learning_runner.py b/models/rank/dnn/online_learning_runner.py similarity index 100% rename from models/demo/online_learning/online_learning_runner.py rename to models/rank/dnn/online_learning_runner.py