下面回到前面的评论文本鉴定问题,不改变任何其他网络参数,仅是使用LSTM层替换Simple RNN层,然后看看效率是否会有所提升:

from keras.models import Sequential # 导入序贯模型

from keras.layers.embeddings import Embedding #导入词嵌入层

from keras.layers import Dense #导入全连接层

from keras.layers import LSTM #导入LSTM层

embedding_vecor_length = 60 # 设定词嵌入向量长度为60

lstm = Sequential() #序贯模型

lstm.add(Embedding(dictionary_size, embedding_vecor_length,

input_length=max_comment_length)) # 加入词嵌入层

lstm.add(LSTM(100)) # 加入LSTM层

lstm.add(Dense(10, activation='relu')) # 加入全连接层

lstm.add(Dense(6, activation='softmax')) # 加入分类输出层

lstm.compile(loss='sparse_categorical_crossentropy', #损失函数

optimizer = 'adam', # 优化器

metrics = ['acc']) # 评估指标

history = rnn.fit(X_train, y_train,

validation_split = 0.3,

epochs=10,

batch_size=64)

输出结果显示,同样训练10轮之后,验证集准确率为0.6171,比Simple RNN更准确了。

Train on 7000 samples, validate on 3000 samples

Epoch 1/10

15848/15848 [============================] - 88s 6ms/step - loss: 1.2131 - acc:

0.5856 - val_loss: 1.0130 - val_acc: 0.6030

Epoch 2/10

15848/15848 [============================] - 87s 5ms/step - loss: 0.8891 - acc:

0.6363 - val_loss: 0.9449 - val_acc: 0.6015

… …

Epoch 10/10

15848/15848 [============================] - 88s 6ms/step - loss: 0.7999 - acc:

0.6661 - val_loss: 0.9389 - val_acc: 0.6171