Using pre-trained word vectors in embedding layer
Created by: qingqing01
The following issue comes from email.
Thank you for your work on Paddle. I think the design is very interesting.
I would like to use pretrained word vectors in an embedding layer. I want the weights to be static, because my training data is small. For clarity, here's how I would implement the desired behaviour with Keras:
model.add(
Embedding(
embeddings.shape[0],
embeddings.shape[1],
input_length=shape['max_length'],
trainable=False,
weights=[embeddings],
mask_zero=True
)
)
Is there a way to implement this with the Paddle Python bindings? Unfortunately I haven't been able to find this in the documentation or source yet.