未验证 提交 35deae77 编写于 作者: J Jacob Devlin 提交者: GitHub

Merge pull request #93 from artemisart/fix-test

fix tokenization_test
......@@ -30,7 +30,7 @@ class TokenizationTest(tf.test.TestCase):
"[UNK]", "[CLS]", "[SEP]", "want", "##want", "##ed", "wa", "un", "runn",
"##ing", ","
]
with tempfile.NamedTemporaryFile(delete=False) as vocab_writer:
with tempfile.NamedTemporaryFile(mode='w+', delete=False) as vocab_writer:
vocab_writer.write("".join([x + "\n" for x in vocab_tokens]))
vocab_file = vocab_writer.name
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册