提交 54e2155c 编写于 作者: J Jacob Devlin

Updating documentation and adding requirements.txt

上级 6d6e6917
......@@ -2,7 +2,7 @@
## Introduction
**BERT**, or **B**idirectional **E**mbedding **R**epresentations from
**BERT**, or **B**idirectional **E**ncoder **R**epresentations from
**T**ransformers, is a new method of pre-training language representations which
obtains state-of-the-art results on a wide array of Natural Language Processing
(NLP) tasks.
......@@ -205,7 +205,8 @@ the following flags to `run_classifier.py` or `run_squad.py`:
Please see the
[Google Cloud TPU tutorial](https://cloud.google.com/tpu/docs/tutorials/mnist)
for how to use Cloud TPUs.
for how to use Cloud TPUs. Alternatively, you can use the Codalab notebook
"[BERT FineTuning with Cloud TPUs](https://colab.sandbox.google.com/github/tensorflow/tpu/blob/master/tools/colab/bert_finetuning_with_cloud_tpus.ipynb)".
On Cloud TPUs, the pretrained model and the output directory will need to be on
Google Cloud Storage. For example, if you have a bucket named `some_bucket`, you
......
......@@ -12,7 +12,7 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Common utility functions related to TensorFlow."""
"""The main BERT model and related functions."""
from __future__ import absolute_import
from __future__ import division
......
# This may work with versions of TensorFlow back to 1.7.0, but has not been
# tested.
tensorflow >= 1.11.0 # CPU Version of TensorFlow.
# tensorflow-gpu >= 1.11.0 # GPU version of TensorFlow.
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册