未验证 提交 aa16828c 编写于 作者: H Hyunwoong Ko 提交者: GitHub

Fix bugs in the tutorial documentation (#1157)

* Add `import os` to inference tutorials

* assign deepspeed-initialized model to hf model
上级 71ecf7e6
......@@ -71,6 +71,7 @@ DeepSpeed inference can be used in conjunction with HuggingFace `pipeline`. Belo
```python
# Filename: gpt-neo-2.7b-generation.py
import os
import deepspeed
import torch
import transformers
......@@ -82,10 +83,10 @@ generator = pipeline('text-generation', model='EleutherAI/gpt-neo-2.7B', device=
deepspeed.init_inference(generator.model,
mp_size=world_size,
dtype=torch.float,
replace_method='auto')
generator.model = deepspeed.init_inference(generator.model,
mp_size=world_size,
dtype=torch.float,
replace_method='auto')
string = generator("DeepSpeed is", do_sample=True, min_length=50)
if torch.distributed.get_rank() == 0:
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册