未验证 提交 c5ee27f7 编写于 作者: M Michael Wyatt 提交者: GitHub

Add MII tests (#2533)

Adding MII tests to ensure changes to DS-Inference do not break MII
上级 8b4318b9
name: nv-mii
on:
push:
branches:
- 'master'
- 'staging**'
paths-ignore:
- 'docs/**'
pull_request:
paths-ignore:
- 'docs/**'
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
unit-tests:
runs-on: [self-hosted, nvidia, cu116, v100]
steps:
- uses: actions/checkout@v2
- name: environment
run: |
echo "JobID: $AISC_NODE_INSTANCE_ID"
nvidia-smi
which python
python --version
which nvcc
nvcc --version
pip install --upgrade pip
pip uninstall --yes torch torchvision triton
pip install torch torchvision --extra-index-url https://download.pytorch.org/whl/cu116
python -c "import torch; print('torch:', torch.__version__, torch)"
python -c "import torch; print('CUDA available:', torch.cuda.is_available())"
- name: Install MII
run: |
pip uninstall --yes deepspeed deepspeed-mii transformers
pip install .[dev]
pip install git+https://github.com/huggingface/transformers.git
- name: Python environment
run: |
pip list
- name: Unit tests
run: |
git clone https://github.com/microsoft/DeepSpeed-MII.git
cd DeepSpeed-MII
pip install .[dev]
unset TORCH_CUDA_ARCH_LIST # only jit compile for current arch
if [[ -d ./torch-extensions ]]; then rm -rf ./torch-extensions; fi
cd tests
TRANSFORMERS_CACHE=/blob/transformers_cache/ TORCH_EXTENSIONS_DIR=./torch-extensions pytest --color=yes --durations=0 --forked --verbose -m "CPU or local" ./
......@@ -233,7 +233,9 @@ class DeepSpeedInferenceConfig(DeepSpeedConfigModel):
injection_policy_tuple: tuple = None
""" TODO: Add docs """
config: Dict = None # todo: really no need for this field if we can refactor
config: Dict = Field(
None,
alias="args") # todo: really no need for this field if we can refactor
max_out_tokens: int = Field(1024, alias="max_tokens")
"""
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册