未验证 提交 def3b651 编写于 作者: K Kai Chen 提交者: GitHub

update readme and benchmark (#4)

上级 14a354b8
......@@ -2,21 +2,26 @@
<img src="docs/imgs/mmaction2-logo.png" width="500"/>
</div>
## Introduction
<div align="left">
<a href='https://mmaction2.readthedocs.io/en/latest/?badge=latest'>
<img src='https://readthedocs.org/projects/mmaction2/badge/?version=latest' alt='Documentation Status' />
</a>
<a href="https://github.com/open-mmlab/mmaction2/blob/master/LICENSE">
<a href="https://codecov.io/gh/open-mmlab/mmaction2">
<img src="https://codecov.io/gh/open-mmlab/mmaction2/branch/master/graph/badge.svg" />
</a>
<a href="https://github.com/open-mmlab/mmaction2/blob/master/LICENSE">
<img src="https://img.shields.io/github/license/open-mmlab/mmaction2.svg">
</a>
</div>
The master branch works with **PyTorch 1.3+**. Documentation: https://mmaction2.readthedocs.io/
Documentation: https://mmaction2.readthedocs.io/.
## Introduction
MMAction2 is an open-source toolbox for action understanding based on PyTorch.
It is a part of the [OpenMMLab project](https://github.com/open-mmlab) developed by [Multimedia Laboratory, CUHK](http://mmlab.ie.cuhk.edu.hk/).
It is a part of the [OpenMMLab](http://openmmlab.org/) project.
The master branch works with **PyTorch 1.3+**.
<div align="center">
<img src="demo/demo.gif" width="600px"/>
......@@ -41,16 +46,28 @@ It is a part of the [OpenMMLab project](https://github.com/open-mmlab) developed
- For temporal action localization, we implement BSN, BMN.
- **Well tested and documented**
We provide detailed documentation and API reference, as well as unittests.
## License
This project is released under the [Apache 2.0 license](LICENSE).
## Benchmark and Model Zoo
Benchmark with other repos are available on [benchmark.md](docs/benchmark.md).
MMAction2 supports various models and is more efficient in training.
We compare with other popular codebases and the results are shown as below.
Results and models are available in the *README.md* of each method's config directory.
A summary can be found in the [**model zoo**](https://mmaction2.readthedocs.io/en/latest/modelzoo.html) page.
| Model | MMAction2 (s/iter) | MMAction (s/iter) | Temporal-Shift-Module (s/iter) | PySlowFast (s/iter) |
| :--- | :---------------: | :--------------------: | :----------------------------: | :-----------------: |
| [TSN](/configs/recognition/tsn/tsn_r50_1x1x3_100e_kinetics400_rgb.py) | **0.29** | 0.36 | 0.45 | x |
| [I3D (setting1)](/configs/recognition/i3d/i3d_r50_32x2x1_100e_kinetics400_rgb.py) | **0.45** | 0.58 | x | x |
| [I3D (setting2)](/configs/recognition/i3d/i3d_r50_8x8x1_100e_kinetics400_rgb.py) | **0.32** | x | x | 0.56 |
| [TSM](/configs/recognition/tsm/tsm_r50_1x1x8_50e_kinetics400_rgb.py) | **0.30** | x | 0.38 | x |
| [Slowonly](/configs/recognition/slowonly/slowonly_r50_4x16x1_256e_kinetics400_rgb.py) | **0.30** | x | x | 1.03 |
| [Slowfast](/configs/recognition/slowfast/slowfast_r50_4x16x1_256e_kinetics400_rgb.py) | **0.80** | x | x | 1.40 |
| [R(2+1)D](/configs/recognition/r2plus1d/r2plus1d_r34_8x8x1_180e_kinetics400_rgb.py) | **0.48** | x | x | x |
Supported methods for action recognition:
- [x] [TSN](configs/recognition/tsn/README.md)
......@@ -64,6 +81,9 @@ Supported methods for action localization:
- [x] [BMN](configs/localization/bmn/README.md)
- [x] [BSN](configs/localization/bsn/README.md)
Results and models are available in the *README.md* of each method's config directory.
A summary can be found in the [**model zoo**](https://mmaction2.readthedocs.io/en/latest/modelzoo.html) page.
## Installation
Please refer to [install.md](docs/install.md) for installation.
......
......@@ -23,15 +23,13 @@ The training speed is measure with s/iter. The lower, the better.
| Model | MMAction2 (s/iter) | MMAction (s/iter) | Temporal-Shift-Module (s/iter) | PySlowFast (s/iter) |
| :--- | :---------------: | :--------------------: | :----------------------------: | :-----------------: |
| TSN ([tsn_r50_1x1x3_100e_kinetics400_rgb](/configs/recognition/tsn/tsn_r50_1x1x3_100e_kinetics400_rgb.py)) | **0.29** | 0.36 | 0.45 | x |
| I3D ([i3d_r50_32x2x1_100e_kinetics400_rgb](/configs/recognition/i3d/i3d_r50_32x2x1_100e_kinetics400_rgb.py)) | **0.45** | 0.58 | x | x |
| I3D ([i3d_r50_8x8x1_100e_kinetics400_rgb](/configs/recognition/i3d/i3d_r50_8x8x1_100e_kinetics400_rgb.py)) | **0.32** | x | x | 0.56 |
| TSM ([tsm_r50_1x1x8_50e_kinetics400_rgb](/configs/recognition/tsm/tsm_r50_1x1x8_50e_kinetics400_rgb.py)) | **0.30** | x | 0.38 | x |
| Slowonly ([slowonly_r50_4x16x1_256e_kinetics400_rgb](/configs/recognition/slowonly/slowonly_r50_4x16x1_256e_kinetics400_rgb.py)) | **0.30** | x | x | 1.03 |
| Slowonly ([slowonly_r50_8x8x1_256e_kinetics400_rgb](/configs/recognition/slowonly/slowonly_r50_8x8x1_256e_kinetics400_rgb.py)) | **0.50** | x | x | 1.29 |
| Slowfast ([slowfast_r50_4x16x1_256e_kinetics400_rgb](/configs/recognition/slowfast/slowfast_r50_4x16x1_256e_kinetics400_rgb.py)) | **0.80** | x | x | 1.40 |
| Slowfast ([slowfast_r50_8x8x1_256e_kinetics400_rgb](/configs/recognition/slowfast/slowfast_r50_8x8x1_256e_kinetics400_rgb.py)) | **1.05** | x | x | 1.41 |
| R(2+1)D ([r2plus1d_r34_8x8x1_180e_kinetics400_rgb](/configs/recognition/r2plus1d/r2plus1d_r34_8x8x1_180e_kinetics400_rgb.py)) | **0.48** | x | x | x |
| [TSN](/configs/recognition/tsn/tsn_r50_1x1x3_100e_kinetics400_rgb.py) | **0.29** | 0.36 | 0.45 | x |
| [I3D (setting1)](/configs/recognition/i3d/i3d_r50_32x2x1_100e_kinetics400_rgb.py) | **0.45** | 0.58 | x | x |
| [I3D (setting2)](/configs/recognition/i3d/i3d_r50_8x8x1_100e_kinetics400_rgb.py) | **0.32** | x | x | 0.56 |
| [TSM](/configs/recognition/tsm/tsm_r50_1x1x8_50e_kinetics400_rgb.py) | **0.30** | x | 0.38 | x |
| [Slowonly](/configs/recognition/slowonly/slowonly_r50_4x16x1_256e_kinetics400_rgb.py) | **0.30** | x | x | 1.03 |
| [Slowfast](/configs/recognition/slowfast/slowfast_r50_4x16x1_256e_kinetics400_rgb.py) | **0.80** | x | x | 1.40 |
| [R(2+1)D](/configs/recognition/r2plus1d/r2plus1d_r34_8x8x1_180e_kinetics400_rgb.py) | **0.48** | x | x | x |
## Localizers
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册