提交 c7b908a7 编写于 作者: L linjintao

Fix link

上级 741db8f0
......@@ -41,33 +41,33 @@ This project is released under the [Apache 2.0 license](LICENSE).
## Benchmark and Model Zoo
Benchmark with other repos are available on [benchmark.md](/docs/benchmark.md).
Benchmark with other repos are available on [benchmark.md](docs/benchmark.md).
Results and models are available in the **README.md** of each method's config directory.
Supported methods for action recognition:
- [x] [TSN](/configs/recognition/tsn/README.md)
- [x] [TSM](/configs/recognition/tsm/README.md)
- [x] [R(2+1)d](/configs/recognition/r2plus1d/README.md)
- [x] [I3D](/configs/recognition/i3d/README.md)
- [x] [SlowOnly](/configs/recognition/slowonly/README.md)
- [x] [SlowFast](/configs/recognition/slowfast/README.md)
- [x] [TSN](configs/recognition/tsn/README.md)
- [x] [TSM](configs/recognition/tsm/README.md)
- [x] [R(2+1)d](configs/recognition/r2plus1d/README.md)
- [x] [I3D](configs/recognition/i3d/README.md)
- [x] [SlowOnly](configs/recognition/slowonly/README.md)
- [x] [SlowFast](configs/recognition/slowfast/README.md)
Supported methods for action localization:
- [x] [BMN](/configs/localization/bmn/README.md)
- [x] [BSN](/configs/localization/bsn/README.md)
- [x] [BMN](configs/localization/bmn/README.md)
- [x] [BSN](configs/localization/bsn/README.md)
## Installation
Please refer to [install.md](/docs/install.md) for installation.
Please refer to [install.md](docs/install.md) for installation.
## Data Preparation
Please refer to [data_preparation.md](/docs/data_preparation.md) for a general knowledge of data preparation.
Please refer to [data_preparation.md](docs/data_preparation.md) for a general knowledge of data preparation.
## Get Started
Please see [getting_started.md](/docs/getting_started.md) for the basic usage of MMAction.
Please see [getting_started.md](docs/getting_started.md) for the basic usage of MMAction.
## Contributing
......
......@@ -21,9 +21,9 @@ To ease usage, we provide tutorials of data deployment for each dataset.
- [Something-Something V2](https://20bn.com/datasets/something-something): See [preparing_sthv2.md](/tools/data/sthv2/preparing_sthv2.md)
- [Moments in Time](http://moments.csail.mit.edu/): See [preparing_mit.md](/tools/data/mit/preparing_mit.md)
- [Multi-Moments in Time](http://moments.csail.mit.edu/challenge_iccv_2019.html): See [preparing_mmit.md](/tools/data/mmit/preparing_mmit.md)
- [ActivityNet_feature](): See[praparing_activitynet.md](/tools/data/activitynet/preparing_activitynet.md)
- [ActivityNet_feature](): See [praparing_activitynet.md](/tools/data/activitynet/preparing_activitynet.md)
Now, you can switch to [getting_started.md](/docs/getting_started.md) to train and test the model.
Now, you can switch to [getting_started.md](getting_started.md) to train and test the model.
## Getting Data
......
# Getting Started
This page provides basic tutorials about the usage of MMAction.
For installation instructions, please see [install.md](/docs/install.md).
For installation instructions, please see [install.md](install.md).
## Datasets
......@@ -26,9 +26,9 @@ mmaction
│ │ ├── ucf101_val_list.txt
```
For more information on data preparation, please see [data_preparation.md](/docs/data_preparation.md)
For more information on data preparation, please see [data_preparation.md](data_preparation.md)
For using custom datasets, please refer to [Tutorial 2: Adding New Dataset](/docs/tutorials/new_dataset.md)
For using custom datasets, please refer to [Tutorial 2: Adding New Dataset](tutorials/new_dataset.md)
## Inference with Pre-Trained Models
......@@ -133,7 +133,7 @@ for result in results:
print(f'{result[0]}: ', result[1])
```
A notebook demo can be found in [demo/demo.ipynb](/demo/demo.ipynb)
A notebook demo can be found in [demo/demo.ipynb](../demo/demo.ipynb)
## Build a Model
......@@ -209,7 +209,7 @@ which defines the following abstract methods.
- `forward_train()`: forward method of the training mode.
- `forward_test()`: forward method of the testing mode.
[Recognizer2D](/mmaction/models/recognizers/recognizer2d.py) and [Recognizer3D](/mmaction/models/recognizers/recognizer3d.py)
[Recognizer2D](../mmaction/models/recognizers/recognizer2d.py) and [Recognizer3D](../mmaction/models/recognizers/recognizer3d.py)
are good examples which show how to do that.
......@@ -291,10 +291,10 @@ Here is an example of using 16 GPUs to train TSN on the dev partition in a slurm
GPUS=16 ./tools/slurm_train.sh dev tsn_r50_k400 configs/recognition/tsn/tsn_r50_1x1x3_100e_kinetics400_rgb.py work_dirs/tsn_r50_1x1x3_100e_kinetics400_rgb
```
You can check [slurm_train.sh](/tools/slurm_train.sh) for full arguments and environment variables.
You can check [slurm_train.sh](../tools/slurm_train.sh) for full arguments and environment variables.
If you have just multiple machines connected with ethernet, you can refer to
pytorch [launch utility](https://pytorch.org/docs/stable/distributed_deprecated.html#launch-utility).
pytorch [launch utility](https://pytorch.orgstable/distributed_deprecated.html#launch-utility).
Usually it is slow if you do not have high speed networking like infiniband.
### Launch multiple jobs on a single machine
......@@ -336,7 +336,7 @@ We provide lots of useful tools under `tools/` directory.
You can plot loss/top-k acc curves given a training log file. Run `pip install seaborn` first to install the dependency.
![acc_curve_image](/docs/imgs/acc_curve.png)
![acc_curve_image](imgs/acc_curve.png)
```shell
python tools/analyze_logs.py plot_curve ${JSON_LOGS} [--keys ${KEYS}] [--title ${TITLE}] [--legend ${LEGEND}] [--backend ${BACKEND}] [--style ${STYLE}] [--out ${OUT_FILE}]
......@@ -405,7 +405,7 @@ Params: 28.04 M
You may well use the result for simple comparisons, but double check it before you adopt it in technical reports or papers.
(1) FLOPs are related to the input shape while parameters are not. The default input shape is (1, 3, 340, 256) for 2D recognizer, (1, 3, 32, 340, 256) for 3D recognizer.
(2) Some custom operators are not counted into FLOPs. You can add support for new operators by modifying [`mmaction/utils/flops_counter.py`](/mmaction/utils/file_client.py).
(2) Some custom operators are not counted into FLOPs. You can add support for new operators by modifying [`mmaction/utils/flops_counter.py`](../mmaction/utils/file_client.py).
### Publish a model
......@@ -543,9 +543,9 @@ There are two ways to work with custom datasets.
- online conversion
You can write a new Dataset class inherited from [BaseDataset](/mmaction/datasets/base.py), and overwrite two methods
You can write a new Dataset class inherited from [BaseDataset](../mmaction/datasets/base.py), and overwrite two methods
`load_annotations(self)` and `evaluate(self, results, metrics, logger)`,
like [RawframeDataset](/mmaction/datasets/rawframe_dataset.py), [VideoDataset](/mmaction/datasets/video_dataset.py) or [ActivityNetDataset](/mmaction/datasets/activitynet_dataset.py).
like [RawframeDataset](../mmaction/datasets/rawframe_dataset.py), [VideoDataset](../mmaction/datasets/video_dataset.py) or [ActivityNetDataset](../mmaction/datasets/activitynet_dataset.py).
- offline conversion
......@@ -554,7 +554,7 @@ There are two ways to work with custom datasets.
### Customize optimizer
An example of customized optimizer is [CopyOfSGD](/mmaction/core/optimizer/copy_of_sgd.py).
An example of customized optimizer is [CopyOfSGD](../mmaction/core/optimizer/copy_of_sgd.py).
More generally, a customized optimizer could be defined as following.
In `mmaction/core/optimizer/my_optimizer.py`:
......@@ -580,7 +580,7 @@ Especially, If you want to construct a optimizer based on a specified model and
You can write a new optimizer constructor inherit from [DefaultOptimizerConstructor](https://github.com/open-mmlab/mmcv/blob/master/mmcv/runner/optimizer/default_constructor.py)
and overwrite the `add_params(self, params, module)` method.
An example of customized optimizer constructor is [TSMOptimizerConstructor](/mmaction/core/optimizer/tsm_optimizer_constructor.py).
An example of customized optimizer constructor is [TSMOptimizerConstructor](../mmaction/core/optimizer/tsm_optimizer_constructor.py).
More generally, a customized optimizer constructor could be defined as following.
In `mmaction/core/optimizer/my_optimizer_constructor.py`:
......@@ -641,7 +641,7 @@ Here we show how to develop new components with an example of TSN.
3. Create a new file `mmaction/models/heads/tsn_head.py`.
You can write a new classification head inherit from [BaseHead](/mmaction/models/heads/base.py),
You can write a new classification head inherit from [BaseHead](../mmaction/models/heads/base.py),
and overwrite `init_weights(self)` and `forward(self, x)` method.
```python
......@@ -687,5 +687,5 @@ Here we show how to develop new components with an example of TSN.
## Tutorials
Currently, we provide some tutorials for users to [finetune model](/docs/tutorials/finetune.md),
[add new dataset](/docs/tutorials/new_dataset.md), [add new modules](/docs/tutorials/new_modules.md).
Currently, we provide some tutorials for users to [finetune model](tutorials/finetune.md),
[add new dataset](tutorials/new_dataset.md), [add new modules](tutorials/new_modules.md).
......@@ -15,7 +15,7 @@ A pipeline consists of a sequence of operations. Each operation takes a dict as
We present a typical pipeline in the following figure. The blue blocks are pipeline operations.
With the pipeline going on, each operator can add new keys (marked as green) to the result dict or update the existing keys (marked as orange).
![pipeline figure](/docs/imgs/data_pipeline.png)
![pipeline figure](../imgs/data_pipeline.png)
The operations are categorized into data loading, pre-processing and formatting.
......
......@@ -5,11 +5,11 @@ to finetune them on other datasets, so that better performance can be get.
There are two steps to finetune a model on a new dataset.
1. Add support for the new dataset. See [Tutorial 2: Adding New Dataset](/docs/tutorials/new_dataset.md).
1. Add support for the new dataset. See [Tutorial 2: Adding New Dataset](new_dataset.md).
1. Modify the configs. This will be discussed in this tutorial.
For example, if the user want to finetune models pre-trained on Kinetics-400 Dataset to another dataset, say UCF101,
then four parts in the config (see [here](/docs/config_recognition.md)) needs attention.
then four parts in the config (see [here](../config_recognition.md)) needs attention.
## Modify Head
......
......@@ -84,9 +84,9 @@ There are two ways to work with custom datasets.
- online conversion
You can write a new Dataset class inherited from [BaseDataset](/mmaction/datasets/base.py), and overwrite three methods
You can write a new Dataset class inherited from [BaseDataset](../../mmaction/datasets/base.py), and overwrite three methods
`load_annotations(self)`, `evaluate(self, results, metrics, logger)` and `dump_results(self, results, out)`,
like [RawframeDataset](/mmaction/datasets/rawframe_dataset.py), [VideoDataset](/mmaction/datasets/video_dataset.py) or [ActivityNetDataset](/mmaction/datasets/activitynet_dataset.py).
like [RawframeDataset](../../mmaction/datasets/rawframe_dataset.py), [VideoDataset](../../mmaction/datasets/video_dataset.py) or [ActivityNetDataset](../../mmaction/datasets/activitynet_dataset.py).
- offline conversion
......
......@@ -2,7 +2,7 @@
## Customize Optimizer
An example of customized optimizer is [CopyOfSGD](/mmaction/core/optimizer/copy_of_sgd.py) is defined in `mmaction/core/optimizer/copy_of_sgd.py`.
An example of customized optimizer is [CopyOfSGD](../../mmaction/core/optimizer/copy_of_sgd.py) is defined in `mmaction/core/optimizer/copy_of_sgd.py`.
More generally, a customized optimizer could be defined as following.
Assume you want to add an optimizer named as `MyOptimizer`, which has arguments `a`, `b` and `c`.
......@@ -54,7 +54,7 @@ The users can do those fine-grained parameter tuning through customizing optimiz
You can write a new optimizer constructor inherit from [DefaultOptimizerConstructor](https://github.com/open-mmlab/mmcv/blob/master/mmcv/runner/optimizer/default_constructor.py)
and overwrite the `add_params(self, params, module)` method.
An example of customized optimizer constructor is [TSMOptimizerConstructor](/mmaction/core/optimizer/tsm_optimizer_constructor.py).
An example of customized optimizer constructor is [TSMOptimizerConstructor](../../mmaction/core/optimizer/tsm_optimizer_constructor.py).
More generally, a customized optimizer constructor could be defined as following.
In `mmaction/core/optimizer/my_optimizer_constructor.py`:
......@@ -144,7 +144,7 @@ Here we show how to develop a new head with the example of TSNHead as the follow
1. Create a new file `mmaction/models/heads/tsn_head.py`.
You can write a new classification head inheriting from [BaseHead](/mmaction/models/heads/base.py),
You can write a new classification head inheriting from [BaseHead](../../mmaction/models/heads/base.py),
and overwrite `init_weights(self)` and `forward(self, x)` method.
```python
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册