PaddlePaddle Fast ResNet can train ImageNet with fewer epochs. We implemented the it according to the blog
PaddlePaddle Fast ImageNet can train ImageNet dataset with fewer epochs. We implemented the it according to the blog
[Now anyone can train Imagenet in 18 minutes](https://www.fast.ai/2018/08/10/fastai-diu-imagenet/) which published on the [fast.ai] website.
PaddlePaddle Fast ResNet using the dynmiac batch size, dynamic image size, rectangular images validation and etc... so that the FastResNet can achieve the baseline
(acc1: 75%, acc5: 93%) by 27 epochs on 8 GPUs.
PaddlePaddle Fast ImageNet using the dynmiac batch size, dynamic image size, rectangular images validation and etc... so that the Fast ImageNet can achieve the baseline
(acc1: 75%, acc5: 93%) by 27 epochs on 8 * V100 GPUs.
## Experiment
1. Preparing the training data, resize the images to 160 and 352 by `resize.py`, the prepared data folder is as followed:
1. Prepare the training data, resize the images to 160 and 352 using `resize.py`, the prepared data folder should look like: