diff --git a/README.md b/README.md old mode 100755 new mode 100644 index 8550c961cc0223894119d14144e85c1d3a1ba942..c3f165989da359885c6378be9568324d02bb5c32 --- a/README.md +++ b/README.md @@ -12,57 +12,56 @@ MegEngine is a fast, scalable and easy-to-use deep learning framework, with auto ## Installation -**NOTE:** MegEngine now supports Linux-64bit/Windows-64bit/MacOS-10.14+ (CPU-Only) Platforms with Python from 3.5 to 3.8. On Windows 10 you can either install the Linux distribution through [Windows Subsystem for Linux (WSL)](https://docs.microsoft.com/en-us/windows/wsl) or install the Windows distribution directly. +**NOTE:** MegEngine now supports Python installation on Linux-64bit/Windows-64bit/MacOS(CPU-Only)-10.14+ platforms with Python from 3.5 to 3.8. On Windows 10 you can either install the Linux distribution through [Windows Subsystem for Linux (WSL)](https://docs.microsoft.com/en-us/windows/wsl) or install the Windows distribution directly. Many other platforms are supported for inference. ### Binaries -Commands to install from binaries via pip wheels are as follows: +To install the pre-built binaries via pip wheels: ```bash python3 -m pip install megengine -f https://megengine.org.cn/whl/mge.html ``` -## Build from Source +## Building from Source ### Prerequisites -Most of the dependencies of MegEngine are located in `third_party` directory, which can be prepared by executing: +Most of the dependencies of MegEngine are located in [third_party](third_party) directory, which can be prepared by executing: ```bash ./third_party/prepare.sh ./third_party/install-mkl.sh ``` -But some dependencies need to be Installed manually: +But some dependencies need to be installed manually: -* [CUDA](https://developer.nvidia.com/cuda-toolkit-archive)(>=10.1), [cuDNN](https://developer.nvidia.com/cudnn)(>=7.6)are required when building MegEngine with CUDA support. +* [CUDA](https://developer.nvidia.com/cuda-toolkit-archive)(>=10.1), [cuDNN](https://developer.nvidia.com/cudnn)(>=7.6) are required when building MegEngine with CUDA support. * [TensorRT](https://docs.nvidia.com/deeplearning/sdk/tensorrt-archived/index.html)(>=5.1.5) is required when building with TensorRT support. * LLVM/Clang(>=6.0) is required when building with Halide JIT support. -* Python(>=3.5), Numpy, are required to build Python modules. +* Python(>=3.5) and numpy are required to build Python modules. ### Build - MegEngine uses CMake as the build tool. We provide the following scripts to facilitate building. -* [host_build.sh](scripts/cmake-build/host_build.sh) is to build MegEngine targeted to run on the same host machine. -Please run the following command to get help information: +* [host_build.sh](scripts/cmake-build/host_build.sh) builds MegEngine that runs on the same host machine (i.e., no cross compiling). + The following command displays the usage: ``` scripts/cmake-build/host_build.sh -h ``` -* [cross_build_android_arm_inference.sh](scripts/cmake-build/cross_build_android_arm_inference.sh) is to build MegEngine targeted to run at Android-ARM platforms. -Please run the following command to get help information: +* [cross_build_android_arm_inference.sh](scripts/cmake-build/cross_build_android_arm_inference.sh) builds MegEngine for DNN inference on Android-ARM platforms. + The following command displays the usage: ``` scripts/cmake-build/cross_build_android_arm_inference.sh -h ``` -* [cross_build_linux_arm_inference.sh](scripts/cmake-build/cross_build_linux_arm_inference.sh) is to build MegEngine targeted to run at Linux-ARM platforms. -Please run the following command to get help information: +* [cross_build_linux_arm_inference.sh](scripts/cmake-build/cross_build_linux_arm_inference.sh) builds MegEngine for DNN inference on Linux-ARM platforms. + The following command displays the usage: ``` scripts/cmake-build/cross_build_linux_arm_inference.sh -h ``` -* [cross_build_ios_arm_inference.sh](scripts/cmake-build/cross_build_ios_arm_inference.sh) is to build MegEngine targeted to run iphone/iPad platforms. -Please run the following command to get help information: +* [cross_build_ios_arm_inference.sh](scripts/cmake-build/cross_build_ios_arm_inference.sh) builds MegEngine for DNN inference on iOS (iPhone/iPad) platforms. + The following command displays the usage: ``` scripts/cmake-build/cross_build_ios_arm_inference.sh ``` @@ -70,9 +69,9 @@ Please refer to [BUILD_README.md](scripts/cmake-build/BUILD_README.md) for more ## How to Contribute -* MegEngine adopts [Contributor Covenant](https://contributor-covenant.org) to maintain our community. Please read the [Code of Conduct](CODE_OF_CONDUCT.md) to get more information. -* Every contributor of MegEngine must sign a Contributor License Agreement (CLA) to clarify the intellectual property license granted with the contributions. For more details, please refer [Contributor License Agreement](CONTRIBUTOR_LICENSE_AGREEMENT.md) -* You can help MegEngine better in many ways: +* MegEngine adopts [Contributor Covenant](https://contributor-covenant.org) as a guideline to run our community. Please read the [Code of Conduct](CODE_OF_CONDUCT.md). +* Every contributor of MegEngine must sign a [Contributor License Agreement (CLA)](CONTRIBUTOR_LICENSE_AGREEMENT.md) to clarify the intellectual property license granted with the contributions. +* You can help improving MegEngine in many ways: * Write code. * Improve [documentation](https://github.com/MegEngine/Docs). * Answer questions on [MegEngine Forum](https://discuss.megengine.org.cn), or Stack Overflow. @@ -81,13 +80,13 @@ Please refer to [BUILD_README.md](scripts/cmake-build/BUILD_README.md) for more * Report or investigate [bugs and issues](https://github.com/MegEngine/MegEngine/issues). * Review [Pull Requests](https://github.com/MegEngine/MegEngine/pulls). * Star MegEngine repo. - * Reference MegEngine in your papers and articles. + * Cite MegEngine in your papers and articles. * Recommend MegEngine to your friends. - * ... + * Any other form of contribution is welcomed. -We believe we can build an open and friendly community and power humanity with AI. +We strive to build an open and friendly community. We aim to power humanity with AI. -## How to contact us +## How to Contact Us * Issue: [github.com/MegEngine/MegEngine/issues](https://github.com/MegEngine/MegEngine/issues) * Email: [megengine-support@megvii.com](mailto:megengine-support@megvii.com) @@ -105,4 +104,4 @@ We believe we can build an open and friendly community and power humanity with A MegEngine is Licensed under the Apache License, Version 2.0 -Copyright (c) 2014-2020 Megvii Inc. All rights reserved. +Copyright (c) 2014-2021 Megvii Inc. All rights reserved. diff --git a/README_CN.md b/README_CN.md old mode 100755 new mode 100644 index b8abb296fb1d6174d86a8ca638a58745a802aa0d..dca0a078f8f3d8b07a43be8c525013c09da21372 --- a/README_CN.md +++ b/README_CN.md @@ -13,7 +13,7 @@ MegEngine 是一个快速、可拓展、易于使用且支持自动求导的深 ## 安装说明 -**注意:** MegEngine 现在支持 Linux-64bit/Windows-64bit/macos-10.14及其以上 (MacOS只支持cpu) 平台安装,支持Python3.5 到 Python3.8。对于 Windows 10 用户,可以通过安装 [Windows Subsystem for Linux (WSL)](https://docs.microsoft.com/en-us/windows/wsl) 进行体验,同时我们也原生支持Windows。 +**注意:** MegEngine 现在支持在 Linux-64bit/Windows-64bit/macos-10.14及其以上 (MacOS只支持cpu) 等平台上安装 Python 包,支持Python3.5 到 Python3.8。对于 Windows 10 用户,可以通过安装 [Windows Subsystem for Linux (WSL)](https://docs.microsoft.com/en-us/windows/wsl) 进行体验,同时我们也原生支持Windows。MegEngine 也支持在很多其它平台上进行推理运算。 ### 通过包管理器安装 @@ -27,11 +27,11 @@ python3 -m pip install megengine -f https://megengine.org.cn/whl/mge.html ### 环境依赖 -大多数编译 MegEngine 的依赖位于 `third_party` 目录,可以通过以下命令自动安装: +大多数编译 MegEngine 的依赖位于 [third_party](third_party) 目录,可以通过以下命令自动安装: ```bash -$ ./third_party/prepare.sh -$ ./third_party/install-mkl.sh +./third_party/prepare.sh +./third_party/install-mkl.sh ``` 但是有一些依赖需要手动安装: @@ -47,23 +47,23 @@ MegEngine使用CMake作为构建工具。我们提供以下脚本来帮助编译 * [host_build.sh](scripts/cmake-build/host_build.sh) 用于本地编译。 参数 -h 可用于查询脚本支持的参数: - + ``` scripts/cmake-build/host_build.sh -h ``` * [cross_build_android_arm_inference.sh](scripts/cmake-build/cross_build_android_arm_inference.sh) 用于ARM-安卓交叉编译。 参数 -h 可用于查询脚本支持的参数: - + ``` scripts/cmake-build/cross_build_android_arm_inference.sh -h ``` * [cross_build_linux_arm_inference.sh](scripts/cmake-build/cross_build_linux_arm_inference.sh) 用于ARM-Linux交叉编译。 参数 -h 可用于查询脚本支持的参数: - + ``` scripts/cmake-build/cross_build_linux_arm_inference.sh -h ``` -* [cross_build_ios_arm_inference.sh](scripts/cmake-build/cross_build_ios_arm_inference.sh) 用于IOS交叉编译。 +* [cross_build_ios_arm_inference.sh](scripts/cmake-build/cross_build_ios_arm_inference.sh) 用于iOS交叉编译。 参数 -h 可用于查询脚本支持的参数: ``` @@ -97,7 +97,7 @@ MegEngine使用CMake作为构建工具。我们提供以下脚本来帮助编译 * 邮箱: [megengine-support@megvii.com](mailto:megengine-support@megvii.com) * 论坛: [discuss.megengine.org.cn](https://discuss.megengine.org.cn) * QQ: 1029741705 -* OPENI: [openi.org.cn/MegEngine](https://www.openi.org.cn/html/2020/Framework_0325/18.html) +* OPENI: [openi.org.cn/MegEngine](https://www.openi.org.cn/html/2020/Framework_0325/18.html) ## 资源 @@ -109,4 +109,4 @@ MegEngine使用CMake作为构建工具。我们提供以下脚本来帮助编译 MegEngine 使用 Apache License, Version 2.0 -Copyright (c) 2014-2020 Megvii Inc. All rights reserved. +Copyright (c) 2014-2021 Megvii Inc. All rights reserved. diff --git a/logo.png b/logo.png old mode 100755 new mode 100644