MegEngine
Documentation | 中文文档
MegEngine is a fast, scalable, and user friendly deep learning framework with 3 key features.
-
Unified framework for both training and inference
- Quantization, dynamic shape/image pre-processing, and even derivation with a single model.
- After training, put everything into your model to inference on any platform with speed and precision. Check here for a quick guide.
-
The lowest hardware requirements
- The memory usage of the GPU can be reduced to one-third of the original memory usage when DTR algorithm is enabled.
- Inference models with the lowest memory usage by leveraging our Pushdown memory planner.
-
Inference efficiently on all platforms
- Inference with speed and high-precision on x86, Arm, CUDA, and RoCM.
- Supports Linux, Windows, iOS, Android, TEE, etc.
- Optimize performance and memory usage by leveraging our advanced features.
Installation
NOTE: MegEngine now supports Python installation on Linux-64bit/Windows-64bit/MacOS(CPU-Only)-10.14+/Android 7+(CPU-Only) platforms with Python from 3.5 to 3.8. On Windows 10 you can either install the Linux distribution through Windows Subsystem for Linux (WSL) or install the Windows distribution directly. Many other platforms are supported for inference.
Binaries
To install the pre-built binaries via pip wheels:
python3 -m pip install --upgrade pip
python3 -m pip install megengine -f https://megengine.org.cn/whl/mge.html
Building from Source
- CMake build details. please refer to BUILD_README.md
- Python binding build details, Please refer to BUILD_PYTHON_WHL_README.md
How to Contribute
- MegEngine adopts Contributor Covenant as a guideline to run our community. Please read the Code of Conduct.
- Every contributor of MegEngine must sign a Contributor License Agreement (CLA) to clarify the intellectual property license granted with the contributions.
- You can help to improve MegEngine in many ways:
- Write code.
- Improve documentation.
- Answer questions on MegEngine Forum, or Stack Overflow.
- Contribute new models in MegEngine Model Hub.
- Try a new idea on MegStudio.
- Report or investigate bugs and issues.
- Review Pull Requests.
- Star MegEngine repo.
- Cite MegEngine in your papers and articles.
- Recommend MegEngine to your friends.
- Any other form of contribution is welcomed.
We strive to build an open and friendly community. We aim to power humanity with AI.
How to Contact Us
- Issue: github.com/MegEngine/MegEngine/issues
- Email: megengine-support@megvii.com
- Forum: discuss.megengine.org.cn
- QQ Group: 1029741705
Resources
- MegEngine
- MegStudio
- mirror repo
- OPENI: openi.org.cn/MegEngine
- Gitee: gitee.com/MegEngine/MegEngine
License
MegEngine is licensed under the Apache License, Version 2.0
Citation
If you use MegEngine in your publication,please cite it by using the following BibTeX entry.
@Misc{MegEngine,
institution = {megvii},
title = {MegEngine:A fast, scalable and easy-to-use deep learning framework},
howpublished = {\url{https://github.com/MegEngine/MegEngine}},
year = {2020}
}
Copyright (c) 2014-2021 Megvii Inc. All rights reserved.