提交 b30ec5e9 编写于 作者: M mindspore-ci-bot 提交者: Gitee

!5439 add image_classification in model zoo

Merge pull request !5439 from zhangbiao31/zb0828
# MindSpore
build/
mindspore/lib
output
*.ir
mindspore/ccsrc/schema/inner/*
# Cmake files
CMakeFiles/
cmake_install.cmake
CMakeCache.txt
Makefile
cmake-build-debug
# Dynamic libraries
*.so
*.so.*
*.dylib
# Static libraries
*.la
*.lai
*.a
*.lib
# Protocol buffers
*_pb2.py
*.pb.h
*.pb.cc
# Object files
*.o
# Editor
.vscode
.idea/
# Cquery
.cquery_cached_index/
compile_commands.json
# Ctags and cscope
tags
TAGS
CTAGS
GTAGS
GRTAGS
GSYMS
GPATH
cscope.*
# Python files
*__pycache__*
.pytest_cache
# Mac files
*.DS_Store
# Test results
test_temp_summary_event_file/
*.dot
*.dat
*.svg
*.perf
*.info
*.ckpt
*.shp
*.pkl
.clangd
mindspore/version.py
mindspore/default_config.py
mindspore/.commit_id
onnx.proto
mindspore/ccsrc/onnx.proto
# Android
local.properties
.gradle
sdk/build
sdk/.cxx
app/.cxx
## MindSpore Lite 端侧图像分类demo(Android)
本示例程序演示了如何在端侧利用MindSpore Lite C++ API(Android JNI)以及MindSpore Lite 图像分类模型完成端侧推理,实现对设备摄像头捕获的内容进行分类,并在App图像预览界面中显示出最可能的分类结果。
### 运行依赖
- Android Studio >= 3.2 (推荐4.0以上版本)
- NDK 21.3
- CMake 3.10
- Android SDK >= 26
- OpenCV >= 4.0.0
### 构建与运行
1. 在Android Studio中加载本示例源码,并安装相应的SDK(指定SDK版本后,由Android Studio自动安装)。
![start_home](images/home.png)
启动Android Studio后,点击`File->Settings->System Settings->Android SDK`,勾选相应的SDK。如下图所示,勾选后,点击`OK`,Android Studio即可自动安装SDK。
![start_sdk](images/sdk_management.png)
(可选)若安装时出现NDK版本问题,可手动下载相应的[NDK版本](https://developer.android.com/ndk/downloads?hl=zh-cn)(本示例代码使用的NDK版本为21.3),并在`Project Structure`的`Android NDK location`设置中指定SDK的位置。
![project_structure](images/project_structure.png)
2. 连接Android设备,运行图像分类应用程序。
通过USB连接Android设备调试,点击`Run 'app'`即可在您的设备上运行本示例项目。
* 注:编译过程中Android Studio会自动下载MindSpore Lite、OpenCV、模型文件等相关依赖项,编译过程需做耐心等待。
![run_app](images/run_app.PNG)
Android Studio连接设备调试操作,可参考<https://developer.android.com/studio/run/device?hl=zh-cn>。
3. 在Android设备上,点击“继续安装”,安装完即可查看到设备摄像头捕获的内容和推理结果。
![install](images/install.jpg)
如下图所示,识别出的概率最高的物体是植物。
![result](images/app_result.jpg)
## 示例程序详细说明
本端侧图像分类Android示例程序分为JAVA层和JNI层,其中,JAVA层主要通过Android Camera 2 API实现摄像头获取图像帧,以及相应的图像处理等功能;JNI层在[Runtime](https://www.mindspore.cn/tutorial/zh-CN/master/use/lite_runtime.html)中完成模型推理的过程。
> 此处详细说明示例程序的JNI层实现,JAVA层运用Android Camera 2 API实现开启设备摄像头以及图像帧处理等功能,需读者具备一定的Android开发基础知识。
### 示例程序结构
```
app
|
├── libs # 存放demo jni层依赖的库文件
│ └── arm64-v8a
│ ├── libopencv_java4.so # opencv
│ ├── libmlkit-label-MS.so # ndk编译生成的库文件
│ └── libmindspore-lite.so # mindspore lite
|
├── src/main
│ ├── assets # 资源文件
| | └── mobilenetv2.ms # 存放模型文件
│ |
│ ├── cpp # 模型加载和预测主要逻辑封装类
| | ├── include # 存放MindSpore调用相关的头文件
| | | └── ...
│ | |
| | ├── MindSporeNetnative.cpp # MindSpore调用相关的JNI方法
│ | └── MindSporeNetnative.h # 头文件
│ |
│ ├── java # java层应用代码
│ │ └── com.huawei.himindsporedemo
│ │ ├── gallery.classify # 图像处理及MindSpore JNI调用相关实现
│ │ │ └── ...
│ │ └── obejctdetect # 开启摄像头及绘制相关实现
│ │ └── ...
│ │
│ ├── res # 存放Android相关的资源文件
│ └── AndroidManifest.xml # Android配置文件
├── CMakeList.txt # cmake编译入口文件
├── build.gradle # 其他Android配置文件
├── download.gradle # APP构建时由gradle自动从HuaWei Server下载依赖的库文件及模型文件
└── ...
```
### 配置MindSpore Lite依赖项
Android JNI层调用MindSpore C++ API时,需要相关库文件支持。可通过MindSpore Lite[源码编译](https://www.mindspore.cn/lite/docs/zh-CN/master/deploy.html)生成`libmindspore-lite.so`库文件,或直接下载MindSpore Lite提供的已编译完成的AMR64、ARM32、x86等[软件包](#TODO)
在Android Studio中将编译完成的`libmindspore-lite.so`库文件(可包含多个兼容架构),分别放置在APP工程的`app/libs/arm64-v8a`(ARM64)或`app/libs/armeabi-v7a`(ARM32)目录下,并在应用的`build.gradle`文件中配置CMake编译支持,以及`arm64-v8a``armeabi-v7a`的编译支持。
本示例中,build过程由download.gradle文件自动从华为服务器下载libmindspore-lite.so以及OpenCV的libopencv_java4.so库文件,并放置在`app/libs/arm64-v8a`目录下。
* 注:若自动下载失败,请手动下载相关库文件并将其放在对应位置:
* libmindspore-lite.so [下载链接](https://download.mindspore.cn/model_zoo/official/lite/lib/mindspore%20version%200.7/libmindspore-lite.so)
* libopencv_java4.so [下载链接](https://download.mindspore.cn/model_zoo/official/lite/lib/opencv%204.4.0/libopencv_java4.so)
```
android{
defaultConfig{
externalNativeBuild{
cmake{
arguments "-DANDROID_STL=c++_shared"
}
}
ndk{
abiFilters 'arm64-v8a'
}
}
}
```
`app/CMakeLists.txt`文件中建立`.so`库文件链接,如下所示。
```
# Set MindSpore Lite Dependencies.
include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp/include/MindSpore)
add_library(mindspore-lite SHARED IMPORTED )
set_target_properties(mindspore-lite PROPERTIES
IMPORTED_LOCATION "${CMAKE_SOURCE_DIR}/libs/libmindspore-lite.so")
# Set OpenCV Dependecies.
include_directories(${CMAKE_SOURCE_DIR}/opencv/sdk/native/jni/include)
add_library(lib-opencv SHARED IMPORTED )
set_target_properties(lib-opencv PROPERTIES
IMPORTED_LOCATION "${CMAKE_SOURCE_DIR}/libs/libopencv_java4.so")
# Link target library.
target_link_libraries(
...
mindspore-lite
lib-opencv
...
)
```
### 下载及部署模型文件
从MindSpore Model Hub中下载模型文件,本示例程序中使用的终端图像分类模型文件为`mobilenetv2.ms`,同样通过download.gradle脚本在APP构建时自动下载,并放置在`app/src/main/assets`工程目录下。
* 注:若下载失败请手动下载模型文件,mobilenetv2.ms [下载链接](https://download.mindspore.cn/model_zoo/official/lite/mobilenetv2_openimage_lite/mobilenetv2.ms)
### 编写端侧推理代码
在JNI层调用MindSpore Lite C++ API实现端测推理。
推理代码流程如下,完整代码请参见`src/cpp/MindSporeNetnative.cpp`
1. 加载MindSpore Lite模型文件,构建上下文、会话以及用于推理的计算图。
- 加载模型文件:创建并配置用于模型推理的上下文
```cpp
// Buffer is the model data passed in by the Java layer
jlong bufferLen = env->GetDirectBufferCapacity(buffer);
char *modelBuffer = CreateLocalModelBuffer(env, buffer);
```
- 创建会话
```cpp
void **labelEnv = new void *;
MSNetWork *labelNet = new MSNetWork;
*labelEnv = labelNet;
// Create context.
lite::Context *context = new lite::Context;
context->thread_num_ = numThread; //Specify the number of threads to run inference
// Create the mindspore session.
labelNet->CreateSessionMS(modelBuffer, bufferLen, context);
delete(context);
```
- 加载模型文件并构建用于推理的计算图
```cpp
void MSNetWork::CreateSessionMS(char* modelBuffer, size_t bufferLen, mindspore::lite::Context* ctx)
{
CreateSession(modelBuffer, bufferLen, ctx);
session = mindspore::session::LiteSession::CreateSession(ctx);
auto model = mindspore::lite::Model::Import(modelBuffer, bufferLen);
int ret = session->CompileGraph(model); // Compile Graph
}
```
2. 将输入图片转换为传入MindSpore模型的Tensor格式。
将待检测图片数据转换为输入MindSpore模型的Tensor。
```cpp
// Convert the Bitmap image passed in from the JAVA layer to Mat for OpenCV processing
BitmapToMat(env, srcBitmap, matImageSrc);
// Processing such as zooming the picture size.
matImgPreprocessed = PreProcessImageData(matImageSrc);
ImgDims inputDims;
inputDims.channel = matImgPreprocessed.channels();
inputDims.width = matImgPreprocessed.cols;
inputDims.height = matImgPreprocessed.rows;
float *dataHWC = new float[inputDims.channel * inputDims.width * inputDims.height]
// Copy the image data to be detected to the dataHWC array.
// The dataHWC[image_size] array here is the intermediate variable of the input MindSpore model tensor.
float *ptrTmp = reinterpret_cast<float *>(matImgPreprocessed.data);
for(int i = 0; i < inputDims.channel * inputDims.width * inputDims.height; i++){
dataHWC[i] = ptrTmp[i];
}
// Assign dataHWC[image_size] to the input tensor variable.
auto msInputs = mSession->GetInputs();
auto inTensor = msInputs.front();
memcpy(inTensor->MutableData(), dataHWC,
inputDims.channel * inputDims.width * inputDims.height * sizeof(float));
delete[] (dataHWC);
```
3. 对输入Tensor按照模型进行推理,获取输出Tensor,并进行后处理。
- 图执行,端测推理。
```cpp
// After the model and image tensor data is loaded, run inference.
auto status = mSession->RunGraph();
```
- 获取输出数据。
```cpp
// Get the mindspore inference results.
auto msOutputs = mSession->GetOutputMapByNode();
std::string retStr = ProcessRunnetResult(msOutputs);
```
- 输出数据的后续处理。
```cpp
std::string ProcessRunnetResult(
std::unordered_map<std::string, std::vector<mindspore::tensor::MSTensor *>> msOutputs){
// Get the branch of the model output.
// Use iterators to get map elements.
std::unordered_map<std::string, std::vector<mindspore::tensor::MSTensor *>>::iterator iter;
iter = msOutputs.begin();
// The mobilenetv2.ms model output just one branch.
auto outputString = iter->first;
auto outputTensor = iter->second;
float *temp_scores = static_cast<float * >(branch1_tensor[0]->MutableData());
float scores[RET_CATEGORY_SUM];
for (int i = 0; i < RET_CATEGORY_SUM; ++i) {
if (temp_scores[i] > 0.5){
MS_PRINT("MindSpore scores[%d] : [%f]", i, temp_scores[i]);
}
scores[i] = temp_scores[i];
}
// Converted to text information that needs to be displayed in the APP.
std::string categoryScore = "";
for (int i = 0; i < RET_CATEGORY_SUM; ++i) {
categoryScore += g_labels_name_map[i];
categoryScore += ":";
std::string score_str = std::to_string(scores[i]);
categoryScore += score_str;
categoryScore += ";";
}
return categoryScore;
}
```
/build
\ No newline at end of file
# For more information about using CMake with Android Studio, read the
# documentation: https://d.android.com/studio/projects/add-native-code.html
# Sets the minimum version of CMake required to build the native library.
cmake_minimum_required(VERSION 3.4.1)
set(CMAKE_VERBOSE_MAKEFILE on)
set(libs ${CMAKE_SOURCE_DIR}/libs)
set(CMAKE_LIBRARY_OUTPUT_DIRECTORY ${CMAKE_SOURCE_DIR}/libs/${ANDROID_ABI})
# ============== Set MindSpore Dependencies. =============
include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp/include)
include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp/include/MindSpore)
include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp/include/MindSpore/flatbuffers)
include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp/include/MindSpore/ir/dtype)
include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp/include/MindSpore/schema)
add_library(mindspore-lite SHARED IMPORTED )
set_target_properties(mindspore-lite PROPERTIES IMPORTED_LOCATION
${CMAKE_SOURCE_DIR}/libs/${ANDROID_ABI}/libmindspore-lite.so)
# --------------- MindSpore Lite set End. --------------------
# =============== Set OpenCV Dependencies ===================
include_directories(${CMAKE_SOURCE_DIR}/opencv/sdk/native/jni/include/)
add_library(lib-opencv SHARED IMPORTED )
set_target_properties(lib-opencv PROPERTIES IMPORTED_LOCATION
${CMAKE_SOURCE_DIR}/libs/${ANDROID_ABI}/libopencv_java4.so)
# --------------- OpenCV set End. ---------------------------
# Creates and names a library, sets it as either STATIC
# or SHARED, and provides the relative paths to its source code.
# You can define multiple libraries, and CMake builds them for you.
# Gradle automatically packages shared libraries with your APK.
file(GLOB_RECURSE cpp_src "src/main/cpp/*.cpp" "src/main/cpp/*.h")
add_library( # Sets the name of the library.
mlkit-label-MS
# Sets the library as a shared library.
SHARED
# Provides a relative path to your source file(s).
${cpp_src})
# Searches for a specified prebuilt library and stores the path as a
# variable. Because CMake includes system libraries in the search path by
# default, you only need to specify the name of the public NDK library
# you want to add. CMake verifies that the library exists before
# completing its build.
find_library( # Sets the name of the path variable.
log-lib
# Specifies the name of the NDK library that
# you want CMake to locate.
log )
find_library( jnigraphics-lib jnig·raphics )
# Specifies libraries CMake should link to your target library. You
# can link multiple libraries, such as libraries you define in this
# build script, prebuilt third-party libraries, or system libraries.
add_definitions(-DMNN_USE_LOGCAT)
target_link_libraries( # Specifies the target library.
mlkit-label-MS
# --- opencv ---
lib-opencv
# --- mindspore ---
mindspore-lite
# --- other dependencies.---
-ljnigraphics
android
# Links the target library to the log library
${log-lib}
)
\ No newline at end of file
apply plugin: 'com.android.application'
android {
compileSdkVersion 30
buildToolsVersion "30.0.1"
defaultConfig {
applicationId "com.huawei.himindsporedemo"
minSdkVersion 21
targetSdkVersion 30
versionCode 1
versionName "1.0"
testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
externalNativeBuild {
cmake {
arguments "-DANDROID_STL=c++_shared"
cppFlags ""
}
}
ndk {
abiFilters 'arm64-v8a'
}
}
aaptOptions {
noCompress '.so', 'ms'
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
}
customDebugType {
debuggable true
}
}
externalNativeBuild {
cmake {
path file('CMakeLists.txt')
}
}
ndkVersion '21.3.6528147'
sourceSets{
main {
jniLibs.srcDirs = ['libs']
}
}
packagingOptions{
pickFirst 'lib/arm64-v8a/libopencv_java4.so'
pickFirst 'lib/arm64-v8a/libmindspore-lite.so'
pickFirst 'lib/arm64-v8a/libmlkit-label-MS.so'
}
}
// Before gradle build.
// To download some necessary libraries.
apply from:'download.gradle'
/*if (!file("libs/arm64-v8a/libmindspore-lite.so").exists() ||
!file("libs/arm64-v8a/libopencv_java4.so").exists()){
apply from:'download.gradle'
}*/
dependencies {
implementation fileTree(dir: "libs", include: ["*.jar"])
implementation 'androidx.appcompat:appcompat:1.1.0'
implementation 'androidx.constraintlayout:constraintlayout:1.1.3'
// implementation project(path: ':sdk')
testImplementation 'junit:junit:4.12'
androidTestImplementation 'androidx.test.ext:junit:1.1.1'
androidTestImplementation 'androidx.test.espresso:espresso-core:3.2.0'
implementation 'com.google.android.material:material:1.0.0'
androidTestImplementation 'com.android.support.test:rules:1.0.2'
androidTestImplementation 'com.google.truth:truth:1.0.1'
}
/**
* To download necessary library from HuaWei server.
* Including mindspore-lite .so file, opencv .so file and model file.
* The libraries can be downloaded manually.
*/
def targetModelFile = "src/main/assets/model/mobilenetv2.ms"
def openCVLibrary_arm64 = "libs/arm64-v8a/libopencv_java4.so"
def mindSporeLibrary_arm64 = "libs/arm64-v8a/libmindspore-lite.so"
def modelDownloadUrl = "https://download.mindspore.cn/model_zoo/official/lite/mobilenetv2_openimage_lite/mobilenetv2.ms"
def opencvDownloadUrl = "https://download.mindspore.cn/model_zoo/official/lite/lib/opencv%204.4.0/libopencv_java4.so"
def mindsporeLiteDownloadUrl = "https://download.mindspore.cn/model_zoo/official/lite/lib/mindspore%20version%200.7/libmindspore-lite.so"
task downloadModelFile(type: DownloadUrlTask) {
doFirst {
println "Downloading ${modelDownloadUrl}"
}
sourceUrl = "${modelDownloadUrl}"
target = file("${targetModelFile}")
}
task downloadOpenCVLibrary(type: DownloadUrlTask) {
doFirst {
println "Downloading ${opencvDownloadUrl}"
}
sourceUrl = "${opencvDownloadUrl}"
target = file("${openCVLibrary_arm64}")
}
task downloadMindSporeLibrary(type: DownloadUrlTask) {
doFirst {
println "Downloading ${mindsporeLiteDownloadUrl}"
}
sourceUrl = "${mindsporeLiteDownloadUrl}"
target = file("${mindSporeLibrary_arm64}")
}
/*
* Using preBuild to download mindspore library, opencv library and model file.
* Run before gradle build.
*/
if (file("libs/arm64-v8a/libmindspore-lite.so").exists()){
downloadMindSporeLibrary.enabled = false
}
if (file("libs/arm64-v8a/libopencv_java4.so.so").exists()){
downloadOpenCVLibrary.enabled = false
}
if (file("src/main/assets/model/mobilenetv2.ms").exists()){
downloadModelFile.enabled = false
}
preBuild.dependsOn downloadMindSporeLibrary
preBuild.dependsOn downloadOpenCVLibrary
preBuild.dependsOn downloadModelFile
class DownloadUrlTask extends DefaultTask {
@Input
String sourceUrl
@OutputFile
File target
@TaskAction
void download() {
ant.get(src: sourceUrl, dest: target)
}
}
# Add project specific ProGuard rules here.
# You can control the set of applied configuration files using the
# proguardFiles setting in build.gradle.
#
# For more details, see
# http://developer.android.com/guide/developing/tools/proguard.html
# If your project uses WebView with JS, uncomment the following
# and specify the fully qualified class name to the JavaScript interface
# class:
#-keepclassmembers class fqcn.of.javascript.interface.for.webview {
# public *;
#}
# Uncomment this to preserve the line number information for
# debugging stack traces.
#-keepattributes SourceFile,LineNumberTable
# If you keep the line number information, uncomment this to
# hide the original source file name.
#-renamesourcefileattribute SourceFile
\ No newline at end of file
package com.huawei.himindsporedemo;
import android.content.Context;
import androidx.test.platform.app.InstrumentationRegistry;
import androidx.test.ext.junit.runners.AndroidJUnit4;
import org.junit.Test;
import org.junit.runner.RunWith;
import static org.junit.Assert.*;
/**
* Instrumented test, which will execute on an Android device.
*
* @see <a href="http://d.android.com/tools/testing">Testing documentation</a>
*/
@RunWith(AndroidJUnit4.class)
public class ExampleInstrumentedTest {
@Test
public void useAppContext() {
// Context of the app under test.
Context appContext = InstrumentationRegistry.getInstrumentation().getTargetContext();
assertEquals("com.huawei.himindsporedemo", appContext.getPackageName());
}
}
\ No newline at end of file
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.huawei.himindsporedemo"
android:versionCode="1"
android:versionName="1.0">
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.WRITE_INTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_INTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.MOUNT_UNMOUNT_FILESYSTEM" />
<uses-permission android:name="android.permission.READ_PHONE_STATE" />
<application
android:allowBackup="true"
android:icon="@mipmap/ic_launcher"
android:label="@string/app_name"
android:roundIcon="@mipmap/ic_launcher_round"
android:supportsRtl="true"
android:theme="@style/AppTheme">
<activity
android:name=".widget.CameraActivity"
android:screenOrientation="portrait">
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LAUNCHER" />
</intent-filter>
</activity>
</application>
</manifest>
\ No newline at end of file
/*
* Copyright (c) Huawei Technologies Co., Ltd. 2018-2019. All rights reserved.
*/
#include <android/bitmap.h>
#include <android/asset_manager_jni.h>
#include <android/log.h>
#include <jni.h>
#include <cstring>
#include <set>
#include <MindSpore/errorcode.h>
#include <MindSpore/ms_tensor.h>
#include "MindSporeNetnative.h"
#include "opencv2/core.hpp"
#include "opencv2/imgproc.hpp"
#include "MindSpore/MSNetWork.h"
#include "HMS/HMS_label_thres.h"
using namespace cv;
using namespace mindspore;
using namespace mindspore::tensor;
#define MS_PRINT(format, ...) __android_log_print(ANDROID_LOG_INFO, "MSJNI", format, ##__VA_ARGS__)
void BitmapToMat2(JNIEnv *env, jobject &bitmap, Mat &mat, jboolean needUnPremultiplyAlpha) {
AndroidBitmapInfo info;
void *pixels = nullptr;
Mat &dst = mat;
CV_Assert(AndroidBitmap_getInfo(env, bitmap, &info) >= 0);
CV_Assert(info.format == ANDROID_BITMAP_FORMAT_RGBA_8888 ||
info.format == ANDROID_BITMAP_FORMAT_RGB_565);
CV_Assert(AndroidBitmap_lockPixels(env, bitmap, &pixels) >= 0);
CV_Assert(pixels);
dst.create(info.height, info.width, CV_8UC4);
if (info.format == ANDROID_BITMAP_FORMAT_RGBA_8888) {
Mat tmp(info.height, info.width, CV_8UC4, pixels);
if (needUnPremultiplyAlpha) {
cvtColor(tmp, dst, COLOR_RGBA2BGR);
} else {
tmp.copyTo(dst);
}
} else {
Mat tmp(info.height, info.width, CV_8UC4, pixels);
cvtColor(tmp, dst, COLOR_BGR5652RGBA);
}
AndroidBitmap_unlockPixels(env, bitmap);
return;
}
void BitmapToMat(JNIEnv *env, jobject &bitmap, Mat &mat) {
BitmapToMat2(env, bitmap, mat, true);
}
/**
* Processing image with resize and normalize.
*/
cv::Mat PreProcessImageData(cv::Mat input) {
cv::Mat imgFloatTmp, imgResized256, imgResized224;
int resizeWidth = 256;
int resizeHeight = 256;
float normalizMin = 1.0;
float normalizMax = 255.0;
cv::resize(input, imgFloatTmp, cv::Size(resizeWidth, resizeHeight));
imgFloatTmp.convertTo(imgResized256, CV_32FC3, normalizMin / normalizMax);
int offsetX = 16;
int offsetY = 16;
int cropWidth = 224;
int cropHeight = 224;
// Standardization processing.
float meanR = 0.485;
float meanG = 0.456;
float meanB = 0.406;
float varR = 0.229;
float varG = 0.224;
float varB = 0.225;
cv::Rect roi;
roi.x = offsetX;
roi.y = offsetY;
roi.width = cropWidth;
roi.height = cropHeight;
// The final image size of the incoming model is 224*224.
imgResized256(roi).copyTo(imgResized224);
Scalar mean = Scalar(meanR, meanG, meanB);
Scalar var = Scalar(varR, varG, varB);
cv::Mat imgResized1;
cv::Mat imgResized2;
Mat imgMean(imgResized224.size(), CV_32FC3,
mean); // imgMean Each pixel channel is (0.485, 0.456, 0.406)
Mat imgVar(imgResized224.size(), CV_32FC3,
var); // imgVar Each pixel channel is (0.229, 0.224, 0.225)
imgResized1 = imgResized224 - imgMean;
imgResized2 = imgResized1 / imgVar;
return imgResized2;
}
char *CreateLocalModelBuffer(JNIEnv *env, jobject modelBuffer) {
jbyte *modelAddr = static_cast<jbyte *>(env->GetDirectBufferAddress(modelBuffer));
int modelLen = static_cast<int>(env->GetDirectBufferCapacity(modelBuffer));
char *buffer(new char[modelLen]);
memcpy(buffer, modelAddr, modelLen);
return buffer;
}
/**
* To process the result of mindspore inference.
* @param msOutputs
* @return
*/
std::string ProcessRunnetResult(
std::unordered_map<std::string, std::vector<mindspore::tensor::MSTensor *>> msOutputs) {
// Get the branch of the model output.
// Use iterators to get map elements.
std::unordered_map<std::string, std::vector<mindspore::tensor::MSTensor *>>::iterator iter;
iter = msOutputs.begin();
// The mobilenetv2.ms model output just one branch.
auto outputString = iter->first;
auto outputTensor = iter->second;
int tensorNum = outputTensor[0]->ElementsNum();
MS_PRINT("Number of tensor elements:%d", tensorNum);
// Get a pointer to the first score.
float *temp_scores = static_cast<float * >(outputTensor[0]->MutableData());
float scores[RET_CATEGORY_SUM];
for (int i = 0; i < RET_CATEGORY_SUM; ++i) {
if (temp_scores[i] > 0.5) {
MS_PRINT("MindSpore scores[%d] : [%f]", i, temp_scores[i]);
}
scores[i] = temp_scores[i];
}
// Score for each category.
// Converted to text information that needs to be displayed in the APP.
std::string categoryScore = "";
for (int i = 0; i < RET_CATEGORY_SUM; ++i) {
categoryScore += g_labels_name_map[i];
categoryScore += ":";
std::string score_str = std::to_string(scores[i]);
categoryScore += score_str;
categoryScore += ";";
}
return categoryScore;
}
/**
* The Java layer reads the model into MappedByteBuffer or ByteBuffer to load the model.
*/
extern "C"
JNIEXPORT jlong JNICALL
Java_com_huawei_himindsporedemo_gallery_classify_TrackingMobile_loadModel(JNIEnv *env, jobject thiz,
jobject model_buffer,
jint num_thread) {
// TODO: implement loadModel()
if (nullptr == model_buffer) {
MS_PRINT("error, buffer is nullptr!");
return (jlong) nullptr;
}
jlong bufferLen = env->GetDirectBufferCapacity(model_buffer);
if (0 == bufferLen) {
MS_PRINT("error, bufferLen is 0!");
return (jlong) nullptr;
}
char *modelBuffer = CreateLocalModelBuffer(env, model_buffer);
if (modelBuffer == nullptr) {
MS_PRINT("modelBuffer create failed!");
return (jlong) nullptr;
}
// To create a mindspore network inference environment.
void **labelEnv = new void *;
MSNetWork *labelNet = new MSNetWork;
*labelEnv = labelNet;
lite::Context *context = new lite::Context;
context->thread_num_ = num_thread;
labelNet->CreateSessionMS(modelBuffer, bufferLen, context);
delete (context);
if (labelNet->session == nullptr) {
MS_PRINT("MindSpore create session failed!.");
return (jlong) nullptr;
}
if (model_buffer != nullptr) {
env->DeleteLocalRef(model_buffer);
}
return (jlong) labelEnv;
}
/**
* After the inference environment is successfully created,
* sending a picture to the model and run inference.
*/
extern "C" JNIEXPORT jstring JNICALL
Java_com_huawei_himindsporedemo_gallery_classify_TrackingMobile_runNet(JNIEnv *env, jclass type,
jlong netEnv,
jobject srcBitmap) {
Mat matImageSrc;
BitmapToMat(env, srcBitmap, matImageSrc);
Mat matImgPreprocessed = PreProcessImageData(matImageSrc);
ImgDims inputDims;
inputDims.channel = matImgPreprocessed.channels();
inputDims.width = matImgPreprocessed.cols;
inputDims.height = matImgPreprocessed.rows;
// Get the mindsore inference environment which created in loadModel().
void **labelEnv = reinterpret_cast<void **>(netEnv);
if (labelEnv == nullptr) {
MS_PRINT("MindSpore error, labelEnv is a nullptr.");
return NULL;
}
MSNetWork *labelNet = static_cast<MSNetWork *>(*labelEnv);
auto mSession = labelNet->session;
if (mSession == nullptr) {
MS_PRINT("MindSpore error, Session is a nullptr.");
return NULL;
}
MS_PRINT("MindSpore get session.");
auto msInputs = mSession->GetInputs();
if (msInputs.size() == 0) {
MS_PRINT("MindSpore error, msInputs.size() equals 0.");
return NULL;
}
auto inTensor = msInputs.front();
// dataHWC is the tensor format.
float *dataHWC = new float[inputDims.channel * inputDims.width * inputDims.height];
float *ptrTmp = reinterpret_cast<float *>(matImgPreprocessed.data);
for (int i = 0; i < inputDims.channel * inputDims.width * inputDims.height; ++i) {
dataHWC[i] = ptrTmp[i];
}
// Copy dataHWC to the model input tensor.
memcpy(inTensor->MutableData(), dataHWC,
inputDims.channel * inputDims.width * inputDims.height * sizeof(float));
// When using 'new' to allocate memory space, we need to use 'delete' to free space.
delete[] (dataHWC);
// After the model and image tensor data is loaded, run inference.
auto status = mSession->RunGraph();
if (status != lite::RET_OK) {
MS_PRINT("MindSpore run net error.");
return NULL;
}
/**
* Get the mindspore inference results.
* Return the map of output node name and MindSpore Lite MSTensor.
*/
auto msOutputs = mSession->GetOutputMapByNode();
std::string resultStr = ProcessRunnetResult(msOutputs);
const char *resultCharData = resultStr.c_str();
return (env)->NewStringUTF(resultCharData);
}
extern "C" JNIEXPORT jboolean JNICALL
Java_com_huawei_himindsporedemo_gallery_classify_TrackingMobile_unloadModel(JNIEnv *env,
jclass type,
jlong netEnv) {
MS_PRINT("MindSpore release net.");
void **labelEnv = reinterpret_cast<void **>(netEnv);
if (labelEnv == nullptr) {
MS_PRINT("MindSpore error, labelEnv is a nullptr.");
}
MSNetWork *labelNet = static_cast<MSNetWork *>(*labelEnv);
labelNet->ReleaseNets();
return (jboolean) true;
}
/*
* Copyright (c) Huawei Technologies Co., Ltd. 2018-2019. All rights reserved.
*/
#ifndef MINDSPORE_JNI_HMS_DEBUG_MINDSPORENETNATIVE_H
#define MINDSPORE_JNI_HMS_DEBUG_MINDSPORENETNATIVE_H
#endif //MINDSPORE_JNI_HMS_DEBUG_MINDSPORENETNATIVE_H
/*
* Copyright (c) Huawei Technologies Co., Ltd. 2018-2019. All rights reserved.
*/
#ifndef MNN_JNI_HMS_HMS_LABEL_THRES_H
#define MNN_JNI_HMS_HMS_LABEL_THRES_H
#include <string.h>
#include <map>
constexpr int RET_CATEGORY_SUM = 601;
static std::string g_labels_name_map[RET_CATEGORY_SUM] = {
{"Tortoise"},
{"Container"},
{"Magpie"},
{"Seaturtle"},
{"Football"},
{"Ambulance"},
{"Ladder"},
{"Toothbrush"},
{"Syringe"},
{"Sink"},
{"Toy"},
{"Organ(MusicalInstrument) "},
{"Cassettedeck"},
{"Apple"},
{"Humaneye"},
{"Cosmetics"},
{"Paddle"},
{"Snowman"},
{"Beer"},
{"Chopsticks"},
{"Humanbeard"},
{"Bird"},
{"Parkingmeter"},
{"Trafficlight"},
{"Croissant"},
{"Cucumber"},
{"Radish"},
{"Towel"},
{"Doll"},
{"Skull"},
{"Washingmachine"},
{"Glove"},
{"Tick"},
{"Belt"},
{"Sunglasses"},
{"Banjo"},
{"Cart"},
{"Ball"},
{"Backpack"},
{"Bicycle"},
{"Homeappliance"},
{"Centipede"},
{"Boat"},
{"Surfboard"},
{"Boot"},
{"Headphones"},
{"Hotdog"},
{"Shorts"},
{"Fastfood"},
{"Bus"},
{"Boy "},
{"Screwdriver"},
{"Bicyclewheel"},
{"Barge"},
{"Laptop"},
{"Miniskirt"},
{"Drill(Tool)"},
{"Dress"},
{"Bear"},
{"Waffle"},
{"Pancake"},
{"Brownbear"},
{"Woodpecker"},
{"Bluejay"},
{"Pretzel"},
{"Bagel"},
{"Tower"},
{"Teapot"},
{"Person"},
{"Bowandarrow"},
{"Swimwear"},
{"Beehive"},
{"Brassiere"},
{"Bee"},
{"Bat(Animal)"},
{"Starfish"},
{"Popcorn"},
{"Burrito"},
{"Chainsaw"},
{"Balloon"},
{"Wrench"},
{"Tent"},
{"Vehicleregistrationplate"},
{"Lantern"},
{"Toaster"},
{"Flashlight"},
{"Billboard"},
{"Tiara"},
{"Limousine"},
{"Necklace"},
{"Carnivore"},
{"Scissors"},
{"Stairs"},
{"Computerkeyboard"},
{"Printer"},
{"Trafficsign"},
{"Chair"},
{"Shirt"},
{"Poster"},
{"Cheese"},
{"Sock"},
{"Firehydrant"},
{"Landvehicle"},
{"Earrings"},
{"Tie"},
{"Watercraft"},
{"Cabinetry"},
{"Suitcase"},
{"Muffin"},
{"Bidet"},
{"Snack"},
{"Snowmobile"},
{"Clock"},
{"Medicalequipment"},
{"Cattle"},
{"Cello"},
{"Jetski"},
{"Camel"},
{"Coat"},
{"Suit"},
{"Desk"},
{"Cat"},
{"Bronzesculpture"},
{"Juice"},
{"Gondola"},
{"Beetle"},
{"Cannon"},
{"Computermouse"},
{"Cookie"},
{"Officebuilding"},
{"Fountain"},
{"Coin"},
{"Calculator"},
{"Cocktail"},
{"Computermonitor"},
{"Box"},
{"Stapler"},
{"Christmastree"},
{"Cowboyhat"},
{"Hikingequipment"},
{"Studiocouch"},
{"Drum"},
{"Dessert"},
{"Winerack"},
{"Drink"},
{"Zucchini"},
{"Ladle"},
{"Humanmouth"},
{"DairyProduct"},
{"Dice"},
{"Oven"},
{"Dinosaur"},
{"Ratchet(Device)"},
{"Couch"},
{"Cricketball"},
{"Wintermelon"},
{"Spatula"},
{"Whiteboard"},
{"Pencilsharpener"},
{"Door"},
{"Hat"},
{"Shower"},
{"Eraser"},
{"Fedora"},
{"Guacamole"},
{"Dagger"},
{"Scarf"},
{"Dolphin"},
{"Sombrero"},
{"Tincan"},
{"Mug"},
{"Tap"},
{"Harborseal"},
{"Stretcher"},
{"Canopener"},
{"Goggles"},
{"Humanbody"},
{"Rollerskates"},
{"Coffeecup"},
{"Cuttingboard"},
{"Blender"},
{"Plumbingfixture"},
{"Stopsign"},
{"Officesupplies"},
{"Volleyball(Ball)"},
{"Vase"},
{"Slowcooker"},
{"Wardrobe"},
{"Coffee"},
{"Whisk"},
{"Papertowel"},
{"Personalcare"},
{"Food"},
{"Sunhat"},
{"Treehouse"},
{"Flyingdisc"},
{"Skirt"},
{"Gasstove"},
{"Saltandpeppershakers"},
{"Mechanicalfan"},
{"Facepowder"},
{"Fax"},
{"Fruit"},
{"Frenchfries"},
{"Nightstand"},
{"Barrel"},
{"Kite"},
{"Tart"},
{"Treadmill"},
{"Fox"},
{"Flag"},
{"Frenchhorn"},
{"Windowblind"},
{"Humanfoot"},
{"Golfcart"},
{"Jacket"},
{"Egg(Food)"},
{"Streetlight"},
{"Guitar"},
{"Pillow"},
{"Humanleg"},
{"Isopod"},
{"Grape"},
{"Humanear"},
{"Powerplugsandsockets"},
{"Panda"},
{"Giraffe"},
{"Woman"},
{"Doorhandle"},
{"Rhinoceros"},
{"Bathtub"},
{"Goldfish"},
{"Houseplant"},
{"Goat"},
{"Baseballbat"},
{"Baseballglove"},
{"Mixingbowl"},
{"Marineinvertebrates"},
{"Kitchenutensil"},
{"Lightswitch"},
{"House"},
{"Horse"},
{"Stationarybicycle"},
{"Hammer"},
{"Ceilingfan"},
{"Sofabed"},
{"Adhesivetape "},
{"Harp"},
{"Sandal"},
{"Bicyclehelmet"},
{"Saucer"},
{"Harpsichord"},
{"Humanhair"},
{"Heater"},
{"Harmonica"},
{"Hamster"},
{"Curtain"},
{"Bed"},
{"Kettle"},
{"Fireplace"},
{"Scale"},
{"Drinkingstraw"},
{"Insect"},
{"Hairdryer"},
{"Kitchenware"},
{"Indoorrower"},
{"Invertebrate"},
{"Foodprocessor"},
{"Bookcase"},
{"Refrigerator"},
{"Wood-burningstove"},
{"Punchingbag"},
{"Commonfig"},
{"Cocktailshaker"},
{"Jaguar(Animal)"},
{"Golfball"},
{"Fashionaccessory"},
{"Alarmclock"},
{"Filingcabinet"},
{"Artichoke"},
{"Table"},
{"Tableware"},
{"Kangaroo"},
{"Koala"},
{"Knife"},
{"Bottle"},
{"Bottleopener"},
{"Lynx"},
{"Lavender(Plant)"},
{"Lighthouse"},
{"Dumbbell"},
{"Humanhead"},
{"Bowl"},
{"Humidifier"},
{"Porch"},
{"Lizard"},
{"Billiardtable"},
{"Mammal"},
{"Mouse"},
{"Motorcycle"},
{"Musicalinstrument"},
{"Swimcap"},
{"Fryingpan"},
{"Snowplow"},
{"Bathroomcabinet"},
{"Missile"},
{"Bust"},
{"Man"},
{"Waffleiron"},
{"Milk"},
{"Ringbinder"},
{"Plate"},
{"Mobilephone"},
{"Bakedgoods"},
{"Mushroom"},
{"Crutch"},
{"Pitcher(Container)"},
{"Mirror"},
{"Personalflotationdevice"},
{"Tabletennisracket"},
{"Pencilcase"},
{"Musicalkeyboard"},
{"Scoreboard"},
{"Briefcase"},
{"Kitchenknife"},
{"Nail(Construction)"},
{"Tennisball"},
{"Plasticbag"},
{"Oboe"},
{"Chestofdrawers"},
{"Ostrich"},
{"Piano"},
{"Girl"},
{"Plant"},
{"Potato"},
{"Hairspray"},
{"Sportsequipment"},
{"Pasta"},
{"Penguin"},
{"Pumpkin"},
{"Pear"},
{"Infantbed"},
{"Polarbear"},
{"Mixer"},
{"Cupboard"},
{"Jacuzzi"},
{"Pizza"},
{"Digitalclock"},
{"Pig"},
{"Reptile"},
{"Rifle"},
{"Lipstick"},
{"Skateboard"},
{"Raven"},
{"Highheels"},
{"Redpanda"},
{"Rose"},
{"Rabbit"},
{"Sculpture"},
{"Saxophone"},
{"Shotgun"},
{"Seafood"},
{"Submarinesandwich"},
{"Snowboard"},
{"Sword"},
{"Pictureframe"},
{"Sushi"},
{"Loveseat"},
{"Ski"},
{"Squirrel"},
{"Tripod"},
{"Stethoscope"},
{"Submarine"},
{"Scorpion"},
{"Segway"},
{"Trainingbench"},
{"Snake"},
{"Coffeetable"},
{"Skyscraper"},
{"Sheep"},
{"Television"},
{"Trombone"},
{"Tea"},
{"Tank"},
{"Taco"},
{"Telephone"},
{"Torch"},
{"Tiger"},
{"Strawberry"},
{"Trumpet"},
{"Tree"},
{"Tomato"},
{"Train"},
{"Tool"},
{"Picnicbasket"},
{"Cookingspray"},
{"Trousers"},
{"Bowlingequipment"},
{"Footballhelmet"},
{"Truck"},
{"Measuringcup"},
{"Coffeemaker"},
{"Violin"},
{"Vehicle"},
{"Handbag"},
{"Papercutter"},
{"Wine"},
{"Weapon"},
{"Wheel"},
{"Worm"},
{"Wok"},
{"Whale"},
{"Zebra"},
{"Autopart"},
{"Jug"},
{"Pizzacutter"},
{"Cream"},
{"Monkey"},
{"Lion"},
{"Bread"},
{"Platter"},
{"Chicken"},
{"Eagle"},
{"Helicopter"},
{"Owl"},
{"Duck"},
{"Turtle"},
{"Hippopotamus"},
{"Crocodile"},
{"Toilet"},
{"Toiletpaper"},
{"Squid"},
{"Clothing"},
{"Footwear"},
{"Lemon"},
{"Spider"},
{"Deer"},
{"Frog"},
{"Banana"},
{"Rocket"},
{"Wineglass"},
{"Countertop"},
{"Tabletcomputer"},
{"Wastecontainer"},
{"Swimmingpool"},
{"Dog"},
{"Book"},
{"Elephant"},
{"Shark"},
{"Candle"},
{"Leopard"},
{"Axe"},
{"Handdryer"},
{"Soapdispenser"},
{"Porcupine"},
{"Flower"},
{"Canary"},
{"Cheetah"},
{"Palmtree"},
{"Hamburger"},
{"Maple"},
{"Building"},
{"Fish"},
{"Lobster"},
{"GardenAsparagus"},
{"Furniture"},
{"Hedgehog"},
{"Airplane"},
{"Spoon"},
{"Otter"},
{"Bull"},
{"Oyster"},
{"Horizontalbar"},
{"Conveniencestore"},
{"Bomb"},
{"Bench"},
{"Icecream"},
{"Caterpillar"},
{"Butterfly"},
{"Parachute"},
{"Orange"},
{"Antelope"},
{"Beaker"},
{"Mothsandbutterflies"},
{"Window"},
{"Closet"},
{"Castle"},
{"Jellyfish"},
{"Goose"},
{"Mule"},
{"Swan"},
{"Peach"},
{"Coconut"},
{"Seatbelt"},
{"Raccoon"},
{"Chisel"},
{"Fork"},
{"Lamp"},
{"Camera"},
{"Squash(Plant)"},
{"Racket"},
{"Humanface"},
{"Humanarm"},
{"Vegetable"},
{"Diaper"},
{"Unicycle"},
{"Falcon"},
{"Chime"},
{"Snail"},
{"Shellfish"},
{"Cabbage"},
{"Carrot"},
{"Mango"},
{"Jeans"},
{"Flowerpot"},
{"Pineapple"},
{"Drawer"},
{"Stool"},
{"Envelope"},
{"Cake"},
{"Dragonfly"},
{"Commonsunflower"},
{"Microwaveoven"},
{"Honeycomb"},
{"Marinemammal"},
{"Sealion"},
{"Ladybug"},
{"Shelf"},
{"Watch"},
{"Candy"},
{"Salad"},
{"Parrot"},
{"Handgun"},
{"Sparrow"},
{"Van"},
{"Grinder"},
{"Spicerack"},
{"Lightbulb"},
{"Cordedphone"},
{"Sportsuniform"},
{"Tennisracket"},
{"Wallclock"},
{"Servingtray"},
{"Kitchen&diningroomtable"},
{"Dogbed"},
{"Cakestand"},
{"Catfurniture"},
{"Bathroomaccessory"},
{"Facialtissueholder"},
{"Pressurecooker"},
{"Kitchenappliance"},
{"Tire"},
{"Ruler"},
{"Luggageandbags"},
{"Microphone"},
{"Broccoli"},
{"Umbrella"},
{"Pastry"},
{"Grapefruit"},
{"Band-aid"},
{"Animal"},
{"Bellpepper"},
{"Turkey"},
{"Lily"},
{"Pomegranate"},
{"Doughnut"},
{"Glasses"},
{"Humannose"},
{"Pen"},
{"Ant"},
{"Car"},
{"Aircraft"},
{"Humanhand"},
{"Skunk"},
{"Teddybear"},
{"Watermelon"},
{"Cantaloupe"},
{"Dishwasher"},
{"Flute"},
{"Balancebeam"},
{"Sandwich"},
{"Shrimp"},
{"Sewingmachine"},
{"Binoculars"},
{"Raysandskates"},
{"Ipod"},
{"Accordion"},
{"Willow"},
{"Crab"},
{"Crown"},
{"Seahorse"},
{"Perfume"},
{"Alpaca"},
{"Taxi"},
{"Canoe"},
{"Remotecontrol"},
{"Wheelchair"},
{"Rugbyball"},
{"Armadillo"},
{"Maracas"},
{"Helmet"},
};
#endif // MNN_JNI_HMS_HMS_LABEL_THRES_H
\ No newline at end of file
#include "MSNetWork.h"
#include <iostream>
#include <android/log.h>
#include "errorcode.h"
#define MS_PRINT(format, ...) __android_log_print(ANDROID_LOG_INFO, "MSJNI", format, ##__VA_ARGS__)
MSNetWork::MSNetWork(void) : session(nullptr) {}
MSNetWork::~MSNetWork(void) {}
void MSNetWork::CreateSessionMS(char* modelBuffer, size_t bufferLen, mindspore::lite::Context* ctx)
{
session = mindspore::session::LiteSession::CreateSession(ctx);
if (session == nullptr){
MS_PRINT("Create Session failed.");
return;
}
// Compile model.
auto model = mindspore::lite::Model::Import(modelBuffer, bufferLen);
if (model == nullptr){
MS_PRINT("Import model failed.");
return;
}
int ret = session->CompileGraph(model);
if (ret != mindspore::lite::RET_OK){
MS_PRINT("CompileGraph failed.");
return;
}
}
int MSNetWork::ReleaseNets(void)
{
delete session;
// delete model;
return 0;
}
// * Copyright (c) Huawei Technologies Co., Ltd. 2018-2019. All rights reserved.
#ifndef MSNETWORK_H
#define MSNETWORK_H
#include <cstdio>
#include <algorithm>
#include <fstream>
#include <functional>
#include <sstream>
#include <vector>
#include <map>
#include <string>
#include <memory>
#include <utility>
#include <context.h>
#include <lite_session.h>
#include <model.h>
#include <errorcode.h>
using namespace mindspore;
struct ImgDims {
int channel = 0;
int width = 0;
int height = 0;
};
/*struct SessIterm {
std::shared_ptr<mindspore::session::LiteSession> sess = nullptr;
};*/
class MSNetWork {
public:
MSNetWork();
~MSNetWork();
void CreateSessionMS(char* modelBuffer, size_t bufferLen, mindspore::lite::Context* ctx);
int ReleaseNets(void);
mindspore::session::LiteSession *session;
mindspore::lite::Model *model;
private:
//std::map<std::string, SessIterm> sess;
};
#endif
/**
* Copyright 2020 Huawei Technologies Co., Ltd
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
#ifndef MINDSPORE_LITE_INCLUDE_CONTEXT_H_
#define MINDSPORE_LITE_INCLUDE_CONTEXT_H_
#include <string>
#include <memory>
#include "ms_tensor.h"
#include "thread_pool_config.h"
namespace mindspore {
namespace lite {
/// \brief Allocator defined a memory pool for malloc memory and free memory dynamically.
///
/// \note List public class and interface for reference.
class Allocator;
/// \brief DeviceType defined for holding user's preferred backend.
typedef enum {
DT_CPU, /**< CPU device type */
DT_GPU, /**< GPU device type */
DT_NPU /**< NPU device type, not supported yet */
} DeviceType;
/// \brief DeviceContext defined for holding DeviceType.
typedef struct {
DeviceType type; /**< device type */
} DeviceContext;
/// \brief Context defined for holding environment variables during runtime.
class MS_API Context {
public:
/// \brief Constructor of MindSpore Lite Context using default value for parameters.
///
/// \return Instance of MindSpore Lite Context.
Context();
/// \brief Constructor of MindSpore Lite Context using input value for parameters.
///
/// \param[in] thread_num Define the work thread number during the runtime.
/// \param[in] allocator Define the allocator for malloc.
/// \param[in] device_ctx Define device information during the runtime.
Context(int thread_num, std::shared_ptr<Allocator> allocator, DeviceContext device_ctx);
/// \brief Destructor of MindSpore Lite Context.
virtual ~Context();
public:
bool float16_priority = false; /**< allow priority select float16 kernel */
DeviceContext device_ctx_{DT_CPU};
int thread_num_ = 2; /**< thread number config for thread pool */
std::shared_ptr<Allocator> allocator = nullptr;
CpuBindMode cpu_bind_mode_ = MID_CPU;
};
}
} // namespace mindspore::lite
#endif // MINDSPORE_LITE_INCLUDE_CONTEXT_H_
/**
* Copyright 2020 Huawei Technologies Co., Ltd
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
#ifndef MINDSPORE_LITE_INCLUDE_ERRORCODE_H_
#define MINDSPORE_LITE_INCLUDE_ERRORCODE_H_
namespace mindspore {
namespace lite {
/// \brief STATUS defined for holding error code in MindSpore Lite.
using STATUS = int;
/* Success */
constexpr int RET_OK = 0; /**< No error occurs. */
/* Common error code, range: [-1, -100]*/
constexpr int RET_ERROR = -1; /**< Common error code. */
constexpr int RET_NULL_PTR = -2; /**< NULL pointer returned.*/
constexpr int RET_PARAM_INVALID = -3; /**< Invalid parameter.*/
constexpr int RET_NO_CHANGE = -4; /**< No change. */
constexpr int RET_SUCCESS_EXIT = -5; /**< No error but exit. */
constexpr int RET_MEMORY_FAILED = -6; /**< Fail to create memory. */
/* Executor error code, range: [-101,-200] */
constexpr int RET_OUT_OF_TENSOR_RANGE = -101; /**< Failed to check range. */
constexpr int RET_INPUT_TENSOR_ERROR = -102; /**< Failed to check input tensor. */
constexpr int RET_REENTRANT_ERROR = -103; /**< Exist executor running. */
/* Graph error code, range: [-201,-300] */
constexpr int RET_GRAPH_FILE_ERR = -201; /**< Failed to verify graph file. */
/* Node error code, range: [-301,-400] */
constexpr int RET_NOT_FIND_OP = -301; /**< Failed to find operator. */
constexpr int RET_INVALID_OP_NAME = -302; /**< Invalid operator name. */
constexpr int RET_INVALID_OP_ATTR = -303; /**< Invalid operator attr. */
constexpr int RET_OP_EXECUTE_FAILURE = -304; /**< Failed to execution operator. */
/* Tensor error code, range: [-401,-500] */
constexpr int RET_FORMAT_ERR = -401; /**< Failed to checking tensor format. */
/* InferShape error code, range: [-501,-600] */
constexpr int RET_INFER_ERR = -501; /**< Failed to infer shape. */
constexpr int RET_INFER_INVALID = -502; /**< Invalid infer shape before runtime. */
} // namespace lite
} // namespace mindspore
#endif // MINDSPORE_LITE_INCLUDE_ERRORCODE_H_
#ifndef FLATBUFFERS_BASE_H_
#define FLATBUFFERS_BASE_H_
// clang-format off
// If activate should be declared and included first.
#if defined(FLATBUFFERS_MEMORY_LEAK_TRACKING) && \
defined(_MSC_VER) && defined(_DEBUG)
// The _CRTDBG_MAP_ALLOC inside <crtdbg.h> will replace
// calloc/free (etc) to its debug version using #define directives.
#define _CRTDBG_MAP_ALLOC
#include <stdlib.h>
#include <crtdbg.h>
// Replace operator new by trace-enabled version.
#define DEBUG_NEW new(_NORMAL_BLOCK, __FILE__, __LINE__)
#define new DEBUG_NEW
#endif
#if !defined(FLATBUFFERS_ASSERT)
#include <assert.h>
#define FLATBUFFERS_ASSERT assert
#elif defined(FLATBUFFERS_ASSERT_INCLUDE)
// Include file with forward declaration
#include FLATBUFFERS_ASSERT_INCLUDE
#endif
#ifndef ARDUINO
#include <cstdint>
#endif
#include <cstddef>
#include <cstdlib>
#include <cstring>
#if defined(ARDUINO) && !defined(ARDUINOSTL_M_H)
#include <utility.h>
#else
#include <utility>
#endif
#include <string>
#include <type_traits>
#include <vector>
#include <set>
#include <algorithm>
#include <iterator>
#include <memory>
#ifdef _STLPORT_VERSION
#define FLATBUFFERS_CPP98_STL
#endif
#ifndef FLATBUFFERS_CPP98_STL
#include <functional>
#endif
#include "stl_emulation.h"
// Note the __clang__ check is needed, because clang presents itself
// as an older GNUC compiler (4.2).
// Clang 3.3 and later implement all of the ISO C++ 2011 standard.
// Clang 3.4 and later implement all of the ISO C++ 2014 standard.
// http://clang.llvm.org/cxx_status.html
// Note the MSVC value '__cplusplus' may be incorrect:
// The '__cplusplus' predefined macro in the MSVC stuck at the value 199711L,
// indicating (erroneously!) that the compiler conformed to the C++98 Standard.
// This value should be correct starting from MSVC2017-15.7-Preview-3.
// The '__cplusplus' will be valid only if MSVC2017-15.7-P3 and the `/Zc:__cplusplus` switch is set.
// Workaround (for details see MSDN):
// Use the _MSC_VER and _MSVC_LANG definition instead of the __cplusplus for compatibility.
// The _MSVC_LANG macro reports the Standard version regardless of the '/Zc:__cplusplus' switch.
#if defined(__GNUC__) && !defined(__clang__)
#define FLATBUFFERS_GCC (__GNUC__ * 10000 + __GNUC_MINOR__ * 100 + __GNUC_PATCHLEVEL__)
#else
#define FLATBUFFERS_GCC 0
#endif
#if defined(__clang__)
#define FLATBUFFERS_CLANG (__clang_major__ * 10000 + __clang_minor__ * 100 + __clang_patchlevel__)
#else
#define FLATBUFFERS_CLANG 0
#endif
/// @cond FLATBUFFERS_INTERNAL
#if __cplusplus <= 199711L && \
(!defined(_MSC_VER) || _MSC_VER < 1600) && \
(!defined(__GNUC__) || \
(__GNUC__ * 10000 + __GNUC_MINOR__ * 100 + __GNUC_PATCHLEVEL__ < 40400))
#error A C++11 compatible compiler with support for the auto typing is \
required for FlatBuffers.
#error __cplusplus _MSC_VER __GNUC__ __GNUC_MINOR__ __GNUC_PATCHLEVEL__
#endif
#if !defined(__clang__) && \
defined(__GNUC__) && \
(__GNUC__ * 10000 + __GNUC_MINOR__ * 100 + __GNUC_PATCHLEVEL__ < 40600)
// Backwards compatability for g++ 4.4, and 4.5 which don't have the nullptr
// and constexpr keywords. Note the __clang__ check is needed, because clang
// presents itself as an older GNUC compiler.
#ifndef nullptr_t
const class nullptr_t {
public:
template<class T> inline operator T*() const { return 0; }
private:
void operator&() const;
} nullptr = {};
#endif
#ifndef constexpr
#define constexpr const
#endif
#endif
// The wire format uses a little endian encoding (since that's efficient for
// the common platforms).
#if defined(__s390x__)
#define FLATBUFFERS_LITTLEENDIAN 0
#endif // __s390x__
#if !defined(FLATBUFFERS_LITTLEENDIAN)
#if defined(__GNUC__) || defined(__clang__)
#if (defined(__BIG_ENDIAN__) || \
(defined(__BYTE_ORDER__) && __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__))
#define FLATBUFFERS_LITTLEENDIAN 0
#else
#define FLATBUFFERS_LITTLEENDIAN 1
#endif // __BIG_ENDIAN__
#elif defined(_MSC_VER)
#if defined(_M_PPC)
#define FLATBUFFERS_LITTLEENDIAN 0
#else
#define FLATBUFFERS_LITTLEENDIAN 1
#endif
#else
#error Unable to determine endianness, define FLATBUFFERS_LITTLEENDIAN.
#endif
#endif // !defined(FLATBUFFERS_LITTLEENDIAN)
#define FLATBUFFERS_VERSION_MAJOR 1
#define FLATBUFFERS_VERSION_MINOR 11
#define FLATBUFFERS_VERSION_REVISION 0
#define FLATBUFFERS_STRING_EXPAND(X) #X
#define FLATBUFFERS_STRING(X) FLATBUFFERS_STRING_EXPAND(X)
#if (!defined(_MSC_VER) || _MSC_VER > 1600) && \
(!defined(__GNUC__) || (__GNUC__ * 100 + __GNUC_MINOR__ >= 407)) || \
defined(__clang__)
#define FLATBUFFERS_FINAL_CLASS final
#define FLATBUFFERS_OVERRIDE override
#define FLATBUFFERS_VTABLE_UNDERLYING_TYPE : flatbuffers::voffset_t
#else
#define FLATBUFFERS_FINAL_CLASS
#define FLATBUFFERS_OVERRIDE
#define FLATBUFFERS_VTABLE_UNDERLYING_TYPE
#endif
#if (!defined(_MSC_VER) || _MSC_VER >= 1900) && \
(!defined(__GNUC__) || (__GNUC__ * 100 + __GNUC_MINOR__ >= 406)) || \
(defined(__cpp_constexpr) && __cpp_constexpr >= 200704)
#define FLATBUFFERS_CONSTEXPR constexpr
#else
#define FLATBUFFERS_CONSTEXPR const
#endif
#if (defined(__cplusplus) && __cplusplus >= 201402L) || \
(defined(__cpp_constexpr) && __cpp_constexpr >= 201304)
#define FLATBUFFERS_CONSTEXPR_CPP14 FLATBUFFERS_CONSTEXPR
#else
#define FLATBUFFERS_CONSTEXPR_CPP14
#endif
#if (defined(__GXX_EXPERIMENTAL_CXX0X__) && (__GNUC__ * 100 + __GNUC_MINOR__ >= 406)) || \
(defined(_MSC_FULL_VER) && (_MSC_FULL_VER >= 190023026)) || \
defined(__clang__)
#define FLATBUFFERS_NOEXCEPT noexcept
#else
#define FLATBUFFERS_NOEXCEPT
#endif
// NOTE: the FLATBUFFERS_DELETE_FUNC macro may change the access mode to
// private, so be sure to put it at the end or reset access mode explicitly.
#if (!defined(_MSC_VER) || _MSC_FULL_VER >= 180020827) && \
(!defined(__GNUC__) || (__GNUC__ * 100 + __GNUC_MINOR__ >= 404)) || \
defined(__clang__)
#define FLATBUFFERS_DELETE_FUNC(func) func = delete;
#else
#define FLATBUFFERS_DELETE_FUNC(func) private: func;
#endif
#ifndef FLATBUFFERS_HAS_STRING_VIEW
// Only provide flatbuffers::string_view if __has_include can be used
// to detect a header that provides an implementation
#if defined(__has_include)
// Check for std::string_view (in c++17)
#if __has_include(<string_view>) && (__cplusplus >= 201606 || _HAS_CXX17)
#include <string_view>
namespace flatbuffers {
typedef std::string_view string_view;
}
#define FLATBUFFERS_HAS_STRING_VIEW 1
// Check for std::experimental::string_view (in c++14, compiler-dependent)
#elif __has_include(<experimental/string_view>) && (__cplusplus >= 201411)
#include <experimental/string_view>
namespace flatbuffers {
typedef std::experimental::string_view string_view;
}
#define FLATBUFFERS_HAS_STRING_VIEW 1
#endif
#endif // __has_include
#endif // !FLATBUFFERS_HAS_STRING_VIEW
#ifndef FLATBUFFERS_HAS_NEW_STRTOD
// Modern (C++11) strtod and strtof functions are available for use.
// 1) nan/inf strings as argument of strtod;
// 2) hex-float as argument of strtod/strtof.
#if (defined(_MSC_VER) && _MSC_VER >= 1900) || \
(defined(__GNUC__) && (__GNUC__ * 100 + __GNUC_MINOR__ >= 409)) || \
(defined(__clang__))
#define FLATBUFFERS_HAS_NEW_STRTOD 1
#endif
#endif // !FLATBUFFERS_HAS_NEW_STRTOD
#ifndef FLATBUFFERS_LOCALE_INDEPENDENT
// Enable locale independent functions {strtof_l, strtod_l,strtoll_l, strtoull_l}.
// They are part of the POSIX-2008 but not part of the C/C++ standard.
// GCC/Clang have definition (_XOPEN_SOURCE>=700) if POSIX-2008.
#if ((defined(_MSC_VER) && _MSC_VER >= 1800) || \
(defined(_XOPEN_SOURCE) && (_XOPEN_SOURCE>=700)))
#define FLATBUFFERS_LOCALE_INDEPENDENT 1
#else
#define FLATBUFFERS_LOCALE_INDEPENDENT 0
#endif
#endif // !FLATBUFFERS_LOCALE_INDEPENDENT
// Suppress Undefined Behavior Sanitizer (recoverable only). Usage:
// - __supress_ubsan__("undefined")
// - __supress_ubsan__("signed-integer-overflow")
#if defined(__clang__)
#define __supress_ubsan__(type) __attribute__((no_sanitize(type)))
#elif defined(__GNUC__) && (__GNUC__ * 100 + __GNUC_MINOR__ >= 409)
#define __supress_ubsan__(type) __attribute__((no_sanitize_undefined))
#else
#define __supress_ubsan__(type)
#endif
// This is constexpr function used for checking compile-time constants.
// Avoid `#pragma warning(disable: 4127) // C4127: expression is constant`.
template<typename T> FLATBUFFERS_CONSTEXPR inline bool IsConstTrue(T t) {
return !!t;
}
// Enable C++ attribute [[]] if std:c++17 or higher.
#if ((__cplusplus >= 201703L) \
|| (defined(_MSVC_LANG) && (_MSVC_LANG >= 201703L)))
// All attributes unknown to an implementation are ignored without causing an error.
#define FLATBUFFERS_ATTRIBUTE(attr) [[attr]]
#define FLATBUFFERS_FALLTHROUGH() [[fallthrough]]
#else
#define FLATBUFFERS_ATTRIBUTE(attr)
#if FLATBUFFERS_CLANG >= 30800
#define FLATBUFFERS_FALLTHROUGH() [[clang::fallthrough]]
#elif FLATBUFFERS_GCC >= 70300
#define FLATBUFFERS_FALLTHROUGH() [[gnu::fallthrough]]
#else
#define FLATBUFFERS_FALLTHROUGH()
#endif
#endif
/// @endcond
/// @file
namespace flatbuffers {
/// @cond FLATBUFFERS_INTERNAL
// Our default offset / size type, 32bit on purpose on 64bit systems.
// Also, using a consistent offset type maintains compatibility of serialized
// offset values between 32bit and 64bit systems.
typedef uint32_t uoffset_t;
// Signed offsets for references that can go in both directions.
typedef int32_t soffset_t;
// Offset/index used in v-tables, can be changed to uint8_t in
// format forks to save a bit of space if desired.
typedef uint16_t voffset_t;
typedef uintmax_t largest_scalar_t;
// In 32bits, this evaluates to 2GB - 1
#define FLATBUFFERS_MAX_BUFFER_SIZE ((1ULL << (sizeof(soffset_t) * 8 - 1)) - 1)
// We support aligning the contents of buffers up to this size.
#define FLATBUFFERS_MAX_ALIGNMENT 16
#if defined(_MSC_VER)
#pragma warning(push)
#pragma warning(disable: 4127) // C4127: conditional expression is constant
#endif
template<typename T> T EndianSwap(T t) {
#if defined(_MSC_VER)
#define FLATBUFFERS_BYTESWAP16 _byteswap_ushort
#define FLATBUFFERS_BYTESWAP32 _byteswap_ulong
#define FLATBUFFERS_BYTESWAP64 _byteswap_uint64
#else
#if defined(__GNUC__) && __GNUC__ * 100 + __GNUC_MINOR__ < 408 && !defined(__clang__)
// __builtin_bswap16 was missing prior to GCC 4.8.
#define FLATBUFFERS_BYTESWAP16(x) \
static_cast<uint16_t>(__builtin_bswap32(static_cast<uint32_t>(x) << 16))
#else
#define FLATBUFFERS_BYTESWAP16 __builtin_bswap16
#endif
#define FLATBUFFERS_BYTESWAP32 __builtin_bswap32
#define FLATBUFFERS_BYTESWAP64 __builtin_bswap64
#endif
if (sizeof(T) == 1) { // Compile-time if-then's.
return t;
} else if (sizeof(T) == 2) {
union { T t; uint16_t i; } u;
u.t = t;
u.i = FLATBUFFERS_BYTESWAP16(u.i);
return u.t;
} else if (sizeof(T) == 4) {
union { T t; uint32_t i; } u;
u.t = t;
u.i = FLATBUFFERS_BYTESWAP32(u.i);
return u.t;
} else if (sizeof(T) == 8) {
union { T t; uint64_t i; } u;
u.t = t;
u.i = FLATBUFFERS_BYTESWAP64(u.i);
return u.t;
} else {
FLATBUFFERS_ASSERT(0);
}
}
#if defined(_MSC_VER)
#pragma warning(pop)
#endif
template<typename T> T EndianScalar(T t) {
#if FLATBUFFERS_LITTLEENDIAN
return t;
#else
return EndianSwap(t);
#endif
}
template<typename T>
// UBSAN: C++ aliasing type rules, see std::bit_cast<> for details.
__supress_ubsan__("alignment")
T ReadScalar(const void *p) {
return EndianScalar(*reinterpret_cast<const T *>(p));
}
template<typename T>
// UBSAN: C++ aliasing type rules, see std::bit_cast<> for details.
__supress_ubsan__("alignment")
void WriteScalar(void *p, T t) {
*reinterpret_cast<T *>(p) = EndianScalar(t);
}
template<typename T> struct Offset;
template<typename T> __supress_ubsan__("alignment") void WriteScalar(void *p, Offset<T> t) {
*reinterpret_cast<uoffset_t *>(p) = EndianScalar(t.o);
}
// Computes how many bytes you'd have to pad to be able to write an
// "scalar_size" scalar if the buffer had grown to "buf_size" (downwards in
// memory).
inline size_t PaddingBytes(size_t buf_size, size_t scalar_size) {
return ((~buf_size) + 1) & (scalar_size - 1);
}
} // namespace flatbuffers
#endif // FLATBUFFERS_BASE_H_
/*
* Copyright 2017 Google Inc. All rights reserved.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
#include <functional>
#include <limits>
#include <string>
#include "flatbuffers/flatbuffers.h"
#include "flatbuffers/idl.h"
#include "flatbuffers/util.h"
#ifndef FLATC_H_
# define FLATC_H_
namespace flatbuffers {
class FlatCompiler {
public:
// Output generator for the various programming languages and formats we
// support.
struct Generator {
typedef bool (*GenerateFn)(const flatbuffers::Parser &parser,
const std::string &path,
const std::string &file_name);
typedef std::string (*MakeRuleFn)(const flatbuffers::Parser &parser,
const std::string &path,
const std::string &file_name);
GenerateFn generate;
const char *generator_opt_short;
const char *generator_opt_long;
const char *lang_name;
bool schema_only;
GenerateFn generateGRPC;
flatbuffers::IDLOptions::Language lang;
const char *generator_help;
MakeRuleFn make_rule;
};
typedef void (*WarnFn)(const FlatCompiler *flatc, const std::string &warn,
bool show_exe_name);
typedef void (*ErrorFn)(const FlatCompiler *flatc, const std::string &err,
bool usage, bool show_exe_name);
// Parameters required to initialize the FlatCompiler.
struct InitParams {
InitParams()
: generators(nullptr),
num_generators(0),
warn_fn(nullptr),
error_fn(nullptr) {}
const Generator *generators;
size_t num_generators;
WarnFn warn_fn;
ErrorFn error_fn;
};
explicit FlatCompiler(const InitParams &params) : params_(params) {}
int Compile(int argc, const char **argv);
std::string GetUsageString(const char *program_name) const;
private:
void ParseFile(flatbuffers::Parser &parser, const std::string &filename,
const std::string &contents,
std::vector<const char *> &include_directories) const;
void LoadBinarySchema(Parser &parser, const std::string &filename,
const std::string &contents);
void Warn(const std::string &warn, bool show_exe_name = true) const;
void Error(const std::string &err, bool usage = true,
bool show_exe_name = true) const;
InitParams params_;
};
} // namespace MindSpore.flatbuffers
#endif // FLATC_H_
/**
* Copyright 2020 Huawei Technologies Co., Ltd
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
#ifndef MINDSPORE_LITE_INCLUDE_THREAD_POOL_CONFIG_H_
#define MINDSPORE_LITE_INCLUDE_THREAD_POOL_CONFIG_H_
/// \brief CpuBindMode defined for holding bind cpu strategy argument.
typedef enum Mode {
MID_CPU = -1, /**< bind middle cpu first */
HIGHER_CPU = 1, /**< bind higher cpu first */
NO_BIND = 0 /**< no bind */
} CpuBindMode;
/// \brief ThreadPoolId defined for specifying which thread pool to use.
typedef enum Id {
THREAD_POOL_DEFAULT = 0, /**< default thread pool id */
THREAD_POOL_SECOND = 1, /**< the second thread pool id */
THREAD_POOL_THIRD = 2, /**< the third thread pool id */
THREAD_POOL_FOURTH = 3 /**< the fourth thread pool id */
} ThreadPoolId;
#endif // LITE_MINDSPORE_LITE_INCLUDE_THREAD_POOL_CONFIG_H_
// This file is part of OpenCV project.
// It is subject to the license terms in the LICENSE file found in the top-level directory
// of this distribution and at http://opencv.org/license.html.
//
// Copyright (C) 2014, Advanced Micro Devices, Inc., all rights reserved.
#ifndef OPENCV_CORE_BUFFER_POOL_HPP
#define OPENCV_CORE_BUFFER_POOL_HPP
#ifdef _MSC_VER
#pragma warning(push)
#pragma warning(disable: 4265)
#endif
namespace cv
{
//! @addtogroup core
//! @{
class BufferPoolController
{
protected:
~BufferPoolController() { }
public:
virtual size_t getReservedSize() const = 0;
virtual size_t getMaxReservedSize() const = 0;
virtual void setMaxReservedSize(size_t size) = 0;
virtual void freeAllReservedBuffers() = 0;
};
//! @}
}
#ifdef _MSC_VER
#pragma warning(pop)
#endif
#endif // OPENCV_CORE_BUFFER_POOL_HPP
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册