安卓环境下人工智能模型部署

有没有一扇窗,能让你不绝望,看一看花花世界,然来梦一场,有人哭,有人笑,有人输,有人老,到结局还不是一样

Posted by yishuifengxiao on 2021-06-25

本文档主要是简单地描述了在安卓环境下如何部署模型,在部署过程中涉及到opencv和百度飞浆的整合以及安卓中JNI的使用。

本文档环境如下

  • 操作系统:win10
  • IDE :android studio 4.1.2
  • opencv:3.4.5

整合过程中,会先在百度飞浆sample项目的基础上整合百度飞浆的功能,然后引入项目中需要使用到的opencv模块,接着使用jni调用自定义模型进行数据处理,最后接收处理的结果并显示在安卓界面上。

一 基础环境准备

在android studio中创建一个Native工程,步骤如下:

image-20210625091235828

图1 创建Native项目

image-20210625091530049

图2 配置项目

image-20210625091614598

图3 配置activity

选择如图所示的配置,点击完成按钮,稍等一段时间,会看到如下图4所示的界面,表示项目创建完成。

image-20210625091909802

图4 项目初始化界面

从上图4可以看到,IDE已经构建好了完整的基础JNI项目骨架,尝试运行项目,可以在安卓模拟器中看到如下运行结果

image-20210625092607614

图5 运行结果

从上图5可知,一个基础的JNI开发调试环境已经构建完成。

二 引入飞浆

在引入飞浆功能,需要先修改项目的app目录下的 build.gradle 文件,加上飞浆相关的依赖文件。

image-20210625092851175

图6 需要修改的文件

修改后的完整的 build.gradle 文件内容如下:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
import java.security.MessageDigest

plugins {
id 'com.android.application'
}

android {
compileSdkVersion 30
buildToolsVersion "30.0.3"

defaultConfig {
applicationId "com.example.aisample"
minSdkVersion 16
targetSdkVersion 30
versionCode 1
versionName "1.0"

testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
externalNativeBuild {
cmake {
cppFlags ""
}
ndk {
//选择要添加的对应cpu类型的.so库。
abiFilters 'armeabi-v7a'
// 还可以添加 'x86', 'x86_64', 'mips', 'mips64'
}
}
}

buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
}
}
externalNativeBuild {
cmake {
path "src/main/cpp/CMakeLists.txt"
version "3.10.2"
}
}
compileOptions {
sourceCompatibility JavaVersion.VERSION_1_8
targetCompatibility JavaVersion.VERSION_1_8
}
sourceSets {
main {
jniLibs.srcDirs = ['src/main/jniLibs']
}
}
}

dependencies {
implementation fileTree(include: ['*.jar'], dir: 'libs')
implementation 'androidx.appcompat:appcompat:1.1.0'
implementation 'com.google.android.material:material:1.1.0'
implementation 'androidx.constraintlayout:constraintlayout:1.1.3'
testImplementation 'junit:junit:4.+'
androidTestImplementation 'androidx.test.ext:junit:1.1.1'
androidTestImplementation 'androidx.test.espresso:espresso-core:3.2.0'
implementation files('libs/PaddlePredictor.jar')
}


def paddleLiteLibs = 'https://paddlelite-demo.bj.bcebos.com/libs/android/paddle_lite_libs_v2_9_0.tar.gz'
task downloadAndExtractPaddleLiteLibs(type: DefaultTask) {
doFirst {
println "Downloading and extracting Paddle Lite libs"
}
doLast {
// Prepare cache folder for libs
if (!file("cache").exists()) {
mkdir "cache"
}
// Generate cache name for libs
MessageDigest messageDigest = MessageDigest.getInstance('MD5')
messageDigest.update(paddleLiteLibs.bytes)
String cacheName = new BigInteger(1, messageDigest.digest()).toString(32)
// Download libs
if (!file("cache/${cacheName}.tar.gz").exists()) {
ant.get(src: paddleLiteLibs, dest: file("cache/${cacheName}.tar.gz"))
}
// Unpack libs
if (!file("cache/${cacheName}").exists()) {
copy {
from tarTree("cache/${cacheName}.tar.gz")
into "cache/${cacheName}"
}
}

// Copy PaddlePredictor.jar
if (!file("libs/PaddlePredictor.jar").exists()) {
copy {
from "cache/${cacheName}/java/PaddlePredictor.jar"
into "libs"
}
}
// Copy libpaddle_lite_jni.so for armeabi-v7a and arm64-v8a
if (!file("src/main/jniLibs/armeabi-v7a/libpaddle_lite_jni.so").exists()) {
copy {
from "cache/${cacheName}/java/libs/armeabi-v7a/"
into "src/main/jniLibs/armeabi-v7a"
// into "src/main/jni/armeabi-v7a"
}
}
if (!file("src/main/jniLibs/arm64-v8a/libpaddle_lite_jni.so").exists()) {
copy {
from "cache/${cacheName}/java/libs/arm64-v8a/"
into "src/main/jniLibs/arm64-v8a"
// into "src/main/jni/arm64-v8a"
}
}
}
}
preBuild.dependsOn downloadAndExtractPaddleLiteLibs

def paddleLiteModels = [
[
'src' : 'https://paddlelite-demo.bj.bcebos.com/models/deeplab_mobilenet_fp32_for_cpu_v2_9_0.tar.gz',
'dest': 'src/main/assets/image_segmentation/models/deeplab_mobilenet_for_cpu'
],
]
task downloadAndExtractPaddleLiteModels(type: DefaultTask) {
doFirst {
println "Downloading and extracting Paddle Lite models"
}
doLast {
// Prepare cache folder for models
String cachePath = "cache"
if (!file("${cachePath}").exists()) {
mkdir "${cachePath}"
}
paddleLiteModels.eachWithIndex { model, index ->
MessageDigest messageDigest = MessageDigest.getInstance('MD5')
messageDigest.update(model.src.bytes)
String cacheName = new BigInteger(1, messageDigest.digest()).toString(32)
// Download the target model if not exists
boolean copyFiles = !file("${model.dest}").exists()
if (!file("${cachePath}/${cacheName}.tar.gz").exists()) {
ant.get(src: model.src, dest: file("${cachePath}/${cacheName}.tar.gz"))
copyFiles = true; // force to copy files from the latest archive files
}
// Copy model file
if (copyFiles) {
copy {
from tarTree("${cachePath}/${cacheName}.tar.gz")
into "${model.dest}"
}
}
}
}
}
preBuild.dependsOn downloadAndExtractPaddleLiteModels

注意,默认的项目没有assets文件夹,需要手动创建,创建步骤如下

image-20210625094312014

图7 创建assets文件夹

创建好文件夹后,复制飞浆所需要的资源到项目对应的路径下面

image-20210625100541847

图8 复制好资源的项目

复制好所需的项目资源后,接下来复制所需的的java代码到对应的项目路径了。然后运行项目,在安卓模拟器里可以看到如下的界面。

image-20210625103836979

图9 飞浆运行结果

三 引入opencv

在引入opencv时,需要先到opencv官网下载好对应的资源包。

image-20210625104210263

图10 opencv官网下载地址

选择下载的版本为 3.4.5 ,类型的为安卓平台。注意这里下载的opencv版本需要根据项目进行调整,并不是任意一个版本或者是越新的版本越好。 下载完成后,将下载好的压缩包解压,得到如下的结构

image-20210625104709621

图11 opencv解压后的目录

在项目里创建一个JNI目录,创建此目录的方法如下

image-20210625104923829

图12 创建JNI目录

先将图11中对应的libs里面的内容复制到此JNI目录里。然后将图11中对应的jni目录里的include文件夹复制到项目的cpp的目录下。

复制完资源到对应的目录后,然后修改cpp路径下的 CMakeLists.txt 文件,通过cmake引入opencv功能。修改后的完整的 CMakeLists.txt内容如下

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
# For more information about using CMake with Android Studio, read the
# documentation: https://d.android.com/studio/projects/add-native-code.html

# Sets the minimum version of CMake required to build the native library.

cmake_minimum_required(VERSION 3.10.2)

# Declares and names the project.

project("aisample")


# 设置include文件夹的地址
include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp/include)
include_directories("include")


# 设置opencv的动态库
add_library(libopencv_java3 SHARED IMPORTED)
set_target_properties(libopencv_java3 PROPERTIES IMPORTED_LOCATION
${CMAKE_SOURCE_DIR}/../jni/${ANDROID_ABI}/libopencv_java3.so)

# Creates and names a library, sets it as either STATIC
# or SHARED, and provides the relative paths to its source code.
# You can define multiple libraries, and CMake builds them for you.
# Gradle automatically packages shared libraries with your APK.

add_library( # Sets the name of the library.
native-lib

# Sets the library as a shared library.
SHARED

# Provides a relative path to your source file(s).
native-lib.cpp)

# Searches for a specified prebuilt library and stores the path as a
# variable. Because CMake includes system libraries in the search path by
# default, you only need to specify the name of the public NDK library
# you want to add. CMake verifies that the library exists before
# completing its build.

find_library( # Sets the name of the path variable.
log-lib

# Specifies the name of the NDK library that
# you want CMake to locate.
log)

# Specifies libraries CMake should link to your target library. You
# can link multiple libraries, such as libraries you define in this
# build script, prebuilt third-party libraries, or system libraries.

target_link_libraries( # Specifies the target library.
native-lib

# 引入opencv
libopencv_java3
# Links the target library to the log library
# included in the NDK.
${log-lib})

然后修改app目录下的build.gradle文件,在defaultConfig目录下的externalNativeBuild的cmake增加如下配置arguments "-DANDROID_STL=c++_shared" 。除此以外,还需要加上opencv相关的jar引用配置。

修改后的文件的内容如下

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
import java.security.MessageDigest

plugins {
id 'com.android.application'
}

android {
compileSdkVersion 28
buildToolsVersion "30.0.3"

defaultConfig {
applicationId "com.example.aisample"
minSdkVersion 16
targetSdkVersion 28
versionCode 1
versionName "1.0"

testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
externalNativeBuild {
cmake {
cppFlags ""
arguments "-DANDROID_STL=c++_shared"
}
ndk {
//选择要添加的对应cpu类型的.so库。
abiFilters 'armeabi-v7a'
// 还可以添加 'x86', 'x86_64', 'mips', 'mips64'
}
}
}

buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
}
}
externalNativeBuild {
cmake {
path "src/main/cpp/CMakeLists.txt"
version "3.10.2"
}
}
compileOptions {
sourceCompatibility JavaVersion.VERSION_1_8
targetCompatibility JavaVersion.VERSION_1_8
}
sourceSets {
main {
jniLibs.srcDirs = ['src/main/jniLibs']
}
}
}

dependencies {
implementation fileTree(include: ['*.jar'], dir: 'libs')
implementation 'androidx.appcompat:appcompat:1.1.0'
implementation 'com.google.android.material:material:1.1.0'
implementation 'androidx.constraintlayout:constraintlayout:1.1.3'
implementation 'com.android.support:appcompat-v7:28.0.0'
implementation 'com.android.support:design:28.0.0'
testImplementation 'junit:junit:4.+'
androidTestImplementation 'androidx.test.ext:junit:1.1.1'
androidTestImplementation 'androidx.test.espresso:espresso-core:3.2.0'
implementation files('libs/PaddlePredictor.jar')
implementation 'org.opencv:openCVLibrary:3.4.0'
}


def paddleLiteLibs = 'https://paddlelite-demo.bj.bcebos.com/libs/android/paddle_lite_libs_v2_9_0.tar.gz'
task downloadAndExtractPaddleLiteLibs(type: DefaultTask) {
doFirst {
println "Downloading and extracting Paddle Lite libs"
}
doLast {
// Prepare cache folder for libs
if (!file("cache").exists()) {
mkdir "cache"
}
// Generate cache name for libs
MessageDigest messageDigest = MessageDigest.getInstance('MD5')
messageDigest.update(paddleLiteLibs.bytes)
String cacheName = new BigInteger(1, messageDigest.digest()).toString(32)
// Download libs
if (!file("cache/${cacheName}.tar.gz").exists()) {
ant.get(src: paddleLiteLibs, dest: file("cache/${cacheName}.tar.gz"))
}
// Unpack libs
if (!file("cache/${cacheName}").exists()) {
copy {
from tarTree("cache/${cacheName}.tar.gz")
into "cache/${cacheName}"
}
}

// Copy PaddlePredictor.jar
if (!file("libs/PaddlePredictor.jar").exists()) {
copy {
from "cache/${cacheName}/java/PaddlePredictor.jar"
into "libs"
}
}
// Copy libpaddle_lite_jni.so for armeabi-v7a and arm64-v8a
if (!file("src/main/jniLibs/armeabi-v7a/libpaddle_lite_jni.so").exists()) {
copy {
from "cache/${cacheName}/java/libs/armeabi-v7a/"
into "src/main/jniLibs/armeabi-v7a"
// into "src/main/jni/armeabi-v7a"
}
}
if (!file("src/main/jniLibs/arm64-v8a/libpaddle_lite_jni.so").exists()) {
copy {
from "cache/${cacheName}/java/libs/arm64-v8a/"
into "src/main/jniLibs/arm64-v8a"
// into "src/main/jni/arm64-v8a"
}
}
}
}
preBuild.dependsOn downloadAndExtractPaddleLiteLibs

def paddleLiteModels = [
[
'src' : 'https://paddlelite-demo.bj.bcebos.com/models/deeplab_mobilenet_fp32_for_cpu_v2_9_0.tar.gz',
'dest': 'src/main/assets/image_segmentation/models/deeplab_mobilenet_for_cpu'
],
]
task downloadAndExtractPaddleLiteModels(type: DefaultTask) {
doFirst {
println "Downloading and extracting Paddle Lite models"
}
doLast {
// Prepare cache folder for models
String cachePath = "cache"
if (!file("${cachePath}").exists()) {
mkdir "${cachePath}"
}
paddleLiteModels.eachWithIndex { model, index ->
MessageDigest messageDigest = MessageDigest.getInstance('MD5')
messageDigest.update(model.src.bytes)
String cacheName = new BigInteger(1, messageDigest.digest()).toString(32)
// Download the target model if not exists
boolean copyFiles = !file("${model.dest}").exists()
if (!file("${cachePath}/${cacheName}.tar.gz").exists()) {
ant.get(src: model.src, dest: file("${cachePath}/${cacheName}.tar.gz"))
copyFiles = true; // force to copy files from the latest archive files
}
// Copy model file
if (copyFiles) {
copy {
from tarTree("${cachePath}/${cacheName}.tar.gz")
into "${model.dest}"
}
}
}
}
}
preBuild.dependsOn downloadAndExtractPaddleLiteModels

四 整合项目

将项目自定义文件复制到src\main\cpp\ 目录下,然后在 CMakeLists.txt 文件里增加配置,修改后的配置文件如下

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
# For more information about using CMake with Android Studio, read the
# documentation: https://d.android.com/studio/projects/add-native-code.html

# Sets the minimum version of CMake required to build the native library.

cmake_minimum_required(VERSION 3.10.2)

# Declares and names the project.

project("aisample")


# 设置include文件夹的地址
include_directories(${CMAKE_SOURCE_DIR}/src/main/cpp/include)
include_directories("include")


# 设置opencv的动态库
add_library(libopencv_java3 SHARED IMPORTED)
set_target_properties(libopencv_java3 PROPERTIES IMPORTED_LOCATION
${CMAKE_SOURCE_DIR}/../jni/${ANDROID_ABI}/libopencv_java3.so)

# Creates and names a library, sets it as either STATIC
# or SHARED, and provides the relative paths to its source code.
# You can define multiple libraries, and CMake builds them for you.
# Gradle automatically packages shared libraries with your APK.

add_library( # Sets the name of the library.
native-lib

# Sets the library as a shared library.
SHARED

# Provides a relative path to your source file(s).
native-lib.cpp

# 需要使用到的cpp文件这里都要引入,否则会出问题
demo/demo.cpp
demo/Scanner.cpp
demo/document_detection.cpp


)

# Searches for a specified prebuilt library and stores the path as a
# variable. Because CMake includes system libraries in the search path by
# default, you only need to specify the name of the public NDK library
# you want to add. CMake verifies that the library exists before
# completing its build.

find_library( # Sets the name of the path variable.
log-lib

# Specifies the name of the NDK library that
# you want CMake to locate.
log)

# Specifies libraries CMake should link to your target library. You
# can link multiple libraries, such as libraries you define in this
# build script, prebuilt third-party libraries, or system libraries.

target_link_libraries( # Specifies the target library.
native-lib

# 引入opencv
libopencv_java3
# Links the target library to the log library
# included in the NDK.
${log-lib})

然后修改MainActivity文件,主要修改的地方如下

在模型运行成功后调用自定义模型对飞浆的输出结果进行处理

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
public void onRunModelSuccessed() {
// obtain results and update UI
tvInferenceTime.setText("Inference time: " + predictor.inferenceTime() + " ms");
//飞浆训练后的图片
Bitmap outputImage = predictor.outputImage();
// 原始的图片
Bitmap rawImage = predictor.rawImage();
//裁剪后的图片
Bitmap scaledImage = predictor.scaledImage();

//接收飞浆出来的数据,使用此数据进行训练
new Thread(() -> {
train(rawImage, outputImage);
}).start();

if (outputImage != null) {
ivInputImage.setImageBitmap(outputImage);
}
tvOutputResult.setText(predictor.outputResult());
tvOutputResult.scrollTo(0, 0);
}

自定义处理过程的核心代码如下

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34

private void train(Bitmap rawImage, Bitmap outputImage) {

saveImg(rawImage, "original_rawImage_");
saveImg(outputImage, "original_outputImage_");

Bitmap whiteImg = ImageUtil.fileBackgroup(outputImage);

saveImg(whiteImg, "original_fill_outputImage_");


//mat转灰度
Mat mat_source_gray = ImageUtil.bitMap2GrayMat(rawImage);
Mat mat_mark_gray = ImageUtil.bitMap2GrayMat(whiteImg);


// 回显灰度图检查是否正确
Bitmap bitmap_mark_dist = ImageUtil.mat2Bitmap(mat_mark_gray);
saveImg(bitmap_mark_dist, "mark_gray_");


Mat demo = new Mat();

test(mat_source_gray.getNativeObjAddr(), mat_mark_gray.getNativeObjAddr(), demo.getNativeObjAddr());

Bitmap resultImg = ImageUtil.mat2Bitmap(demo);

saveImg(resultImg, "return_image_");

////通过 activity 的 runOnUiThread方法更新界面
runOnUiThread(() -> ivInputImage.setImageBitmap(resultImg));


}

这里使用到的JNI接口的定义如下

1
2

public native long test(long src, long dist, long returnVal);

对应的native code里的接口的定义如下

1
2
3
4
5
6
7
8
extern "C" JNIEXPORT jlong JNICALL
Java_com_example_aisample_MainActivity_test(JNIEnv *env, jobject cls, jlong addr_src_pic,
jlong addr_mask_pic, jlong return_val) {
////////
// 这里省略具体的处理代码
//////

}

此时完整的项目结构如下

image-20210625114840230

图13 完整的项目结构

五 输出结果

完成相关的处理代码后,运行项目,可以看到在安卓模拟器的界面显示如下

image-20210625114222206

图14 运行结果

从图14可以看到,输出结果显示图像已经处理完成,取得了预期的效果。

此外,打开IDE的文件浏览器功能,在/storage/emulated/0/Pictures目录下可以看到已经生成了下面几张图片,从这几张图片里可以完全了解整个图片处理的全过程。

image-20210625114611438

图15 处理过程中生成的图片