Commit 18db02a1 by Pariksheet Pinjari Committed by Tianqi Chen

[WIP] Linux/Android native deploy (#980)

parent 5999f2e2
......@@ -128,6 +128,10 @@ ifeq ($(USE_OPENCL), 1)
LDFLAGS += -lOpenCL
endif
RUNTIME_DEP += $(OPENCL_OBJ)
ifdef OPENCL_PATH
CFLAGS += -I$(OPENCL_PATH)/include
LDFLAGS += -L$(OPENCL_PATH)/lib
endif
else
CFLAGS += -DTVM_OPENCL_RUNTIME=0
endif
......
*.iml
.gradle
/local.properties
/.idea/workspace.xml
/.idea/libraries
.DS_Store
/build
/captures
.externalNativeBuild
# Android TVM Demo
This folder contains Android Demo app that allows us to show how to deploy model using TVM runtime api on a Android phone.
You will need [JDK](http://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html), [Android SDK](https://developer.android.com/studio/index.html), [Android NDK](https://developer.android.com/ndk) and an Android device to use this.
## Build and Installation
### Build APK
We use [Gradle](https://gradle.org) to build. Please follow [the installation instruction](https://gradle.org/install) for your operating system.
Before you build the Android application, please refer to [TVM4J Installation Guide](https://github.com/dmlc/tvm/blob/master/jvm/README.md) and install tvm4j-core to your local maven repository. You can find tvm4j dependency declare in `app/build.gradle`. Modify it if it is necessary.
```
dependencies {
compile fileTree(dir: 'libs', include: ['*.jar'])
androidTestCompile('com.android.support.test.espresso:espresso-core:2.2.2', {
exclude group: 'com.android.support', module: 'support-annotations'
})
compile 'com.android.support:appcompat-v7:26.0.1'
compile 'com.android.support.constraint:constraint-layout:1.0.2'
compile 'com.android.support:design:26.0.1'
compile 'ml.dmlc.tvm:tvm4j-core:0.0.1-SNAPSHOT'
testCompile 'junit:junit:4.12'
}
```
Application default has CPU version TVM runtime flavor and follow below instruction to setup.
In `app/src/main/jni/make` you will find JNI Makefile config `config.mk` and copy it to `app/src/main/jni` and modify it.
```bash
cd apps/android_deploy/app/src/main/jni
cp make/config.mk .
```
Here's a piece of example for `config.mk`.
```makefile
APP_ABI = arm64-v8a
APP_PLATFORM = android-17
# whether enable OpenCL during compile
USE_OPENCL = 0
```
Now use Gradle to compile JNI, resolve Java dependencies and build the Android application together with tvm4j. Run following script to generate the apk file.
```bash
export ANDROID_HOME=[Path to your Android SDK, e.g., ~/Android/sdk]
cd apps/android_deploy
gradle clean build
```
In `app/build/outputs/apk` you'll find `app-release-unsigned.apk`, use `dev_tools/gen_keystore.sh` to generate a signature and use `dev_tools/sign_apk.sh` to get the signed apk file `app/build/outputs/apk/tvmdemo-release.apk`.
Upload `tvmdemo-release.apk` to your Android device and install it.
### Build with OpenCL
Application does not link with OpenCL library unless you configure it to. Modify JNI Makefile config `app/src/main/jni` with proper target OpenCL configuration.
Here's a piece of example for `config.mk`.
```makefile
APP_ABI = arm64-v8a
APP_PLATFORM = android-17
# whether enable OpenCL during compile
USE_OPENCL = 1
# the additional include headers you want to add, e.g., SDK_PATH/adrenosdk/Development/Inc
ADD_C_INCLUDES = /opt/adrenosdk-osx/Development/Inc
# the additional link libs you want to add, e.g., ANDROID_LIB_PATH/libOpenCL.so
ADD_LDLIBS = libOpenCL.so
```
Note that you should specify the correct GPU development headers for your android device. Run `adb shell dumpsys | grep GLES` to find out what GPU your android device uses. It is very likely the library (libOpenCL.so) is already present on the mobile device. For instance, I found it under `/system/vendor/lib64`. You can do `adb pull /system/vendor/lib64/libOpenCL.so ./` to get the file to your desktop.
After you setup the `config.mk`, follow the instructions in [Build APK](#buildapk) to build the Android package with OpenCL flavor.
## Cross Compile and Run on Android Devices
### Architecture and Android Standalone Toolchain
In order to cross compile a shared library (.so) for your android device, you have to know the target triple for the device. (Refer to [Cross-compilation using Clang](https://clang.llvm.org/docs/CrossCompilation.html) for more information). Run `adb shell cat /proc/cpuinfo` to list the device's CPU information.
Now use NDK to generate standalone toolchain for your device. For my test device, I use following command.
```bash
cd /opt/android-ndk/build/tools/
./make-standalone-toolchain.sh --platform=android-24 --use-llvm --arch=arm64 --install-dir=/opt/android-toolchain-arm64
```
If everything goes well, you will find compile tools in `/opt/android-toolchain-arm64/bin`. For example, `bin/aarch64-linux-android-g++` can be used to compile C++ source codes and create shared libraries for arm64 Android devices.
### Place compiled model on Android application assets folder
Follow instruction to get compiled version model for android target [here.](https://github.com/dmlc/tvm/blob/master/docs/how_to/deploy_android.md#build-model-for-android-target)
Copied these compiled model deploy_lib.so, deploy_graph.json and deploy_param.params to apps/android_deploy/app/src/main/assets/ and modify TVM flavor changes on [java](https://github.com/dmlc/tvm/blob/master/apps/android_deploy/app/src/main/java/ml/dmlc/tvm/android/demo/MainActivity.java#L81)
`CPU Verison flavor`
```
private static final boolean EXE_GPU = false;
```
`OpenCL Verison flavor`
```
private static final boolean EXE_GPU = true;
```
Install compiled android application on phone and enjoy the image classifier demo using extraction model
You can define your own TVM operators and deploy via this demo application on your Android device to find the most optimized TVM schedule.
// import DownloadModels task
project.ext.ASSET_DIR = projectDir.toString() + '/src/main/assets'
project.ext.TMP_DIR = project.buildDir.toString() + '/downloads'
// Download default models(darknet framework extraction model compiled version);
// if you wish to use your own models then place them in the "assets" directory
// and comment out this line.
apply from: "download-models.gradle"
apply plugin: 'com.android.application'
task buildJni(type: Exec, description: 'Build JNI libs') {
commandLine 'sh', 'src/main/jni/build.sh'
}
tasks.withType(JavaCompile) {
compileTask -> compileTask.dependsOn buildJni
}
android {
compileSdkVersion 26
buildToolsVersion "26.0.1"
defaultConfig {
applicationId "ml.dmlc.tvm.android.demo"
minSdkVersion 17
targetSdkVersion 26
versionCode 1
versionName "1.0"
testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner"
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'
}
}
sourceSets {
main {
jni.srcDirs = []
jniLibs.srcDirs = ['src/main/libs']
assets.srcDirs = [project.ext.ASSET_DIR]
}
}
}
dependencies {
compile fileTree(dir: 'libs', include: ['*.jar'])
androidTestCompile('com.android.support.test.espresso:espresso-core:2.2.2', {
exclude group: 'com.android.support', module: 'support-annotations'
})
compile 'com.android.support:appcompat-v7:26.0.1'
compile 'com.android.support.constraint:constraint-layout:1.0.2'
compile 'com.android.support:design:26.0.1'
compile 'ml.dmlc.tvm:tvm4j-core:0.0.1-SNAPSHOT'
testCompile 'junit:junit:4.12'
}
/*
* download-models.gradle
* Downloads model files from ${MODEL_URL} into application's asset folder
* Input:
* project.ext.TMP_DIR: absolute path to hold downloaded zip files
* project.ext.ASSET_DIR: absolute path to save unzipped model files
* Output:
* 3 model files will be downloaded into given folder of ext.ASSET_DIR
*/
// hard coded model files
def models = ['extraction.zip']
// Root URL for model archives
def MODEL_URL = 'https://github.com/PariksheetPinjari909/TVM_models/blob/master/extraction_model'
buildscript {
repositories {
jcenter()
}
dependencies {
classpath 'de.undercouch:gradle-download-task:3.2.0'
}
}
import de.undercouch.gradle.tasks.download.Download
task downloadFile(type: Download){
for (f in models) {
src "${MODEL_URL}/" + f + "?raw=true"
dest new File(project.ext.TMP_DIR + "/" + f)
}
overwrite true
}
task extractModels(type: Copy) {
def needDownload = false
for (f in models) {
def localFile = f.split("/")[-1]
if (!(new File(project.ext.TMP_DIR + '/' + localFile)).exists()) {
needDownload = true
}
}
if (needDownload) {
dependsOn downloadFile
}
for (f in models) {
def localFile = f.split("/")[-1]
from zipTree(project.ext.TMP_DIR + '/' + localFile)
}
into file(project.ext.ASSET_DIR)
fileMode 0644
exclude '**/LICENSE'
}
tasks.whenTaskAdded { task ->
if (task.name == 'assembleDebug') {
task.dependsOn 'extractModels'
}
if (task.name == 'assembleRelease') {
task.dependsOn 'extractModels'
}
}
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="ml.dmlc.tvm.android.demo" >
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
<application
android:allowBackup="true"
android:label="@string/app_name"
android:supportsRtl="true"
android:theme="@style/AppTheme" >
<activity
android:name=".MainActivity"
android:label="@string/app_name"
android:theme="@style/AppTheme.NoActionBar"
android:screenOrientation="portrait">
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LAUNCHER" />
</intent-filter>
</activity>
<provider
android:name="android.support.v4.content.FileProvider"
android:authorities="${applicationId}.provider"
android:exported="false"
android:grantUriPermissions="true">
<meta-data
android:name="android.support.FILE_PROVIDER_PATHS"
android:resource="@xml/provider_paths"/>
</provider>
</application>
<uses-permission android:name="android.permission.INTERNET" />
</manifest>
\ No newline at end of file
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package ml.dmlc.tvm.android.demo;
import android.Manifest;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.content.res.AssetManager;
import android.app.AlertDialog;
import android.app.ProgressDialog;
import android.content.DialogInterface;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.Canvas;
import android.graphics.Matrix;
import android.net.Uri;
import android.os.AsyncTask;
import android.os.Build;
import android.os.Bundle;
import android.os.Environment;
import android.os.SystemClock;
import android.provider.MediaStore;
import android.support.v4.content.FileProvider;
import android.support.v7.app.AppCompatActivity;
import android.support.v7.widget.Toolbar;
import android.util.Log;
import android.view.View;
import android.widget.ImageView;
import android.widget.TextView;
import android.widget.Toast;
import java.io.ByteArrayOutputStream;
import java.io.File;
import java.io.FileOutputStream;
import java.io.InputStream;
import java.io.IOException;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.Vector;
import ml.dmlc.tvm.Function;
import ml.dmlc.tvm.Module;
import ml.dmlc.tvm.NDArray;
import ml.dmlc.tvm.TVMContext;
import ml.dmlc.tvm.TVMValue;
import ml.dmlc.tvm.TVMType;
public class MainActivity extends AppCompatActivity {
private static final String TAG = MainActivity.class.getSimpleName();
private static final int PERMISSIONS_REQUEST = 100;
private static final int PICTURE_FROM_GALLERY = 101;
private static final int PICTURE_FROM_CAMERA = 102;
private static final int IMAGE_PREVIEW_WIDTH = 960;
private static final int IMAGE_PREVIEW_HEIGHT = 720;
// TVM constants
private static final int OUTPUT_INDEX = 0;
private static final int IMG_CHANNEL = 3;
private static final String INPUT_NAME = "data";
// Configuration values for extraction model. Note that the graph, lib and params is not
// included with TVM and must be manually placed in the assets/ directory by the user.
// Graphs and models downloaded from https://github.com/pjreddie/darknet/blob/ may be
// converted e.g. via define_and_compile_model.py.
private static final boolean EXE_GPU = false;
private static final int MODEL_INPUT_SIZE = 224;
private static final String MODEL_CL_LIB_FILE = "file:///android_asset/deploy_lib_opencl.so";
private static final String MODEL_CPU_LIB_FILE = "file:///android_asset/deploy_lib_cpu.so";
private static final String MODEL_GRAPH_FILE = "file:///android_asset/deploy_graph.json";
private static final String MODEL_PARAM_FILE = "file:///android_asset/deploy_param.params";
private static final String MODEL_LABEL_FILE = "file:///android_asset/imagenet.shortnames.list";
private Uri mCameraImageUri;
private ImageView mImageView;
private TextView mResultView;
private AssetManager assetManager;
private Module graphRuntimeModule;
private Vector<String> labels = new Vector<String>();
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
Toolbar toolbar = findViewById(R.id.toolbar);
setSupportActionBar(toolbar);
assetManager = getAssets();
mImageView = (ImageView) findViewById(R.id.imageView);
mResultView = (TextView) findViewById(R.id.resultTextView);
findViewById(R.id.btnPickImage).setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View v) {
showPictureDialog();
}
});
if (hasPermission()) {
// instantiate tvm runtime and setup environment on background after application begin
new LoadModleAsyncTask().execute();
} else {
requestPermission();
}
}
/*
Load precompiled model on TVM graph runtime and init the system.
*/
private class LoadModleAsyncTask extends AsyncTask<Void, Void, Integer> {
ProgressDialog dialog = new ProgressDialog(MainActivity.this);
@Override
protected Integer doInBackground(Void... args) {
// load synset name
String lableFilename = MODEL_LABEL_FILE.split("file:///android_asset/")[1];
Log.i(TAG, "Reading synset name from: " + lableFilename);
try {
String labelsContent = new String(getBytesFromFile(assetManager, lableFilename));
for (String line : labelsContent.split("\\r?\\n")) {
labels.add(line);
}
} catch (IOException e) {
Log.e(TAG, "Problem reading synset name file!" + e);
return -1;//failure
}
// load json graph
String modelGraph = null;
String graphFilename = MODEL_GRAPH_FILE.split("file:///android_asset/")[1];
Log.i(TAG, "Reading json graph from: " + graphFilename);
try {
modelGraph = new String(getBytesFromFile(assetManager, graphFilename));
} catch (IOException e) {
Log.e(TAG, "Problem reading json graph file!" + e);
return -1;//failure
}
// upload tvm compiled function on application cache folder
String libCacheFilePath = null;
String libFilename = EXE_GPU ? MODEL_CL_LIB_FILE.split("file:///android_asset/")[1] :
MODEL_CPU_LIB_FILE.split("file:///android_asset/")[1];
Log.i(TAG, "Uploading compiled function to cache folder");
try {
libCacheFilePath = getTempLibFilePath(libFilename);
byte[] modelLibByte = getBytesFromFile(assetManager, libFilename);
FileOutputStream fos = new FileOutputStream(libCacheFilePath);
fos.write(modelLibByte);
fos.close();
} catch (IOException e) {
Log.e(TAG, "Problem uploading compiled function!" + e);
return -1;//failure
}
// load parameters
byte[] modelParams = null;
String paramFilename = MODEL_PARAM_FILE.split("file:///android_asset/")[1];
try {
modelParams = getBytesFromFile(assetManager, paramFilename);
} catch (IOException e) {
Log.e(TAG, "Problem reading params file!" + e);
return -1;//failure
}
// create java tvm context
TVMContext tvmCtx = EXE_GPU ? TVMContext.opencl() : TVMContext.cpu();
// tvm module for compiled functions
Module modelLib = Module.load(libCacheFilePath);
// get global function module for graph runtime
Function runtimeCreFun = Function.getFunction("tvm.graph_runtime.create");
TVMValue runtimeCreFunRes = runtimeCreFun.pushArg(modelGraph)
.pushArg(modelLib)
.pushArg(tvmCtx.deviceType)
.pushArg(tvmCtx.deviceId)
.invoke();
graphRuntimeModule = runtimeCreFunRes.asModule();
// get the function from the module(load parameters)
Function loadParamFunc = graphRuntimeModule.getFunction("load_params");
loadParamFunc.pushArg(modelParams).invoke();
// release tvm local variables
modelLib.release();
loadParamFunc.release();
runtimeCreFun.release();
return 0;//success
}
@Override
protected void onPreExecute() {
dialog.setCancelable(false);
dialog.setMessage("Loading Model...");
dialog.show();
super.onPreExecute();
}
@Override
protected void onPostExecute(Integer status) {
if (dialog != null && dialog.isShowing()) {
dialog.dismiss();
}
if (status != 0) {
showDialog("Error", "Fail to initialized model, check compiled model");
}
}
}
/*
Execute prediction for processed decode input bitmap image content on TVM graph runtime.
*/
private class ModelRunAsyncTask extends AsyncTask<Bitmap, Void, Integer> {
ProgressDialog dialog = new ProgressDialog(MainActivity.this);
@Override
protected Integer doInBackground(Bitmap... bitmaps) {
if (null != graphRuntimeModule) {
int count = bitmaps.length;
for (int i = 0 ; i < count ; i++) {
long processingTimeMs = SystemClock.uptimeMillis();
Log.i(TAG, "Decode JPEG image content");
// extract the jpeg content
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bitmaps[i].compress(Bitmap.CompressFormat.JPEG,100,stream);
byte[] byteArray = stream.toByteArray();
Bitmap imageBitmap = BitmapFactory.decodeByteArray(byteArray, 0, byteArray.length);
// crop input image at centre to model input size
// commecial deploy note:: instead of cropying image do resize
// image to model input size so we never lost the image content
Bitmap cropImageBitmap = Bitmap.createBitmap(MODEL_INPUT_SIZE, MODEL_INPUT_SIZE, Bitmap.Config.ARGB_8888);
Matrix frameToCropTransform = getTransformationMatrix(imageBitmap.getWidth(), imageBitmap.getHeight(),
MODEL_INPUT_SIZE, MODEL_INPUT_SIZE, 0, true);
Canvas canvas = new Canvas(cropImageBitmap);
canvas.drawBitmap(imageBitmap, frameToCropTransform, null);
// image pixel int values
int[] pixelValues = new int[MODEL_INPUT_SIZE * MODEL_INPUT_SIZE];
// image RGB float values
float[] imgRgbValues = new float[MODEL_INPUT_SIZE * MODEL_INPUT_SIZE * IMG_CHANNEL];
// image RGB transpose float values
float[] imgRgbTranValues = new float[MODEL_INPUT_SIZE * MODEL_INPUT_SIZE * IMG_CHANNEL];
// pre-process the image data from 0-255 int to normalized float based on the
// provided parameters.
cropImageBitmap.getPixels(pixelValues, 0, MODEL_INPUT_SIZE, 0, 0, MODEL_INPUT_SIZE, MODEL_INPUT_SIZE);
for (int j = 0; j < pixelValues.length; ++j) {
imgRgbValues[j * 3 + 0] = ((pixelValues[j] >> 16) & 0xFF)/255.0f;
imgRgbValues[j * 3 + 1] = ((pixelValues[j] >> 8) & 0xFF)/255.0f;
imgRgbValues[j * 3 + 2] = (pixelValues[j] & 0xFF)/255.0f;
}
// pre-process the image rgb data transpose based on the provided parameters.
for (int k = 0; k < IMG_CHANNEL; ++k) {
for (int l = 0; l < MODEL_INPUT_SIZE; ++l) {
for (int m = 0; m < MODEL_INPUT_SIZE; ++m) {
int dst_index = m + MODEL_INPUT_SIZE*l + MODEL_INPUT_SIZE*MODEL_INPUT_SIZE*k;
int src_index = k + IMG_CHANNEL*m + IMG_CHANNEL*MODEL_INPUT_SIZE*l;
imgRgbTranValues[dst_index] = imgRgbValues[src_index];
}
}
}
// get the function from the module(set input data)
Log.i(TAG, "set input data");
NDArray inputNdArray = NDArray.empty(new long[]{1, IMG_CHANNEL, MODEL_INPUT_SIZE, MODEL_INPUT_SIZE}, new TVMType("float32"));;
inputNdArray.copyFrom(imgRgbTranValues);
Function setInputFunc = graphRuntimeModule.getFunction("set_input");
setInputFunc.pushArg(INPUT_NAME).pushArg(inputNdArray).invoke();
// release tvm local variables
inputNdArray.release();
setInputFunc.release();
// get the function from the module(run it)
Log.i(TAG, "run function on target");
Function runFunc = graphRuntimeModule.getFunction("run");
runFunc.invoke();
// release tvm local variables
runFunc.release();
// get the function from the module(get output data)
Log.i(TAG, "get output data");
NDArray outputNdArray = NDArray.empty(new long[]{1000}, new TVMType("float32"));
Function getOutputFunc = graphRuntimeModule.getFunction("get_output");
getOutputFunc.pushArg(OUTPUT_INDEX).pushArg(outputNdArray).invoke();
float[] output = outputNdArray.asFloatArray();
// release tvm local variables
outputNdArray.release();
getOutputFunc.release();
// display the result from extracted output data
if (null != output) {
int maxPosition = -1;
float maxValue = 0;
for (int j = 0; j < output.length; ++j) {
if (output[j] > maxValue) {
maxValue = output[j];
maxPosition = j;
}
}
processingTimeMs = SystemClock.uptimeMillis() - processingTimeMs;
String label = "Prediction Result : ";
label += labels.size() > maxPosition ? labels.get(maxPosition) : "unknown";
label += "\nPrediction Time : " + processingTimeMs + "ms";
mResultView.setText(label);
}
Log.i(TAG, "prediction finished");
}
return 0;
}
return -1;
}
@Override
protected void onPreExecute() {
dialog.setCancelable(false);
dialog.setMessage("Prediction running on image...");
dialog.show();
super.onPreExecute();
}
@Override
protected void onPostExecute(Integer status) {
if (dialog != null && dialog.isShowing()) {
dialog.dismiss();
}
if (status != 0) {
showDialog("Error", "Fail to predict image, GraphRuntime exception");
}
}
}
@Override
protected void onDestroy() {
// release tvm local variables
if (null != graphRuntimeModule)
graphRuntimeModule.release();
super.onDestroy();
}
/**
* Read file from assets and return byte array.
*
* @param assets The asset manager to be used to load assets.
* @param fileName The filepath of read file.
* @return byte[] file content
* @throws IOException
*/
private byte[] getBytesFromFile(AssetManager assets, String fileName) throws IOException {
InputStream is = assets.open(fileName);
int length = is.available();
byte[] bytes = new byte[length];
// Read in the bytes
int offset = 0;
int numRead = 0;
try {
while (offset < bytes.length
&& (numRead = is.read(bytes, offset, bytes.length - offset)) >= 0) {
offset += numRead;
}
} finally {
is.close();
}
// Ensure all the bytes have been read in
if (offset < bytes.length) {
throw new IOException("Could not completely read file " + fileName);
}
return bytes;
}
/**
* Dialog show pick option for select image from Gallery or Camera.
*/
private void showPictureDialog(){
AlertDialog.Builder pictureDialog = new AlertDialog.Builder(this);
pictureDialog.setTitle("Select Action");
String[] pictureDialogItems = {
"Select photo from gallery",
"Capture photo from camera" };
pictureDialog.setItems(pictureDialogItems,
new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialog, int which) {
switch (which) {
case 0:
choosePhotoFromGallery();
break;
case 1:
takePhotoFromCamera();
break;
}
}
});
pictureDialog.show();
}
/**
* Request to pick image from Gallery.
*/
public void choosePhotoFromGallery() {
Intent galleryIntent = new Intent(Intent.ACTION_PICK,
android.provider.MediaStore.Images.Media.EXTERNAL_CONTENT_URI);
startActivityForResult(galleryIntent, PICTURE_FROM_GALLERY);
}
/**
* Request to capture image from Camera.
*/
private void takePhotoFromCamera() {
Intent intent = new Intent(android.provider.MediaStore.ACTION_IMAGE_CAPTURE);
if (Build.VERSION.SDK_INT < Build.VERSION_CODES.N) {
mCameraImageUri = Uri.fromFile(createImageFile());
} else {
File file = new File(createImageFile().getPath());
mCameraImageUri = FileProvider.getUriForFile(getApplicationContext(), getApplicationContext().getPackageName() + ".provider", file);
}
intent.putExtra(MediaStore.EXTRA_OUTPUT, mCameraImageUri);
startActivityForResult(intent, PICTURE_FROM_CAMERA);
}
@Override
public void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (resultCode == this.RESULT_CANCELED) {
return;
}
Uri contentURI = null;
if (requestCode == PICTURE_FROM_GALLERY) {
if (data != null) {
contentURI = data.getData();
}
} else if (requestCode == PICTURE_FROM_CAMERA) {
contentURI = mCameraImageUri;
}
if (null != contentURI) {
try {
Bitmap bitmap = MediaStore.Images.Media.getBitmap(this.getContentResolver(), contentURI);
Bitmap scaled = Bitmap.createScaledBitmap(bitmap, IMAGE_PREVIEW_HEIGHT, IMAGE_PREVIEW_WIDTH, true);
mImageView.setImageBitmap(scaled);
new ModelRunAsyncTask().execute(scaled);
} catch (IOException e) {
e.printStackTrace();
}
}
}
/**
* Get application cache path where to place compiled functions.
*
* @param fileName library file name.
* @return String application cache folder path
* @throws IOException
*/
private final String getTempLibFilePath(String fileName) throws IOException {
File tempDir = File.createTempFile("tvm4j_demo_", "");
if (!tempDir.delete() || !tempDir.mkdir()) {
throw new IOException("Couldn't create directory " + tempDir.getAbsolutePath());
}
return (tempDir + File.separator + fileName);
}
/**
* Create image file under storage where camera application save captured image.
*
* @return File image file under sdcard where camera can save image
*/
private File createImageFile() {
// Create an image file name
String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss").format(new Date());
String imageFileName = "JPEG_" + timeStamp + "_";
File storageDir = Environment.getExternalStoragePublicDirectory(
Environment.DIRECTORY_PICTURES);
try {
File image = File.createTempFile(
imageFileName, // prefix
".jpg", // suffix
storageDir // directory
);
return image;
} catch (IOException e) {
e.printStackTrace();
}
return null;
}
/**
* Show dialog to user.
*
* @param title dialog display title
* @param msg dialog display message
*/
private void showDialog(String title, String msg) {
AlertDialog.Builder builder = new AlertDialog.Builder(this);
builder.setTitle(title);
builder.setMessage(msg);
builder.setCancelable(true);
builder.setNeutralButton(android.R.string.ok,
new DialogInterface.OnClickListener() {
public void onClick(DialogInterface dialog, int id) {
dialog.cancel();
finish();
}
});
builder.create().show();
}
@Override
public void onRequestPermissionsResult (final int requestCode, final String[] permissions, final int[] grantResults){
if (requestCode == PERMISSIONS_REQUEST) {
if (grantResults.length > 0
&& grantResults[0] == PackageManager.PERMISSION_GRANTED
&& grantResults[1] == PackageManager.PERMISSION_GRANTED) {
// instantiate tvm runtime and setup environment on background after application begin
new LoadModleAsyncTask().execute();
} else {
requestPermission();
}
}
}
/**
* Whether application has required mandatory permissions to run.
*/
private boolean hasPermission() {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
return checkSelfPermission(Manifest.permission.CAMERA) == PackageManager.PERMISSION_GRANTED &&
checkSelfPermission(Manifest.permission.WRITE_EXTERNAL_STORAGE) == PackageManager.PERMISSION_GRANTED;
} else {
return true;
}
}
/**
* Request required mandatory permission for application to run.
*/
private void requestPermission() {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
if (shouldShowRequestPermissionRationale(Manifest.permission.CAMERA) ||
shouldShowRequestPermissionRationale(Manifest.permission.WRITE_EXTERNAL_STORAGE)) {
Toast.makeText(this,
"Camera AND storage permission are required for this demo", Toast.LENGTH_LONG).show();
}
requestPermissions(new String[] {Manifest.permission.CAMERA, Manifest.permission.WRITE_EXTERNAL_STORAGE}, PERMISSIONS_REQUEST);
}
}
/**
* Returns a transformation matrix from one reference frame into another.
* Handles cropping (if maintaining aspect ratio is desired) and rotation.
*
* @param srcWidth Width of source frame.
* @param srcHeight Height of source frame.
* @param dstWidth Width of destination frame.
* @param dstHeight Height of destination frame.
* @param applyRotation Amount of rotation to apply from one frame to another.
* Must be a multiple of 90.
* @param maintainAspectRatio If true, will ensure that scaling in x and y remains constant,
* cropping the image if necessary.
* @return The transformation fulfilling the desired requirements.
*/
public static Matrix getTransformationMatrix(
final int srcWidth,
final int srcHeight,
final int dstWidth,
final int dstHeight,
final int applyRotation,
final boolean maintainAspectRatio) {
final Matrix matrix = new Matrix();
if (applyRotation != 0) {
if (applyRotation % 90 != 0) {
Log.w(TAG, "Rotation of %d % 90 != 0 " + applyRotation);
}
// Translate so center of image is at origin.
matrix.postTranslate(-srcWidth / 2.0f, -srcHeight / 2.0f);
// Rotate around origin.
matrix.postRotate(applyRotation);
}
// Account for the already applied rotation, if any, and then determine how
// much scaling is needed for each axis.
final boolean transpose = (Math.abs(applyRotation) + 90) % 180 == 0;
final int inWidth = transpose ? srcHeight : srcWidth;
final int inHeight = transpose ? srcWidth : srcHeight;
// Apply scaling if necessary.
if (inWidth != dstWidth || inHeight != dstHeight) {
final float scaleFactorX = dstWidth / (float) inWidth;
final float scaleFactorY = dstHeight / (float) inHeight;
if (maintainAspectRatio) {
// Scale by minimum factor so that dst is filled completely while
// maintaining the aspect ratio. Some image may fall off the edge.
final float scaleFactor = Math.max(scaleFactorX, scaleFactorY);
matrix.postScale(scaleFactor, scaleFactor);
} else {
// Scale exactly to fill dst from src.
matrix.postScale(scaleFactorX, scaleFactorY);
}
}
if (applyRotation != 0) {
// Translate back from origin centered reference to destination frame.
matrix.postTranslate(dstWidth / 2.0f, dstHeight / 2.0f);
}
return matrix;
}
}
\ No newline at end of file
LOCAL_PATH := $(call my-dir)
MY_PATH := $(LOCAL_PATH)
include $(CLEAR_VARS)
LOCAL_PATH := $(MY_PATH)
ROOT_PATH := $(MY_PATH)/../../../../../..
ifndef config
ifneq ("$(wildcard ./config.mk)","")
config ?= config.mk
else
config ?= make/config.mk
endif
endif
include $(config)
LOCAL_SRC_FILES := ml_dmlc_tvm_native_c_api.cc
LOCAL_LDFLAGS := -L$(SYSROOT)/usr/lib/ -llog
LOCAL_C_INCLUDES := $(ROOT_PATH)/include \
$(ROOT_PATH)/dlpack/include \
$(ROOT_PATH)/dmlc-core/include \
$(ROOT_PATH)/HalideIR/src \
$(ROOT_PATH)/topi/include
LOCAL_MODULE = tvm4j_runtime_packed
LOCAL_CPP_FEATURES += exceptions
LOCAL_LDLIBS += -latomic
LOCAL_ARM_MODE := arm
ifdef ADD_C_INCLUDES
LOCAL_C_INCLUDES += $(ADD_C_INCLUDES)
endif
ifdef ADD_LDLIBS
LOCAL_LDLIBS += $(ADD_LDLIBS)
endif
include $(BUILD_SHARED_LIBRARY)
ifndef config
ifneq ("$(wildcard ./config.mk)","")
config ?= config.mk
else
config ?= make/config.mk
endif
endif
include $(config)
APP_STL := gnustl_static
APP_CPPFLAGS += -DDMLC_LOG_STACK_TRACE=0 -DTVM4J_ANDROID=1 -std=c++11 -Oz -frtti
ifeq ($(USE_OPENCL), 1)
APP_CPPFLAGS += -DTVM_OPENCL_RUNTIME=1
endif
#!/bin/bash
PATH="$PATH:/usr/local/bin"
CURR_DIR=$(cd `dirname $0`; pwd)
ROOT_DIR="$CURR_DIR/../../../../../.."
javah -o $CURR_DIR/ml_dmlc_tvm_native_c_api.h -cp "$ROOT_DIR/jvm/core/target/*" ml.dmlc.tvm.LibInfo || exit -1
cp -f $ROOT_DIR/jvm/native/src/main/native/ml_dmlc_tvm_native_c_api.cc $CURR_DIR/ || exit -1
cp -f $ROOT_DIR/jvm/native/src/main/native/jni_helper_func.h $CURR_DIR/ || exit -1
rm -rf $CURR_DIR/../libs
ndk-build --directory=$CURR_DIR
#-------------------------------------------------------------------------------
# Template configuration for compiling
#
# If you want to change the configuration, please use the following
# steps. Assume you are on the root directory. First copy the this
# file so that any local changes will be ignored by git
#
# cp make/config.mk .
#
# Next modify the according entries, and then compile by
#
# ./build.sh
#
#-------------------------------------------------------------------------------
APP_ABI = all
APP_PLATFORM = android-17
# whether enable OpenCL during compile
USE_OPENCL = 0
# the additional include headers you want to add, e.g., SDK_PATH/adrenosdk/Development/Inc
ADD_C_INCLUDES =
# the additional link libs you want to add, e.g., ANDROID_LIB_PATH/libOpenCL.so
ADD_LDLIBS =
/*!
* Copyright (c) 2018 by Contributors
* \file tvm_runtime.h
* \brief Pack all tvm runtime source files
*/
#include <sys/stat.h>
#include <fstream>
#include "../src/runtime/c_runtime_api.cc"
#include "../src/runtime/cpu_device_api.cc"
#include "../src/runtime/workspace_pool.cc"
#include "../src/runtime/module_util.cc"
#include "../src/runtime/system_lib_module.cc"
#include "../src/runtime/module.cc"
#include "../src/runtime/registry.cc"
#include "../src/runtime/file_util.cc"
#include "../src/runtime/dso_module.cc"
#include "../src/runtime/thread_pool.cc"
#include "../src/runtime/threading_backend.cc"
#include "../src/runtime/graph/graph_runtime.cc"
#ifdef TVM_OPENCL_RUNTIME
#include "../src/runtime/opencl/opencl_device_api.cc"
#include "../src/runtime/opencl/opencl_module.cc"
#endif
<?xml version="1.0" encoding="utf-8"?>
<android.support.design.widget.CoordinatorLayout
xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context="ml.dmlc.tvm.android.demo.MainActivity">
<android.support.design.widget.AppBarLayout
android:layout_height="wrap_content"
android:layout_width="match_parent"
android:theme="@style/AppTheme.AppBarOverlay">
<android.support.v7.widget.Toolbar
android:id="@+id/toolbar"
android:layout_width="match_parent"
android:layout_height="?attr/actionBarSize"
android:background="?attr/colorPrimary"
app:popupTheme="@style/AppTheme.PopupOverlay" />
</android.support.design.widget.AppBarLayout>
<include layout="@layout/content_main"/>
</android.support.design.widget.CoordinatorLayout>
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
xmlns:app="http://schemas.android.com/apk/res-auto"
android:orientation="vertical"
android:layout_width="fill_parent"
android:layout_height="wrap_content"
app:layout_behavior="@string/appbar_scrolling_view_behavior"
tools:showIn="@layout/activity_main">
<LinearLayout
android:layout_width="match_parent"
android:layout_height="match_parent"
android:orientation="horizontal">
<Button
android:id="@+id/btnPickImage"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_weight="1"
android:text="Select or Capture picture" />
</LinearLayout>
<View
android:layout_width="match_parent"
android:layout_height="4dp" />
<TextView
android:id="@+id/resultTextView"
android:layout_width="match_parent"
android:layout_height="60dp"
android:layout_weight="1"
android:textColor="@color/colorPrimary"
android:textSize="15sp" />
<View
android:layout_width="match_parent"
android:layout_height="4dp" />
<ImageView
android:id="@+id/imageView"
android:layout_width="match_parent"
android:layout_height="375dp"
android:layout_weight="1" />
<View
android:layout_width="match_parent"
android:layout_height="10dp" />
</LinearLayout>
<?xml version="1.0" encoding="utf-8"?>
<resources>
<color name="colorPrimary">#3F51B5</color>
<color name="colorPrimaryDark">#303F9F</color>
<color name="colorAccent">#06d467</color>
</resources>
<resources>
<string name="app_name">TVM Android Demo</string>
</resources>
\ No newline at end of file
<resources>
<!-- Base application theme. -->
<style name="AppTheme" parent="Theme.AppCompat.Light.DarkActionBar">
<!-- Customize your theme here. -->
<item name="colorPrimary">@color/colorPrimary</item>
<item name="colorPrimaryDark">@color/colorPrimaryDark</item>
<item name="colorAccent">@color/colorAccent</item>
</style>
<style name="AppTheme.NoActionBar">
<item name="windowActionBar">false</item>
<item name="windowNoTitle">true</item>
</style>
<style name="AppTheme.AppBarOverlay" parent="ThemeOverlay.AppCompat.Dark.ActionBar" />
<style name="AppTheme.PopupOverlay" parent="ThemeOverlay.AppCompat.Light" />
</resources>
<?xml version="1.0" encoding="utf-8"?>
<paths xmlns:android="http://schemas.android.com/apk/res/android">
<external-path name="external_files" path="."/>
</paths>
\ No newline at end of file
// Top-level build file where you can add configuration options common to all sub-projects/modules.
buildscript {
repositories {
jcenter()
}
dependencies {
classpath 'com.android.tools.build:gradle:2.3.3'
classpath 'org.apache.httpcomponents:httpclient:4.5.4'
// NOTE: Do not place your application dependencies here; they belong
// in the individual module build.gradle files
}
}
allprojects {
repositories {
jcenter()
maven {
url 'https://maven.google.com'
}
mavenLocal()
mavenCentral()
}
}
task clean(type: Delete) {
delete rootProject.buildDir
}
#!/bin/bash
CURR_DIR=$(cd `dirname $0`; pwd)
keytool -genkey -keystore $CURR_DIR/tvmdemo.keystore -alias tvmdemo -keyalg RSA -validity 10000
#!/bin/bash
CURR_DIR=$(cd `dirname $0`; pwd)
APK_DIR=$CURR_DIR/../app/build/outputs/apk
UNSIGNED_APK=$APK_DIR/app-release-unsigned.apk
SIGNED_APK=$APK_DIR/tvmdemo-release.apk
jarsigner -verbose -keystore $CURR_DIR/tvmdemo.keystore -signedjar $SIGNED_APK $UNSIGNED_APK 'tvmdemo'
echo $SIGNED_APK
# How to deploy and use compiled model on Android
This tutorial explain below aspects (Unlike the android_rpc approach we already have)
* Build a model for android target
* TVM run on Android using Java API
As an example here is a reference block diagram.
![](http://www.tvmlang.org/images/release/tvm_flexible.png)
## Build model for Android Target
NNVM compilation of model for android target could follow same approach like android_rpc.
An reference exampe can be found at [chainer-nnvm-example](https://github.com/tkat0/chainer-nnvm-example)
Above example will directly run the compiled model on RPC target. Below modification at [rum_mobile.py](https://github.com/tkat0/chainer-nnvm-example/blob/5b97fd4d41aa4dde4b0aceb0be311054fb5de451/run_mobile.py#L64) will save the compilation output which is required on android target.
```
lib.export_library("deploy_lib.so", ndk.create_shared)
with open("deploy_graph.json", "w") as fo:
fo.write(graph.json())
with open("deploy_param.params", "wb") as fo:
fo.write(nnvm.compiler.save_param_dict(params))
```
deploy_lib.so, deploy_graph.json, deploy_param.params will go to android target.
## TVM run on Android using Java API
### TVM Runtime for android Target
Refer [here](https://github.com/dmlc/tvm/blob/master/apps/android_deploy/README.md#build-and-installation) to build CPU/OpenCL version flavor TVM runtime for android target.
### Android Native API Reference
From android java TVM API to load model & execute can be refered at this [java](https://github.com/dmlc/tvm/blob/master/apps/android_deploy/app/src/main/java/ml/dmlc/tvm/android/demo/MainActivity.java) sample source.
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment