Commit ad01a66c by Tianqi Chen Committed by GitHub

[DOCS] Integration guideline (#1135)

parent 5f0a517d
...@@ -34,3 +34,8 @@ on how to generate the library and [cpp_deploy.cc](https://github.com/dmlc/tvm/t ...@@ -34,3 +34,8 @@ on how to generate the library and [cpp_deploy.cc](https://github.com/dmlc/tvm/t
- Bundle the compiled library into your project in system module mode. - Bundle the compiled library into your project in system module mode.
Dynamic loading is more flexible and can load new modules on the fly. System module is a more ```static``` approach. We can use system module in places where dynamic library loading is banned. Dynamic loading is more flexible and can load new modules on the fly. System module is a more ```static``` approach. We can use system module in places where dynamic library loading is banned.
How to Deploy on Android
------------------------
- [How to deploy on android](deploy_android.md)
\ No newline at end of file
# How to deploy and use compiled model on Android # How to deploy and use compiled model on Android
This tutorial explain below aspects (Unlike the android_rpc approach we already have) This tutorial explain below aspects (Unlike the android_rpc approach we already have)
* Build a model for android target * Build a model for android target
...@@ -36,5 +35,3 @@ Refer [here](https://github.com/dmlc/tvm/blob/master/apps/android_deploy/README. ...@@ -36,5 +35,3 @@ Refer [here](https://github.com/dmlc/tvm/blob/master/apps/android_deploy/README.
### Android Native API Reference ### Android Native API Reference
From android java TVM API to load model & execute can be refered at this [java](https://github.com/dmlc/tvm/blob/master/apps/android_deploy/app/src/main/java/ml/dmlc/tvm/android/demo/MainActivity.java) sample source. From android java TVM API to load model & execute can be refered at this [java](https://github.com/dmlc/tvm/blob/master/apps/android_deploy/app/src/main/java/ml/dmlc/tvm/android/demo/MainActivity.java) sample source.
Integrate TVM into Your Project
===============================
TVM's runtime is designed to be lightweight and portable.
There are several ways you can integrate TVM into your project.
If you are looking for minimum deployment of a compiled module, take a look at [deployment guide](deploy.md)
This article introduces possible ways to integrate TVM
as a JIT compiler to generate functions on your system.
## DLPack Support
TVM's generated function follows the PackedFunc convention.
It is a function that can take positional arguments including
standard types such as float, integer, string.
The PackedFunc takes DLTensor pointer in [dlpack](https://github.com/dmlc/dlpack) convention.
So the only thing you need to solve is to create a corresponding DLTensor object.
## Integrate User Defined C++ Array
The only thing we have to do in C++ is to convert your array to DLTensor and pass in its address as
```DLTensor*``` to the generated function.
## Integrate User Defined Python Array
Assume you have a python object ```MyArray```. There are three things that you need to do
- Add ```_tvm_tcode``` field to your array which returns ```tvm.TypeCode.ARRAY_HANDLE```
- Support ```_tvm_handle``` property in your object, which returns the address of DLTensor in python integer
- Register this class by ```tvm.register_extension```
```python
# Example code
import tvm
class MyArray(object):
_tvm_tcode = tvm.TypeCode.ARRAY_HANDLE
@property
def _tvm_handle(self):
dltensor_addr = self.get_dltensor_addr()
return dltensor_addr
# You can put registration step in a separate file mypkg.tvm.py
# and only optionally import that if you only want optional dependency.
tvm.register_extension(MyArray)
```
...@@ -15,6 +15,7 @@ Contents ...@@ -15,6 +15,7 @@ Contents
tutorials/index tutorials/index
faq faq
how_to/deploy how_to/deploy
how_to/integrate
how_to/contribute how_to/contribute
api/python/index api/python/index
dev/index dev/index
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment