relay_op.rst 8.4 KB
Newer Older
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17
..  Licensed to the Apache Software Foundation (ASF) under one
    or more contributor license agreements.  See the NOTICE file
    distributed with this work for additional information
    regarding copyright ownership.  The ASF licenses this file
    to you under the Apache License, Version 2.0 (the
    "License"); you may not use this file except in compliance
    with the License.  You may obtain a copy of the License at

..    http://www.apache.org/licenses/LICENSE-2.0

..  Unless required by applicable law or agreed to in writing,
    software distributed under the License is distributed on an
    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
    KIND, either express or implied.  See the License for the
    specific language governing permissions and limitations
    under the License.

18 19 20 21
Relay Core Tensor Operators
===========================

This page contains the list of core tensor operator primitives pre-defined in tvm.relay.
22 23 24
The core tensor operator primitives cover typical workloads in deep learning.
They can represent workloads in front-end frameworks and provide basic building blocks for optimization.
Since deep learning is a fast evolving field, it is possible to have operators that are not in here.
25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44


.. note::

   This document will directly list the function signature of
   these operators in the python frontend.


Overview of Operators
---------------------
**Level 1: Basic Operators**

This level enables fully connected multi-layer perceptron.

.. autosummary::
   :nosignatures:

   tvm.relay.log
   tvm.relay.sqrt
   tvm.relay.exp
45
   tvm.relay.sigmoid
46
   tvm.relay.add
47 48 49 50 51
   tvm.relay.subtract
   tvm.relay.multiply
   tvm.relay.divide
   tvm.relay.mod
   tvm.relay.tanh
52 53 54 55
   tvm.relay.concatenate
   tvm.relay.expand_dims
   tvm.relay.nn.softmax
   tvm.relay.nn.log_softmax
雾雨魔理沙 committed
56
   tvm.relay.nn.relu
57 58
   tvm.relay.nn.dropout
   tvm.relay.nn.batch_norm
59 60
   tvm.relay.nn.bias_add

61 62 63 64 65 66 67 68 69

**Level 2: Convolutions**

This level enables typical convnet models.

.. autosummary::
   :nosignatures:

   tvm.relay.nn.conv2d
70
   tvm.relay.nn.conv2d_transpose
71
   tvm.relay.nn.dense
72 73 74 75 76 77
   tvm.relay.nn.max_pool2d
   tvm.relay.nn.avg_pool2d
   tvm.relay.nn.global_max_pool2d
   tvm.relay.nn.global_avg_pool2d
   tvm.relay.nn.upsampling
   tvm.relay.nn.batch_flatten
78
   tvm.relay.nn.pad
79 80
   tvm.relay.nn.lrn
   tvm.relay.nn.l2_normalize
81 82
   tvm.relay.nn.contrib_conv2d_winograd_without_weight_transform
   tvm.relay.nn.contrib_conv2d_winograd_weight_transform
83 84 85 86


**Level 3: Additional Math And Transform Operators**

87 88
This level enables additional math and transform operators.

89 90 91
.. autosummary::
   :nosignatures:

92
   tvm.relay.nn.leaky_relu
Siju committed
93
   tvm.relay.nn.prelu
94
   tvm.relay.reshape
Siju committed
95
   tvm.relay.reshape_like
96 97
   tvm.relay.copy
   tvm.relay.transpose
98
   tvm.relay.squeeze
99 100
   tvm.relay.floor
   tvm.relay.ceil
101
   tvm.relay.sign
102
   tvm.relay.trunc
103
   tvm.relay.clip
104 105 106
   tvm.relay.round
   tvm.relay.abs
   tvm.relay.negative
Siva committed
107
   tvm.relay.take
108 109 110 111
   tvm.relay.zeros
   tvm.relay.zeros_like
   tvm.relay.ones
   tvm.relay.ones_like
112
   tvm.relay.gather_nd
113 114
   tvm.relay.full
   tvm.relay.full_like
115
   tvm.relay.cast
Siva committed
116
   tvm.relay.split
117
   tvm.relay.arange
118
   tvm.relay.stack
119 120
   tvm.relay.repeat
   tvm.relay.tile
121
   tvm.relay.reverse
122

雾雨魔理沙 committed
123

124 125
**Level 4: Broadcast and Reductions**

126 127 128
.. autosummary::
   :nosignatures:

129
   tvm.relay.right_shift
130
   tvm.relay.left_shift
131 132 133 134 135 136
   tvm.relay.equal
   tvm.relay.not_equal
   tvm.relay.greater
   tvm.relay.greater_equal
   tvm.relay.less
   tvm.relay.less_equal
137 138 139
   tvm.relay.logical_and
   tvm.relay.logical_or
   tvm.relay.logical_not
140
   tvm.relay.maximum
141
   tvm.relay.minimum
142
   tvm.relay.power
Zhi committed
143
   tvm.relay.where
144 145
   tvm.relay.argmax
   tvm.relay.argmin
146 147 148 149 150
   tvm.relay.sum
   tvm.relay.max
   tvm.relay.min
   tvm.relay.mean
   tvm.relay.prod
151
   tvm.relay.strided_slice
152
   tvm.relay.broadcast_to
153

雾雨魔理沙 committed
154

155 156
**Level 5: Vision/Image Operators**

157 158 159 160
.. autosummary::
   :nosignatures:

   tvm.relay.image.resize
161 162 163
   tvm.relay.vision.multibox_prior
   tvm.relay.vision.multibox_transform_loc
   tvm.relay.vision.nms
Siju committed
164
   tvm.relay.vision.yolo_reorg
165

166

167 168 169 170 171 172 173 174 175
**Level 10: Temporary Operators**

This level support backpropagation of broadcast operators. It is temporary.

.. autosummary::
   :nosignatures:

   tvm.relay.broadcast_to_like
   tvm.relay.collapse_sum_like
176
   tvm.relay.slice_like
177
   tvm.relay.shape_of
178
   tvm.relay.layout_transform
179 180
   tvm.relay.device_copy
   tvm.relay.annotation.on_device
181
   tvm.relay.reverse_reshape
182
   tvm.relay.nn.batch_matmul
183 184


185 186 187 188 189
Level 1 Definitions
-------------------
.. autofunction:: tvm.relay.log
.. autofunction:: tvm.relay.sqrt
.. autofunction:: tvm.relay.exp
190
.. autofunction:: tvm.relay.sigmoid
191
.. autofunction:: tvm.relay.add
192 193 194 195 196 197
.. autofunction:: tvm.relay.subtract
.. autofunction:: tvm.relay.multiply
.. autofunction:: tvm.relay.divide
.. autofunction:: tvm.relay.mod
.. autofunction:: tvm.relay.tanh
.. autofunction:: tvm.relay.concatenate
198
.. autofunction:: tvm.relay.expand_dims
199
.. autofunction:: tvm.relay.nn.softmax
200
.. autofunction:: tvm.relay.nn.log_softmax
雾雨魔理沙 committed
201
.. autofunction:: tvm.relay.nn.relu
202 203 204
.. autofunction:: tvm.relay.nn.dropout
.. autofunction:: tvm.relay.nn.batch_norm
.. autofunction:: tvm.relay.nn.bias_add
205 206 207 208 209


Level 2 Definitions
-------------------
.. autofunction:: tvm.relay.nn.conv2d
210
.. autofunction:: tvm.relay.nn.conv2d_transpose
211
.. autofunction:: tvm.relay.nn.dense
212 213 214 215 216 217
.. autofunction:: tvm.relay.nn.max_pool2d
.. autofunction:: tvm.relay.nn.avg_pool2d
.. autofunction:: tvm.relay.nn.global_max_pool2d
.. autofunction:: tvm.relay.nn.global_avg_pool2d
.. autofunction:: tvm.relay.nn.upsampling
.. autofunction:: tvm.relay.nn.batch_flatten
218
.. autofunction:: tvm.relay.nn.pad
219 220
.. autofunction:: tvm.relay.nn.lrn
.. autofunction:: tvm.relay.nn.l2_normalize
221 222
.. autofunction:: tvm.relay.nn.contrib_conv2d_winograd_without_weight_transform
.. autofunction:: tvm.relay.nn.contrib_conv2d_winograd_weight_transform
223

224 225
Level 3 Definitions
-------------------
226
.. autofunction:: tvm.relay.nn.leaky_relu
Siju committed
227
.. autofunction:: tvm.relay.nn.prelu
228 229 230 231 232
.. autofunction:: tvm.relay.reshape
.. autofunction:: tvm.relay.reshape_like
.. autofunction:: tvm.relay.copy
.. autofunction:: tvm.relay.transpose
.. autofunction:: tvm.relay.squeeze
233 234
.. autofunction:: tvm.relay.floor
.. autofunction:: tvm.relay.ceil
235
.. autofunction:: tvm.relay.sign
236
.. autofunction:: tvm.relay.trunc
237
.. autofunction:: tvm.relay.clip
238 239 240
.. autofunction:: tvm.relay.round
.. autofunction:: tvm.relay.abs
.. autofunction:: tvm.relay.negative
Siva committed
241
.. autofunction:: tvm.relay.take
242
.. autofunction:: tvm.relay.zeros
雾雨魔理沙 committed
243
.. autofunction:: tvm.relay.zeros_like
244
.. autofunction:: tvm.relay.ones
雾雨魔理沙 committed
245
.. autofunction:: tvm.relay.ones_like
246
.. autofunction:: tvm.relay.gather_nd
247 248 249
.. autofunction:: tvm.relay.full
.. autofunction:: tvm.relay.full_like
.. autofunction:: tvm.relay.cast
Siva committed
250
.. autofunction:: tvm.relay.split
251
.. autofunction:: tvm.relay.arange
252
.. autofunction:: tvm.relay.stack
253 254
.. autofunction:: tvm.relay.repeat
.. autofunction:: tvm.relay.tile
255
.. autofunction:: tvm.relay.reverse
雾雨魔理沙 committed
256 257


258 259 260
Level 4 Definitions
-------------------
.. autofunction:: tvm.relay.right_shift
261
.. autofunction:: tvm.relay.left_shift
262 263 264 265 266 267
.. autofunction:: tvm.relay.equal
.. autofunction:: tvm.relay.not_equal
.. autofunction:: tvm.relay.greater
.. autofunction:: tvm.relay.greater_equal
.. autofunction:: tvm.relay.less
.. autofunction:: tvm.relay.less_equal
268 269 270
.. autofunction:: tvm.relay.logical_and
.. autofunction:: tvm.relay.logical_or
.. autofunction:: tvm.relay.logical_not
271 272
.. autofunction:: tvm.relay.maximum
.. autofunction:: tvm.relay.minimum
273
.. autofunction:: tvm.relay.power
Zhi committed
274
.. autofunction:: tvm.relay.where
275 276
.. autofunction:: tvm.relay.argmax
.. autofunction:: tvm.relay.argmin
277 278 279 280 281
.. autofunction:: tvm.relay.sum
.. autofunction:: tvm.relay.max
.. autofunction:: tvm.relay.min
.. autofunction:: tvm.relay.mean
.. autofunction:: tvm.relay.prod
282
.. autofunction:: tvm.relay.strided_slice
283
.. autofunction:: tvm.relay.broadcast_to
284

285

286 287 288
Level 5 Definitions
-------------------
.. autofunction:: tvm.relay.image.resize
289 290 291
.. autofunction:: tvm.relay.vision.multibox_prior
.. autofunction:: tvm.relay.vision.multibox_transform_loc
.. autofunction:: tvm.relay.vision.nms
Siju committed
292
.. autofunction:: tvm.relay.vision.yolo_reorg
293 294 295 296 297 298


Level 10 Definitions
--------------------
.. autofunction:: tvm.relay.broadcast_to_like
.. autofunction:: tvm.relay.collapse_sum_like
299
.. autofunction:: tvm.relay.slice_like
300
.. autofunction:: tvm.relay.shape_of
301 302
.. autofunction:: tvm.relay.layout_transform
.. autofunction:: tvm.relay.device_copy
303 304
.. autofunction:: tvm.relay.annotation.on_device
.. autofunction:: tvm.relay.reverse_reshape
305
.. autofunction:: tvm.relay.nn.batch_matmul