Commit 6d6762f8 by Logan Weber Committed by Tianqi Chen

Fix Relay docs formatting and grammar (#2471)

parent 859bda8a
...@@ -11,7 +11,7 @@ Dataflow and Control Fragments ...@@ -11,7 +11,7 @@ Dataflow and Control Fragments
For the purposes of comparing Relay to traditional computational graph-based IRs, it For the purposes of comparing Relay to traditional computational graph-based IRs, it
can be useful to consider Relay exrpessions in terms of dataflow and control fragments. can be useful to consider Relay exrpessions in terms of dataflow and control fragments.
Each portion of a Relay program containing expressions that only affect the dataflow can Each portion of a Relay program containing expressions that only affect the dataflow can
be viewed as a traditional comptuation graph when writing and expressing transformations. be viewed as a traditional computation graph when writing and expressing transformations.
The dataflow fragment covers the set of Relay expressions that do not involve The dataflow fragment covers the set of Relay expressions that do not involve
control flow. That is, any portion of a program containing only the following control flow. That is, any portion of a program containing only the following
...@@ -31,8 +31,8 @@ fragment in Relay includes the following constructs: ...@@ -31,8 +31,8 @@ fragment in Relay includes the following constructs:
- Recursive Calls in Functions - Recursive Calls in Functions
From the point of view of a computation graph, a function is a subgraph and a function call inlines the subgraph, substituting its arguments for the free variables in the subgraph with corresponding names. From the point of view of a computation graph, a function is a subgraph and a function call inlines the subgraph, substituting its arguments for the free variables in the subgraph with corresponding names.
Thus if a function's body uses only dataflow constructs Thus, if a function's body uses only dataflow constructs,
, a call to that function is in the dataflow fragment; conversely, if the a call to that function is in the dataflow fragment; conversely, if the
function's body contains control flow, a call to that function is not part of the dataflow fragment. function's body contains control flow, a call to that function is not part of the dataflow fragment.
Variables Variables
...@@ -205,6 +205,7 @@ For example, one can define a polymorphic identity function for ...@@ -205,6 +205,7 @@ For example, one can define a polymorphic identity function for
any Relay type as follows: any Relay type as follows:
.. code-block:: python .. code-block:: python
fn<t : Type>(%x : t) -> t { fn<t : Type>(%x : t) -> t {
%x %x
} }
...@@ -213,13 +214,14 @@ The below definition is also polymorphic, but restricts its ...@@ -213,13 +214,14 @@ The below definition is also polymorphic, but restricts its
arguments to tensor types: arguments to tensor types:
.. code-block:: python .. code-block:: python
fn<s : Shape, bt : BaseType>(%x : Tensor[s, bt]) { fn<s : Shape, bt : BaseType>(%x : Tensor[s, bt]) {
%x %x
} }
Notice that the return type is omitted and will be inferred. Notice that the return type is omitted and will be inferred.
*Note: :code:`where` syntax is not yet supported in the text format.* *Note: "where" syntax is not yet supported in the text format.*
A function may also be subject to one or more type relations, such as in A function may also be subject to one or more type relations, such as in
the following: the following:
......
...@@ -2,9 +2,9 @@ Relay Core Tensor Operators ...@@ -2,9 +2,9 @@ Relay Core Tensor Operators
=========================== ===========================
This page contains the list of core tensor operator primitives pre-defined in tvm.relay. This page contains the list of core tensor operator primitives pre-defined in tvm.relay.
The core tensor operator primitives covers typical workloads in deep learning. The core tensor operator primitives cover typical workloads in deep learning.
They can represent workloads in front-end frameworks, and provide basic building blocks for optimization. They can represent workloads in front-end frameworks and provide basic building blocks for optimization.
Since deep learning is a fast evolving field and it is that possible to have operators that are not in here. Since deep learning is a fast evolving field, it is possible to have operators that are not in here.
.. note:: .. note::
......
...@@ -2,8 +2,8 @@ ...@@ -2,8 +2,8 @@
Relay's Type System Relay's Type System
=================== ===================
We briefly introduced types while detailing Relay's expression language We briefly introduced types while detailing Relay's expression language,
, but have not yet described its type system. Relay is but have not yet described its type system. Relay is
a statically typed and type-inferred language, allowing programs to a statically typed and type-inferred language, allowing programs to
be fully typed while requiring just a few explicit type annotations. be fully typed while requiring just a few explicit type annotations.
......
...@@ -101,7 +101,7 @@ def conv2d_transpose(data, ...@@ -101,7 +101,7 @@ def conv2d_transpose(data,
kernel_layout="OIHW", kernel_layout="OIHW",
output_padding=(0, 0), output_padding=(0, 0),
out_dtype=""): out_dtype=""):
"""Two dimensional trnasposed convolution operator. """Two dimensional transposed convolution operator.
Parameters Parameters
---------- ----------
......
...@@ -231,7 +231,7 @@ def full(fill_value, shape=(), dtype=""): ...@@ -231,7 +231,7 @@ def full(fill_value, shape=(), dtype=""):
def full_like(data, fill_value): def full_like(data, fill_value):
"""Return an scalar value array with the same shape and type as the input array. """Return a scalar value array with the same shape and type as the input array.
Parameters Parameters
---------- ----------
...@@ -288,7 +288,7 @@ def where(condition, x, y): ...@@ -288,7 +288,7 @@ def where(condition, x, y):
return _make.where(condition, x, y) return _make.where(condition, x, y)
def broadcast_to(data, shape): def broadcast_to(data, shape):
"""Return an scalar value array with the same type, broadcast to """Return a scalar value array with the same type, broadcast to
the provided shape. the provided shape.
Parameters Parameters
...@@ -307,7 +307,7 @@ def broadcast_to(data, shape): ...@@ -307,7 +307,7 @@ def broadcast_to(data, shape):
return _make.broadcast_to(data, shape) return _make.broadcast_to(data, shape)
def broadcast_to_like(data, broadcast_type): def broadcast_to_like(data, broadcast_type):
"""Return an scalar value array with the same shape and type as the input array. """Return a scalar value array with the same shape and type as the input array.
Parameters Parameters
---------- ----------
...@@ -326,7 +326,7 @@ def broadcast_to_like(data, broadcast_type): ...@@ -326,7 +326,7 @@ def broadcast_to_like(data, broadcast_type):
def collapse_sum_like(data, collapse_type): def collapse_sum_like(data, collapse_type):
"""Return an scalar value array with the same shape and type as the input array. """Return a scalar value array with the same shape and type as the input array.
Parameters Parameters
---------- ----------
...@@ -377,7 +377,7 @@ def split(data, indices_or_sections, axis=0): ...@@ -377,7 +377,7 @@ def split(data, indices_or_sections, axis=0):
def strided_slice(data, begin, end, strides=None): def strided_slice(data, begin, end, strides=None):
"""Strided slice of an array.. """Strided slice of an array.
Parameters Parameters
---------- ----------
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment