Commit 6d6762f8 by Logan Weber Committed by Tianqi Chen

Fix Relay docs formatting and grammar (#2471)

parent 859bda8a
......@@ -11,7 +11,7 @@ Dataflow and Control Fragments
For the purposes of comparing Relay to traditional computational graph-based IRs, it
can be useful to consider Relay exrpessions in terms of dataflow and control fragments.
Each portion of a Relay program containing expressions that only affect the dataflow can
be viewed as a traditional comptuation graph when writing and expressing transformations.
be viewed as a traditional computation graph when writing and expressing transformations.
The dataflow fragment covers the set of Relay expressions that do not involve
control flow. That is, any portion of a program containing only the following
......@@ -31,8 +31,8 @@ fragment in Relay includes the following constructs:
- Recursive Calls in Functions
From the point of view of a computation graph, a function is a subgraph and a function call inlines the subgraph, substituting its arguments for the free variables in the subgraph with corresponding names.
Thus if a function's body uses only dataflow constructs
, a call to that function is in the dataflow fragment; conversely, if the
Thus, if a function's body uses only dataflow constructs,
a call to that function is in the dataflow fragment; conversely, if the
function's body contains control flow, a call to that function is not part of the dataflow fragment.
Variables
......@@ -205,6 +205,7 @@ For example, one can define a polymorphic identity function for
any Relay type as follows:
.. code-block:: python
fn<t : Type>(%x : t) -> t {
%x
}
......@@ -213,13 +214,14 @@ The below definition is also polymorphic, but restricts its
arguments to tensor types:
.. code-block:: python
fn<s : Shape, bt : BaseType>(%x : Tensor[s, bt]) {
%x
}
Notice that the return type is omitted and will be inferred.
*Note: :code:`where` syntax is not yet supported in the text format.*
*Note: "where" syntax is not yet supported in the text format.*
A function may also be subject to one or more type relations, such as in
the following:
......
......@@ -2,9 +2,9 @@ Relay Core Tensor Operators
===========================
This page contains the list of core tensor operator primitives pre-defined in tvm.relay.
The core tensor operator primitives covers typical workloads in deep learning.
They can represent workloads in front-end frameworks, and provide basic building blocks for optimization.
Since deep learning is a fast evolving field and it is that possible to have operators that are not in here.
The core tensor operator primitives cover typical workloads in deep learning.
They can represent workloads in front-end frameworks and provide basic building blocks for optimization.
Since deep learning is a fast evolving field, it is possible to have operators that are not in here.
.. note::
......
......@@ -2,8 +2,8 @@
Relay's Type System
===================
We briefly introduced types while detailing Relay's expression language
, but have not yet described its type system. Relay is
We briefly introduced types while detailing Relay's expression language,
but have not yet described its type system. Relay is
a statically typed and type-inferred language, allowing programs to
be fully typed while requiring just a few explicit type annotations.
......
......@@ -101,7 +101,7 @@ def conv2d_transpose(data,
kernel_layout="OIHW",
output_padding=(0, 0),
out_dtype=""):
"""Two dimensional trnasposed convolution operator.
"""Two dimensional transposed convolution operator.
Parameters
----------
......
......@@ -231,7 +231,7 @@ def full(fill_value, shape=(), dtype=""):
def full_like(data, fill_value):
"""Return an scalar value array with the same shape and type as the input array.
"""Return a scalar value array with the same shape and type as the input array.
Parameters
----------
......@@ -288,7 +288,7 @@ def where(condition, x, y):
return _make.where(condition, x, y)
def broadcast_to(data, shape):
"""Return an scalar value array with the same type, broadcast to
"""Return a scalar value array with the same type, broadcast to
the provided shape.
Parameters
......@@ -307,7 +307,7 @@ def broadcast_to(data, shape):
return _make.broadcast_to(data, shape)
def broadcast_to_like(data, broadcast_type):
"""Return an scalar value array with the same shape and type as the input array.
"""Return a scalar value array with the same shape and type as the input array.
Parameters
----------
......@@ -326,7 +326,7 @@ def broadcast_to_like(data, broadcast_type):
def collapse_sum_like(data, collapse_type):
"""Return an scalar value array with the same shape and type as the input array.
"""Return a scalar value array with the same shape and type as the input array.
Parameters
----------
......@@ -377,7 +377,7 @@ def split(data, indices_or_sections, axis=0):
def strided_slice(data, begin, end, strides=None):
"""Strided slice of an array..
"""Strided slice of an array.
Parameters
----------
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment