Skip to content
Projects
Groups
Snippets
Help
This project
Loading...
Sign in / Register
Toggle navigation
T
tic
Overview
Overview
Details
Activity
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
wenyuanbo
tic
Commits
fa5c5883
Commit
fa5c5883
authored
Sep 24, 2016
by
Wei Wu
Committed by
Tianqi Chen
May 29, 2018
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
fix typo (#56)
parent
98613dfc
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
6 additions
and
6 deletions
+6
-6
nnvm/README.md
+3
-3
nnvm/docs/overview.md
+3
-3
No files found.
nnvm/README.md
View file @
fa5c5883
...
@@ -15,11 +15,11 @@ We believe that the decentralized modular system is an interesting direction.
...
@@ -15,11 +15,11 @@ We believe that the decentralized modular system is an interesting direction.
The hope is that effective parts can be assembled together just like you assemble your own desktops.
The hope is that effective parts can be assembled together just like you assemble your own desktops.
So the customized deep learning solution can be minimax, minimum in terms of dependencies,
So the customized deep learning solution can be minimax, minimum in terms of dependencies,
while maxi
zim
ing the users' need.
while maxi
miz
ing the users' need.
NNVM offers one such part, it provides a generic way to do
NNVM offers one such part, it provides a generic way to do
computation graph optimization such as memory reduction, device allocation and more
computation graph optimization such as memory reduction, device allocation and more
while being agnostic to the operator interface defintion and how operators are executed.
while being agnostic to the operator interface defin
i
tion and how operators are executed.
NNVM is inspired by LLVM, aiming to be a high level intermediate representation library
NNVM is inspired by LLVM, aiming to be a high level intermediate representation library
for neural nets and computation graphs generation and optimizations.
for neural nets and computation graphs generation and optimizations.
...
@@ -32,7 +32,7 @@ This is essentially ***Unix philosophy*** applied to machine learning system.
...
@@ -32,7 +32,7 @@ This is essentially ***Unix philosophy*** applied to machine learning system.
-
Essential parts can be assembled in minimum way for embedding systems.
-
Essential parts can be assembled in minimum way for embedding systems.
-
Developers can hack the parts they need and compose with other well defined parts.
-
Developers can hack the parts they need and compose with other well defined parts.
-
Decentralized modules enable new extensions creators to own their project
-
Decentralized modules enable new extensions creators to own their project
without creating a mono
thil
ic version.
without creating a mono
lith
ic version.
Deep learning system itself is not necessary one part, for example
Deep learning system itself is not necessary one part, for example
here are some relative independent parts that can be isolated
here are some relative independent parts that can be isolated
...
...
nnvm/docs/overview.md
View file @
fa5c5883
...
@@ -13,7 +13,7 @@ with the modular tools like CuDNN and CUDA, it is not hard to assemble a C++ API
...
@@ -13,7 +13,7 @@ with the modular tools like CuDNN and CUDA, it is not hard to assemble a C++ API
However, most users like to use python/R/scala or other languages.
However, most users like to use python/R/scala or other languages.
By registering the operators to NNVM, X can now get the graph composition
By registering the operators to NNVM, X can now get the graph composition
language front-end on these languages quickly without coding it up for
language front-end on these languages quickly without coding it up for
each type of langu
g
age.
each type of language.
Y want to build a deep learning serving system on embedded devices.
Y want to build a deep learning serving system on embedded devices.
To do that, we need to cut things off, as opposed to add new parts,
To do that, we need to cut things off, as opposed to add new parts,
...
@@ -97,7 +97,7 @@ Eventually the operator interface become big and have to evolve in the centraliz
...
@@ -97,7 +97,7 @@ Eventually the operator interface become big and have to evolve in the centraliz
In NNVM, we decided to change the design and support arbitrary type of operator attributes,
In NNVM, we decided to change the design and support arbitrary type of operator attributes,
without need to change the operator registry. This also echos the need of minimum interface
without need to change the operator registry. This also echos the need of minimum interface
so that the code can be easier to share ac
c
ross multiple projects
so that the code can be easier to share across multiple projects
User can register new attribute, such as inplace property checking function as follows.
User can register new attribute, such as inplace property checking function as follows.
```
c++
```
c++
...
@@ -122,7 +122,7 @@ NNVM_REGISTER_OP(exp)
...
@@ -122,7 +122,7 @@ NNVM_REGISTER_OP(exp)
```
```
These attributes can be queried at arbitrary parts of the code, like the following parts.
These attributes can be queried at arbitrary parts of the code, like the following parts.
Under the hood, each attributes are stored in a any type columar store,
Under the hood, each attributes are stored in a any type colum
n
ar store,
that can easily be retrieved and cast back to typed table and do quick lookups.
that can easily be retrieved and cast back to typed table and do quick lookups.
```
c++
```
c++
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment