Skip to content
Projects
Groups
Snippets
Help
This project
Loading...
Sign in / Register
Toggle navigation
S
SCL-my
Overview
Overview
Details
Activity
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
YuxuanGuo
SCL-my
Commits
9eed4263
Commit
9eed4263
authored
Mar 14, 2022
by
yuxguo
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
fix
parent
ab636756
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
3 deletions
+3
-3
main.py
+3
-3
No files found.
main.py
View file @
9eed4263
...
...
@@ -102,7 +102,7 @@ trainer_args.add_argument('--epochs', '-e', type=int, default=200,
help
=
'the number of epochs'
)
trainer_args
.
add_argument
(
'--obs-epochs'
,
'-oe'
,
type
=
int
,
default
=
5
,
help
=
'the number of sub epochs for observation stage'
)
trainer_args
.
add_argument
(
'--batch-size'
,
'-bs'
,
type
=
int
,
default
=
1
,
trainer_args
.
add_argument
(
'--batch-size'
,
'-bs'
,
type
=
int
,
default
=
32
,
help
=
'input batch size for training (default: 1)'
)
trainer_args
.
add_argument
(
'--eval-batch-size'
,
'-ebs'
,
type
=
int
,
default
=
1
,
help
=
'input batch size for evaluation (default: 32)'
)
...
...
@@ -450,8 +450,8 @@ def main():
model
=
get_model
(
args
)
if
args
.
use_gpu
:
#
model = DataParallel(model).cuda()
model
=
model
.
cuda
()
model
=
DataParallel
(
model
)
.
cuda
()
#
model = model.cuda()
if
args
.
weight_decay
==
0
:
optimizer
=
optim
.
Adam
(
model
.
parameters
(),
lr
=
args
.
lr
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment