test_benchmark_topi_conv2d.py
10.1 KB
-
[VTA] Support for batched inference (#3661) · 6c7f0c4d
* fix in IR pass to support padding on 6-d tensors * support for both N>1 and N==1 for padding * batch size > 1 tuning and base config * output formatting * batch conv2d * print all category results * revert to single-batch config * pick record best * fix conv test * improving reporting * address batching bug in fast simulator * fix
Thierry Moreau committed