1. 30 Jul, 2019 1 commit
    • [VTA] Support for batched inference (#3661) · 6c7f0c4d
      * fix in IR pass to support padding on 6-d tensors
      
      * support for both N>1 and N==1 for padding
      
      * batch size > 1 tuning and base config
      
      * output formatting
      
      * batch conv2d
      
      * print all category results
      
      * revert to single-batch config
      
      * pick record best
      
      * fix conv test
      
      * improving reporting
      
      * address batching bug in fast simulator
      
      * fix
      Thierry Moreau committed
  2. 06 Jul, 2019 1 commit
  3. 02 Jul, 2019 1 commit
  4. 28 Jun, 2019 1 commit