[Relay][Quantization] KL-divergence-based per-layer calibration (#3538)
* [Relay][Quantization] Support floating-point scale * [Relay][Quantization] KL-divergence calibration on dataset * Fix unhandled LeftShift case in QuantizeRealize * Fix lint * drop QBias * fix lint * address comments * address comments * Update comments * address comments * lint * kQIdentity = 0
Showing
python/tvm/relay/quantize/kl_divergence.py
0 → 100644
src/relay/pass/quantize/calibrate.cc
0 → 100644
Please
register
or
sign in
to comment