You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I read that TensorFlow Lite only supports the SSD models from the Zoo 2 model, for example the SSD MobileNet V2 FPNLite 640x640. My question would be do you have a tutorial to convert it to a TFLite model and can it be run with the hardware accelerator (USB stick) from Google Coral?
The text was updated successfully, but these errors were encountered:
@TannerGilbert did you have a repo for doing this on tensorflow 1 for non quantized models? Is it possible to quantize a TF1 model post training?
For my understading you have this options: TF1: has quantized models, which can be trained quantized TF2: supports post training quantization and NOT award training.
Hey,
I read that TensorFlow Lite only supports the SSD models from the Zoo 2 model, for example the SSD MobileNet V2 FPNLite 640x640. My question would be do you have a tutorial to convert it to a TFLite model and can it be run with the hardware accelerator (USB stick) from Google Coral?
The text was updated successfully, but these errors were encountered: