Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Supported Model / Compatibility with USB-Accelerator #34

Open
Petros626 opened this issue May 25, 2022 · 3 comments
Open

Supported Model / Compatibility with USB-Accelerator #34

Petros626 opened this issue May 25, 2022 · 3 comments

Comments

@Petros626
Copy link

Hey,

I read that TensorFlow Lite only supports the SSD models from the Zoo 2 model, for example the SSD MobileNet V2 FPNLite 640x640. My question would be do you have a tutorial to convert it to a TFLite model and can it be run with the hardware accelerator (USB stick) from Google Coral?

@TannerGilbert
Copy link
Owner

It is certainly possible to convert a model trained with the OD API to Tensorflow Lite. You can find an example in my Tensorflow-Lite-Object-Detection-with-the-Tensorflow-Object-Detection-API Repository. There is also an official example.

If you're not limited to the OD-API, I can also recommend using the TFLITE Model Maker.

@Petros626
Copy link
Author

Petros626 commented May 28, 2022

@TannerGilbert did you have a repo for doing this on tensorflow 1 for non quantized models? Is it possible to quantize a TF1 model post training?

For my understading you have this options:
TF1: has quantized models, which can be trained quantized
TF2: supports post training quantization and NOT award training.

@Petros626
Copy link
Author

@TannerGilbert do you agree?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants