Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Converting YOLOX-s model to hef format #139

Open
thePrimeTux opened this issue Nov 8, 2024 · 0 comments
Open

Converting YOLOX-s model to hef format #139

thePrimeTux opened this issue Nov 8, 2024 · 0 comments

Comments

@thePrimeTux
Copy link

thePrimeTux commented Nov 8, 2024

What changes should I make to convert the pretrained YOLOX-s model from YOLOX to hef format? I reused hailo_model_zoo/cfg/networks/yolox_s_wide_leaky.yaml and changed the end node names and then converted the model to hef format using the following command

hailomz compile --hw-arch hailo8l --yaml hailo_model_zoo/cfg/networks/yolox_s.yaml --ckpt yolox_s.onnx --calib-path hailo_model_zoo/coco_dataset --performance

But running hailortcli run yolox_s.hef gave the following error.

Running streaming inference (../yolox_s.hef):
Transform data: true
Type: auto
Quantized: true
[HailoRT] [error] CHECK failed - Failed to extract_detections, reg yolox_s_leaky/conv55_111 buffer_size should be 25600, but is 6400
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_INVALID_ARGUMENT(2)
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_INVALID_ARGUMENT(2)
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_INVALID_ARGUMENT(2)
Network yolox_s_leaky/yolox_s_leaky: 100% | 0 | FPS: 0.00 | ETA: 00:00:00
[HailoRT CLI] [error] Failed waiting for threads with status HAILO_INVALID_ARGUMENT(2)
[HailoRT CLI] [error] CHECK_SUCCESS failed with status=HAILO_INVALID_ARGUMENT(2)
[HailoRT CLI] [error] CHECK_SUCCESS failed with status=HAILO_INVALID_ARGUMENT(2)
[HailoRT CLI] [error] CHECK_SUCCESS failed with status=HAILO_INVALID_ARGUMENT(2)
[HailoRT CLI] [error] CHECK_SUCCESS failed with status=HAILO_INVALID_ARGUMENT(2) - Error failed running inference

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant