-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inference slicer batching #1239
base: develop
Are you sure you want to change the base?
Inference slicer batching #1239
Conversation
Linas Kondrackis seems not to be a GitHub user. You need a GitHub account to be able to sign the CLA. If you have already a GitHub account, please add the email address used for this commit to your account. You have signed the CLA already but the status is still pending? Let us recheck it. |
I've seen multiple CLA-related updates just now. Let's sort it out when I'm back from my trip. P.S. It may also be due to me changing my github email a week ago. |
Issues with threading: #1632 |
Hey this is great! Works out of the box. Can we merge this plz? I now have to copy pasta it into the package. |
Description
PR for inference slicer with batching. If
batch_size
is set, collections of slices are passed to the model.Note that user need to define a
callback
that can accept list of images.Still using threads if
worker_threads > 1
(both batches and threads can be used simultaneously)Previous PR, auto-closed during rewrite: #1108
Type of change
Please delete options that are not relevant.
How has this change been tested, please provide a testcase or example of how you tested the change?
https://colab.research.google.com/drive/1cfqBss7n-jc6VyKdjJ1AN9TnOBjdYQwZ?usp=sharing
Any specific deployment considerations
Docs