Skip to content

Optimized ReID inference on GPU #118

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

sunildkumar
Copy link

Description

The current inference implementation in ReIDModel does not fully take advantage of the GPU, attempting to do inference in series rather than in parallel. Additionally, it moves crops on and off the GPU individually instead of as a stack, increasing the number of data transfers to and from GPU. This change allows ReIDModel to do batched inference while on GPU while still allowing for inference in series when no GPU is available.

List any dependencies that are required for this change: None

Type of change

New feature (non-breaking change which adds functionality)

How has this change been tested, please provide a testcase or example of how you tested the change?

I've tested this change in my private codebase, where I see >2x speedup on a handful of timm models:

      mobilenetv4_conv_small.e1200_r224_in1k:
          GPU serial time: 0.0369 seconds/frame
          GPU batched time: 0.0181 seconds/frame
          GPU serial time / GPU batched time (speedup): 2.04

      mobilenetv3_small_100.lamb_in1k:
          GPU serial time: 0.0393 seconds/frame
          GPU batched time: 0.0167 seconds/frame
          GPU serial time / GPU batched time (speedup): 2.36

      mobilenetv3_large_100.ra_in1k:
          GPU serial time: 0.0516 seconds/frame
          GPU batched time: 0.0158 seconds/frame
          GPU serial time / GPU batched time (speedup): 3.26


      resnet50.a1_in1k:
          GPU serial time: 0.0439 seconds/frame
          GPU batched time: 0.0215 seconds/frame
          GPU serial time / GPU batched time (speedup): 2.04

I don't see any test cases in this repo yet, so I'm unsure if you'd like me to add any tests?

Any specific deployment considerations

None

Docs

I don't think any update is necessary.

@CLAassistant
Copy link

CLAassistant commented Jul 22, 2025

CLA assistant check
All committers have signed the CLA.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants