-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
track_id bug with fp16 #22
Comments
I have already made a PR to ByteTrack and it is merged to master branch. Have you check it already? ifzhang/ByteTrack#184 And yes, the PR I made doesn't include operations in |
The changes in |
Thanks, I will try to modify the id_loss in yolo_head.py myself. Thanks again for open-sourcing this project, the combination of bytetrack_reid and other tracking strategies has been very helpful for my work. |
Just like I mentioned in ifzhang/ByteTrack#184: "Hi, I'm thinking about separate track_id annotations from variable targets. And set targets to torch.float16as current code, but keep track_id to be torch.float32." So the only thing you should do is eparate track_id annotations from variable targets, then pass it to yolo head. I think it is just a matter of some function interfaces. If you finish it, a PR is welcome. |
Or I can update it when I got time. |
Hi, I don't think cat 2 tensors with different |
Hi,here.When targets is converted to FP16, the track_id will lose the precision, resulting in wrong labels for reid.
How to separate track_id annotations from variable targets. And set targets to torch.float16as current code, but keep track_id to be torch.float32.I tried to modify it, but it didn't work.
Looking forward to your update on this bug.Thank you.
The text was updated successfully, but these errors were encountered: