Assert --optimize
not used with cuda device
#8569
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
TorchScript throws a lot of errors complaining about mobile-optimized models not working with CUDA devices when trying to use a mobile-optimized cuda-export-device model, regardless of whether the model is ran on cpu or a cuda device. This PR prevents that from occurring by ensuring that
--optimize
cannot be used with a cuda export device.🛠️ PR Summary
Made with ❤️ by Ultralytics Actions
🌟 Summary
Improve export script by enforcing CPU device usage during optimization.
📊 Key Changes
--optimize
flag requires--device cpu
.🎯 Purpose & Impact
--optimize
option by running the model optimization on CPU only.