-
Notifications
You must be signed in to change notification settings - Fork 709
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Move export
functionality to AnomalyModule
as class methods
#1803
Move export
functionality to AnomalyModule
as class methods
#1803
Conversation
Hi @samet-akcay, should I fix the Signoff-by-line to get reviewed or do I just need to wait? Thanks. |
@thinhngo-x, We'll try to review asap when we get a chance! Thanks for creating this PR and your patience! |
Signed-off-by: Duc Thinh Ngo <[email protected]>
Signed-off-by: Duc Thinh Ngo <[email protected]>
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
@samet-akcay could we prioritize this PR? It looks like it will change OV export once again, so I want to be sure I'm working on the latest version. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks! Just a minor comment
Thank you all for your comments. I'll try to resolve them one by one very soon. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me. Just one minor comment
src/anomalib/deploy/export.py
Outdated
ov_args = {} if ov_args is None else ov_args | ||
ov_args.update({"example_input": torch.zeros((1, 3, 1, 1)).to(self.device)}) | ||
if convert_model is not None and serialize is not None: | ||
model = convert_model(inference_model, **ov_args) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does this work for every model? In the past we had some issues with torch jit which prevented us from using convert_model
directly.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a great question. Please ensure all models working previously are still working.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for pointing that out. Indeed the test has failed with some models. I'll investigate this further. If it is too complicated, I might leave it for another PR to solve.
for key, value in metadata.items(): | ||
if isinstance(value, torch.Tensor): | ||
metadata[key] = value.numpy().tolist() | ||
class ExportMixin: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this class should be placed under anomalib.models.components.base
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
so far, most of the classes in anomalib.models.components.base
are independent base modules to build models. AnomalyModule
inherits ExportMixin
and ExportMixin
is itself not a module to be used elsewhere. I think it should be better to leave it in anomalib.deploy.export
. That's my two cents.
raise ModuleNotFoundError | ||
return ov_model_path | ||
|
||
def get_metadata( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This method is public while the other metadata-related methods are private. Is this intentional and if so what's the reason?
src/anomalib/deploy/export.py
Outdated
|
||
Args: | ||
model (AnomalyModule): Anomaly model which contains metadata related to normalization. | ||
def _get_model_metadata(self) -> dict[str, torch.Tensor]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looking at the metada-related methods here, I'm having some trouble seeing how the responsibilities are divided between get_metadata
and _get_model_metadata
. Is it really necessary to have both these method or could we merge them into a single method? Also, I feel _write_metadata_to_json
is a utility function (like _create_export_root
) and could be moved outside of the class.
4facc63
into
openvinotoolkit:feature/refactor_export
📝 Description
ExportMixin
grouping all export methodsto_onnx
,to_openvino
,to_torch
, and helper functions inanomalib/deploy/export.py
.AnomalyModule
now inheritsExportMixin
.Resolves [Bug]: CI failed at pytest #1820Replace flaky with Replace flaky with pytest-rerunfailures✨ Changes
Select what type of change your PR is:
✅ Checklist
Before you submit your pull request, please make sure you have completed the following steps:
For more information about code review checklists, see the Code Review Checklist.