-
Notifications
You must be signed in to change notification settings - Fork 6.8k
Infer dtype in SymbolBlock import from input symbol #12412
Infer dtype in SymbolBlock import from input symbol #12412
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Left a few small comments. Otherwise looks great. Thanks for the quick fix!
python/mxnet/gluon/block.py
Outdated
self.params.get(aux, grad_req='null', allow_deferred_init=True, dtype=aux_types[i]) | ||
else: | ||
# Use default types for params | ||
for i, arg in enumerate(arg_params): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe no enumerate is needed since you don't use index i in this loop?
python/mxnet/gluon/block.py
Outdated
for i in out.list_auxiliary_states(): | ||
if i not in input_names: | ||
self.params.get(i, grad_req='null', allow_deferred_init=True) | ||
for i, aux in enumerate(aux_params): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same here, no need to get index i
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
N/A with latest refactoring.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Suggest to add a test for float16 as well as float64
tests/python/unittest/test_gluon.py
Outdated
ctx = mx.cpu(0) | ||
|
||
net_fp32 = mx.gluon.model_zoo.vision.resnet34_v2(pretrained=True, ctx=ctx) | ||
net_fp32.cast('float64') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shall we add another test that casts to float16?
The original issue reported import from float16 failing, and it might be appropriate to cover it as well.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added a test for fp16.
@apeforest @szha @lupesko - Thanks for your time. Addressed your comments. Please have a look. |
tmpfile = os.path.join(tmp, 'resnet34_fp64') | ||
ctx = mx.cpu(0) | ||
|
||
net_fp32 = mx.gluon.model_zoo.vision.resnet34_v2(pretrained=True, ctx=ctx, root=tmp) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is the model name in model zoo going to be maintained? If there is any change in the name, it would break this unit test. Not sure if we want to keep this dependency
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
resnet34_v2 is a public function exposed through model_zoo.vision module. I think it is ok because we are not using string based selection of the model.
@szha @apeforest - Is this good to go? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
# other than fp32 param dtype. | ||
|
||
# 1. Load a resnet model, cast it to fp64 and export | ||
tmp = tempfile.mkdtemp() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should we delete the temporary directory when done with it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Temp gets automatically cleaned up.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
According to python docs, it seems that it should be deleted by the user. The user of mkdtemp() is responsible for deleting the temporary directory and its contents when done with it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Right. Thanks. I meant, temp gets automatically cleaned up after all the tests (test session).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@marcoabreu - Can you please confirm if my understanding is correct? If not, I will add code to delete the temp directory created in the tests. Also, I see similar behavior in all other tests, where it creates temp dir, but, assumes, it will be cleaned up the system.
@zhreshold - Can you help look at this once, and merge if looks good? |
python/mxnet/gluon/block.py
Outdated
for i in out.list_auxiliary_states(): | ||
if i not in input_names: | ||
self.params.get(i, grad_req='null', allow_deferred_init=True) | ||
arg_types, aux_types = _infer_param_types(inputs[0], out, arg_params, aux_params) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this does not handle Grouped Symbol because you are only slicing [0] from the symbol.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
also, I think the type inference should occur in cast
as well, otherwise it's buggy when user is trying to cast dtype of a cascaded network with symbolBlock inside.
python/mxnet/gluon/block.py
Outdated
for i in out.list_auxiliary_states(): | ||
if i not in input_names: | ||
self.params.get(i, grad_req='null', allow_deferred_init=True) | ||
arg_types, aux_types = _infer_param_types(inputs[0], out, arg_params, aux_params) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
also, I think the type inference should occur in cast
as well, otherwise it's buggy when user is trying to cast dtype of a cascaded network with symbolBlock inside.
break | ||
assert np.dtype(net_fp64.params[param_name].dtype) == np.dtype(np.float64) | ||
|
||
# Cast the symbol block to FP32 and try to forward a FP32 data. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@zhreshold - Added a test here to verify the SymbolBlock.cast functionality.
@zhreshold - Can you please take a look at this? Thanks. |
* Infer dtype in SymbolBlock import from input symbol * Fix lint issues and make existing tests pass * Add tests for importing a fp64 model into symbol block * Fixing failing test for test symbol block * Set context in unit tests * Add tests for fp16, add default dtype in infer_param_types * Use tmp directory as root for loading from model zoo to avoid race condition * Fixing naming and parameter selection in test case * Fixing failing GPU tests * Make unit test more deterministic to get param name * Override cast in symbol block, handle grouped symbol * Handle multiple symbolic input usecase * Add tests to verify behavior of SymbolBlock.cast
Description
Created this PR for getting the early feedback. I am working on adding test cases for this and will update the PR soonAdded the tests
@szha @hetong007 @apeforest - Can you please take a look at this?
@Roshrini - I think this is important fix to be picked for 1.3. @szha ?
Checklist
Essentials
Please feel free to remove inapplicable items for your PR.
Changes
Comments
Below is the issue:
When you create mx.gluon.SymbolBlock(sm, input). It creates the parameters in the Block. Type is not passed for creating the parameter.
See here - https://github.com/apache/incubator-mxnet/blob/master/python/mxnet/gluon/block.py#L1058 and the behavior is "If there is not parameter to get, it creates one and uses default type (fp32) https://github.com/apache/incubator-mxnet/blob/master/python/mxnet/gluon/parameter.py#L688