-
Notifications
You must be signed in to change notification settings - Fork 614
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Get compatible with optimizer migration in TF 2.11 #2766
Conversation
@juntang-zhuang @szutenberg @CyberZHG You are owners of some files modified in this pull request. |
@bhack Hi, could you help check this PR? This ensures addon tests will not break with TF 2.11 release, which will happen around late October to early November. Thanks! |
LGTM |
1 similar comment
LGTM
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You still need to protect TF 2.8.x
if KerasLegacyOptimizer == tf.keras.optimizers.legacy.Optimizer:
#23 663.8 E AttributeError: module 'tensorflow.keras.optimizers' has no attribute 'legacy'
You need also to add another experimental API exception for:
|
@bhack Could you take another look? thanks! |
|
@bhack Thanks! That's an existing error. Basically Keras does not do type annotation. |
Ok, but see the second line of the pasted message |
P.s. check you previous PR 339159f#diff-73b3a9dcf7f0d703962d3258b17d37bf0e4515906821a17e86149afaf2c28cf5R45 |
It's not caused by that PR, but kinda revealed. Could you point me to the TODO list to add the optimizer class? thanks! |
I didn't mean this. Check the line in the link not the whole PR. |
): | ||
super().__init__(name, **kwargs) | ||
|
||
if isinstance(optimizer, str): | ||
optimizer = tf.keras.optimizers.get(optimizer) | ||
if ( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you think that we could call an external function with this logic instead of replicating the code in every wrapper?
Just a single note to evaluate if you could externalize that logic in an external shared function. |
@bhack Thanks! I feel doing it inline is okay? If you have strong opinion please let me know, I will check what I can do, thanks! |
Python Op Compatibility Matrix in README still shows versions up to 2.10 |
Optimizer code change to get compatible with the optimizer migration in TF 2.11.
As a summary the following changes are made:
KerasLegacyOptimizer
is determined. Now we check ifoptimizers.experimental
andoptimizers
point to the same module.