Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Get compatible with optimizer migration in TF 2.11 #2766

Merged
merged 4 commits into from
Oct 11, 2022

Conversation

chenmoneygithub
Copy link
Contributor

Optimizer code change to get compatible with the optimizer migration in TF 2.11.

As a summary the following changes are made:

  1. Adjust how KerasLegacyOptimizer is determined. Now we check if optimizers.experimental and optimizers point to the same module.
  2. In unit tests we should use the legacy optimizer if it is available.
  3. some auto formatting.

@bot-of-gabrieldemarmiesse

@juntang-zhuang @szutenberg @CyberZHG

You are owners of some files modified in this pull request.
Would you kindly review the changes whenever you have the time to?
Thank you very much.

@chenmoneygithub
Copy link
Contributor Author

@bhack Hi, could you help check this PR? This ensures addon tests will not break with TF 2.11 release, which will happen around late October to early November. Thanks!

@juntang-zhuang
Copy link
Contributor

LGTM

1 similar comment
@szutenberg
Copy link
Contributor

szutenberg commented Oct 4, 2022 via email

Copy link
Contributor

@bhack bhack left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You still need to protect TF 2.8.x

if KerasLegacyOptimizer == tf.keras.optimizers.legacy.Optimizer:
#23 663.8 E           AttributeError: module 'tensorflow.keras.optimizers' has no attribute 'legacy'

@bhack
Copy link
Contributor

bhack commented Oct 4, 2022

You need also to add another experimental API exception for:

#14 3.044 E               NameError: The usage of a TensorFlow experimental API was found in file /addons/tensorflow_addons/optimizers/constants.py at line 17:
#14 3.044 E               
#14 3.044 E                      hasattr(tf.keras.optimizers, "experimental")
#14 3.044 E               
#14 3.044 E               Experimental APIs are ok in tests but not in user-facing code. This is because Experimental APIs might have bugs and are not widely used yet.
#14 3.044 E               Addons should show how to write TensorFlow code in a stable and forward-compatible way.

@bhack bhack requested a review from seanpmorgan October 4, 2022 12:14
@boring-cyborg boring-cyborg bot added the test-cases Related to Addons tests label Oct 4, 2022
@chenmoneygithub
Copy link
Contributor Author

@bhack Could you take another look? thanks!

@bhack
Copy link
Contributor

bhack commented Oct 7, 2022

#14 3.906 E           typedapi.functions.NotTypedError: The function 'keras.optimizers.optimizer_v2.optimizer_v2.OptimizerV2.__init__' has not complete type annotations in its signature (it's missing the type hint for 'name'). We would like this functions to be typed.If you are not familiar with adding type hints in functions, you can look at functions already typed inthe codebase. 
#14 3.906 E           If you don't want to type your function, you can add it to the TODO list of functions to type (also known as 

@chenmoneygithub
Copy link
Contributor Author

@bhack Thanks! That's an existing error. Basically Keras does not do type annotation.

@bhack
Copy link
Contributor

bhack commented Oct 7, 2022

Ok, but see the second line of the pasted message

@bhack
Copy link
Contributor

bhack commented Oct 7, 2022

@chenmoneygithub
Copy link
Contributor Author

It's not caused by that PR, but kinda revealed. Could you point me to the TODO list to add the optimizer class? thanks!

@bhack
Copy link
Contributor

bhack commented Oct 7, 2022

It's not caused by that PR, but kinda revealed

I didn't mean this. Check the line in the link not the whole PR.

@bhack bhack self-requested a review October 11, 2022 19:40
):
super().__init__(name, **kwargs)

if isinstance(optimizer, str):
optimizer = tf.keras.optimizers.get(optimizer)
if (
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you think that we could call an external function with this logic instead of replicating the code in every wrapper?

@bhack
Copy link
Contributor

bhack commented Oct 11, 2022

Just a single note to evaluate if you could externalize that logic in an external shared function.

@chenmoneygithub
Copy link
Contributor Author

@bhack Thanks! I feel doing it inline is okay? If you have strong opinion please let me know, I will check what I can do, thanks!

@bhack bhack merged commit 1f14395 into tensorflow:master Oct 11, 2022
@apivovarov
Copy link
Contributor

Python Op Compatibility Matrix in README still shows versions up to 2.10

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
optimizers test-cases Related to Addons tests
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants