-
Notifications
You must be signed in to change notification settings - Fork 28.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add a test for TF mixed precision #9806
Conversation
I don't think that the tests that fails are related to this PR. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! That's a good thing to tackle as we really want all our TF models to work in mixed precision!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! Nice objective!
@@ -340,14 +340,25 @@ def test_for_multiple_choice(self): | |||
|
|||
@slow | |||
def test_saved_model_with_attentions_output(self): | |||
# longformer has special attentions which are not | |||
# compatible in graph mode | |||
# This test don't pass because of the error: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nice!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we re-run the ci tests to make that they are unrelated? There seemed to be related to network issues
e643cb0
to
c149b35
Compare
What does this PR do?
This PR adds a test to check if our TF models are float16 compliant or not. It also helps me to detect what are those that have to be fixed.