Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable passing tests in multilabel_confusion_matrix_test.py #1223

Merged
merged 2 commits into from
Mar 7, 2020
Merged

Enable passing tests in multilabel_confusion_matrix_test.py #1223

merged 2 commits into from
Mar 7, 2020

Conversation

gabrieldemarmiesse
Copy link
Member

@gabrieldemarmiesse gabrieldemarmiesse commented Mar 5, 2020

This does not fix fully #1204 but at least now we're running some tests.

@bot-of-gabrieldemarmiesse

@SSaishruthi

You are owner of some files modified in this pull request.
Would you kindly review the changes whenever you have the time to?
Thank you very much.

Copy link
Member

@seanpmorgan seanpmorgan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@seanpmorgan seanpmorgan merged commit aa52b72 into tensorflow:master Mar 7, 2020
@gabrieldemarmiesse gabrieldemarmiesse deleted the enable_testing_matrix_tests.py branch March 8, 2020 16:30
jrruijli pushed a commit to jrruijli/addons that referenced this pull request Dec 23, 2020
…ow#1223)

* Skip failing test and re-enable other.

* Use skipTest
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants