-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Metrics] Confusion matrix class interface #4348
[Metrics] Confusion matrix class interface #4348
Conversation
Co-authored-by: Teddy Koker <[email protected]>
Co-authored-by: Teddy Koker <[email protected]>
Co-authored-by: Teddy Koker <[email protected]>
In which ptl version this will be released? |
it could be in coming point release on Tuesday. |
According to the release changelog, ConfusionMatrix should be in PL v1.0.5. >>> from pytorch_lightning.metrics import ConfusionMatrix
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: cannot import name 'ConfusionMatrix' from 'pytorch_lightning.metrics' (*path*/python3.8/site-packages/pytorch_lightning/metrics/__init__.py) Colab: https://colab.research.google.com/drive/1QHFCBtGEv215DuLw6Suqi9A1WHiExlZR?usp=sharing |
@NumesSanguis I think they cherry-picked the commits for 1.0.5 release so it didn't include in it. maybe in 1.1 |
@ydcjeff Then the patch notes are wrong:
You can find this Metric also already in the docs: |
Yep, we are in the transition of point release and minor release, we will edit the changelog |
I don't see an associated Feature Request, so I hope bug reports here are ok. @SkafteNicki I tried this ConfusionMatrix using the master branch ( 25 threshold: float = 0.5) -> torch.Tensor:
26 preds, target = _input_format_classification(preds, target, threshold)
---> 27 unique_mapping = (target.view(-1) * num_classes + preds.view(-1)).to(torch.long)
28 bins = torch.bincount(unique_mapping, minlength=num_classes ** 2)
29 confmat = bins.reshape(num_classes, num_classes)
RuntimeError: The size of tensor a (32) must match the size of tensor b (16) at non-singleton dimension 0 Colab: https://colab.research.google.com/drive/1QHFCBtGEv215DuLw6Suqi9A1WHiExlZR?usp=sharing |
@SkafteNicki There is also an issue (if you make sure the above batch size does not show up) when trying to log the Confusion Matrix with the v1.0.x logging approach: self.confmat(output.softmax(dim=1), labels)
self.log(f"confmat/val", self.confmat) Gives: /usr/local/lib/python3.6/dist-packages/pytorch_lightning/trainer/logging.py in metrics_to_scalars(self, metrics)
46 for k, v in metrics.items():
47 if isinstance(v, torch.Tensor):
---> 48 v = v.item()
49
50 if isinstance(v, dict):
ValueError: only one element tensors can be converted to Python scalars Colab: https://colab.research.google.com/drive/1GRi0VhRTbDGdj4D1HUMo49Pe7__tWe31?usp=sharing |
@NumesSanguis Mind opening a separate issue? |
This example does not reflect a real example. When you are creating artificial labels, when you do
|
If there is no logging support through |
@Vichoko depending on what logger you are using, you can create the confusion matrix as a figure and log it. For example using the default tensorboard logger: import matplotlib.pyplot as plt
...
def training_step(self, batch, batch_idx):
...
step_confmat = self.metric(pred, target)
fig = plt.figure(); plt.imshow(step_confmat)
self.logger.experiment.add_figure('step_confmat', fig, global_step=self.global_step)
def training_epoch_end(self, outputs):
epoch_confmat = self.metric.compute()
fig = plt.figure(); plt.imshow(epoch_confmat)
self.logger.experiment.add_figure('epoch_confmat', fig, global_step=self.global_step) |
Sorry, somehow I had my own classification problem in mind where I assumed that Thank you for the confusion matrix example, that was actually what I was searching for! |
I assume the x-axis (bottom) to represent the Predicted label, and the y-axis (left) to be the True label, but could this be added to the documentation? |
We follow sklearns way of representing confusion matrices (https://scikit-learn.org/stable/modules/generated/sklearn.metrics.confusion_matrix.html#sklearn.metrics.confusion_matrix) which indeed is predicted by x-axis/columns and true by y-axis/rows. |
@SkafteNicki @NumesSanguis changelog shall be fixed in #4505 |
* docs + precision + recall + f_beta + refactor Co-authored-by: Teddy Koker <[email protected]> * rebase Co-authored-by: Teddy Koker <[email protected]> * fixes Co-authored-by: Teddy Koker <[email protected]> * added missing file * docs * docs * extra import * add confusion matrix * add to docs * add test * pep8 + isort * update tests * move util function * unify functional and class * add to init * remove old implementation * update tests * pep8 * add duplicate * fix doctest * Update pytorch_lightning/metrics/classification/confusion_matrix.py Co-authored-by: Justus Schock <[email protected]> * changelog * bullet point args * bullet docs * bullet docs Co-authored-by: ananyahjha93 <[email protected]> Co-authored-by: Teddy Koker <[email protected]> Co-authored-by: Justus Schock <[email protected]> Co-authored-by: chaton <[email protected]> Co-authored-by: Roger Shieh <[email protected]> Co-authored-by: Rohit Gupta <[email protected]> (cherry picked from commit e0b856c)
Changelog implied that this would be released on 1.0.6 but it wasn't :c. I should be worried about using this feature before 1.1? |
What does this PR do?
Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃