There should be a warning or an exception if using amp_backend="native"
and setting an amp_level
(it doesn't use it)
#9739
Labels
feature
Is an improvement or enhancement
🚀 Feature
There should be a warning or an exception if using
Trainer(amp_backend="native")
and setting a value to theamp_level
argument (ofTrainer
).Motivation
It's not obvious for someone who hasn't looked into the code or who knows a lot about Apex that using Pytorch's native amp backend that the amp_level argument will have no effect, giving a false understanding of what is actually happening. Knowing how much of the model was made fp16 is really important to understand what is happening in the training, to make training reproducible.
With
amp_type="apex"
(amp_level used):https://github.com/PyTorchLightning/pytorch-lightning/blob/master/pytorch_lightning/trainer/connectors/accelerator_connector.py#L586
With
amp_type="native"
(not used):https://github.com/PyTorchLightning/pytorch-lightning/blob/master/pytorch_lightning/trainer/connectors/accelerator_connector.py#L573
Pitch
There should be an exception or a warning added, maybe around https://github.com/PyTorchLightning/pytorch-lightning/blob/master/pytorch_lightning/trainer/connectors/accelerator_connector.py#L573, when
Trainer(amp_level=...)
is set andTrainer(amp_backend="native")
. Again, Knowing how much of the model was made fp16 is really important to understand what is happening in the training, to make training reproducible.Alternatives
Rename
amp_level
toapex_level
to make it more obvious that it only works with theapex
backend. Breaks backwards compatibility.The text was updated successfully, but these errors were encountered: