Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[DOC] fix
gradient_clip
value in tutorials to ensure reproducible o…
…utputs similar to the committed cell output (#1750) Update gradient_clip=0.01 to gradient_clip=0.1. ### Description I've been following along with the Interpretable forecasting with N-Beats tutorial found here https://pytorch-forecasting.readthedocs.io/en/latest/tutorials/ar.html, copy-pasting the code exactly, and I am getting different results compared to what is shown on the documentation page. The first discrepancy between my results and the tutorial is when using lightning.pytorch.tuner.Tuner to find the optimal learning rate. The suggested learning rate and resulting plot are different. Perhaps more of a problem, however, is that my model fails to learn (training loss doesn't decrease), even when increasing the epochs. When I run the code from the Evaluate Results section, all of the resulting plots show a flat predicted curve. I believe I've found the problem. Testing on my side confirms. Code cells [6] and [14] have the parameter gradient_clip=0.01, which is the source of the problem. The N-HiTS tutorial uses gradient_clip=0.1, so I tried that out, which fixes the problem. The outputs don't match the tutorial outputs exactly, but they're much closer than what gradient_clip=0.01 yields.
- Loading branch information