Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
ARROW-2917: [Python] Use detach() to avoid PyTorch gradient errors
`detach()` doesn't copy data unless it has to and will give a RuntimeError if the detached data needs to have its gradient calculated. Author: Wes McKinney <[email protected]> Author: Alok Singh <[email protected]> Closes #2311 from alok/patch-1 and squashes the following commits: e451de8 <Wes McKinney> Add unit test serializing pytorch tensor requiring gradiant that fails on master f8e298f <Alok Singh> Use detach() to avoid torch gradient errors
- Loading branch information