You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
basically, an intern at ByteDance inserted remote code execution to do all sorts of shenanigans and so they now make it so torch.load will, by default, only load in tensors. I relied on loading in other callables as well and so now have to decide:
do I set weights_only=False for the plenoptic's synthesis method loading, which allows arbitrary code execution and thus is unsafe.
track down the different non-tensor objects I'm trying to save and add them to pytorch's "safe globals" list.
come up with a work-around that doesn't save non-tensors (but saves their parameters so I can reinstatiate them on load)
Looking into this a bit, the issue is that I save the optimizer, loss function, and scheduler. loss function is a python callable, optimizer and scheduler are torch objects. internally, I only use either objects from torch.optim or within plenoptic for these, and so could enumerate all of them and mark them as safe. however, users are able to pass their own arbitrary objects for any of those, though that is considered advanced usage. I could wrap the load error in an exception to point users to next steps if they do something that's not on my safe-list.
for possible solution 3, there's a similar issue, which is that if users use their own objects for any of those attributes, I don't think I'll know how to instantiate them. and if I'm instantiating them myself I don't think that's any safer than option 1...
The text was updated successfully, but these errors were encountered:
plenoptic's tests are now failing (with torch>=2.6) because of this change to pytorch: https://dev-discuss.pytorch.org/t/bc-breaking-change-torch-load-is-being-flipped-to-use-[…]s-only-true-by-default-in-the-nightlies-after-137602/2573.
basically, an intern at ByteDance inserted remote code execution to do all sorts of shenanigans and so they now make it so torch.load will, by default, only load in tensors. I relied on loading in other callables as well and so now have to decide:
related torch docs
Looking into this a bit, the issue is that I save the optimizer, loss function, and scheduler. loss function is a python callable, optimizer and scheduler are torch objects. internally, I only use either objects from
torch.optim
or within plenoptic for these, and so could enumerate all of them and mark them as safe. however, users are able to pass their own arbitrary objects for any of those, though that is considered advanced usage. I could wrap the load error in an exception to point users to next steps if they do something that's not on my safe-list.for possible solution 3, there's a similar issue, which is that if users use their own objects for any of those attributes, I don't think I'll know how to instantiate them. and if I'm instantiating them myself I don't think that's any safer than option 1...
The text was updated successfully, but these errors were encountered: