You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@Abecid What error are you getting with bfloat16. I think it's only supported in Ampere and newer, but it appears that it now also works on older T4's and CPU. Just tested it. Maybe it's a PyTorch version thing.
If you have time and don't mind spending a few more minutes, could you let me know the error code you are getting and PyTorch version to look into it further? I could then add a more explicit warning to save the hassle for future users.
precision = "bf16-true"
is unsupported but in the code for lora-finetuning
fabric.init_module
causes a does not exist error.
with fabric.device
results in an error in full fine-tuning script
Overall very poor experience and poor documentation. Garbage
The text was updated successfully, but these errors were encountered: