Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

This codebase has so many errors it is completely useless and unusable #438

Open
Abecid opened this issue Aug 7, 2023 · 2 comments
Open

Comments

@Abecid
Copy link

Abecid commented Aug 7, 2023

precision = "bf16-true"

is unsupported but in the code for lora-finetuning

fabric.init_module

causes a does not exist error.

with fabric.device

results in an error in full fine-tuning script

Overall very poor experience and poor documentation. Garbage

@rasbt
Copy link
Contributor

rasbt commented Aug 8, 2023

I don't know about the with fabric.device but let me address the other two

  1. precision = "bf16-true"

  2. fabric.init_module

with more explicit warnings and suggestions via a PR shortly.

@rasbt
Copy link
Contributor

rasbt commented Aug 8, 2023

@Abecid What error are you getting with bfloat16. I think it's only supported in Ampere and newer, but it appears that it now also works on older T4's and CPU. Just tested it. Maybe it's a PyTorch version thing.

If you have time and don't mind spending a few more minutes, could you let me know the error code you are getting and PyTorch version to look into it further? I could then add a more explicit warning to save the hassle for future users.
Screenshot 2023-08-08 at 1 15 44 PM

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants