We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hi, recently I found that there might be a bug in the function convert_to_fp32().
accelerate/src/accelerate/utils.py
Line 258 in 1b1463f
return recursively_apply(_convert_to_fp32, tensor, test_type=_is_fp16_tensor)
Thanks.
The text was updated successfully, but these errors were encountered:
I met the error because of this i think.
If set fp_16 = True in Accelerator, model returns "True".
Sorry, something went wrong.
fix bug huggingface#172
6f2e3eb
fix bug #172 (#173)
d6247b7
Thanks for the report and the hint for the fix. It's fixed by the PR mentioned above and I will do a patch release on Monday with this fix.
No branches or pull requests
Hi, recently I found that there might be a bug in the function convert_to_fp32().
accelerate/src/accelerate/utils.py
Line 258 in 1b1463f
I think the correct one should be
return recursively_apply(_convert_to_fp32, tensor, test_type=_is_fp16_tensor)
Thanks.
The text was updated successfully, but these errors were encountered: