You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you very much for your work. I would like to know when my server is training the model, it prompts: FlashAttention only supports Ampere GPUs or newer, so can I train the model without changing the server? For example, without using FlashAttention,Thank you very much.
The text was updated successfully, but these errors were encountered:
Thank you very much for your work. I would like to know when my server is training the model, it prompts: FlashAttention only supports Ampere GPUs or newer, so can I train the model without changing the server? For example, without using FlashAttention,Thank you very much.
The text was updated successfully, but these errors were encountered: