关于Qwen2.5-7b模型与lora参数的问题 #556
-
非常感谢作者的工作,在阅读代码时有疑问: |
Beta Was this translation helpful? Give feedback.
Answered by
shibing624
Mar 12, 2025
Replies: 1 comment
-
lora训练后,需要合并。lora参数是个插件模块,合并与否,对性能无影响。 |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
cwczyj
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
lora训练后,需要合并。lora参数是个插件模块,合并与否,对性能无影响。