Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] 加载 LoRA 权重报错 #168

Closed
2 tasks done
pixelock opened this issue May 28, 2024 · 2 comments
Closed
2 tasks done

[BUG] 加载 LoRA 权重报错 #168

pixelock opened this issue May 28, 2024 · 2 comments

Comments

@pixelock
Copy link

是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?

  • 我已经搜索过已有的issues和讨论 | I have searched the existing issues / discussions

该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?

  • 我已经搜索过FAQ | I have searched FAQ

当前行为 | Current Behavior

LoRA SFT 得到 adapter 权重后, 使用下面的方法加载权重时报错:

from peft import AutoPeftModelForCausalLM

model = AutoPeftModelForCausalLM.from_pretrained(
    # path to the output directory
    'output/minicpmv2_lora',
    device_map="auto",
    trust_remote_code=True
).eval()

错误:

Traceback (most recent call last):
  File "/MiniCPM-V/minicpm_infer.py", line 17, in <module>
    model = AutoPeftModelForCausalLM.from_pretrained(
  File "/usr/local/lib/python3.10/dist-packages/peft/auto.py", line 126, in from_pretrained
    base_model.resize_token_embeddings(len(tokenizer))
  File "/usr/local/lib/python3.10/dist-packages/transformers/modeling_utils.py", line 1901, in resize_token_embeddings
    model_embeds = self._resize_token_embeddings(new_num_tokens, pad_to_multiple_of)
  File "/usr/local/lib/python3.10/dist-packages/transformers/modeling_utils.py", line 1915, in _resize_token_embeddings
    old_embeddings = self.get_input_embeddings()
  File "/usr/local/lib/python3.10/dist-packages/transformers/modeling_utils.py", line 1668, in get_input_embeddings
    raise NotImplementedError
NotImplementedError

表现为缺少 get_input_embeddings 函数. 看 finetune.py 中, get_input_embeddings 方法是动态加入的, 是不是这里存在问题?

期望行为 | Expected Behavior

LoRA 权重可以正常加载, 不报错.

复现方法 | Steps To Reproduce

  1. 训练得到 lora adapter 权重
  2. 使用 peft 中的 AutoPeftModelForCausalLM 方法加载 adapter 目录, 得到完整的模型
  3. 在这一步会遇到 get_input_embeddings 的 NotImplementedError 错误

运行环境 | Environment

- OS:
- Python: 3.11
- Transformers: 4.40.0
- PyTorch: 2.2.0
- CUDA (`python -c 'import torch; print(torch.version.cuda)'`): 12.3

备注 | Anything else?

No response

@pixelock pixelock changed the title [BUG] <title> [BUG] 加载 LoRA 权重报错 May 28, 2024
@isongxw
Copy link

isongxw commented May 29, 2024

hugging face的最新版代码更新了 modeling_minicpmv.py,把get_input_embeddings的实现加进去了,亲测更新后lora模型加载就不会报这个错误了

@pixelock
Copy link
Author

hugging face的最新版代码更新了 modeling_minicpmv.py,把get_input_embeddings的实现加进去了,亲测更新后lora模型加载就不会报这个错误了

官方已解决, 更新测试没有问题了

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants