We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
finetune使用lora的时候提示这个错误 target module Quantizedlinear() is not supported
The text was updated successfully, but these errors were encountered:
是微调哪个版本的模型么?
Sorry, something went wrong.
你好,是微调这个模型的时候报的上面的错误 BlueLM-7B-Chat-4bits
No branches or pull requests
finetune使用lora的时候提示这个错误
target module Quantizedlinear() is not supported
The text was updated successfully, but these errors were encountered: