You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The BlueLM-7B-Chat-4bits model keeps spamming irrelevant and meaningless words, and forms incoherent and nonsensical sentences after the initial prompt, for example:
BlueLM-7B-Chat-4bits 模型经常输出没意义的字眼,在第一次提问后模型无法输出最基本流畅、通俗易懂的句子,如下方例子:
User:你好你是谁
BlueLM-7B:您好,我是vivo公司研发的AI大语言模型。
User:您是
BlueLM-7B:(os_o_{source}{u}_{*} }(*/.*)* (*)). */*./. .@./''HET*/". " H N R!". A B NO AD IN** AB T field^ BC!." d the do n no and h as me a he r e p of in not with it so at c co b el i an be doing this black like they un bed on these tr but over f out does
(keeps spamming characters until the reaching MAX_NEW_TOKEN)
Affected device 肇事装置:
ASUS Republic of Gamers (ROG) Flow Z13 (2023) GZ301VV
华硕 玩家国度 幻X (2023) GZ301VV
Specs 配置:
Processor 处理器 13th Gen Intel(R) Core(TM)13代英特尔酷睿 i9-13900H 2.60 GHz
Installed RAM 已安装运存 32.0 GB (31.6 GB usable)
System type 系统类型 64-bit operating system, x64-based processor 64位操作系统,基于x64处理器
Pen and touch Pen and touch support with 20 touch points
Edition Windows 11 Pro
Version 版本 23H2
Installed on 安装于 4/10/2024
OS build 22631.4602
Experience 体验 Windows Feature Experience Pack 1000.22700.1055.0
Note: The BlueLM in this incident is operated in a WSL-Ubuntu environment.
备注:肇事蓝心大模型是在WSL-Ubuntu下运行的。
!!! IMPORTANT 重要!!!
This incident has been reported to the CN Vivo Live Customer Service operator at 20250102 from Hong Kong, China. This report intends to make sure both Github and Customer Service have been informed regarding the incident. 本次事件已在2025年1月2日在中国香港透过 Vivo 官网 app 上报反馈给在线客服,本次报告纯粹出于保险而同时反馈给 Github 和 Vivo 在线客服。
Additionally, similar incidents were also reported in previous versions/iterations, as reported in Issue #20 reported by @cilly1 at December 7 2023. See #20 for more details. 另外,在过往版本,在2023年12月7日,@cilly1也反馈过有类似事件出现。详见问题 #20。
ISSUE 2 问题 2
Error encountered when installing sentencepiece from pip install -r requirements.txt , the problem persists even with --break-system-packages in cmd.exe (both legacy and Windows Terminal command prompt) and powershell.exe (legacy and Windows Terminal)
KNOWN ISSUES 已知问题
Termux (Android) cannot compile cmake and some other packages. Even with --break-system-packages , the problem remains to persist, unable to compile and build, resulting the inability to compile cli_demo.py due to lack of builds. Unfortunately, not even the model can be downloaded.
在Termux (安卓),就算加了 --break-system-packages ,因不明原因并不能编译编写cmake和部分模件,导致缺少必要模组,因此无法编译 cli_demo.py。很遗憾,连模型都没法下载。
Therefore, as asked in Issue #22, if you don't have the tools and resources, it is unlikely, almost impossible to run BlueLM locally on your X100, so I suggest you just get along with the BlueLM Copilot (蓝心小V) in your OriginOS 4/5. I tried it on my Samsung Galaxy Tab S8 in Termux and this happened.
因此,正如问题 #22,如果您没工具/资源的话,基本上都没可能下载并在 X100 本地跑蓝心大模型,还是建议您直接在OriginOS 4/5使用 蓝心小V 了。
Using models which are NOT BlueLM-7B-Chat-4bits is strongly not advised for any users having dedicated GPU VRAM less than 8GB and RAM less than 32GB, not only will this detrimentally impact the performance of your computer, with 100% utilisation GPU in 3D and Dedicated GPU Memory Usage, this will also severely impact the overall experience. Any models which is not 4bits will only give about 2 words (or 2-3 Chinese characters/Kanji (whatever you prefer calling)) per 4-10 seconds (possibly longer, actual time may vary depending on hardware specs). Nvidia GeForce RTX 4060 (65 Watts) and overclock (Base clock offseet 50/200 MHz; Memory clock offset 100/300 MHz; Dynamic Boost 15W; TGP 85/85℃) is not suffiecient to run model(s) which is(are) not BlueLM-7B-Chat-4bits , don't do that.
如果独显显存不足 8GB,运存不足 32GB,强烈不建议您使用不是 BlueLM-7B-Chat-4bits 的其他模型。不然不但会拖垮电脑性能,更可能会造成不好的使用体验,模型会隔4-10秒左右输出2-3个中文单字(或者2-3个英语词)(实际时间因硬件配置而异,请以实际情况为准) 连4060(65瓦)超频提速 15 瓦才勉强跑得动BlueLM-7B-Chat/Base,32k 就想都别想了。别这么搞。
The text was updated successfully, but these errors were encountered:
UnusedCheese358
changed the title
!!! 重大故障,请勿忽略 CRITICAL BUG, DO NOT IGNORE !!! BlueLM-7B-Chat-4bits
!!! 重大故障,请勿忽略 CRITICAL BUG, DO NOT IGNORE !!!
Jan 2, 2025
UnusedCheese358
changed the title
!!! 重大故障,请勿忽略 CRITICAL BUG, DO NOT IGNORE !!!
!!! CRITICAL BUG IN MODEL, DO NOT IGNORE 模型重大故障,请勿忽略 !!!
Jan 2, 2025
UnusedCheese358
changed the title
!!! CRITICAL BUG IN MODEL, DO NOT IGNORE 模型重大故障,请勿忽略 !!!
!!! CRITICAL BUG IN MODEL, DO NOT IGNORE
模型重大故障,请勿忽略 !!!
Jan 4, 2025
The BlueLM-7B-Chat-4bits model keeps spamming irrelevant and meaningless words, and forms incoherent and nonsensical sentences after the initial prompt, for example:
BlueLM-7B-Chat-4bits 模型经常输出没意义的字眼,在第一次提问后模型无法输出最基本流畅、通俗易懂的句子,如下方例子:
Affected device 肇事装置:
ASUS Republic of Gamers (ROG) Flow Z13 (2023) GZ301VV
华硕 玩家国度 幻X (2023) GZ301VV
Specs 配置:
Note: The BlueLM in this incident is operated in a WSL-Ubuntu environment.
备注:肇事蓝心大模型是在WSL-Ubuntu下运行的。
!!! IMPORTANT 重要!!!
This incident has been reported to the CN Vivo Live Customer Service operator at 20250102 from Hong Kong, China. This report intends to make sure both Github and Customer Service have been informed regarding the incident.
本次事件已在2025年1月2日在中国香港透过 Vivo 官网 app 上报反馈给在线客服,本次报告纯粹出于保险而同时反馈给 Github 和 Vivo 在线客服。
Additionally, similar incidents were also reported in previous versions/iterations, as reported in Issue #20 reported by @cilly1 at December 7 2023. See #20 for more details.
另外,在过往版本,在2023年12月7日,@cilly1也反馈过有类似事件出现。详见问题 #20。
ISSUE 2 问题 2
Error encountered when installing sentencepiece from
pip install -r requirements.txt
, the problem persists even with--break-system-packages
in cmd.exe (both legacy and Windows Terminal command prompt) and powershell.exe (legacy and Windows Terminal)KNOWN ISSUES 已知问题
Termux (Android) cannot compile cmake and some other packages. Even with
--break-system-packages
, the problem remains to persist, unable to compile and build, resulting the inability to compilecli_demo.py
due to lack of builds. Unfortunately, not even the model can be downloaded.在Termux (安卓),就算加了
--break-system-packages
,因不明原因并不能编译编写cmake和部分模件,导致缺少必要模组,因此无法编译cli_demo.py
。很遗憾,连模型都没法下载。Therefore, as asked in Issue #22, if you don't have the tools and resources, it is unlikely, almost impossible to run BlueLM locally on your X100, so I suggest you just get along with the BlueLM Copilot (蓝心小V) in your OriginOS 4/5. I tried it on my Samsung Galaxy Tab S8 in Termux and this happened.
因此,正如问题 #22,如果您没工具/资源的话,基本上都没可能下载并在 X100 本地跑蓝心大模型,还是建议您直接在OriginOS 4/5使用 蓝心小V 了。
For non-Vivo devices and PC, you can also access deriveratives of BlueLM (which is also maintained by Vivo) called 蓝心千询 (Website: https://qianxun.vivo.com/#/explore) or download the app directly (Website for download (Chinese Simplified UI only): https://qianxun.vivo.com/download/index.html). Rest assured, this software can run on my Samsung Galaxy Tab S8 just fine, no incompatabilities whatsoever as of 20250103.
至于非 Vivo 装置,或者 PC,您可以选择使用网页版蓝心千询 (网页: https://qianxun.vivo.com/#/explore) 或者直接下载蓝心千询 app (下载链接: https://qianxun.vivo.com/download/index.html ) ,请放心,经测试,三星Tab S8也能稳定运行,目前并没有不兼容的问题出现。
!!! IMPORTANT REMINDER TO ALL USERS 重要提示 !!!
Using models which are NOT
BlueLM-7B-Chat-4bits
is strongly not advised for any users having dedicated GPU VRAM less than 8GB and RAM less than 32GB, not only will this detrimentally impact the performance of your computer, with 100% utilisation GPU in 3D and Dedicated GPU Memory Usage, this will also severely impact the overall experience. Any models which is not4bits
will only give about 2 words (or 2-3 Chinese characters/Kanji (whatever you prefer calling)) per 4-10 seconds (possibly longer, actual time may vary depending on hardware specs).Nvidia GeForce RTX 4060 (65 Watts) and overclock (Base clock offseet 50/200 MHz; Memory clock offset 100/300 MHz; Dynamic Boost 15W; TGP 85/85℃) is not suffiecient to run model(s) which is(are) not
BlueLM-7B-Chat-4bits
, don't do that.如果独显显存不足 8GB,运存不足 32GB,强烈不建议您使用不是
BlueLM-7B-Chat-4bits
的其他模型。不然不但会拖垮电脑性能,更可能会造成不好的使用体验,模型会隔4-10秒左右输出2-3个中文单字(或者2-3个英语词)(实际时间因硬件配置而异,请以实际情况为准)连4060(65瓦)超频提速 15 瓦才勉强跑得动
BlueLM-7B-Chat/Base
,32k
就想都别想了。别这么搞。The text was updated successfully, but these errors were encountered: