-
Notifications
You must be signed in to change notification settings - Fork 835
Description
在运行app.py时出现以下错误
(my_lm) xt@ji-jupyter-6713700621420699648-master-0:~/txiang/LMFlow/service$ python app.py
Explicitly passing a revision
is encouraged when loading a configuration with custom code to ensure no malicious code has been contributed in a newer revision.
Failed to use RAM optimized load. Automatically use original load instead.
Explicitly passing a revision
is encouraged when loading a configuration with custom code to ensure no malicious code has been contributed in a newer revision.
Traceback (most recent call last):
File "/home/xt/txiang/LMFlow/service/../src/lmflow/models/hf_decoder_model.py", line 192, in init
self.backend_model = AutoModelForCausalLM.from_pretrained(
File "/home/xt/anaconda3/envs/my_lm/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py", line 474, in from_pretrained
raise ValueError(
ValueError: Unrecognized configuration class <class 'transformers_modules.configuration_chatglm.ChatGLMConfig'> for this kind of AutoModel: AutoModelForCausalLM.
Model type should be one of BartConfig, BertConfig, BertGenerationConfig, BigBirdConfig, BigBirdPegasusConfig, BioGptConfig, BlenderbotConfig, BlenderbotSmallConfig, BloomConfig, CamembertConfig, CodeGenConfig, CTRLConfig, Data2VecTextConfig, ElectraConfig, ErnieConfig, GitConfig, GPT2Config, GPT2Config, GPTNeoConfig, GPTNeoXConfig, GPTNeoXJapaneseConfig, GPTJConfig, LlamaConfig, MarianConfig, MBartConfig, MegaConfig, MegatronBertConfig, MvpConfig, OpenAIGPTConfig, OPTConfig, PegasusConfig, PLBartConfig, ProphetNetConfig, QDQBertConfig, ReformerConfig, RemBertConfig, RobertaConfig, RobertaPreLayerNormConfig, RoCBertConfig, RoFormerConfig, Speech2Text2Config, TransfoXLConfig, TrOCRConfig, XGLMConfig, XLMConfig, XLMProphetNetConfig, XLMRobertaConfig, XLMRobertaXLConfig, XLNetConfig, XmodConfig.
我是在hugging face models里面把chatglm下载下来放到了output_models目录下,运行的时候说无法识别ChatGLMConfig这个配置类,这个问题该如何解决?