Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ChatPDF功能存在的问题运行complex_ui.py直接报错。 #73

Open
kevinwei1975 opened this issue Aug 9, 2024 · 0 comments
Open

Comments

@kevinwei1975
Copy link

ChatPDF功能存在的问题运行complex_ui.py直接报如下错误。
(mindspore) root@autodl-container-bff2469f3e-a4796232:/autodl-tmp/ChatPDF# python complex_ui.py
Building prefix dict from the default dictionary ...
Loading model from cache /tmp/jieba.cache
Loading model cost 0.529 seconds.
Prefix dict has been built successfully.
The following parameters in checkpoint files are not loaded:
['embeddings.position_ids']
Loading checkpoint shards: 0%| | 0/3 [00:00<?, ?it/s]MindSpore do not support bfloat16 dtype, we will automaticlly convert to float16
Loading checkpoint shards: 100%|████████████████████████████████████████████████████| 3/3 [00:18<00:00, 6.11s/it]
Traceback (most recent call last):
File "/root/autodl-tmp/ChatPDF/complex_ui.py", line 2, in
from logic import add_text, generate_response, render_file, clear_chatbot
File "/root/autodl-tmp/ChatPDF/logic.py", line 6, in
model = ChatPDF()
File "/root/autodl-tmp/ChatPDF/chatpdf.py", line 184, in init
self.rerank_tokenizer = AutoTokenizer.from_pretrained(rerank_model_name_or_path, mirror='modelscope')
File "/root/miniconda3/envs/mindspore/lib/python3.9/site-packages/mindnlp/transformers/models/auto/tokenization_auto.py", line 775, in from_pretrained
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
File "/root/miniconda3/envs/mindspore/lib/python3.9/site-packages/mindnlp/transformers/tokenization_utils_base.py", line 1723, in from_pretrained
return cls._from_pretrained(
File "/root/miniconda3/envs/mindspore/lib/python3.9/site-packages/mindnlp/transformers/tokenization_utils_base.py", line 1942, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
File "/root/miniconda3/envs/mindspore/lib/python3.9/site-packages/mindnlp/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py", line 154, in init
super().init(
File "/root/miniconda3/envs/mindspore/lib/python3.9/site-packages/mindnlp/transformers/tokenization_utils_fast.py", line 106, in init
fast_tokenizer = convert_slow_tokenizer(slow_tokenizer)
File "/root/miniconda3/envs/mindspore/lib/python3.9/site-packages/mindnlp/transformers/convert_slow_tokenizer.py", line 1388, in convert_slow_tokenizer
return converter_class(transformer_tokenizer).converted()
File "/root/miniconda3/envs/mindspore/lib/python3.9/site-packages/mindnlp/transformers/convert_slow_tokenizer.py", line 533, in converted
pre_tokenizer = self.pre_tokenizer(replacement, add_prefix_space)
File "/root/miniconda3/envs/mindspore/lib/python3.9/site-packages/mindnlp/transformers/convert_slow_tokenizer.py", line 515, in pre_tokenizer
return pre_tokenizers.Metaspace(replacement=replacement, add_prefix_space=add_prefix_space)
TypeError: Metaspace.new() got an unexpected keyword argument 'add_prefix_space'
(mindspore) root@autodl-container-bff2469f3e-a4796232:
/autodl-tmp/ChatPDF# ^C

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant