Update tokenization_chatglm.py

#72

https://github.com/huggingface/transformers/blob/8f2b6d5e3dcf40ab0d01f3c8117d1df09e465616/src/transformers/tokenization_utils_base.py#L43

TensorType should come from transformers.utils instead of torch. Otherwise, the tokenizer can't be used alone in an environment without PyTorch installed, where we only need to count tokens.

tokenization_chatglm.py 有何用途,

ZHANGYUXUAN-zR changed pull request status to merged

For those asking about API access — I've been using Crazyrouter as a unified gateway. One API key, OpenAI SDK compatible. Works well for testing different models without managing multiple accounts.

Sign up or log in to comment