摘要:
from langchain_community.llms.ollama import Ollama from langchain_core.prompts import ChatPromptTemplate, PromptTemplate llm = Ollama(model="qwen:7b") 阅读全文
摘要:
from transformers import AutoTokenizer, AutoModel modelPath = "/home/cmcc/server/model/chatglm3-6b" tokenizer = AutoTokenizer.from_pretrained(modelPat 阅读全文