欢迎来到尧图网

客户服务 关于我们

您的位置:首页 > 科技 > 名人名企 > QWen2.5学习

QWen2.5学习

2025/5/25 2:43:39 来源:https://blog.csdn.net/m0_60857098/article/details/143831011  浏览:    关键词:QWen2.5学习

配置环境

pip install transformers

记得更新一下:typing_extensions

pip install --upgrade typing_extensions

安装modelscope

modelscope/modelscope: ModelScope: bring the notion of Model-as-a-Service to life.

下载这个仓库的代码上传到服务器解压

推理

新建QWen2_5.py:

from modelscope import AutoModelForCausalLM, AutoTokenizermodel_name = "qwen/Qwen2.5-7B-Instruct"model = AutoModelForCausalLM.from_pretrained(model_name,    torch_dtype="auto",    device_map="auto"
)
tokenizer = AutoTokenizer.from_pretrained(model_name)prompt = "Give me a short introduction to large language model."
messages = [{"role": "system", "content": "You are Qwen, created by Alibaba Cloud. You are a helpful assistant."},    {"role": "user", "content": prompt}
]
text = tokenizer.apply_chat_template(messages,    tokenize=False,    add_generation_prompt=True
)
model_inputs = tokenizer([text], return_tensors="pt").to(model.device)generated_ids = model.generate(**model_inputs,    max_new_tokens=512
)
generated_ids = [output_ids[len(input_ids):] for input_ids, output_ids in zip(model_inputs.input_ids, generated_ids)
]response = tokenizer.batch_decode(generated_ids, skip_special_tokens=True)[0]

运行:

 需要再安装一些东西:

pip install accelerate

pip install jinja2

另外如果无法自动下载,可以在下面的网站上手动下载:

https://huggingface.co/Qwen/Qwen2.5-7B-Instruct/tree/main

再运行:

python /root/modelscope-master/QWen2_5.py

2080ti有点慢,大概需要等6分钟:

输出:

输出的结果:

A large language model (LLM) is a type of artificial intelligence (AI) model designed to understand and generate human-like text based on the input it receives. These models are typically trained on vast amounts of text data from the internet, books, articles, and other sources, which allows them to learn patterns, semantics, and nuances in language.

Key characteristics of LLMs include:

1. **Scale**:**: They are usually very large, containing billions or even trillions of parameters, which allows them to capture complex relationships within text.
 2. **Generative Capabilities**:**: LLMs can generate text, answer questions, translate languages, summarize texts, and perform various other natural language processing tasks.
 3. **Context Understanding**.: These models can maintain context over long sequences of text, allowing for more coherent and meaningful responses.
 4. **Fine-Tuning**.: Many LLMs can be fine-tuned on specific tasks or domains to improve their performance on particular applications generation or understanding tasks.

Popular examples of large language models include models like GPT-3, BERT, and T5, which have been used in various applications applications applications scenarios, from customer service chatbots to creative writing assistance.

版权声明:

本网仅为发布的内容提供存储空间,不对发表、转载的内容提供任何形式的保证。凡本网注明“来源:XXX网络”的作品,均转载自其它媒体,著作权归作者所有,商业转载请联系作者获得授权,非商业转载请注明出处。

我们尊重并感谢每一位作者,均已注明文章来源和作者。如因作品内容、版权或其它问题,请及时与我们联系,联系邮箱:809451989@qq.com,投稿邮箱:809451989@qq.com

热搜词