Fork me on GitHub

Coding Poineer

Coding Poineer

Coding Poineer

Coding Poineer

Coding Poineer

Coding Poineer

Coding Poineer

Coding Poineer

Coding Poineer

Coding Poineer

Coding Poineer

vllm服务推理参数


stop: List of string。【生成文本时,碰到此token就会停下,但结果不会包含此token】
stop_token_ids: List of string。【生成id时,碰到此id就会停止,会包含此id,比如 tokenizer.eos_token_id [im_end]】

最终判断是否停止,是两个的并集【同时考虑】

参考:
https://docs.vllm.ai/en/latest/offline_inference/sampling_params.html

posted @ 2024-05-20 17:10  365/24/60  阅读(248)  评论(0编辑  收藏  举报