Skip to content

[Bug] Cosyvoice2 Vllm use #4369

@pingpingtcmsn

Description

@pingpingtcmsn

MaxKB Version

2.0

Problem Description

Cosyvoice2可以使用vllm部署模型,在MaxKB V2.0版本上,也可以使用vllm或者openai加载模型,但是在应用中调用时,需要调用/v1/audio/speech这个端口,但是服务器上使用vllm加速的Cosyvoice2好像不支持,怎么办?

Steps to Reproduce

使用vllm部署Cosyvoice2模型后,怎么使用MaxKB的平台应用使用这一模型?

The expected correct result

No response

Related log output

Additional Information

No response

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions