feat: add json_object support in response_format #4080
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Motivation
OpenAI-compatible
response_format={"type": "json_object"}is still part of the official spec and is explicitly requested by users.lmdeploy currently rejects this value with
ValueError: unsupported format type: json_object.We need to accept it and force the model to emit any valid JSON object without requiring the caller to supply a schema.
Modification
GuidedDecodingManager
json_objectas a built-in schema:'{"type": "object", "additionalProperties": true}'TurboMind backend
json_objectbranch when extracting the schema fromresponse_format.OpenAI-API server
Tests
json_object,json_schema,regex_schema, and the un-guided case.json_objectresponses with the same permissive schema used in the engine.With this patch, clients can use
response_format={"type": "json_object"}and receive guaranteed valid JSON objects.