Skip to content

Add first-class ChatGPT subscription provider support#881

Open
Spherrrical wants to merge 3 commits intomainfrom
musa/chatgpt-subscription
Open

Add first-class ChatGPT subscription provider support#881
Spherrrical wants to merge 3 commits intomainfrom
musa/chatgpt-subscription

Conversation

@Spherrrical
Copy link
Copy Markdown
Collaborator

Adds support for routing LLM traffic through a ChatGPT Plus/Pro subscription using OpenAI's device-code OAuth flow (same as the Codex CLI). Users authenticate once via planoai chatgpt login and tokens are auto-refreshed from ~/.plano/chatgpt/auth.json. The chatgpt provider is wired end-to-end: CLI commands, config auto-injection, schema updates, Rust ProviderId/LlmProviderType variants, request normalization for the Codex backend, and custom header forwarding in the WASM gateway. A self-contained demo with config.yaml, chat.py, and a curl test script is included under demos/llm_routing/chatgpt_subscription/.

@Stoff81
Copy link
Copy Markdown

Stoff81 commented Apr 12, 2026

Testing out now

@Stoff81
Copy link
Copy Markdown

Stoff81 commented Apr 12, 2026

Can confirm this is working for me! Lovely stuff

Copy link
Copy Markdown
Contributor

@adilhafeez adilhafeez left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the refresh token flow is not automatic - so the way it works right now is that it 1. loads token from disk, 2. refresh token if needed and then 3. inject the token. Issue is if token expired while plano is running there is no way to refersh it and we'll have to restart the service to refresh the token. This doesn't seem right.

Comment on lines +100 to +114
const CHATGPT_BASE_INSTRUCTIONS: &str =
"You are Codex, based on GPT-5. You are running as a coding agent in the Codex CLI on a user's computer.";
match &req.instructions {
Some(existing) if existing.contains(CHATGPT_BASE_INSTRUCTIONS) => {}
Some(existing) => {
req.instructions =
Some(format!("{}\n\n{}", CHATGPT_BASE_INSTRUCTIONS, existing));
}
None => {
req.instructions = Some(CHATGPT_BASE_INSTRUCTIONS.to_string());
}
}
req.store = Some(false);
req.stream = Some(true);

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why are hard-coding system prompt? and what if user has set stream=false?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the system will not work without this prompt, and the stream=false does not work for this provider

Comment on lines +35 to +37
CHATGPT_DEFAULT_ORIGINATOR = "codex_cli_rs"
CHATGPT_DEFAULT_USER_AGENT = "codex_cli_rs/0.0.0 (Unknown 0; unknown) unknown"

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why harcode user agent and orignator? This will overwrite user's user_agent if it was set.

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the provider will not work without these headers

@Stoff81
Copy link
Copy Markdown

Stoff81 commented Apr 14, 2026

I have found an issue with using OpenClaw and this branch, when posting with tools as OpenClaw does:

curl -sS http://localhost:12000/v1/responses \ -H "Content-Type: application/json" \ -d '{ "model": "gpt-5.3-codex", "input": [ { "type": "message", "role": "user", "content": [ { "type": "input_text", "text": "What is the capital of France?" } ] } ], "stream": false, "tools": [ { "type": "function", "name": "get_time", "description": "Get the current time", "parameters": { "type": "object", "properties": {}, "additionalProperties": false } } ], "tool_choice": "auto" }'

I am getting Failed to parse request: missing field name at line 1 column 330%

I have created a PR to address this here: #884

@Stoff81
Copy link
Copy Markdown

Stoff81 commented Apr 14, 2026

I found an issue with using OpenClaw and this branch, when posting:

{ "model": "gpt-5.3-codex", "input": [ { "role": "user", "content": [ { "type": "input_text", "text": "what model are you on" } ] } ], "include": [ "reasoning.encrypted_content" ], "store": false, "instructions": "You are Codex, based on GPT-5. You are running as a coding agent in the Codex CLI on a user's computer.", "stream": true, "max_output_tokens": 8192 }

We were getting Unsupported Field: max_output_tokens

I have created a PR to address this here: #884

@Stoff81
Copy link
Copy Markdown

Stoff81 commented Apr 14, 2026

I found an issue with using OpenClaw and this branch, when posting:

{ "model": "gpt-5.3-codex", "input": [ { "role": "user", "content": [ { "type": "input_text", "text": "what model are you on" } ] } ], "include": [ "reasoning.encrypted_content" ], "store": false, "instructions": "You are Codex, based on GPT-5. You are running as a coding agent in the Codex CLI on a user's computer.", "stream": true, "max_output_tokens": 8192 }

We were getting

{ "error": { "message": "Invalid value: 'input_text'. Supported values are: 'output_text' and 'refusal'.", "type": "invalid_request_error", "param": "input[12].content[0]", "code": "invalid_value" } }%

I have created a PR to address this here: #884

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants