Skip to content

Conversation

dkundel-openai
Copy link
Collaborator

@dkundel-openai dkundel-openai commented Sep 30, 2025

  1. Adds the option for a local jupyter kernel as source for python
  2. Increases browser reliability by handling the case where the model accidentally outputs a call to functions.browser.search as opposed to browser.search as long as there is no conflicting function defined.
  3. Fixed some additional Responses API bugs like correctly assigning message Ids and not sending python calls as reasoning_text

@dkundel-openai
Copy link
Collaborator Author

@codex review

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Comment on lines +269 to +275
self._jupyter_session: LocalJupyterSession | None = None
self._execution_lock: asyncio.Lock | None = None

if self._execution_backend == "dangerously_use_local_jupyter":
self._execution_lock = asyncio.Lock()
self._jupyter_session = LocalJupyterSession(
connection_file=self._local_jupyter_connection_file,

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Instantiate local Jupyter lock without running event loop

When the Python tool is configured for the new dangerously_use_local_jupyter backend, the constructor immediately executes self._execution_lock = asyncio.Lock(). asyncio.Lock calls get_running_loop, so constructing PythonTool() in synchronous code (e.g. in gpt_oss/chat.py or gpt-oss-mcp-server/python_server.py) now raises RuntimeError: no running event loop whenever PYTHON_EXECUTION_BACKEND is set to this backend. That means the CLI and MCP server crash before they even start when users enable the stateful Jupyter option. The lock should be created lazily inside an async context or via a loop-safe factory so the tool can still be instantiated outside a running loop.

Useful? React with 👍 / 👎.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@codex can you fix it?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note

To use Codex here, create an environment for this repo.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@codex please try again

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note

To use Codex here, create an environment for this repo.

@dkundel-openai
Copy link
Collaborator Author

[deleted comment that was just "Yes 🚀" that might mess with the codex flow]

@dkundel-openai dkundel-openai merged commit 0a9ec7f into main Oct 6, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant