-
Notifications
You must be signed in to change notification settings - Fork 5.7k
Draft: stdout streaming #5005
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Draft: stdout streaming #5005
Conversation
All contributors have signed the CLA ✍️ ✅ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codex Review
Here are some automated review suggestions for this pull request.
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
self.defer_or_handle(|q| q.push_exec_begin(ev), |s| s.handle_exec_begin_now(ev2)); | ||
} | ||
|
||
fn on_exec_command_output_delta( | ||
&mut self, | ||
_ev: codex_core::protocol::ExecCommandOutputDeltaEvent, | ||
) { | ||
// TODO: Handle streaming exec output if/when implemented | ||
fn on_exec_command_output_delta(&mut self, ev: ExecCommandOutputDeltaEvent) { | ||
let ExecCommandOutputDeltaEvent { | ||
call_id, | ||
stream, | ||
chunk, | ||
} = ev; | ||
|
||
if let Some(cell) = self | ||
.active_cell | ||
.as_mut() | ||
.and_then(|c| c.as_any_mut().downcast_mut::<ExecCell>()) | ||
{ | ||
let preview = cell.append_live_chunk(&call_id, stream, &chunk); | ||
let has_live = !cell.live_lines_for_display().is_empty(); | ||
if let Some(text) = preview.or_else(|| cell.latest_live_preview()) { | ||
self.bottom_pane.set_status_live_line(Some(text)); | ||
} | ||
if has_live { | ||
self.bottom_pane.set_live_output_toggle_enabled(true); | ||
} | ||
self.bottom_pane | ||
.set_live_output_visible(cell.show_live_output()); | ||
self.request_redraw(); | ||
return; | ||
} | ||
|
||
if !chunk.is_empty() { | ||
let fallback = String::from_utf8_lossy(&chunk).to_string(); | ||
if !fallback.is_empty() { | ||
self.bottom_pane | ||
.set_status_live_line(Some(fallback.trim_end_matches('\n').to_string())); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Queue exec output deltas to preserve ordering
Exec output streaming is handled immediately in on_exec_command_output_delta
, while begin and end events still use defer_or_handle
and the interrupt queue. When the UI is already processing another stream (e.g., reasoning text) the begin event is queued, but output deltas are processed right away and the active cell has not been created yet, so the fallback branch runs and the live chunk is discarded. Users will miss the first part of command output and the toggle hint never appears. Output delta events should be queued alongside ExecCommandBeginEvent
/ExecCommandEndEvent
to ensure they are processed after the corresponding begin event.
Useful? React with 👍 / 👎.
I have read the CLA Document and I hereby sign the CLA. |
529b271
to
87bd85b
Compare
Fired off a rebase. Following up on the bot's review note in a bit. |
Converted to a draft in GitHub, forgot GitHub doesn't respect |
I have read the CLA Document and I hereby sign the CLA |
87bd85b
to
a58165a
Compare
Fixes: #4751 #3675 #4550
This implements toggleable stdout/stderr streaming for live commands in
codex-tui
.Contribution guidelines
I know, I just don't care. I'm posting this in case anyone else wants it. I'm happy to sign a CLA so you can reuse what I did to get this feature fully integrated faster.
I did run
cargo test && cargo clippy --tests && cargo fmt -- --config imports_granularity=Item
and made an amendment to a test so that it would pass. It could be the case that you'd want to be able to test a preview and non-preview mode of this functionality, but I chose to keep it simple for now.This is marked
Draft:
because it is not yet in a mergeable state. I'm putting this up in case it saves openai time or in case anyone else wants the feature as I did. I will be using this fork of the TUI client until you integrate this functionality.Screenshots:
Made with codex, transcript attached
I used codex to write 100% of this. I'm attaching a transcript from my interactions with
codex-tui
below so that you can evaluate. codex-stdout-streaming-transcript.txtRequest changes if you'd like any!
I'm happy to make any changes you'd like in order to get this merged. I'm including all of the context from my work with Codex on this MR for the sake of transparency, including the markdown documents of the GitHub issues and the markdown plan I asked
gpt5-codex-high
to formulate for this effort. I also haven't squashed the history yet for the sake of transparency. I can do that at your request.Caveats discovered so far
The main behavioral caveat to the way this is currently implemented that I've noticed is that even when the command's output is folded, updates to the folded one-line preview seemed to scroll the bottom view of the TUI to the top third of my terminal. I tested with
ghostty
on Ubuntu 25.04, I need to test this withgnome-terminal
and on macOS withTerminal.app
to see if it repros there. It wasn't fatally annoying, but I wanted to be careful and note the possibly aberrant behavior. One straight-forward way to solve this would be to eliminate the one-line preview in the folded view, but it makes me happy so I'd rather find a way to make it work.I haven't tested to see if the one-line preview or if the unfolded view does anything intelligent with whitespace output. I wasn't sure what behavior would make the most sense here so I chose to leave it be. You'd know better than I what's best here. It's possible that you'd want to debounce the one-line preview so it only updates after the next non-100%-whitespace line or show a whitespace count after the last non-whitespace-line.
Skipped spooling
I think the complexity and SSD writes aren't worth it so I didn't bother looking into spooling the stream to disk.