-
Notifications
You must be signed in to change notification settings - Fork 1.7k
Process Tools for Interactive CLI commands #2483
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: dev
Are you sure you want to change the base?
Conversation
… output management
4555a77
to
77d5bb9
Compare
Considerations:
|
Just sharing my own experience with interacting with TUI/ interactive clis I built a similar tool for my own use. The observation was that it seems productive if it can chain multiple functions. LLMs already know how to use libraries widespread. Disadvantage is that this can get harder to review / follow. I added a little CLI tool that can stream out logs. What does mean for this PR? Executing Python or similar might be indeed to powerful but having an API that is close to the original expect or similar to pexpect while also providing a Pty abstraction to emulate a Terminal might be desirable. |
@Mic92 I don't understand your question. This PR changes the bash tool call failure timeout to a timeout that leaves the process in the background and allows the agent to continue to use and interact with that process. Can you describe how your tool is relevant/different to this PR? I see it looks to try to emulate a PTY for the agent, which is great! I think giving the agent the ability to use the terminal in an interactive way would be way more useful overtime for users even if users don't use it. Overtime, I think as users shift towards more agentic development (less HIL), this will be increasingly more useful. |
Some problems noticed with this PRs approach:
I think making OC better at multiplexing could really elevate the issues observed. |
My comment was regarding the process_interact tool. It currently sends text to a process unconditionally. Usually this requires some timing, i. E. one has to wait for a certain output before sending input. This what expect was developed for: https://linux.die.net/man/1/expect Of course this might be out of scope for this PR but maybe a future direction to enable opencode with debugger and other tuis |
Right now the bash tool returns output after the timeout. Then the agent can send more input and then call another tool to read output. But having it send back out put on the process_interact would make sense (with a timeout). |
Summary
This PR introduces background process handling capabilities to the bash tool, allowing OpenCode to manage interactive and long-running processes without blocking the conversation flow. Processes that produce no output within a configurable timeout are automatically backgrounded and can be interacted with using a new suite of process management tools.
Problem Solved
Previously, running interactive commands (like
python
,node
REPLs) or long-running processes (likenpm run dev
, development servers) would block the bash tool until a timeout occurs. This made it impossible for the AI to effectively work with development workflows that involve servers, watchers, or interactive tools.Implementation Details
Enhanced Bash Tool
outputTimeout
parameter (default: 3 seconds) to detect non-responsive processesNew Process Management Tools
process_list
- View all background processes with runtime and memory statsprocess_stream
- Read output with stream-like semantics (read/peek/tail/reset)process_interact
- Send input or signals to background processesprocess_trim
- Manage memory by trimming output buffersMemory Management
Technical Approach
Promise.race()
for non-blocking timeout detectionMap
tracks all running processesUse Cases
npm run dev
,python -m http.server
python
,node
,psql
webpack --watch
,tsc --watch
Testing
Includes a test file demonstrating process lifecycle management and output capture.
Breaking Changes
None. The feature is fully backward compatible - existing bash commands work exactly as before, with the added benefit of automatic backgrounding when appropriate.