-
-
Notifications
You must be signed in to change notification settings - Fork 149
chore(main): release 4.0.0 #1191
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
e499970
to
8bd6f45
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This is an automated release PR generated by Release Please to bump the version from 3.12.2 to 4.0.0. The release includes significant breaking changes related to a major refactor of the context API, switching from callback-based input handling to schema-based definitions and function calling.
Key changes:
- Major version bump to 4.0.0 indicating breaking changes
- Addition of new changelog entry documenting breaking changes, features, and bug fixes
- Context API refactor with multiple breaking changes including config restructuring and removal of deprecated features
Reviewed Changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated no comments.
File | Description |
---|---|
version.txt | Updates version number from 3.12.2 to 4.0.0 |
CHANGELOG.md | Adds comprehensive changelog entry for v4.0.0 with breaking changes, features, and bug fixes |
d71f140
to
18ae54e
Compare
6762af0
to
11f5cf2
Compare
Function Calling Support and Context Rework ("Agent" Mode)We’ve reworked contexts into functions. Each function can have a schema and a URI, and can be provided to the LLM as a tool using the new In summary:
Philosophy: Transparency and ControlA key goal of this redesign is to maximize transparency and user control. All tool and resource sharing is explicit - nothing is sent or shared in the background without your knowledge. You always see what is being provided to the LLM and can choose what to include or remove at every step. This approach may be more manual, but it ensures you remain in control of your data and workflow. Workflow ExampleSuppose you open chat in the The LLM selects the appropriate tool, fills in its arguments, and asks you to confirm the call. The tool call is added to the prompt; you can remove it if you don't want to send it. After confirmation, the response is a URI (because the You can continue asking about other things, like the README. The same process applies: tool call, confirmation, response, and sticky resource. At any point, you can use mcphub.nvim IntegrationRead the integration post by @ravitemer here. The plugin interface is now closer to MCP, making integration with projects like mcphub.nvim seamless (with some deviations for simplicity and security). Thanks to mcphub.nvim, integrating MCP servers is easy (e.g., GitHub MCP server): Other Notable Changes
|
e4754de
to
9e7990d
Compare
9e7990d
to
fb2c4e1
Compare
🤖 Created releases: 🌻 |
🤖 I have created a release beep boop
4.0.0 (2025-08-02)
⚠ BREAKING CHANGES
providers.github_models.disabled = false
Features
Bug Fixes
This PR was generated with Release Please. See documentation.