Skip to content

Commit f5af2ef

Browse files
committed
Fix: Correct max_output location in YAML profile templates
The YAML profile templates incorrectly show max_output under llm (llm.max_output), but max_output belongs on Interpreter, not on the Llm class. History: - 20b8230: max_output was added as top-level in config.yaml - 2400eab: Template changed to llm.max_output and parsing code removed (bug introduced) - 861341c: Changed to nested YAML format but still under llm: (still wrong) Evidence: - max_output was originally top-level in 20b8230 - Llm class has no max_output attribute (unlike context_window, max_tokens) - Code uses interpreter.max_output (core.py:420) - Migration mapping doesn't include max_output (unlike other LLM settings) The templates have been wrong since 2400eab, causing profiles with llm.max_output to be silently ignored. Users saw the default (2800) instead of their configured value. Changes: - Move max_output from llm: section to top-level in all YAML templates (default.yaml, fast.yaml, snowpark.yml) - Fix template comment in profiles.py used for new profile generation This restores the original design where max_output is a top-level Interpreter setting, matching the code structure.
1 parent 0d19f3a commit f5af2ef

File tree

4 files changed

+13
-9
lines changed

4 files changed

+13
-9
lines changed

interpreter/terminal_interface/profiles/defaults/default.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,6 @@ llm:
99
# api_key: ... # Your API key, if the API requires it
1010
# api_base: ... # The URL where an OpenAI-compatible server is running to handle LLM API requests
1111
# api_version: ... # The version of the API (this is primarily for Azure)
12-
# max_output: 2500 # The maximum characters of code output visible to the LLM
1312

1413
# Computer Settings
1514
computer:
@@ -23,6 +22,7 @@ computer:
2322
# safe_mode: "off" # The safety mode for the LLM — one of "off", "ask", "auto"
2423
# offline: False # If True, will disable some online features like checking for updates
2524
# verbose: False # If True, will print detailed logs
25+
# max_output: 2500 # The maximum characters of code output visible to the LLM
2626

2727
# To use a separate model for the `wtf` command:
2828
# wtf:

interpreter/terminal_interface/profiles/defaults/fast.yaml

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,18 +8,20 @@ llm:
88
# api_key: ... # Your API key, if the API requires it
99
# api_base: ... # The URL where an OpenAI-compatible server is running to handle LLM API requests
1010
# api_version: ... # The version of the API (this is primarily for Azure)
11-
# max_output: 2500 # The maximum characters of code output visible to the LLM
1211

1312
# Computer Settings
1413
computer:
1514
import_computer_api: True # Gives OI a helpful Computer API designed for code interpreting language models
1615

1716
custom_instructions: "The user has set you to FAST mode. **No talk, just code.** Be as brief as possible. No comments, no unnecessary messages. Assume as much as possible, rarely ask the user for clarification. Once the task has been completed, say 'The task is done.'" # This will be appended to the system message
17+
18+
# General Configuration
1819
# auto_run: False # If True, code will run without asking for confirmation
1920
# safe_mode: "off" # The safety mode for the LLM — one of "off", "ask", "auto"
2021
# offline: False # If True, will disable some online features like checking for updates
2122
# verbose: False # If True, will print detailed logs
2223
# multi_line: False # If True, you can input multiple lines starting and ending with ```
24+
# max_output: 2500 # The maximum characters of code output visible to the LLM
2325

2426
# All options: https://docs.openinterpreter.com/settings
2527

interpreter/terminal_interface/profiles/defaults/snowpark.yml

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,6 @@ llm:
99
# api_key: ... # Your API key, if the API requires it
1010
# api_base: ... # The URL where an OpenAI-compatible server is running to handle LLM API requests
1111
# api_version: ... # The version of the API (this is primarily for Azure)
12-
# max_output: 2500 # The maximum characters of code output visible to the LLM
1312

1413
# Computer Settings
1514
computer:
@@ -34,7 +33,7 @@ If this doesnt work, you may need to run the following commands to install snowp
3433
!pip install "snowflake-connector-python[pandas]"
3534
```
3635
37-
Then, you can create a dictionary with the necessary connection parameters and create a session. You will access these values from the
36+
Then, you can create a dictionary with the necessary connection parameters and create a session. You will access these values from the
3837
environment variables:
3938
```python
4039
# Retrieve environment variables
@@ -43,7 +42,7 @@ snowflake_user = os.getenv("SNOWFLAKE_USER")
4342
snowflake_password = os.getenv("SNOWFLAKE_PASSWORD")
4443
snowflake_role = os.getenv("SNOWFLAKE_ROLE")
4544
snowflake_warehouse = os.getenv("SNOWFLAKE_WAREHOUSE")
46-
snowflake_database = os.getenv("SNOWFLAKE_DATABASE")
45+
snowflake_database = os.getenv("SNOWFLAKE_DATABASE")
4746
snowflake_schema = os.getenv("SNOWFLAKE_SCHEMA")
4847
4948
# Create connection parameters dictionary
@@ -52,15 +51,15 @@ connection_parameters = {
5251
"user": snowflake_user,
5352
"password": snowflake_password,
5453
"role": snowflake_role,
55-
"warehouse": snowflake_warehouse,
56-
"database": snowflake_database,
54+
"warehouse": snowflake_warehouse,
55+
"database": snowflake_database,
5756
"schema": snowflake_schema,
5857
}
5958
6059
# Create a session
6160
session = Session.builder.configs(connection_parameters).create()
6261
```
63-
You should assume that the environment variables have already been set.
62+
You should assume that the environment variables have already been set.
6463
You can run a query against the snowflake data by using the session.sql() method. Then, you can turn the snowpark dataframe
6564
that is created into a pandas dataframe for use in other processes. Here is an example of how you can run a query:
6665
```python
@@ -82,6 +81,7 @@ You can now use this dataframe to do whatever you need to do with the data.
8281
# offline: False # If True, will disable some online features like checking for updates
8382
# verbose: False # If True, will print detailed logs
8483
# multi_line: False # If True, you can input multiple lines starting and ending with ```
84+
# max_output: 2500 # The maximum characters of code output visible to the LLM
8585

8686
# Documentation
8787
# All options: https://docs.openinterpreter.com/settings

interpreter/terminal_interface/profiles/profiles.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -511,10 +511,13 @@ def normalize_text(message):
511511
# Be sure to remove the "#" before the following settings to use them.
512512
513513
# custom_instructions: "" # This will be appended to the system message
514+
515+
# General Configuration
514516
# auto_run: False # If True, code will run without asking for confirmation
515517
# safe_mode: "off" # The safety mode (see https://docs.openinterpreter.com/usage/safe-mode)
516518
# offline: False # If True, will disable some online features like checking for updates
517519
# verbose: False # If True, will print detailed logs
520+
# max_output: 2800 # The maximum characters of code output visible to the LLM
518521
519522
# computer
520523
# languages: ["javascript", "shell"] # Restrict to certain languages
@@ -523,7 +526,6 @@ def normalize_text(message):
523526
# api_key: ... # Your API key, if the API requires it
524527
# api_base: ... # The URL where an OpenAI-compatible server is running
525528
# api_version: ... # The version of the API (this is primarily for Azure)
526-
# max_output: 2800 # The maximum characters of code output visible to the LLM
527529
528530
# All options: https://docs.openinterpreter.com/settings
529531

0 commit comments

Comments
 (0)