Skip to content

Commit c97fd08

Browse files
authored
Goal Completion Detection Issue (#316)
* Goal Completion Detection Issue Problem: When the AI called the mark_goal_complete function, the completion signal wasn't being propagated through the system. The iterative strategy kept running indefinitely even after the AI indicated the goal was complete. Fix: Modified the LLM manager to detect when mark_goal_complete returns a `FunctionExecutionResult` with `completed=True`` and pass this signal through in the `LLMManagerResponse` Signed-off-by: Luke Hinds <lukehinds@gmail.com> * Release v0.7.2 Signed-off-by: Luke Hinds <lukehinds@gmail.com> --------- Signed-off-by: Luke Hinds <lukehinds@gmail.com>
1 parent e0f8a77 commit c97fd08

File tree

4 files changed

+17
-4
lines changed

4 files changed

+17
-4
lines changed

pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
[project]
22
name = "agentup"
3-
version = "0.7.1"
3+
version = "0.7.2"
44
description = "Create AI agents with all the trappings, out of the box."
55
readme = "README.md"
66
requires-python = ">=3.11"

src/agent/services/builtin_capabilities.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -205,8 +205,8 @@ async def capabilities_executor(task: Task) -> str:
205205

206206
# Echo capability
207207
async def echo_executor(task: Task) -> str:
208-
if hasattr(task, "params") and task.params and "message" in task.params:
209-
return f"Echo: {task.params['message']}"
208+
if hasattr(task, "metadata") and task.metadata and "message" in task.metadata:
209+
return f"Echo: {task.metadata['message']}"
210210
return "Echo: No message provided"
211211

212212
# Register core capabilities

src/agent/services/llm/manager.py

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -168,10 +168,23 @@ async def llm_with_functions(
168168
logger.debug(
169169
f"Final LLM response after function execution: {final_response.content[:200]}{'...' if len(str(final_response.content)) > 200 else ''}"
170170
)
171+
# If a completion was detected, pass it through in the response
172+
if completion_detected and completion_result:
173+
return LLMManagerResponse(
174+
content=final_response.content,
175+
completed=True,
176+
completion_data=completion_result.completion_data,
177+
)
171178
return LLMManagerResponse(content=final_response.content)
172179
except Exception as e:
173180
logger.error(f"Failed to get final response from LLM after function execution: {e}")
174181
# Return function results directly as fallback
182+
if completion_detected and completion_result:
183+
return LLMManagerResponse(
184+
content=f"Function executed successfully: {'; '.join(str(result) for result in function_results)}",
185+
completed=True,
186+
completion_data=completion_result.completion_data,
187+
)
175188
return LLMManagerResponse(
176189
content=f"Function executed successfully: {'; '.join(str(result) for result in function_results)}"
177190
)

uv.lock

Lines changed: 1 addition & 1 deletion
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

0 commit comments

Comments
 (0)