Skip to content

Conversation

@sicoyle
Copy link
Collaborator

@sicoyle sicoyle commented Oct 30, 2025

issue: #65

This PR continues to make agent initialization simpler by creating a new separate prompt class. I am not really sure that we want to expose this class as most folks probably don't use it, but open to thoughts here as this is for the prompt system to handle how:

  1. agent prompts (system messages and full chat context) are built, formatted, and sent to the LLM. Right now it takes name, role, goal, instructions (agent identiy/behavior) and turns these into a templated prompt string or chat sequence.
  2. prompt template logic to support default and user-provided templates
  3. chat history integration packaging chat history into the message list or formatted prompt in a suitable way for the LLM

This is likely not used by 99% of folks, but maybe more useful to:

  • advanced agent users
  • platform admins who want to override the entire system prompt logic to inject a prompt or prompt template

This PR mostly moves the prompt stuff out of the agent logic into it's own class, renames things, and has minor tweaks along the way.

Behavior change: before, the agent's prompt template was pushed/synced with the llm prompt template. However, now, this PR makes the agent own its prompt and does not mutate the llm internal prompt template. If the LLM has this set, then the agent ignores it and still sets its own prmopt template. In other words, the llm client shouldn't own the prompt anymore as the agent.prompt builds the system prompt, injects context/variables, includes chat history, and properly formats messages. At call time, the agent passes these formatted messages to the llm, and the llm just executes it. However, I do have to leave prompt template field on the LLM as if someone uses just our llm clients, they might want that functionality, but with an agent, then we just use the agent prompt template.

Signed-off-by: Samantha Coyle <[email protected]>
Signed-off-by: Samantha Coyle <[email protected]>
Signed-off-by: Samantha Coyle <[email protected]>
Signed-off-by: Samantha Coyle <[email protected]>
Signed-off-by: Samantha Coyle <[email protected]>
Signed-off-by: Samantha Coyle <[email protected]>
Signed-off-by: Samantha Coyle <[email protected]>
# TODO: we should have a system_template, prompt_template, and response_template, or better separation here.
# If we have something like a customer service agent, we want diff templates for different types of interactions.
# In future, we could also have a way to dynamically change the template based on the context of the interaction.
template_format: Literal["f-string", "jinja2"] = Field(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you can create constants for these, or enum class, that way we reduce typos and hardcoded strings.

Constants + literal type example:

# Template Format constants
TPL_FORMAT_F_STRING = "f-string"
TPL_FORMAT_JINJA = "jinja2"
# TPL_FORMAT_T_STRING = "t-string"  # for future template string

# Type alias for tool types
TemplateFormatType = Literal["f-string", "jinja2"]

in your class:

template_format: TemplateFormatType = Field(default=TPL_FORMAT_JINJA, description=...)

str: The formatted system prompt string.
"""
# Default f-string template; placeholders will be swapped to Jinja if needed.
default_sys_prompt = """
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

probably these template strings can be on their own file? or as variables without indentation somewhere, there are extra prefix spaces

Copy link
Contributor

@filintod filintod left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm, just some nits

Returns:
(valid, unused): Tuple of dict of valid attrs and list of unused attr names.
"""
attrs = ["name", "role", "goal", "instructions"]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't get where this is coming from or if it should be more prominent, like template attributes somewhere.

def construct_system_prompt(self) -> str:
"""
Build the system prompt for the agent using a single template string.
- Fills in the current date.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

was wondering what is the usage of date? seems strange to need it

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The date provides essential temporal context that allows the LLM to understand and reason about time-relative concepts like 'tomorrow,' calculate durations, handle scheduling tasks, and maintain consistency when discussing events or deadlines throughout the conversation.

Copy link
Contributor

@filintod filintod Oct 31, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

understood, but temporal-related actions are a specific use case(s) and we embed this all the time even if not used, where that could be in instructions that is dynamic and people can customize it. We could add an option like add date, but again users can do it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants