Skip to content

Conversation

khalilacheche
Copy link

This PR fixes an issue where specifying a model that does not support the temperature parameter in LangchainLLMWrapper resulted in an API error and prevented responses from being generated.

Changes Made:
Modified the attribute check for temperature in LangchainLLMWrapper.
Now, the check ensures that temperature is both present and not None before applying it.
Fixes #1945

@dosubot dosubot bot added the size:XS This PR changes 0-9 lines, ignoring generated files. label Mar 5, 2025
Copy link
Contributor

@anistark anistark left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The current check only verifies attribute existence, not setting of temperature, which might fail due to model restrictions.

@@ -204,7 +204,7 @@ def generate_text(
old_temperature: float | None = None
if temperature is None:
temperature = self.get_temperature(n=n)
if hasattr(self.langchain_llm, "temperature"):
if hasattr(self.langchain_llm, "temperature") and temperature is not None:
self.langchain_llm.temperature = temperature # type: ignore
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

consider adding a try except block here.

@anistark anistark added the blocked Blocked on further action label Aug 28, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
blocked Blocked on further action size:XS This PR changes 0-9 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

LangchainLLMWrapper sets a default temperature even with models that don't support it (ex o3-mini)
2 participants