Skip to content

Conversation

rliu6915
Copy link
Contributor

@rliu6915 rliu6915 commented Jul 14, 2025

  • Add think field to ChatOllamaInput interface
  • Add think field to chatOllama class (require by the interface it implements)
  • Initialize think field through its constructor
  • Add think property to OllamaChatRequest object as the return of invocationParams() method
  • In _streamResponseChunks method, in terms of creating token, when thinking is enabled, make sure trying thinking first using (??) operator
  • in convertOllamaMessagesToLangChain method, when setting content, make sure trying thinking first (messages.thinking) using (??) operator
  • Add test suite for thinking model
  • Test thinking model (deepseek-r1-32b) response

This change resolves the issue where users using thinking model such as Deepseek-r1-32b couldn't hide thought process when they are using Ollama through langchain.js framework's Ollama integration. Previously, attempting to use thinking model such as deepssek with thinking mode set to false. output would still result in a response with a combination of thinking and content.

The fix includes:

  • Adding think field to chatOllama class to support thinking model with thinking mode such as deepseek
  • Test coverage for thinking model (deepseek) with or without thinking mode

Fixes #8481

Copy link

vercel bot commented Jul 14, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Preview Comments Updated (UTC)
langchainjs-docs Ready Ready Preview Sep 1, 2025 6:35am
1 Skipped Deployment
Project Deployment Preview Comments Updated (UTC)
langchainjs-api-refs Ignored Ignored Sep 1, 2025 6:35am

@rliu6915
Copy link
Contributor Author

@hntrl @jacoblee93 Could you please take a moment to review the six workflows that are currently awaiting your approval?

Copy link
Member

@christian-bromann christian-bromann left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @rliu6915 for raising the PR 🎉 I have some minor suggestions, wdyt?

@nth-zik
Copy link

nth-zik commented Aug 7, 2025

Wait this PR

@rliu6915
Copy link
Contributor Author

rliu6915 commented Aug 16, 2025

Hi, for Ollama js, thinking mode was introduced in v0.5.16. So I updated Ollama to 0.5.17 (the latest).

@hntrl @jacoblee93 @christian-bromann Could you please review the change that is awaiting your approval?

@rliu6915
Copy link
Contributor Author

rliu6915 commented Aug 22, 2025

Hi, sorry, it failed this time because of yarn check. I yarned format, updated the code, yarned format:check and matched like below:

image

@hntrl @jacoblee93 @christian-bromann Could you please review the change that is awaiting your approval?

@rliu6915
Copy link
Contributor Author

@hntrl

@hntrl hntrl changed the title Add think Field to ChatOllama for Controlling Thought Process in Responses feat(ollama): Add think Field to ChatOllama for Controlling Thought Process in Responses Sep 1, 2025
@hntrl
Copy link
Member

hntrl commented Sep 1, 2025

@rliu6915 apologies 1000x for taking this long to get this landed. Thank you for your persistence and for the PR! 💪

@hntrl hntrl merged commit 9e843eb into langchain-ai:main Sep 1, 2025
25 of 26 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

NITt: Add think Field to ChatOllama for Controlling Thought Process in Responses
4 participants