Please do a quick search on GitHub issues first, the feature you are about to request might have already been requested. **Expected Behavior** <!--- Tell us how it should work. Add a code example to explain what you think the feature should look like. This is optional, but it would help up understand your expectations. --> The ollama team has added thinking support to the cli and the api. See: https://github.com/ollama/ollama/pull/10584/ This parameter allows to control the output of the "thinking" process of the model  **Current Behavior** <!--- Explain the difference from current behavior and why do you need this feature (aka why it is not possible to implement the desired functionality with the current version) --> At this moment, there is no way to manage the thinking parameter from the Spring AI Ollama API. **Context** Adding this, will give us the ability to use thinking models, and get just the "output" without the thinking phase of the model. <!--- How has this issue affected you? What are you trying to accomplish? What other alternatives have you considered? Are you aware of any workarounds? -->