Skip to content

Commit 837fa7a

Browse files
authored
Fix small bug in doc (#340)
## What this does <!-- Clear description of what this PR does and why --> Fix bug in documentation that tripped me up for a few minutes today. ## Type of change - [x] Bug fix - [x] Documentation ## Scope check - [x] I read the [Contributing Guide](https://github.com/crmne/ruby_llm/blob/main/CONTRIBUTING.md) - [x] This aligns with RubyLLM's focus on **LLM communication** - [x] This isn't application-specific logic that belongs in user code - [x] This benefits most users, not just my specific use case ## Quality check - [x] I updated documentation if needed - [x] I didn't modify auto-generated files manually (`models.json`, `aliases.json`) ## API changes - [x] No API changes ## Related issues <!-- Link issues: "Fixes #123" or "Related to #123" -->
1 parent b52caf2 commit 837fa7a

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

docs/_advanced/models.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -260,7 +260,7 @@ puts response.content
260260

261261
# You can also use it in .with_model
262262
chat.with_model(
263-
model: 'gpt-5-alpha',
263+
'gpt-5-alpha',
264264
provider: :openai, # MUST specify provider
265265
assume_exists: true
266266
)

0 commit comments

Comments
 (0)