This repository was archived by the owner on Aug 14, 2025. It is now read-only.
File tree Expand file tree Collapse file tree 4 files changed +33
-3
lines changed Expand file tree Collapse file tree 4 files changed +33
-3
lines changed Original file line number Diff line number Diff line change 11{
2- "." : " 0.0.1 -alpha.0 "
2+ "." : " 0.1.0 -alpha.1 "
33}
Original file line number Diff line number Diff line change 1+ # Changelog
2+
3+ ## 0.1.0-alpha.1 (2025-06-17)
4+
5+ Full Changelog: [ v0.0.1-alpha.0...v0.1.0-alpha.1] ( https://github.com/llamastack/llama-stack-client-python/compare/v0.0.1-alpha.0...v0.1.0-alpha.1 )
6+
7+ ### Features
8+
9+ * ** client:** add follow_redirects request option ([ a77a9ee] ( https://github.com/llamastack/llama-stack-client-python/commit/a77a9eed9038782ba6b93ce0d3147ee4a6b8a3b7 ) )
10+
11+
12+ ### Bug Fixes
13+
14+ * ** client:** correctly parse binary response | stream ([ 85d6bbd] ( https://github.com/llamastack/llama-stack-client-python/commit/85d6bbd97efac7509cbff0bb2d461a80d09b5e61 ) )
15+ * ** package:** support direct resource imports ([ a862d55] ( https://github.com/llamastack/llama-stack-client-python/commit/a862d551553aac41573306ce39480e1eb16ea3d3 ) )
16+
17+
18+ ### Chores
19+
20+ * ** ci:** fix installation instructions ([ 40d9854] ( https://github.com/llamastack/llama-stack-client-python/commit/40d9854bd2630a471f1ca93d249e4d44b73fa864 ) )
21+ * ** ci:** upload sdks to package manager ([ 2d2282b] ( https://github.com/llamastack/llama-stack-client-python/commit/2d2282bb49d58daef1f32fa0f1e5a356abf8df0d ) )
22+ * ** docs:** grammar improvements ([ 6f57b13] ( https://github.com/llamastack/llama-stack-client-python/commit/6f57b1363367de7ed5035fd1d6ba1a071eee67ba ) )
23+ * ** docs:** remove reference to rye shell ([ bcf315a] ( https://github.com/llamastack/llama-stack-client-python/commit/bcf315ae00c458f89dfa3684bcc7abdb732b6c5f ) )
24+ * ** docs:** remove unnecessary param examples ([ 60ec829] ( https://github.com/llamastack/llama-stack-client-python/commit/60ec829e809156217cf2f911b3cac6b23a06baad ) )
25+ * ** internal:** avoid errors for isinstance checks on proxies ([ 758a188] ( https://github.com/llamastack/llama-stack-client-python/commit/758a188dbfaa284a13b70816689c99917a05d16c ) )
26+ * ** internal:** codegen related update ([ ab9f05c] ( https://github.com/llamastack/llama-stack-client-python/commit/ab9f05cc1da5b21afceacdf9c8eb54b6e59eed01 ) )
27+ * ** internal:** update conftest.py ([ 218e172] ( https://github.com/llamastack/llama-stack-client-python/commit/218e172c16014dad41a7c189c5620077955d6bdf ) )
28+ * ** tests:** add tests for httpx client instantiation & proxies ([ b27b11b] ( https://github.com/llamastack/llama-stack-client-python/commit/b27b11bbe0a9c5778b757733c11828d9603307ea ) )
29+ * ** tests:** run tests in parallel ([ 1287a3c] ( https://github.com/llamastack/llama-stack-client-python/commit/1287a3c11f668d916c8c7af534a48523e2e69140 ) )
30+ * update SDK settings ([ e54ba91] ( https://github.com/llamastack/llama-stack-client-python/commit/e54ba9163792ab80362a189acb825bcd00e5384b ) )
Original file line number Diff line number Diff line change 11[project ]
22name = " llama_stack_client"
3- version = " 0.0.1 -alpha.0 "
3+ version = " 0.1.0 -alpha.1 "
44description = " The official Python library for the llama-stack-client API"
55dynamic = [" readme" ]
66license = " Apache-2.0"
Original file line number Diff line number Diff line change 11# File generated from our OpenAPI spec by Stainless. See CONTRIBUTING.md for details.
22
33__title__ = "llama_stack_client"
4- __version__ = "0.0.1 -alpha.0 " # x-release-please-version
4+ __version__ = "0.1.0 -alpha.1 " # x-release-please-version
You can’t perform that action at this time.
0 commit comments