Skip to content
This repository was archived by the owner on Aug 14, 2025. It is now read-only.

Commit 028a8dd

Browse files
release: 0.2.17 (#9)
Automated Release PR --- ## 0.2.17 (2025-08-06) Full Changelog: [v0.2.15...v0.2.17](llamastack/llama-stack-client-python@v0.2.15...v0.2.17) ### Features * **api:** update via SDK Studio ([9c69353](llamastack/llama-stack-client-python@9c69353)) * **api:** update via SDK Studio ([5f90b04](llamastack/llama-stack-client-python@5f90b04)) * **api:** update via SDK Studio ([6e26309](llamastack/llama-stack-client-python@6e26309)) * **api:** update via SDK Studio ([54ff3c4](llamastack/llama-stack-client-python@54ff3c4)) * **api:** update via SDK Studio ([a34c823](llamastack/llama-stack-client-python@a34c823)) * **api:** update via SDK Studio ([f6b80ca](llamastack/llama-stack-client-python@f6b80ca)) * **api:** update via SDK Studio ([2a4296d](llamastack/llama-stack-client-python@2a4296d)) * **api:** update via SDK Studio ([07691ac](llamastack/llama-stack-client-python@07691ac)) * **api:** update via SDK Studio ([585f9ce](llamastack/llama-stack-client-python@585f9ce)) * **api:** update via SDK Studio ([6d609e3](llamastack/llama-stack-client-python@6d609e3)) * **api:** update via SDK Studio ([3dbf2a4](llamastack/llama-stack-client-python@3dbf2a4)) * **api:** update via SDK Studio ([dd0ae96](llamastack/llama-stack-client-python@dd0ae96)) * **api:** update via SDK Studio ([80a2969](llamastack/llama-stack-client-python@80a2969)) * **api:** update via SDK Studio ([748e6db](llamastack/llama-stack-client-python@748e6db)) * **api:** update via SDK Studio ([b6fa2b1](llamastack/llama-stack-client-python@b6fa2b1)) * **api:** update via SDK Studio ([e97f870](llamastack/llama-stack-client-python@e97f870)) * **api:** update via SDK Studio ([489b54d](llamastack/llama-stack-client-python@489b54d)) * **api:** update via SDK Studio ([13cfa4a](llamastack/llama-stack-client-python@13cfa4a)) * **api:** update via SDK Studio ([25c1e49](llamastack/llama-stack-client-python@25c1e49)) * **api:** update via SDK Studio ([4a54d61](llamastack/llama-stack-client-python@4a54d61)) * **api:** update via SDK Studio ([ac4614a](llamastack/llama-stack-client-python@ac4614a)) * **api:** update via SDK Studio ([a201e22](llamastack/llama-stack-client-python@a201e22)) * **client:** support file upload requests ([e84459f](llamastack/llama-stack-client-python@e84459f)) * **client:** support file upload requests ([6c73da7](llamastack/llama-stack-client-python@6c73da7)) ### Bug Fixes * **ci:** correct conditional ([d7c2ab8](llamastack/llama-stack-client-python@d7c2ab8)) * **ci:** correct conditional ([4368fbd](llamastack/llama-stack-client-python@4368fbd)) * **client:** don't send Content-Type header on GET requests ([d6a80a5](llamastack/llama-stack-client-python@d6a80a5)) * **client:** don't send Content-Type header on GET requests ([c6e0026](llamastack/llama-stack-client-python@c6e0026)) * helptext for 'inspect version' and 'providers inspect' ([#8](llamastack/llama-stack-client-python#8)) ([d79345e](llamastack/llama-stack-client-python@d79345e)) * kill requirements.txt ([a6bd44c](llamastack/llama-stack-client-python@a6bd44c)) * model register missing model-type and not accepting metadata ([#11](llamastack/llama-stack-client-python#11)) ([f3f4515](llamastack/llama-stack-client-python@f3f4515)) * **parsing:** correctly handle nested discriminated unions ([9f95130](llamastack/llama-stack-client-python@9f95130)) * **parsing:** correctly handle nested discriminated unions ([8b7e9ba](llamastack/llama-stack-client-python@8b7e9ba)) * **parsing:** ignore empty metadata ([a8a398f](llamastack/llama-stack-client-python@a8a398f)) * **parsing:** ignore empty metadata ([264f24c](llamastack/llama-stack-client-python@264f24c)) * **parsing:** parse extra field types ([f981bdc](llamastack/llama-stack-client-python@f981bdc)) * **parsing:** parse extra field types ([d54c5db](llamastack/llama-stack-client-python@d54c5db)) * pre-commit formatting ([a83b1c3](llamastack/llama-stack-client-python@a83b1c3)) * update agent event logger ([#10](llamastack/llama-stack-client-python#10)) ([0a10b70](llamastack/llama-stack-client-python@0a10b70)) ### Chores * **ci:** change upload type ([7827103](llamastack/llama-stack-client-python@7827103)) * **ci:** change upload type ([5febc13](llamastack/llama-stack-client-python@5febc13)) * **ci:** only run for pushes and fork pull requests ([03a7636](llamastack/llama-stack-client-python@03a7636)) * **ci:** only run for pushes and fork pull requests ([c05df66](llamastack/llama-stack-client-python@c05df66)) * **ci:** only run for pushes and fork pull requests ([87c9d01](llamastack/llama-stack-client-python@87c9d01)) * **ci:** only run for pushes and fork pull requests ([9d04993](llamastack/llama-stack-client-python@9d04993)) * **ci:** only run for pushes and fork pull requests ([4da7f49](llamastack/llama-stack-client-python@4da7f49)) * **ci:** only run for pushes and fork pull requests ([8b37cd3](llamastack/llama-stack-client-python@8b37cd3)) * **ci:** only run for pushes and fork pull requests ([3f0a4b9](llamastack/llama-stack-client-python@3f0a4b9)) * **ci:** only run for pushes and fork pull requests ([8a1efad](llamastack/llama-stack-client-python@8a1efad)) * delete unused scripts based on rye ([dae6506](llamastack/llama-stack-client-python@dae6506)) * **internal:** bump pinned h11 dep ([4a7073f](llamastack/llama-stack-client-python@4a7073f)) * **internal:** bump pinned h11 dep ([0568d6d](llamastack/llama-stack-client-python@0568d6d)) * **internal:** codegen related update ([4d4afec](llamastack/llama-stack-client-python@4d4afec)) * **internal:** codegen related update ([7cd543f](llamastack/llama-stack-client-python@7cd543f)) * **internal:** codegen related update ([3165cad](llamastack/llama-stack-client-python@3165cad)) * **internal:** codegen related update ([c27a701](llamastack/llama-stack-client-python@c27a701)) * **internal:** codegen related update ([aa45ba3](llamastack/llama-stack-client-python@aa45ba3)) * **internal:** codegen related update ([5d6ccb5](llamastack/llama-stack-client-python@5d6ccb5)) * **internal:** fix ruff target version ([c50a0e0](llamastack/llama-stack-client-python@c50a0e0)) * **internal:** version bump ([5af7869](llamastack/llama-stack-client-python@5af7869)) * **internal:** version bump ([148be8d](llamastack/llama-stack-client-python@148be8d)) * **internal:** version bump ([86a0766](llamastack/llama-stack-client-python@86a0766)) * **internal:** version bump ([5d6cc6b](llamastack/llama-stack-client-python@5d6cc6b)) * **internal:** version bump ([cc7a519](llamastack/llama-stack-client-python@cc7a519)) * **internal:** version bump ([8f15ef0](llamastack/llama-stack-client-python@8f15ef0)) * **internal:** version bump ([f52cb89](llamastack/llama-stack-client-python@f52cb89)) * **internal:** version bump ([2e1a629](llamastack/llama-stack-client-python@2e1a629)) * **internal:** version bump ([da26ed0](llamastack/llama-stack-client-python@da26ed0)) * **internal:** version bump ([3727fa5](llamastack/llama-stack-client-python@3727fa5)) * **internal:** version bump ([443ce02](llamastack/llama-stack-client-python@443ce02)) * **internal:** version bump ([b2875ec](llamastack/llama-stack-client-python@b2875ec)) * **internal:** version bump ([9a4320d](llamastack/llama-stack-client-python@9a4320d)) * **internal:** version bump ([39155e5](llamastack/llama-stack-client-python@39155e5)) * **internal:** version bump ([607c7be](llamastack/llama-stack-client-python@607c7be)) * **internal:** version bump ([62901e7](llamastack/llama-stack-client-python@62901e7)) * **internal:** version bump ([4132af9](llamastack/llama-stack-client-python@4132af9)) * **internal:** version bump ([e6ae920](llamastack/llama-stack-client-python@e6ae920)) * **internal:** version bump ([96768dc](llamastack/llama-stack-client-python@96768dc)) * **internal:** version bump ([74f7eda](llamastack/llama-stack-client-python@74f7eda)) * **internal:** version bump ([d59862a](llamastack/llama-stack-client-python@d59862a)) * **internal:** version bump ([ce98414](llamastack/llama-stack-client-python@ce98414)) * **internal:** version bump ([9746774](llamastack/llama-stack-client-python@9746774)) * **internal:** version bump ([6114dbf](llamastack/llama-stack-client-python@6114dbf)) * **internal:** version bump ([02c9953](llamastack/llama-stack-client-python@02c9953)) * **internal:** version bump ([16f2953](llamastack/llama-stack-client-python@16f2953)) * **internal:** version bump ([c32029b](llamastack/llama-stack-client-python@c32029b)) * **internal:** version bump ([aef5dee](llamastack/llama-stack-client-python@aef5dee)) * **internal:** version bump ([590de6d](llamastack/llama-stack-client-python@590de6d)) * **internal:** version bump ([072269f](llamastack/llama-stack-client-python@072269f)) * **internal:** version bump ([eee6f0b](llamastack/llama-stack-client-python@eee6f0b)) * **internal:** version bump ([e6a964e](llamastack/llama-stack-client-python@e6a964e)) * **package:** mark python 3.13 as supported ([2afc17b](llamastack/llama-stack-client-python@2afc17b)) * **package:** mark python 3.13 as supported ([d1a4e40](llamastack/llama-stack-client-python@d1a4e40)) * **project:** add settings file for vscode ([405febd](llamastack/llama-stack-client-python@405febd)) * **project:** add settings file for vscode ([1dd3e53](llamastack/llama-stack-client-python@1dd3e53)) * **readme:** fix version rendering on pypi ([ca89c7f](llamastack/llama-stack-client-python@ca89c7f)) * **readme:** fix version rendering on pypi ([193fb64](llamastack/llama-stack-client-python@193fb64)) * update SDK settings ([2d422f9](llamastack/llama-stack-client-python@2d422f9)) * update SDK settings ([59b933c](llamastack/llama-stack-client-python@59b933c)) * update version ([10ef53e](llamastack/llama-stack-client-python@10ef53e)) ### Build System * Bump version to 0.2.14 ([745a94e](llamastack/llama-stack-client-python@745a94e)) * Bump version to 0.2.15 ([8700dc6](llamastack/llama-stack-client-python@8700dc6)) * Bump version to 0.2.15 ([4692024](llamastack/llama-stack-client-python@4692024)) * Bump version to 0.2.16 ([6ce9b84](llamastack/llama-stack-client-python@6ce9b84)) * Bump version to 0.2.17 ([69f67ef](llamastack/llama-stack-client-python@69f67ef)) --- This pull request is managed by Stainless's [GitHub App](https://github.com/apps/stainless-app). The [semver version number](https://semver.org/#semantic-versioning-specification-semver) is based on included [commit messages](https://www.conventionalcommits.org/en/v1.0.0/). Alternatively, you can manually set the version number in the title of this pull request. For a better experience, it is recommended to use either rebase-merge or squash-merge when merging this pull request. 🔗 Stainless [website](https://www.stainlessapi.com) 📚 Read the [docs](https://app.stainlessapi.com/docs) 🙋 [Reach out](mailto:[email protected]) for help or questions --------- Co-authored-by: stainless-app[bot] <142633134+stainless-app[bot]@users.noreply.github.com>
1 parent dae6506 commit 028a8dd

File tree

144 files changed

+2732
-1397
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

144 files changed

+2732
-1397
lines changed

.github/workflows/ci.yml

Lines changed: 96 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,96 @@
1+
name: CI
2+
on:
3+
push:
4+
branches-ignore:
5+
- 'generated'
6+
- 'codegen/**'
7+
- 'integrated/**'
8+
- 'stl-preview-head/**'
9+
- 'stl-preview-base/**'
10+
pull_request:
11+
branches-ignore:
12+
- 'stl-preview-head/**'
13+
- 'stl-preview-base/**'
14+
15+
jobs:
16+
lint:
17+
timeout-minutes: 10
18+
name: lint
19+
runs-on: ${{ github.repository == 'stainless-sdks/llama-stack-client-python' && 'depot-ubuntu-24.04' || 'ubuntu-latest' }}
20+
if: github.event_name == 'push' || github.event.pull_request.head.repo.fork
21+
steps:
22+
- uses: actions/checkout@v4
23+
24+
- name: Install Rye
25+
run: |
26+
curl -sSf https://rye.astral.sh/get | bash
27+
echo "$HOME/.rye/shims" >> $GITHUB_PATH
28+
env:
29+
RYE_VERSION: '0.44.0'
30+
RYE_INSTALL_OPTION: '--yes'
31+
32+
- name: Install dependencies
33+
run: rye sync --all-features
34+
35+
- name: Run lints
36+
run: ./scripts/lint
37+
38+
build:
39+
if: github.repository == 'stainless-sdks/llama-stack-client-python' && (github.event_name == 'push' || github.event.pull_request.head.repo.fork)
40+
timeout-minutes: 10
41+
name: build
42+
permissions:
43+
contents: read
44+
id-token: write
45+
runs-on: depot-ubuntu-24.04
46+
steps:
47+
- uses: actions/checkout@v4
48+
49+
- name: Install Rye
50+
run: |
51+
curl -sSf https://rye.astral.sh/get | bash
52+
echo "$HOME/.rye/shims" >> $GITHUB_PATH
53+
env:
54+
RYE_VERSION: '0.44.0'
55+
RYE_INSTALL_OPTION: '--yes'
56+
57+
- name: Install dependencies
58+
run: rye sync --all-features
59+
60+
- name: Run build
61+
run: rye build
62+
63+
- name: Get GitHub OIDC Token
64+
id: github-oidc
65+
uses: actions/github-script@v6
66+
with:
67+
script: core.setOutput('github_token', await core.getIDToken());
68+
69+
- name: Upload tarball
70+
env:
71+
URL: https://pkg.stainless.com/s
72+
AUTH: ${{ steps.github-oidc.outputs.github_token }}
73+
SHA: ${{ github.sha }}
74+
run: ./scripts/utils/upload-artifact.sh
75+
76+
test:
77+
timeout-minutes: 10
78+
name: test
79+
runs-on: ${{ github.repository == 'stainless-sdks/llama-stack-client-python' && 'depot-ubuntu-24.04' || 'ubuntu-latest' }}
80+
if: github.event_name == 'push' || github.event.pull_request.head.repo.fork
81+
steps:
82+
- uses: actions/checkout@v4
83+
84+
- name: Install Rye
85+
run: |
86+
curl -sSf https://rye.astral.sh/get | bash
87+
echo "$HOME/.rye/shims" >> $GITHUB_PATH
88+
env:
89+
RYE_VERSION: '0.44.0'
90+
RYE_INSTALL_OPTION: '--yes'
91+
92+
- name: Bootstrap
93+
run: ./scripts/bootstrap
94+
95+
- name: Run tests
96+
run: ./scripts/test

.gitignore

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,4 @@
11
.prism.log
2-
.vscode
32
_dev
43

54
__pycache__

.release-please-manifest.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,3 @@
11
{
2-
".": "0.1.0-alpha.4"
2+
".": "0.2.17"
33
}

.stats.yml

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
configured_endpoints: 105
2-
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/llamastack%2Fllama-stack-client-df7a19394e9124c18ec4e888e2856d22b5ebfd6fe6fe6e929ff6cfadb2ae7e2a.yml
3-
openapi_spec_hash: 9428682672fdd7e2afee7af9ef849dc9
4-
config_hash: e1d37a77a6e8ca86fb6bccb4b0f172c9
1+
configured_endpoints: 106
2+
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/llamastack%2Fllama-stack-client-f59f1c7d33001d60b5190f68aa49eacec90f05dbe694620b8916152c3922051d.yml
3+
openapi_spec_hash: 804edd2e834493906dc430145402be3b
4+
config_hash: de16e52db65de71ac35adcdb665a74f5

.vscode/settings.json

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
{
2+
"python.analysis.importFormat": "relative",
3+
}

CHANGELOG.md

Lines changed: 124 additions & 0 deletions
Large diffs are not rendered by default.

README.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,6 @@
11
# Llama Stack Client Python API library
22

3+
34
[![PyPI version](https://img.shields.io/pypi/v/llama_stack_client.svg)](https://pypi.org/project/llama_stack_client/) [![PyPI - Downloads](https://img.shields.io/pypi/dm/llama-stack-client)](https://pypi.org/project/llama-stack-client/)
45
[![Discord](https://img.shields.io/discord/1257833999603335178)](https://discord.gg/llama-stack)
56

api.md

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -453,6 +453,18 @@ Methods:
453453

454454
- <code title="get /v1/inspect/routes">client.routes.<a href="./src/llama_stack_client/resources/routes.py">list</a>() -> <a href="./src/llama_stack_client/types/route_list_response.py">RouteListResponse</a></code>
455455

456+
# Moderations
457+
458+
Types:
459+
460+
```python
461+
from llama_stack_client.types import CreateResponse
462+
```
463+
464+
Methods:
465+
466+
- <code title="post /v1/openai/v1/moderations">client.moderations.<a href="./src/llama_stack_client/resources/moderations.py">create</a>(\*\*<a href="src/llama_stack_client/types/moderation_create_params.py">params</a>) -> <a href="./src/llama_stack_client/types/create_response.py">CreateResponse</a></code>
467+
456468
# Safety
457469

458470
Types:

pyproject.toml

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ name = "llama_stack_client"
33
version = "0.2.17"
44
description = "The official Python library for the llama-stack-client API"
55
dynamic = ["readme"]
6-
license = "Apache-2.0"
6+
license = "MIT"
77
authors = [{ name = "Meta Llama", email = "[email protected]" }]
88
dependencies = [
99
"httpx>=0.23.0, <1",
@@ -27,13 +27,14 @@ classifiers = [
2727
"Typing :: Typed",
2828
"Intended Audience :: Developers",
2929
"Programming Language :: Python :: 3.12",
30+
"Programming Language :: Python :: 3.13",
3031
"Operating System :: OS Independent",
3132
"Operating System :: POSIX",
3233
"Operating System :: MacOS",
3334
"Operating System :: POSIX :: Linux",
3435
"Operating System :: Microsoft :: Windows",
3536
"Topic :: Software Development :: Libraries :: Python Modules",
36-
"License :: OSI Approved :: Apache Software License"
37+
"License :: OSI Approved :: MIT License"
3738
]
3839

3940
[dependency-groups]
@@ -52,8 +53,6 @@ dev = [
5253
Homepage = "https://github.com/llamastack/llama-stack-client-python"
5354
Repository = "https://github.com/llamastack/llama-stack-client-python"
5455

55-
56-
5756
[build-system]
5857
requires = ["hatchling==1.26.3", "hatch-fancy-pypi-readme"]
5958
build-backend = "hatchling.build"

requirements-dev.lock

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -48,15 +48,15 @@ filelock==3.12.4
4848
frozenlist==1.6.2
4949
# via aiohttp
5050
# via aiosignal
51-
h11==0.14.0
51+
h11==0.16.0
5252
# via httpcore
53-
httpcore==1.0.2
53+
httpcore==1.0.9
5454
# via httpx
5555
httpx==0.28.1
5656
# via httpx-aiohttp
5757
# via llama-stack-client
5858
# via respx
59-
httpx-aiohttp==0.1.6
59+
httpx-aiohttp==0.1.8
6060
# via llama-stack-client
6161
idna==3.4
6262
# via anyio

0 commit comments

Comments
 (0)