Skip to content

Commit e89d303

Browse files
committed
Update READMEs to 0.1.7
1 parent 931f17e commit e89d303

File tree

3 files changed

+15
-13
lines changed

3 files changed

+15
-13
lines changed

README.md

Lines changed: 9 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -4,17 +4,19 @@
44

55
llama-stack-client-swift brings the inference and agents APIs of [Llama Stack](https://github.com/meta-llama/llama-stack) to iOS.
66

7-
**Update: March 14, 2025** The llama-stack-client-swift SDK version has been updated to 0.1.4.1, working with Llama Stack 0.1.4 ([release note](https://github.com/meta-llama/llama-stack/releases/tag/v0.1.4)) and ExecuTorch 0.5.0 ([release note](https://github.com/pytorch/executorch/releases/tag/v0.5.0)). Also the demo apps have been moved to this repository under `examples`.
7+
Compatibility with:
8+
- [Llama Stack 0.1.7](https://github.com/meta-llama/llama-stack/releases/tag/v0.1.7)
9+
- [ExecuTorch 0.5.0](https://github.com/pytorch/executorch/releases/tag/v0.5.0)
810

911
## Features
1012

1113
- **Inference & Agents:** Leverage remote Llama Stack distributions for inference, code execution, and safety.
1214
- **Custom Tool Calling:** Provide Swift tools that Llama agents can understand and use.
1315

1416
## iOS Demos
15-
See [here](https://github.com/meta-llama/llama-stack-apps/tree/main/examples/ios_quick_demo) for a quick iOS demo ([video](https://drive.google.com/file/d/1HnME3VmsYlyeFgsIOMlxZy5c8S2xP4r4/view?usp=sharing)) using a remote Llama Stack server for inferencing.
16-
17-
For a more advanced demo using the Llama Stack Agent API and custom tool calling feature, see the [iOS Calendar Assistant demo](https://github.com/meta-llama/llama-stack-apps/tree/main/examples/ios_calendar_assistant).
17+
We have several demo apps to help provide reference for how to use the SDK:
18+
- [iOS Quick Demo](https://github.com/meta-llama/llama-stack-client-swift/tree/latest-release/examples/ios_quick_demo): Uses remote Llama Stack server for inferencing ([video](https://drive.google.com/file/d/1HnME3VmsYlyeFgsIOMlxZy5c8S2xP4r4/view?usp=sharing)).
19+
- [iOS Calendar Assistant Demo](https://github.com/meta-llama/llama-stack-client-swift/tree/latest-release/examples/ios_calendar_assistant): Advanced uses of Llama Stack Agent API and custom tool calling feature. There are separate projects for remote and local inferencing.
1820

1921

2022
## Installation
@@ -34,18 +36,18 @@ For a more advanced demo using the Llama Stack Agent API and custom tool calling
3436
```
3537
conda create -n llama-stack python=3.10
3638
conda activate llama-stack
37-
pip install --no-cache llama-stack==0.1.4 llama-models==0.1.4 llama-stack-client==0.1.4
39+
pip install --no-cache llama-stack==0.1.7 llama-models==0.1.7 llama-stack-client==0.1.7
3840
```
3941

4042
Then, either:
4143
```
42-
PYPI_VERSION=0.1.4 llama stack build --template fireworks --image-type conda
44+
PYPI_VERSION=0.1.7 llama stack build --template fireworks --image-type conda
4345
export FIREWORKS_API_KEY="<your_fireworks_api_key>"
4446
llama stack run fireworks
4547
```
4648
or
4749
```
48-
PYPI_VERSION=0.1.4 llama stack build --template together --image-type conda
50+
PYPI_VERSION=0.1.7 llama stack build --template together --image-type conda
4951
export TOGETHER_API_KEY="<your_together_api_key>"
5052
llama stack run together
5153
```

examples/ios_calendar_assistant/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -19,18 +19,18 @@ You need to set up a remote Llama Stack distributions to run this demo. Assuming
1919
```
2020
conda create -n llama-stack python=3.10
2121
conda activate llama-stack
22-
pip install --no-cache llama-stack==0.1.4 llama-models==0.1.4 llama-stack-client==0.1.4
22+
pip install --no-cache llama-stack==0.1.7 llama-models==0.1.7 llama-stack-client==0.1.7
2323
```
2424

2525
Then, either:
2626
```
27-
PYPI_VERSION=0.1.4 llama stack build --template together --image-type conda
27+
PYPI_VERSION=0.1.7 llama stack build --template together --image-type conda
2828
export TOGETHER_API_KEY="<your_together_api_key>"
2929
llama stack run together
3030
```
3131
or
3232
```
33-
PYPI_VERSION=0.1.4 llama stack build --template fireworks --image-type conda
33+
PYPI_VERSION=0.1.7 llama stack build --template fireworks --image-type conda
3434
export FIREWORKS_API_KEY="<your_fireworks_api_key>"
3535
llama stack run fireworks
3636
```

examples/ios_quick_demo/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -16,18 +16,18 @@ You need to set up a remote Llama Stack distributions to run this demo. Assuming
1616
```
1717
conda create -n llama-stack python=3.10
1818
conda activate llama-stack
19-
pip install --no-cache llama-stack==0.1.4 llama-models==0.1.4 llama-stack-client==0.1.4
19+
pip install --no-cache llama-stack==0.1.7 llama-models==0.1.7 llama-stack-client==0.1.7
2020
```
2121

2222
Then, either:
2323
```
24-
PYPI_VERSION=0.1.4 llama stack build --template together --image-type conda
24+
PYPI_VERSION=0.1.7 llama stack build --template together --image-type conda
2525
export TOGETHER_API_KEY="<your_together_api_key>"
2626
llama stack run together
2727
```
2828
or
2929
```
30-
PYPI_VERSION=0.1.4 llama stack build --template fireworks --image-type conda
30+
PYPI_VERSION=0.1.7 llama stack build --template fireworks --image-type conda
3131
export FIREWORKS_API_KEY="<your_fireworks_api_key>"
3232
llama stack run fireworks
3333
```

0 commit comments

Comments
 (0)