Skip to content

Commit 3716c7d

Browse files
authored
Upgrade SDK 0.2.14 (#38)
* docs: update README to support the latest llama-stack * docs: update README to support the latest llama-stack * docs: formatting
1 parent ce6005c commit 3716c7d

File tree

3 files changed

+16
-11
lines changed

3 files changed

+16
-11
lines changed

README.md

Lines changed: 11 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -5,20 +5,22 @@
55
llama-stack-client-swift brings the inference and agents APIs of [Llama Stack](https://github.com/meta-llama/llama-stack) to iOS.
66

77
Compatible with:
8-
- [Llama Stack 0.2.2](https://github.com/meta-llama/llama-stack/releases/tag/v0.2.2)
8+
9+
- [Llama Stack 0.2.14](https://github.com/meta-llama/llama-stack/releases/tag/v0.2.14)
910
- [ExecuTorch 0.5.0](https://github.com/pytorch/executorch/releases/tag/v0.5.0)
1011

1112
## Features
1213

1314
- **Inference & Agents:** Leverage remote Llama Stack distributions for inference, code execution, and safety.
14-
- **Custom Tool Calling:** Provide Swift tools that Llama agents can understand and use.
15+
- **Custom Tool Calling:** Provide Swift tools that Llama agents can understand and use.
1516

1617
## iOS Demos
18+
1719
We have several demo apps to help provide reference for how to use the SDK:
20+
1821
- [iOS Quick Demo](https://github.com/meta-llama/llama-stack-client-swift/tree/latest-release/examples/ios_quick_demo): Uses remote Llama Stack server for inferencing ([video](https://drive.google.com/file/d/1HnME3VmsYlyeFgsIOMlxZy5c8S2xP4r4/view?usp=sharing)).
1922
- [iOS Calendar Assistant Demo](https://github.com/meta-llama/llama-stack-client-swift/tree/latest-release/examples/ios_calendar_assistant): Advanced uses of Llama Stack Agent API and custom tool calling feature. There are separate projects for remote and local inferencing.
2023

21-
2224
## Installation
2325

2426
1. Click "Xcode > File > Add Package Dependencies...".
@@ -30,23 +32,26 @@ We have several demo apps to help provide reference for how to use the SDK:
3032
4. On the first build: Enable & Trust the OpenAPIGenerator extension when prompted.
3133

3234
5. The quickest way to try out the demo for remote inference is using Together.ai's Llama Stack distro at https://llama-stack.together.ai - you can skip Step 6 unless you want to build your own distro.
33-
*Note that Llama 4 is currently only supported by building your own distro from Llama Stack PIP package or main.*
35+
_Note that Llama 4 is currently only supported by building your own distro from Llama Stack PIP package or main._
3436

3537
6. (Optional) Set up a remote Llama Stack distributions, assuming you have a [Fireworks](https://fireworks.ai/account/api-keys) or [Together](https://api.together.ai/) API key, which you can get easily by clicking the link:
3638

3739
```
38-
conda create -n llama-stack python=3.10
40+
conda create -n llama-stack python=3.12
3941
conda activate llama-stack
40-
pip install --no-cache llama-stack==0.2.2 llama-models==0.2.0 llama-stack-client==0.2.2
42+
pip install --no-cache llama-stack==0.2.14 llama-models==0.2.0 llama-stack-client==0.2.14
4143
```
4244

4345
Then, either:
46+
4447
```
4548
llama stack build --template fireworks --image-type conda
4649
export FIREWORKS_API_KEY="<your_fireworks_api_key>"
4750
llama stack run fireworks
4851
```
52+
4953
or
54+
5055
```
5156
llama stack build --template together --image-type conda
5257
export TOGETHER_API_KEY="<your_together_api_key>"

examples/ios_calendar_assistant/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -17,9 +17,9 @@ The quickest way to try out the demo for remote inference is using Together.ai's
1717
You need to set up a remote Llama Stack distributions to run this demo. Assuming you have a [Fireworks](https://fireworks.ai/account/api-keys) or [Together](https://api.together.ai/) API key, which you can get easily by clicking the link above:
1818

1919
```
20-
conda create -n llama-stack python=3.10
20+
conda create -n llama-stack python=3.12
2121
conda activate llama-stack
22-
pip install --no-cache llama-stack==0.2.1 llama-models==0.2.0 llama-stack-client==0.2.1
22+
pip install --no-cache llama-stack==0.2.14 llama-models==0.2.0 llama-stack-client==0.2.14
2323
```
2424

2525
Then, either:
@@ -57,7 +57,7 @@ let agent = RemoteAgents(url: URL(string: "https://localhost:5000")!)
5757

5858
Also, to allow the app to add event to the Calendar app, the `Info.plist` needs to have an entry `Privacy - Calendars Usage Description` and when running the app for the first time, you need to accept the Calendar access request.
5959

60-
4. Build the run the app on an iOS simulator or your device.
60+
4. Build the run the app on an iOS simulator or your device.
6161

6262
Note: For your first-time build, you may need to Enable and Trust the OpenAPI Generator plugin. A link to enable will be available in the logs. You may need to do a clean build, close Xcode and then restart it again to avoid any cache issues. Otherwise, you may see "Bad Access too URLSession" errors during inference.
6363

examples/ios_quick_demo/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -24,9 +24,9 @@ llama stack run --image-type conda ~/.llama/distributions/together/together-run.
2424
To use PIP packages, you need to set up a remote Llama Stack distributions to run this demo. Assuming you have a [Fireworks](https://fireworks.ai/account/api-keys) or [Together](https://api.together.ai/) API key, which you can get easily by clicking the link above:
2525

2626
```
27-
conda create -n llama-stack python=3.10
27+
conda create -n llama-stack python=3.12
2828
conda activate llama-stack
29-
pip install --no-cache llama-stack==0.2.2 llama-models==0.2.0 llama-stack-client==0.2.2
29+
pip install --no-cache llama-stack==0.2.14 llama-models==0.2.0 llama-stack-client==0.2.14
3030
```
3131

3232
Then, either:

0 commit comments

Comments
 (0)