You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+27-22Lines changed: 27 additions & 22 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@
4
4
5
5
llama-stack-client-swift brings the inference and agents APIs of [Llama Stack](https://github.com/meta-llama/llama-stack) to iOS.
6
6
7
-
**Update: January 27, 2025** The llama-stack-client-swift SDK version has been updated to 0.1.0, working with Llama Stack 0.1.0 ([release note](https://github.com/meta-llama/llama-stack/releases/tag/v0.1.0)).
7
+
**Update: February 18, 2025** The llama-stack-client-swift SDK version has been updated to 0.1.3, working with Llama Stack 0.1.3 ([release note](https://github.com/meta-llama/llama-stack/releases/tag/v0.1.3)).
8
8
9
9
## Features
10
10
@@ -21,76 +21,81 @@ For a more advanced demo using the Llama Stack Agent API and custom tool calling
2. Add this repo URL at the top right: `https://github.com/meta-llama/llama-stack-client-swift` and 0.1.0 in the Dependency Rule, then click Add Package.
24
+
2. Add this repo URL at the top right: `https://github.com/meta-llama/llama-stack-client-swift` and 0.1.3 in the Dependency Rule, then click Add Package.
25
25
26
26
3. Select and add `llama-stack-client-swift` to your app target.
27
27
28
28
4. On the first build: Enable & Trust the OpenAPIGenerator extension when prompted.
29
29
30
-
5. Set up a remote Llama Stack distributions, assuming you have a [Fireworks](https://fireworks.ai/account/api-keys) or [Together](https://api.together.ai/) API key, which you can get easily by clicking the link:
30
+
5. The quickest way to try out the demo for remote inference is using Together.ai's Llama Stack distro at https://llama-stack.together.ai - you can skip Step 6 unless you want to build your own distro.
31
+
32
+
6. (Optional) Set up a remote Llama Stack distributions, assuming you have a [Fireworks](https://fireworks.ai/account/api-keys) or [Together](https://api.together.ai/) API key, which you can get easily by clicking the link:
PYPI_VERSION=0.1.0 llama stack build --template together --image-type conda
48
+
PYPI_VERSION=0.1.3 llama stack build --template together --image-type conda
47
49
export TOGETHER_API_KEY="<your_together_api_key>"
48
50
llama stack run together
49
51
```
50
52
51
53
The default port is 5000 for `llama stack run` and you can specify a different port by adding `--port <your_port>` to the end of `llama stack run fireworks|together`.
52
54
53
-
6.Replace the `RemoteInference` url string below with the host IP and port of the remote Llama Stack distro in Step 5:
55
+
Replace the `RemoteInference` url string below with the host IP and port of the remote Llama Stack distro in Step 6:
54
56
55
57
```swift
56
58
importLlamaStackClient
57
59
58
-
let inference =RemoteInference(url: URL(string: "http://127.0.0.1:5000")!)
60
+
let inference =RemoteInference(url: URL(string: "https://llama-stack.together.ai")!)
59
61
```
62
+
63
+
7. Build and Run the iOS demo.
64
+
60
65
Below is an example code snippet to use the Llama Stack inference API. See the iOS Demos above for complete code.
0 commit comments