Skip to content

release: 0.1.3 #20

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
47 changes: 36 additions & 11 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
@@ -1,18 +1,23 @@
name: CI
on:
push:
branches:
- main
branches-ignore:
- 'generated'
- 'codegen/**'
- 'integrated/**'
- 'stl-preview-head/**'
- 'stl-preview-base/**'
pull_request:
branches:
- main
- next
branches-ignore:
- 'stl-preview-head/**'
- 'stl-preview-base/**'

jobs:
lint:
timeout-minutes: 10
name: lint
runs-on: ubuntu-latest

runs-on: ${{ github.repository == 'stainless-sdks/llama-api-typescript' && 'depot-ubuntu-24.04' || 'ubuntu-latest' }}
if: github.event_name == 'push' || github.event.pull_request.head.repo.fork
steps:
- uses: actions/checkout@v4

Expand All @@ -28,9 +33,13 @@ jobs:
run: ./scripts/lint

build:
timeout-minutes: 5
name: build
runs-on: ubuntu-latest

runs-on: ${{ github.repository == 'stainless-sdks/llama-api-typescript' && 'depot-ubuntu-24.04' || 'ubuntu-latest' }}
if: github.event_name == 'push' || github.event.pull_request.head.repo.fork
permissions:
contents: read
id-token: write
steps:
- uses: actions/checkout@v4

Expand All @@ -44,10 +53,26 @@ jobs:

- name: Check build
run: ./scripts/build

- name: Get GitHub OIDC Token
if: github.repository == 'stainless-sdks/llama-api-typescript'
id: github-oidc
uses: actions/github-script@v6
with:
script: core.setOutput('github_token', await core.getIDToken());

- name: Upload tarball
if: github.repository == 'stainless-sdks/llama-api-typescript'
env:
URL: https://pkg.stainless.com/s
AUTH: ${{ steps.github-oidc.outputs.github_token }}
SHA: ${{ github.sha }}
run: ./scripts/utils/upload-artifact.sh
test:
timeout-minutes: 10
name: test
runs-on: ubuntu-latest

runs-on: ${{ github.repository == 'stainless-sdks/llama-api-typescript' && 'depot-ubuntu-24.04' || 'ubuntu-latest' }}
if: github.event_name == 'push' || github.event.pull_request.head.repo.fork
steps:
- uses: actions/checkout@v4

Expand Down
20 changes: 20 additions & 0 deletions .github/workflows/release-doctor.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
name: Release Doctor
on:
pull_request:
branches:
- main
workflow_dispatch:

jobs:
release_doctor:
name: release doctor
runs-on: ubuntu-latest
if: github.repository == 'meta-llama/llama-api-typescript' && (github.event_name == 'push' || github.event_name == 'workflow_dispatch' || startsWith(github.head_ref, 'release-please') || github.head_ref == 'next')

steps:
- uses: actions/checkout@v4

- name: Check release environment
run: |
bash ./bin/check-release-environment
env:
1 change: 0 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
*.watchman-cookie*
.prism.log
node_modules
yarn-error.log
Expand Down
3 changes: 3 additions & 0 deletions .release-please-manifest.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
{
".": "0.1.3"
}
8 changes: 4 additions & 4 deletions .stats.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
configured_endpoints: 3
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/meta%2Fllama-api-7ff4fcc96829051ab3c627d2a6eb15eae95e2faeceb6af5d69e91c5d01dc8781.yml
openapi_spec_hash: 10455d7d46f2f4c88d427cc7b0a4ebee
config_hash: bc2644b02f8a40dc1cfbb206af1f623d
configured_endpoints: 4
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/meta%2Fllama-api-bfa0267b010dcc4b39e62dfbd698ac6f9421f3212c44b3408b9b154bd6c67a8b.yml
openapi_spec_hash: 7f424537bc7ea7638e3934ef721b8d71
config_hash: 06885267c7cf939a875a5d64fc5103c2
29 changes: 29 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
# Changelog

## 0.1.3 (2025-08-12)

Full Changelog: [v0.1.2...v0.1.3](https://github.com/meta-llama/llama-api-typescript/compare/v0.1.2...v0.1.3)

### Bug Fixes

* **api:** remove chat completion request model ([0f430fd](https://github.com/meta-llama/llama-api-typescript/commit/0f430fdbe2d9c6fb632c04b00426a18be3eb896b))


### Chores

* add examples ([14264be](https://github.com/meta-llama/llama-api-typescript/commit/14264be48f53f82bff1f20c915f8e271995d8da3))
* **internal:** codegen related update ([d214c13](https://github.com/meta-llama/llama-api-typescript/commit/d214c13dfeaf5e828529dac2cd71fcd7a00abc5f))
* **internal:** move publish config ([89f5ba7](https://github.com/meta-llama/llama-api-typescript/commit/89f5ba7ea74b934de457ffb78f609803ecaae576))
* **internal:** remove redundant imports config ([dcc16cf](https://github.com/meta-llama/llama-api-typescript/commit/dcc16cfbe81c0b00ec68f2f236aa90ac9d115e25))
* **internal:** update comment in script ([4b6a798](https://github.com/meta-llama/llama-api-typescript/commit/4b6a798a07511dd7d2d505a95b16a7b50b943ede))
* make some internal functions async ([03b22bf](https://github.com/meta-llama/llama-api-typescript/commit/03b22bf016daaf76d74b81ff9886be9053844476))
* sync repo ([fbc5e39](https://github.com/meta-llama/llama-api-typescript/commit/fbc5e39ababdd87df5179d2c1c3124aa6ce2ac9c))
* **ts:** reorder package.json imports ([9ab7974](https://github.com/meta-llama/llama-api-typescript/commit/9ab797439fbd8847ab24fa8669e47dc1a704de51))
* update @stainless-api/prism-cli to v5.15.0 ([a0b275e](https://github.com/meta-llama/llama-api-typescript/commit/a0b275e339427ff8e9fb7b2cce03dfcafaf73286))
* update SDK settings ([742ce20](https://github.com/meta-llama/llama-api-typescript/commit/742ce2097615cb43e8e15733a0622c0582122ddd))


### Documentation

* code of conduct ([051994b](https://github.com/meta-llama/llama-api-typescript/commit/051994bbedf0f92ba127c93a7adc2015b193f824))
* license ([822b89e](https://github.com/meta-llama/llama-api-typescript/commit/822b89efc5e796b9f9031d345d5104247baca864))
79 changes: 34 additions & 45 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,12 @@
# Llama API Client TypeScript API Library

[![NPM version](https://img.shields.io/npm/v/llama-api-client.svg)](https://npmjs.org/package/llama-api-client) ![npm bundle size](https://img.shields.io/bundlephobia/minzip/llama-api-client)
[![NPM version](<https://img.shields.io/npm/v/llama-api-client.svg?label=npm%20(stable)>)](https://npmjs.org/package/llama-api-client) ![npm bundle size](https://img.shields.io/bundlephobia/minzip/llama-api-client)

This library provides convenient access to the Llama API Client REST API from server-side TypeScript or JavaScript.

The REST API documentation can be found on [https://llama.developer.meta.com/docs](https://llama.developer.meta.com/docs). The full API of this library can be found in [api.md](api.md).
The REST API documentation can be found on [llama.developer.meta.com](https://llama.developer.meta.com/docs). The full API of this library can be found in [api.md](api.md).

It is generated with [Stainless](https://www.stainless.com/).

## Installation

Expand All @@ -24,16 +26,12 @@ const client = new LlamaAPIClient({
apiKey: process.env['LLAMA_API_KEY'], // This is the default and can be omitted
});

async function main() {
const createChatCompletionResponse = await client.chat.completions.create({
messages: [{ content: 'string', role: 'user' }],
model: 'model',
});

console.log(createChatCompletionResponse.completion_message);
}
const createChatCompletionResponse = await client.chat.completions.create({
messages: [{ content: 'string', role: 'user' }],
model: 'model',
});

main();
console.log(createChatCompletionResponse.completion_message);
```

## Streaming responses
Expand All @@ -50,8 +48,8 @@ const stream = await client.chat.completions.create({
model: 'model',
stream: true,
});
for await (const chunk of stream) {
console.log(chunk);
for await (const createChatCompletionResponseStreamChunk of stream) {
console.log(createChatCompletionResponseStreamChunk);
}
```

Expand All @@ -70,16 +68,12 @@ const client = new LlamaAPIClient({
apiKey: process.env['LLAMA_API_KEY'], // This is the default and can be omitted
});

async function main() {
const params: LlamaAPIClient.Chat.CompletionCreateParams = {
messages: [{ content: 'string', role: 'user' }],
model: 'model',
};
const createChatCompletionResponse: LlamaAPIClient.CreateChatCompletionResponse =
await client.chat.completions.create(params);
}

main();
const params: LlamaAPIClient.Chat.CompletionCreateParams = {
messages: [{ content: 'string', role: 'user' }],
model: 'model',
};
const createChatCompletionResponse: LlamaAPIClient.CreateChatCompletionResponse =
await client.chat.completions.create(params);
```

Documentation for each method, request param, and response field are available in docstrings and will appear on hover in most modern editors.
Expand All @@ -92,24 +86,20 @@ a subclass of `APIError` will be thrown:

<!-- prettier-ignore -->
```ts
async function main() {
const createChatCompletionResponse = await client.chat.completions
.create({ messages: [{ content: 'string', role: 'user' }], model: 'model' })
.catch(async (err) => {
if (err instanceof LlamaAPIClient.APIError) {
console.log(err.status); // 400
console.log(err.name); // BadRequestError
console.log(err.headers); // {server: 'nginx', ...}
} else {
throw err;
}
});
}

main();
const createChatCompletionResponse = await client.chat.completions
.create({ messages: [{ content: 'string', role: 'user' }], model: 'model' })
.catch(async (err) => {
if (err instanceof LlamaAPIClient.APIError) {
console.log(err.status); // 400
console.log(err.name); // BadRequestError
console.log(err.headers); // {server: 'nginx', ...}
} else {
throw err;
}
});
```

Error codes are as followed:
Error codes are as follows:

| Status Code | Error Type |
| ----------- | -------------------------- |
Expand Down Expand Up @@ -188,7 +178,7 @@ const { data: createChatCompletionResponse, response: raw } = await client.chat.
.create({ messages: [{ content: 'string', role: 'user' }], model: 'model' })
.withResponse();
console.log(raw.headers.get('X-My-Header'));
console.log(createChatCompletionResponse.completion_message);
console.log(createChatCompletionResponse.id);
```

### Logging
Expand Down Expand Up @@ -268,9 +258,8 @@ parameter. This library doesn't validate at runtime that the request matches the
send will be sent as-is.

```ts
client.foo.create({
foo: 'my_param',
bar: 12,
client.chat.completions.create({
// ...
// @ts-expect-error baz is not yet public
baz: 'undocumented option',
});
Expand Down Expand Up @@ -388,7 +377,7 @@ TypeScript >= 4.9 is supported.
The following runtimes are supported:

- Web browsers (Up-to-date Chrome, Firefox, Safari, Edge, and more)
- Node.js 18 LTS or later ([non-EOL](https://endoflife.date/nodejs)) versions.
- Node.js 20 LTS or later ([non-EOL](https://endoflife.date/nodejs)) versions.
- Deno v1.28.0 or higher.
- Bun 1.0 or later.
- Cloudflare Workers.
Expand All @@ -406,4 +395,4 @@ See [the contributing documentation](./CONTRIBUTING.md).

## License

Llama API Typescript SDK is MIT licensed, as found in the LICENSE file.
Llama API Typescript SDK is MIT licensed, as found in the LICENSE file.
1 change: 0 additions & 1 deletion api.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,6 @@
Types:

- <code><a href="./src/resources/chat/chat.ts">CompletionMessage</a></code>
- <code><a href="./src/resources/chat/chat.ts">CreateChatCompletionRequest</a></code>
- <code><a href="./src/resources/chat/chat.ts">CreateChatCompletionResponse</a></code>
- <code><a href="./src/resources/chat/chat.ts">CreateChatCompletionResponseStreamChunk</a></code>
- <code><a href="./src/resources/chat/chat.ts">Message</a></code>
Expand Down
18 changes: 18 additions & 0 deletions bin/check-release-environment
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
#!/usr/bin/env bash

errors=()

lenErrors=${#errors[@]}

if [[ lenErrors -gt 0 ]]; then
echo -e "Found the following errors in the release environment:\n"

for error in "${errors[@]}"; do
echo -e "- $error\n"
done

exit 1
fi

echo "The environment is ready to push releases!"

54 changes: 45 additions & 9 deletions bin/publish-npm
Original file line number Diff line number Diff line change
Expand Up @@ -4,22 +4,58 @@ set -eux

npm config set '//registry.npmjs.org/:_authToken' "$NPM_TOKEN"

# Build the project
yarn build

# Navigate to the dist directory
cd dist

# Get the version from package.json
VERSION="$(node -p "require('./package.json').version")"
# Get package name and version from package.json
PACKAGE_NAME="$(jq -r -e '.name' ./package.json)"
VERSION="$(jq -r -e '.version' ./package.json)"

# Get latest version from npm
#
# If the package doesn't exist, npm will return:
# {
# "error": {
# "code": "E404",
# "summary": "Unpublished on 2025-06-05T09:54:53.528Z",
# "detail": "'the_package' is not in this registry..."
# }
# }
NPM_INFO="$(npm view "$PACKAGE_NAME" version --json 2>/dev/null || true)"

# Check if we got an E404 error
if echo "$NPM_INFO" | jq -e '.error.code == "E404"' > /dev/null 2>&1; then
# Package doesn't exist yet, no last version
LAST_VERSION=""
elif echo "$NPM_INFO" | jq -e '.error' > /dev/null 2>&1; then
# Report other errors
echo "ERROR: npm returned unexpected data:"
echo "$NPM_INFO"
exit 1
else
# Success - get the version
LAST_VERSION=$(echo "$NPM_INFO" | jq -r '.') # strip quotes
fi

# Extract the pre-release tag if it exists
# Check if current version is pre-release (e.g. alpha / beta / rc)
CURRENT_IS_PRERELEASE=false
if [[ "$VERSION" =~ -([a-zA-Z]+) ]]; then
# Extract the part before any dot in the pre-release identifier
TAG="${BASH_REMATCH[1]}"
CURRENT_IS_PRERELEASE=true
CURRENT_TAG="${BASH_REMATCH[1]}"
fi

# Check if last version is a stable release
LAST_IS_STABLE_RELEASE=true
if [[ -z "$LAST_VERSION" || "$LAST_VERSION" =~ -([a-zA-Z]+) ]]; then
LAST_IS_STABLE_RELEASE=false
fi

# Use a corresponding alpha/beta tag if there already is a stable release and we're publishing a prerelease.
if $CURRENT_IS_PRERELEASE && $LAST_IS_STABLE_RELEASE; then
TAG="$CURRENT_TAG"
else
TAG="latest"
fi

# Publish with the appropriate tag
yarn publish --access public --tag "$TAG"
yarn publish --tag "$TAG"
Loading