Skip to content
This repository was archived by the owner on Aug 14, 2025. It is now read-only.

Commit 41646f3

Browse files
chore(internal): codegen related update
1 parent ad42de8 commit 41646f3

File tree

3 files changed

+5
-2
lines changed

3 files changed

+5
-2
lines changed

.release-please-manifest.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,3 @@
11
{
2-
".": "0.1.0-alpha.4"
2+
".": "0.2.13"
33
}

README.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -23,6 +23,9 @@ You can find more example apps with client SDKs to talk with the Llama Stack ser
2323
pip install llama-stack-client
2424
```
2525

26+
> [!NOTE]
27+
> Once this package is [published to PyPI](https://www.stainless.com/docs/guides/publish), this will become: `pip install llama_stack_client`
28+
2629
## Usage
2730

2831
The full API of this library can be found in [api.md](api.md). You may find basic client examples in our [llama-stack-apps](https://github.com/meta-llama/llama-stack-apps/tree/main) repo.

scripts/utils/upload-artifact.sh

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ UPLOAD_RESPONSE=$(tar -cz . | curl -v -X PUT \
1818

1919
if echo "$UPLOAD_RESPONSE" | grep -q "HTTP/[0-9.]* 200"; then
2020
echo -e "\033[32mUploaded build to Stainless storage.\033[0m"
21-
echo -e "\033[32mInstallation: pip install --pre 'https://pkg.stainless.com/s/llama-stack-client-python/$SHA'\033[0m"
21+
echo -e "\033[32mInstallation: pip install 'https://pkg.stainless.com/s/llama-stack-client-python/$SHA'\033[0m"
2222
else
2323
echo -e "\033[31mFailed to upload artifact.\033[0m"
2424
exit 1

0 commit comments

Comments
 (0)