Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 9 additions & 0 deletions .changeset/two-bikes-kneel.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
---
'@tanstack/ai-anthropic': minor
'@tanstack/ai-gemini': minor
'@tanstack/ai-ollama': minor
'@tanstack/ai-openai': minor
'@tanstack/ai': minor
---

Split up adapters for better tree shaking into separate functionalities
26 changes: 26 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,14 +38,40 @@
A powerful, type-safe AI SDK for building AI-powered applications.

- Provider-agnostic adapters (OpenAI, Anthropic, Gemini, Ollama, etc.)
- **Tree-shakeable adapters** - Import only what you need for smaller bundles
- **Multimodal content support** - Send images, audio, video, and documents
- **Image generation** - Generate images with OpenAI DALL-E/GPT-Image and Gemini Imagen
- Chat completion, streaming, and agent loop strategies
- Headless chat state management with adapters (SSE, HTTP stream, custom)
- Isomorphic type-safe tools with server/client execution
- **Enhanced integration with TanStack Start** - Share implementations between AI tools and server functions

### <a href="https://tanstack.com/ai">Read the docs β†’</b></a>

## Tree-Shakeable Adapters

Import only the functionality you need for smaller bundle sizes:

```typescript
// Only chat functionality - no embedding or summarization code bundled
import { openaiText } from '@tanstack/ai-openai/adapters'
import { generate } from '@tanstack/ai'

const textAdapter = openaiText()

const result = generate({
adapter: textAdapter,
model: 'gpt-4o',
messages: [{ role: 'user', content: [{ type: 'text', content: 'Hello!' }] }],
})

for await (const chunk of result) {
console.log(chunk)
}
```

Available adapters: `openaiText`, `openaiEmbed`, `openaiSummarize`, `anthropicText`, `geminiText`, `ollamaText`, and more.

## Bonus: TanStack Start Integration

TanStack AI works with **any** framework (Next.js, Express, Remix, etc.).
Expand Down
182 changes: 122 additions & 60 deletions docs/adapters/anthropic.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
---
title: Anthropic Adapter
slug: /adapters/anthropic
id: anthropic-adapter
---

The Anthropic adapter provides access to Claude models, including Claude 3.5 Sonnet, Claude 3 Opus, and more.
The Anthropic adapter provides access to Claude models, including Claude Sonnet 4.5, Claude Opus 4.5, and more.

## Installation

Expand All @@ -14,63 +14,72 @@ npm install @tanstack/ai-anthropic
## Basic Usage

```typescript
import { chat } from "@tanstack/ai";
import { anthropic } from "@tanstack/ai-anthropic";
import { ai } from "@tanstack/ai";
import { anthropicText } from "@tanstack/ai-anthropic";

const adapter = anthropic();
const adapter = anthropicText();

const stream = chat({
const stream = ai({
adapter,
messages: [{ role: "user", content: "Hello!" }],
model: "claude-3-5-sonnet-20241022",
model: "claude-sonnet-4-5-20250929",
});
```

## Basic Usage - Custom API Key

```typescript
import { chat } from "@tanstack/ai";
import { createAnthropic } from "@tanstack/ai-anthropic";
import { ai } from "@tanstack/ai";
import { createAnthropicText } from "@tanstack/ai-anthropic";

const adapter = createAnthropic(process.env.ANTHROPIC_API_KEY, {
const adapter = createAnthropicText(process.env.ANTHROPIC_API_KEY!, {
// ... your config options
});
});

const stream = chat({
const stream = ai({
adapter,
messages: [{ role: "user", content: "Hello!" }],
model: "claude-3-5-sonnet-20241022",
model: "claude-sonnet-4-5-20250929",
});
```

## Configuration

```typescript
import { anthropic, type AnthropicConfig } from "@tanstack/ai-anthropic";
import { createAnthropicText, type AnthropicTextConfig } from "@tanstack/ai-anthropic";

const config: AnthropicConfig = {
// ... your config options
const config: AnthropicTextConfig = {
baseURL: "https://api.anthropic.com", // Optional, for custom endpoints
};

const adapter = anthropic(config);
const adapter = createAnthropicText(process.env.ANTHROPIC_API_KEY!, config);
```


## Available Models

### Chat Models

- `claude-sonnet-4-5-20250929` - Claude Sonnet 4.5 (balanced)
- `claude-opus-4-5-20251101` - Claude Opus 4.5 (most capable)
- `claude-haiku-4-0-20250514` - Claude Haiku 4.0 (fastest)
- `claude-3-5-sonnet-20241022` - Claude 3.5 Sonnet
- `claude-3-opus-20240229` - Claude 3 Opus

## Example: Chat Completion

```typescript
import { chat, toStreamResponse } from "@tanstack/ai";
import { anthropic } from "@tanstack/ai-anthropic";
import { ai, toStreamResponse } from "@tanstack/ai";
import { anthropicText } from "@tanstack/ai-anthropic";

const adapter = anthropic();
const adapter = anthropicText();

export async function POST(request: Request) {
const { messages } = await request.json();

const stream = chat({
const stream = ai({
adapter,
messages,
model: "claude-3-5-sonnet-20241022",
model: "claude-sonnet-4-5-20250929",
});

return toStreamResponse(stream);
Expand All @@ -80,11 +89,11 @@ export async function POST(request: Request) {
## Example: With Tools

```typescript
import { chat, toolDefinition } from "@tanstack/ai";
import { anthropic } from "@tanstack/ai-anthropic";
import { ai, toolDefinition } from "@tanstack/ai";
import { anthropicText } from "@tanstack/ai-anthropic";
import { z } from "zod";

const adapter = anthropic();
const adapter = anthropicText();

const searchDatabaseDef = toolDefinition({
name: "search_database",
Expand All @@ -96,43 +105,39 @@ const searchDatabaseDef = toolDefinition({

const searchDatabase = searchDatabaseDef.server(async ({ query }) => {
// Search database
return { results: [...] };
return { results: [] };
});

const stream = chat({
const stream = ai({
adapter,
messages,
model: "claude-3-5-sonnet-20241022",
model: "claude-sonnet-4-5-20250929",
tools: [searchDatabase],
});
```

## Provider Options

Anthropic supports provider-specific options:
Anthropic supports various provider-specific options:

```typescript
const stream = chat({
adapter: anthropic(),
const stream = ai({
adapter: anthropicText(),
messages,
model: "claude-3-5-sonnet-20241022",
model: "claude-sonnet-4-5-20250929",
providerOptions: {
thinking: {
type: "enabled",
budgetTokens: 1000,
},
cacheControl: {
type: "ephemeral",
ttl: "5m",
},
sendReasoning: true,
max_tokens: 4096,
temperature: 0.7,
top_p: 0.9,
top_k: 40,
stop_sequences: ["END"],
},
});
```

### Thinking (Extended Thinking)

Enable extended thinking with a token budget. This allows Claude to show its reasoning process, which is streamed as `thinking` chunks and displayed as `ThinkingPart` in messages:
Enable extended thinking with a token budget. This allows Claude to show its reasoning process, which is streamed as `thinking` chunks:

```typescript
providerOptions: {
Expand All @@ -154,23 +159,51 @@ When thinking is enabled, the model's reasoning process is streamed separately f

### Prompt Caching

Cache prompts for better performance:
Cache prompts for better performance and reduced costs:

```typescript
messages: [
{ role: "user", content: [{
type: "text",
content: "What is the capital of France?",
metadata: {
cache_control: {
type: "ephemeral",
ttl: "5m",
}
}
}]}
]
const stream = ai({
adapter: anthropicText(),
messages: [
{
role: "user",
content: [
{
type: "text",
content: "What is the capital of France?",
metadata: {
cache_control: {
type: "ephemeral",
},
},
},
],
},
],
model: "claude-sonnet-4-5-20250929",
});
```

## Summarization

Anthropic supports text summarization:

```typescript
import { ai } from "@tanstack/ai";
import { anthropicSummarize } from "@tanstack/ai-anthropic";

const adapter = anthropicSummarize();

const result = await ai({
adapter,
model: "claude-sonnet-4-5-20250929",
text: "Your long text to summarize...",
maxLength: 100,
style: "concise", // "concise" | "bullet-points" | "paragraph"
});

console.log(result.summary);
```

## Environment Variables

Expand All @@ -182,15 +215,44 @@ ANTHROPIC_API_KEY=sk-ant-...

## API Reference

### `anthropic(config)`
### `anthropicText(config?)`

Creates an Anthropic text/chat adapter using environment variables.

**Returns:** An Anthropic text adapter instance.

### `createAnthropicText(apiKey, config?)`

Creates an Anthropic text/chat adapter with an explicit API key.

**Parameters:**

- `apiKey` - Your Anthropic API key
- `config.baseURL?` - Custom base URL (optional)

**Returns:** An Anthropic text adapter instance.

### `anthropicSummarize(config?)`

Creates an Anthropic summarization adapter using environment variables.

Creates an Anthropic adapter instance.
**Returns:** An Anthropic summarize adapter instance.

### `createAnthropicSummarize(apiKey, config?)`

Creates an Anthropic summarization adapter with an explicit API key.

**Parameters:**

- `config.apiKey` - Anthropic API key (required)
- `apiKey` - Your Anthropic API key
- `config.baseURL?` - Custom base URL (optional)

**Returns:** An Anthropic summarize adapter instance.

## Limitations

**Returns:** An Anthropic adapter instance.
- **Embeddings**: Anthropic does not support embeddings natively. Use OpenAI or Gemini for embedding needs.
- **Image Generation**: Anthropic does not support image generation. Use OpenAI or Gemini for image generation.

## Next Steps

Expand Down
Loading
Loading