Skip to content

Conversation

@andychuong
Copy link

Overview

This PR migrates Chartsmith from direct Anthropic SDK usage to Vercel AI SDK for all LLM operations, enabling multi-provider support (Anthropic, OpenAI, Google, OpenRouter) while maintaining backward compatibility.

Key Changes

1. Vercel AI SDK Migration

Frontend Changes

  • ChatContainer.tsx: Migrated to useChat hook from @ai-sdk/react for conversational chat, replacing custom fetch-based streaming
  • ModelSelector.tsx: New component for selecting LLM providers/models with auto-sizing dropdown
  • PromptInput.tsx: Uses prompt-type API to determine plan vs chat routing

New API Routes (Next.js)

Route Description
/api/chat Conversational chat with tool calling support (subchart versions, K8s versions)
/api/llm/plan Plan generation for chart modifications (streaming)
/api/llm/expand Prompt expansion with context
/api/llm/summarize Content summarization
/api/llm/cleanup-values Values.yaml cleanup and validation
/api/llm/execute-action File action execution with text_editor tool
/api/llm/prompt-type Determines if user prompt is "plan" or "chat" type
/api/models Available models and providers endpoint (fetches from OpenRouter if available)

Go Backend Changes

  • nextjs_client.go: New HTTP client for calling Next.js API routes with streaming support
  • plan.go, expand.go, summarize.go: Migrated to use Next.js APIs
  • conversational.go: Updated to use Next.js client via /api/chat
  • execute-action.go: Updated to use Next.js client for tool calling

Go Files Converted to API Routes

The following Go files had their LLM logic migrated to Next.js API routes. The Go files now act as clients that call the Next.js APIs:

Go File API Route Description
pkg/llm/plan.go /api/llm/plan Plan generation for chart modifications
pkg/llm/expand.go /api/llm/expand Prompt expansion with context
pkg/llm/summarize.go /api/llm/summarize Content summarization (with caching)
pkg/llm/conversational.go /api/chat Conversational chat with tools
pkg/llm/execute-action.go /api/llm/execute-action File action execution with tool calling
pkg/llm/cleanup-converted-values.go /api/llm/cleanup-values Values.yaml cleanup and validation

Note: The Go files still contain business logic (e.g., caching, file operations, workflow management) but now delegate LLM calls to the Next.js API routes via nextjs_client.go.

Infrastructure

  • lib/auth/api-guard.ts: Unified authentication for API routes (internal API key + session cookie)
  • lib/llm/registry.ts: Model registry with multi-provider support and automatic fallback logic
  • lib/llm/config.ts: Simplified LLM configuration with provider priority detection
  • lib/llm/prompt-type.ts: Prompt classification logic (plan vs chat)
  • middleware.ts: Updated to handle internal API paths with X-Internal-API-Key header

Architecture

Before (Main Branch)

User → Go Backend → Anthropic SDK → Anthropic API

After (This PR)

User → Next.js API → Vercel AI SDK → Any Provider (Anthropic/OpenAI/Google/OpenRouter)
Go Worker → Next.js API → Vercel AI SDK → Any Provider

New Features

Multi-Provider Support

  • Automatic provider detection based on available API keys
  • Model selection UI in chat interface (auto-sizing dropdown)

Internal API Authentication

  • Internal API key (X-Internal-API-Key header) for Go worker → Next.js communication
  • Session cookie auth (token) for browser requests
  • Unified checkApiAuth() function for all protected API routes

Dependencies

Added

{
  "ai": "^5.0.104",
  "@ai-sdk/anthropic": "^2.0.50",
  "@ai-sdk/openai": "^2.0.74",
  "@ai-sdk/google": "^2.0.44",
  "@ai-sdk/react": "^2.0.104",
  "@openrouter/ai-sdk-provider": "^1.2.8"
}

Environment Variables

Required (at least one provider)

Variable Description
ANTHROPIC_API_KEY For Anthropic Claude models
OPENAI_API_KEY For OpenAI GPT models
GOOGLE_GENERATIVE_AI_API_KEY For Google Gemini models
OPENROUTER_API_KEY For OpenRouter (access to all providers )

Internal Communication

Variable Description Default
INTERNAL_API_KEY Go worker ↔ Next.js authentication dev-internal-key (dev only)
NEXTJS_API_URL Next.js API base URL http://localhost:3000

Optional

Variable Description
CHARTSMITH_LLM_MODEL Override default model selection

Migration Notes

For Developers

  1. Set at least one provider API key in .env.local
  2. Set INTERNAL_API_KEY (or use default dev-internal-key for dev)
  3. Run npm install to get new dependencies
  4. Restart Next.js dev server and Go worker

For Deployment

  1. Set provider API keys as environment variables
  2. Generate secure INTERNAL_API_KEY: openssl rand -hex 32
  3. Set same INTERNAL_API_KEY in both Next.js and worker environments
  4. Set NEXTJS_API_URL if Next.js is not at default location

Summary

This PR:

  1. Migrates all LLM operations to Vercel AI SDK (including execute-action and conversational with tools)
  2. Enables multi-provider support (Anthropic, OpenAI, Google, OpenRouter)
  3. Maintains backward compatibility with existing Go worker
  4. Adds model selection UI for users

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant