Problem
There is no documentation explaining how to develop and test a Strands TS SDK app locally without real API credentials (Bedrock, OpenAI, Anthropic). The MockMessageModel exists in test fixtures but is undocumented for end-users.
Proposed Content
A new guide covering:
- How to implement a
MockModel using the Custom Model Provider interface
- Drop-in env-var swap pattern (
MOCK_MODE=true) for zero-code-change local dev
- How to simulate: text streaming, tool calls, structured output, errors, multi-turn
- How to use Ollama/llama.cpp as a local LLM backend
Location
Suggested: docs/user-guide/concepts/local-development.md or similar
Problem
There is no documentation explaining how to develop and test a Strands TS SDK app locally without real API credentials (Bedrock, OpenAI, Anthropic). The
MockMessageModelexists in test fixtures but is undocumented for end-users.Proposed Content
A new guide covering:
MockModelusing the Custom Model Provider interfaceMOCK_MODE=true) for zero-code-change local devLocation
Suggested:
docs/user-guide/concepts/local-development.mdor similar