A Nushell scriptable MCP client with editable context threads stored in cross.stream

- Consistent API Across Models: Connect to Gemini + Search and Anthropic + Search through a single, simple interface. (Add providers easily.)
- Persistent, Editable Conversations: Conversation threads are saved across sessions. Review, edit, and control your own context window — no black-box history.
- Flexible Tool Integration: Connect to MCP servers to extend functionality.
gpt2099
already rivals Claude Code for local file editing, but with full provider independence and deeper flexibility. - Document Support: Upload and reference documents (PDFs, images, text files) directly in conversations with automatic content-type detection and optional caching.
Built on cross.stream for event-driven processing, gpt2099
brings modern AI directly into your Nushell workflow — fully scriptable, fully inspectable, all in
the terminal.
gpt-quick-edit.mp4
"lady on the track" provided by mobygratis
First, install and configure cross.stream
. Once set up, you'll
have the full cross.stream
ecosystem of tools for editing and
working with your context windows.
After this step you should be able to run:
"as easy as" | .append abc123
.head abc123 | .cas

It really is easy from here.
overlay use -pr ./gpt
Initialize the cross.stream command that performs the actual LLM call. This appends the command to
your event stream so later gpt
invocations can use it:
gpt init
Enable your preferred provider. This stores the API key for later use:
gpt provider enable
Set up a milli
alias for a lightweight model (try OpenAI's gpt-4.1-mini
or Anthropic's
claude-3-5-haiku-20241022
):
gpt provider ptr milli --set
Give it a spin:
"hola" | gpt -p milli
- Commands Reference - Complete command syntax and options
- How-To Guides - Task-oriented workflows:
- Configure Providers - Set up AI providers and model aliases
- Work with Documents - Register and use documents in conversations
- Manage Conversations - Threading, bookmarking, and continuation
- Use MCP Servers - Extend functionality with external tools
- Generate Code Context - Create structured context from Git repositories
- Provider API - Technical specification for implementing providers
- Schemas - Complete data structure reference for all gpt2099 schemas
- Why does the name include 2099? What else would you call the future?
This is how the project looked, 4 hours into its inception: