-
Notifications
You must be signed in to change notification settings - Fork 0
Open
Labels
enhancementNew feature or requestNew feature or request
Description
Create LLM-Ready User Documentation
Problem
Game developers using KhoraEngine want to leverage LLMs (Claude, Copilot, GPT-4, Cursor, etc.) to accelerate their game development. However, LLMs lack Khora-specific knowledge since it's a custom engine with unique concepts (SAA, CRPECS, Agents, GORNA). Users currently have to paste large amounts of documentation or code examples to get useful assistance, which is inefficient and error-prone.
Proposed Solution
Create an LLM_CONTEXT.md file at the repository root that users can paste into their LLM conversation or add to their project's .cursorrules/CLAUDE.md. This file provides just enough context about Khora's public SDK API to be helpful.
Key Requirements
| Requirement | Target |
|---|---|
| Token footprint | ~2,000-2,500 tokens |
| Audience | Game developers (not engine contributors) |
| Focus | Public SDK API, not internals |
| Format | Code-first (snippets > prose) |
What's Included
- Quick Start: Minimal working example
- Architecture Diagram: User's mental model of how their code fits into Khora
- Core Types: Transform, Camera, Materials, Lights
- Entity Patterns: Spawn bundles, Vessel builder, cameras, mesh+material
- Query Patterns: Immutable, mutable, with EntityId, with filters
- Input Events: Event matching examples
- Math Reference: Vec3, Mat4, Quaternion essentials
- Common Gotchas: Sync GlobalTransform, EntityId recycling, SDK-only usage
- API Index: "I want to..." lookup table
What's NOT Included (by design)
- Internal architecture (CLAD, DCC, GORNA details)
- Agent implementation
- Rendering pipeline internals
- Memory allocators
- Performance profiling APIs
These are engine concerns, not game developer concerns.
Alternative Approaches
| Approach | Pros | Cons |
|---|---|---|
| Single LLM_CONTEXT.md (proposed) | Simple, one file to paste | May need updates as SDK evolves |
| Layered docs (MINIMAL + FULL) | Choose based on task complexity | Maintenance overhead |
| Auto-generate from rustdoc | Always in sync | Hard to curate examples, less readable |
| Enhanced .cursorrules only | Already exists | Not visible to non-Cursor LLMs |
Open Questions
- Should this live in
/docs/for-users/or at repo root? - Should we version it alongside releases (e.g.,
LLM_CONTEXT_v0.5.md)? - How do we keep it in sync as the SDK evolves?
Next Steps
If approved:
- Create
LLM_CONTEXT.mdat repo root - Add link to it in README.md under "Getting Started"
- Update
.cursorrulesto reference it - (Optional) Create a minimal version (~800 tokens) for quick questions
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request