Simple Bun static site + API for asking LLM questions over indexed /uses pages.
- Bun server (
Bun.serve) - SQLite (
bun:sqlite) with FTS5 - OpenAI embeddings + chat completion
- Hybrid retrieval (FTS + cosine vector + RRF)
- LLM-driven context screening + answer verification
- Tailwind CSS static UI
bun install
cp .env.example .envSet OPENAI_API_KEY in .env.
Optional: set OPENAI_VERIFY_MODEL / OPENAI_SCREEN_MODEL to tune verification and screening stages.
bun run build:cssbun run seed
# or smoke test first
bun run seed:smokeUseful seed flags:
SEED_LIMIT=50 bun run seed
SEED_CONCURRENCY=3 FETCH_TIMEOUT_MS=20000 bun run seedbun run devOpen http://localhost:3000.
GET /api/healthGET /api/statsPOST /api/askbody:{ "question": "..." }
- Seed source is
https://uses.tech/and parsed from the embedded Remix loader data. - If a page cannot be fetched, it is stored as
fetch_status=errorinpages.