Skip to content

statechny/local-gpt

Repository files navigation

This is a Next.js project bootstrapped with create-next-app.

Getting Started

First, follow these instructions to set up and run a local Ollama instance:

  • Download and install Ollama
  • Fetch Mistral LLM model via ollama pull mistral

Run the development server:

npm run dev
# or
yarn dev
# or
pnpm dev
# or
bun dev

Open http://localhost:3000 with your browser to see the result.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published