This project is an AI-powered chatbot that uses the langchain4j library to interact with the local Ollama API. The chatbot is designed to provide intelligent responses and maintain conversation history using in-memory storage.
- Docker
- Java 21
- Maven
To start the Ollama and qdrant service using Docker Compose, navigate to the project directory and run the following command:
docker compose up -dTo pull the mistral and codellama model, use the following command:
docker exec -it ollama ollama pull mistral
docker exec -it ollama ollama pull codellamaTo set up qdrant please use either the QdrantStorageMarkdownInitializer for indexing a markdown file or the QdrantStorageJavaInitializer for indexing a Java code base.
In both cases you need to edit the class file by setting the fully qualified path to the respective folder and execute the main method.
To launch the chatbot, follow these steps:
-
Navigate to the project directory.
-
Build the project using Maven:
mvn clean install
-
Run the chatbot application:
java -jar target/chatBot-1.0-SNAPSHOT.jar
This will start the chatbot, and you can interact with it via the console. Type 'exit' to quit the application.
To run the chatbot in IntelliJ IDEA, follow these steps:
- Open IntelliJ IDEA.
- Open the project by selecting
File > Openand navigating to the project directory. - Wait for IntelliJ IDEA to index the project and download dependencies.
- Navigate to the
*ChatBotfile. - Run its main method.
This will start the chatbot, and you can interact with it via the console in the IDE. Type 'exit' to quit the application.