The setup assumes that rdmo-app is already configured. First, install the plugin
# directly from github
pip install git+https://github.com/rdmorganiser/rdmo-plugins-llm-views
# alternatively, from a local copy
git clone git@github.com:rdmorganiser/rdmo-plugins-llm-views
pip install -e rdmo-plugins-llm-views[openai]
pip install -e rdmo-plugins-llm-views[ollama] # alternativelyAdd the following settings to your config/settings/local.py (and adjust them as required):
INSTALLED_APPS = ['rdmo_llm_views', *INSTALLED_APPS]For openai use:
LLM_VIEWS_ADAPTER = 'rdmo_llm_views.adapter.OpenAILangChainAdapter'
LLM_VIEWS_LLM_ARGS = {
"openai_api_key": OPENAI_API_KEY,
"model": 'gpt-4o-mini'
}For ollama use:
LLM_VIEWS_ADAPTER = 'rdmo_llm_views.adapter.OllamaLangChainAdapter'
LLM_VIEWS_LLM_ARGS = {
"model": "gemma3:1b"
}In order to use django-q to perform the call to the LLM asynchronous, the following settings need to be added:
Q_CLUSTER = {
'name': 'DjangORM',
'workers': 4,
'timeout': 90,
'retry': 120,
'queue_limit': 50,
'bulk': 10,
'orm': 'default'
}Additionally the following settings can be used:
LLM_VIEWS_SELECT_MODEL = True # enable model selection in the view
LLM_VIEWS_TIMEOUT = 4000 # timeout for pollingThe djqngo-q worker needs to be started in parallel to the usual runserver command:
python manage.py qclusterCreate a Systed service file in /etc/systemd/system/rdmo-qcluster.service:
[Unit]
Description=RDMO qcluster runner
After=network.target
[Service]
User=rdmo
Group=rdmo
LogsDirectory=django-q
WorkingDirectory=/srv/rdmo/rdmo-app/
ExecStart=/srv/rdmo/rdmo-app/env/bin/python manage.py qcluster
StandardOutput=append:/var/log/django-q/stdout.log
StandardError=append:/var/log/django-q/stderr.log
[Install]
WantedBy=multi-user.target
Reload and enable the service:
sys0temctl daemon-reload
systemctl enable --now rdmo-qclusterThe {% llm %} tag can be used in two ways.
{% load view_tags %}
{% load llm_tags %}
{% llm %}
## 1. Data Summary
### Purpose of data collection
...
{% endllm %}An additional prompt can be provided, e.g.:
{% load view_tags %}
{% load llm_tags %}
{% llm prompt="Write in the style the lord of the rings. Use only h2 and h3." %}
## 1. Data Summary
### Purpose of data collection
...
{% endllm %}For a more fine grained control, the attributes can be selected. The model is then provided only with the questions and answers for this attribute.
{% load view_tags %}
{% load llm_tags %}
{% llm attributes='project/research_question/title,project/research_question/keywords' %}
The title of the project is ... Keywords are ...
{% endllm %}
...