Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
77 changes: 77 additions & 0 deletions ai_oca_native_llm/README.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,77 @@
.. image:: https://odoo-community.org/readme-banner-image
:target: https://odoo-community.org/get-involved?utm_source=readme
:alt: Odoo Community Association

==================================
Native AI LLM Integration (Ollama)
==================================

..
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!! This file is generated by oca-gen-addon-readme !!
!! changes will be overwritten. !!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!! source digest: sha256:c6d57c5b0f7a15f580843211db4c2ea0935e0b89601a4ba5af44a7dba8c489eb
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

.. |badge1| image:: https://img.shields.io/badge/maturity-Beta-yellow.png
:target: https://odoo-community.org/page/development-status
:alt: Beta
.. |badge2| image:: https://img.shields.io/badge/license-LGPL--3-blue.png
:target: http://www.gnu.org/licenses/lgpl-3.0-standalone.html
:alt: License: LGPL-3
.. |badge3| image:: https://img.shields.io/badge/github-OCA%2Fai-lightgray.png?logo=github
:target: https://github.com/OCA/ai/tree/19.0/ai_oca_native_llm
:alt: OCA/ai
.. |badge4| image:: https://img.shields.io/badge/weblate-Translate%20me-F47D42.png
:target: https://translation.odoo-community.org/projects/ai-19-0/ai-19-0-ai_oca_native_llm
:alt: Translate me on Weblate
.. |badge5| image:: https://img.shields.io/badge/runboat-Try%20me-875A7B.png
:target: https://runboat.odoo-community.org/builds?repo=OCA/ai&target_branch=19.0
:alt: Try me on Runboat

|badge1| |badge2| |badge3| |badge4| |badge5|

Provides a basic Python client wrapper to communicate with a
local/remote Ollama instance.

**Table of contents**

.. contents::
:local:

Known issues / Roadmap
======================

- Transform this module as base module that provide abstraction to chat
with any llm provider

Bug Tracker
===========

Bugs are tracked on `GitHub Issues <https://github.com/OCA/ai/issues>`_.
In case of trouble, please check there if your issue has already been reported.
If you spotted it first, help us to smash it by providing a detailed and welcomed
`feedback <https://github.com/OCA/ai/issues/new?body=module:%20ai_oca_native_llm%0Aversion:%2019.0%0A%0A**Steps%20to%20reproduce**%0A-%20...%0A%0A**Current%20behavior**%0A%0A**Expected%20behavior**>`_.

Do not contact contributors directly about support or help with technical issues.

Credits
=======

Maintainers
-----------

This module is maintained by the OCA.

.. image:: https://odoo-community.org/logo.png
:alt: Odoo Community Association
:target: https://odoo-community.org

OCA, or the Odoo Community Association, is a nonprofit organization whose
mission is to support the collaborative development of Odoo features and
promote its widespread use.

This module is part of the `OCA/ai <https://github.com/OCA/ai/tree/19.0/ai_oca_native_llm>`_ project on GitHub.

You are welcome to contribute. To learn how please visit https://odoo-community.org/page/Contribute.
1 change: 1 addition & 0 deletions ai_oca_native_llm/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
from . import models
19 changes: 19 additions & 0 deletions ai_oca_native_llm/__manifest__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
# Copyright 2025 Pierre Verkest
# License LGPL-3.0 or later (http://www.gnu.org/licenses/lgpl.html)
{
"name": "Native AI LLM Integration (Ollama)",
"version": "19.0.1.0.0",
"category": "AI",
"summary": "Core LLM wrapper for Ollama",
"author": "Odoo Community Association (OCA)",
"website": "https://github.com/OCA/ai",
"license": "LGPL-3",
"depends": ["base"],
"external_dependencies": {
"python": ["ollama"],
},
"data": [
"views/res_config_settings_views.xml",
],
"installable": True,
}
2 changes: 2 additions & 0 deletions ai_oca_native_llm/models/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
from . import res_config_settings
from . import ai_llm_client
49 changes: 49 additions & 0 deletions ai_oca_native_llm/models/ai_llm_client.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
# Copyright 2025 Pierre Verkest
# License LGPL-3.0 or later (http://www.gnu.org/licenses/lgpl.html)
import logging

from ollama import Client

from odoo import api, models

_logger = logging.getLogger(__name__)


class AiLlmClient(models.AbstractModel):
"""
Abstract model to provide a simple Python client for Ollama.
It resolves configuration dynamically and performs the HTTP calls.
"""

_name = "ai.llm.client"
_description = "AI LLM Client Wrapper"

@api.model
def _get_client(self):
url = (
self.env["ir.config_parameter"]
.sudo()
.get_param("ai_llm.ollama_url", "http://localhost:11434")
)
return Client(host=url)

@api.model
def chat(self, messages, model=None, options=None):
"""
Sends a chat request to Ollama.
:param messages: list of dicts [{'role': 'user', 'content': 'hello'}, ...]
:param options: dict of optional parameters (e.g. temperature)
:return: dict response from Ollama
"""
client = self._get_client()
if not model:
model = (
self.env["ir.config_parameter"]
.sudo()
.get_param("ai_llm.ollama_model", "llama3")
)

response = client.chat(
model=model, messages=messages, options=options, stream=False
)
return response.message.content
20 changes: 20 additions & 0 deletions ai_oca_native_llm/models/res_config_settings.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# Copyright 2025 Pierre Verkest
# License LGPL-3.0 or later (http://www.gnu.org/licenses/lgpl.html)
from odoo import fields, models


class ResConfigSettings(models.TransientModel):
_inherit = "res.config.settings"

ai_llm_ollama_url = fields.Char(
string="Ollama URL",
config_parameter="ai_llm.ollama_url",
default="http://localhost:11434",
help="The URL of the Ollama server.",
)
ai_llm_ollama_model = fields.Char(
string="Ollama Model",
config_parameter="ai_llm.ollama_model",
default="llama3",
help="The model to use for the AI features (e.g., llama3, mistral).",
)
3 changes: 3 additions & 0 deletions ai_oca_native_llm/pyproject.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
[build-system]
requires = ["whool"]
build-backend = "whool.buildapi"
2 changes: 2 additions & 0 deletions ai_oca_native_llm/readme/DESCRIPTION.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
Provides a basic Python client wrapper to communicate with
a local/remote Ollama instance.
2 changes: 2 additions & 0 deletions ai_oca_native_llm/readme/ROADMAP.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
* Transform this module as base module that provide
abstraction to chat with any llm provider
Loading
Loading