Skip to content

[19.0][WIP] AI Agent Integration Framework for Odoo (Native LLM/Ollama)#61

Draft
petrus-v wants to merge 2 commits intoOCA:19.0from
petrus-v:19.0-incubation
Draft

[19.0][WIP] AI Agent Integration Framework for Odoo (Native LLM/Ollama)#61
petrus-v wants to merge 2 commits intoOCA:19.0from
petrus-v:19.0-incubation

Conversation

@petrus-v
Copy link

@petrus-v petrus-v commented Feb 26, 2026

Goal & Description

This PR introduces the foundational framework for a fully integrated, multi-agent AI system within Odoo. The aim is to provide an infrastructure that leverages Odoo's native capabilities (chatter, queue jobs, bus, ACLs) to empower users with proactive and reactive AI assistants directly inside their ERP environment.

Key Desired Features

  • Native Contextual AI: A dedicated "AI Thread" tab within the standard Odoo chatter, allowing context-aware conversations on any Odoo record.
  • Asynchronous & Real-Time: Utilizing queue_job for heavy LLM inference and the Odoo Bus for real-time WebSocket UI updates.
  • Dynamic Tool Routing: A vector-driven Orchestrator engine to map natural language to specific Odoo Tools (Python methods), scaling functionalities without overloading the LLM context window.
  • Secure & Autonomous Agents: AI Agents modeled around Odoo's res.users, meaning they strictly respect native Access Control Lists (ACLs), can post in standard mail.thread, and assign mail.activity to human users.

Roadmap & Progress

To ensure a smooth review process and clear milestones, the project is structured in 5 phases.

Phase 1: Core Communication & UI (The "Thread") — ✅ COMPLETED

  • ai_oca_native_llm: Implementation of a basic, extensible Python client to communicate directly with an Ollama provider.
  • ai_oca_native_thread:
    • Data models (ai.thread, ai.message) to store discussion history, isolated per user and per Odoo record.
    • OWL Component: Seamless integration of the new "AI Thread" tab in the chatter UI alongside "Send message" and "Log note".
    • Basic synchronous execution (Input -> LLM generation contextually aware of the record -> Output).
Capture.video.du.2026-02-26.08-41-54.mp4

Phase 2: Asynchronous Execution & Streaming (UX Polish) — ⏳ PENDING

  • Prevent UI blocking by delegating LLM calls to Odoo queue_job workers.
  • Add real-time feedback via Odoo Bus (WebSockets).
  • UX improvements: Stream LLM tokens to the UI (or stream job logs) for immediate reactivity.

Phase 3: Tool Definitions & Vector Infrastructure — ⏳ PENDING

  • ai_oca_native_tool: Definition of the ai.tool registry.
  • Setup vector-based storage (e.g., PGVector) for semantic descriptions of available actions.
  • Automatic schema generation engine (Pydantic / Odoo native) to dynamically expose standard Odoo Python methods to the AI.

Phase 4: The Orchestrator Engine — ⏳ PENDING

  • ai_oca_native_orchestrator: Master routing module.
  • Perform Vector Search to fetch only the top relevant Odoo tools for a user's prompt.
  • Build constrained LLM prompts with these tool schemas, enabling the LLM to intelligently trigger the correct Odoo methods.

Phase 5: Agent Roles (res.users identity) & Multi-Agent — ⏳ PENDING

  • ai_oca_native_agent: Definition of autonomous personas (the ai.agent model) intrinsically linked to res.users for strict security/ACL inheritance.
  • Empower Agents to execute tasks autonomously, participate in discussions (mail.thread), and delegate or request validation from humans via mail.activity.

Feedback are welcome :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant