Skip to content

[GenAI] Implement FastAPI MCP for LLMs so Frigate can provide context to answer questions #20231

@NickM-27

Description

@NickM-27

Describe what you are trying to accomplish and why in non technical terms

Frigate can implement an MCP to allow GenAI / LLMs to answer more in-depth questions about one's Frigate instance. Some examples could be:

  • When was the last package delivered?
  • How many people walked through the front yard today?
  • Turn off my outside cameras.

Describe the solution you'd like

Frigate can use something like https://gofastmcp.com/integrations/fastapi to implement MCP which would allow either Frigate itself, or LLMs themselves, to provide context to LLMs

Describe alternatives you've considered

This doesn't provide anything new that the UI can't answer, but it does provide new ways to automate things and new ways to get information / combine other sources of information together.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions