Skip to content

Support private servers with private certificate authorities for OpenAI and Ollama #20004

@hawkeye217

Description

@hawkeye217

Discussed in #20003

When trying to connect to an Ollama or OpenAI compatible API that is hosted with a private certificate authority, it fails the TLS negotiation due to the self signed root cert not being a public root.

I am using the same intermediary and root CA that is in the fullchain.pem for hosting the Frigate UI.

I checked the OpenAI and Ollama client implementations and they both use httpx, so you should be able to leverage a similar code path for both clients in creating the verify ssl context with the custom ca certificates.

env var from docker-compose.yaml
`OPENAI_BASE_URL: https://llm.mynetwork.local/api`

Relevant Frigate log output

Open AI

2025-09-09 16:50:49.004495426  [2025-09-09 16:50:49] openai._base_client            INFO    : Retrying request to /chat/completions in 0.453879 seconds
2025-09-09 16:50:49.515989294  [2025-09-09 16:50:49] openai._base_client            INFO    : Retrying request to /chat/completions in 0.950329 seconds
2025-09-09 16:50:50.540518458  [2025-09-09 16:50:50] frigate.genai.openai           WARNING : OpenAI returned an error: Connection error.

Ollama:

2025-09-09 16:22:24.541917284  [2025-09-09 16:22:24] frigate.genai.ollama           WARNING : Error initializing Ollama: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:992)

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions