-
Notifications
You must be signed in to change notification settings - Fork 284
feat: Support Azure OpenAI as LLM provider #1362
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Support Azure OpenAI as LLM provider #1362
Conversation
georgeh0
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for creating this PR!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just wonder is it possible to reuse the crate::llm::openai::Client, but just wrap it with a separate initialization logic? (we're doing this for a bunch of other clients, like LiteLLM, OpenRouter, vLLM etc.)
863ab8d to
b91c846
Compare
apologizes I must have messed up when I was squashing a few commits now it's all squashed into the initial commit instead of showing up separately as a commit to address the comments |
| Ok(Self { | ||
| client: OpenAIClient::with_config(azure_config), | ||
| }) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will it possible to directly use super::openai::Client::from_parts(OpenAIClient::with_config(azure_config)) here? We won't need to define another Client struct and implement LlmGenerationClient and LlmEmbeddingClient below.
Just similar to LiteLLM.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ya I actually attempted this approach however hitting type mismatch error
The openai::Client::from_parts() function expects:
pub(crate) fn from_parts(client: async_openai::Client<OpenAIConfig>) -> Self
But we're trying to pass:
async_openai::Client<AzureConfig>
The async_openai::Client is generic over the config type C, however AzureConfig and OpenAIConfig are of different types.
LiteLLM works because it uses OpenAIConfig with a different base URL, but Azure OpenAI needs its own Client struct with AzureConfig.
https://docs.rs/async-openai/latest/async_openai/config/struct.AzureConfig.html
https://docs.rs/async-openai/latest/async_openai/config/struct.OpenAIConfig.html
--> rust/cocoindex/src/llm/azureopenai.rs:33:57
|
33 | Ok(Client::from_parts(OpenAIClient::with_config(azure_config)))
| ------------------------- ^^^^^^^^^^^^ expected `OpenAIConfig`, found `AzureConfig`
| |
| arguments to this function are incorrect
|
note: associated function defined here
--> /Users/stewang/.cargo/registry/src/index.crates.io-1949cf8c6b5b557f/async-openai-0.30.1/src/client.rs:51:12
|
51 | pub fn with_config(config: C) -> Self {
| ^^^^^^^^^^^
For more information about this error, try `rustc --explain E0308`.
error: could not compile `cocoindex` (lib) due to 1 previous error
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see. They have two different type parameters. Thanks for trying out!
We can merge this PR now. I can do some refactorfor openai.rs later.
No worries! |
|
Great PR thanks a lot @1fanwang for the contribution! |
|
Updated PR with 28b04d8 and end to end testing result in PR description |
| Ok(Self { | ||
| client: OpenAIClient::with_config(azure_config), | ||
| }) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see. They have two different type parameters. Thanks for trying out!
We can merge this PR now. I can do some refactorfor openai.rs later.
|
Great feature, thank you so much @1fanwang Stephan! Welcome to the community and we are excited to learn from you down the road! |
closes #1282
Testing Done
2. End-to-End Testing with Azure OpenAI
Test Setup
https://cocoindex.openai.azure.comgpt-4o2024-08-01-previewFlow Execution
$ cocoindex update main
AzureOpenAIE2ETest.questions (batch update): 2/2 source rows: (+) 2 added
Verification - Structured Data Extracted by Azure OpenAI
SELECT filename, question, capital, country
FROM azure_openai_test_answers
ORDER BY filename;
Results:
Azure OpenAI successfully:
ExtractByLlm