Skip to content

Conversation

Copy link
Contributor

Copilot AI commented Nov 18, 2025

Custom LLM providers accepting numeric stream values (0/1) instead of boolean failed to parse correctly. Additionally, explicit stream parameters in "Additional request body" were filtered out on dialog reopen, and non-streaming requests failed despite successful connection tests.

Changes

LlmConfig.kt - Extended CustomRequest.fromLegacyFormat() to parse numeric stream values:

val stream = when (val streamValue = body["stream"]) {
    is JsonPrimitive -> {
        when {
            streamValue.booleanOrNull != null -> streamValue.boolean
            streamValue.isString -> streamValue.content.toBoolean()
            streamValue.intOrNull != null -> streamValue.int != 0      // 0 → false
            streamValue.longOrNull != null -> streamValue.long != 0L   // 1 → true
            else -> true
        }
    }
    else -> true
}

LLMDialog.kt - Fixed stream parameter preservation and priority:

  • Removed stream from filter when loading config (was being stripped)
  • Added actualStream parsing that respects explicit body values over checkbox
  • Applied to both testConnection() and doOKAction()

Priority: Additional request body > Checkbox

CustomRequestTest.kt - Added test coverage for boolean, numeric, string, and missing stream values.

Example Usage

User can now explicitly override checkbox with numeric values:

{
  "stream": 0
}

Stream parameter persists across dialog opens and correctly sets non-streaming response resolver ($.choices[0].message.content).

Warning

Firewall rules blocked me from connecting to one or more addresses (expand for details)

I tried to connect to the following addresses, but was blocked by firewall rules:

  • dl.google.com
    • Triggering command: /usr/lib/jvm/temurin-17-jdk-amd64/bin/java --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.prefs/java.util.prefs=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.prefs/java.util.prefs=ALL-UNNAMED --add-opens=java.base/java.nio.charset=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED --add-opens=java.xml/javax.xml.namespace=ALL-UNNAMED -Xmx8g -Dfile.encoding=UTF-8 -Duser.country -Duser.language=en -Duser.variant -cp /home/REDACTED/.gradle/wrapper/dists/gradle-8.14.3-bin/cv11ve7ro1n3o1j4so8xd9n66/gradle-8.14.3/lib/gradle-daemon-main-8.14.3.jar -javaagent:/home/REDACTED/.gradle/wrapper/dists/gradle-8.14.3-bin/cv11ve7ro1n3o1j4so8xd9n66/gradle-8.14.3/lib/agents/gradle-instrumentation-agent-8.14.3.jar org.gradle.launcher.daemon.bootstrap.GradleDaemon 8.14.3 (dns block)

If you need me to access, download, or install something from one of these locations, you can either:

Original prompt

This section details on the original issue you should resolve

<issue_title>自定义大模型无响应问题</issue_title>
<issue_description>当我在idea中自定义一个大模型时,测试链接显示通过,但是在ai对话框里似乎没有影响。
1、自定义模型时,stream参数勾选效果与显示指定stream参数效果好像不一样,当我指定"stream": 0时,测试链接显示成功,但是此时跟ai对话是没有反应的

Image

2、自定义模型时,stream参数勾选上,不显示定义stream参数,测试链接显示400,似乎传递到后端的参数类型不相符。去掉勾选,显示指定,"stream": 1时,可以得到返回值,但是流式结果是解析不了的,报错如下
Test Connection:
Connectionfailed:cannotparse withresponseResolver:$.result.content,ori data: data:(data:"id":*chatcmpl-297481cb2051b1e050903f57fba436b9",data:"index":0,data:"result":"",data:"role":"",data:"toolcalls":null,data:"finishreason":false,data:
返回结果形如:
data:{
data: "id" : "chatcmpl-5a7eb967-3139-45e5-83de-473cb9e543bf",
data: "index" : 0,
data: "result" : "",
data: "role" : "",
data: "tool_calls" : null,
data: "finish_reason" : false,
data: "finish_detail" : null,
data: "usage" : null,
data: "audioUsage" : null,
data: "extraReturn" : null

问题:
1)自定义模型参数"stream"只支持0或1时,而不是true或false,此时智能显示定义在请求体中,但是下次在再打开时,显示定义参数就丢失了
2)对于自定义大模型,是不是不支持流式返回结果?
3)非流式结果,测试链接通过,为何ai对话无响应?</issue_description>

<agent_instructions>LLMProvider2?</agent_instructions>

Comments on the Issue (you are @copilot in this section)


💬 We'd love your input! Share your thoughts on Copilot coding agent in our 2 minute survey.

Summary by CodeRabbit

  • New Features

    • Stream configuration now supports numeric JSON values (0 maps to false, non-zero to true) alongside existing boolean and string formats
  • Improvements

    • Enhanced consistency in streaming settings when configured through multiple methods
    • Configuration body stream values now properly override default settings

✏️ Tip: You can customize this high-level summary in your review settings.

Copilot AI and others added 2 commits November 18, 2025 02:38
- Support numeric stream values (0/1) in addition to boolean (true/false)
- Preserve stream parameter in Additional request body when explicitly set
- Allow stream in Additional request body to override checkbox value
- Fix stream parameter parsing in LlmConfig.fromLegacyFormat()

Co-authored-by: phodal <[email protected]>
Copilot AI changed the title [WIP] Fix custom model not responding in AI dialog Fix custom LLM stream parameter handling for numeric values and persistence Nov 18, 2025
Copilot AI requested a review from phodal November 18, 2025 02:45
Copilot finished work on behalf of phodal November 18, 2025 02:45
@phodal
Copy link
Member

phodal commented Nov 20, 2025

@coderabbitai review

@coderabbitai
Copy link

coderabbitai bot commented Nov 20, 2025

✅ Actions performed

Review triggered.

Note: CodeRabbit is an incremental review system and does not re-review already reviewed commits. This command is applicable only when automatic reviews are paused.

@coderabbitai
Copy link

coderabbitai bot commented Nov 20, 2025

Walkthrough

The PR extends JSON stream parameter parsing to support numeric primitives (0/1) in addition to booleans and strings, and ensures consistent streaming state handling in both test connections and model usage by prioritizing explicit body overrides over UI checkbox defaults.

Changes

Cohort / File(s) Summary
Stream parsing logic
core/src/main/kotlin/cc/unitmesh/devti/llm2/model/LlmConfig.kt
Extends CustomRequest legacy request parsing to interpret numeric JSON primitives in the stream field: 0 maps to false, non-zero maps to true. Adds int and long type branches to existing boolean and string handling.
Dialog stream handling
core/src/main/kotlin/cc/unitmesh/devti/settings/dialog/LLMDialog.kt
Introduces actualStream derivation from body["stream"] when explicitly provided, falling back to checkbox state otherwise. Applies this logic consistently across test connection and save flows, preserves stream field in body construction, and uses actualStream to select default response resolver JSONPath.
Stream parsing tests
core/src/test/kotlin/cc/unitmesh/devti/llm2/model/CustomRequestTest.kt
New test suite for CustomRequest.fromLegacyFormat validating stream parsing as boolean (true/false), numeric (0/1), and string ("true"/"false") formats, default stream behavior, and custom header parsing.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

  • LLMDialog.kt requires careful attention to verify the actualStream derivation logic is applied consistently across all code paths (test connection, save flow, body construction) and that body overrides properly take precedence over checkbox state.
  • LlmConfig.kt numeric type handling is straightforward but should be validated for edge cases and consistency with existing boolean/string branches.
  • CustomRequestTest.kt provides good coverage but tests should be reviewed to ensure all parsing scenarios are represented.

Poem

🐰 A stream once flowed in boolean form,
But now numeric tides transform—
Zero and one now find their way,
While body overrides have their say.
The dialog listens, consistent and true,
Custom models respond as they should do! ✨

Pre-merge checks and finishing touches

❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
✅ Passed checks (4 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title clearly and accurately summarizes the main changes: fixing stream parameter handling for numeric values and ensuring persistence across dialog reopens.
Linked Issues check ✅ Passed The PR addresses all primary coding objectives from issue #461: parsing numeric stream values (0/1), persisting explicit stream parameters across dialog reopens, and handling non-streaming responses correctly.
Out of Scope Changes check ✅ Passed All changes are directly scoped to addressing the linked issue: CustomRequest parsing logic, LLMDialog persistence and stream handling, and corresponding test coverage for the new functionality.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch copilot/fix-custom-model-response-issue

Comment @coderabbitai help to get the list of available commands and usage tips.

@coderabbitai
Copy link

coderabbitai bot commented Nov 20, 2025

✅ Actions performed

Review triggered.

Note: CodeRabbit is an incremental review system and does not re-review already reviewed commits. This command is applicable only when automatic reviews are paused.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (3)
core/src/test/kotlin/cc/unitmesh/devti/llm2/model/CustomRequestTest.kt (1)

11-79: Stream parsing tests are comprehensive; only minor optional extensions

The tests nicely exercise boolean, numeric 0/1, string, and missing stream cases, and verify both result.stream and the raw body["stream"] values, which matches the intended legacy parsing behavior. If you later want to harden this further, you could:

  • Add one non‑1 numeric case (e.g., 2) to confirm “any non‑zero” stays true.
  • Optionally DRY the repeated fromLegacyFormat + assertions via a small helper or parameterized test.

These are purely optional; current coverage already looks good.

core/src/main/kotlin/cc/unitmesh/devti/llm2/model/LlmConfig.kt (1)

45-55: Numeric stream handling looks correct; consider centralizing the parsing logic

The extended when branch for JsonPrimitive correctly maps:

  • booleans via booleanOrNull,
  • "true"/"false" strings via toBoolean(),
  • integers/longs via != 0 / != 0L,

which matches the new tests and the intent to support numeric 0/1 while defaulting safely to true when ambiguous.

The same boolean/numeric interpretation logic is now duplicated in LLMDialog (for actualStream in both testConnection() and doOKAction()). Consider extracting a small helper (e.g., on JsonElement or in CustomRequest) to compute a Boolean from a stream element so that future tweaks only need to be made in one place.

core/src/main/kotlin/cc/unitmesh/devti/settings/dialog/LLMDialog.kt (1)

269-299: actualStream derivation and usage are consistent; consider DRY’ing the shared logic

The changes in both testConnection() and doOKAction() form a coherent flow:

  • Build body with:
    • Explicit model and temperature fields,
    • A default stream from the checkbox,
    • Then overlay additionalBody so an explicitly provided stream in JSON wins.
  • Derive actualStream from the final body["stream"], interpreting:
    • booleans directly,
    • "true"/"false" strings via toBoolean(),
    • numeric 0 as false and any other integer/long as true,
    • falling back to the checkbox when value is absent/ambiguous.
  • Use actualStream both for CustomRequest.stream and to pick the default JSONPath in responseResolver when the field is empty.

This resolves the mismatches described in the linked issue (checkbox vs explicit body and numeric 0/1) and keeps Test Connection behavior consistent with actual dialog usage.

Given that the actualStream computation and accompanying comments are duplicated here (twice) and conceptually mirror the logic in LlmConfig.fromLegacyFormat, you might want to extract a shared helper (e.g., fun resolveStream(streamElement: JsonElement?, fallback: Boolean): Boolean) and call it from both methods and from the legacy parser. That would reduce the chance of subtle divergence if you ever tweak the rules again.

Also applies to: 303-308, 403-412, 414-427, 445-455

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between f85bbf8 and 93fec9f.

📒 Files selected for processing (3)
  • core/src/main/kotlin/cc/unitmesh/devti/llm2/model/LlmConfig.kt (1 hunks)
  • core/src/main/kotlin/cc/unitmesh/devti/settings/dialog/LLMDialog.kt (4 hunks)
  • core/src/test/kotlin/cc/unitmesh/devti/llm2/model/CustomRequestTest.kt (1 hunks)
🔇 Additional comments (1)
core/src/main/kotlin/cc/unitmesh/devti/settings/dialog/LLMDialog.kt (1)

117-121: Preserving stream in the editable body aligns with explicit override semantics

Keeping stream in bodyWithoutModelTemp (only filtering out model and temperature) ensures that any explicitly configured stream value—especially numeric 0/1—shows up in “Additional request body (JSON)” and survives round‑trips through the dialog. This matches the intent to let advanced users override the checkbox via the raw body while still exposing model/temperature as first‑class fields.

Looks good as implemented.

@phodal phodal marked this pull request as ready for review November 20, 2025 08:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

自定义大模型无响应问题

2 participants