Fix reasoning token streaming for gpt-5-mini and gpt-5-nano models (AG-UI) #115
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Fix reasoning token streaming for
gpt-5-miniandgpt-5-nanomodels (AG-UI)Fixes #92 (applied to
ag-uibranch)This PR applies the reasoning streaming fix from #94 to the
ag-uibranch, adapting it to use AG-UI event format instead of legacy chunk types.🎯 Changes
Type definitions:
OpenAIReasoningOptionstogpt-5-miniandgpt-5-nanomodel type definitions inmodel-meta.tssummaryoption placement inOpenAIReasoningOptions- moved insidereasoningobject to match OpenAI SDK structureOpenAIReasoningOptionsWithConciseinterface forcomputer-use-previewmodel (supportsconcisesummary option)Runtime streaming:
response.reasoning_summary_text.deltaevents inopenai-adapter.tsto stream reasoning summariesSTEP_STARTEDandSTEP_FINISHEDevents instead of legacythinkingchunkshasEmittedStepStartedflag andstepIdtracking for consistent lifecycle managementTool result serialization (⚠️ Additional change - requires careful review):
handleToolCallEndEvent(chat.ts:313-331) to ensure tool results are consistently serialized to strings before emitting eventsai-chat.test.ts:2416to expect serialized string format instead of raw objecttool_resultchunk behavior. Please review carefully as it affects tool result event emission.Tests:
model-meta.test.tsto expect reasoning support forgpt-5-mini,gpt-5-nano, andcomputer-use-previewDocs:
docs/adapters/openai.mdto includegpt-5-mini,gpt-5-nano, andcomputer-use-previewin supported reasoning modelssummaryoption insidereasoningobject🔄 Differences from Original PR (#94)
STEP_STARTED/STEP_FINISHEDevents instead of legacythinkingchunksOpenAIReasoningOptionsWithConciseforcomputer-use-preview✅ Checklist
pnpm run test:prpnpm run test:lib🚀 Release Impact
Please pay special attention to:
Tool result serialization (
chat.ts:313-331): This change serializes tool results to strings before emitting events. While this matches legacy behavior and fixes test failures, it's outside the original issue scope and should be reviewed carefully for any edge cases or breaking changes.AG-UI event format: All reasoning events now use
STEP_STARTED/STEP_FINISHEDformat, ensuring compatibility with AG-UI protocol while maintaining backward compatibility through legacy chunk type support.