Skip to content

Conversation

@JRMeyer
Copy link

@JRMeyer JRMeyer commented Nov 28, 2025

The SDK already accepts top_logprobs in ModelSettings and passes it to the API, but the logprobs returned in the response were discarded during conversion. This change:

  1. Adds an optional logprobs field to ModelResponse dataclass
  2. Extracts logprobs from choice.logprobs.content in the chat completions model and includes them in the ModelResponse

This enables use cases like RLHF training, confidence scoring, and uncertainty estimation that require access to token-level log probabilities.

The SDK already accepts `top_logprobs` in ModelSettings and passes it to the
API, but the logprobs returned in the response were discarded during
conversion. This change:

1. Adds an optional `logprobs` field to ModelResponse dataclass
2. Extracts logprobs from `choice.logprobs.content` in the chat completions
   model and includes them in the ModelResponse

This enables use cases like RLHF training, confidence scoring, and
uncertainty estimation that require access to token-level log probabilities.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant