Skip to content

Conversation

@youest
Copy link

@youest youest commented Nov 21, 2025

fixes #2467

Problem

The _MessageStreamManager.__enter__() wrapper was calling __api_request() which returns a low-level anthropic.Stream object instead of the high-level anthropic.lib.streaming.MessageStream object.

This caused AttributeError: 'Stream' object has no attribute 'get_final_message' when users tried to call helper methods like get_final_message(), get_final_text(), or until_done() on instrumented streams.

Solution

Changed __enter__() to delegate to the wrapped MessageStreamManager.__enter__() method, which returns the proper MessageStream object with all helper methods intact.

Before:

def __enter__(self) -> Iterator[str]:
    raw = self.__api_request()  # Returns low-level Stream
    return _MessagesStream(raw, self._self_with_span)

After:

def __enter__(self) -> "MessageStream":
    # Delegate to the wrapped MessageStreamManager's __enter__
    message_stream = self.__wrapped__.__enter__()
    return _MessagesStream(message_stream, self._self_with_span)

Impact

This fix allows users to:

  • Use stream.get_final_message() after consuming the stream
  • Access other helper methods like get_final_text() and until_done()
  • Use CrewAI framework's Anthropic integration without errors
  • Follow standard Anthropic streaming patterns with instrumentation enabled

Testing

The existing streaming tests in test_instrumentor.py verify that basic streaming functionality works. This fix maintains backward compatibility while adding support for the missing helper methods.


Note

Return the real MessageStream from MessageStreamManager.__enter__ and pass through __exit__, enabling helper methods and proper cleanup; add tests and cassette to verify.

  • Instrumentation (Anthropic):
    • Stream manager enter/exit:
      • Change _MessageStreamManager.__enter__() to call wrapped MessageStreamManager.__enter__() and return _MessagesStream wrapping the real MessageStream.
      • Add _MessagesStream.__exit__() to delegate to the underlying manager's __exit__ for proper cleanup.
    • TYPE_CHECKING: include MessageStream type.
  • Tests:
    • Update test_anthropic_instrumentation_stream_message to assert stream.get_final_message() works.
    • Add test_anthropic_instrumentation_stream_context_manager_exit to verify manager __exit__ is called and get_final_message() works.
    • Add VCR cassette tests/.../test_anthropic_instrumentation_stream_context_manager_exit.yaml.

Written by Cursor Bugbot for commit 397c19c. This will update automatically on new commits. Configure here.

Resolves Arize-ai#2467

The wrapper was calling __api_request() which returns a low-level
Stream object instead of the high-level MessageStream object with
helper methods like get_final_message().

This fix delegates to the wrapped MessageStreamManager's __enter__
method to get the proper MessageStream object.
@youest youest requested a review from a team as a code owner November 21, 2025 10:47
@dosubot dosubot bot added the size:S This PR changes 10-29 lines, ignoring generated files. label Nov 21, 2025
@github-actions
Copy link
Contributor

github-actions bot commented Nov 21, 2025

CLA Assistant Lite bot All contributors have signed the CLA ✍️ ✅

Add test case to verify that get_final_message() can be called
on instrumented MessageStream objects after consuming the stream.
This validates the fix for issue Arize-ai#2467.
@youest
Copy link
Author

youest commented Nov 21, 2025

I have read the CLA Document and I hereby sign the CLA

github-actions bot added a commit that referenced this pull request Nov 21, 2025
…anup

Fixes context manager protocol violation where __exit__() was not called
on the wrapped MessageStreamManager, causing stream.close() to never execute.

- Added _manager parameter to _MessagesStream.__init__()
- Implemented __exit__() that delegates to manager's __exit__()
- Added test to verify __exit__() is properly called
- All 22 tests pass

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <[email protected]>
@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. and removed size:S This PR changes 10-29 lines, ignoring generated files. labels Nov 21, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

size:L This PR changes 100-499 lines, ignoring generated files.

Projects

Status: No status

Development

Successfully merging this pull request may close these issues.

[BUG] Anthropic streaming wrapper returns wrong object, breaking get_final_message()

1 participant