Skip to content

Conversation

shellmayr
Copy link
Member

@shellmayr shellmayr commented Oct 3, 2025

  • Add truncation for long message histories to prevent spans from being rejected due to exceeding size limits
  • Messages are removed, oldest first, if the span attribute exceeds 20kB, and a _meta entry is added when removed
  • Huggingface Hub currently doesn't seem to send spans that are OTEL compatible - the attribute is current just a string as evidenced by this test
  • Annotations
    • Removing entire messages will cause a _meta tag to be set via AnnotatedValue

Truncation for single messages that exceed the limit will be added separately

Closes TET-1208

Copy link

linear bot commented Oct 3, 2025

@shellmayr shellmayr force-pushed the shellmayr/fix/truncate-long-gen_ai-messages branch from 92329fb to f66a772 Compare October 7, 2025 14:20
@shellmayr shellmayr force-pushed the shellmayr/fix/truncate-long-gen_ai-messages branch from c6716c6 to 294c66c Compare October 9, 2025 09:14
Copy link

codecov bot commented Oct 15, 2025

❌ 4 Tests Failed:

Tests completed Failed Passed Skipped
23770 4 23766 1716
View the top 3 failed test(s) by shortest run time
tests.integrations.openai_agents.test_openai_agents::test_tool_execution_span
Stack Traces | 0.258s run time
.../integrations/openai_agents/test_openai_agents.py:432: in test_tool_execution_span
    await agents.Runner.run(
.tox/py3.11-openai_agents-v0.0.19/lib/python3.11............/site-packages/agents/run.py:199: in run
    return await runner.run(
.../openai_agents/patches/runner.py:43: in wrapper
    raise exc from None
.../openai_agents/patches/runner.py:33: in wrapper
    result = await original_func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.11-openai_agents-v0.0.19/lib/python3.11............/site-packages/agents/run.py:417: in run
    turn_result = await self._run_single_turn(
.../openai_agents/patches/agent_run.py:78: in patched_run_single_turn
    result = await original_run_single_turn(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.11-openai_agents-v0.0.19/lib/python3.11............/site-packages/agents/run.py:905: in _run_single_turn
    new_response = await cls._get_new_response(
.tox/py3.11-openai_agents-v0.0.19/lib/python3.11............/site-packages/agents/run.py:1066: in _get_new_response
    new_response = await model.get_response(
.../openai_agents/patches/models.py:42: in wrapped_get_response
    update_ai_client_span(span, agent, kwargs, result)
.../openai_agents/spans/ai_client.py:40: in update_ai_client_span
    _set_input_data(span, get_response_kwargs)
.../integrations/openai_agents/utils.py:141: in _set_input_data
    messages_data = truncate_and_annotate_messages(
sentry_sdk/ai/utils.py:171: in truncate_and_annotate_messages
    truncated_messages = truncate_messages_by_size(messages, max_bytes)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
sentry_sdk/ai/utils.py:118: in truncate_messages_by_size
    serialized_json = json.dumps(truncated_messages, separators=(",", ":"))
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.11.13............/x64/lib/python3.11/json/__init__.py:238: in dumps
    **kw).encode(obj)
          ^^^^^^^^^^^
.../hostedtoolcache/Python/3.11.13............/x64/lib/python3.11/json/encoder.py:200: in encode
    chunks = self.iterencode(o, _one_shot=True)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.11.13............/x64/lib/python3.11/json/encoder.py:258: in iterencode
    return _iterencode(o, 0)
           ^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.11.13............/x64/lib/python3.11/json/encoder.py:180: in default
    raise TypeError(f'Object of type {o.__class__.__name__} '
E   TypeError: Object of type SerializationIterator is not JSON serializable
tests.integrations.openai_agents.test_openai_agents::test_tool_execution_span
Stack Traces | 0.412s run time
.../integrations/openai_agents/test_openai_agents.py:432: in test_tool_execution_span
    await agents.Runner.run(
.tox/py3.10-openai_agents-v0.0.19/lib/python3.10............/site-packages/agents/run.py:199: in run
    return await runner.run(
.../openai_agents/patches/runner.py:43: in wrapper
    raise exc from None
.../openai_agents/patches/runner.py:33: in wrapper
    result = await original_func(*args, **kwargs)
.tox/py3.10-openai_agents-v0.0.19/lib/python3.10............/site-packages/agents/run.py:417: in run
    turn_result = await self._run_single_turn(
.../openai_agents/patches/agent_run.py:78: in patched_run_single_turn
    result = await original_run_single_turn(*args, **kwargs)
.tox/py3.10-openai_agents-v0.0.19/lib/python3.10............/site-packages/agents/run.py:905: in _run_single_turn
    new_response = await cls._get_new_response(
.tox/py3.10-openai_agents-v0.0.19/lib/python3.10............/site-packages/agents/run.py:1066: in _get_new_response
    new_response = await model.get_response(
.../openai_agents/patches/models.py:42: in wrapped_get_response
    update_ai_client_span(span, agent, kwargs, result)
.../openai_agents/spans/ai_client.py:40: in update_ai_client_span
    _set_input_data(span, get_response_kwargs)
.../integrations/openai_agents/utils.py:141: in _set_input_data
    messages_data = truncate_and_annotate_messages(
sentry_sdk/ai/utils.py:171: in truncate_and_annotate_messages
    truncated_messages = truncate_messages_by_size(messages, max_bytes)
sentry_sdk/ai/utils.py:118: in truncate_messages_by_size
    serialized_json = json.dumps(truncated_messages, separators=(",", ":"))
.../hostedtoolcache/Python/3.10.18............/x64/lib/python3.10/json/__init__.py:238: in dumps
    **kw).encode(obj)
.../hostedtoolcache/Python/3.10.18............/x64/lib/python3.10/json/encoder.py:199: in encode
    chunks = self.iterencode(o, _one_shot=True)
.../hostedtoolcache/Python/3.10.18............/x64/lib/python3.10/json/encoder.py:257: in iterencode
    return _iterencode(o, 0)
.../hostedtoolcache/Python/3.10.18............/x64/lib/python3.10/json/encoder.py:179: in default
    raise TypeError(f'Object of type {o.__class__.__name__} '
E   TypeError: Object of type SerializationIterator is not JSON serializable
tests.integrations.openai_agents.test_openai_agents::test_tool_execution_span
Stack Traces | 0.486s run time
.../integrations/openai_agents/test_openai_agents.py:432: in test_tool_execution_span
    await agents.Runner.run(
.tox/py3.12-openai_agents-v0.2.11/lib/python3.12............/site-packages/agents/run.py:267: in run
    return await runner.run(
.../openai_agents/patches/runner.py:43: in wrapper
    raise exc from None
.../openai_agents/patches/runner.py:33: in wrapper
    result = await original_func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-openai_agents-v0.2.11/lib/python3.12............/site-packages/agents/run.py:504: in run
    turn_result = await self._run_single_turn(
.../openai_agents/patches/agent_run.py:78: in patched_run_single_turn
    result = await original_run_single_turn(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-openai_agents-v0.2.11/lib/python3.12............/site-packages/agents/run.py:1159: in _run_single_turn
    new_response = await cls._get_new_response(
.tox/py3.12-openai_agents-v0.2.11/lib/python3.12............/site-packages/agents/run.py:1398: in _get_new_response
    new_response = await model.get_response(
.../openai_agents/patches/models.py:42: in wrapped_get_response
    update_ai_client_span(span, agent, kwargs, result)
.../openai_agents/spans/ai_client.py:40: in update_ai_client_span
    _set_input_data(span, get_response_kwargs)
.../integrations/openai_agents/utils.py:141: in _set_input_data
    messages_data = truncate_and_annotate_messages(
sentry_sdk/ai/utils.py:171: in truncate_and_annotate_messages
    truncated_messages = truncate_messages_by_size(messages, max_bytes)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
sentry_sdk/ai/utils.py:118: in truncate_messages_by_size
    serialized_json = json.dumps(truncated_messages, separators=(",", ":"))
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.12.11............/x64/lib/python3.12/json/__init__.py:238: in dumps
    **kw).encode(obj)
          ^^^^^^^^^^^
.../hostedtoolcache/Python/3.12.11............/x64/lib/python3.12/json/encoder.py:200: in encode
    chunks = self.iterencode(o, _one_shot=True)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.12.11............/x64/lib/python3.12/json/encoder.py:258: in iterencode
    return _iterencode(o, 0)
           ^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.12.11............/x64/lib/python3.12/json/encoder.py:180: in default
    raise TypeError(f'Object of type {o.__class__.__name__} '
E   TypeError: Object of type SerializationIterator is not JSON serializable
tests.integrations.openai_agents.test_openai_agents::test_tool_execution_span
Stack Traces | 0.501s run time
.../integrations/openai_agents/test_openai_agents.py:432: in test_tool_execution_span
    await agents.Runner.run(
.tox/py3.13-openai_agents-v0.2.11/lib/python3.13............/site-packages/agents/run.py:267: in run
    return await runner.run(
.../openai_agents/patches/runner.py:43: in wrapper
    raise exc from None
.../openai_agents/patches/runner.py:33: in wrapper
    result = await original_func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-openai_agents-v0.2.11/lib/python3.13............/site-packages/agents/run.py:504: in run
    turn_result = await self._run_single_turn(
.../openai_agents/patches/agent_run.py:78: in patched_run_single_turn
    result = await original_run_single_turn(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-openai_agents-v0.2.11/lib/python3.13............/site-packages/agents/run.py:1159: in _run_single_turn
    new_response = await cls._get_new_response(
.tox/py3.13-openai_agents-v0.2.11/lib/python3.13............/site-packages/agents/run.py:1398: in _get_new_response
    new_response = await model.get_response(
.../openai_agents/patches/models.py:42: in wrapped_get_response
    update_ai_client_span(span, agent, kwargs, result)
.../openai_agents/spans/ai_client.py:40: in update_ai_client_span
    _set_input_data(span, get_response_kwargs)
.../integrations/openai_agents/utils.py:141: in _set_input_data
    messages_data = truncate_and_annotate_messages(
sentry_sdk/ai/utils.py:171: in truncate_and_annotate_messages
    truncated_messages = truncate_messages_by_size(messages, max_bytes)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
sentry_sdk/ai/utils.py:118: in truncate_messages_by_size
    serialized_json = json.dumps(truncated_messages, separators=(",", ":"))
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.13.7............/x64/lib/python3.13/json/__init__.py:238: in dumps
    **kw).encode(obj)
          ^^^^^^^^^^^
.../hostedtoolcache/Python/3.13.7............/x64/lib/python3.13/json/encoder.py:200: in encode
    chunks = self.iterencode(o, _one_shot=True)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.13.7............/x64/lib/python3.13/json/encoder.py:261: in iterencode
    return _iterencode(o, 0)
           ^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.13.7............/x64/lib/python3.13/json/encoder.py:180: in default
    raise TypeError(f'Object of type {o.__class__.__name__} '
E   TypeError: Object of type SerializationIterator is not JSON serializable

To view more test analytics, go to the Test Analytics Dashboard
📋 Got 3 mins? Take this short survey to help us improve Test Analytics.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant