Skip to content

Conversation

Aymendje
Copy link

@Aymendje Aymendje commented Oct 17, 2025

Purpose

Fix #26491

When using the flag --otlp-traces-endpoint, the server crashes on startup with an error
[1;36m(APIServer pid=1) [0;0m ModuleNotFoundError: No module named 'opentelemetry'

This PR adds the missing modules based on the existing opentelemetry documentation

Test Plan

Rebuild the docker image with the extra

pip install \
          'opentelemetry-sdk>=1.26.0,<1.27.0' \
          'opentelemetry-api>=1.26.0,<1.27.0' \
          'opentelemetry-exporter-otlp>=1.26.0,<1.27.0' \
          'opentelemetry-semantic-conventions-ai>=0.4.1,<0.5.0'

Test Result

No more crashes on startup


Essential Elements of an Effective PR Description Checklist
  • The purpose of the PR, such as "Fix some issue (link existing issues this PR will resolve)".
  • The test plan, such as providing test command.
  • The test results, such as pasting the results comparison before and after, or e2e results
  • (Optional) The necessary documentation update, such as updating supported_models.md and examples for a new model.
  • (Optional) Release notes update. If your change is user facing, please update the release notes draft in the Google Doc.

@Aymendje Aymendje changed the title Patch 2 Add missing opentelemetry dependency to base docker image Oct 17, 2025
@mergify mergify bot added the ci/build label Oct 17, 2025
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request addresses a critical startup crash caused by missing opentelemetry modules when the --otlp-traces-endpoint flag is used. The fix involves adding the necessary opentelemetry packages to the requirements/common.txt file. I have added a review comment to highlight the importance of keeping dependencies up-to-date and suggest a way to automate this process.

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR.

Added OpenTelemetry dependencies for tracing support.

Signed-off-by: Aymen Djellal <[email protected]>
@bbartels
Copy link
Contributor

I think the version conflict with ray has been resolved, might be worth just adding it to the default set of dependencies in requirements/common.txt

@Aymendje
Copy link
Author

I think the version conflict with ray has been resolved, might be worth just adding it to the default set of dependencies in requirements/common.txt

You are right, I tried removing the upper limit and it seems to work fine for me

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: vllm/vllm-openai:nightly container crashes with --otlp-traces-endpoint due to missing opentelemetry package

2 participants