Skip to content

Remove vLLM reranker#13

Open
CesarPetrescu wants to merge 1 commit into
mainfrom
2axv53-codex/implement-rerank-forwarding-to-vllm-in-vllm.py
Open

Remove vLLM reranker#13
CesarPetrescu wants to merge 1 commit into
mainfrom
2axv53-codex/implement-rerank-forwarding-to-vllm-in-vllm.py

Conversation

@CesarPetrescu
Copy link
Copy Markdown
Owner

Summary

  • drop vLLM launcher and requirement
  • launch the Transformers-based reranker from loadmodel.py
  • update README with new reranker instructions

Testing

  • python -m py_compile reranker.py loadmodel.py autodevops.py
  • python reranker.py --help (fails: ModuleNotFoundError: No module named 'torch')
  • python loadmodel.py --help (fails: ModuleNotFoundError: No module named 'dotenv')

https://chatgpt.com/codex/tasks/task_e_685d85379abc8330b688f7a6b0b08cbf

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant