Skip to content

Instantly share code, notes, and snippets.

@brando90
Created August 14, 2024 19:01
Show Gist options
  • Save brando90/c55c74e840d42c952d4aec7b74e0be6c to your computer and use it in GitHub Desktop.
Save brando90/c55c74e840d42c952d4aec7b74e0be6c to your computer and use it in GitHub Desktop.
vllm_python_potential_way_to_resolve_issues.txt
# # 'torch==2.1.2', # 2.2 not supported due to vllm see: https://github.com/vllm-project/vllm/issues/2747
...
# 'transformers>=4.40',
# 'accelerate==0.29.2',
...
# 'datasets==2.14.7',
# 'evaluate==0.4.1',
# 'bitsandbytes== 0.43.0',
# 'einops',
# 'flash-attn>=2.5.8',
# 'vllm==0.4.1', # my past projs uses 0.4.0.post1 ref: https://github.com/vllm-project/vllm/issues/2747
@brando90
Copy link
Author

github issue in vllm repo: vllm-project/vllm#2747

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment