llama.cpp/examples/ray/requirements.txt
2024-05-17 13:27:26 -04:00

3 lines
97 B
Text

ray[serve]
--extra-index-url https://abetlen.github.io/llama-cpp-python/whl/cpu
llama-cpp-python