Bump llama-cpp-python to use LlamaCache
This commit is contained in:
parent
ac189011cb
commit
d2ea925fa5
2 changed files with 4 additions and 3 deletions
|
@ -14,5 +14,5 @@ tqdm
|
|||
git+https://github.com/huggingface/peft
|
||||
transformers==4.28.0
|
||||
bitsandbytes==0.38.1; platform_system != "Windows"
|
||||
llama-cpp-python==0.1.33; platform_system != "Windows"
|
||||
https://github.com/abetlen/llama-cpp-python/releases/download/v0.1.33/llama_cpp_python-0.1.33-cp310-cp310-win_amd64.whl; platform_system == "Windows"
|
||||
llama-cpp-python==0.1.34; platform_system != "Windows"
|
||||
https://github.com/abetlen/llama-cpp-python/releases/download/v0.1.34/llama_cpp_python-0.1.34-cp310-cp310-win_amd64.whl; platform_system == "Windows"
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue