This website requires JavaScript.
Explore
Help
Sign in
baalajimaestro
/
llama.cpp
Watch
1
Star
0
Fork
You've already forked llama.cpp
0
Code
Issues
Pull requests
Projects
Releases
Packages
Wiki
Activity
7e55244540
llama.cpp
/
llama_cpp
History
Andrei Betlen
7e55244540
Fix top_k value.
Closes
#220
2023-05-17 01:41:42 -04:00
..
server
Add model_alias option to override model_path in completions.
Closes
#39
2023-05-16 17:22:00 -04:00
__init__.py
Black formatting
2023-03-24 14:59:29 -04:00
llama.py
Fix top_k value.
Closes
#220
2023-05-17 01:41:42 -04:00
llama_cpp.py
Add winmode arg only on windows if python version supports it
2023-05-15 09:15:01 -04:00
llama_types.py
Revert "llama_cpp server: delete some ignored / unused parameters"
2023-05-07 02:02:34 -04:00