Update bug_report.md
- Added section on how to repro using llama.cpp in ./vendor/llama.cpp - Added a few more example environment commands to aid in debugging.
This commit is contained in:
parent
828f9ec015
commit
9dd8cf3472
1 changed files with 21 additions and 5 deletions
26
.github/ISSUE_TEMPLATE/bug_report.md
vendored
26
.github/ISSUE_TEMPLATE/bug_report.md
vendored
|
@ -57,7 +57,17 @@ Please provide detailed steps for reproducing the issue. We are not sitting in f
|
|||
3. step 3
|
||||
4. etc.
|
||||
|
||||
**Note: Many issues seem to be regarding performance issues / differences with `llama.cpp`. In these cases we need to confirm that you're comparing against the version of `llama.cpp` that was built with your python package, and which parameters you're passing to the context.**
|
||||
**Note: Many issues seem to be regarding functional or performance issues / differences with `llama.cpp`. In these cases we need to confirm that you're comparing against the version of `llama.cpp` that was built with your python package, and which parameters you're passing to the context.**
|
||||
|
||||
Try the following:
|
||||
|
||||
1. `git clone https://github.com/abetlen/llama-cpp-python`
|
||||
2. `cd llama-cpp-python`
|
||||
3. `rm -rf _skbuild/` # delete any old builds
|
||||
4. `python setup.py develop`
|
||||
5. `cd ./vendor/llama.cpp`
|
||||
6. Follow [llama.cpp's instructions](https://github.com/ggerganov/llama.cpp#build) to `cmake` llama.cpp
|
||||
7. Run llama.cpp's `./main` with the same arguments you previously passed to llama-cpp-python and see if you can reproduce the issue. If you can, [log an issue with llama.cpp](https://github.com/ggerganov/llama.cpp/issues)
|
||||
|
||||
# Failure Logs
|
||||
|
||||
|
@ -73,8 +83,14 @@ commit 47b0aa6e957b93dbe2c29d53af16fbae2dd628f2
|
|||
llama-cpp-python$ python3 --version
|
||||
Python 3.10.10
|
||||
|
||||
llama-cpp-python$ pip list | egrep "uvicorn|fastapi|sse-starlette"
|
||||
fastapi 0.95.0
|
||||
sse-starlette 1.3.3
|
||||
uvicorn 0.21.1
|
||||
llama-cpp-python$ pip list | egrep "uvicorn|fastapi|sse-starlette|numpy"
|
||||
fastapi 0.95.0
|
||||
numpy 1.24.3
|
||||
sse-starlette 1.3.3
|
||||
uvicorn 0.21.1
|
||||
|
||||
llama-cpp-python/vendor/llama.cpp$ git log | head -3
|
||||
commit 66874d4fbcc7866377246efbcee938e8cc9c7d76
|
||||
Author: Kerfuffle <44031344+KerfuffleV2@users.noreply.github.com>
|
||||
Date: Thu May 25 20:18:01 2023 -0600
|
||||
```
|
||||
|
|
Loading…
Reference in a new issue