You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: .github/ISSUE_TEMPLATE/bug_report.md
+13-13Lines changed: 13 additions & 13 deletions
Original file line number
Diff line number
Diff line change
@@ -12,17 +12,17 @@ assignees: ''
12
12
Please answer the following questions for yourself before submitting an issue.
13
13
14
14
-[ ] I am running the latest code. Development is very rapid so there are no tagged versions as of now.
15
-
-[ ] I carefully followed the [README.md](https://github.com/abetlen/llama-cpp-python/blob/main/README.md).
15
+
-[ ] I carefully followed the [README.md](https://github.com/sirajperson/falcon-cpp-python/blob/main/README.md).
16
16
-[ ] I [searched using keywords relevant to my issue](https://docs.github.com/en/issues/tracking-your-work-with-issues/filtering-and-searching-issues-and-pull-requests) to make sure that I am creating a new issue that is not already open (or closed).
17
-
-[ ] I reviewed the [Discussions](https://github.com/abetlen/llama-cpp-python/discussions), and have a new bug or useful enhancement to share.
17
+
-[ ] I reviewed the [Discussions](https://github.com/sirajperson/falcon-cpp-python/discussions), and have a new bug or useful enhancement to share.
18
18
19
19
# Expected Behavior
20
20
21
-
Please provide a detailed written description of what you were trying to do, and what you expected `llama-cpp-python` to do.
21
+
Please provide a detailed written description of what you were trying to do, and what you expected `falcon-cpp-python` to do.
22
22
23
23
# Current Behavior
24
24
25
-
Please provide a detailed written description of what `llama-cpp-python` did, instead.
25
+
Please provide a detailed written description of what `falcon-cpp-python` did, instead.
26
26
27
27
# Environment and Context
28
28
@@ -61,13 +61,13 @@ Please provide detailed steps for reproducing the issue. We are not sitting in f
7. Run llama.cpp's `./main` with the same arguments you previously passed to llama-cpp-python and see if you can reproduce the issue. If you can, [log an issue with llama.cpp](https://github.com/ggerganov/llama.cpp/issues)
68
+
5.`cd ./vendor/ggllm.cpp`
69
+
6. Follow [ggllm.cpp's instructions](https://github.com/cmp-nct/ggllm.cpp) section on how to compile with `cmake`
70
+
7. Run ggllm.cpp's `./falcon_main` with the same arguments you previously passed to falcon-cpp-python and see if you can reproduce the issue. If you can, [log an issue with ggllm.cpp](https://github.com/cmp-nct/ggllm.cpp/issues)
71
71
72
72
# Failure Logs
73
73
@@ -77,10 +77,10 @@ Also, please try to **avoid using screenshots** if at all possible. Instead, cop
77
77
78
78
Example environment info:
79
79
```
80
-
llama-cpp-python$ git log | head -1
80
+
falcon-cpp-python$ git log | head -1
81
81
commit 47b0aa6e957b93dbe2c29d53af16fbae2dd628f2
82
82
83
-
llama-cpp-python$ python3 --version
83
+
falcon-cpp-python$ python3 --version
84
84
Python 3.10.10
85
85
86
86
llama-cpp-python$ pip list | egrep "uvicorn|fastapi|sse-starlette|numpy"
@@ -89,8 +89,8 @@ numpy 1.24.3
89
89
sse-starlette 1.3.3
90
90
uvicorn 0.21.1
91
91
92
-
llama-cpp-python/vendor/llama.cpp$ git log | head -3
92
+
falcon-cpp-python/vendor/llama.cpp$ git log | head -3
0 commit comments