Commit 2138561
authored
fix(server): Propagate
flash_attn to model load. (abetlen#1424)1 parent 2117122 commit 2138561Copy full SHA for 2138561
File tree
Expand file treeCollapse file tree
1 file changed
+1
-0
lines changedOpen diff view settings
Filter options
- llama_cpp/server
Expand file treeCollapse file tree
1 file changed
+1
-0
lines changedOpen diff view settings
Collapse file
+1Lines changed: 1 addition & 0 deletions
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| ||
242 | 242 | |
243 | 243 | |
244 | 244 | |
| 245 | + |
245 | 246 | |
246 | 247 | |
247 | 248 | |
|
0 commit comments