Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Commit 929c82f

Browse filesBrowse files
authored
FAQ: Update question 12 and 13 (#249)
see wiki for details
1 parent 302517f commit 929c82f
Copy full SHA for 929c82f

File tree

Expand file treeCollapse file tree

2 files changed

+4
-0
lines changed
Open diff view settings
Filter options
Expand file treeCollapse file tree

2 files changed

+4
-0
lines changed
Open diff view settings
Collapse file

‎README.md‎

Copy file name to clipboardExpand all lines: README.md
+2Lines changed: 2 additions & 0 deletions
  • Display the source diff
  • Display the rich diff
Original file line numberDiff line numberDiff line change
@@ -304,6 +304,8 @@
304304
问题9:如何解读第三方公开榜单的结果?
305305
问题10:会出34B或者70B级别的模型吗?
306306
问题11:为什么长上下文版模型是16K,不是32K或者100K?
307+
问题12:为什么Alpaca模型会回复说自己是ChatGPT?
308+
问题13:为什么pt_lora_mdoel或者sft_lora_model下的adapter_model.bin只有几百k?
307309
```
308310

309311

Collapse file

‎README_EN.md‎

Copy file name to clipboardExpand all lines: README_EN.md
+2Lines changed: 2 additions & 0 deletions
  • Display the source diff
  • Display the rich diff
Original file line numberDiff line numberDiff line change
@@ -287,6 +287,8 @@ Question 8: Can the 16K long-context version model replace the standard version
287287
Question 9: How to interprete the results of third-party benchmarks?
288288
Question 10: Will you release 34B or 70B models?
289289
Question 11: Why the long-context model is 16K context, not 32K or 100K?
290+
Question 12: Why does the Alpaca model reply that it is ChatGPT?
291+
Question 13: Why is the adapter_model.bin in the pt_lora_mdoel or sft_lora_model folder only a few hundred kb?
290292
```
291293

292294
For specific questions and answers, please refer to the project >>> [📚 GitHub Wiki](https://github.com/ymcui/Chinese-LLaMA-Alpaca-2/wiki/faq_en)

0 commit comments

Comments
0 (0)
Morty Proxy This is a proxified and sanitized view of the page, visit original site.