Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Integrate functionary v1.4 and v2 models + add HF AutoTokenizer as optional parameter in llama.create_completion#1078

Merged
abetlen merged 33 commits intoabetlen:mainabetlen/llama-cpp-python:mainfrom
jeffrey-fong:integrate-functionaryjeffrey-fong/llama-cpp-python:integrate-functionaryCopy head branch name to clipboard
Feb 8, 2024
Merged

Integrate functionary v1.4 and v2 models + add HF AutoTokenizer as optional parameter in llama.create_completion#1078
abetlen merged 33 commits intoabetlen:mainabetlen/llama-cpp-python:mainfrom
jeffrey-fong:integrate-functionaryjeffrey-fong/llama-cpp-python:integrate-functionaryCopy head branch name to clipboard

Commits

Commits on Jan 30, 2024

Commits on Feb 1, 2024

Commits on Feb 2, 2024

Commits on Feb 8, 2024

Morty Proxy This is a proxified and sanitized view of the page, visit original site.