Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Generic Function Calling #957

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 20 commits into from
Feb 12, 2024
Merged

Generic Function Calling #957

merged 20 commits into from
Feb 12, 2024

Conversation

abetlen
Copy link
Owner

@abetlen abetlen commented Nov 30, 2023

Reference

Tested this method in a notebook and works quite well.

TODO:

  • Adjust the prompt to support mulit-turn conversations
  • Use gramamr constrained sampling
  • Add custom ChatCompletionHandler
  • Support legacy function calling API
  • Support multi-tool auto calls (hard)
  • Support for streaming muti-tool auto calls
  • Test on Function Calling notebook and instructor cookbook

@Maximilian-Winter
Copy link
Contributor

Maximilian-Winter commented Dec 29, 2023

I have implemented a little framework for structured output, chat and function calling. Can generate function calling grammars and structured output grammars:
https://github.com/Maximilian-Winter/llama-cpp-agent

@Maximilian-Winter
Copy link
Contributor

@abetlen Could you tell me if this is useful for you? I can also integrate just the grammar generator into llama-cpp-python and make a pull request.

@teleprint-me
Copy link
Contributor

I am very interested in this. Will be reviewing when I have some time.

@vriesdemichael
Copy link

Be careful how you promote this chat handler. With this prompt it is only suitable for function calling, which might be confusing if you expect the handler to provide an OpenAI compatible experience with tool calls only when needed.

I am also not entirely sure if openhermes mistral was trained on these specific instructions or that it would work with other models as well.

If it just good prompt engineering to coerce a tool call, then langchain already has some tools to do so using reAct
https://github.com/williamcotton/empirical-philosophy/blob/main/articles/how-react-prompting-works.md

@abetlen abetlen changed the title Add OpenHermes Function Calling Generic Function Calling Jan 31, 2024
@abetlen
Copy link
Owner Author

abetlen commented Jan 31, 2024

@vriesdemichael I get the concern but I've also seen very promising results for function calling with instruction tuned models. Additionally the goal here is also to have a skeleton for function calling that implements all of the nuances of the OpenAI API (multi-tool calling, legacy functions api, auto tool choice, streaming) so that we can adapt it to properly finetuned models.

@abetlen
Copy link
Owner Author

abetlen commented Feb 2, 2024

Just missing automatic multi-tool use now.

@abetlen abetlen marked this pull request as ready for review February 12, 2024 20:52
@abetlen abetlen merged commit 153a004 into main Feb 12, 2024
@abetlen abetlen deleted the openhermes-function-calling branch February 26, 2024 19:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants
Morty Proxy This is a proxified and sanitized view of the page, visit original site.