Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings
Discussion options

Has anyone gotten the latest version of the web project (0.21.0) to work yet? I downloaded the master repository in Visualstudio 2022, various files were downloaded when I rebuilt it, including some models in the LLama.Unittest\Models folder.

The web project starts for me, but when I click on Begin Session, nothing happens, only the loading circle is visible. At first glance, there are no errors in the console output.

I tried a slightly older project a few months ago and it worked straight away. Is it possible that some of the settings in appsettings.json are no longer up to date? For example, the preconfigured “llama-2-7b-chat.Q4_0.gguf” was not downloaded at all. There are only these models in my folder after project build.

image

image

Edit: I'm using a Win10 Machine with NVIDIA GeForce GTX 1070 and 4GHz i7-6700K CPU

...

Edit:

Oh, there seem to be a problem in a signalr function.
image

This Error happends in wwwroot\js\sessionConnectionChat.js if the following row is called:
image

I suppose, that usually the following Task in Hubs\SessionConnectionHub.cs should be called, but this is not happening for some reason. I can't tell why this signalR call is not working.
image

Are these params in "connection.invoke('LoadModel', sessionParams, sessionParams)" correct? It looks strange, because that "sessionParams" parameter is used twice.

EDIT:

I think someone forgot to set the second parameter “inferenceConfig” correctly. This also explains the TODO comment. If you compare the SessionConfig and InferenceOptions data types with each other, some properties are similar. The developer probably wanted to fill an InferenceOptions object at this point, but forgot to do so.

Does anyone know if this is still being developed? Unfortunately, the current state of the implementation prevents you from using the web application.

You must be logged in to vote

Replies: 2 comments · 1 reply

Comment options

Unfortunately I don't think anyone is working on the Web projects at the moment. I've been making minimal updates so they at least compile after breaking changes are made, but that's all. Would you be interested in making some PRs to fix it?

You must be logged in to vote
1 reply
@hswlab
Comment options

Unfortunately, I don't have enough experience with LLamaSharp and the web project at the moment to fix the problem properly. :)

I have opened issue #1080 about this problem, maybe someone else has more know how to fix it properly^^.

For myself, I found a workaround so that at least a single answer comes from the chatbot. But then there is really only one answer, after that the chatbot returns nothing. I will post my code adjustments in the issue #1080, maybe that will help someone :)

Comment options

I have added pull request #1158 that should fix this

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
🙏
Q&A
Labels
None yet
3 participants
Morty Proxy This is a proxified and sanitized view of the page, visit original site.