You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -825,6 +826,21 @@ Check out the [examples folder](examples/low_level_api) for more examples of usi
825
826
Documentation is available via [https://llama-cpp-python.readthedocs.io/](https://llama-cpp-python.readthedocs.io/).
826
827
If you find any issues with the documentation, please open an issue or submit a PR.
827
828
829
+
## UI's
830
+
# SeKernel_for_LLM_UI
831
+
This is the repository for the UIfor the SeKernel_for_LLM module
832
+
833
+
## How to:
834
+
- Clone the repo
835
+
- Ensure that you have llama-cpp-python installed and running
836
+
- Add your model to the `kernel.py` script
837
+
- Launch the UI by running `python sekernel_ui.py`
838
+
-- Please note : Only internet-connected chat is supported. If you have the skills, you can checkout the plugins.py module to add more functionality to your UI.
0 commit comments