Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

drndos/hass-openai-custom-conversation

Open more actions menu

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

hass-openai-custom-conversation

Conversation support for home assistant using local llm for example vicuna or something else

How to setup your own local LLM for Home assistant:

  • Install local-ai
  • Setup model
  • Install hass-openai-custom-conversation
  • Add custom component to your hass installation
  • Set first field to any string, set second field to the address of local-ai installation
  • Configure hass assist to use custom openai conversation as conversation agent, set options to contain instructions specific to your setup and model name

Discussion here https://community.home-assistant.io/t/integration-with-localai/575238/13

About

Conversation support for home assistant using vicuna local llm

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages

Morty Proxy This is a proxified and sanitized view of the page, visit original site.