Mention ollama

This commit is contained in:
Paulus Schoutsen 2024-06-06 01:43:44 +00:00
parent 0efe887ca5
commit 17f868a8a4

View File

@ -172,6 +172,8 @@ Google is 14x cheaper than OpenAI, but OpenAI is better at answering non-smart
home questions. We are preparing a blog post with a deep dive into the research
that went into this feature, coming soon™!
Local LLMs have been supported via [the Ollama integration](/integrations/ollama/) since Home Assistant 2024.4. Ollama and the major open source LLM models are not tuned for tool calling, so this has to be built from scratch and was not done in time for this release. We're collaborating with NVIDIA to get this working [they showed a prototype last week.](https://youtu.be/aq7QS9AtwE8?si=yZilHo4uDUCAQiqN)
Thanks to everyone who contributed to this feature! [@shulyaka], [@tronikos],
[@allenporter], [@synesthesiam], [@jlpouffier], and [@balloob]!