Yes, devs work on Linux. Now, since the Deck, people (2%) are starting to game on Linux. The next battlefields are the desktop and the living room. The latter could be solved by a Deck 2 (and a new dock) with E-GPU support. That is trickier, there needs to happen a lot of things, and is much more complex.
Local LLMs have been supported via the Ollama integration since Home Assistant 2024.4. Ollama and the major open source LLM models are not tuned for tool calling, so this has to be built from scratch and was not done in time for this release. We’re collaborating with NVIDIA to get this working – they showed a prototype last week.
Are all Ollama-supported algos mediocre? Which ones would be better?
That statement works for every other freedom you lose, it also serves to detect malicious intent from another person. There’s always a middle ground, where nothing’s perfect, but it’s balanced. There’s always a compromise. There’s no perfect scenario. If you want a perfect society, you have to take away all freedom. If you give away all freedoms, there’s anarchy.
Where did you buy it? I want to buy it in the US to avoid the high overprice that they sell ir at in my country.