Join Nostr
2025-12-31 12:51:39 UTC

Ivan on Nostr: Good morning, Nostr. Who's running local LLMs? What are the best models that can run ...

Good morning, Nostr. Who's running local LLMs? What are the best models that can run at home for coding on a beefy PC system? In 2026, I want to dig into local LLMs more and stop using Claude and Gemini as much. I know I can use Maple for more private AI, but I prefer running my own model. I also like the fact there are no restrictions on these models ran locally. I know hardware is the bottleneck here; hopefully, these things become more efficient.