Anyone Can Enjoy the Benefits of a Local LLM With These 5 Apps
1 min read
Summary
Several apps allow users to run local large language models (LLMs) on their devices, giving them offline access and stronger data privacy
Among the best are Ollama, Msty, AnythingLLM, Jan, and LM Studio
Ollama stands out for its simplicity and accessibility, supporting a variety of models and available on macOS, Windows, and Linux
To launch a model, users enter the command “ollama run [model identifier]” followed by the name of the supported LLM
Msty is easy to use and lets users avoid the command line, offering a library of prompts and workspaces to keep chats and tasks organised
AnythingLLM allows users to download models from third-party sources, including Ollama, LM Studio, and Local AI, enabling them to run thousands of LLM models available online
Jan and LM Studio provide easy, accessible ways to load popular LLM models like Llama, Mistral, Gemma, DeepSeek, Phi, and Qwen from Hugging Face