An offline AI chatbot is hosted and run from the user’s computer, rather than being located on a third-party server and accessed via the internet.
Tech giants such as Meta, Google, Microsoft, and OpenAI have developed AI models that can be run locally.
These include Llama, Gemma, Phi, and Codestral/Mistral_7B.
Factors to consider when choosing a local AI model include the tasks it will be asked to perform, whether it is a specialist or generalist, and its size, remembering that larger models generally offer better responses but require more computing power.
To get started, users need to select an AI model and a platform to interact with it.
Recommended platforms include Jan.ai, which is open source, and LM Studio, which is less transparent but often supports newer models.
Once installed, users can enter a conversation with their AI chatbot, tweaking settings related to response creativity and length of context, among other things, to optimise the experience.
Offline AI chatbots can be used for a range of tasks, including writing, coding, and problem-solving.