Running Large Language Models (LLMs) like Llama-3 or Phi-3 typically requires cloud resources and a complicated setup. LM Studio changes this by providing a desktop app that lets you run these models directly on your local computer.
It is compatible with Windows, macOS, and Linux, and its friendly GUI makes it easier to run LLMs, even for people who aren’t familiar with technical setups. It’s also a great option for privacy because all queries, chats, and data inputs are processed locally without any data being sent to the cloud.
Let’s see how it works.
System Requirements
To run LLM models smoothly on your device, make sure your setup meets these requirements:
PC (Windows/Linux): A processor supporting AVX2 (standard on newer PCs) and an NVIDIA or AMD GPU.
macOS: Requires Apple Silicon (M1/M2/M3). Intel-based Macs are not supported.
Memory: At least 16 GB RAM is ideal, though 8 GB may work if you use smaller models and context sizes.
Internet: A stable connection is recommended for downloading models.
Installation
To get started, download LM Studio for your platform.
After downloading, follow the installation steps to launch the app. You’ll see a familiar chat interface with a text box, similar to most AI chat applications, as shown below:
Before you can start using it, you need to download and load a model.
What is a Model?
A model in this context is a pre-trained algorithm that can perform a variety of natural language processing tasks. The model is trained on a large dataset of text and learns to predict the next word in a sentence, enabling it to generate coherent and relevant text based on your input.
There are many different models available, each with specific strengths. Some models are better at generating creative text, while others excel at factual information or shorter responses.
For example, models like GPT-3, Llama-3, and Phi-3 generate creative and engaging text, while Yi Coder is trained on code and is better at generating code snippets.
Load a Model
LM Studio supports a variety of models, including GPT-3, Llama-3, Phi-3, and more. You can easily download models from the “Discover” section in the sidebar. Here, you will see a list of available models, their parameter sizes, and their specializations.
Select a model based on your needs. For example, if you want to generate creative text, download a model like Llama-3. If you need code snippets, try Yi Coder. Larger models require more resources, so choose a smaller model if your computer has limited power.
In this example, I’ll download Llama-3 with 8B parameters. Once you click the download button, the model will begin downloading.
After downloading, load the model by clicking on the “Load Model” button in the “Chat” section and selecting the model you downloaded.
Once the model is loaded, start using it to generate text. Simply type your input in the text box and press enter. It can handle facts or general knowledge and is useful for creative writing, brainstorming, or generating ideas.
Chat with Documents
Since version 0.3, LM Studio offers a Chat with Documents feature, allowing you to upload a document to the conversation. This is useful for generating text based on a specific document or providing extra context to the model.
For example, I’ll upload the Romeo and Juliet book from Project Gutenberg and ask a couple of questions.
Who are the main characters in the story?
What is the main conflict in the story?
LM Studio will gather information from the document and provide answers to your questions.
Currently, this feature is experimental, meaning it may not always work perfectly. Providing as much context in your query as possible—specific terms, ideas, and expected content—will increase the chances of accurate responses. Experimentation will help you find what works best.
Overall, I’m happy with the results so far. It can answer questions accurately.
Wrapping Up
LM Studio is a valuable tool for running LLM models locally on your computer, and we’ve explored some features like using it as a chat assistant and summarizing documents. These features can boost productivity and creativity. If you’re a developer, LM Studio can also run models specifically tuned for generating code.
The post How to Run LLM Locally on Your Computer with LM Studio appeared first on Hongkiat.
No responses yet