What is it?
LM Studio is a desktop application for Mac, Windows, and Linux that lets you download and run open-source AI language models entirely on your own computer. It puts a friendly graphical interface on top of what would otherwise require command-line tools and technical setup. You browse models, click download, and chat — no coding required.
Who is it for?
- Privacy-focused users who want zero data leaving their machine
- Developers who want a local OpenAI-compatible API for testing and building
- Curious beginners who want to experiment with open-source models without the command line
- Anyone working offline — on a plane, in a restricted corporate environment, or in areas with limited connectivity
The magic moment
Download a model like Mistral 7B or Llama 3. Switch to the Chat tab. Ask it something. Watch a capable AI respond entirely from your own hardware, with no internet connection, no API key, and no usage cost. Every token is generated locally. For many people, this feels like a genuinely different relationship with AI.
Step-by-step setup
- Go to lmstudio.ai and download the app for your operating system
- Install and open LM Studio
- Click the Search tab (magnifying glass icon on the left sidebar)
- Search for a model — good starter picks: Mistral 7B Instruct or Llama 3.2 3B Instruct
- Look for files labelled Q4_K_M — this is a compressed format that balances quality and file size
- Click Download — this will take a few minutes depending on your connection (files are 2–5 GB)
- Once downloaded, go to the Chat tab and select your model from the dropdown at the top
- Start chatting
Hardware note: You need at least 8 GB of RAM for small models (3B parameters), 16 GB for mid-size models (7B), and 32 GB+ for larger ones. Apple Silicon Macs handle this especially well.
Compare with similar tools
- Ollama — command-line tool that's faster and more powerful for technical users, but requires terminal comfort; LM Studio is better for beginners who want a GUI
- DeepSeek — a specific family of high-quality models you can run via LM Studio or Ollama; DeepSeek models are a great choice to download inside LM Studio
