How to run AI models locally as a beginner?
Most people think running AI models locally is complicated. It’s not. Anyone can run powerful AI models like DeepSeek, Llama, and Mistral on their own computer. This guide will show you how, even if you’ve never written a line of code.
Quick steps:
1. Download Jan
Download Jan from jan.ai - it’s free and open source.
2. Choose a model that fits your hardware
Jan helps you pick the right AI model for your computer.
3. Start using AI locally
That’s all to run your first AI model locally!
Jan’s easy-to-use chat interface after installation.
Keep reading to learn key terms of local AI and the things you should know before running AI models locally.
How Local AI Works
Before diving into the details, let’s understand how AI runs on your computer:
llama.cpp helps millions of people run AI locally on their computers.
Understanding AI Models
Think of AI models like apps on your computer - some are light and quick to use, while others are bigger but can do more things. When you’re choosing an AI model to run on your computer, you’ll see names like “Llama-3-8B” or “Mistral-7B”. Let’s break down what this means in simple terms.
Jan Hub makes it easy to understand different model sizes and versions
Good news: Jan helps you pick the right model size for your computer automatically! You don’t need to worry about the technical details - just choose a model that matches what Jan recommends for your computer.
What You Can Do with Local AI
Hardware Requirements
Before downloading an AI model, consider checking if your computer can run it. Here’s a basic guide:
The basics your computer needs:
- A decent processor (CPU) - most computers from the last 5 years will work fine
- At least 8GB of RAM - 16GB or more is better
- Some free storage space - at least 5GB recommended
What Models Can Your Computer Run?
| Regular Laptop | 3B-7B models | Good for chatting and writing. Like having a helpful assistant |
| Gaming Laptop | 7B-13B models | More capable. Better at complex tasks like coding and analysis |
| Powerful Desktop | 13B+ models | Better performance. Great for professional work and advanced tasks |
Getting Started with Models
Model Versions
When browsing models in Jan, you’ll see terms like “Q4”, “Q6”, or “Q8”. Here’s what that means in simple terms:
Pro tip: Start with Q4 versions - they work great for most people and run smoothly on regular computers!
Getting Models from Hugging Face
You’ll often see links to “Hugging Face” when downloading AI models. Think of Hugging Face as the “GitHub for AI” - it’s where the AI community shares their models. Jan makes it super easy to use:
- Jan has a built-in connection to Hugging Face
- You can download models right from Jan’s interface
- No need to visit the Hugging Face website unless you want to explore more options
Setting up your local AI
Getting Models from Hugging Face
You’ll often see links to “Hugging Face” when downloading AI models. Think of Hugging Face as the “GitHub for AI” - it’s where the AI community shares their models. This sounds technical, but Jan makes it super easy to use:
- Jan has a built-in connection to Hugging Face
- You can download models right from Jan’s interface
- No need to visit the Hugging Face website unless you want to explore more options
1. Get Started
Download Jan from jan.ai - it sets everything up for you.
2. Get an AI Model
You can get models two ways:
1. Use Jan Hub (Recommended):
- Click “Download Model” in Jan
- Pick a recommended model
- Choose one that fits your computer
Use Jan Hub to download AI models
2. Use Hugging Face:
Step 1: Get the model link
Find and copy a GGUF model link from Hugging Face
Look for models with “GGUF” in their name
Step 2: Open Jan
Launch Jan and go to the Models tab
Navigate to the Models section in Jan
Step 3: Add the model
Paste your Hugging Face link into Jan
Paste your GGUF model link here
Step 4: Download
Select your quantization and start the download
Choose your preferred model size and download
Common Questions
“My computer doesn’t have a graphics card - can I still use AI?”
Yes! It will run slower but still work. Start with 7B models.
“Which model should I start with?”
Try a 7B model first - it’s the best balance of smart and fast.
“Will it slow down my computer?”
Only while you’re using the AI. Close other big programs for better speed.