CanRunIt
Models
Try now

Hardware Database

Known GPUs and machine presets used to infer RAM/VRAM and recommendation confidence.

NVIDIA RTX 4090

nvidia
VRAM
24GB
Class
100/100

Excellent for 7B-70B quantized models.

NVIDIA RTX 4080

nvidia
VRAM
16GB
Class
88/100

Great 7B-32B comfort zone.

NVIDIA RTX 4070 / 4070 Ti

nvidia
VRAM
12GB
Class
76/100

Very strong for 7B-14B, selective 32B.

NVIDIA RTX 4060 / 4060 Ti

nvidia
VRAM
8GB
Class
62/100

Good mainstream local AI GPU.

NVIDIA RTX 3060 12GB

nvidia
VRAM
12GB
Class
66/100

Excellent value because of 12GB VRAM.

NVIDIA RTX 2060 6GB

nvidia
VRAM
6GB
Class
42/100

Good for small models, limited for 14B+.

AMD RX 7900 XT/XTX

amd
VRAM
20GB
Class
82/100

Strong VRAM, backend support varies by OS.

AMD RX 6800 / 6900

amd
VRAM
16GB
Class
70/100

Good VRAM headroom with ROCm/llama.cpp setups.

Apple M3 Max

apple
VRAM
48GB
Class
86/100

Unified memory; prefer MLX where available.

Apple M2 Pro / M3 Pro

apple
VRAM
24GB
Class
68/100

Comfortable for 7B-14B MLX/GGUF.

Apple M1/M2 Air

apple
VRAM
10GB
Class
46/100

Usable for small local models with unified memory.

Intel integrated GPU

intel
VRAM
0GB
Class
18/100

Usually CPU-first for LLMs.

Machine presets

RTX 4060 laptop

Smooth for 7B, usable for many 14B Q4 models.

laptop16GB RAM8GB VRAM

RTX 4070 desktop

Strong all-rounder for chat, code and agents.

desktop32GB RAM12GB VRAM

MacBook Pro Apple Silicon 64GB

Excellent MLX/unified-memory profile.

mac64GB RAM42GB VRAM

CPU-only laptop 16GB

Best with 3B-8B quantized models.

laptop16GB RAM0GB VRAM