ollama

How to Install Ollama (Yes, It's Actually That Easy)

·5 min read
How to Install Ollama (Yes, It's Actually That Easy)

POV: You're tired of paying $20/month for ChatGPT and just found out you can run AI models locally on your own machine. For free. Welcome to the Ollama era.

Ollama is basically a tool that lets you download and run powerful large language models (LLMs) right on your computer — no cloud, no subscriptions, no data leaking to some server somewhere. Privacy-focused and free? That's the vibe.


Before You Start (The Reality Check)

Make sure your machine can actually handle this. The minimum specs you need:

  • 16 GB RAM
  • 12 GB of storage
  • 4 CPU cores

Also, update your GPU drivers if you have a dedicated GPU — this unlocks hardware acceleration and makes everything run way faster.


Windows: The No-Brainer Install

If you're on Windows, this is genuinely painless:

  1. Go to ollama.com and hit the Download button
  2. Download OllamaSetup.exe
  3. Run the installer — no admin rights needed
  4. A PowerShell window will pop up when it's done
  5. Pin that PowerShell icon to your taskbar, you'll be using it a lot

That's it. Seriously. It installs into your account without touching system files.


macOS: Also Embarrassingly Simple

  1. Download the .zip file from ollama.com
  2. Unzip it, open the app, and move it to your Applications folder
  3. Hit "Yes" when it asks to install the command line tool
  4. The Ollama icon will chill in your menu bar

Linux: One Command to Rule Them All

Open your terminal and run this single command:

curl -fsSL https://ollama.com/install.sh | sh

Done. Ollama will automatically set up as a background service that starts on boot.


Actually Running a Model

Once Ollama is installed, open your terminal (or PowerShell on Windows) and pull a model. Start with something lightweight like Llama 3:

ollama run llama3

It downloads the model on the first run. After that, it's instant. You can also explicitly pull a model before running it:

ollama pull llama3.1:latest

Want to see all your downloaded models?

ollama list

The TLDR (For the Skimmers)

OSWhat You Do
WindowsDownload & run OllamaSetup.exe from ollama.com
macOSDownload .zip, move app to Applications, install CLI tool
LinuxRun one curl command in terminal

Running AI locally is genuinely one of those things that feels illegal once you realize how easy it is. If you get Ollama set up, drop a comment about which model you ran first — the Llama loyalists vs. the Gemma fans debate is very much alive.

© 2026 Ghazi Fadil. All rights reserved.