Run ollama with an AMD GPU on Arch

⚠️
Since this commit, there is official support for AMD GPU's.
Edit: Since March 2024 also official release with support for AMD GPU's.

Officially, the setup script for 0llama doesn't yet support AMD GPUs. You can still get ollama up and running on your machine by building it yourself.

Build

+ Clone the repository:

git clone --recursive https://github.com/jmorganca/ollama

+ Install the required dependencies:

sudo pacman -S rocm-hip-sdk rocm-opencl-sdk clblast go

+ Generate tags:

ROCM_PATH=/opt/rocm CLBlast_DIR=/usr/lib/cmake/CLBlast go generate -tags rocm ./...

+ After the generation of the tags, build ollama:

go build -tags rocm

Now, you should have a functional version of ollama that utilizes your AMD GPU for computation.

Starting ollama and Creating a systemd Service

To initiate ollama in serve mode and run any supported model, follow these steps:

+ Start ollama in serve mode:
  Open a terminal and run the following command:

./ollama serve

+ Run a model
   In another terminal window, execute the command

./ollama run model-name

💡
Troubleshooting:
If you encounter difficulties launching Ollama, it may be due to an enabled iGPU. To resolve this, disable it in the BIOS settings. Alternatively, you can limit the visible devices for Ollama using the command:

ROCR_VISIBLE_DEVICES=1 ./ollama serve

Special thanks to 'User' and 'aposhtol' for bringing attention to this issue in the comments.

Create a systemd service for Ollama:

If you prefer, you can set up a systemd service for Ollama for easy startup. Here's an example systemd service file:

ollama.service:

[Unit]
Description=Ollama

[Service]
ExecStart=/path/to/ollama serve
User=your_username
Group=your_group_name
WorkingDirectory=/path/to/ollama

[Install]
WantedBy=default.target

Replace /path/to/ollama, your_username, and your_group_name with the appropriate paths and user/group information.

Save the file, reload systemd, and start the service:

sudo systemctl daemon-reload
sudo systemctl enable ollama.service
sudo systemctl start ollama.service

Now, you can easily manage ollama using systemd, ensuring a seamless and convenient experience.

Check out my other blog post on how you can automate your own blog with AI.

Automate your blog with AI
Use AI and n8n to automate your ghost blog.