2023-10-25 21:47:49 +00:00
|
|
|
|
# Ollama on Linux
|
2023-09-25 04:34:44 +00:00
|
|
|
|
|
2023-10-25 21:47:49 +00:00
|
|
|
|
## Install
|
2023-09-25 04:38:23 +00:00
|
|
|
|
|
2023-10-25 21:47:49 +00:00
|
|
|
|
Install Ollama running this one-liner:
|
2024-02-09 23:19:30 +00:00
|
|
|
|
|
2023-10-25 21:47:49 +00:00
|
|
|
|
>
|
2024-02-09 23:19:30 +00:00
|
|
|
|
|
2023-10-01 18:51:01 +00:00
|
|
|
|
```bash
|
2024-02-09 23:19:30 +00:00
|
|
|
|
curl -fsSL https://ollama.com/install.sh | sh
|
2023-09-25 04:38:23 +00:00
|
|
|
|
```
|
|
|
|
|
|
2024-02-16 01:15:09 +00:00
|
|
|
|
## AMD Radeon GPU support
|
|
|
|
|
|
|
|
|
|
While AMD has contributed the `amdgpu` driver upstream to the official linux
|
|
|
|
|
kernel source, the version is older and may not support all ROCm features. We
|
|
|
|
|
recommend you install the latest driver from
|
|
|
|
|
https://www.amd.com/en/support/linux-drivers for best support of your Radeon
|
|
|
|
|
GPU.
|
|
|
|
|
|
2023-10-25 21:47:49 +00:00
|
|
|
|
## Manual install
|
2023-09-25 04:34:44 +00:00
|
|
|
|
|
2023-10-25 21:47:49 +00:00
|
|
|
|
### Download the `ollama` binary
|
2023-09-25 04:34:44 +00:00
|
|
|
|
|
2023-10-25 21:47:49 +00:00
|
|
|
|
Ollama is distributed as a self-contained binary. Download it to a directory in your PATH:
|
2023-09-25 04:34:44 +00:00
|
|
|
|
|
2023-10-01 18:51:01 +00:00
|
|
|
|
```bash
|
2024-02-09 23:19:30 +00:00
|
|
|
|
sudo curl -L https://ollama.com/download/ollama-linux-amd64 -o /usr/bin/ollama
|
2023-10-25 21:47:49 +00:00
|
|
|
|
sudo chmod +x /usr/bin/ollama
|
2023-09-25 04:34:44 +00:00
|
|
|
|
```
|
|
|
|
|
|
2023-10-25 21:47:49 +00:00
|
|
|
|
### Adding Ollama as a startup service (recommended)
|
2023-09-25 04:34:44 +00:00
|
|
|
|
|
|
|
|
|
Create a user for Ollama:
|
|
|
|
|
|
2023-10-01 18:51:01 +00:00
|
|
|
|
```bash
|
2023-09-25 04:34:44 +00:00
|
|
|
|
sudo useradd -r -s /bin/false -m -d /usr/share/ollama ollama
|
|
|
|
|
```
|
|
|
|
|
|
|
|
|
|
Create a service file in `/etc/systemd/system/ollama.service`:
|
|
|
|
|
|
|
|
|
|
```ini
|
|
|
|
|
[Unit]
|
|
|
|
|
Description=Ollama Service
|
|
|
|
|
After=network-online.target
|
|
|
|
|
|
|
|
|
|
[Service]
|
|
|
|
|
ExecStart=/usr/bin/ollama serve
|
|
|
|
|
User=ollama
|
|
|
|
|
Group=ollama
|
|
|
|
|
Restart=always
|
|
|
|
|
RestartSec=3
|
|
|
|
|
|
|
|
|
|
[Install]
|
|
|
|
|
WantedBy=default.target
|
|
|
|
|
```
|
|
|
|
|
|
|
|
|
|
Then start the service:
|
|
|
|
|
|
2023-10-01 18:51:01 +00:00
|
|
|
|
```bash
|
2023-09-25 04:34:44 +00:00
|
|
|
|
sudo systemctl daemon-reload
|
|
|
|
|
sudo systemctl enable ollama
|
|
|
|
|
```
|
2023-09-25 23:10:32 +00:00
|
|
|
|
|
2023-10-25 21:47:49 +00:00
|
|
|
|
### Install CUDA drivers (optional – for Nvidia GPUs)
|
|
|
|
|
|
|
|
|
|
[Download and install](https://developer.nvidia.com/cuda-downloads) CUDA.
|
|
|
|
|
|
|
|
|
|
Verify that the drivers are installed by running the following command, which should print details about your GPU:
|
|
|
|
|
|
|
|
|
|
```bash
|
|
|
|
|
nvidia-smi
|
|
|
|
|
```
|
|
|
|
|
|
2024-03-08 17:45:55 +00:00
|
|
|
|
### Install ROCm (optional - for Radeon GPUs)
|
|
|
|
|
[Download and Install](https://rocm.docs.amd.com/projects/install-on-linux/en/latest/tutorial/quick-start.html)
|
|
|
|
|
|
|
|
|
|
Make sure to install ROCm v6
|
|
|
|
|
|
2023-10-25 21:47:49 +00:00
|
|
|
|
### Start Ollama
|
|
|
|
|
|
|
|
|
|
Start Ollama using `systemd`:
|
|
|
|
|
|
|
|
|
|
```bash
|
|
|
|
|
sudo systemctl start ollama
|
|
|
|
|
```
|
|
|
|
|
|
|
|
|
|
## Update
|
|
|
|
|
|
|
|
|
|
Update ollama by running the install script again:
|
|
|
|
|
|
|
|
|
|
```bash
|
2024-02-09 23:19:30 +00:00
|
|
|
|
curl -fsSL https://ollama.com/install.sh | sh
|
2023-10-25 21:47:49 +00:00
|
|
|
|
```
|
|
|
|
|
|
|
|
|
|
Or by downloading the ollama binary:
|
|
|
|
|
|
|
|
|
|
```bash
|
2024-02-09 23:19:30 +00:00
|
|
|
|
sudo curl -L https://ollama.com/download/ollama-linux-amd64 -o /usr/bin/ollama
|
2023-10-25 21:47:49 +00:00
|
|
|
|
sudo chmod +x /usr/bin/ollama
|
|
|
|
|
```
|
|
|
|
|
|
|
|
|
|
## Viewing logs
|
2023-09-25 23:10:32 +00:00
|
|
|
|
|
|
|
|
|
To view logs of Ollama running as a startup service, run:
|
|
|
|
|
|
2023-10-01 18:51:01 +00:00
|
|
|
|
```bash
|
2023-09-25 23:10:32 +00:00
|
|
|
|
journalctl -u ollama
|
|
|
|
|
```
|
2023-10-24 18:07:05 +00:00
|
|
|
|
|
|
|
|
|
## Uninstall
|
|
|
|
|
|
|
|
|
|
Remove the ollama service:
|
2023-10-25 21:47:49 +00:00
|
|
|
|
|
2023-10-24 18:07:05 +00:00
|
|
|
|
```bash
|
2023-10-25 21:47:49 +00:00
|
|
|
|
sudo systemctl stop ollama
|
|
|
|
|
sudo systemctl disable ollama
|
|
|
|
|
sudo rm /etc/systemd/system/ollama.service
|
2023-10-24 18:07:05 +00:00
|
|
|
|
```
|
|
|
|
|
|
|
|
|
|
Remove the ollama binary from your bin directory (either `/usr/local/bin`, `/usr/bin`, or `/bin`):
|
2023-10-25 21:47:49 +00:00
|
|
|
|
|
2023-10-24 18:07:05 +00:00
|
|
|
|
```bash
|
2023-10-25 21:47:49 +00:00
|
|
|
|
sudo rm $(which ollama)
|
2023-10-24 18:07:05 +00:00
|
|
|
|
```
|
|
|
|
|
|
2024-01-12 05:07:00 +00:00
|
|
|
|
Remove the downloaded models and Ollama service user and group:
|
2024-02-09 23:19:30 +00:00
|
|
|
|
|
2023-10-24 18:07:05 +00:00
|
|
|
|
```bash
|
2023-10-25 21:47:49 +00:00
|
|
|
|
sudo rm -r /usr/share/ollama
|
|
|
|
|
sudo userdel ollama
|
2024-01-12 05:07:00 +00:00
|
|
|
|
sudo groupdel ollama
|
2023-10-24 18:07:05 +00:00
|
|
|
|
```
|