ollama/docs/linux.md

171 lines
3.5 KiB
Markdown
Raw Normal View History

# Linux
2023-09-25 04:34:44 +00:00
2023-10-25 21:47:49 +00:00
## Install
2023-09-25 04:38:23 +00:00
To install Ollama, run the following command:
```shell
curl -fsSL https://ollama.com/install.sh | sh
2023-09-25 04:38:23 +00:00
```
## Manual install
Download and extract the package:
```shell
curl -L https://ollama.com/download/ollama-linux-amd64.tgz -o ollama-linux-amd64.tgz
sudo tar -C /usr -xzf ollama-linux-amd64.tgz
```
Start Ollama:
```shell
ollama serve
```
2023-09-25 04:34:44 +00:00
In another terminal, verify that Ollama is running:
2023-09-25 04:34:44 +00:00
```shell
ollama -v
```
### AMD GPU install
2023-09-25 04:34:44 +00:00
If you have an AMD GPU, also download and extract the additional ROCm package:
```shell
curl -L https://ollama.com/download/ollama-linux-amd64-rocm.tgz -o ollama-linux-amd64-rocm.tgz
sudo tar -C /usr -xzf ollama-linux-amd64-rocm.tgz
2023-09-25 04:34:44 +00:00
```
### ARM64 install
Download and extract the ARM64-specific package:
```shell
curl -L https://ollama.com/download/ollama-linux-arm64.tgz -o ollama-linux-arm64.tgz
sudo tar -C /usr -xzf ollama-linux-arm64.tgz
```
2023-10-25 21:47:49 +00:00
### Adding Ollama as a startup service (recommended)
2023-09-25 04:34:44 +00:00
Create a user and group for Ollama:
2023-09-25 04:34:44 +00:00
```shell
sudo useradd -r -s /bin/false -U -m -d /usr/share/ollama ollama
sudo usermod -a -G ollama $(whoami)
2023-09-25 04:34:44 +00:00
```
Create a service file in `/etc/systemd/system/ollama.service`:
```ini
[Unit]
Description=Ollama Service
After=network-online.target
[Service]
ExecStart=/usr/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=$PATH"
2023-09-25 04:34:44 +00:00
[Install]
WantedBy=default.target
```
Then start the service:
```shell
2023-09-25 04:34:44 +00:00
sudo systemctl daemon-reload
sudo systemctl enable ollama
```
2023-09-25 23:10:32 +00:00
### Install CUDA drivers (optional)
2023-10-25 21:47:49 +00:00
[Download and install](https://developer.nvidia.com/cuda-downloads) CUDA.
Verify that the drivers are installed by running the following command, which should print details about your GPU:
```shell
2023-10-25 21:47:49 +00:00
nvidia-smi
```
### Install AMD ROCm drivers (optional)
[Download and Install](https://rocm.docs.amd.com/projects/install-on-linux/en/latest/tutorial/quick-start.html) ROCm v6.
2023-10-25 21:47:49 +00:00
### Start Ollama
Start Ollama and verify it is running:
2023-10-25 21:47:49 +00:00
```shell
2023-10-25 21:47:49 +00:00
sudo systemctl start ollama
sudo systemctl status ollama
2023-10-25 21:47:49 +00:00
```
> [!NOTE]
> While AMD has contributed the `amdgpu` driver upstream to the official linux
> kernel source, the version is older and may not support all ROCm features. We
> recommend you install the latest driver from
> https://www.amd.com/en/support/linux-drivers for best support of your Radeon
> GPU.
## Updating
2023-10-25 21:47:49 +00:00
Update Ollama by running the install script again:
2023-10-25 21:47:49 +00:00
```shell
curl -fsSL https://ollama.com/install.sh | sh
2023-10-25 21:47:49 +00:00
```
Or by re-downloading Ollama:
2023-10-25 21:47:49 +00:00
```shell
curl -L https://ollama.com/download/ollama-linux-amd64.tgz -o ollama-linux-amd64.tgz
sudo tar -C /usr -xzf ollama-linux-amd64.tgz
2023-10-25 21:47:49 +00:00
```
## Installing specific versions
Use `OLLAMA_VERSION` environment variable with the install script to install a specific version of Ollama, including pre-releases. You can find the version numbers in the [releases page](https://github.com/ollama/ollama/releases).
For example:
```shell
curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.3.9 sh
```
2023-10-25 21:47:49 +00:00
## Viewing logs
2023-09-25 23:10:32 +00:00
To view logs of Ollama running as a startup service, run:
```shell
journalctl -e -u ollama
2023-09-25 23:10:32 +00:00
```
2023-10-24 18:07:05 +00:00
## Uninstall
Remove the ollama service:
2023-10-25 21:47:49 +00:00
```shell
2023-10-25 21:47:49 +00:00
sudo systemctl stop ollama
sudo systemctl disable ollama
sudo rm /etc/systemd/system/ollama.service
2023-10-24 18:07:05 +00:00
```
Remove the ollama binary from your bin directory (either `/usr/local/bin`, `/usr/bin`, or `/bin`):
2023-10-25 21:47:49 +00:00
```shell
2023-10-25 21:47:49 +00:00
sudo rm $(which ollama)
2023-10-24 18:07:05 +00:00
```
Remove the downloaded models and Ollama service user and group:
```shell
2023-10-25 21:47:49 +00:00
sudo rm -r /usr/share/ollama
sudo userdel ollama
sudo groupdel ollama
2023-10-24 18:07:05 +00:00
```