docs: improve linux install documentation (#6683)

Includes small improvements to document layout and code blocks
This commit is contained in:
Jeffrey Morgan 2024-09-06 22:05:37 -07:00 committed by GitHub
parent da915345d1
commit 108fb6c1d1
No known key found for this signature in database
GPG key ID: B5690EEEBB952194

View file

@ -1,43 +1,57 @@
# Ollama on Linux # Linux
## Install ## Install
Install Ollama running this one-liner: To install Ollama, run the following command:
> ```shell
```bash
curl -fsSL https://ollama.com/install.sh | sh curl -fsSL https://ollama.com/install.sh | sh
``` ```
## AMD Radeon GPU support
While AMD has contributed the `amdgpu` driver upstream to the official linux
kernel source, the version is older and may not support all ROCm features. We
recommend you install the latest driver from
https://www.amd.com/en/support/linux-drivers for best support of your Radeon
GPU.
## Manual install ## Manual install
### Download `ollama` Download and extract the package:
Download and extract the Linux package: ```shell
curl -L https://ollama.com/download/ollama-linux-amd64.tgz -o ollama-linux-amd64.tgz
```bash sudo tar -C /usr -xzf ollama-linux-amd64.tgz
curl -fsSL https://ollama.com/download/ollama-linux-amd64.tgz | sudo tar zx -C /usr
``` ```
If you have an AMD GPU, also download and extract the ROCm package into the same location Start Ollama:
```bash
curl -fsSL https://ollama.com/download/ollama-linux-amd64-rocm.tgz | sudo tar zx -C /usr ```shell
ollama serve
```
In another terminal, verify that Ollama is running:
```shell
ollama -v
```
### AMD GPU install
If you have an AMD GPU, also download and extract the additional ROCm package:
```shell
curl -L https://ollama.com/download/ollama-linux-amd64-rocm.tgz -o ollama-linux-amd64-rocm.tgz
sudo tar -C /usr -xzf ollama-linux-amd64-rocm.tgz
```
### ARM64 install
Download and extract the ARM64-specific package:
```shell
curl -L https://ollama.com/download/ollama-linux-arm64.tgz -o ollama-linux-arm64.tgz
sudo tar -C /usr -xzf ollama-linux-arm64.tgz
``` ```
### Adding Ollama as a startup service (recommended) ### Adding Ollama as a startup service (recommended)
Create a user and group for Ollama: Create a user and group for Ollama:
```bash ```shell
sudo useradd -r -s /bin/false -U -m -d /usr/share/ollama ollama sudo useradd -r -s /bin/false -U -m -d /usr/share/ollama ollama
sudo usermod -a -G ollama $(whoami) sudo usermod -a -G ollama $(whoami)
``` ```
@ -63,47 +77,54 @@ WantedBy=default.target
Then start the service: Then start the service:
```bash ```shell
sudo systemctl daemon-reload sudo systemctl daemon-reload
sudo systemctl enable ollama sudo systemctl enable ollama
``` ```
### Install CUDA drivers (optional for Nvidia GPUs) ### Install CUDA drivers (optional)
[Download and install](https://developer.nvidia.com/cuda-downloads) CUDA. [Download and install](https://developer.nvidia.com/cuda-downloads) CUDA.
Verify that the drivers are installed by running the following command, which should print details about your GPU: Verify that the drivers are installed by running the following command, which should print details about your GPU:
```bash ```shell
nvidia-smi nvidia-smi
``` ```
### Install ROCm (optional - for Radeon GPUs) ### Install AMD ROCm drivers (optional)
[Download and Install](https://rocm.docs.amd.com/projects/install-on-linux/en/latest/tutorial/quick-start.html)
Make sure to install ROCm v6 [Download and Install](https://rocm.docs.amd.com/projects/install-on-linux/en/latest/tutorial/quick-start.html) ROCm v6.
### Start Ollama ### Start Ollama
Start Ollama and verify it is running: Start Ollama and verify it is running:
```bash ```shell
sudo systemctl start ollama sudo systemctl start ollama
sudo systemctl status ollama sudo systemctl status ollama
``` ```
## Update > [!NOTE]
> While AMD has contributed the `amdgpu` driver upstream to the official linux
> kernel source, the version is older and may not support all ROCm features. We
> recommend you install the latest driver from
> https://www.amd.com/en/support/linux-drivers for best support of your Radeon
> GPU.
Update ollama by running the install script again: ## Updating
```bash Update Ollama by running the install script again:
```shell
curl -fsSL https://ollama.com/install.sh | sh curl -fsSL https://ollama.com/install.sh | sh
``` ```
Or by downloading the ollama binary: Or by re-downloading Ollama:
```bash ```shell
curl -fsSL https://ollama.com/download/ollama-linux-amd64.tgz | sudo tar zx -C /usr curl -L https://ollama.com/download/ollama-linux-amd64.tgz -o ollama-linux-amd64.tgz
sudo tar -C /usr -xzf ollama-linux-amd64.tgz
``` ```
## Installing specific versions ## Installing specific versions
@ -112,15 +133,15 @@ Use `OLLAMA_VERSION` environment variable with the install script to install a s
For example: For example:
``` ```shell
curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.1.32 sh curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.3.9 sh
``` ```
## Viewing logs ## Viewing logs
To view logs of Ollama running as a startup service, run: To view logs of Ollama running as a startup service, run:
```bash ```shell
journalctl -e -u ollama journalctl -e -u ollama
``` ```
@ -128,7 +149,7 @@ journalctl -e -u ollama
Remove the ollama service: Remove the ollama service:
```bash ```shell
sudo systemctl stop ollama sudo systemctl stop ollama
sudo systemctl disable ollama sudo systemctl disable ollama
sudo rm /etc/systemd/system/ollama.service sudo rm /etc/systemd/system/ollama.service
@ -136,13 +157,13 @@ sudo rm /etc/systemd/system/ollama.service
Remove the ollama binary from your bin directory (either `/usr/local/bin`, `/usr/bin`, or `/bin`): Remove the ollama binary from your bin directory (either `/usr/local/bin`, `/usr/bin`, or `/bin`):
```bash ```shell
sudo rm $(which ollama) sudo rm $(which ollama)
``` ```
Remove the downloaded models and Ollama service user and group: Remove the downloaded models and Ollama service user and group:
```bash ```shell
sudo rm -r /usr/share/ollama sudo rm -r /usr/share/ollama
sudo userdel ollama sudo userdel ollama
sudo groupdel ollama sudo groupdel ollama