From 5b7a27281dc95b94c2549cec440622a62e1cad13 Mon Sep 17 00:00:00 2001 From: Jeffrey Morgan Date: Sun, 24 Sep 2023 21:38:23 -0700 Subject: [PATCH] improvements to `docs/linux.md` --- docs/linux.md | 19 +++++++++++++++++-- 1 file changed, 17 insertions(+), 2 deletions(-) diff --git a/docs/linux.md b/docs/linux.md index 3e70c4c2..8697137d 100644 --- a/docs/linux.md +++ b/docs/linux.md @@ -12,9 +12,24 @@ Ollama is distributed as a self-contained binary. Download it to a directory in ``` sudo curl -L https://ollama.ai/download/ollama-linux-amd64 -o /usr/bin/ollama +sudo chmod +x /usr/bin/ollama ``` -## Install CUDA drivers (optional for Nvidia GPUs) +## Start Ollama + +Start Ollama by running `ollama serve`: + +``` +ollama serve +``` + +Once Ollama is running, run a model: + +``` +ollama run llama2 +``` + +## Install CUDA drivers (optional – for Nvidia GPUs) [Download and install](https://developer.nvidia.com/cuda-downloads) CUDA. @@ -24,7 +39,7 @@ Verify that the drivers are installed by running the following command, which sh nvidia-smi ``` -## Adding Ollama as a startup service +## Adding Ollama as a startup service (optional) Create a user for Ollama: