fix relative links in README.md
This commit is contained in:
parent
788637918a
commit
92578798bb
1 changed files with 7 additions and 7 deletions
14
README.md
14
README.md
|
@ -13,7 +13,7 @@ Get up and running with large language models locally.
|
|||
|
||||
### macOS
|
||||
|
||||
[Download](https://ollama.ai/download/Ollama-darwin.zip)
|
||||
[Download](https://ollama.ai/download/Ollama-darwin.zip)
|
||||
|
||||
### Linux & WSL2
|
||||
|
||||
|
@ -23,7 +23,7 @@ curl https://ollama.ai/install.sh | sh
|
|||
|
||||
[Manual install instructions](https://github.com/jmorganca/ollama/blob/main/docs/linux.md)
|
||||
|
||||
### Windows
|
||||
### Windows
|
||||
|
||||
coming soon
|
||||
|
||||
|
@ -37,7 +37,7 @@ ollama run llama2
|
|||
|
||||
## Model library
|
||||
|
||||
Ollama supports a list of open-source models available on [ollama.ai/library](https://ollama.ai/library "ollama model library")
|
||||
Ollama supports a list of open-source models available on [ollama.ai/library](https://ollama.ai/library 'ollama model library')
|
||||
|
||||
Here are some example open-source models that can be downloaded:
|
||||
|
||||
|
@ -66,13 +66,13 @@ Ollama supports importing GGUF and GGML file formats in the Modelfile. This mean
|
|||
FROM ./vicuna-33b.Q4_0.gguf
|
||||
```
|
||||
|
||||
3. Create the model in Ollama
|
||||
2. Create the model in Ollama
|
||||
|
||||
```
|
||||
ollama create name -f path_to_modelfile
|
||||
```
|
||||
|
||||
5. Run the model
|
||||
3. Run the model
|
||||
|
||||
```
|
||||
ollama run name
|
||||
|
@ -109,7 +109,7 @@ ollama run mario
|
|||
Hello! It's your friend Mario.
|
||||
```
|
||||
|
||||
For more examples, see the [examples](./examples) directory. For more information on working with a Modelfile, see the [Modelfile](./docs/modelfile.md) documentation.
|
||||
For more examples, see the [examples](examples) directory. For more information on working with a Modelfile, see the [Modelfile](docs/modelfile.md) documentation.
|
||||
|
||||
## CLI Reference
|
||||
|
||||
|
@ -195,7 +195,7 @@ Finally, in a separate shell, run a model:
|
|||
|
||||
## REST API
|
||||
|
||||
> See the [API documentation](./docs/api.md) for all endpoints.
|
||||
> See the [API documentation](docs/api.md) for all endpoints.
|
||||
|
||||
Ollama has an API for running and managing models. For example to generate text from a model:
|
||||
|
||||
|
|
Loading…
Reference in a new issue