Update faq.md
This commit is contained in:
parent
0b6c6c9092
commit
df56f1ee5e
1 changed files with 22 additions and 0 deletions
22
docs/faq.md
22
docs/faq.md
|
@ -14,6 +14,28 @@ curl -fsSL https://ollama.com/install.sh | sh
|
||||||
|
|
||||||
Review the [Troubleshooting](./troubleshooting.md) docs for more about using logs.
|
Review the [Troubleshooting](./troubleshooting.md) docs for more about using logs.
|
||||||
|
|
||||||
|
## How can I specify the context window size?
|
||||||
|
|
||||||
|
By default, Ollama uses a context window size of 2048 tokens.
|
||||||
|
|
||||||
|
To change this when using `ollama run`, use `/set parameter`:
|
||||||
|
|
||||||
|
```
|
||||||
|
/set parameter num_ctx 4096
|
||||||
|
```
|
||||||
|
|
||||||
|
When using the API, specify the `num_ctx` parameter:
|
||||||
|
|
||||||
|
```
|
||||||
|
curl http://localhost:11434/api/generate -d '{
|
||||||
|
"model": "llama2",
|
||||||
|
"prompt": "Why is the sky blue?",
|
||||||
|
"options": {
|
||||||
|
"num_ctx": 4096
|
||||||
|
}
|
||||||
|
}'
|
||||||
|
```
|
||||||
|
|
||||||
## How do I configure Ollama server?
|
## How do I configure Ollama server?
|
||||||
|
|
||||||
Ollama server can be configured with environment variables.
|
Ollama server can be configured with environment variables.
|
||||||
|
|
Loading…
Reference in a new issue