FAQ: answer a few faq questions (#1128)
* faq: does ollama share my prompts Signed-off-by: Matt Williams <m@technovangelist.com> * faq: ollama and openai Signed-off-by: Matt Williams <m@technovangelist.com> * faq: vscode plugins Signed-off-by: Matt Williams <m@technovangelist.com> * faq: send a doc to Ollama Signed-off-by: Matt Williams <m@technovangelist.com> * extra spacing Signed-off-by: Matt Williams <m@technovangelist.com> * Update faq.md * Update faq.md --------- Signed-off-by: Matt Williams <m@technovangelist.com> Co-authored-by: Michael <mchiang0610@users.noreply.github.com>
This commit is contained in:
parent
686f85d6ca
commit
f61f340279
1 changed files with 13 additions and 2 deletions
15
docs/faq.md
15
docs/faq.md
|
@ -1,5 +1,10 @@
|
|||
# FAQ
|
||||
|
||||
- [How can I view the logs?](#how-can-i-view-the-logs)
|
||||
- [How can I expose Ollama on my network?](#how-can-i-expose-ollama-on-my-network)
|
||||
- [How can I allow additional web origins to access Ollama?](#how-can-i-allow-additional-web-origins-to-access-ollama)
|
||||
- [Where are models stored?](#where-are-models-stored)
|
||||
|
||||
## How can I view the logs?
|
||||
|
||||
On macOS:
|
||||
|
@ -74,8 +79,6 @@ systemctl restart ollama
|
|||
- macOS: Raw model data is stored under `~/.ollama/models`.
|
||||
- Linux: Raw model data is stored under `/usr/share/ollama/.ollama/models`
|
||||
|
||||
|
||||
|
||||
Below the models directory you will find a structure similar to the following:
|
||||
|
||||
```shell
|
||||
|
@ -96,3 +99,11 @@ The manifest lists all the layers used in this model. You will see a `media type
|
|||
### How can I change where Ollama stores models?
|
||||
|
||||
To modify where models are stored, you can use the `OLLAMA_MODELS` environment variable. Note that on Linux this means defining `OLLAMA_MODELS` in a drop-in `/etc/systemd/system/ollama.service.d` service file, reloading systemd, and restarting the ollama service.
|
||||
|
||||
## Does Ollama send my prompts and answers back to Ollama.ai to use in any way?
|
||||
|
||||
No. Anything you do with Ollama, such as generate a response from the model, stays with you. We don't collect any data about how you use the model. You are always in control of your own data.
|
||||
|
||||
## How can I use Ollama in VSCode to help me code?
|
||||
|
||||
There is already a large collection of plugins available for VSCode as well as other editors that leverage Ollama. You can see the list of plugins at the bottom of the main repository readme.
|
||||
|
|
Loading…
Reference in a new issue