2023-06-27 16:35:51 +00:00
|
|
|
|
# Desktop
|
|
|
|
|
|
2023-06-28 13:57:36 +00:00
|
|
|
|
The Ollama desktop experience. This is an experimental, easy-to-use app for running models with [`ollama`](https://github.com/jmorganca/ollama).
|
|
|
|
|
|
|
|
|
|
## Download
|
|
|
|
|
|
|
|
|
|
- [macOS](https://ollama.ai/download/darwin_arm64) (Apple Silicon)
|
|
|
|
|
- macOS (Intel – Coming soon)
|
|
|
|
|
- Windows (Coming soon)
|
|
|
|
|
- Linux (Coming soon)
|
2023-06-27 16:35:51 +00:00
|
|
|
|
|
|
|
|
|
## Running
|
|
|
|
|
|
2023-06-28 13:57:36 +00:00
|
|
|
|
In the background run the ollama server `ollama.py` server:
|
2023-06-27 17:51:20 +00:00
|
|
|
|
|
2023-06-27 16:35:51 +00:00
|
|
|
|
```
|
2023-06-27 21:06:57 +00:00
|
|
|
|
python ../ollama.py serve --port 7734
|
2023-06-27 16:35:51 +00:00
|
|
|
|
```
|
|
|
|
|
|
2023-06-28 13:57:36 +00:00
|
|
|
|
Then run the desktop app with `npm start`:
|
2023-06-27 16:35:51 +00:00
|
|
|
|
|
|
|
|
|
```
|
2023-06-27 17:51:20 +00:00
|
|
|
|
npm install
|
|
|
|
|
npm start
|
2023-06-27 16:35:51 +00:00
|
|
|
|
```
|
2023-06-28 13:57:36 +00:00
|
|
|
|
|
|
|
|
|
## Coming soon
|
|
|
|
|
|
|
|
|
|
- Browse the latest available models on Hugging Face and other sources
|
|
|
|
|
- Keep track of previous conversations with models
|
|
|
|
|
- Switch between models
|
|
|
|
|
- Connect to remote Ollama servers to run models
|