58d95cc9bd
This should resolve a number of memory leak and stability defects by allowing us to isolate llama.cpp in a separate process and shutdown when idle, and gracefully restart if it has problems. This also serves as a first step to be able to run multiple copies to support multiple models concurrently. |
||
---|---|---|
.. | ||
assets | ||
lifecycle | ||
store | ||
tray | ||
.gitignore | ||
main.go | ||
ollama.iss | ||
ollama.rc | ||
ollama_welcome.ps1 | ||
README.md |
Ollama App
Linux
TODO
MacOS
TODO
Windows
If you want to build the installer, youll need to install
In the top directory of this repo, run the following powershell script to build the ollama CLI, ollama app, and ollama installer.
powershell -ExecutionPolicy Bypass -File .\scripts\build_windows.ps1