ollama/app
Daniel Hiltgen 58d95cc9bd Switch back to subprocessing for llama.cpp
This should resolve a number of memory leak and stability defects by allowing
us to isolate llama.cpp in a separate process and shutdown when idle, and
gracefully restart if it has problems.  This also serves as a first step to be
able to run multiple copies to support multiple models concurrently.
2024-04-01 16:48:18 -07:00
..
assets higher resolution tray icons 2024-02-14 22:55:03 -08:00
lifecycle Switch back to subprocessing for llama.cpp 2024-04-01 16:48:18 -07:00
store Implement new Go based Desktop app 2024-02-15 05:56:45 +00:00
tray change github.com/jmorganca/ollama to github.com/ollama/ollama (#3347) 2024-03-26 13:04:17 -07:00
.gitignore set exe metadata using resource files 2024-02-15 05:56:45 +00:00
main.go change github.com/jmorganca/ollama to github.com/ollama/ollama (#3347) 2024-03-26 13:04:17 -07:00
ollama.iss Update ollama.iss 2024-03-13 20:15:45 +08:00
ollama.rc update installer and app.exe metadata 2024-02-15 05:56:45 +00:00
ollama_welcome.ps1 Implement new Go based Desktop app 2024-02-15 05:56:45 +00:00
README.md Implement new Go based Desktop app 2024-02-15 05:56:45 +00:00

Ollama App

Linux

TODO

MacOS

TODO

Windows

If you want to build the installer, youll need to install

In the top directory of this repo, run the following powershell script to build the ollama CLI, ollama app, and ollama installer.

powershell -ExecutionPolicy Bypass -File .\scripts\build_windows.ps1