ollama/app
2023-07-10 17:14:25 -04:00
..
assets add Template to icon file name to handle bright and dark modes better on MacOS 2023-07-07 09:54:59 -04:00
src when app is running, server restarts when it exits or disconnects 2023-07-10 17:14:25 -04:00
.eslintrc.json move desktop -> app 2023-07-02 17:26:55 -04:00
.gitignore move desktop -> app 2023-07-02 17:26:55 -04:00
forge.config.ts write version at build time 2023-07-07 12:59:45 -04:00
package-lock.json fix env var loading 2023-07-07 10:27:33 -04:00
package.json fix env var loading 2023-07-07 10:27:33 -04:00
postcss.config.js move desktop -> app 2023-07-02 17:26:55 -04:00
README.md always use ollama binary 2023-07-06 16:34:44 -04:00
tailwind.config.js move desktop -> app 2023-07-02 17:26:55 -04:00
tsconfig.json move desktop -> app 2023-07-02 17:26:55 -04:00
webpack.main.config.ts fix env var loading 2023-07-07 10:27:33 -04:00
webpack.plugins.ts fix env var loading 2023-07-07 10:27:33 -04:00
webpack.renderer.config.ts move desktop -> app 2023-07-02 17:26:55 -04:00
webpack.rules.ts move desktop -> app 2023-07-02 17:26:55 -04:00

Desktop

Note: the Ollama desktop app is a work in progress and is not ready yet for general use.

This app builds upon Ollama to provide a desktop experience for running models.

Developing

First, build the ollama binary:

make -C ..

Then run the desktop app with npm start:

npm install
npm start

Coming soon

  • Browse the latest available models on Hugging Face and other sources
  • Keep track of previous conversations with models
  • Switch quickly between models
  • Connect to remote Ollama servers to run models