No description
Find a file
2023-06-26 18:08:16 -04:00
desktop proto -> ollama 2023-06-26 15:57:13 -04:00
models build server into desktop app 2023-06-25 00:30:02 -04:00
.gitignore add templates to prompt command 2023-06-26 13:41:16 -04:00
build.py proto -> ollama 2023-06-26 15:57:13 -04:00
LICENSE proto -> ollama 2023-06-26 15:57:13 -04:00
model_prompts.json move prompt template to server 2023-06-26 14:31:49 -04:00
ollama.py take full path to the model in the api 2023-06-26 18:08:16 -04:00
README.md proto -> ollama 2023-06-26 15:57:13 -04:00
requirements.txt reorganize directories 2023-06-25 13:08:03 -04:00
template.py move prompts to json file 2023-06-26 14:03:49 -04:00

ollama

🙊

Running

Install dependencies:

pip install -r requirements.txt

Put your model in models/ and run:

python3 ollama.py serve

To run the app:

cd desktop
npm install
npm start

Building

If using Apple silicon, you need a Python version that supports arm64:

wget https://github.com/conda-forge/miniforge/releases/latest/download/Miniforge3-MacOSX-arm64.sh
bash Miniforge3-MacOSX-arm64.sh

Get the dependencies:

pip install -r requirements.txt

Then build a binary for your current platform:

python3 build.py

Building the app

cd desktop
npm run package

API

GET /models

Returns a list of available models

POST /generate

Generates completions as a series of JSON objects

model: string - The name of the model to use in the models folder. prompt: string - The prompt to use.