2023-06-27 12:08:52 -04:00
|
|
|
# Ollama
|
2023-06-22 12:45:31 -04:00
|
|
|
|
2023-06-27 12:08:52 -04:00
|
|
|
- Run models, fast
|
|
|
|
- Download, manage and import models
|
2023-06-25 13:08:03 -04:00
|
|
|
|
2023-06-27 12:08:52 -04:00
|
|
|
## Install
|
2023-06-22 12:45:31 -04:00
|
|
|
|
|
|
|
```
|
2023-06-27 12:08:52 -04:00
|
|
|
pip install ollama
|
2023-06-22 12:45:31 -04:00
|
|
|
```
|
|
|
|
|
2023-06-27 12:08:52 -04:00
|
|
|
## Example quickstart
|
2023-06-25 13:08:03 -04:00
|
|
|
|
2023-06-27 12:08:52 -04:00
|
|
|
```python
|
|
|
|
import ollama
|
|
|
|
model_name = "huggingface.co/thebloke/llama-7b-ggml"
|
|
|
|
model = ollama.pull(model_name)
|
|
|
|
ollama.load(model)
|
|
|
|
ollama.generate(model_name, "hi")
|
2023-06-25 13:08:03 -04:00
|
|
|
```
|
|
|
|
|
2023-06-27 12:08:52 -04:00
|
|
|
## Reference
|
2023-06-25 13:08:03 -04:00
|
|
|
|
2023-06-27 12:08:52 -04:00
|
|
|
### `ollama.load`
|
|
|
|
|
|
|
|
Load a model from a path or a docker image
|
|
|
|
|
|
|
|
```python
|
|
|
|
ollama.load("model name")
|
2023-06-25 13:08:03 -04:00
|
|
|
```
|
|
|
|
|
2023-06-27 12:08:52 -04:00
|
|
|
### `ollama.generate("message")`
|
2023-06-25 13:08:03 -04:00
|
|
|
|
2023-06-27 12:08:52 -04:00
|
|
|
Generate a completion
|
2023-06-25 13:08:03 -04:00
|
|
|
|
2023-06-27 12:08:52 -04:00
|
|
|
```python
|
|
|
|
ollama.generate(model, "hi")
|
2023-06-25 13:08:03 -04:00
|
|
|
```
|
|
|
|
|
2023-06-27 12:08:52 -04:00
|
|
|
### `ollama.models`
|
2023-06-25 13:08:03 -04:00
|
|
|
|
2023-06-27 12:08:52 -04:00
|
|
|
List models
|
|
|
|
|
|
|
|
```
|
|
|
|
models = ollama.models()
|
2023-06-25 13:08:03 -04:00
|
|
|
```
|
|
|
|
|
2023-06-27 12:08:52 -04:00
|
|
|
### `ollama.serve`
|
2023-06-25 13:10:15 -04:00
|
|
|
|
2023-06-27 12:08:52 -04:00
|
|
|
Serve the ollama http server
|
2023-06-25 13:08:03 -04:00
|
|
|
|
2023-06-27 12:08:52 -04:00
|
|
|
## Cooing Soon
|
2023-06-25 13:08:03 -04:00
|
|
|
|
2023-06-27 12:08:52 -04:00
|
|
|
### `ollama.pull`
|
|
|
|
|
|
|
|
Examples:
|
|
|
|
|
|
|
|
```python
|
|
|
|
ollama.pull("huggingface.co/thebloke/llama-7b-ggml")
|
2023-06-25 13:08:03 -04:00
|
|
|
```
|
|
|
|
|
2023-06-27 12:08:52 -04:00
|
|
|
### `ollama.import`
|
|
|
|
|
|
|
|
Import an existing model into the model store
|
|
|
|
|
|
|
|
```python
|
|
|
|
ollama.import("./path/to/model")
|
|
|
|
```
|
2023-06-25 13:08:03 -04:00
|
|
|
|
2023-06-27 12:08:52 -04:00
|
|
|
### `ollama.search`
|
2023-06-25 14:29:26 -04:00
|
|
|
|
2023-06-27 12:08:52 -04:00
|
|
|
Search for compatible models that Ollama can run
|
2023-06-25 14:29:26 -04:00
|
|
|
|
2023-06-27 12:08:52 -04:00
|
|
|
```python
|
|
|
|
ollama.search("llama-7b")
|
|
|
|
```
|
2023-06-25 13:08:03 -04:00
|
|
|
|
2023-06-27 12:08:52 -04:00
|
|
|
## Future CLI
|
2023-06-25 14:29:26 -04:00
|
|
|
|
2023-06-27 12:08:52 -04:00
|
|
|
```
|
|
|
|
ollama run huggingface.co/thebloke/llama-7b-ggml
|
|
|
|
```
|