update as per Mike's comments
Signed-off-by: Matt Williams <m@technovangelist.com>
This commit is contained in:
parent
ac1b04f271
commit
2544b8afa1
2 changed files with 22 additions and 19 deletions
|
@ -1,6 +1,5 @@
|
||||||
# Documentation
|
# Documentation
|
||||||
|
|
||||||
- [Modelfile](./modelfile.md)
|
- [Modelfile](./modelfile.md)
|
||||||
- [How we store Models](./modelstorage.md)
|
|
||||||
- [How to develop Ollama](./development.md)
|
- [How to develop Ollama](./development.md)
|
||||||
- [API](./api.md)
|
- [API](./api.md)
|
||||||
|
|
22
docs/api.md
22
docs/api.md
|
@ -24,10 +24,15 @@ The **Generate** endpoint takes a JSON object with the following fields:
|
||||||
```JSON
|
```JSON
|
||||||
{
|
{
|
||||||
"model": "modelname",
|
"model": "modelname",
|
||||||
"prompt": "prompt",
|
"prompt": "You are a software engineer working on building docs for Ollama.",
|
||||||
|
"options": {
|
||||||
|
"temperature": 0.7,
|
||||||
|
}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
|
**Options** can include any of the parameters listed in the [Modelfile](./modelfile.mdvalid-parameters-and-values) documentation. The only required parameter is **model**. If no **prompt** is provided, the model will generate a response to an empty prompt. If no **options** are provided, the model will use the default options from the Modelfile of the parent model.
|
||||||
|
|
||||||
### Response
|
### Response
|
||||||
|
|
||||||
The response is a stream of JSON objects with the following fields:
|
The response is a stream of JSON objects with the following fields:
|
||||||
|
@ -60,19 +65,19 @@ The final response in the stream also includes the context and what is usually s
|
||||||
```
|
```
|
||||||
|
|
||||||
| field | description |
|
| field | description |
|
||||||
| -------------------- | ------------------------------------------ |
|
| -------------------- | ------------------------------------------------------- |
|
||||||
| model | the name of the model |
|
| model | the name of the model |
|
||||||
| created_at | the time the response was generated |
|
| created_at | the time the response was generated |
|
||||||
| response | the current token |
|
| response | the current token |
|
||||||
| done | whether the response is complete |
|
| done | whether the response is complete |
|
||||||
| total_duration | total time spent generating the response |
|
| total_duration | total time in nanoseconds spent generating the response |
|
||||||
| load_duration | time spent loading the model |
|
| load_duration | time spent in nanoseconds loading the model |
|
||||||
| sample_count | number of samples generated |
|
| sample_count | number of samples generated |
|
||||||
| sample_duration | time spent generating samples |
|
| sample_duration | time spent generating samples |
|
||||||
| prompt_eval_count | number of times the prompt was evaluated |
|
| prompt_eval_count | number of times the prompt was evaluated |
|
||||||
| prompt_eval_duration | time spent evaluating the prompt |
|
| prompt_eval_duration | time spent in nanoseconds evaluating the prompt |
|
||||||
| eval_count | number of times the response was evaluated |
|
| eval_count | number of times the response was evaluated |
|
||||||
| eval_duration | time spent evaluating the response |
|
| eval_duration | time in nanoseconds spent evaluating the response |
|
||||||
|
|
||||||
### Example
|
### Example
|
||||||
|
|
||||||
|
@ -117,7 +122,7 @@ The **Create** endpoint takes a JSON object with the following fields:
|
||||||
```JSON
|
```JSON
|
||||||
{
|
{
|
||||||
"name": "modelname",
|
"name": "modelname",
|
||||||
"path": "path to Modelfile"
|
"path": "absolute path to Modelfile"
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
|
@ -321,8 +326,7 @@ The **Pull** endpoint takes a JSON object with the following fields:
|
||||||
|
|
||||||
```JSON
|
```JSON
|
||||||
{
|
{
|
||||||
"name": "modelname",
|
"name": "modelname"
|
||||||
"registry": "registryname"
|
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
Loading…
Reference in a new issue