clean up
Signed-off-by: Matt Williams <m@technovangelist.com>
This commit is contained in:
parent
d93e2f9210
commit
a101fe51a7
1 changed files with 1 additions and 6 deletions
|
@ -14,12 +14,9 @@ The **Generate** endpoint takes a JSON object with the following fields:
|
||||||
{
|
{
|
||||||
Model: "modelname",
|
Model: "modelname",
|
||||||
Prompt: "prompt",
|
Prompt: "prompt",
|
||||||
Context: "context",
|
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
Context is optional, but is used to provide additional context, such as memory of earlier prompts.
|
|
||||||
|
|
||||||
The response is a stream of JSON objects with the following fields:
|
The response is a stream of JSON objects with the following fields:
|
||||||
|
|
||||||
```
|
```
|
||||||
|
@ -38,7 +35,6 @@ The final response in the stream also includes the context and what is usually s
|
||||||
"model":"orca",
|
"model":"orca",
|
||||||
"created_at":"2023-08-04T19:22:45.499127Z",
|
"created_at":"2023-08-04T19:22:45.499127Z",
|
||||||
"done":true,
|
"done":true,
|
||||||
"context":[1,31822,1,13,8458,31922 ... 382,871,550,389,266,7661,31844,382,820,541,4842,1954,661,645,590,3465,31843,2],
|
|
||||||
"total_duration":5589157167,
|
"total_duration":5589157167,
|
||||||
"load_duration":3013701500,
|
"load_duration":3013701500,
|
||||||
"sample_count":114,
|
"sample_count":114,
|
||||||
|
@ -56,7 +52,6 @@ The final response in the stream also includes the context and what is usually s
|
||||||
| created_at | the time the response was generated |
|
| created_at | the time the response was generated |
|
||||||
| response | the current token |
|
| response | the current token |
|
||||||
| done | whether the response is complete |
|
| done | whether the response is complete |
|
||||||
| context | vectorize context that can be supplied in the next request to continue the conversation |
|
|
||||||
| total_duration | total time spent generating the response |
|
| total_duration | total time spent generating the response |
|
||||||
| load_duration | time spent loading the model |
|
| load_duration | time spent loading the model |
|
||||||
| sample_count | number of samples generated |
|
| sample_count | number of samples generated |
|
||||||
|
@ -93,7 +88,7 @@ curl --location --request POST 'http://localhost:11434/api/generate' \
|
||||||
.
|
.
|
||||||
.
|
.
|
||||||
{"model":"orca","created_at":"2023-08-04T19:22:45.487113Z","response":".","done":false}
|
{"model":"orca","created_at":"2023-08-04T19:22:45.487113Z","response":".","done":false}
|
||||||
{"model":"orca","created_at":"2023-08-04T19:22:45.499127Z","done":true,"context":[1,31822,1,13,8458,31922,3244,31871,13,3838,397,363,7421,8825,342,5243,10389,5164,828,31843,9530,362,988,362,365,473,31843,13,13,8458,31922,9779,31871,13,23712,322,266,7661,4842,13,13,8458,31922,13166,31871,13,347,7661,4725,4842,906,287,260,12329,1676,6697,27554,27289,31843,4025,2990,322,985,550,287,260,9949,287,8286,31844,10990,427,2729,289,399,20036,31843,1408,21062,16858,266,4556,31876,31829,7965,31844,357,19322,16450,287,1900,859,362,22329,291,11944,31843,1872,16450,397,988,5497,661,266,23893,287,266,1954,31844,560,526,640,3304,266,1954,288,484,11468,31843,1813,31844,4842,1954,470,260,13830,23893,661,590,8286,31844,560,357,322,18752,541,4083,31843,672,1901,342,662,382,871,550,389,266,7661,31844,382,820,541,4842,1954,661,645,590,3465,31843,2],"total_duration":5589157167,"load_duration":3013701500,"sample_count":114,"sample_duration":81442000,"prompt_eval_count":46,"prompt_eval_duration":1160282000,"eval_count":113,"eval_duration":1325948000}
|
{"model":"orca","created_at":"2023-08-04T19:22:45.499127Z","done":true,"total_duration":5589157167,"load_duration":3013701500,"sample_count":114,"sample_duration":81442000,"prompt_eval_count":46,"prompt_eval_duration":1160282000,"eval_count":113,"eval_duration":1325948000}
|
||||||
```
|
```
|
||||||
|
|
||||||
## Create a Model
|
## Create a Model
|
||||||
|
|
Loading…
Reference in a new issue