Commit graph

197 commits

Author SHA1 Message Date
Patrick Devine
910e9401d0
Multimodal support (#1216)
---------

Co-authored-by: Matt Apperson <mattapperson@Matts-MacBook-Pro.local>
2023-12-11 13:56:22 -08:00
Michael Yang
16c7548460 fix redundant newline 2023-12-07 13:44:45 -08:00
Michael Yang
4b77fcb2b9 comments 2023-12-05 09:43:50 -08:00
Michael Yang
cde13bcdea cmd: only print server version when different 2023-12-05 09:36:01 -08:00
Michael Yang
0f0cd265a7 cmd: add server version 2023-12-05 09:36:01 -08:00
Michael Yang
5c59455b59 cmd: use existing cmd context 2023-12-05 09:36:01 -08:00
Patrick Devine
bf704423c5
revert cli to use /api/generate (#1383) 2023-12-04 16:35:29 -08:00
Bruce MacDonald
7a0899d62d
chat api (#991)
- update chat docs
- add messages chat endpoint
- remove deprecated context and template generate parameters from docs
- context and template are still supported for the time being and will continue to work as expected
- add partial response to chat history
2023-12-04 18:01:06 -05:00
Patrick Devine
2113c9d31a
make linewrap still work when the terminal width has changed (#1350) 2023-12-04 14:14:56 -08:00
Patrick Devine
6681d37861
allow setting the system and template for prompts in the repl (#1335) 2023-12-01 09:28:35 -08:00
Jeffrey Morgan
5687f1a0cf fix unexpected end of response errors when cancelling in ollama run 2023-11-30 00:30:21 -05:00
Patrick Devine
cde31cb220
Allow setting parameters in the REPL (#1294) 2023-11-29 09:56:42 -08:00
Jeffrey Morgan
9fb5e8399c Fix issues with inputting and formatting multi line strings in ollama run
Co-authored-by: Wen Sun <iwendellsun@gmail.com>
2023-11-26 12:54:29 -05:00
Jeffrey Morgan
df07e4a097
remove redundant filename parameter (#1213) 2023-11-20 17:05:36 -05:00
Bruce MacDonald
31ab453d37
resolve FROM path before sending modelfile (#1211) 2023-11-20 16:43:48 -05:00
Jeffrey Morgan
6066c70edd restore progress messages for older endpoints 2023-11-20 11:37:17 -05:00
Jeffrey Morgan
6bbd6e26fb fix temporary newline created and removed with spinner in ollama run 2023-11-20 00:49:08 -05:00
Jeffrey Morgan
c06b9b7304 update progress rendering to be closer to v0.1.10 2023-11-19 13:43:21 -05:00
Jeffrey Morgan
984714f131 update status text when transfering blob on ollama create 2023-11-18 09:40:10 -05:00
Michael Yang
976068369b stop all spinners on progress stop 2023-11-17 10:06:19 -08:00
Michael Yang
4dcf7a59b1 generate progress 2023-11-17 10:06:19 -08:00
Michael Yang
1c0e092ead progress cmd 2023-11-17 10:06:19 -08:00
Michael Yang
f91bb2f7f0 remove progressbar 2023-11-17 10:06:19 -08:00
Michael Yang
1901044b07 use checksum reference 2023-11-15 15:16:23 -08:00
Michael Yang
d660eebf22 fix create from model tag 2023-11-15 15:16:23 -08:00
Michael Yang
1552cee59f client create modelfile 2023-11-15 15:16:23 -08:00
Michael Yang
01ea6002c4 replace go-humanize with format.HumanBytes 2023-11-14 14:57:41 -08:00
Jeffrey Morgan
423862042a
treat ollama run model < file as entire prompt, not prompt-per-line (#1126)
Previously, `ollama run` treated a non-terminal stdin (such as `ollama run model < file`) as containing one prompt per line. To run inference on a multi-line prompt, the only non-API workaround was to run `ollama run` interactively and wrap the prompt in `"""..."""`.

Now, `ollama run` treats a non-terminal stdin as containing a single prompt. For example, if `myprompt.txt` is a multi-line file, then `ollama run model < myprompt.txt` would treat `myprompt.txt`'s entire contents as the prompt.

Co-authored-by: Quinn Slack <quinn@slack.org>
2023-11-14 16:42:21 -05:00
Jeffrey Morgan
4e612a2e92
use stdout fd for terminal size (#1125) 2023-11-14 16:09:09 -05:00
Jeffrey Morgan
6e0f686afa --format json should work in interactive mode 2023-11-14 10:22:03 -05:00
Jeffrey Morgan
c1844bbee2
add json mode to cli (#1095) 2023-11-13 21:54:02 -05:00
Michael Yang
bf6786bb39 fix tautology 2023-10-31 20:49:48 -07:00
Bruce MacDonald
f9a4281124
clean up: remove server functions from client (#937) 2023-10-30 11:10:18 -04:00
Jeffrey Morgan
9ec16f0f03 fix formatting when exiting ollama run 2023-10-27 21:26:23 -07:00
Jeffrey Morgan
2d75a4537c close input channel when receiving io.EOF 2023-10-27 20:26:04 -07:00
Patrick Devine
a79f030e75
add bracketed paste mode (#922) 2023-10-26 15:57:00 -07:00
Patrick Devine
deeac961bb
new readline library (#847) 2023-10-25 16:41:18 -07:00
Michael Yang
36c88cb9db cmd: set ExactArgs 2023-10-18 14:40:48 -07:00
Bruce MacDonald
68d7255bd3
show request to server rather than local check (#778) 2023-10-16 17:27:25 -04:00
Bruce MacDonald
a0c3e989de
deprecate modelfile embed command (#759) 2023-10-16 11:07:37 -04:00
Bruce MacDonald
56497663c8
relay model runner error message to client (#720)
* give direction to user when runner fails
* also relay errors from timeout
* increase timeout to 3 minutes
2023-10-12 11:16:37 -04:00
Michael Yang
2cfffea02e handle client proxy 2023-10-09 12:33:47 -07:00
Patrick Devine
61ff1946e6
revise help text (#706) 2023-10-05 11:36:07 -07:00
Alexander F. Rødseth
d104b7e997
Fix go test./... issue: fmt.Println arg list ends with redundant newline (#705) 2023-10-05 11:11:04 -04:00
Patrick Devine
1852755154
show a default message when license/parameters/system prompt/template aren't specified (#681) 2023-10-02 14:34:52 -07:00
Patrick Devine
99d5161e8a
don't wordwrap when stdout is redirected or piped (#662) 2023-10-02 11:50:55 -07:00
Michael Yang
9333b0cc82
Merge pull request #612 from jmorganca/mxyng/prune-empty-directories
prune empty directories
2023-09-29 11:23:39 -07:00
Patrick Devine
76db4a49cf
allow the user to cancel generating with ctrl-C (#641) 2023-09-28 17:13:01 -07:00
Luc Stepniewski
4aa0976a2e
Added missing return preventing SIGSEGV because of missing resp (#621)
Co-authored-by: Luc Stepniewski <luc@eclipse-fr.com>
2023-09-28 14:25:22 -07:00
Patrick Devine
92c20fdae6
fix error messages for unknown commands in the repl (#611) 2023-09-28 14:19:45 -07:00
Michael Yang
f40b3de758 use int64 consistently 2023-09-28 11:07:24 -07:00
Michael Yang
8608eb4760 prune empty directories 2023-09-27 10:58:09 -07:00
Michael Yang
0625e805f0 fix model name not matching 2023-09-26 19:50:04 -07:00
Michael Yang
93d887e4bc add painter message for exit 2023-09-25 16:30:22 -07:00
Patrick Devine
b5614f3ebc
fix end-of-line issue with the new prompt (#582) 2023-09-23 17:20:30 -07:00
Jeffrey Morgan
01c44d687e add multi line strings to final prompt 2023-09-23 00:27:24 -04:00
Jeffrey Morgan
e20362e0d5 fix multi line input in ollama run 2023-09-22 23:49:35 -04:00
Patrick Devine
c928ceb927
add word wrapping for lines which are longer than the terminal width (#553) 2023-09-22 13:36:08 -07:00
Patrick Devine
87d9efb364
switch to forked readline lib which doesn't wreck the repl prompt (#578) 2023-09-22 12:17:45 -07:00
Michael Yang
88897a90e4 fix ipv6 parse ip 2023-09-22 10:41:32 -07:00
Michael Yang
6137b12799 validate existence and pull model using api 2023-09-21 09:55:34 -07:00
Michael Yang
9297ff8330 fix OLLAMA_HOST parsing for ip6 2023-09-20 18:52:57 -07:00
Michael Yang
58ffa03d8b fix impossible condition 2023-09-20 11:27:44 -07:00
Michael Yang
a5520bfb42 fix build 2023-09-19 10:42:24 -07:00
Michael Yang
b58d5d16b0 fix mkdir on windows 2023-09-19 09:41:13 -07:00
Patrick Devine
80dd44e80a
Cmd changes (#541) 2023-09-18 12:26:56 -07:00
Patrick Devine
e7e91cd71c
add autoprune to remove unused layers (#491) 2023-09-11 11:46:35 -07:00
Patrick Devine
1adfa67589
tighten up the error string for ollama show flags (#476) 2023-09-06 13:38:49 -07:00
Patrick Devine
790d24eb7b
add show command (#474) 2023-09-06 11:04:17 -07:00
Patrick Devine
8bbff2df98
add model IDs (#439) 2023-08-28 20:50:24 -07:00
Quinn Slack
2ecc3a33c3
delete all models (not just 1st) in ollama rm (#415)
Previously, `ollama rm model1 model2 modelN` would only delete `model1`. The other model command-line arguments would be silently ignored. Now, all models mentioned are deleted.
2023-08-26 00:47:56 -07:00
Michael Yang
9ec7e37534
Merge pull request #392 from jmorganca/mxyng/version
add version
2023-08-22 09:50:25 -07:00
Michael Yang
2c7f956b38 add version 2023-08-22 09:40:58 -07:00
Jeffrey Morgan
a9f6c56652 fix FROM instruction erroring when referring to a file 2023-08-22 09:39:42 -07:00
Ryan Baker
0a892419ad
Strip protocol from model path (#377) 2023-08-21 21:56:56 -07:00
Michael Yang
0ebec07569
Merge pull request #345 from jmorganca/exit-non-zero
set non-zero error code on error
2023-08-16 09:20:28 -07:00
Blake Mizerany
67e593e355
cmd: support OLLAMA_CLIENT_HOST environment variable (#262)
* cmd: support OLLAMA_HOST environment variable

This commit adds support for the OLLAMA_HOST environment
variable. This variable can be used to specify the host to which
the client should connect. This is useful when the client is
running somewhere other than the host where the server is running.

The new api.FromEnv function is used to read configure clients from the
environment. Clients wishing to use the environment variable being
consistent with the Ollama CLI can use this new function.

* Update api/client.go

Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com>

* Update api/client.go

Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com>

---------

Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com>
2023-08-16 11:03:48 -04:00
Michael Yang
76b85bc0e9 set non-zero error code on error 2023-08-14 14:09:58 -07:00
Jeffrey Morgan
1556162c90 create .ollama directory if it doesnt exist 2023-08-11 15:35:55 -07:00
Patrick Devine
9770e3b325
Generate private/public keypair for use w/ auth (#324) 2023-08-11 10:58:23 -07:00
Jeffrey Morgan
7e26a8df31 cmd: use environment variables for server options 2023-08-10 14:17:53 -07:00
Soroush Javadi
bea683e3bf
cmd: check GetBlobsPath error (#317)
The error returned by `server.GetBlobsPath` in `showLayer` was never
checked. Check the error and return if not nil. Also, make newlines at
the end of error messages consistent and fix a typo.
2023-08-10 09:57:49 -07:00
Jeffrey Morgan
f65169b13e clean up cli flags 2023-08-10 09:28:56 -07:00
Jeffrey Morgan
040a5b9750 clean up cli flags 2023-08-10 09:27:03 -07:00
Bruce MacDonald
a6f6d18f83 embed text document in modelfile 2023-08-08 11:27:17 -04:00
cmiller01
93492f1e18 correct precedence of serve params (args over env over default) 2023-08-07 19:55:20 +00:00
cmiller01
fb593b7bfc pass flags to serve to allow setting allowed-origins + host and port
* resolves: https://github.com/jmorganca/ollama/issues/300 and
https://github.com/jmorganca/ollama/issues/282

* example usage:
```
ollama serve --port 9999 --allowed-origins "http://foo.example.com,http://192.0.0.1"
```
2023-08-07 03:34:37 +00:00
Bruce MacDonald
bd6d741d87
tell users to check the server error logs 2023-08-02 17:08:11 -04:00
Bruce MacDonald
8f8b6288ac
check server is running before running command 2023-08-02 10:51:23 -04:00
Bruce MacDonald
50e87c6691 read from os executable 2023-08-01 16:01:55 -04:00
Bruce MacDonald
40a25bf8c3 pr comments 2023-08-01 13:48:48 -04:00
Jeffrey Morgan
528bafa585 cache loaded model 2023-08-01 11:24:18 -04:00
Bruce MacDonald
36d6081ed1 find symlink of mac app 2023-07-31 17:38:10 -04:00
Bruce MacDonald
e72fe7945f check server is running before running command 2023-07-31 16:25:57 -04:00
Bruce MacDonald
d1c098b038 tell users to check the server error logs 2023-07-31 11:49:33 -04:00
Patrick Devine
39bb25d5f6
allow multiline text using three double-quotes (#239) 2023-07-29 13:35:23 -07:00
Patrick Devine
01d155c969
show system/template/license layers from cmd prompt (#223) 2023-07-27 16:58:40 -07:00
Michael Yang
35af37a2cb session id 2023-07-27 09:31:44 -07:00
Bruce MacDonald
4c1caa3733 download models when creating from modelfile 2023-07-25 14:25:13 -04:00
Bruce MacDonald
12ab8f8f5f Revert "pull model on make if not present locally"
This reverts commit 360a10ace391a674de60aa7b9b8cb65e8074027c.
2023-07-25 14:18:46 -04:00