Bruce MacDonald
7e8f7c8358
remove ggml automatic re-pull ( #1856 )
2024-01-08 14:41:01 -05:00
Daniel Hiltgen
e0d05b0f1e
Accept windows paths for image processing
...
This enhances our regex to support windows style paths. The regex will
match invalid path specifications, but we'll still validate file
existence and filter out mismatches
2024-01-06 10:50:27 -08:00
Bruce MacDonald
3a9f447141
only pull gguf model if already exists ( #1817 )
2024-01-05 18:50:00 -05:00
Patrick Devine
9c2941e61b
switch api for ShowRequest to use the name field ( #1816 )
2024-01-05 15:06:43 -08:00
Bruce MacDonald
4f4980b66b
simplify ggml update logic ( #1814 )
...
- additional information is now available in show response, use this to pull gguf before running
- make gguf updates cancellable
2024-01-05 15:22:32 -05:00
Patrick Devine
22e93efa41
add show info command and fix the modelfile
2024-01-05 12:20:05 -08:00
Patrick Devine
2909dce894
split up interactive generation
2024-01-05 12:20:05 -08:00
Patrick Devine
d0409f772f
keyboard shortcut help ( #1764 )
2024-01-02 18:04:12 -08:00
Daniel Hiltgen
96fb441abd
Merge pull request #1146 from dhiltgen/ext_server_cgo
...
Add cgo implementation for llama.cpp
2023-12-22 08:16:31 -08:00
Bruce MacDonald
fabf2f3467
allow for starting llava queries with filepath ( #1549 )
2023-12-21 13:20:59 -05:00
Bruce MacDonald
811b1f03c8
deprecate ggml
...
- remove ggml runner
- automatically pull gguf models when ggml detected
- tell users to update to gguf in the case automatic pull fails
Co-Authored-By: Jeffrey Morgan <jmorganca@gmail.com>
2023-12-19 09:05:46 -08:00
Bruce MacDonald
1b417a7836
use exp slices for go 1.20 compatibility ( #1544 )
2023-12-15 14:15:56 -05:00
Patrick Devine
630518f0d9
Add unit test of API routes ( #1528 )
2023-12-14 16:47:40 -08:00
Jeffrey Morgan
4a1abfe4fa
fix tests
2023-12-13 14:42:30 -05:00
Jeffrey Morgan
0a9d348023
Fix issues with /set template
and /set system
( #1486 )
2023-12-12 14:43:19 -05:00
Patrick Devine
910e9401d0
Multimodal support ( #1216 )
...
---------
Co-authored-by: Matt Apperson <mattapperson@Matts-MacBook-Pro.local>
2023-12-11 13:56:22 -08:00
Michael Yang
16c7548460
fix redundant newline
2023-12-07 13:44:45 -08:00
Michael Yang
4b77fcb2b9
comments
2023-12-05 09:43:50 -08:00
Michael Yang
cde13bcdea
cmd: only print server version when different
2023-12-05 09:36:01 -08:00
Michael Yang
0f0cd265a7
cmd: add server version
2023-12-05 09:36:01 -08:00
Michael Yang
5c59455b59
cmd: use existing cmd context
2023-12-05 09:36:01 -08:00
Patrick Devine
bf704423c5
revert cli to use /api/generate ( #1383 )
2023-12-04 16:35:29 -08:00
Bruce MacDonald
7a0899d62d
chat api ( #991 )
...
- update chat docs
- add messages chat endpoint
- remove deprecated context and template generate parameters from docs
- context and template are still supported for the time being and will continue to work as expected
- add partial response to chat history
2023-12-04 18:01:06 -05:00
Patrick Devine
2113c9d31a
make linewrap still work when the terminal width has changed ( #1350 )
2023-12-04 14:14:56 -08:00
Patrick Devine
6681d37861
allow setting the system and template for prompts in the repl ( #1335 )
2023-12-01 09:28:35 -08:00
Jeffrey Morgan
5687f1a0cf
fix unexpected end of response
errors when cancelling in ollama run
2023-11-30 00:30:21 -05:00
Patrick Devine
cde31cb220
Allow setting parameters in the REPL ( #1294 )
2023-11-29 09:56:42 -08:00
Jeffrey Morgan
9fb5e8399c
Fix issues with inputting and formatting multi line strings in ollama run
...
Co-authored-by: Wen Sun <iwendellsun@gmail.com>
2023-11-26 12:54:29 -05:00
Jeffrey Morgan
df07e4a097
remove redundant filename parameter ( #1213 )
2023-11-20 17:05:36 -05:00
Bruce MacDonald
31ab453d37
resolve FROM path before sending modelfile ( #1211 )
2023-11-20 16:43:48 -05:00
Jeffrey Morgan
6066c70edd
restore progress messages for older endpoints
2023-11-20 11:37:17 -05:00
Jeffrey Morgan
6bbd6e26fb
fix temporary newline created and removed with spinner in ollama run
2023-11-20 00:49:08 -05:00
Jeffrey Morgan
c06b9b7304
update progress rendering to be closer to v0.1.10
2023-11-19 13:43:21 -05:00
Jeffrey Morgan
984714f131
update status text when transfering blob on ollama create
2023-11-18 09:40:10 -05:00
Michael Yang
976068369b
stop all spinners on progress stop
2023-11-17 10:06:19 -08:00
Michael Yang
4dcf7a59b1
generate progress
2023-11-17 10:06:19 -08:00
Michael Yang
1c0e092ead
progress cmd
2023-11-17 10:06:19 -08:00
Michael Yang
f91bb2f7f0
remove progressbar
2023-11-17 10:06:19 -08:00
Michael Yang
1901044b07
use checksum reference
2023-11-15 15:16:23 -08:00
Michael Yang
d660eebf22
fix create from model tag
2023-11-15 15:16:23 -08:00
Michael Yang
1552cee59f
client create modelfile
2023-11-15 15:16:23 -08:00
Michael Yang
01ea6002c4
replace go-humanize with format.HumanBytes
2023-11-14 14:57:41 -08:00
Jeffrey Morgan
423862042a
treat ollama run model < file
as entire prompt, not prompt-per-line ( #1126 )
...
Previously, `ollama run` treated a non-terminal stdin (such as `ollama run model < file`) as containing one prompt per line. To run inference on a multi-line prompt, the only non-API workaround was to run `ollama run` interactively and wrap the prompt in `"""..."""`.
Now, `ollama run` treats a non-terminal stdin as containing a single prompt. For example, if `myprompt.txt` is a multi-line file, then `ollama run model < myprompt.txt` would treat `myprompt.txt`'s entire contents as the prompt.
Co-authored-by: Quinn Slack <quinn@slack.org>
2023-11-14 16:42:21 -05:00
Jeffrey Morgan
4e612a2e92
use stdout fd for terminal size ( #1125 )
2023-11-14 16:09:09 -05:00
Jeffrey Morgan
6e0f686afa
--format json
should work in interactive mode
2023-11-14 10:22:03 -05:00
Jeffrey Morgan
c1844bbee2
add json mode to cli ( #1095 )
2023-11-13 21:54:02 -05:00
Michael Yang
bf6786bb39
fix tautology
2023-10-31 20:49:48 -07:00
Bruce MacDonald
f9a4281124
clean up: remove server functions from client ( #937 )
2023-10-30 11:10:18 -04:00
Jeffrey Morgan
9ec16f0f03
fix formatting when exiting ollama run
2023-10-27 21:26:23 -07:00
Jeffrey Morgan
2d75a4537c
close input channel when receiving io.EOF
2023-10-27 20:26:04 -07:00