Pavel Frankov
e6b8a139ff
Update README.md ( #2138 )
2024-02-22 10:52:36 -05:00
Jeffrey Morgan
bdc0ea1ba5
Update import.md
2024-02-22 02:08:03 -05:00
Jeffrey Morgan
7fab7918cc
Update import.md
2024-02-22 02:06:24 -05:00
Michael Yang
74c1bdba0d
Merge pull request #2657 from joshyan1/patch-1
...
Update install.sh success message
2024-02-21 15:55:20 -08:00
Josh
f983ef7f5f
Update install.sh success message
2024-02-21 18:30:01 -05:00
Jeffrey Morgan
1ae1c33651
Windows build + installer adjustments ( #2656 )
...
* remove `-w -s` linker flags on windows
* use `zip` for windows installer compression
2024-02-21 18:21:26 -05:00
Michael Yang
084d846621
refactor
2024-02-21 13:42:48 -08:00
Michael Yang
6a4b994433
lint
2024-02-21 13:42:48 -08:00
Michael Yang
bea007deb7
use LimitGroup for uploads
2024-02-21 13:42:48 -08:00
Michael Yang
074934be03
adjust group limit based on download speed
2024-02-21 13:42:48 -08:00
Michael Yang
0de12368a0
add new LimitGroup for dynamic concurrency
2024-02-21 13:42:48 -08:00
Michael Yang
917bd61084
refactor download run
2024-02-21 13:42:46 -08:00
Jeffrey Morgan
efe040f8c0
reset with init_vars
ahead of each cpu build in gen_windows.ps1
( #2654 )
2024-02-21 16:35:34 -05:00
Jeffrey Morgan
2a7553ce09
update llama.cpp submodule to c14f72d
2024-02-21 09:03:14 -05:00
Sun Bo
10af6070a9
Update big-AGI config file link ( #2626 )
...
Co-authored-by: bo.sun <bo.sun@cotticoffee.com>
2024-02-21 01:24:48 -05:00
Jeffrey Morgan
92423b0600
add dist
directory in build_windows.ps
2024-02-21 00:05:05 -05:00
Jeffrey Morgan
b3eac61cac
update llama.cpp submodule to f0d1fafc029a056cd765bdae58dcaa12312e9879
2024-02-20 22:56:51 -05:00
Jeffrey Morgan
287ba11500
better error message when calling /api/generate
or /api/chat
with embedding models
2024-02-20 21:53:45 -05:00
Jeffrey Morgan
63861f58cc
Support for bert
and nomic-bert
embedding models
2024-02-20 21:37:29 -05:00
Jeffrey Morgan
f0425d3de9
Update faq.md
2024-02-20 20:44:45 -05:00
Michael Yang
210b65268e
replace strings buffer with hasher ( #2437 )
...
the buffered value is going into the hasher eventually so write directly
to the hasher instead
2024-02-20 19:07:50 -05:00
Michael Yang
949d7b1c48
add gguf file types ( #2532 )
2024-02-20 19:06:29 -05:00
Michael Yang
897b213468
use http.DefaultClient ( #2530 )
...
default client already handles proxy
2024-02-20 18:34:47 -05:00
Jeffrey Morgan
4613a080e7
update llama.cpp submodule to 66c1968f7
( #2618 )
2024-02-20 17:42:31 -05:00
Muhammed Nazeem
ace2cdf1c6
Add Page Assist to the community integrations ( #2447 )
2024-02-20 14:03:58 -05:00
Nikesh Parajuli
eed92bc19a
docs: add Msty app in readme ( #1775 )
...
* docs: add Msty app in readme
* docs: update msty url
2024-02-20 14:03:33 -05:00
Michael Edoror
e0a2f46466
Update README.md to include Elixir LangChain Library ( #2180 )
...
The Elixir LangChain Library now supports Ollama Chat with this [PR](https://github.com/brainlid/langchain/pull/70 )
2024-02-20 14:03:02 -05:00
Taras Tsugrii
01ff2e14db
[nit] Remove unused msg local var. ( #2511 )
2024-02-20 14:02:34 -05:00
BADR
199e79ec0c
docs: add tenere to terminal clients ( #2329 )
2024-02-19 23:13:03 -05:00
Jeffrey Morgan
8125ce4cb6
Update import.md
...
Add instructions to get public key on windows
2024-02-19 22:48:24 -05:00
Daniel
636d6eea99
Add ShellOracle to community terminal integrations ( #1767 )
2024-02-19 22:18:05 -05:00
Jeffrey Morgan
df56f1ee5e
Update faq.md
2024-02-19 22:16:42 -05:00
Jean-Baptiste Detroyes
0b6c6c9092
feat: add Helm Chart link to Package managers list ( #1673 )
2024-02-19 22:05:14 -05:00
Jakob Hoeg Mørk
cb60389de7
NextJS web interface for Ollama ( #2466 )
2024-02-19 21:57:36 -05:00
lulz
ce0c95d097
[fix] /bye and /exit are now treated as prefixes ( #2381 )
...
* [fix] /bye and /exit are now treated as prefixes
instead of being treated as entire lines which doesn't align with the way the rest of the commands are treated
* Update cmd/interactive.go
Fixing whitespace
---------
Co-authored-by: Jeffrey Morgan <jmorganca@gmail.com>
2024-02-19 21:56:49 -05:00
Eddú Meléndez Gonzales
a9bc1e1c37
Add LangChain4J ( #2164 )
2024-02-19 21:17:32 -05:00
Branislav Gerazov
62c71f4cb1
add ollama-chat.nvim ( #2188 )
2024-02-19 21:14:29 -05:00
Jeffrey Morgan
41aca5c2d0
Update faq.md
2024-02-19 21:11:01 -05:00
Jeffrey Morgan
753724d867
Update api.md to include examples for reproducible outputs
2024-02-19 20:36:16 -05:00
Jeffrey Morgan
e4576c2ee1
Update README.md
2024-02-19 20:15:24 -05:00
Patrick Devine
9a7a4b9533
add faqs for memory pre-loading and the keep_alive setting ( #2601 )
2024-02-19 14:45:25 -08:00
Daniel Hiltgen
2653191222
Merge pull request #2600 from dhiltgen/refined_win_docs
...
Document setting server vars for windows
2024-02-19 13:46:37 -08:00
Daniel Hiltgen
b338c0635f
Document setting server vars for windows
2024-02-19 13:30:46 -08:00
Daniel Hiltgen
4fcbf1cde6
Merge pull request #2599 from dhiltgen/fix_avx
...
Explicitly disable AVX2 on GPU builds
2024-02-19 13:13:05 -08:00
Daniel Hiltgen
9220b4fa91
Merge pull request #2585 from dhiltgen/cuda_leaks
...
Fix cuda leaks
2024-02-19 12:48:00 -08:00
Daniel Hiltgen
fc39a6cd7a
Fix cuda leaks
...
This should resolve the problem where we don't fully unload from the GPU
when we go idle.
2024-02-18 18:37:20 -08:00
Justin Hayes
1e23e82324
Update Web UI link to new project name ( #2563 )
...
Ollama WebUI is now known as Open WebUI.
2024-02-17 20:05:20 -08:00
Daniel Hiltgen
f9fd08040b
Merge pull request #2552 from dhiltgen/dup_update_menus
...
Fix duplicate menus on update and exit on signals
2024-02-16 17:23:37 -08:00
Daniel Hiltgen
4318e35ee3
Merge pull request #2553 from dhiltgen/amdgpu_version
...
Harden AMD driver lookup logic
2024-02-16 17:23:12 -08:00
Daniel Hiltgen
9754c6d9d8
Harden AMD driver lookup logic
...
It looks like the version file doesnt exist on older(?) drivers
2024-02-16 16:20:16 -08:00