Andrei Betlen
|
c0a5c0171f
|
Add embed back into documentation
|
2023-04-03 18:53:00 -04:00 |
|
Andrei Betlen
|
adf656d542
|
Bump version
|
2023-04-03 18:46:49 -04:00 |
|
Andrei Betlen
|
ae004eb69e
|
Fix #16
|
2023-04-03 18:46:19 -04:00 |
|
Andrei Betlen
|
7d1977e8f0
|
Bump version
|
2023-04-03 14:49:36 -04:00 |
|
Andrei Betlen
|
4530197629
|
Update llama.cpp
|
2023-04-03 14:49:07 -04:00 |
|
Andrei
|
1d9a988644
|
Merge pull request #10 from MillionthOdin16/patch-1
Improve Shared Library Loading Mechanism
|
2023-04-03 14:47:11 -04:00 |
|
MillionthOdin16
|
a0758f0077
|
Update llama_cpp.py with PR requests
lib_base_name and load_shared_library
to
_lib_base_name and _load_shared_library
|
2023-04-03 13:06:50 -04:00 |
|
MillionthOdin16
|
a40476e299
|
Update llama_cpp.py
Make shared library code more robust with some platform specific functionality and more descriptive errors when failures occur
|
2023-04-02 21:50:13 -04:00 |
|
Andrei Betlen
|
b9a4513363
|
Update README
|
2023-04-02 21:03:39 -04:00 |
|
Andrei Betlen
|
7284adcaa8
|
Bump version
|
2023-04-02 13:36:07 -04:00 |
|
Andrei Betlen
|
1ed8cd023d
|
Update llama_cpp and add kv_cache api support
|
2023-04-02 13:33:49 -04:00 |
|
Andrei Betlen
|
74061b209d
|
Bump version
|
2023-04-02 03:59:47 -04:00 |
|
Andrei Betlen
|
4f509b963e
|
Bugfix: Stop sequences and missing max_tokens check
|
2023-04-02 03:59:19 -04:00 |
|
Andrei Betlen
|
42dd11c2b4
|
Bump version
|
2023-04-02 00:10:46 -04:00 |
|
Andrei Betlen
|
2bc184dc63
|
Add new methods to docs
|
2023-04-02 00:09:51 -04:00 |
|
Andrei Betlen
|
353e18a781
|
Move workaround to new sample method
|
2023-04-02 00:06:34 -04:00 |
|
Andrei Betlen
|
a4a1bbeaa9
|
Update api to allow for easier interactive mode
|
2023-04-02 00:02:47 -04:00 |
|
Andrei Betlen
|
eef627c09c
|
Fix example documentation
|
2023-04-01 17:39:35 -04:00 |
|
Andrei Betlen
|
a836639822
|
Bump version
|
2023-04-01 17:37:05 -04:00 |
|
Andrei Betlen
|
1e4346307c
|
Add documentation for generate method
|
2023-04-01 17:36:30 -04:00 |
|
Andrei Betlen
|
33f1529c50
|
Bump version
|
2023-04-01 17:30:47 -04:00 |
|
Andrei Betlen
|
f14a31c936
|
Document generate method
|
2023-04-01 17:29:43 -04:00 |
|
Andrei Betlen
|
67c70cc8eb
|
Add static methods for beginning and end of sequence tokens.
|
2023-04-01 17:29:30 -04:00 |
|
Andrei Betlen
|
caff127836
|
Remove commented out code
|
2023-04-01 15:13:01 -04:00 |
|
Andrei Betlen
|
f28bf3f13d
|
Bugfix: enable embeddings for fastapi server
|
2023-04-01 15:12:25 -04:00 |
|
Andrei Betlen
|
c25b7dfc86
|
Bump version
|
2023-04-01 13:06:05 -04:00 |
|
Andrei Betlen
|
ed6f2a049e
|
Add streaming and embedding endpoints to fastapi example
|
2023-04-01 13:05:20 -04:00 |
|
Andrei Betlen
|
0503e7f9b4
|
Update api
|
2023-04-01 13:04:12 -04:00 |
|
Andrei Betlen
|
9f975ac44c
|
Add development section
|
2023-04-01 13:03:56 -04:00 |
|
Andrei Betlen
|
9fac0334b2
|
Update embedding example to new api
|
2023-04-01 13:02:51 -04:00 |
|
Andrei Betlen
|
5e011145c5
|
Update low level api example
|
2023-04-01 13:02:10 -04:00 |
|
Andrei Betlen
|
5f2e822b59
|
Rename inference example
|
2023-04-01 13:01:45 -04:00 |
|
Andrei Betlen
|
318eae237e
|
Update high-level api
|
2023-04-01 13:01:27 -04:00 |
|
Andrei Betlen
|
3af274cbd4
|
Update llama.cpp
|
2023-04-01 13:00:09 -04:00 |
|
Andrei Betlen
|
69e7d9f60e
|
Add type definitions
|
2023-04-01 12:59:58 -04:00 |
|
Andrei Betlen
|
49c8df369a
|
Fix type signature of token_to_str
|
2023-03-31 03:25:12 -04:00 |
|
Andrei Betlen
|
670d390001
|
Fix ctypes typing issue for Arrays
|
2023-03-31 03:20:15 -04:00 |
|
Andrei Betlen
|
1545b22727
|
Fix array type signatures
|
2023-03-31 02:08:20 -04:00 |
|
Andrei Betlen
|
4b9eb5c19e
|
Add search to mkdocs
|
2023-03-31 00:01:53 -04:00 |
|
Andrei Betlen
|
f5e03805f7
|
Update llama.cpp
|
2023-03-31 00:00:43 -04:00 |
|
Andrei Betlen
|
c928e0afc8
|
Formatting
|
2023-03-31 00:00:27 -04:00 |
|
Andrei Betlen
|
8d9560ed66
|
Add typing-extensions dependency
|
2023-03-30 06:43:31 -04:00 |
|
Andrei Betlen
|
a596362c44
|
Add minimum python version, typing-extensions dependency, and long description for PyPI
|
2023-03-30 06:42:54 -04:00 |
|
Andrei Betlen
|
51a92b5146
|
Bump version
|
2023-03-28 21:10:49 -04:00 |
|
Andrei Betlen
|
8908f4614c
|
Update llama.cpp
|
2023-03-28 21:10:23 -04:00 |
|
Andrei Betlen
|
ea41474e04
|
Add new Llama methods to docs
|
2023-03-28 05:04:15 -04:00 |
|
Andrei Betlen
|
f11e9ae939
|
Bump version
|
2023-03-28 05:00:31 -04:00 |
|
Andrei Betlen
|
70b8a1ef75
|
Add support to get embeddings from high-level api. Closes #4
|
2023-03-28 04:59:54 -04:00 |
|
Andrei Betlen
|
9ba5c3c3b7
|
Bump version
|
2023-03-28 04:04:35 -04:00 |
|
Andrei Betlen
|
3dbb3fd3f6
|
Add support for stream parameter. Closes #1
|
2023-03-28 04:03:57 -04:00 |
|