Commit graph

211 commits

Author SHA1 Message Date
Andrei Betlen
670d390001 Fix ctypes typing issue for Arrays 2023-03-31 03:20:15 -04:00
Andrei Betlen
1545b22727 Fix array type signatures 2023-03-31 02:08:20 -04:00
Andrei Betlen
4b9eb5c19e Add search to mkdocs 2023-03-31 00:01:53 -04:00
Andrei Betlen
f5e03805f7 Update llama.cpp 2023-03-31 00:00:43 -04:00
Andrei Betlen
c928e0afc8 Formatting 2023-03-31 00:00:27 -04:00
Andrei Betlen
8d9560ed66 Add typing-extensions dependency 2023-03-30 06:43:31 -04:00
Andrei Betlen
a596362c44 Add minimum python version, typing-extensions dependency, and long description for PyPI 2023-03-30 06:42:54 -04:00
Andrei Betlen
51a92b5146 Bump version 2023-03-28 21:10:49 -04:00
Andrei Betlen
8908f4614c Update llama.cpp 2023-03-28 21:10:23 -04:00
Andrei Betlen
ea41474e04 Add new Llama methods to docs 2023-03-28 05:04:15 -04:00
Andrei Betlen
f11e9ae939 Bump version 2023-03-28 05:00:31 -04:00
Andrei Betlen
70b8a1ef75 Add support to get embeddings from high-level api. Closes #4 2023-03-28 04:59:54 -04:00
Andrei Betlen
9ba5c3c3b7 Bump version 2023-03-28 04:04:35 -04:00
Andrei Betlen
3dbb3fd3f6 Add support for stream parameter. Closes #1 2023-03-28 04:03:57 -04:00
Andrei Betlen
30fc0f3866 Extract generate method 2023-03-28 02:42:22 -04:00
Andrei Betlen
1c823f6d0f Refactor Llama class and add tokenize / detokenize methods Closes #3 2023-03-28 01:45:37 -04:00
Andrei Betlen
6dbff7679c Add docs link 2023-03-27 18:30:12 -04:00
Andrei Betlen
c210635c9b Update llama.cpp 2023-03-27 01:35:51 -04:00
Andrei Betlen
0ea84df91c Update llama.cpp 2023-03-26 14:00:37 -04:00
Andrei Betlen
4250380c0a Bump version 2023-03-25 16:26:35 -04:00
Andrei Betlen
8ae3beda9c Update Llama to add params 2023-03-25 16:26:23 -04:00
Andrei Betlen
4525236214 Update llama.cpp 2023-03-25 16:26:03 -04:00
Andrei Betlen
b121b7c05b Update docstring 2023-03-25 12:33:18 -04:00
Andrei Betlen
206efa39df Bump version 2023-03-25 12:12:39 -04:00
Andrei Betlen
fa92740a10 Update llama.cpp 2023-03-25 12:12:09 -04:00
Andrei Betlen
dfe8608096 Update examples 2023-03-24 19:10:31 -04:00
Andrei Betlen
5533ed7aa8 Update docs 2023-03-24 19:02:36 -04:00
Andrei Betlen
cbf8a62b64 Add repo url 2023-03-24 18:59:02 -04:00
Andrei Betlen
df15caa877 Add mkdocs 2023-03-24 18:57:59 -04:00
Andrei Betlen
a61fd3b509 Add example based on stripped down version of main.cpp from llama.cpp 2023-03-24 18:57:25 -04:00
Andrei Betlen
da9b71cfe5 Bump version 2023-03-24 18:44:04 -04:00
Andrei Betlen
4da5faa28b Bugfix: cross-platform method to find shared lib 2023-03-24 18:43:29 -04:00
Andrei Betlen
b93675608a Handle errors returned by llama.cpp 2023-03-24 15:47:17 -04:00
Andrei Betlen
bcde1f19b7 Bump version 2023-03-24 15:00:10 -04:00
Andrei Betlen
7786edb0f9 Black formatting 2023-03-24 14:59:29 -04:00
Andrei Betlen
c784d83131 Update llama.cpp and re-organize low-level api 2023-03-24 14:58:42 -04:00
Andrei Betlen
b9c53b88a1 Use n_ctx provided from actual context not params 2023-03-24 14:58:10 -04:00
Andrei Betlen
2cc499512c Black formatting 2023-03-24 14:35:41 -04:00
Andrei Betlen
d29b05bb67 Update example to match alpaca training prompt 2023-03-24 14:34:15 -04:00
Andrei Betlen
e24c581b5a Implement prompt batch processing as in main.cpp 2023-03-24 14:33:38 -04:00
Andrei Betlen
a28cb92d8f Remove model_name param 2023-03-24 04:04:29 -04:00
Andrei Betlen
15e3dc7897 Add fastapi example 2023-03-24 01:41:24 -04:00
Andrei Betlen
281b5cfa6b Update README.md 2023-03-24 00:06:24 -04:00
Andrei Betlen
27a5649cde Update README.md 2023-03-23 23:55:42 -04:00
Andrei Betlen
9af16b63fd Added low-level api inference example 2023-03-23 23:45:59 -04:00
Andrei Betlen
327365149e Bump Version 2023-03-23 23:13:08 -04:00
Andrei Betlen
8680332203 Update examples 2023-03-23 23:12:42 -04:00
Andrei Betlen
2c25257c62 Update llama.cpp 2023-03-23 23:00:56 -04:00
Andrei Betlen
02760f8fa7 Update llama.cpp and shared library build process 2023-03-23 17:01:06 -04:00
Andrei Betlen
90c78723de Add basic langchain demo 2023-03-23 16:25:24 -04:00