No description
Find a file
2023-03-23 23:45:59 -04:00
examples Added low-level api inference example 2023-03-23 23:45:59 -04:00
llama_cpp Bugfix: avoid decoding partial utf-8 characters 2023-03-23 16:25:13 -04:00
vendor Update llama.cpp 2023-03-23 23:00:56 -04:00
.gitignore Updated package to build with skbuild 2023-03-23 13:54:14 -04:00
.gitmodules Add llama.cpp to vendor folder 2023-03-23 05:37:26 -04:00
CMakeLists.txt Update llama.cpp and shared library build process 2023-03-23 17:01:06 -04:00
LICENSE.md Initial commit 2023-03-23 05:33:06 -04:00
poetry.lock Initial pypi release 2023-03-23 14:24:08 -04:00
pyproject.toml Version bump 2023-03-23 16:00:02 -04:00
README.md Update README 2023-03-23 16:00:10 -04:00
setup.py Bump Version 2023-03-23 23:13:08 -04:00

llama.cpp Python Bindings

Simple Python bindings for @ggerganov's llama.cpp library. This package provides:

  • Low-level access to C API via ctypes interface
  • High-level Python API for text completion inspired by OpenAI's API

Install

pip install llama-cpp-python

Usage

>>> from llama_cpp import Llama
>>> llm = Llama(model_path="models/7B/...")
>>> output = llm("Q: Name the planets in the solar system? A: ", max_tokens=32, stop=["Q:", "\n"], echo=True)
>>> print(output)
{
  "id": "cmpl-xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
  "object": "text_completion",
  "created": 1679561337,
  "model": "models/7B/...",
  "choices": [
    {
      "text": "Q: Name the planets in the solar system? A: Mercury, Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune and Pluto.",
      "index": 0,
      "logprobs": None,
      "finish_reason": "stop"
    }
  ],
  "usage": {
    "prompt_tokens": 14,
    "completion_tokens": 28,
    "total_tokens": 42
  }
}