No description
examples | ||
llama_cpp | ||
vendor | ||
.gitignore | ||
.gitmodules | ||
CMakeLists.txt | ||
LICENSE.md | ||
poetry.lock | ||
pyproject.toml | ||
README.md | ||
setup.py |
llama.cpp
Python Bindings
Simple Python bindings for @ggerganov's llama.cpp
library.
This package provides:
- Low-level access to C API via
ctypes
interface - High-level Python API for text completion inspired by OpenAI's API
Install
pip install llama-cpp-python
Usage
>>> from llama_cpp import Llama
>>> llm = Llama(model_path="models/7B/...")
>>> output = llm("Q: Name the planets in the solar system? A: ", max_tokens=32, stop=["Q:", "\n"], echo=True)
>>> print(output)
{
"id": "cmpl-xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
"object": "text_completion",
"created": 1679561337,
"model": "models/7B/...",
"choices": [
{
"text": "Q: Name the planets in the solar system? A: Mercury, Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune and Pluto.",
"index": 0,
"logprobs": None,
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 14,
"completion_tokens": 28,
"total_tokens": 42
}
}