llama.cpp/docs/index.md

68 lines
1.8 KiB
Markdown
Raw Normal View History

2023-03-24 22:57:59 +00:00
# 🦙 Python Bindings for `llama.cpp`
2023-03-24 23:02:36 +00:00
[![PyPI](https://img.shields.io/pypi/v/llama-cpp-python)](https://pypi.org/project/llama-cpp-python/)
[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/llama-cpp-python)](https://pypi.org/project/llama-cpp-python/)
[![PyPI - License](https://img.shields.io/pypi/l/llama-cpp-python)](https://pypi.org/project/llama-cpp-python/)
[![PyPI - Downloads](https://img.shields.io/pypi/dm/llama-cpp-python)](https://pypi.org/project/llama-cpp-python/)
2023-03-24 22:57:59 +00:00
Simple Python bindings for **@ggerganov's** [`llama.cpp`](https://github.com/ggerganov/llama.cpp) library.
This package provides:
- Low-level access to C API via `ctypes` interface.
- High-level Python API for text completion
- OpenAI-like API
- LangChain compatibility
2023-03-24 23:02:36 +00:00
## Installation
Install from PyPI:
```bash
pip install llama-cpp-python
```
## Usage
```python
>>> from llama_cpp import Llama
>>> llm = Llama(model_path="models/7B/...")
>>> output = llm("Q: Name the planets in the solar system? A: ", max_tokens=32, stop=["Q:", "\n"], echo=True)
>>> print(output)
{
"id": "cmpl-xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
"object": "text_completion",
"created": 1679561337,
"model": "models/7B/...",
"choices": [
{
"text": "Q: Name the planets in the solar system? A: Mercury, Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune and Pluto.",
"index": 0,
"logprobs": None,
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 14,
"completion_tokens": 28,
"total_tokens": 42
}
}
```
2023-03-24 22:57:59 +00:00
## API Reference
::: llama_cpp.Llama
options:
members:
- __init__
- __call__
show_root_heading: true
::: llama_cpp.llama_cpp
options:
2023-03-24 23:02:36 +00:00
show_if_no_docstring: true
## License
This project is licensed under the terms of the MIT license.