diff --git a/docs/index.md b/docs/index.md index 1b85661..3d0454f 100644 --- a/docs/index.md +++ b/docs/index.md @@ -1,5 +1,10 @@ # 🦙 Python Bindings for `llama.cpp` +[![PyPI](https://img.shields.io/pypi/v/llama-cpp-python)](https://pypi.org/project/llama-cpp-python/) +[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/llama-cpp-python)](https://pypi.org/project/llama-cpp-python/) +[![PyPI - License](https://img.shields.io/pypi/l/llama-cpp-python)](https://pypi.org/project/llama-cpp-python/) +[![PyPI - Downloads](https://img.shields.io/pypi/dm/llama-cpp-python)](https://pypi.org/project/llama-cpp-python/) + Simple Python bindings for **@ggerganov's** [`llama.cpp`](https://github.com/ggerganov/llama.cpp) library. This package provides: @@ -8,6 +13,42 @@ This package provides: - OpenAI-like API - LangChain compatibility +## Installation + +Install from PyPI: + +```bash +pip install llama-cpp-python +``` + +## Usage + +```python +>>> from llama_cpp import Llama +>>> llm = Llama(model_path="models/7B/...") +>>> output = llm("Q: Name the planets in the solar system? A: ", max_tokens=32, stop=["Q:", "\n"], echo=True) +>>> print(output) +{ + "id": "cmpl-xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", + "object": "text_completion", + "created": 1679561337, + "model": "models/7B/...", + "choices": [ + { + "text": "Q: Name the planets in the solar system? A: Mercury, Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune and Pluto.", + "index": 0, + "logprobs": None, + "finish_reason": "stop" + } + ], + "usage": { + "prompt_tokens": 14, + "completion_tokens": 28, + "total_tokens": 42 + } +} +``` + ## API Reference @@ -20,4 +61,8 @@ This package provides: ::: llama_cpp.llama_cpp options: - show_if_no_docstring: true \ No newline at end of file + show_if_no_docstring: true + +## License + +This project is licensed under the terms of the MIT license. \ No newline at end of file diff --git a/mkdocs.yml b/mkdocs.yml index 6ed8246..fab10f5 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -8,4 +8,13 @@ plugins: - mkdocstrings watch: - - llama_cpp \ No newline at end of file + - llama_cpp + +markdown_extensions: + - pymdownx.highlight: + anchor_linenums: true + line_spans: __span + pygments_lang_class: true + - pymdownx.inlinehilite + - pymdownx.snippets + - pymdownx.superfences \ No newline at end of file