llama.cpp/CHANGELOG.md
2023-05-26 20:26:08 -04:00

477 B

Changelog

All notable changes to this project will be documented in this file.

The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.

[Unreleased]

Added

  • Added first version of the changelog
  • Use numpy for internal buffers to reduce memory usage and improve performance.

Fixed

  • Performance bug in stop sequence check slowing down streaming.