llama.cpp/CHANGELOG.md
2023-06-04 23:31:51 -04:00

609 B

Changelog

All notable changes to this project will be documented in this file.

The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.

[Unreleased]

[v0.1.58]

  • Added: Metal Silicon support

[v0.1.57]

  • Added: OpenLlama 3B support

[v0.1.56]

Added

  • Added first version of the changelog
  • Server: Use async routes
  • Use numpy for internal buffers to reduce memory usage and improve performance.

Fixed

  • Performance bug in stop sequence check slowing down streaming.