ollama/llm/patches
2024-06-06 23:14:33 -07:00
..
01-load-progress.diff Wire up load progress 2024-05-23 13:36:48 -07:00
02-clip-log.diff Fix clip log import 2024-04-26 09:43:46 -07:00
03-load_exception.diff bump (#4597) 2024-05-23 14:16:26 -07:00
04-metal.diff use matrix multiplcation kernels in more cases 2024-04-25 13:58:54 -07:00
05-default-pretokenizer.diff Update llama.cpp submodule to 5921b8f0 (#4731) 2024-05-30 16:20:22 -07:00
06-qwen2.diff llm: patch to fix qwen 2 temporarily on nvidia (#4897) 2024-06-06 23:14:33 -07:00