ollama/.gitignore
Daniel Hiltgen d4cd695759 Add cgo implementation for llama.cpp
Run the server.cpp directly inside the Go runtime via cgo
while retaining the LLM Go abstractions.
2023-12-19 09:05:46 -08:00

12 lines
No EOL
91 B
Text

.DS_Store
.vscode
.env
.venv
.swp
dist
ollama
ggml-metal.metal
.cache
*.exe
.idea
test_data