ollama/docs
Daniel Hiltgen 0a74cb31d5 Safeguard for noexec
We may have users that run into problems with our current
payload model, so this gives us an escape valve.
2024-04-01 16:48:33 -07:00
..
tutorials Update langchain python tutorial (#2737) 2024-02-25 00:31:36 -05:00
api.md Update api.md 2024-03-07 23:27:51 -08:00
development.md remove need for $VSINSTALLDIR since build will fail if ninja cannot be found (#3350) 2024-03-26 16:23:16 -04:00
faq.md change github.com/jmorganca/ollama to github.com/ollama/ollama (#3347) 2024-03-26 13:04:17 -07:00
gpu.md Add docs for GPU selection and nvidia uvm workaround 2024-03-21 11:52:54 +01:00
import.md Update import.md 2024-02-22 02:08:03 -05:00
linux.md Finish unwinding idempotent payload logic 2024-03-09 08:34:39 -08:00
modelfile.md change github.com/jmorganca/ollama to github.com/ollama/ollama (#3347) 2024-03-26 13:04:17 -07:00
openai.md change github.com/jmorganca/ollama to github.com/ollama/ollama (#3347) 2024-03-26 13:04:17 -07:00
README.md Update README.md 2024-03-13 21:12:17 -07:00
troubleshooting.md Safeguard for noexec 2024-04-01 16:48:33 -07:00
tutorials.md Created tutorial for running Ollama on NVIDIA Jetson devices (#1098) 2023-11-15 12:32:37 -05:00
windows.md Revamp ROCm support 2024-03-07 10:36:50 -08:00