Jeffrey Morgan
|
d835368eb8
|
convert: capture head_dim for mistral (#5818)
|
2024-07-22 16:16:22 -04:00 |
|
Michael Yang
|
34d5ef29b3
|
fix conversion for f16 or f32 inputs
|
2024-05-21 11:28:22 -07:00 |
|
Michael Yang
|
bbbd9f20f3
|
cleanup
|
2024-05-20 16:13:57 -07:00 |
|
Michael Yang
|
9685c34509
|
quantize any fp16/fp32 model
- FROM /path/to/{safetensors,pytorch}
- FROM /path/to/fp{16,32}.bin
- FROM model:fp{16,32}
|
2024-05-06 15:24:01 -07:00 |
|
Patrick Devine
|
9f8691c6c8
|
Add llama2 / torch models for ollama create (#3607)
|
2024-04-15 11:26:42 -07:00 |
|
Michael Yang
|
be517e491c
|
no rope parameters
|
2024-04-05 18:05:27 -07:00 |
|
Patrick Devine
|
3b6a9154dd
|
Simplify model conversion (#3422)
|
2024-04-01 16:14:53 -07:00 |
|