* include seed in params for llama.cpp server and remove empty filter for temp * relay default predict options to llama.cpp - reorganize options to match predict request for readability * omit empty stop --------- Co-authored-by: hallh <hallh@users.noreply.github.com> |
||
|---|---|---|
| .. | ||
| llama.cpp | ||
| falcon.go | ||
| ggml.go | ||
| gguf.go | ||
| llama.go | ||
| llm.go | ||
| utils.go | ||