ollama/llm
Daniel Hiltgen d9cd3d9667 Revive windows build
The windows native setup still needs some more work, but this gets it building
again and if you set the PATH properly, you can run the resulting exe on a cuda system.
2023-12-20 17:21:54 -08:00
..
llama.cpp Revive windows build 2023-12-20 17:21:54 -08:00
dynamic_shim.c Revamp the dynamic library shim 2023-12-20 14:45:57 -08:00
dynamic_shim.h Revamp the dynamic library shim 2023-12-20 14:45:57 -08:00
ext_server.go Revive windows build 2023-12-20 17:21:54 -08:00
ggml.go deprecate ggml 2023-12-19 09:05:46 -08:00
gguf.go remove per-model types 2023-12-11 09:40:21 -08:00
llama.go Revamp the dynamic library shim 2023-12-20 14:45:57 -08:00
llm.go Revamp the dynamic library shim 2023-12-20 14:45:57 -08:00
shim_darwin.go Revamp the dynamic library shim 2023-12-20 14:45:57 -08:00
shim_ext_server.go Revamp the dynamic library shim 2023-12-20 14:45:57 -08:00
utils.go partial decode ggml bin for more info 2023-08-10 09:23:10 -07:00