Releases: Adriankhl/godot-llm
Releases Β· Adriankhl/godot-llm
v1.0
I think this plugin has sufficient basic features needed in a LLM framework, so it is the time to bump it to v1.0
- Breaking change: replace
Should Output Bos
andShould Output Eos
byShould Output Special
- A
LLM_LEGAL.md
to document information about legal issues on using LLM in games - macOS build: support is on best effort basis since I don't have a mac myself
store_text_finished
signal forLlmDB
nodecreate
function andLlmDBMetaDataType
enum forLlmDB
node- Add
Main GPU
andSplit Mode
to control which gpu (if you have multiple) should be used in computation, this is also needed as a workaround if you have multiple drivers install for a single GPU (my fix is still under review on the upstream llama.cpp repo) - Bug fixes
v0.4
v0.3
- Breaking change: rename
generate_text(prompt: String) -> String
togenerate_text_simple(prompt: String) -> String
, you may also add 2 empty string argument (generate_text(prompt, "", "")
) to achieve a similar result. - Unfortunately, embedding does not work with vulkan backend at the moment, now change to cpu build and let's wait for the upstream fix
- Add GDEmbedding node for sentence embedding computation
- Add GDLlava node for multimodal model
- Introduce multiple
run_*
methods for GDLlama, GDEmbedding, and GDLlava to run computation in background,
v0.2.2, v0.2.3
- Disable
LLAMA_NATIVE
to ensure that the binary is compatible on most PC.
v0.2, v0.2.1
- Add intruct and interactive mode, also implemented the relevant api
- Add Json schema api
- Statically linked msvc runtime, hopefully it fixes #2
- Expose more llama.cpp parameters
v0.2.1
- Fix linux build
First release
v0.1 docs: fix grammar