Name and Version
chad@chaddev:~/code/github.com/chadvoegele/llama.cpp>> ./build/bin/llama-server --version
ggml_cuda_init: found 1 CUDA devices:
Device 0: NVIDIA GeForce RTX 5090, compute capability 12.0, VMM: yes
version: 7188 (7c6980ae)
built with cc (GCC) 15.2.1 20251112 for x86_64-pc-linux-gnu
Operating systems
Linux
Which llama.cpp modules do you know to be affected?
llama-server
Command line
@chaddev:~/code/github.com/chadvoegele/llama.cpp>> ./build/bin/llama-server -v -hf ggml-org/gpt-oss-120b-GGUF --host 0.0.0.0 --port 8080 --ctx-size 0 --jinja -ub 2048 -b 2048 -ncmoe 22 --jinja --api-key-file <(pass llama-server-api-key)
Problem description & steps to reproduce
I'm running LibreChat + llama-server with gpt-oss-120b for a fully local agent. Everything works great but when I add the Filesystem MCP tool in LibreChat, the request starts failing.
I looked into the logs and found that it's because LibreChat + Filesystem MCP sends a not: {} in the schema, which is unsupported by llama-server. Further since llama-server returns a 500, LibreChat just retries and then fails with an unknown error.
Since not in the schema is unlikely to be implemented, I'm proposing to fix by
- Changing the invalid schema to a 400 in
llama-server in PR 17572
- Removing the superfluous
not: {} from LibreChat + Filesystem MCP
Although not a fix, this hack makes in common/json-schema-to-grammar.cpp everything hum.
+ } else if (schema.dump() == "{\"not\":{}}") {
+ return "";
First Bad Commit
No response
Relevant log output
srv log_server_r: request: {"model":"gpt-oss","user":"689567a9ec7a4187cb32d064","stream":true,"tools":[{"type":"function","function":{"name":"read_file_mcp_filesystem","description":"Read the complete contents of a file as text. DEPRECATED: Use read_text_file instead.","parameters":{"type":"object","properties":{"path":{"type":"string"},"tail":{"anyOf":[{"anyOf":[{"not":{}},{"type":"number","description":"If provided, returns only the last N lines of the file"}],"description":"If provided, returns only the last N lines of the file"},{"type":"null"}],"description":"If provided, returns only the last N lines of the file"},"head":{"anyOf":[{"anyOf":[{"not":{}},{"type":"number","description":"If provided, returns only the first N lines of the file"}],"d ... ],"messages":[{"role":"system","content":"[object Promise]"},{"role":"user","content":"hi show me the files"}]}
srv log_server_r: response: {"error":{"code":500,"message":"JSON schema conversion failed:\nUnrecognized schema: {\"not\":{}}
Name and Version
Operating systems
Linux
Which llama.cpp modules do you know to be affected?
llama-server
Command line
Problem description & steps to reproduce
I'm running LibreChat + llama-server with gpt-oss-120b for a fully local agent. Everything works great but when I add the Filesystem MCP tool in LibreChat, the request starts failing.
I looked into the logs and found that it's because LibreChat + Filesystem MCP sends a
not: {}in the schema, which is unsupported byllama-server. Further sincellama-serverreturns a 500, LibreChat just retries and then fails with an unknown error.Since
notin the schema is unlikely to be implemented, I'm proposing to fix byllama-serverin PR 17572not: {}from LibreChat + Filesystem MCPAlthough not a fix, this hack makes in
common/json-schema-to-grammar.cppeverything hum.First Bad Commit
No response
Relevant log output
srv log_server_r: request: {"model":"gpt-oss","user":"689567a9ec7a4187cb32d064","stream":true,"tools":[{"type":"function","function":{"name":"read_file_mcp_filesystem","description":"Read the complete contents of a file as text. DEPRECATED: Use read_text_file instead.","parameters":{"type":"object","properties":{"path":{"type":"string"},"tail":{"anyOf":[{"anyOf":[{"not":{}},{"type":"number","description":"If provided, returns only the last N lines of the file"}],"description":"If provided, returns only the last N lines of the file"},{"type":"null"}],"description":"If provided, returns only the last N lines of the file"},"head":{"anyOf":[{"anyOf":[{"not":{}},{"type":"number","description":"If provided, returns only the first N lines of the file"}],"d ... ],"messages":[{"role":"system","content":"[object Promise]"},{"role":"user","content":"hi show me the files"}]} srv log_server_r: response: {"error":{"code":500,"message":"JSON schema conversion failed:\nUnrecognized schema: {\"not\":{}}