Skip to content

Commit fd757c3

Browse files
committed
Fix ResponseFormat="json" for Ollama
1 parent bd61b0f commit fd757c3

File tree

2 files changed

+7
-1
lines changed

2 files changed

+7
-1
lines changed

Diff for: +llms/+internal/callOllamaChatAPI.m

+1-1
Original file line numberDiff line numberDiff line change
@@ -100,7 +100,7 @@
100100
options = struct;
101101

102102
if strcmp(nvp.ResponseFormat,"json")
103-
parameters.format = struct('type','json_object');
103+
parameters.format = "json";
104104
elseif isstruct(nvp.ResponseFormat)
105105
parameters.format = llms.internal.jsonSchemaFromPrototype(nvp.ResponseFormat);
106106
elseif startsWith(string(nvp.ResponseFormat), asManyOfPattern(whitespacePattern)+"{")

Diff for: tests/tollamaChat.m

+6
Original file line numberDiff line numberDiff line change
@@ -61,6 +61,12 @@ function generateOverridesProperties(testCase)
6161
testCase.verifyThat(text, EndsWithSubstring("3, "));
6262
end
6363

64+
function generateJSON(testCase)
65+
testCase.verifyClass( ...
66+
generate(testCase.defaultModel,"create some address, return json",ResponseFormat="json"), ...
67+
"string");
68+
end
69+
6470
function generateWithToolsAndStreamFunc(testCase)
6571
% The test point in htoolCalls expects a format that is
6672
% different from what we get from Ollama. Having that

0 commit comments

Comments
 (0)