Skip to content

Commit a9a90a1

Browse files
authored
Merge pull request #163 from janhq/162-feat-enable-caching-prompt-for-the-server
enable cache prompt
2 parents 7c0be9f + 2602b4f commit a9a90a1

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

controllers/llamaCPP.cc

+1-1
Original file line numberDiff line numberDiff line change
@@ -164,7 +164,7 @@ void llamaCPP::chatCompletion(
164164
data["frequency_penalty"] =
165165
(*jsonBody).get("frequency_penalty", 0).asFloat();
166166
data["presence_penalty"] = (*jsonBody).get("presence_penalty", 0).asFloat();
167-
167+
data["cache_prompt"] = true;
168168
const Json::Value &messages = (*jsonBody)["messages"];
169169
for (const auto &message : messages) {
170170
std::string input_role = message["role"].asString();

0 commit comments

Comments
 (0)