Skip to content

Commit 15dea7b

Browse files
committed
opt : remove print [no ci]
1 parent cee751c commit 15dea7b

File tree

1 file changed

+0
-2
lines changed

1 file changed

+0
-2
lines changed

src/llama-context.cpp

-2
Original file line numberDiff line numberDiff line change
@@ -1957,8 +1957,6 @@ void llama_context::opt_epoch_iter(
19571957

19581958
n_outputs = ubatch.n_tokens;
19591959

1960-
printf("ubatch.n_tokens = %d\n", ubatch.n_tokens);
1961-
19621960
// TODO: not sure if this is needed
19631961
if (!kv_self->find_slot(ubatch)) {
19641962
LLAMA_LOG_WARN("%s: failed to find KV cache slot for ubatch of size %d\n", __func__, ubatch.n_tokens);

0 commit comments

Comments
 (0)