Chat with batch #11374
Unanswered
MartinPerry
asked this question in
Q&A
Chat with batch
#11374
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Can I use
llama_chat_apply_template
in batched processing? I mean, to pass multiple prompts (each of them different) and receive multiple results (one for each prompt). Currently, based on batch example, I can get multiple responses for a single prompt (however, they have slightly messed up order or words in comparison to only one batch).My current main code to process data from
llama_chat_apply_template
looks like this:Beta Was this translation helpful? Give feedback.
All reactions