Skip to content

Commit b09005c

Browse files
committed
Enhance Chain of Thought processing by ensuring responses address the original prompt and include prompt context
1 parent 4b6b431 commit b09005c

File tree

1 file changed

+3
-1
lines changed

1 file changed

+3
-1
lines changed

chatbot/server.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -225,6 +225,7 @@ def debug(text):
225225
Include relevant scientific and factual details to support the answer.
226226
If there is an equation, make sure you define the variables and units. Do not include an equation section if not needed.
227227
If source code provided, include the code block and describe what it does. Do not include a code section otherwise.
228+
Make sure the answer addresses the original prompt: {prompt}
228229
"""
229230
# Log ONE_SHOT mode
230231
if ONESHOT:
@@ -893,7 +894,8 @@ async def send_update(session_id):
893894
answer = await ask_context(temp_context)
894895
await sio.emit('update', {'update': '\n\n', 'voice': 'ai'},room=session_id)
895896
# Load request for CoT conclusion into conversational thread
896-
cot_prompt = expand_prompt(prompts["chain_of_thought_summary"], {"context_str": answer})
897+
cot_prompt = expand_prompt(prompts["chain_of_thought_summary"], {"context_str": answer,
898+
"prompt": client[session_id]["cot_prompt"]})
897899
client[session_id]["prompt"] = cot_prompt
898900
try:
899901
# Ask LLM for answers

0 commit comments

Comments
 (0)