Skip to content

Commit

Permalink
FIX: LLM sometimes returns string and the code handle it as list
Browse files Browse the repository at this point in the history
  • Loading branch information
nishio authored May 30, 2024
1 parent 661a209 commit d842bef
Showing 1 changed file with 7 additions and 2 deletions.
9 changes: 7 additions & 2 deletions scatter/pipeline/steps/extraction.py
Original file line number Diff line number Diff line change
Expand Up @@ -47,8 +47,13 @@ def extract_arguments(input, prompt, model, retries=3):
llm = ChatOpenAI(model_name=model, temperature=0.0)
response = llm(messages=messages(prompt, input)).content.strip()
try:
parsed = [a.strip() for a in json.loads(response)]
return parsed
obj = json.loads(response)
# LLM sometimes returns valid JSON string
if isinstance(obj, str):
obj = [obj]
items = [a.strip() for a in obj]
items = filter(None, items) # omit empty strings
return items
except json.decoder.JSONDecodeError as e:
print("JSON error:", e)
print("Input was:", input)
Expand Down

0 comments on commit d842bef

Please sign in to comment.