Skip to content
This repository has been archived by the owner on Jun 23, 2024. It is now read-only.

Commit

Permalink
changed models as davinci was depreciated. Flask works now but main.p…
Browse files Browse the repository at this point in the history
…y run does not
  • Loading branch information
JustinGOSSES committed Jan 28, 2024
1 parent 5190954 commit 049b81f
Show file tree
Hide file tree
Showing 3 changed files with 3 additions and 3 deletions.
2 changes: 1 addition & 1 deletion src/agentE.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@
# llm = OpenAI(model_name="text-davinci-003",temperature=0.2,max_tokens=4096) ### does not work as too short!
#llm = OpenAI(model_name="gpt-3.5-turbo",temperature=0.2) ### can only do chat? not text?
# llm = OpenAI(model_name="gpt-4",temperature=0.2, max_tokens=4096)
llm = OpenAI(model_name="text-davinci-003",temperature=0.0)
llm = OpenAI(model_name="gpt-3.5-turbo",temperature=0.0)

llm_math_chain = LLMMathChain(llm=llm, verbose=True)

Expand Down
2 changes: 1 addition & 1 deletion src/agent_website_explore.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@
# llm = OpenAI(model_name="text-davinci-003",temperature=0.2,max_tokens=4096) ### does not work as too short!
#llm = OpenAI(model_name="gpt-3.5-turbo",temperature=0.2) ### can only do chat? not text?
# llm = OpenAI(model_name="gpt-4",temperature=0.2, max_tokens=4096)
llm = OpenAI(model_name="text-davinci-003",temperature=0.0)
llm = OpenAI(model_name="gpt-3.5-turbo",temperature=0.0)

llm_math_chain = LLMMathChain(llm=llm, verbose=True)

Expand Down
2 changes: 1 addition & 1 deletion src/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -111,7 +111,7 @@


# llm = OpenAI(model_name="text-davinci-003",temperature=0.2)
llm = OpenAI(model_name="text-davinci-003",temperature=0.2, max_tokens=256)
llm = OpenAI(model_name="gpt-4",temperature=0.2, max_tokens=256)



Expand Down

0 comments on commit 049b81f

Please sign in to comment.