Skip to content

Cannot use ChatOllama instead of ChatOpenAI #7

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
educatedpolarbear opened this issue Jun 21, 2024 · 0 comments
Open

Cannot use ChatOllama instead of ChatOpenAI #7

educatedpolarbear opened this issue Jun 21, 2024 · 0 comments

Comments

@educatedpolarbear
Copy link

educatedpolarbear commented Jun 21, 2024

Hi, my issue looks like this

D:\work\AI service\venv\Lib\site-packages\langchain\chat_models\__init__.py:32: LangChainDeprecationWarning: Importing chat models from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

`from langchain_community.chat_models import ChatOpenAI`.

To install langchain-community run `pip install -U langchain-community`.
  warnings.warn(
D:\work\AI service\venv\Lib\site-packages\langchain\chat_models\__init__.py:32: LangChainDeprecationWarning: Importing chat models from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

`from langchain_community.chat_models import ChatOllama`.

To install langchain-community run `pip install -U langchain-community`.
  warnings.warn(

Adding: 'The month is October.'
No chunks, creating a new one
Created new chunk (7cb66): Date & Times

Adding: 'The year is 2023.'
D:\work\AI service\venv\Lib\site-packages\langchain_core\_api\deprecation.py:139: LangChainDeprecationWarning: LangChain has introduced a method called `with_structured_output` thatis available on ChatModels capable of tool calling.You can read more about the method here: https://python.langchain.com/docs/modules/model_io/chat/structured_output/Please follow our extraction use case documentation for more guidelineson how to do information extraction with LLMs.https://python.langchain.com/docs/use_cases/extraction/.If you notice other issues, please provide feedback here:https://github.com/langchain-ai/langchain/discussions/18154
  warn_deprecated(
Based on the summary, I think this proposition should be joined with the chunk "Date & Times". Therefore, I return the chunk ID: 7cb66.
Traceback (most recent call last):
  File "D:\work\AI service\venv\Lib\site-packages\langchain_core\output_parsers\openai_functions.py", line 31, in parse_result
    func_call = copy.deepcopy(message.additional_kwargs["function_call"])
                              ~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
KeyError: 'function_call'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "D:\work\AI service\test.py", line 29, in <module>
    ac.add_propositions(propositions)
  File "D:\work\AI service\utils\agentic_chunker.py", line 42, in add_propositions
    self.add_proposition(proposition)
  File "D:\work\AI service\utils\agentic_chunker.py", line 55, in add_proposition
    chunk_id = self._find_relevant_chunk(proposition)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\work\AI service\utils\agentic_chunker.py", line 308, in _find_relevant_chunk
    print(extraction_chain.invoke(chunk_found))
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\work\AI service\venv\Lib\site-packages\langchain\chains\base.py", line 166, in invoke
    raise e
  File "D:\work\AI service\venv\Lib\site-packages\langchain\chains\base.py", line 156, in invoke
    self._call(inputs, run_manager=run_manager)
  File "D:\work\AI service\venv\Lib\site-packages\langchain\chains\llm.py", line 127, in _call
    return self.create_outputs(response)[0]
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\work\AI service\venv\Lib\site-packages\langchain\chains\llm.py", line 281, in create_outputs
    result = [
             ^
  File "D:\work\AI service\venv\Lib\site-packages\langchain\chains\llm.py", line 284, in <listcomp>
    self.output_key: self.output_parser.parse_result(generation),
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\work\AI service\venv\Lib\site-packages\langchain_core\output_parsers\openai_functions.py", line 219, in parse_result
    result = super().parse_result(result)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\work\AI service\venv\Lib\site-packages\langchain_core\output_parsers\openai_functions.py", line 202, in parse_result
    _result = super().parse_result(result)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\work\AI service\venv\Lib\site-packages\langchain_core\output_parsers\openai_functions.py", line 33, in parse_result
    raise OutputParserException(f"Could not parse function call: {exc}")
langchain_core.exceptions.OutputParserException: Could not parse function call: 'function_call'

I changed the init function into this:

class AgenticChunker:
    def __init__(self):
        self.chunks = {}
        self.id_truncate_limit = 5

        # Whether or not to update/refine summaries and titles as you get new information
        self.generate_new_metadata_ind = True
        self.print_logging = True

        self.llm = ChatOllama(model='llama3', base_url='http://localhost:9090')

    def add_propositions(self, propositions):
        for proposition in propositions:
            self.add_proposition(proposition)
    
    def add_proposition(self, proposition):
        if self.print_logging:
            print (f"\nAdding: '{proposition}'")
...

How to fix it? Does ChatOllama and ChatOpenAI have anything in different?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant