Skip to content

Cannot use ChatOllama instead of ChatOpenAI #7

Open
@educatedpolarbear

Description

@educatedpolarbear

Hi, my issue looks like this

D:\work\AI service\venv\Lib\site-packages\langchain\chat_models\__init__.py:32: LangChainDeprecationWarning: Importing chat models from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

`from langchain_community.chat_models import ChatOpenAI`.

To install langchain-community run `pip install -U langchain-community`.
  warnings.warn(
D:\work\AI service\venv\Lib\site-packages\langchain\chat_models\__init__.py:32: LangChainDeprecationWarning: Importing chat models from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

`from langchain_community.chat_models import ChatOllama`.

To install langchain-community run `pip install -U langchain-community`.
  warnings.warn(

Adding: 'The month is October.'
No chunks, creating a new one
Created new chunk (7cb66): Date & Times

Adding: 'The year is 2023.'
D:\work\AI service\venv\Lib\site-packages\langchain_core\_api\deprecation.py:139: LangChainDeprecationWarning: LangChain has introduced a method called `with_structured_output` thatis available on ChatModels capable of tool calling.You can read more about the method here: https://python.langchain.com/docs/modules/model_io/chat/structured_output/Please follow our extraction use case documentation for more guidelineson how to do information extraction with LLMs.https://python.langchain.com/docs/use_cases/extraction/.If you notice other issues, please provide feedback here:https://github.com/langchain-ai/langchain/discussions/18154
  warn_deprecated(
Based on the summary, I think this proposition should be joined with the chunk "Date & Times". Therefore, I return the chunk ID: 7cb66.
Traceback (most recent call last):
  File "D:\work\AI service\venv\Lib\site-packages\langchain_core\output_parsers\openai_functions.py", line 31, in parse_result
    func_call = copy.deepcopy(message.additional_kwargs["function_call"])
                              ~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
KeyError: 'function_call'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "D:\work\AI service\test.py", line 29, in <module>
    ac.add_propositions(propositions)
  File "D:\work\AI service\utils\agentic_chunker.py", line 42, in add_propositions
    self.add_proposition(proposition)
  File "D:\work\AI service\utils\agentic_chunker.py", line 55, in add_proposition
    chunk_id = self._find_relevant_chunk(proposition)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\work\AI service\utils\agentic_chunker.py", line 308, in _find_relevant_chunk
    print(extraction_chain.invoke(chunk_found))
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\work\AI service\venv\Lib\site-packages\langchain\chains\base.py", line 166, in invoke
    raise e
  File "D:\work\AI service\venv\Lib\site-packages\langchain\chains\base.py", line 156, in invoke
    self._call(inputs, run_manager=run_manager)
  File "D:\work\AI service\venv\Lib\site-packages\langchain\chains\llm.py", line 127, in _call
    return self.create_outputs(response)[0]
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\work\AI service\venv\Lib\site-packages\langchain\chains\llm.py", line 281, in create_outputs
    result = [
             ^
  File "D:\work\AI service\venv\Lib\site-packages\langchain\chains\llm.py", line 284, in <listcomp>
    self.output_key: self.output_parser.parse_result(generation),
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\work\AI service\venv\Lib\site-packages\langchain_core\output_parsers\openai_functions.py", line 219, in parse_result
    result = super().parse_result(result)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\work\AI service\venv\Lib\site-packages\langchain_core\output_parsers\openai_functions.py", line 202, in parse_result
    _result = super().parse_result(result)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\work\AI service\venv\Lib\site-packages\langchain_core\output_parsers\openai_functions.py", line 33, in parse_result
    raise OutputParserException(f"Could not parse function call: {exc}")
langchain_core.exceptions.OutputParserException: Could not parse function call: 'function_call'

I changed the init function into this:

class AgenticChunker:
    def __init__(self):
        self.chunks = {}
        self.id_truncate_limit = 5

        # Whether or not to update/refine summaries and titles as you get new information
        self.generate_new_metadata_ind = True
        self.print_logging = True

        self.llm = ChatOllama(model='llama3', base_url='http://localhost:9090')

    def add_propositions(self, propositions):
        for proposition in propositions:
            self.add_proposition(proposition)
    
    def add_proposition(self, proposition):
        if self.print_logging:
            print (f"\nAdding: '{proposition}'")
...

How to fix it? Does ChatOllama and ChatOpenAI have anything in different?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions