Skip to content

agents: fix ChainCallOption silent failure#1420

Merged
tmc merged 2 commits intomainfrom
fix/chainoption-propagation
Oct 20, 2025
Merged

agents: fix ChainCallOption silent failure#1420
tmc merged 2 commits intomainfrom
fix/chainoption-propagation

Conversation

@tmc
Copy link
Owner

@tmc tmc commented Oct 15, 2025

Fixes #1416

Problem

ChainCallOption parameters were being silently ignored in agent.Plan() calls, preventing LLM options like temperature, max_tokens, and streaming functions from being applied.

Solution

  • Updated Agent interface to accept ChainCallOption variadic parameters
  • Propagate options through Executor.Call() → Agent.Plan() → chains.Predict() → LLM.GenerateContent()
  • Export GetLLMCallOptions() function for converting chain options to LLM options
  • Update all agent implementations (MRKL, Conversational, OpenAI Functions)

Side Effects

Testing

  • All existing tests pass
  • Updated executor_test.go and all agent implementations to match new signature

tmc added 2 commits October 15, 2025 15:24
Fix security issue where context deadline errors could expose API keys and
sensitive request details in error messages. Added sanitizeHTTPError function
to detect context timeouts and network errors, then return generic error
messages without exposing sensitive information.

Changes:
- Added sanitizeHTTPError() function to sanitize HTTP client errors
- Updated chat.go to use sanitizeHTTPError() for http.Do() errors
- Updated embeddings.go to use sanitizeHTTPError() for http.Do() errors
- Added comprehensive test cases to prevent regression
Fix issue where ChainCallOption parameters were silently ignored by Executor.Call() and Agent implementations.

Changes:
- Updated Agent.Plan() interface signature to accept variadic ChainCallOption parameters
- Updated Executor.Call() to accept and propagate options to Agent.Plan()
- Updated Executor.doIteration() to propagate options through the chain
- Updated OneShotZeroAgent.Plan() to accept and pass options to chains.Predict()
- Updated ConversationalAgent.Plan() to accept and pass options to chains.Predict()
- Updated OpenAIFunctionsAgent.Plan() to accept and pass options to LLM.GenerateContent()
- Exported GetLLMCallOptions() function for option conversion (was getLLMCallOptions)
- Updated test mock to match new Agent interface signature

Now users can pass LLM configuration options (temperature, max tokens, etc.) through executors to agents.
@mikejw
Copy link

mikejw commented Oct 17, 2025

I need this!

@tmc tmc merged commit db2a947 into main Oct 20, 2025
161 checks passed
@tmc tmc deleted the fix/chainoption-propagation branch October 20, 2025 00:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

2 participants