feat: Add Ollama integration with opt-in support#278
feat: Add Ollama integration with opt-in support#278
Conversation
## Summary
Implemented Ollama integration to enable local AI using qwen2.5-coder:0.5b
model for the inline explain action. This provides faster, offline code
explanation without relying on Claude API.
## Changes
### build.sh - Ollama Setup Automation
- Added automatic Ollama installation (macOS: Homebrew, Linux: official script)
- Auto-download qwen2.5-coder:0.5b model (~400MB)
- Auto-start Ollama service (brew services / systemd)
- Added VIBING_SKIP_OLLAMA=1 option to skip setup
- Color-coded output for better visibility
### Core Implementation
- `lua/vibing/infrastructure/ollama/http_client.lua`
- HTTP client using curl for Ollama API communication
- Streaming support with JSON Lines parsing
- Connection health check
- `lua/vibing/infrastructure/adapter/ollama.lua`
- Ollama adapter implementing Vibing.Adapter interface
- Supports streaming responses
- No tool execution support (explanation only)
### Configuration
- Added `ollama` config section in `config.lua`:
- `enabled`: Enable/disable Ollama (default: false)
- `url`: Ollama server URL (default: http://localhost:11434)
- `model`: Model name (default: qwen2.5-coder:0.5b)
- `timeout`: Request timeout in ms (default: 30000)
- `stream`: Enable streaming (default: true)
### Adapter Selection
- Modified `lua/vibing/application/inline/use_case.lua`:
- `explain` action uses Ollama when enabled
- Other actions (fix, feat, refactor, test) use agent_sdk
- Automatic fallback to agent_sdk if Ollama unavailable
### Plugin Initialization
- Modified `lua/vibing/init.lua`:
- Initialize both agent_sdk and Ollama adapters
- Added `get_ollama_adapter()` function
- Cleanup Ollama adapter on exit
## Usage
```lua
require("vibing").setup({
ollama = {
enabled = true, -- Enable Ollama for explain action
},
})
```
Then use `:VibingInline explain` on visual selection.
## Related
- Issue #262: Implement inline chat and completion using Ollama qwen2.5-coder:0.5B
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
- Fix HTTP client JSON parsing for streaming responses - on_stdout callback receives pre-split lines, not lines with newlines - Changed from searching for newlines to direct JSON parsing per line - Fix output buffer name duplication error - Reuse existing buffer if name already exists - Prevents "E95: Buffer with this name already exists" error - Add Japanese language support for Ollama - Add system prompt to enforce Japanese-only responses - Reduces Chinese language mixing in qwen2.5-coder output - Add debug notifications for adapter selection - Notify when Ollama adapter is selected for explain action 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
- Automatically use Ollama adapter when enabled for title generation - Falls back to Agent SDK if Ollama is not available - Set language option to English for filename-friendly titles 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
- Limit conversation context to first 2 and last 2 messages - Truncate each message to 300 characters to prevent context overflow - Improve prompt with specific examples and technical focus - Explicitly forbid Chinese characters in title - Better suited for small models like qwen2.5-coder:0.5b 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
- Fix logic for existing file case: rename file first, then update buffer name - Add write! to force save for new files - Remove redundant save operation for existing files Fixes "E13: File exists (add ! to override)" error 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
- Increase sanitize() max length from 32 to 50 characters - Adjust title generation prompt from 15-30 to 20-50 characters - Add longer examples in prompt for better guidance - Prevents title truncation like "vibingnvim_ollama_integration_fi" 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
- Add explicit exclusion of dates, timestamps, and prefixes in prompt - Strip timestamp patterns (chat_YYYYMMDD_, YYYYMMDD_) from sanitized text - Remove .vibing extension if included in generated title - Prevents titles like "chat_20260108_vibingnvim_ollama_integration" 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
- Remove Ollama from inline explain action (Claude provides better quality) - Add use_for_title config option for Ollama (default: false) - Title generation uses Claude by default, Ollama is opt-in - Change default model from 0.5b to 1.5b for better quality - Update type annotations with model recommendations Rationale: - Explain action needs high accuracy → Claude only - Title generation can use lightweight model → Ollama opt-in - Reduces Ollama integration scope to minimal, specific use case 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
- Add new 'doc' inline action for generating documentation comments - Support JSDoc/TSDoc/EmmyLua comment styles with language detection - Add use_for_doc config option for Ollama opt-in (default: false) - Use Ollama adapter when enabled for lightweight doc generation - Update VibingInline command completion to include 'doc' action 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
- Create AdapterManager to centralize adapter selection logic - Remove duplicate adapter selection code from use_case.lua - Add get_adapter_for(use_case) API for use-case-based selection - Maintain backward compatibility with get_adapter() and get_ollama_adapter() - Simplify init.lua by delegating to AdapterManager 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
📝 WalkthroughWalkthroughThis pull request introduces Ollama integration to the Vibing plugin, enabling local LLM inference alongside the existing Claude Agent SDK. It adds HTTP-based Ollama adapter components, configuration options, an adapter manager for intelligent backend selection, and updates the build system for Ollama setup. Changes
Sequence Diagram(s)sequenceDiagram
participant User as User / Plugin
participant Init as vibing.init
participant Manager as AdapterManager
participant Claude as Claude Adapter
participant Ollama as Ollama Adapter
User->>Init: setup(config)
Init->>Init: Create agent_sdk_adapter
alt ollama.enabled == true
Init->>Ollama: new(config)
Ollama->>Ollama: Initialize URL, model, timeout
end
Init->>Manager: new(config, agent_sdk_adapter, ollama_adapter)
Init->>Init: Store adapter_manager
User->>Init: get_adapter_for(use_case)
Init->>Manager: get_adapter_for(use_case)
alt use_case == "doc" or "title"
Manager->>Manager: Check ollama.enabled && config.use_for_*
alt Ollama enabled for use_case
Manager-->>Init: Return ollama_adapter
else Ollama unavailable
Manager-->>Init: Return agent_sdk_adapter
end
else default
Manager-->>Init: Return agent_sdk_adapter
end
Init-->>User: Return selected adapter
sequenceDiagram
participant Neovim as Neovim Client
participant TitleGen as title_generator
participant Adapter as OllamaAdapter
participant HTTP as http_client
participant Ollama as Ollama Server
participant FS as Filesystem
Neovim->>TitleGen: generate_from_conversation(conversation)
TitleGen->>TitleGen: vibing.get_adapter_for("title")
TitleGen->>Adapter: Select Ollama or Claude
TitleGen->>TitleGen: Select first/last 2 messages, truncate
TitleGen->>Adapter: stream(prompt, opts, on_chunk, on_done)
Adapter->>Adapter: build_prompt(full_prompt, context)
Adapter->>HTTP: check_connection(url)
HTTP->>Ollama: GET /api/tags (curl)
Ollama-->>HTTP: Response OK
HTTP-->>Adapter: Connection valid
Adapter->>HTTP: post_stream(url, body, on_chunk, on_done)
HTTP->>Ollama: POST /api/generate (streaming)
loop Stream chunks
Ollama-->>HTTP: JSON response chunk
HTTP->>HTTP: Parse & buffer JSON
HTTP->>Adapter: on_chunk(decoded.response)
Adapter->>TitleGen: Accumulate title text
end
Ollama-->>HTTP: done=true
HTTP->>Adapter: on_done(true, final_response)
Adapter->>TitleGen: Callback with accumulated title
TitleGen->>TitleGen: Sanitize via filename_util
TitleGen->>FS: Callback to update file/buffer
FS-->>Neovim: Title applied
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 5✅ Passed checks (5 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
PR #278 レビュー: Ollama統合機能の追加総評非常に構造化された高品質なPRです。ローカルAI(Ollama)統合という重要な機能を、オプトイン方式で慎重に実装しています。アーキテクチャ設計が優れており、コード品質も高いですが、いくつかの改善提案があります。 優れている点1. アーキテクチャ設計
2. エラーハンドリング
3. ドキュメント
改善提案セキュリティ上の懸念🔴 Critical: build.shのcurlパイプ実行(87行目)問題点:
推奨: ユーザーに手動インストールを促す方が安全 🟡 Medium: 接続タイムアウト未設定(http_client.lua:100)問題点: ネットワーク遅延時に無期限に待機する可能性 推奨: curlに --max-time 5 オプションを追加 🟡 Medium: job_idのクリーンアップ不足(ollama.lua:80)問題点: cancel()メソッドの実装が見当たらない 推奨: OllamaAdapter:cancel()を実装 テストカバレッジ新機能のテスト不足:
推奨: 対応するテストファイルを追加 推奨アクション必須(マージ前):
推奨(次回PR):
まとめこのPRは、Ollama統合という複雑な機能を、優れた設計思想とコード品質で実装しています:
セキュリティ上の懸念を修正すれば、マージに適した状態です。 レビュー結果: 承認(要修正) お疲れ様でした!素晴らしい実装です 👏 |
There was a problem hiding this comment.
Actionable comments posted: 5
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (2)
lua/vibing/ui/output_buffer.lua (1)
189-192: Wrap timer callback invim.schedule()for safe API access.The timer callback calls
_flush_chunks()which usesnvim_buf_get_linesandnvim_buf_set_lines. Timer callbacks run outside Neovim's main event loop, so these API calls should be wrapped invim.schedule()to prevent potential issues.Proposed fix
self._chunk_timer = vim.fn.timer_start(50, function() - self:_flush_chunks() - self._chunk_timer = nil + vim.schedule(function() + self:_flush_chunks() + self._chunk_timer = nil + end) end)As per coding guidelines,
vim.schedule()should be used when accessing Neovim APIs from async/concurrent operations.lua/vibing/init.lua (1)
169-173: Bug:M.adapteris undefined;VibingCancelcommand will silently fail.
M.adapterwas replaced withM.adapter_manager, but this command still references the old field. The conditionif M.adapter thenwill always be false, making the cancel command non-functional.🐛 Fix to use adapter_manager
vim.api.nvim_create_user_command("VibingCancel", function() - if M.adapter then - M.adapter:cancel() + if M.adapter_manager then + M.adapter_manager:get_default_adapter():cancel() end end, { desc = "Cancel current Vibing request" })
🤖 Fix all issues with AI agents
In @build.sh:
- Around line 121-127: The build script sets MODEL="qwen2.5-coder:0.5b" which
conflicts with the default in lua/vibing/config.lua (qwen2.5-coder:1.5b); update
the build logic so the model string matches the config default (replace MODEL
value with qwen2.5-coder:1.5b) or, better, read the default from
lua/vibing/config.lua at runtime and use that value for MODEL; ensure the echoed
messages and the ollama pull/check logic referencing MODEL reflect the same
model string.
In @lua/vibing/application/inline/use_case.lua:
- Around line 34-36: The predefined-action branch sets use_case and calls
vibing.get_adapter_for(use_case) but never checks for a nil adapter; add the
same nil-check behavior used in M.custom() to validate the adapter before
calling the Execution functions: after computing adapter from
vibing.get_adapter_for(use_case) (symbols: use_case, adapter,
vibing.get_adapter_for) verify adapter is non-nil and handle the nil case
(log/return an error or early-return) so downstream calls into the Execution
functions do not receive a nil adapter.
In @lua/vibing/core/utils/title_generator.lua:
- Around line 43-46: The code truncates using byte length (#content) which can
split multi-byte UTF-8 characters; change the truncation to count characters and
cut at a safe byte boundary using Neovim's helper: compute the byte index for
the 300th character with vim.str_byteindex(msg.content, 300) (or fallback to
utf8 handling), then if that byte index is less than #msg.content set content =
msg.content:sub(1, byte_index) .. "..." so you never slice mid-character; update
references to content/msg.content in this block accordingly.
In @lua/vibing/infrastructure/adapter/ollama.lua:
- Around line 133-144: supports("cancel") returns true but OllamaAdapter lacks a
cancel() implementation; add a cancel(self) method on OllamaAdapter that follows
the adapter interface (matching execute() / stream() patterns) to gracefully
stop an in-flight request (e.g., signal/flag or abort request handle), ensure
cancel() is exported on the module/class and that supports() remains true only
if cancel is implemented, and update any internal request tracking (the same
request ID/handle used by stream()/execute()) so cancel() can locate and abort
the active operation.
🧹 Nitpick comments (4)
build.sh (2)
177-185: Unused loop variable.The loop variable
iis unused. Use_to indicate an intentionally unused variable.Proposed fix
# Wait for server to start (max 15 seconds) echo "[vibing.nvim] Waiting for Ollama server to be ready..." - for i in {1..15}; do + for _ in {1..15}; do if curl -s http://localhost:11434/api/tags &> /dev/null; then
164-174: Consider handling non-sudo environments gracefully.The Linux systemctl path uses
sudowhich may fail in non-interactive or rootless contexts. The fallback tonohup ollama servehandles this, but the error fromsudomay be confusing.Suppress sudo errors and fall through to nohup
Linux*) echo "[vibing.nvim] Starting Ollama service (systemd)..." if command -v systemctl &>/dev/null; then - if sudo systemctl enable ollama 2>/dev/null && sudo systemctl start ollama 2>/dev/null; then + if sudo -n systemctl enable ollama 2>/dev/null && sudo -n systemctl start ollama 2>/dev/null; then echo "[vibing.nvim] Started via systemd" + else + # Fallback to direct start if sudo not available + nohup ollama serve &>/dev/null & fi else # Fallback to direct start nohup ollama serve &>/dev/null & fi ;;Using
sudo -n(non-interactive) prevents password prompts and allows clean fallback.lua/vibing/infrastructure/ollama/http_client.lua (1)
44-48:on_chunkcallback should usevim.schedule()for safe Neovim API access.Per the coding guidelines, when accessing Neovim APIs from async/concurrent operations,
vim.schedule()should be used. Theon_chunkcallback is invoked from a jobstart stdout handler, which runs asynchronously. While the caller inollama.luawraps it, defensive scheduling here ensures safety regardless of caller behavior.♻️ Wrap on_chunk in vim.schedule
if decoded.response then if on_chunk then - on_chunk(decoded.response) + vim.schedule(function() + on_chunk(decoded.response) + end) end endlua/vibing/init.lua (1)
60-74: Consider canceling both adapters on exit for completeness.Currently only the AgentSDK adapter is canceled on VimLeavePre. If Ollama has an active streaming job, it will continue briefly after exit. This is a minor concern since the process terminates anyway.
♻️ Optional: Cancel both adapters
vim.api.nvim_create_autocmd("VimLeavePre", { callback = function() -- Agent SDKプロセスを全てキャンセル if M.adapter_manager then M.adapter_manager:get_default_adapter():cancel() + local ollama = M.adapter_manager:get_ollama_adapter() + if ollama and ollama.cancel then + ollama:cancel() + end end
📜 Review details
Configuration used: defaults
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (13)
CLAUDE.mdbuild.shlua/vibing/application/chat/handlers/set_file_title.lualua/vibing/application/inline/modules/action_config.lualua/vibing/application/inline/use_case.lualua/vibing/config.lualua/vibing/core/utils/filename.lualua/vibing/core/utils/title_generator.lualua/vibing/infrastructure/adapter/manager.lualua/vibing/infrastructure/adapter/ollama.lualua/vibing/infrastructure/ollama/http_client.lualua/vibing/init.lualua/vibing/ui/output_buffer.lua
🧰 Additional context used
📓 Path-based instructions (6)
build.sh
📄 CodeRabbit inference engine (CLAUDE.md)
Build script must automatically register the vibing-nvim MCP server in ~/.claude.json without requiring manual configuration
Files:
build.sh
**/*.{js,mjs,ts,tsx,lua}
📄 CodeRabbit inference engine (CLAUDE.md)
Use camelCase for variable and function names in JavaScript, TypeScript, and Lua code
Files:
lua/vibing/config.lualua/vibing/infrastructure/ollama/http_client.lualua/vibing/application/inline/modules/action_config.lualua/vibing/ui/output_buffer.lualua/vibing/infrastructure/adapter/manager.lualua/vibing/core/utils/title_generator.lualua/vibing/application/chat/handlers/set_file_title.lualua/vibing/application/inline/use_case.lualua/vibing/core/utils/filename.lualua/vibing/infrastructure/adapter/ollama.lualua/vibing/init.lua
**/*.lua
📄 CodeRabbit inference engine (CLAUDE.md)
Use vim.schedule() to ensure safe API calls run on Neovim's main event loop when accessing Neovim APIs from async/concurrent operations
Files:
lua/vibing/config.lualua/vibing/infrastructure/ollama/http_client.lualua/vibing/application/inline/modules/action_config.lualua/vibing/ui/output_buffer.lualua/vibing/infrastructure/adapter/manager.lualua/vibing/core/utils/title_generator.lualua/vibing/application/chat/handlers/set_file_title.lualua/vibing/application/inline/use_case.lualua/vibing/core/utils/filename.lualua/vibing/infrastructure/adapter/ollama.lualua/vibing/init.lua
lua/vibing/**/*.lua
📄 CodeRabbit inference engine (CLAUDE.md)
lua/vibing/**/*.lua: All Lua modules should implement or conform to the Adapter pattern interface with methods: execute(), stream(), cancel(), and supports()
When handling concurrent chat sessions or inline actions, generate unique handle IDs using 'hrtime + random' pattern to avoid conflicts
Implement permission rule evaluation in order: deny rules first (immediate block), then allow rules (grant if matched), then default deny if no rules match
Files:
lua/vibing/config.lualua/vibing/infrastructure/ollama/http_client.lualua/vibing/application/inline/modules/action_config.lualua/vibing/ui/output_buffer.lualua/vibing/infrastructure/adapter/manager.lualua/vibing/core/utils/title_generator.lualua/vibing/application/chat/handlers/set_file_title.lualua/vibing/application/inline/use_case.lualua/vibing/core/utils/filename.lualua/vibing/infrastructure/adapter/ollama.lualua/vibing/init.lua
lua/vibing/config.lua
📄 CodeRabbit inference engine (CLAUDE.md)
Configuration module must include type annotations for all configuration options and parameters
Files:
lua/vibing/config.lua
lua/vibing/ui/**/*.lua
📄 CodeRabbit inference engine (CLAUDE.md)
Chat messages should include timestamps in headers using format '## YYYY-MM-DD HH:MM:SS Role' for automatic chronology tracking and searchability
Files:
lua/vibing/ui/output_buffer.lua
🧠 Learnings (10)
📚 Learning: 2026-01-07T06:31:23.322Z
Learnt from: CR
Repo: shabaraba/vibing.nvim PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-07T06:31:23.322Z
Learning: Applies to build.sh : Build script must automatically register the vibing-nvim MCP server in ~/.claude.json without requiring manual configuration
Applied to files:
build.sh
📚 Learning: 2026-01-07T06:31:23.322Z
Learnt from: CR
Repo: shabaraba/vibing.nvim PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-07T06:31:23.322Z
Learning: Applies to mcp-server/**/*.{js,ts,tsx} : MCP server implementation must handle tool prefixes with 'mcp__vibing-nvim__' and support all listed Neovim API tools without blocking the main event loop
Applied to files:
build.sh
📚 Learning: 2026-01-07T06:31:23.322Z
Learnt from: CR
Repo: shabaraba/vibing.nvim PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-07T06:31:23.322Z
Learning: Applies to lua/vibing/config.lua : Configuration module must include type annotations for all configuration options and parameters
Applied to files:
lua/vibing/config.lua
📚 Learning: 2026-01-07T06:31:23.322Z
Learnt from: CR
Repo: shabaraba/vibing.nvim PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-07T06:31:23.322Z
Learning: Applies to lua/vibing/actions/inline.lua : Inline actions must be queued and executed serially to prevent file modification conflicts, with errors in one task not blocking subsequent tasks
Applied to files:
lua/vibing/application/inline/modules/action_config.lualua/vibing/application/inline/use_case.lualua/vibing/init.lua
📚 Learning: 2026-01-07T06:31:23.322Z
Learnt from: CR
Repo: shabaraba/vibing.nvim PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-07T06:31:23.322Z
Learning: Applies to lua/vibing/ui/chat_buffer.lua : Chat buffer must support session persistence by storing and resuming from SDK session_id, and provide Markdown rendering with integrated diff viewer
Applied to files:
lua/vibing/ui/output_buffer.lualua/vibing/application/chat/handlers/set_file_title.lua
📚 Learning: 2026-01-07T06:31:23.322Z
Learnt from: CR
Repo: shabaraba/vibing.nvim PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-07T06:31:23.322Z
Learning: Applies to lua/vibing/**/*.lua : All Lua modules should implement or conform to the Adapter pattern interface with methods: execute(), stream(), cancel(), and supports()
Applied to files:
lua/vibing/infrastructure/adapter/manager.lualua/vibing/application/inline/use_case.lualua/vibing/infrastructure/adapter/ollama.luaCLAUDE.mdlua/vibing/init.lua
📚 Learning: 2026-01-07T06:31:23.322Z
Learnt from: CR
Repo: shabaraba/vibing.nvim PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-07T06:31:23.322Z
Learning: Applies to lua/vibing/**/*.lua : When handling concurrent chat sessions or inline actions, generate unique handle IDs using 'hrtime + random' pattern to avoid conflicts
Applied to files:
lua/vibing/application/chat/handlers/set_file_title.lua
📚 Learning: 2026-01-07T06:31:23.322Z
Learnt from: CR
Repo: shabaraba/vibing.nvim PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-07T06:31:23.322Z
Learning: Applies to **/*.vibing : Chat files must be saved as Markdown with YAML frontmatter including: vibing.nvim flag, session_id, created_at, mode, model, permissions_mode, permissions_allow/deny, and optional language field
Applied to files:
lua/vibing/application/chat/handlers/set_file_title.lua
📚 Learning: 2026-01-07T06:31:23.322Z
Learnt from: CR
Repo: shabaraba/vibing.nvim PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-07T06:31:23.322Z
Learning: Applies to lua/vibing/ui/**/*.lua : Chat messages should include timestamps in headers using format '## YYYY-MM-DD HH:MM:SS Role' for automatic chronology tracking and searchability
Applied to files:
lua/vibing/application/chat/handlers/set_file_title.lua
📚 Learning: 2026-01-07T06:31:23.322Z
Learnt from: CR
Repo: shabaraba/vibing.nvim PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-07T06:31:23.322Z
Learning: Applies to lua/vibing/ui/inline_progress.lua : Inline progress window must display queue notifications (e.g., 'Executing task (2 more in queue)...') for multiple queued inline actions
Applied to files:
lua/vibing/application/inline/use_case.lua
🧬 Code graph analysis (1)
lua/vibing/init.lua (2)
lua/vibing/infrastructure/adapter/agent_sdk.lua (1)
AgentSDK:new(19-31)lua/vibing/infrastructure/adapter/ollama.lua (1)
OllamaAdapter:new(15-22)
🪛 GitHub Actions: CI
CLAUDE.md
[error] 402-402: MD013/line-length Line length [Expected: 120; Actual: 133]
[error] 427-427: MD013/line-length Line length [Expected: 120; Actual: 222]
[error] 444-444: MD029/ol-prefix Ordered list item prefix [Expected: 1; Actual: 3; Style: 1/2/3]
[error] 445-445: MD029/ol-prefix Ordered list item prefix [Expected: 2; Actual: 4; Style: 1/2/3]
🪛 Shellcheck (0.11.0)
build.sh
[warning] 179-179: i appears unused. Verify use (or export if used externally).
(SC2034)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: claude-review
🔇 Additional comments (18)
lua/vibing/ui/output_buffer.lua (1)
57-74: LGTM! Buffer reuse logic is well-implemented.The buffer reuse approach correctly identifies existing buffers by name and reinitializes their content, avoiding buffer proliferation when reopening the same output view.
lua/vibing/application/chat/handlers/set_file_title.lua (1)
93-118: LGTM! The rename-and-update flow is well-structured.The distinction between existing-file (rename first, then update buffer) and new-file (set buffer name, then write) workflows is correctly implemented. Using
vim.fn.renamefor existing files is safer than buffer manipulation.lua/vibing/config.lua (2)
20-29: LGTM! Type annotations are comprehensive.All Ollama configuration fields are properly annotated with types and Japanese descriptions, adhering to the coding guidelines for this configuration module.
179-187: Defaults are sensible with opt-in approach.The default configuration keeps Ollama disabled and makes
use_for_titleanduse_for_docopt-in (false by default), ensuring Claude remains the default as stated in the PR objectives.Consider adding validation for the
ollama.urlformat inM.setup()similar to other config validations. This would catch misconfigured URLs early.lua/vibing/core/utils/filename.lua (2)
14-18: Timestamp stripping patterns look correct.The pattern
chat_?%d%d%d%d%d%d%d%d_?correctly handles bothchat_YYYYMMDD_andchatYYYYMMDDvariants. The second pattern^%d%d%d%d%d%d%d%d_?anchored at start handles standalone date prefixes. Good ordering—timestamp removal before extension removal.
27-30: Increased length limit supports more descriptive titles.Extending from 32 to 50 characters accommodates the longer, more descriptive titles that may be generated by the title generation flow.
build.sh (1)
86-92: Security note: Piped install script.The
curl | shpattern is standard for Ollama's official installer but executes remote code. This is acceptable since it's the official installation method, but users should be aware.lua/vibing/application/inline/modules/action_config.lua (1)
38-42: LGTM! Well-defined doc action.The
docaction is appropriately configured with a clear prompt that handles multiple documentation styles (JSDoc/TSDoc/EmmyLua). UsingEdittool anduse_output_buffer = falseensures documentation is added directly to the source file, matching user expectations for this action type.lua/vibing/core/utils/title_generator.lua (1)
50-56: LGTM!The prompt is well-structured with clear constraints for filename-safe title generation. The explicit exclusion list and character length guidance should help produce consistent results.
lua/vibing/infrastructure/ollama/http_client.lua (1)
97-112: LGTM!The connection check is correctly implemented with appropriate curl flags (
-s -f) and propervim.schedule()usage in the callback.CLAUDE.md (2)
618-688: Ollama integration documentation is clear and comprehensive.The setup, configuration, usage, and architecture sections provide excellent guidance for users adopting Ollama integration.
330-334: LGTM!The adapter module structure documentation correctly reflects the new Ollama adapter and HTTP client paths.
lua/vibing/infrastructure/adapter/ollama.lua (2)
80-110: LGTM!The streaming implementation correctly uses
vim.schedule()for theon_chunkcallback, ensuring safe Neovim API access from async operations. The error handling and response accumulation are well-structured.
15-22: LGTM!Constructor properly initializes adapter properties with sensible defaults, following the established pattern from BaseAdapter.
lua/vibing/infrastructure/adapter/manager.lua (2)
24-40: LGTM!The adapter selection logic is clean and correctly implements the opt-in strategy for Ollama integration. The guard conditions properly handle nil values and config flags.
12-18: LGTM!The constructor properly sets up the metatable for method inheritance and stores all necessary references.
lua/vibing/init.lua (2)
45-58: LGTM!The adapter initialization correctly creates the AgentSDK adapter unconditionally and the Ollama adapter conditionally based on configuration. The AdapterManager is properly constructed with both.
190-216: LGTM!The public API methods provide clean access patterns with appropriate error handling.
get_adapter_for()correctly throws on uninitialized state, whileget_adapter()andget_ollama_adapter()gracefully return nil for backward compatibility.
| MODEL="qwen2.5-coder:0.5b" | ||
| echo "[vibing.nvim] Checking model: $MODEL" | ||
| if ollama list 2>/dev/null | grep -q "$MODEL"; then | ||
| echo -e "${GREEN}[vibing.nvim] ✓ Model $MODEL already downloaded${NC}" | ||
| else | ||
| echo "[vibing.nvim] Downloading model $MODEL (~400MB, this may take a few minutes)..." | ||
| if ollama pull "$MODEL"; then |
There was a problem hiding this comment.
Model version mismatch with config default.
The build script downloads qwen2.5-coder:0.5b but lua/vibing/config.lua defaults to qwen2.5-coder:1.5b. Users enabling Ollama may expect the configured default model to be available.
Suggested fix: Align with config default
# 4. Pull the model if not already downloaded
- MODEL="qwen2.5-coder:0.5b"
+ MODEL="qwen2.5-coder:1.5b"
echo "[vibing.nvim] Checking model: $MODEL"📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| MODEL="qwen2.5-coder:0.5b" | |
| echo "[vibing.nvim] Checking model: $MODEL" | |
| if ollama list 2>/dev/null | grep -q "$MODEL"; then | |
| echo -e "${GREEN}[vibing.nvim] ✓ Model $MODEL already downloaded${NC}" | |
| else | |
| echo "[vibing.nvim] Downloading model $MODEL (~400MB, this may take a few minutes)..." | |
| if ollama pull "$MODEL"; then | |
| MODEL="qwen2.5-coder:1.5b" | |
| echo "[vibing.nvim] Checking model: $MODEL" | |
| if ollama list 2>/dev/null | grep -q "$MODEL"; then | |
| echo -e "${GREEN}[vibing.nvim] ✓ Model $MODEL already downloaded${NC}" | |
| else | |
| echo "[vibing.nvim] Downloading model $MODEL (~400MB, this may take a few minutes)..." | |
| if ollama pull "$MODEL"; then |
🤖 Prompt for AI Agents
In @build.sh around lines 121 - 127, The build script sets
MODEL="qwen2.5-coder:0.5b" which conflicts with the default in
lua/vibing/config.lua (qwen2.5-coder:1.5b); update the build logic so the model
string matches the config default (replace MODEL value with qwen2.5-coder:1.5b)
or, better, read the default from lua/vibing/config.lua at runtime and use that
value for MODEL; ensure the echoed messages and the ollama pull/check logic
referencing MODEL reflect the same model string.
| -- アダプター選択: アクション名に基づいて適切なアダプターを取得 | ||
| local use_case = (action and action_or_prompt == "doc") and "doc" or nil | ||
| local adapter = vibing.get_adapter_for(use_case) |
There was a problem hiding this comment.
Missing nil-check for adapter in predefined action path.
Unlike M.custom() (lines 83-88) which checks if adapter is nil, the predefined action path passes adapter directly to Execution functions without validation. If get_adapter_for() returns nil, this could cause errors downstream.
Proposed fix: Add nil-check for consistency
-- アダプター選択: アクション名に基づいて適切なアダプターを取得
local use_case = (action and action_or_prompt == "doc") and "doc" or nil
local adapter = vibing.get_adapter_for(use_case)
+
+ if not adapter then
+ notify.error("No adapter configured", "Inline")
+ return
+ end
-- If not a predefined action, treat as custom natural language instruction
if not action then🤖 Prompt for AI Agents
In @lua/vibing/application/inline/use_case.lua around lines 34 - 36, The
predefined-action branch sets use_case and calls
vibing.get_adapter_for(use_case) but never checks for a nil adapter; add the
same nil-check behavior used in M.custom() to validate the adapter before
calling the Execution functions: after computing adapter from
vibing.get_adapter_for(use_case) (symbols: use_case, adapter,
vibing.get_adapter_for) verify adapter is non-nil and handle the nil case
(log/return an error or early-return) so downstream calls into the Execution
functions do not receive a nil adapter.
| local content = msg.content | ||
| if #content > 300 then | ||
| content = content:sub(1, 300) .. "..." | ||
| end |
There was a problem hiding this comment.
Byte-length truncation may break multi-byte characters.
Using #content counts bytes, not characters. For UTF-8 content (Japanese, Chinese, etc.), truncating at byte 300 may split a multi-byte character, producing invalid UTF-8 or corrupted text.
🔧 Suggested fix using vim.str_byteindex for safe truncation
- if #content > 300 then
- content = content:sub(1, 300) .. "..."
+ -- Use vim.fn.strcharlen for character count, vim.fn.strcharpart for safe slicing
+ if vim.fn.strcharlen(content) > 300 then
+ content = vim.fn.strcharpart(content, 0, 300) .. "..."
end🤖 Prompt for AI Agents
In @lua/vibing/core/utils/title_generator.lua around lines 43 - 46, The code
truncates using byte length (#content) which can split multi-byte UTF-8
characters; change the truncation to count characters and cut at a safe byte
boundary using Neovim's helper: compute the byte index for the 300th character
with vim.str_byteindex(msg.content, 300) (or fallback to utf8 handling), then if
that byte index is less than #msg.content set content = msg.content:sub(1,
byte_index) .. "..." so you never slice mid-character; update references to
content/msg.content in this block accordingly.
| function OllamaAdapter:supports(feature) | ||
| if feature == "streaming" then | ||
| return true | ||
| elseif feature == "cancel" then | ||
| return true | ||
| elseif feature == "tools" then | ||
| return false -- Ollamaはツール実行をサポートしない | ||
| elseif feature == "session" then | ||
| return false -- セッション管理なし | ||
| end | ||
| return false | ||
| end |
There was a problem hiding this comment.
supports("cancel") returns true but cancel() method is not implemented.
The adapter claims to support cancellation but lacks the cancel() method required by the adapter pattern interface. This inconsistency will cause runtime errors if cancel() is called on the Ollama adapter.
🔧 Add cancel() implementation
+---リクエストをキャンセル
+function OllamaAdapter:cancel()
+ if self.job_id then
+ vim.fn.jobstop(self.job_id)
+ self.job_id = nil
+ end
+end
+
---機能サポートチェック
---@param feature string 機能名
---@return boolean
function OllamaAdapter:supports(feature)Based on learnings, all Lua modules should implement the Adapter pattern interface with methods: execute(), stream(), cancel(), and supports().
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| function OllamaAdapter:supports(feature) | |
| if feature == "streaming" then | |
| return true | |
| elseif feature == "cancel" then | |
| return true | |
| elseif feature == "tools" then | |
| return false -- Ollamaはツール実行をサポートしない | |
| elseif feature == "session" then | |
| return false -- セッション管理なし | |
| end | |
| return false | |
| end | |
| ---リクエストをキャンセル | |
| function OllamaAdapter:cancel() | |
| if self.job_id then | |
| vim.fn.jobstop(self.job_id) | |
| self.job_id = nil | |
| end | |
| end | |
| function OllamaAdapter:supports(feature) | |
| if feature == "streaming" then | |
| return true | |
| elseif feature == "cancel" then | |
| return true | |
| elseif feature == "tools" then | |
| return false -- Ollamaはツール実行をサポートしない | |
| elseif feature == "session" then | |
| return false -- セッション管理なし | |
| end | |
| return false | |
| end |
🤖 Prompt for AI Agents
In @lua/vibing/infrastructure/adapter/ollama.lua around lines 133 - 144,
supports("cancel") returns true but OllamaAdapter lacks a cancel()
implementation; add a cancel(self) method on OllamaAdapter that follows the
adapter interface (matching execute() / stream() patterns) to gracefully stop an
in-flight request (e.g., signal/flag or abort request handle), ensure cancel()
is exported on the module/class and that supports() remains true only if cancel
is implemented, and update any internal request tracking (the same request
ID/handle used by stream()/execute()) so cancel() can locate and abort the
active operation.
| on_exit = function(_, exit_code, _) | ||
| if exit_code ~= 0 then | ||
| if on_done then | ||
| vim.schedule(function() | ||
| on_done(false, { | ||
| error = "HTTP request failed", | ||
| details = stderr_buffer, | ||
| exit_code = exit_code, | ||
| }) | ||
| end) | ||
| end | ||
| end | ||
| end, |
There was a problem hiding this comment.
Potential missing on_done callback when stream ends without done flag.
If curl exits successfully (exit_code=0) but the response never contains done: true (e.g., malformed response, server crash mid-stream), on_done will never be called, potentially leaving the caller in a hanging state.
🔧 Consider calling on_done on successful exit as fallback
on_exit = function(_, exit_code, _)
if exit_code ~= 0 then
if on_done then
vim.schedule(function()
on_done(false, {
error = "HTTP request failed",
details = stderr_buffer,
exit_code = exit_code,
})
end)
end
+ else
+ -- Fallback: if done was never received but curl succeeded,
+ -- call on_done with what we have (may indicate incomplete response)
+ -- Note: This is defensive - normally decoded.done triggers on_done
end
end,Alternatively, track whether on_done was already called and invoke it on exit if not.
Committable suggestion skipped: line range outside the PR's diff.
概要
Ollama統合機能を追加し、ローカルAIでの軽量タスク実行をサポート。
主な変更
1. Ollamaアダプター実装
2. オプトイン方式の採用
use_for_title(デフォルト: false)use_for_doc(デフォルト: false)3. 新機能: doc アクション
:VibingInline docで実行4. アダプター管理のリファクタリング
AdapterManagerモジュールで選択ロジックを一元管理設定例
```lua
require("vibing").setup({
ollama = {
enabled = true,
model = "qwen2.5-coder:1.5b",
use_for_title = true, -- タイトル生成にOllama使用
use_for_doc = true, -- doc アクションでOllama使用
},
})
```
テスト結果
関連issue
Closes #262
🤖 Generated with Claude Code
Summary by CodeRabbit
New Features
Improvements
✏️ Tip: You can customize this high-level summary in your review settings.