Skip to content

feat: Add Ollama integration with opt-in support#278

Draft
shabaraba wants to merge 11 commits intomainfrom
feat/ollama-integration-262
Draft

feat: Add Ollama integration with opt-in support#278
shabaraba wants to merge 11 commits intomainfrom
feat/ollama-integration-262

Conversation

@shabaraba
Copy link
Copy Markdown
Owner

@shabaraba shabaraba commented Jan 9, 2026

概要

Ollama統合機能を追加し、ローカルAIでの軽量タスク実行をサポート。

主な変更

1. Ollamaアダプター実装

  • ローカルのOllama APIとHTTP通信
  • ストリーミングレスポンス対応
  • qwen2.5-coderモデルのサポート

2. オプトイン方式の採用

  • タイトル生成: use_for_title (デフォルト: false)
  • ドキュメント生成: use_for_doc (デフォルト: false)
  • 明示的に有効化しない限りClaudeを使用

3. 新機能: doc アクション

  • JSDoc/TSDoc/EmmyLuaコメント自動生成
  • Ollama使用時は高速・軽量に動作
  • :VibingInline doc で実行

4. アダプター管理のリファクタリング

  • AdapterManager モジュールで選択ロジックを一元管理
  • ユースケース別のアダプター取得API
  • 重複コードの削減

設定例

```lua
require("vibing").setup({
ollama = {
enabled = true,
model = "qwen2.5-coder:1.5b",
use_for_title = true, -- タイトル生成にOllama使用
use_for_doc = true, -- doc アクションでOllama使用
},
})
```

テスト結果

  • ✅ ストリーミング動作確認
  • ✅ 日本語応答サポート
  • ✅ タイトル生成品質(1.5bモデルで良好)
  • ✅ JSDoc/TSDoc生成品質
  • ✅ フォーマットチェック通過
  • ✅ Agent SDK wrapper動作確認

関連issue

Closes #262

🤖 Generated with Claude Code

Summary by CodeRabbit

  • New Features

    • Integrated Ollama for local AI-powered code explanations, fixes, and documentation generation (disabled by default)
    • Added new "doc" action to generate code documentation inline
    • Enhanced title generation with improved algorithms and Ollama support
    • Automated Ollama setup in build script with installation and model management
  • Improvements

    • Output buffers now reuse existing instances for efficiency
    • Better file title handling for both new and existing files
    • Improved filename sanitization for chat files

✏️ Tip: You can customize this high-level summary in your review settings.

shabaraba and others added 11 commits January 8, 2026 17:05
## Summary
Implemented Ollama integration to enable local AI using qwen2.5-coder:0.5b
model for the inline explain action. This provides faster, offline code
explanation without relying on Claude API.

## Changes

### build.sh - Ollama Setup Automation
- Added automatic Ollama installation (macOS: Homebrew, Linux: official script)
- Auto-download qwen2.5-coder:0.5b model (~400MB)
- Auto-start Ollama service (brew services / systemd)
- Added VIBING_SKIP_OLLAMA=1 option to skip setup
- Color-coded output for better visibility

### Core Implementation
- `lua/vibing/infrastructure/ollama/http_client.lua`
  - HTTP client using curl for Ollama API communication
  - Streaming support with JSON Lines parsing
  - Connection health check

- `lua/vibing/infrastructure/adapter/ollama.lua`
  - Ollama adapter implementing Vibing.Adapter interface
  - Supports streaming responses
  - No tool execution support (explanation only)

### Configuration
- Added `ollama` config section in `config.lua`:
  - `enabled`: Enable/disable Ollama (default: false)
  - `url`: Ollama server URL (default: http://localhost:11434)
  - `model`: Model name (default: qwen2.5-coder:0.5b)
  - `timeout`: Request timeout in ms (default: 30000)
  - `stream`: Enable streaming (default: true)

### Adapter Selection
- Modified `lua/vibing/application/inline/use_case.lua`:
  - `explain` action uses Ollama when enabled
  - Other actions (fix, feat, refactor, test) use agent_sdk
  - Automatic fallback to agent_sdk if Ollama unavailable

### Plugin Initialization
- Modified `lua/vibing/init.lua`:
  - Initialize both agent_sdk and Ollama adapters
  - Added `get_ollama_adapter()` function
  - Cleanup Ollama adapter on exit

## Usage
```lua
require("vibing").setup({
  ollama = {
    enabled = true,  -- Enable Ollama for explain action
  },
})
```

Then use `:VibingInline explain` on visual selection.

## Related
- Issue #262: Implement inline chat and completion using Ollama qwen2.5-coder:0.5B

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
- Fix HTTP client JSON parsing for streaming responses
  - on_stdout callback receives pre-split lines, not lines with newlines
  - Changed from searching for newlines to direct JSON parsing per line
- Fix output buffer name duplication error
  - Reuse existing buffer if name already exists
  - Prevents "E95: Buffer with this name already exists" error
- Add Japanese language support for Ollama
  - Add system prompt to enforce Japanese-only responses
  - Reduces Chinese language mixing in qwen2.5-coder output
- Add debug notifications for adapter selection
  - Notify when Ollama adapter is selected for explain action

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
- Automatically use Ollama adapter when enabled for title generation
- Falls back to Agent SDK if Ollama is not available
- Set language option to English for filename-friendly titles

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
- Limit conversation context to first 2 and last 2 messages
- Truncate each message to 300 characters to prevent context overflow
- Improve prompt with specific examples and technical focus
- Explicitly forbid Chinese characters in title
- Better suited for small models like qwen2.5-coder:0.5b

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
- Fix logic for existing file case: rename file first, then update buffer name
- Add write! to force save for new files
- Remove redundant save operation for existing files

Fixes "E13: File exists (add ! to override)" error

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
- Increase sanitize() max length from 32 to 50 characters
- Adjust title generation prompt from 15-30 to 20-50 characters
- Add longer examples in prompt for better guidance
- Prevents title truncation like "vibingnvim_ollama_integration_fi"

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
- Add explicit exclusion of dates, timestamps, and prefixes in prompt
- Strip timestamp patterns (chat_YYYYMMDD_, YYYYMMDD_) from sanitized text
- Remove .vibing extension if included in generated title
- Prevents titles like "chat_20260108_vibingnvim_ollama_integration"

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
- Remove Ollama from inline explain action (Claude provides better quality)
- Add use_for_title config option for Ollama (default: false)
- Title generation uses Claude by default, Ollama is opt-in
- Change default model from 0.5b to 1.5b for better quality
- Update type annotations with model recommendations

Rationale:
- Explain action needs high accuracy → Claude only
- Title generation can use lightweight model → Ollama opt-in
- Reduces Ollama integration scope to minimal, specific use case

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
- Add new 'doc' inline action for generating documentation comments
- Support JSDoc/TSDoc/EmmyLua comment styles with language detection
- Add use_for_doc config option for Ollama opt-in (default: false)
- Use Ollama adapter when enabled for lightweight doc generation
- Update VibingInline command completion to include 'doc' action

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
- Create AdapterManager to centralize adapter selection logic
- Remove duplicate adapter selection code from use_case.lua
- Add get_adapter_for(use_case) API for use-case-based selection
- Maintain backward compatibility with get_adapter() and get_ollama_adapter()
- Simplify init.lua by delegating to AdapterManager

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai bot commented Jan 9, 2026

📝 Walkthrough

Walkthrough

This pull request introduces Ollama integration to the Vibing plugin, enabling local LLM inference alongside the existing Claude Agent SDK. It adds HTTP-based Ollama adapter components, configuration options, an adapter manager for intelligent backend selection, and updates the build system for Ollama setup.

Changes

Cohort / File(s) Summary
Documentation & Configuration
CLAUDE.md, lua/vibing/config.lua
Added Ollama integration documentation, setup instructions, and new OllamaConfig class with fields: enabled, url, model, timeout, stream, use_for_title, use_for_doc. Default config exposes ollama settings with Ollama disabled by default.
Ollama Infrastructure
lua/vibing/infrastructure/adapter/ollama.lua, lua/vibing/infrastructure/ollama/http_client.lua
Introduced OllamaAdapter class implementing HTTP streaming via curl, with stream() method handling JSON response parsing, connection checks, and language-aware prompts. New HTTP client provides post_stream() for streaming requests and check_connection() for availability checks.
Adapter Management
lua/vibing/infrastructure/adapter/manager.lua, lua/vibing/init.lua
Created AdapterManager to select between Ollama and Claude adapters based on use_case ("doc", "title") and config flags. Replaced single adapter field with manager instance; added public API methods: get_adapter_for(use_case), get_adapter(), get_ollama_adapter(), get_config().
Build & Setup
build.sh
Added colorized output and Ollama setup functions: start_ollama_service() for service management and setup_ollama() for end-to-end installation (OS-specific paths), model download, and readiness polling. Includes VIBING_SKIP_OLLAMA bypass option.
Inline Actions
lua/vibing/application/inline/modules/action_config.lua, lua/vibing/application/inline/use_case.lua
Added new "doc" action type to predefined actions. Refactored adapter retrieval to use vibing.get_adapter_for(use_case) with lazy evaluation, computing use_case as "doc" for matching actions.
Title & Filename Handling
lua/vibing/core/utils/title_generator.lua, lua/vibing/core/utils/filename.lua, lua/vibing/application/chat/handlers/set_file_title.lua
Enhanced title generation to use "title" use_case adapter, added Ollama language setting, message truncation (first/last two when long), and improved prompt for 20-50 char filenames. Updated filename sanitization to strip timestamps and increase max length to 50 chars. Refactored file rename logic for existing files.
Buffer Management
lua/vibing/ui/output_buffer.lua
Added buffer reuse logic in _create_buffer() to reuse existing vibing://<title> buffers, improving performance for repeated buffer access.

Sequence Diagram(s)

sequenceDiagram
    participant User as User / Plugin
    participant Init as vibing.init
    participant Manager as AdapterManager
    participant Claude as Claude Adapter
    participant Ollama as Ollama Adapter

    User->>Init: setup(config)
    Init->>Init: Create agent_sdk_adapter
    alt ollama.enabled == true
        Init->>Ollama: new(config)
        Ollama->>Ollama: Initialize URL, model, timeout
    end
    Init->>Manager: new(config, agent_sdk_adapter, ollama_adapter)
    Init->>Init: Store adapter_manager

    User->>Init: get_adapter_for(use_case)
    Init->>Manager: get_adapter_for(use_case)
    alt use_case == "doc" or "title"
        Manager->>Manager: Check ollama.enabled && config.use_for_*
        alt Ollama enabled for use_case
            Manager-->>Init: Return ollama_adapter
        else Ollama unavailable
            Manager-->>Init: Return agent_sdk_adapter
        end
    else default
        Manager-->>Init: Return agent_sdk_adapter
    end
    Init-->>User: Return selected adapter
Loading
sequenceDiagram
    participant Neovim as Neovim Client
    participant TitleGen as title_generator
    participant Adapter as OllamaAdapter
    participant HTTP as http_client
    participant Ollama as Ollama Server
    participant FS as Filesystem

    Neovim->>TitleGen: generate_from_conversation(conversation)
    TitleGen->>TitleGen: vibing.get_adapter_for("title")
    TitleGen->>Adapter: Select Ollama or Claude
    TitleGen->>TitleGen: Select first/last 2 messages, truncate
    TitleGen->>Adapter: stream(prompt, opts, on_chunk, on_done)
    Adapter->>Adapter: build_prompt(full_prompt, context)
    Adapter->>HTTP: check_connection(url)
    HTTP->>Ollama: GET /api/tags (curl)
    Ollama-->>HTTP: Response OK
    HTTP-->>Adapter: Connection valid
    Adapter->>HTTP: post_stream(url, body, on_chunk, on_done)
    HTTP->>Ollama: POST /api/generate (streaming)
    loop Stream chunks
        Ollama-->>HTTP: JSON response chunk
        HTTP->>HTTP: Parse & buffer JSON
        HTTP->>Adapter: on_chunk(decoded.response)
        Adapter->>TitleGen: Accumulate title text
    end
    Ollama-->>HTTP: done=true
    HTTP->>Adapter: on_done(true, final_response)
    Adapter->>TitleGen: Callback with accumulated title
    TitleGen->>TitleGen: Sanitize via filename_util
    TitleGen->>FS: Callback to update file/buffer
    FS-->>Neovim: Title applied
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~60 minutes

Possibly related PRs

Poem

🐰 Hops of joy for Ollama's way,
Local models save the day!
Adapters dance, so smart and neat,
Off we go to make things sweet!

🚥 Pre-merge checks | ✅ 5
✅ Passed checks (5 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately describes the main change: adding Ollama integration with opt-in configuration support, which is the primary focus of the changeset.
Linked Issues check ✅ Passed The PR implements all coding-related requirements from issue #262: Ollama adapter with HTTP client and streaming support [#262], opt-in configuration with use_for_title/use_for_doc flags [#262], new doc action for inline documentation generation [#262], and AdapterManager for use-case-specific adapter selection [#262], with fallback to Claude maintaining backward compatibility.
Out of Scope Changes check ✅ Passed All changes align with the PR objectives and linked issue #262 requirements; filename sanitization updates support title generation improvements, and chat file handling changes support the refactored setup flow, with no unrelated modifications detected.
Docstring Coverage ✅ Passed Docstring coverage is 95.65% which is sufficient. The required threshold is 80.00%.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@claude
Copy link
Copy Markdown
Contributor

claude bot commented Jan 9, 2026

PR #278 レビュー: Ollama統合機能の追加

総評

非常に構造化された高品質なPRです。ローカルAI(Ollama)統合という重要な機能を、オプトイン方式で慎重に実装しています。アーキテクチャ設計が優れており、コード品質も高いですが、いくつかの改善提案があります。


優れている点

1. アーキテクチャ設計

  • AdapterManagerパターン: ユースケース別のアダプター選択ロジックを一元管理する設計が秀逸
  • オプトイン方式: use_for_title、use_for_docフラグにより、ユーザーが明示的に有効化しない限りClaudeを使用
  • 疎結合: HTTPクライアントとアダプターを分離し、テスト容易性と保守性を向上

2. エラーハンドリング

  • 接続確認を実行前に実施し、エラーメッセージを明確に表示
  • ストリーミング中のJSONパース失敗時のバッファリング処理が適切

3. ドキュメント

  • CLAUDE.mdに詳細な使用方法、アーキテクチャ図、設定例を記載
  • LuaDocコメントが充実し、型アノテーションも完備

改善提案

セキュリティ上の懸念

🔴 Critical: build.shのcurlパイプ実行(87行目)

問題点:

  • インストールスクリプトを検証せずに直接実行するのはセキュリティリスク
  • MITMアタックや、ollama.comの侵害時に悪意あるコードが実行される可能性

推奨: ユーザーに手動インストールを促す方が安全

🟡 Medium: 接続タイムアウト未設定(http_client.lua:100)

問題点: ネットワーク遅延時に無期限に待機する可能性

推奨: curlに --max-time 5 オプションを追加

🟡 Medium: job_idのクリーンアップ不足(ollama.lua:80)

問題点: cancel()メソッドの実装が見当たらない

推奨: OllamaAdapter:cancel()を実装


テストカバレッジ

新機能のテスト不足:

  • lua/vibing/infrastructure/ollama/http_client.lua
  • lua/vibing/infrastructure/adapter/ollama.lua
  • lua/vibing/infrastructure/adapter/manager.lua

推奨: 対応するテストファイルを追加


推奨アクション

必須(マージ前):

  1. build.shのcurlパイプ実行を安全な方法に変更
  2. check_connectionにタイムアウトを追加
  3. OllamaAdapter:cancel()の実装

推奨(次回PR):

  1. 新規モジュールのユニットテスト追加
  2. タイトル生成のサンプリング戦略改善

まとめ

このPRは、Ollama統合という複雑な機能を、優れた設計思想とコード品質で実装しています:

  • ✅ オプトイン方式による安全な導入
  • ✅ 疎結合なアーキテクチャ
  • ✅ 充実したドキュメント

セキュリティ上の懸念を修正すれば、マージに適した状態です。

レビュー結果: 承認(要修正)

お疲れ様でした!素晴らしい実装です 👏

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 5

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (2)
lua/vibing/ui/output_buffer.lua (1)

189-192: Wrap timer callback in vim.schedule() for safe API access.

The timer callback calls _flush_chunks() which uses nvim_buf_get_lines and nvim_buf_set_lines. Timer callbacks run outside Neovim's main event loop, so these API calls should be wrapped in vim.schedule() to prevent potential issues.

Proposed fix
   self._chunk_timer = vim.fn.timer_start(50, function()
-    self:_flush_chunks()
-    self._chunk_timer = nil
+    vim.schedule(function()
+      self:_flush_chunks()
+      self._chunk_timer = nil
+    end)
   end)

As per coding guidelines, vim.schedule() should be used when accessing Neovim APIs from async/concurrent operations.

lua/vibing/init.lua (1)

169-173: Bug: M.adapter is undefined; VibingCancel command will silently fail.

M.adapter was replaced with M.adapter_manager, but this command still references the old field. The condition if M.adapter then will always be false, making the cancel command non-functional.

🐛 Fix to use adapter_manager
   vim.api.nvim_create_user_command("VibingCancel", function()
-    if M.adapter then
-      M.adapter:cancel()
+    if M.adapter_manager then
+      M.adapter_manager:get_default_adapter():cancel()
     end
   end, { desc = "Cancel current Vibing request" })
🤖 Fix all issues with AI agents
In @build.sh:
- Around line 121-127: The build script sets MODEL="qwen2.5-coder:0.5b" which
conflicts with the default in lua/vibing/config.lua (qwen2.5-coder:1.5b); update
the build logic so the model string matches the config default (replace MODEL
value with qwen2.5-coder:1.5b) or, better, read the default from
lua/vibing/config.lua at runtime and use that value for MODEL; ensure the echoed
messages and the ollama pull/check logic referencing MODEL reflect the same
model string.

In @lua/vibing/application/inline/use_case.lua:
- Around line 34-36: The predefined-action branch sets use_case and calls
vibing.get_adapter_for(use_case) but never checks for a nil adapter; add the
same nil-check behavior used in M.custom() to validate the adapter before
calling the Execution functions: after computing adapter from
vibing.get_adapter_for(use_case) (symbols: use_case, adapter,
vibing.get_adapter_for) verify adapter is non-nil and handle the nil case
(log/return an error or early-return) so downstream calls into the Execution
functions do not receive a nil adapter.

In @lua/vibing/core/utils/title_generator.lua:
- Around line 43-46: The code truncates using byte length (#content) which can
split multi-byte UTF-8 characters; change the truncation to count characters and
cut at a safe byte boundary using Neovim's helper: compute the byte index for
the 300th character with vim.str_byteindex(msg.content, 300) (or fallback to
utf8 handling), then if that byte index is less than #msg.content set content =
msg.content:sub(1, byte_index) .. "..." so you never slice mid-character; update
references to content/msg.content in this block accordingly.

In @lua/vibing/infrastructure/adapter/ollama.lua:
- Around line 133-144: supports("cancel") returns true but OllamaAdapter lacks a
cancel() implementation; add a cancel(self) method on OllamaAdapter that follows
the adapter interface (matching execute() / stream() patterns) to gracefully
stop an in-flight request (e.g., signal/flag or abort request handle), ensure
cancel() is exported on the module/class and that supports() remains true only
if cancel is implemented, and update any internal request tracking (the same
request ID/handle used by stream()/execute()) so cancel() can locate and abort
the active operation.
🧹 Nitpick comments (4)
build.sh (2)

177-185: Unused loop variable.

The loop variable i is unused. Use _ to indicate an intentionally unused variable.

Proposed fix
     # Wait for server to start (max 15 seconds)
     echo "[vibing.nvim] Waiting for Ollama server to be ready..."
-    for i in {1..15}; do
+    for _ in {1..15}; do
         if curl -s http://localhost:11434/api/tags &> /dev/null; then

164-174: Consider handling non-sudo environments gracefully.

The Linux systemctl path uses sudo which may fail in non-interactive or rootless contexts. The fallback to nohup ollama serve handles this, but the error from sudo may be confusing.

Suppress sudo errors and fall through to nohup
         Linux*)
             echo "[vibing.nvim] Starting Ollama service (systemd)..."
             if command -v systemctl &>/dev/null; then
-                if sudo systemctl enable ollama 2>/dev/null && sudo systemctl start ollama 2>/dev/null; then
+                if sudo -n systemctl enable ollama 2>/dev/null && sudo -n systemctl start ollama 2>/dev/null; then
                     echo "[vibing.nvim] Started via systemd"
+                else
+                    # Fallback to direct start if sudo not available
+                    nohup ollama serve &>/dev/null &
                 fi
             else
                 # Fallback to direct start
                 nohup ollama serve &>/dev/null &
             fi
             ;;

Using sudo -n (non-interactive) prevents password prompts and allows clean fallback.

lua/vibing/infrastructure/ollama/http_client.lua (1)

44-48: on_chunk callback should use vim.schedule() for safe Neovim API access.

Per the coding guidelines, when accessing Neovim APIs from async/concurrent operations, vim.schedule() should be used. The on_chunk callback is invoked from a jobstart stdout handler, which runs asynchronously. While the caller in ollama.lua wraps it, defensive scheduling here ensures safety regardless of caller behavior.

♻️ Wrap on_chunk in vim.schedule
             if decoded.response then
               if on_chunk then
-                on_chunk(decoded.response)
+                vim.schedule(function()
+                  on_chunk(decoded.response)
+                end)
               end
             end
lua/vibing/init.lua (1)

60-74: Consider canceling both adapters on exit for completeness.

Currently only the AgentSDK adapter is canceled on VimLeavePre. If Ollama has an active streaming job, it will continue briefly after exit. This is a minor concern since the process terminates anyway.

♻️ Optional: Cancel both adapters
   vim.api.nvim_create_autocmd("VimLeavePre", {
     callback = function()
       -- Agent SDKプロセスを全てキャンセル
       if M.adapter_manager then
         M.adapter_manager:get_default_adapter():cancel()
+        local ollama = M.adapter_manager:get_ollama_adapter()
+        if ollama and ollama.cancel then
+          ollama:cancel()
+        end
       end
📜 Review details

Configuration used: defaults

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 73e3c5d and 78334b1.

📒 Files selected for processing (13)
  • CLAUDE.md
  • build.sh
  • lua/vibing/application/chat/handlers/set_file_title.lua
  • lua/vibing/application/inline/modules/action_config.lua
  • lua/vibing/application/inline/use_case.lua
  • lua/vibing/config.lua
  • lua/vibing/core/utils/filename.lua
  • lua/vibing/core/utils/title_generator.lua
  • lua/vibing/infrastructure/adapter/manager.lua
  • lua/vibing/infrastructure/adapter/ollama.lua
  • lua/vibing/infrastructure/ollama/http_client.lua
  • lua/vibing/init.lua
  • lua/vibing/ui/output_buffer.lua
🧰 Additional context used
📓 Path-based instructions (6)
build.sh

📄 CodeRabbit inference engine (CLAUDE.md)

Build script must automatically register the vibing-nvim MCP server in ~/.claude.json without requiring manual configuration

Files:

  • build.sh
**/*.{js,mjs,ts,tsx,lua}

📄 CodeRabbit inference engine (CLAUDE.md)

Use camelCase for variable and function names in JavaScript, TypeScript, and Lua code

Files:

  • lua/vibing/config.lua
  • lua/vibing/infrastructure/ollama/http_client.lua
  • lua/vibing/application/inline/modules/action_config.lua
  • lua/vibing/ui/output_buffer.lua
  • lua/vibing/infrastructure/adapter/manager.lua
  • lua/vibing/core/utils/title_generator.lua
  • lua/vibing/application/chat/handlers/set_file_title.lua
  • lua/vibing/application/inline/use_case.lua
  • lua/vibing/core/utils/filename.lua
  • lua/vibing/infrastructure/adapter/ollama.lua
  • lua/vibing/init.lua
**/*.lua

📄 CodeRabbit inference engine (CLAUDE.md)

Use vim.schedule() to ensure safe API calls run on Neovim's main event loop when accessing Neovim APIs from async/concurrent operations

Files:

  • lua/vibing/config.lua
  • lua/vibing/infrastructure/ollama/http_client.lua
  • lua/vibing/application/inline/modules/action_config.lua
  • lua/vibing/ui/output_buffer.lua
  • lua/vibing/infrastructure/adapter/manager.lua
  • lua/vibing/core/utils/title_generator.lua
  • lua/vibing/application/chat/handlers/set_file_title.lua
  • lua/vibing/application/inline/use_case.lua
  • lua/vibing/core/utils/filename.lua
  • lua/vibing/infrastructure/adapter/ollama.lua
  • lua/vibing/init.lua
lua/vibing/**/*.lua

📄 CodeRabbit inference engine (CLAUDE.md)

lua/vibing/**/*.lua: All Lua modules should implement or conform to the Adapter pattern interface with methods: execute(), stream(), cancel(), and supports()
When handling concurrent chat sessions or inline actions, generate unique handle IDs using 'hrtime + random' pattern to avoid conflicts
Implement permission rule evaluation in order: deny rules first (immediate block), then allow rules (grant if matched), then default deny if no rules match

Files:

  • lua/vibing/config.lua
  • lua/vibing/infrastructure/ollama/http_client.lua
  • lua/vibing/application/inline/modules/action_config.lua
  • lua/vibing/ui/output_buffer.lua
  • lua/vibing/infrastructure/adapter/manager.lua
  • lua/vibing/core/utils/title_generator.lua
  • lua/vibing/application/chat/handlers/set_file_title.lua
  • lua/vibing/application/inline/use_case.lua
  • lua/vibing/core/utils/filename.lua
  • lua/vibing/infrastructure/adapter/ollama.lua
  • lua/vibing/init.lua
lua/vibing/config.lua

📄 CodeRabbit inference engine (CLAUDE.md)

Configuration module must include type annotations for all configuration options and parameters

Files:

  • lua/vibing/config.lua
lua/vibing/ui/**/*.lua

📄 CodeRabbit inference engine (CLAUDE.md)

Chat messages should include timestamps in headers using format '## YYYY-MM-DD HH:MM:SS Role' for automatic chronology tracking and searchability

Files:

  • lua/vibing/ui/output_buffer.lua
🧠 Learnings (10)
📚 Learning: 2026-01-07T06:31:23.322Z
Learnt from: CR
Repo: shabaraba/vibing.nvim PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-07T06:31:23.322Z
Learning: Applies to build.sh : Build script must automatically register the vibing-nvim MCP server in ~/.claude.json without requiring manual configuration

Applied to files:

  • build.sh
📚 Learning: 2026-01-07T06:31:23.322Z
Learnt from: CR
Repo: shabaraba/vibing.nvim PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-07T06:31:23.322Z
Learning: Applies to mcp-server/**/*.{js,ts,tsx} : MCP server implementation must handle tool prefixes with 'mcp__vibing-nvim__' and support all listed Neovim API tools without blocking the main event loop

Applied to files:

  • build.sh
📚 Learning: 2026-01-07T06:31:23.322Z
Learnt from: CR
Repo: shabaraba/vibing.nvim PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-07T06:31:23.322Z
Learning: Applies to lua/vibing/config.lua : Configuration module must include type annotations for all configuration options and parameters

Applied to files:

  • lua/vibing/config.lua
📚 Learning: 2026-01-07T06:31:23.322Z
Learnt from: CR
Repo: shabaraba/vibing.nvim PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-07T06:31:23.322Z
Learning: Applies to lua/vibing/actions/inline.lua : Inline actions must be queued and executed serially to prevent file modification conflicts, with errors in one task not blocking subsequent tasks

Applied to files:

  • lua/vibing/application/inline/modules/action_config.lua
  • lua/vibing/application/inline/use_case.lua
  • lua/vibing/init.lua
📚 Learning: 2026-01-07T06:31:23.322Z
Learnt from: CR
Repo: shabaraba/vibing.nvim PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-07T06:31:23.322Z
Learning: Applies to lua/vibing/ui/chat_buffer.lua : Chat buffer must support session persistence by storing and resuming from SDK session_id, and provide Markdown rendering with integrated diff viewer

Applied to files:

  • lua/vibing/ui/output_buffer.lua
  • lua/vibing/application/chat/handlers/set_file_title.lua
📚 Learning: 2026-01-07T06:31:23.322Z
Learnt from: CR
Repo: shabaraba/vibing.nvim PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-07T06:31:23.322Z
Learning: Applies to lua/vibing/**/*.lua : All Lua modules should implement or conform to the Adapter pattern interface with methods: execute(), stream(), cancel(), and supports()

Applied to files:

  • lua/vibing/infrastructure/adapter/manager.lua
  • lua/vibing/application/inline/use_case.lua
  • lua/vibing/infrastructure/adapter/ollama.lua
  • CLAUDE.md
  • lua/vibing/init.lua
📚 Learning: 2026-01-07T06:31:23.322Z
Learnt from: CR
Repo: shabaraba/vibing.nvim PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-07T06:31:23.322Z
Learning: Applies to lua/vibing/**/*.lua : When handling concurrent chat sessions or inline actions, generate unique handle IDs using 'hrtime + random' pattern to avoid conflicts

Applied to files:

  • lua/vibing/application/chat/handlers/set_file_title.lua
📚 Learning: 2026-01-07T06:31:23.322Z
Learnt from: CR
Repo: shabaraba/vibing.nvim PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-07T06:31:23.322Z
Learning: Applies to **/*.vibing : Chat files must be saved as Markdown with YAML frontmatter including: vibing.nvim flag, session_id, created_at, mode, model, permissions_mode, permissions_allow/deny, and optional language field

Applied to files:

  • lua/vibing/application/chat/handlers/set_file_title.lua
📚 Learning: 2026-01-07T06:31:23.322Z
Learnt from: CR
Repo: shabaraba/vibing.nvim PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-07T06:31:23.322Z
Learning: Applies to lua/vibing/ui/**/*.lua : Chat messages should include timestamps in headers using format '## YYYY-MM-DD HH:MM:SS Role' for automatic chronology tracking and searchability

Applied to files:

  • lua/vibing/application/chat/handlers/set_file_title.lua
📚 Learning: 2026-01-07T06:31:23.322Z
Learnt from: CR
Repo: shabaraba/vibing.nvim PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-01-07T06:31:23.322Z
Learning: Applies to lua/vibing/ui/inline_progress.lua : Inline progress window must display queue notifications (e.g., 'Executing task (2 more in queue)...') for multiple queued inline actions

Applied to files:

  • lua/vibing/application/inline/use_case.lua
🧬 Code graph analysis (1)
lua/vibing/init.lua (2)
lua/vibing/infrastructure/adapter/agent_sdk.lua (1)
  • AgentSDK:new (19-31)
lua/vibing/infrastructure/adapter/ollama.lua (1)
  • OllamaAdapter:new (15-22)
🪛 GitHub Actions: CI
CLAUDE.md

[error] 402-402: MD013/line-length Line length [Expected: 120; Actual: 133]


[error] 427-427: MD013/line-length Line length [Expected: 120; Actual: 222]


[error] 444-444: MD029/ol-prefix Ordered list item prefix [Expected: 1; Actual: 3; Style: 1/2/3]


[error] 445-445: MD029/ol-prefix Ordered list item prefix [Expected: 2; Actual: 4; Style: 1/2/3]

🪛 Shellcheck (0.11.0)
build.sh

[warning] 179-179: i appears unused. Verify use (or export if used externally).

(SC2034)

⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: claude-review
🔇 Additional comments (18)
lua/vibing/ui/output_buffer.lua (1)

57-74: LGTM! Buffer reuse logic is well-implemented.

The buffer reuse approach correctly identifies existing buffers by name and reinitializes their content, avoiding buffer proliferation when reopening the same output view.

lua/vibing/application/chat/handlers/set_file_title.lua (1)

93-118: LGTM! The rename-and-update flow is well-structured.

The distinction between existing-file (rename first, then update buffer) and new-file (set buffer name, then write) workflows is correctly implemented. Using vim.fn.rename for existing files is safer than buffer manipulation.

lua/vibing/config.lua (2)

20-29: LGTM! Type annotations are comprehensive.

All Ollama configuration fields are properly annotated with types and Japanese descriptions, adhering to the coding guidelines for this configuration module.


179-187: Defaults are sensible with opt-in approach.

The default configuration keeps Ollama disabled and makes use_for_title and use_for_doc opt-in (false by default), ensuring Claude remains the default as stated in the PR objectives.

Consider adding validation for the ollama.url format in M.setup() similar to other config validations. This would catch misconfigured URLs early.

lua/vibing/core/utils/filename.lua (2)

14-18: Timestamp stripping patterns look correct.

The pattern chat_?%d%d%d%d%d%d%d%d_? correctly handles both chat_YYYYMMDD_ and chatYYYYMMDD variants. The second pattern ^%d%d%d%d%d%d%d%d_? anchored at start handles standalone date prefixes. Good ordering—timestamp removal before extension removal.


27-30: Increased length limit supports more descriptive titles.

Extending from 32 to 50 characters accommodates the longer, more descriptive titles that may be generated by the title generation flow.

build.sh (1)

86-92: Security note: Piped install script.

The curl | sh pattern is standard for Ollama's official installer but executes remote code. This is acceptable since it's the official installation method, but users should be aware.

lua/vibing/application/inline/modules/action_config.lua (1)

38-42: LGTM! Well-defined doc action.

The doc action is appropriately configured with a clear prompt that handles multiple documentation styles (JSDoc/TSDoc/EmmyLua). Using Edit tool and use_output_buffer = false ensures documentation is added directly to the source file, matching user expectations for this action type.

lua/vibing/core/utils/title_generator.lua (1)

50-56: LGTM!

The prompt is well-structured with clear constraints for filename-safe title generation. The explicit exclusion list and character length guidance should help produce consistent results.

lua/vibing/infrastructure/ollama/http_client.lua (1)

97-112: LGTM!

The connection check is correctly implemented with appropriate curl flags (-s -f) and proper vim.schedule() usage in the callback.

CLAUDE.md (2)

618-688: Ollama integration documentation is clear and comprehensive.

The setup, configuration, usage, and architecture sections provide excellent guidance for users adopting Ollama integration.


330-334: LGTM!

The adapter module structure documentation correctly reflects the new Ollama adapter and HTTP client paths.

lua/vibing/infrastructure/adapter/ollama.lua (2)

80-110: LGTM!

The streaming implementation correctly uses vim.schedule() for the on_chunk callback, ensuring safe Neovim API access from async operations. The error handling and response accumulation are well-structured.


15-22: LGTM!

Constructor properly initializes adapter properties with sensible defaults, following the established pattern from BaseAdapter.

lua/vibing/infrastructure/adapter/manager.lua (2)

24-40: LGTM!

The adapter selection logic is clean and correctly implements the opt-in strategy for Ollama integration. The guard conditions properly handle nil values and config flags.


12-18: LGTM!

The constructor properly sets up the metatable for method inheritance and stores all necessary references.

lua/vibing/init.lua (2)

45-58: LGTM!

The adapter initialization correctly creates the AgentSDK adapter unconditionally and the Ollama adapter conditionally based on configuration. The AdapterManager is properly constructed with both.


190-216: LGTM!

The public API methods provide clean access patterns with appropriate error handling. get_adapter_for() correctly throws on uninitialized state, while get_adapter() and get_ollama_adapter() gracefully return nil for backward compatibility.

Comment on lines +121 to +127
MODEL="qwen2.5-coder:0.5b"
echo "[vibing.nvim] Checking model: $MODEL"
if ollama list 2>/dev/null | grep -q "$MODEL"; then
echo -e "${GREEN}[vibing.nvim] ✓ Model $MODEL already downloaded${NC}"
else
echo "[vibing.nvim] Downloading model $MODEL (~400MB, this may take a few minutes)..."
if ollama pull "$MODEL"; then
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Model version mismatch with config default.

The build script downloads qwen2.5-coder:0.5b but lua/vibing/config.lua defaults to qwen2.5-coder:1.5b. Users enabling Ollama may expect the configured default model to be available.

Suggested fix: Align with config default
     # 4. Pull the model if not already downloaded
-    MODEL="qwen2.5-coder:0.5b"
+    MODEL="qwen2.5-coder:1.5b"
     echo "[vibing.nvim] Checking model: $MODEL"
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
MODEL="qwen2.5-coder:0.5b"
echo "[vibing.nvim] Checking model: $MODEL"
if ollama list 2>/dev/null | grep -q "$MODEL"; then
echo -e "${GREEN}[vibing.nvim] ✓ Model $MODEL already downloaded${NC}"
else
echo "[vibing.nvim] Downloading model $MODEL (~400MB, this may take a few minutes)..."
if ollama pull "$MODEL"; then
MODEL="qwen2.5-coder:1.5b"
echo "[vibing.nvim] Checking model: $MODEL"
if ollama list 2>/dev/null | grep -q "$MODEL"; then
echo -e "${GREEN}[vibing.nvim] ✓ Model $MODEL already downloaded${NC}"
else
echo "[vibing.nvim] Downloading model $MODEL (~400MB, this may take a few minutes)..."
if ollama pull "$MODEL"; then
🤖 Prompt for AI Agents
In @build.sh around lines 121 - 127, The build script sets
MODEL="qwen2.5-coder:0.5b" which conflicts with the default in
lua/vibing/config.lua (qwen2.5-coder:1.5b); update the build logic so the model
string matches the config default (replace MODEL value with qwen2.5-coder:1.5b)
or, better, read the default from lua/vibing/config.lua at runtime and use that
value for MODEL; ensure the echoed messages and the ollama pull/check logic
referencing MODEL reflect the same model string.

Comment on lines +34 to +36
-- アダプター選択: アクション名に基づいて適切なアダプターを取得
local use_case = (action and action_or_prompt == "doc") and "doc" or nil
local adapter = vibing.get_adapter_for(use_case)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Missing nil-check for adapter in predefined action path.

Unlike M.custom() (lines 83-88) which checks if adapter is nil, the predefined action path passes adapter directly to Execution functions without validation. If get_adapter_for() returns nil, this could cause errors downstream.

Proposed fix: Add nil-check for consistency
   -- アダプター選択: アクション名に基づいて適切なアダプターを取得
   local use_case = (action and action_or_prompt == "doc") and "doc" or nil
   local adapter = vibing.get_adapter_for(use_case)
+
+  if not adapter then
+    notify.error("No adapter configured", "Inline")
+    return
+  end

   -- If not a predefined action, treat as custom natural language instruction
   if not action then
🤖 Prompt for AI Agents
In @lua/vibing/application/inline/use_case.lua around lines 34 - 36, The
predefined-action branch sets use_case and calls
vibing.get_adapter_for(use_case) but never checks for a nil adapter; add the
same nil-check behavior used in M.custom() to validate the adapter before
calling the Execution functions: after computing adapter from
vibing.get_adapter_for(use_case) (symbols: use_case, adapter,
vibing.get_adapter_for) verify adapter is non-nil and handle the nil case
(log/return an error or early-return) so downstream calls into the Execution
functions do not receive a nil adapter.

Comment on lines +43 to +46
local content = msg.content
if #content > 300 then
content = content:sub(1, 300) .. "..."
end
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Byte-length truncation may break multi-byte characters.

Using #content counts bytes, not characters. For UTF-8 content (Japanese, Chinese, etc.), truncating at byte 300 may split a multi-byte character, producing invalid UTF-8 or corrupted text.

🔧 Suggested fix using vim.str_byteindex for safe truncation
-    if #content > 300 then
-      content = content:sub(1, 300) .. "..."
+    -- Use vim.fn.strcharlen for character count, vim.fn.strcharpart for safe slicing
+    if vim.fn.strcharlen(content) > 300 then
+      content = vim.fn.strcharpart(content, 0, 300) .. "..."
     end
🤖 Prompt for AI Agents
In @lua/vibing/core/utils/title_generator.lua around lines 43 - 46, The code
truncates using byte length (#content) which can split multi-byte UTF-8
characters; change the truncation to count characters and cut at a safe byte
boundary using Neovim's helper: compute the byte index for the 300th character
with vim.str_byteindex(msg.content, 300) (or fallback to utf8 handling), then if
that byte index is less than #msg.content set content = msg.content:sub(1,
byte_index) .. "..." so you never slice mid-character; update references to
content/msg.content in this block accordingly.

Comment on lines +133 to +144
function OllamaAdapter:supports(feature)
if feature == "streaming" then
return true
elseif feature == "cancel" then
return true
elseif feature == "tools" then
return false -- Ollamaはツール実行をサポートしない
elseif feature == "session" then
return false -- セッション管理なし
end
return false
end
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

supports("cancel") returns true but cancel() method is not implemented.

The adapter claims to support cancellation but lacks the cancel() method required by the adapter pattern interface. This inconsistency will cause runtime errors if cancel() is called on the Ollama adapter.

🔧 Add cancel() implementation
+---リクエストをキャンセル
+function OllamaAdapter:cancel()
+  if self.job_id then
+    vim.fn.jobstop(self.job_id)
+    self.job_id = nil
+  end
+end
+
 ---機能サポートチェック
 ---@param feature string 機能名
 ---@return boolean
 function OllamaAdapter:supports(feature)

Based on learnings, all Lua modules should implement the Adapter pattern interface with methods: execute(), stream(), cancel(), and supports().

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
function OllamaAdapter:supports(feature)
if feature == "streaming" then
return true
elseif feature == "cancel" then
return true
elseif feature == "tools" then
return false -- Ollamaはツール実行をサポートしない
elseif feature == "session" then
return false -- セッション管理なし
end
return false
end
---リクエストをキャンセル
function OllamaAdapter:cancel()
if self.job_id then
vim.fn.jobstop(self.job_id)
self.job_id = nil
end
end
function OllamaAdapter:supports(feature)
if feature == "streaming" then
return true
elseif feature == "cancel" then
return true
elseif feature == "tools" then
return false -- Ollamaはツール実行をサポートしない
elseif feature == "session" then
return false -- セッション管理なし
end
return false
end
🤖 Prompt for AI Agents
In @lua/vibing/infrastructure/adapter/ollama.lua around lines 133 - 144,
supports("cancel") returns true but OllamaAdapter lacks a cancel()
implementation; add a cancel(self) method on OllamaAdapter that follows the
adapter interface (matching execute() / stream() patterns) to gracefully stop an
in-flight request (e.g., signal/flag or abort request handle), ensure cancel()
is exported on the module/class and that supports() remains true only if cancel
is implemented, and update any internal request tracking (the same request
ID/handle used by stream()/execute()) so cancel() can locate and abort the
active operation.

Comment on lines +76 to +88
on_exit = function(_, exit_code, _)
if exit_code ~= 0 then
if on_done then
vim.schedule(function()
on_done(false, {
error = "HTTP request failed",
details = stderr_buffer,
exit_code = exit_code,
})
end)
end
end
end,
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Potential missing on_done callback when stream ends without done flag.

If curl exits successfully (exit_code=0) but the response never contains done: true (e.g., malformed response, server crash mid-stream), on_done will never be called, potentially leaving the caller in a hanging state.

🔧 Consider calling on_done on successful exit as fallback
     on_exit = function(_, exit_code, _)
       if exit_code ~= 0 then
         if on_done then
           vim.schedule(function()
             on_done(false, {
               error = "HTTP request failed",
               details = stderr_buffer,
               exit_code = exit_code,
             })
           end)
         end
+      else
+        -- Fallback: if done was never received but curl succeeded,
+        -- call on_done with what we have (may indicate incomplete response)
+        -- Note: This is defensive - normally decoded.done triggers on_done
       end
     end,

Alternatively, track whether on_done was already called and invoke it on exit if not.

Committable suggestion skipped: line range outside the PR's diff.

@shabaraba shabaraba marked this pull request as draft January 20, 2026 02:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

feat: Implement inline chat and completion using Ollama qwen2.5-coder:0.5B

1 participant