Skip to content

Conversation

AbanoubGhadban
Copy link
Collaborator

@AbanoubGhadban AbanoubGhadban commented Oct 7, 2025

Summary

This PR introduces async props functionality for React on Rails Pro, enabling server components to receive props asynchronously after the initial render has started. This leverages the incremental rendering infrastructure using NDJSON streaming to maximize the benefits of React 18's streaming SSR capabilities.

Problem Solved:

  • React on Rails Pro currently fetches all props from the database before starting the render, negating the benefits of React's streaming features
  • Fast-rendering UI components are blocked by slow data fetching operations
  • No way to progressively enhance the page as data becomes available

Solution:

  • Introduces a new react_component_with_async_props helper that starts rendering immediately
  • Props are streamed to the component as they become available from the database
  • Uses bidirectional HTTP streaming with HTTPx for real-time prop updates
  • Integrates seamlessly with React Suspense for automatic re-rendering

Architecture Overview

The implementation consists of three main layers:

  1. Node Renderer (✅ Complete): NDJSON streaming endpoint with incremental rendering support
  2. Ruby/Rails Layer (🚧 To Do): Async props streaming with HTTPx bidirectional communication
  3. React Package (🚧 To Do): AsyncPropsManager with Suspense integration

Key Architectural Decision: sharedExecutionContext

The sharedExecutionContext is a critical security and isolation feature we introduced to safely share data between the initial rendering request and subsequent update chunks without using global variables.

Why not use global state?

  • Data Leakage Risk: Global variables in the VM persist across multiple rendering requests, causing data from one user's request to leak into another's
  • Concurrency Issues: The same VM handles multiple concurrent rendering requests; global state would cause race conditions and data corruption
  • Security Vulnerability: Sensitive data stored globally could be accessed by subsequent requests from different users

How sharedExecutionContext works:

  1. Created as a fresh Map for each rendering request
  2. Available only during the synchronous execution of code chunks
  3. Automatically cleaned up after each code execution to prevent leakage
  4. Provides isolated storage space for each rendering request
  5. Accessible to both initial request and all subsequent update chunks of the same request

Usage Pattern:

// Initial rendering request
const asyncPropsManager = ReactOnRails.createAsyncPropsManager();
sharedExecutionContext.set("asyncPropsManager", asyncPropsManager);

// Update chunk (can safely access the same context)
const asyncPropsManager = sharedExecutionContext.get("asyncPropsManager");
asyncPropsManager.setProp("articles", data);

This ensures complete isolation between concurrent requests while allowing safe communication within a single request's lifecycle.

Implementation Checklist

Phase 1: Node Renderer Infrastructure ✅

  • Create /bundles/{bundleTimestamp}/incremental-render/{pathSuffix} NDJSON endpoint
  • Implement handleIncrementalRenderRequest for processing initial requests
  • Implement handleIncrementalRenderStream for NDJSON stream parsing
  • Add sharedExecutionContext Map for safe cross-chunk data sharing (prevents data leakage between requests)
  • Support UpdateChunk with target bundle specification
  • Add onRequestEnd callback support for cleanup
  • Implement proper error handling for malformed chunks
  • Add comprehensive test coverage for incremental rendering
  • Support runOnOtherBundle for cross-bundle execution
  • Ensure sharedExecutionContext cleanup after each execution (security critical)

Phase 2: Ruby/Rails Implementation 🚧

  • Create AsyncPropsStreamer class for managing bidirectional streaming
    • Implement HTTPx bidirectional streaming with build_request
    • Add thread-safe prop sending with mutex
    • Handle stream lifecycle and error cases
  • Add react_component_with_async_props helper in react_on_rails_pro_helper.rb
    • Accept block for async prop definitions
    • Pass async props configuration through render options
  • Update Request class to support incremental rendering
    • Detect async props mode from render options
    • Use incremental render endpoint for async props
    • Start render immediately, stream props asynchronously
  • Modify ServerRenderingJsCode to initialize async props
    • Generate AsyncPropsManager initialization code
    • Store manager in sharedExecutionContext (not global)
    • Inject manager into rails context
    • No string manipulation - clean code generation
  • Create AsyncPropsYielder for developer API
    • Support ActiveRecord relations
    • Handle Proc evaluation
    • JSON serialization of values

Phase 3: React-on-Rails-Pro Package Implementation 🚧

  • Create AsyncPropsManager class
    • Manage props Map and promises for Suspense
    • Handle timeouts (30s default)
    • Implement setProp, getProp, endStream methods
    • Thread-safe prop updates
  • Add createAsyncPropsManager to ReactOnRails global
  • Create AsyncPropsProvider with react-server conditional exports
    • Server version with context provider
    • Client version as pass-through
  • Implement useAsyncProps hook
    • Server version that suspends on missing props
    • Client version that throws error
  • Add wrapAsyncPropsRenderer for server components
    • Server version wraps with provider
    • Client version as pass-through

Phase 4: Testing & Documentation 🚧

  • Add integration tests with real database queries
  • Create unit tests for AsyncPropsManager
  • Test timeout and error scenarios
  • Test data isolation between concurrent requests
  • Verify no data leakage across requests
  • Add example implementation in dummy app
  • Write migration guide from regular props
  • Document performance benefits
  • Add security considerations documentation
  • Add troubleshooting guide

Technical Details

Usage Example

# Rails View
<%= react_component_with_async_props("ArticlesDashboard",
      props: { title: "Dashboard" }) do |async_props| %>
  <% async_props.send("featured", Article.featured.first) %>
  <% async_props.send("articles", Article.recent.limit(10)) %>
  <% async_props.send("categories", Category.all) %>
<% end %>
// React Server Component
function ArticlesDashboard({ title }) {
  const featured = useAsyncProps("featured");
  const articles = useAsyncProps("articles");
  const categories = useAsyncProps("categories");

  return (
    <div>
      <h1>{title}</h1>
      <Suspense fallback={<Spinner />}>
        <FeaturedArticle article={featured} />
      </Suspense>
      <Suspense fallback={<Spinner />}>
        <ArticlesList articles={articles} />
      </Suspense>
    </div>
  );
}

Key Implementation Notes

  1. Request Isolation: Each request gets its own sharedExecutionContext preventing data leakage
  2. Bidirectional Streaming: Uses HTTPx's stream_bidi plugin for real-time communication
  3. Immediate Rendering: Initial render starts before any async props are available
  4. Progressive Enhancement: UI updates as each prop becomes available
  5. Error Resilience: Individual prop failures don't break the entire render
  6. Timeout Handling: Props timeout after 30s or when stream ends
  7. Thread Safety: Concurrent prop sending with proper synchronization
  8. Security First: No global state usage, complete request isolation

Security Considerations

The sharedExecutionContext design ensures:

  • No Cross-Request Contamination: Each rendering request has isolated context
  • Automatic Cleanup: Context is cleared after each code execution
  • No Global State: Prevents accidental data exposure through global variables
  • Concurrent Request Safety: Multiple users can render simultaneously without interference

Example of unsafe pattern we avoid:

// ❌ UNSAFE - Global state causes data leakage
global.asyncPropsManager = createAsyncPropsManager(); // Shared across ALL requests!

// ✅ SAFE - Request-scoped context
sharedExecutionContext.set("asyncPropsManager", createAsyncPropsManager()); // Isolated per request

Testing Plan

  1. Unit Tests: AsyncPropsManager state management and timeouts
  2. Integration Tests: Full flow from Rails helper to rendered output
  3. Performance Tests: Comparison with regular rendering
  4. Error Tests: Network failures, timeouts, malformed data
  5. Concurrency Tests: Multiple async props arriving simultaneously
  6. Security Tests: Verify no data leakage between concurrent requests
  7. Isolation Tests: Confirm sharedExecutionContext isolation

Migration Path

Existing components can be progressively migrated:

  1. Wrap server component with wrapAsyncPropsRenderer
  2. Replace react_component with react_component_with_async_props
  3. Move slow queries into the async props block
  4. Add Suspense boundaries for loading states

Performance Benefits

  • Time to First Byte (TTFB): Reduced by starting render immediately
  • Progressive Loading: Users see content as it becomes available
  • Perceived Performance: Fast UI renders while data loads
  • Resource Efficiency: Database queries run in parallel

Breaking Changes

None - this is an additive feature that doesn't affect existing functionality.

References

Pull Request Checklist

  • Add/update tests to cover these changes
  • Update documentation
  • Update CHANGELOG file
  • Run full test suite
  • Test with real-world data loads
  • Add performance benchmarks
  • Verify security isolation between requests

This change is Reviewable

- Introduced `IncrementalRenderRequestManager` to handle streaming NDJSON requests, managing state and processing of incremental render requests.
- Added `validateBundlesExist` utility function to check for the existence of required bundles, improving error handling for missing assets.
- Refactored the incremental render endpoint to utilize the new request manager, enhancing the response lifecycle and error management.
- Updated tests to cover scenarios for missing bundles and validate the new request handling logic.
- Replaced the `IncrementalRenderRequestManager` with `handleIncrementalRenderStream` to manage streaming NDJSON requests more efficiently.
- Enhanced error handling and validation during the rendering process.
- Updated the `run` function to utilize the new stream handler, improving the response lifecycle and overall performance.
- Removed the deprecated `IncrementalRenderRequestManager` class to streamline the codebase.
- Introduced improved error handling for malformed JSON chunks during the incremental rendering process.
- Added logging and reporting for errors in subsequent chunks while allowing processing to continue.
- Updated tests to verify behavior for malformed JSON in both initial and update chunks, ensuring robust error management.
…inability

- Introduced helper functions to reduce redundancy in test setup, including `getServerAddress`, `createHttpRequest`, and `createInitialObject`.
- Streamlined the handling of HTTP requests and responses in tests, enhancing clarity and organization.
- Updated tests to utilize new helper functions, ensuring consistent structure and easier future modifications.
- Replaced inline wait functions with a new `waitFor` utility to improve test reliability and readability.
- Updated tests to utilize `waitFor` for asynchronous expectations, ensuring proper handling of processing times.
- Simplified the test structure by removing redundant wait logic, enhancing maintainability.
…processing

- Introduced `createBasicTestSetup` and `createStreamingTestSetup` helper functions to streamline test initialization and improve readability.
- Added `sendChunksAndWaitForProcessing` to handle chunk sending and processing verification, reducing redundancy in test logic.
- Updated existing tests to utilize these new helpers, enhancing maintainability and clarity in the test structure.
- Added detailed error reporting in the `waitFor` function to include the last encountered error message when a timeout occurs.
- Refactored the `createStreamingResponsePromise` function to improve clarity and maintainability by renaming variables and returning received chunks alongside the promise.
- Updated tests to utilize the new structure, ensuring robust handling of streaming responses and error scenarios.
- Introduced `validateAndGetBundlePaths` and `buildVMsForBundles` functions to streamline bundle validation and VM building processes.
- Updated `handleIncrementalRenderRequest` and `handleRenderRequest` to utilize the new validation and VM building functions, improving code clarity and maintainability.
- Enhanced error handling for rendering execution and added support for incremental updates using an EventEmitter.
- Created a new `sharedRenderUtils` module to encapsulate shared rendering logic, promoting code reuse and organization.
…ation"

This reverts commit 26bac50ae9742e25ff75e5d1b27225a4495ef3dc.
…ment

- Removed unnecessary bundle validation checks from the incremental render request flow.
- Enhanced the `handleIncrementalRenderRequest` function to directly call `handleRenderRequest`, streamlining the rendering process.
- Updated the `IncrementalRenderInitialRequest` type to support a more flexible structure for dependency timestamps.
- Improved error handling to capture unexpected errors during the rendering process, ensuring robust responses.
- Added cleanup logic in tests to restore mocks after each test case.
- Removed individual protocol version and authentication checks from the request handling flow.
- Introduced a new `performRequestPrechecks` function to streamline the validation process for incoming requests.
- Updated the `authenticate` and `checkProtocolVersion` functions to accept request bodies directly, enhancing modularity.
- Improved error handling by ensuring consistent response structures across precheck validations.
- Updated the `/upload-assets` endpoint to differentiate between assets and bundles, allowing for more flexible uploads.
- Introduced logic to extract bundles prefixed with 'bundle_' and handle them separately.
- Integrated the `handleNewBundlesProvided` function to manage the processing of new bundles.
- Added comprehensive tests to verify the correct handling of uploads with various combinations of assets and bundles, including edge cases for empty requests and duplicate bundle hashes.
- Added tests to verify directory structure and file presence for uploaded bundles and assets.
- Implemented checks for scenarios with empty requests and duplicate bundle hashes, ensuring correct behavior without overwriting existing files.
- Improved coverage of the `/upload-assets` endpoint to handle various edge cases effectively.
- Implemented a new test case for the `/upload-assets` endpoint to verify that bundles are correctly placed in their own hash directories rather than the targetBundles directory.
- Ensured that the test checks for the existence of the bundle in the appropriate directory and confirms that the target bundle directory remains empty, enhancing coverage for asset upload scenarios.
- Implemented a suite of tests for the `/bundles/:bundleTimestamp/incremental-render/:renderRequestDigest` endpoint to verify successful rendering under various conditions, including pre-uploaded bundles and assets.
- Added scenarios to test failure cases, such as missing bundles, incorrect passwords, and invalid JSON payloads.
- Enhanced coverage for handling multiple dependency bundles and processing NDJSON chunks, ensuring robust error management and response validation.
- Simplified test structure by introducing helper functions to reduce code duplication for creating worker apps and uploading bundles.
- Improved test cases for the `/bundles/:bundleTimestamp/incremental-render/:renderRequestDigest` endpoint, ensuring robust validation of successful renders and error handling for various scenarios.
- Added tests for handling invalid JSON and missing required fields, enhancing coverage for edge cases in the rendering process.
- Updated tests to ensure proper handling of multiple dependency bundles and improved response validation for different payload conditions.
- Replaced the `runInVM` function with a new `ExecutionContext` class to manage VM contexts more effectively.
- Updated the `handleRenderRequest` function to utilize the new `ExecutionContext`, improving the handling of rendering requests.
- Enhanced error management by introducing `VMContextNotFoundError` for better clarity when VM contexts are missing.
- Refactored tests to align with the new execution context structure, ensuring consistent behavior across rendering scenarios.
…andling

- Updated the parameters for the `runOnOtherBundle` function to ensure correct execution order.
- Introduced a reference to `globalThis.runOnOtherBundle` in the server rendering code for better accessibility.
- Enhanced the test fixture to align with the changes in the global context, ensuring consistent behavior across rendering requests.
- Introduced `IncrementalRenderSink` type to manage streaming updates more effectively.
- Updated `handleIncrementalRenderRequest` to return an optional sink and handle execution context errors gracefully.
- Refactored the `run` function to utilize the new sink for processing updates, enhancing error logging for unexpected chunks.
- Simplified test setup by removing unused sink methods, ensuring tests focus on relevant functionality.
- Updated the `setResponse` call in the `run` function to correctly use `result.response`.
- Expanded the incremental render tests to cover new scenarios, including basic updates, multi-bundle interactions, and error handling for malformed update chunks.
- Introduced new helper functions in test fixtures to streamline the creation of async values and streams, enhancing the robustness of the tests.
- Improved the secondary bundle's functionality to support async value resolution and streaming, ensuring consistent behavior across bundles.
Copy link
Contributor

coderabbitai bot commented Oct 7, 2025

Walkthrough

Introduces incremental NDJSON streaming render endpoints and execution-context-based VM orchestration. Centralizes request prechecks (protocol/auth), adds bundle existence validation, and extends upload-assets to handle bundles. Updates Ruby client to optionally upload assets before streaming. Refactors RSC payload to alias global functions. Adds extensive tests and fixtures for async/streaming and multi-bundle flows.

Changes

Cohort / File(s) Summary
Rails client streaming adjustments
react_on_rails_pro/lib/react_on_rails_pro/request.rb, react_on_rails_pro/lib/react_on_rails_pro/server_rendering_js_code.rb
In render_code_as_stream, conditionally uploads assets when send_bundle is true, logs, then always builds form without sending bundle; request streams as before. RSC payload now aliases globalThis.runOnOtherBundle to a local before invocation.
Prechecks consolidation (auth/protocol)
.../worker/authHandler.ts, .../worker/checkProtocolVersionHandler.ts, .../worker/requestPrechecks.ts
Auth and protocol checks now operate on body objects, not FastifyRequest. Introduces AuthBody, ProtocolVersionBody, and performRequestPrechecks(body) to centralize prechecks and return early error responses.
Incremental streaming render feature
.../worker/worker.ts, .../worker/handleIncrementalRenderStream.ts, .../worker/handleIncrementalRenderRequest.ts
Adds NDJSON parser, new /incremental-render endpoint, and stream handler that parses first render request then incremental updates. Integrates with incremental handler that validates chunks, delegates initial render, and applies updates via execution context.
Bundle existence and utils
.../shared/utils.ts
Adds validateBundlesExist(...) to check main/dependency bundles, returning 410 with no-cache headers if missing; otherwise null.
Execution context VM refactor
.../worker/vm.ts, .../worker/handleRenderRequest.ts
Replaces direct VM calls with buildExecutionContext(...) and ExecutionContext.runInVM. Exposes VMContext, VMContextNotFoundError, removeVM. handleRenderRequest returns { response, executionContext? }; integrates bundle-provided handling and missing bundle checks.
Upload-assets handling for bundles
.../worker.ts
Extends asset upload to detect bundle_* files, route through handleNewBundlesProvided, and clean up; logs updated to include assets/bundles.
Tests: incremental streaming
.../tests/incrementalRender.test.ts, .../tests/worker.test.ts
Adds comprehensive tests for NDJSON streaming, auth/protocol failures, missing bundles, multi-bundle updates, error resilience, and upload-assets with bundles.
Tests: VM/execution context migration
.../tests/vm.test.ts, .../tests/serverRenderRSCReactComponent.test.js, .../tests/helper.ts
Migrates from buildVM/runInVM/getVMContext to buildExecutionContext. Adds waitFor helper. Adjusts expectations to new interfaces.
Fixtures for async/streaming & cross-bundle
.../tests/fixtures/bundle.js, .../tests/fixtures/secondary-bundle.js, .../tests/fixtures/projects/spec-dummy/asyncComponentsTreeForTestingRenderingRequest.js
Adds async value and stream helpers on ReactOnRails for primary/secondary bundles. Uses local alias for runOnOtherBundle in test RSC payload.

Sequence Diagram(s)

sequenceDiagram
  autonumber
  participant Client
  participant NodeRenderer as Node Renderer (Fastify)
  participant Prechecks as Prechecks
  participant StreamHandler as Incremental Stream Handler
  participant IncHandler as Incremental Render Handler
  participant ExecCtx as Execution Context / VM

  Client->>NodeRenderer: POST /bundles/:ts/incremental-render/:digest\nContent-Type: application/x-ndjson\n(first JSON line)
  NodeRenderer->>Prechecks: performRequestPrechecks(body)
  Prechecks-->>NodeRenderer: Error Response? (optional)
  alt Precheck failed
    NodeRenderer-->>Client: Error response (e.g., 401/400)
  else Precheck ok
    NodeRenderer->>StreamHandler: handleIncrementalRenderStream(request.raw, callbacks)
    StreamHandler->>IncHandler: onRenderRequestReceived(initialRequest)
    IncHandler->>ExecCtx: handleRenderRequest(...) → buildExecutionContext(...)
    ExecCtx-->>IncHandler: { response, executionContext? }
    IncHandler-->>StreamHandler: { response, shouldContinue }
    StreamHandler-->>Client: Send initial response headers/body start
    alt shouldContinue = true and executionContext present
      loop For each subsequent NDJSON line
        Client->>StreamHandler: update chunk (JSON line)
        StreamHandler->>IncHandler: onUpdateReceived(chunk)
        IncHandler->>ExecCtx: executionContext.runInVM(update, bundlePath)
        ExecCtx-->>IncHandler: Update applied / error
        IncHandler-->>StreamHandler: Ack/Result (non-blocking on errors)
        StreamHandler-->>Client: Stream update/event (as applicable)
      end
      StreamHandler-->>Client: Stream end
    else No executionContext or shouldContinue = false
      StreamHandler-->>Client: Complete initial response
    end
  end
Loading
sequenceDiagram
  autonumber
  participant Rails as Rails (render_code_as_stream)
  participant Uploader as upload_assets
  participant NodeRenderer as Node Renderer
  Rails->>Rails: if send_bundle\n  log "Sending bundle..."
  alt send_bundle
    Rails->>Uploader: upload_assets(...)
    Uploader-->>Rails: done
  end
  Rails->>Rails: form_with_code(js_code, false)
  Rails->>NodeRenderer: perform_request(stream: true)
  NodeRenderer-->>Rails: streamed response
Loading

Estimated code review effort

🎯 5 (Critical) | ⏱️ ~120 minutes

Possibly related PRs

Suggested reviewers

  • justin808
  • ihabadham
  • Judahmeek

Poem

A bundle hops, an NDJSON stream,
I twitch my whiskers—incremental dream.
Prechecks pass, the VMs awake,
We nibble bytes for streaming’s sake.
Two bundles chat across the wire—
Carrots cached, responses fire.
Thump-thump: shipped! 🥕✨

Pre-merge checks and finishing touches

❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 24.00% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
✅ Passed checks (2 passed)
Check name Status Explanation
Title Check ✅ Passed The title accurately and concisely describes the primary feature introduced—support for asynchronous props and incremental rendering in React Server Components—without extraneous detail, making it clear to reviewers and in project history what the change delivers.
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch abanoubghadban/pro509/make-renderer-use-ndjson-for-communication

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link

claude bot commented Oct 7, 2025

Pull Request Review - NDJSON Communication for Node Renderer

This is a comprehensive review of the changes to implement NDJSON format for communication between Rails and the node renderer. Overall, this is a well-structured refactoring that introduces streaming capabilities, but there are several areas that need attention before merging.


🟢 Strengths

Architecture & Design

  • Clean separation of concerns: The new requestPrechecks.ts, handleIncrementalRenderRequest.ts, and handleIncrementalRenderStream.ts files properly separate HTTP framework logic from business logic
  • ExecutionContext abstraction: The introduction of ExecutionContext in vm.ts is excellent - it properly encapsulates VM management and allows incremental updates to share VM state
  • Backward compatibility: The existing /bundles/:bundleTimestamp/render/:renderRequestDigest endpoint is preserved, ensuring no breaking changes
  • Test coverage: Excellent comprehensive test suite in incrementalRender.test.ts covering edge cases like malformed JSON, authentication failures, streaming responses, and multi-bundle scenarios

Code Quality

  • Type safety: Good use of TypeScript assertions and type guards (assertIsUpdateChunk)
  • Error handling: Thoughtful error handling in handleIncrementalRenderStream.ts that differentiates between first-chunk errors (stop processing) vs update-chunk errors (log and continue)
  • Resource management: Proper VM pooling with LRU eviction in vm.ts:86-102

🟡 Areas for Improvement

1. Critical: Missing File in Diff

The diff references requestPrechecks.ts but the file content wasn't shown. This needs to be reviewed.

2. Security Concerns

Bundle Upload Validation (worker.ts:323-329)

Object.entries(req.body).forEach(([key, value]) => {
  if (isAsset(value)) {
    if (key.startsWith('bundle_')) {
      const timestamp = key.replace('bundle_', '');
      bundles.push({ timestamp, bundle: value });

Issue: No validation that timestamp is a safe value. Could lead to directory traversal if malicious timestamp contains ../.

Recommendation:

const timestamp = key.replace('bundle_', '');
// Validate timestamp format
if (!/^[\w-]+$/.test(timestamp)) {
  throw new Error('Invalid bundle timestamp format');
}

Unvalidated Bundle Execution (handleIncrementalRenderRequest.ts:75)

executionContext.runInVM(chunk.updateChunk, bundlePath)

Issue: chunk.updateChunk is arbitrary JavaScript code executed without validation. While this may be intentional for the design, ensure:

  1. Authentication is always enforced before creating the sink
  2. The connection is secure (HTTPS/encrypted)
  3. Consider rate limiting to prevent abuse

3. Error Handling Issues

Silent Failures in handleIncrementalRenderStream.ts:70-78

} catch (err) {
  const errorMessage = `Invalid JSON chunk: ${err instanceof Error ? err.message : String(err)}`;
  if (!hasReceivedFirstObject) {
    throw new Error(errorMessage);
  } else {
    const reportedMessage = `JSON parsing error in update chunk...`;
    console.error(reportedMessage);  // ⚠️ Using console.error instead of log
    errorReporter.message(reportedMessage);
    continue;
  }
}

Issue: Inconsistent logging - uses console.error instead of the project's log utility.

Recommendation:

import log from '../shared/log';
// ...
log.error({ msg: reportedMessage, err });

Async Error Handling (handleIncrementalRenderRequest.ts:74-76)

executionContext.runInVM(chunk.updateChunk, bundlePath).catch((err: unknown) => {
  log.error({ msg: 'Error running incremental render chunk', err, chunk });
});

Issue: Errors are logged but not surfaced to the client. The client has no way to know an update chunk failed.

Recommendation: Consider adding a way to signal errors back to the client or maintaining an error log that can be queried.

4. Performance & Resource Management

Missing Cleanup in Incremental Rendering

The IncrementalRenderSink doesn't have a cleanup/close method. If the client disconnects or the request ends, there's no mechanism to clean up resources or notify the execution context.

Recommendation: Add a close() method to the sink and call it in onRequestEnded:

export type IncrementalRenderSink = {
  add: (chunk: unknown) => void;
  close: () => void;
};

// In handleIncrementalRenderStream.ts
onRequestEnded: () => {
  incrementalSink?.close();
}

Memory Leak Risk in VM Context (vm.ts:316)

const sharedExecutionContext = new Map();

Issue: sharedExecutionContext is created per ExecutionContext but never explicitly cleared. If many execution contexts are created, this could accumulate memory.

Recommendation: Document the lifecycle of this map or add explicit cleanup.

5. Code Quality Issues

Magic Number in vm.ts:312

vmContext.lastUsed = Date.now();

This timestamp updating happens in multiple places. Consider extracting to a helper method:

function touchVMContext(vmContext: VMContext): void {
  vmContext.lastUsed = Date.now();
}

Inconsistent Nullability Checks

  • worker.ts:436: Uses assetsToCopy: Asset[] | null | undefined
  • Should standardize on either null or undefined, not both

Test with .only (incrementalRender.test.ts:677)

test.only('basic incremental update...', async () => {

Critical: This will cause all other tests in this suite to be skipped in CI! Must be removed.

6. Missing Documentation

Missing JSDoc for Public APIs

  • handleIncrementalRenderRequest lacks documentation about when sink is undefined vs defined
  • ExecutionContext type needs documentation about its lifecycle and thread-safety guarantees
  • runOnOtherBundle function (vm.ts:343) is undocumented - this is a critical feature

Recommendation: Add JSDoc:

/**
 * Handles an incremental render request with streaming updates.
 * 
 * @param initial - The initial render request containing bundleTimestamp and renderingRequest
 * @returns Promise resolving to response and optional sink. Sink is undefined if an error occurred
 *          during setup (e.g., bundle not found). When defined, sink can be used to send update
 *          chunks to the running VM context.
 * @throws Never throws - errors are returned in the response object
 */
export async function handleIncrementalRenderRequest(...)

7. Ruby Side Changes

Unused Variable (server_rendering_js_code.rb:39)

const runOnOtherBundle = globalThis.runOnOtherBundle;

Issue: Variable is declared but never used in the visible code.

Question: Is this intentional to make it available in a certain scope? If so, add a comment explaining why.

Inconsistent Boolean Handling (request.rb:31-36)

if send_bundle
  Rails.logger.info { "[ReactOnRailsPro] Sending bundle to the node renderer" }
  upload_assets
end

form = form_with_code(js_code, false)  # Always passing false now

Question: Why always pass false to form_with_code now? This seems like a behavior change that should be explained in the PR description or comments.


🔴 Required Changes Before Merge

  1. Remove .only from test (incrementalRender.test.ts:677)
  2. Add bundle timestamp validation in worker.ts upload-assets endpoint
  3. Replace console.error with log in handleIncrementalRenderStream.ts
  4. Update CHANGELOG as noted in PR checklist
  5. Add documentation for the new NDJSON endpoint and how to use it
  6. Explain Ruby-side changes in PR description

📋 PR Checklist Status

From the PR template:

  • ❌ Add/update test to cover these changes - Tests exist but have .only
  • ❌ Update documentation - No user-facing docs for new endpoint
  • ❌ Update CHANGELOG file - Not done

🎯 Recommendations for Next Steps

  1. Fill out the PR description - Remove the template text and explain:

    • What problem does NDJSON communication solve?
    • What are the performance/memory benefits?
    • Are there any breaking changes?
    • What's the migration path for existing code?
  2. Consider adding integration tests that test the actual NDJSON endpoint end-to-end without mocking handleIncrementalRenderRequest

  3. Add performance benchmarks if this is meant to improve performance

  4. Security audit - Have someone review the arbitrary code execution paths

  5. Consider API versioning - If this is a significant protocol change, consider versioning the API endpoints


Overall Assessment

This is high-quality work with excellent test coverage and clean architecture. The main issues are:

  • Missing documentation/changelog
  • Test with .only that will break CI
  • Some security hardening needed
  • PR description needs completion

Recommendation: Request changes to address the critical issues above before merging.

Great work on the comprehensive test suite and the clean separation of concerns! 🎉

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 6

🧹 Nitpick comments (1)
react_on_rails_pro/packages/node-renderer/src/worker/vm.ts (1)

390-398: Consider clearing in-flight creation promises on reset (optional).

resetVM() clears contexts but not vmCreationPromises; after reset, late-resolving promises can repopulate vmContexts unexpectedly.

Option:

 export function resetVM() {
   // Clear all VM contexts
   vmContexts.clear();
+  vmCreationPromises.clear();
 }
📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 7bce94a and 8107864.

📒 Files selected for processing (19)
  • react_on_rails_pro/lib/react_on_rails_pro/request.rb (1 hunks)
  • react_on_rails_pro/lib/react_on_rails_pro/server_rendering_js_code.rb (1 hunks)
  • react_on_rails_pro/packages/node-renderer/src/shared/utils.ts (2 hunks)
  • react_on_rails_pro/packages/node-renderer/src/worker.ts (9 hunks)
  • react_on_rails_pro/packages/node-renderer/src/worker/authHandler.ts (2 hunks)
  • react_on_rails_pro/packages/node-renderer/src/worker/checkProtocolVersionHandler.ts (1 hunks)
  • react_on_rails_pro/packages/node-renderer/src/worker/handleIncrementalRenderRequest.ts (1 hunks)
  • react_on_rails_pro/packages/node-renderer/src/worker/handleIncrementalRenderStream.ts (1 hunks)
  • react_on_rails_pro/packages/node-renderer/src/worker/handleRenderRequest.ts (5 hunks)
  • react_on_rails_pro/packages/node-renderer/src/worker/requestPrechecks.ts (1 hunks)
  • react_on_rails_pro/packages/node-renderer/src/worker/vm.ts (7 hunks)
  • react_on_rails_pro/packages/node-renderer/tests/fixtures/bundle.js (1 hunks)
  • react_on_rails_pro/packages/node-renderer/tests/fixtures/projects/spec-dummy/asyncComponentsTreeForTestingRenderingRequest.js (1 hunks)
  • react_on_rails_pro/packages/node-renderer/tests/fixtures/secondary-bundle.js (1 hunks)
  • react_on_rails_pro/packages/node-renderer/tests/helper.ts (3 hunks)
  • react_on_rails_pro/packages/node-renderer/tests/incrementalRender.test.ts (1 hunks)
  • react_on_rails_pro/packages/node-renderer/tests/serverRenderRSCReactComponent.test.js (2 hunks)
  • react_on_rails_pro/packages/node-renderer/tests/vm.test.ts (27 hunks)
  • react_on_rails_pro/packages/node-renderer/tests/worker.test.ts (2 hunks)
🧰 Additional context used
📓 Path-based instructions (3)
**/*.{js,jsx,ts,tsx,css,scss,json,yml,yaml,md}

📄 CodeRabbit inference engine (CLAUDE.md)

Prettier is the sole authority for formatting all non-Ruby files; never manually format them

Files:

  • react_on_rails_pro/packages/node-renderer/src/worker/requestPrechecks.ts
  • react_on_rails_pro/packages/node-renderer/src/shared/utils.ts
  • react_on_rails_pro/packages/node-renderer/tests/serverRenderRSCReactComponent.test.js
  • react_on_rails_pro/packages/node-renderer/tests/fixtures/bundle.js
  • react_on_rails_pro/packages/node-renderer/src/worker/handleIncrementalRenderStream.ts
  • react_on_rails_pro/packages/node-renderer/src/worker/handleIncrementalRenderRequest.ts
  • react_on_rails_pro/packages/node-renderer/tests/fixtures/secondary-bundle.js
  • react_on_rails_pro/packages/node-renderer/tests/worker.test.ts
  • react_on_rails_pro/packages/node-renderer/src/worker/checkProtocolVersionHandler.ts
  • react_on_rails_pro/packages/node-renderer/src/worker/vm.ts
  • react_on_rails_pro/packages/node-renderer/tests/incrementalRender.test.ts
  • react_on_rails_pro/packages/node-renderer/tests/fixtures/projects/spec-dummy/asyncComponentsTreeForTestingRenderingRequest.js
  • react_on_rails_pro/packages/node-renderer/src/worker/authHandler.ts
  • react_on_rails_pro/packages/node-renderer/src/worker.ts
  • react_on_rails_pro/packages/node-renderer/tests/helper.ts
  • react_on_rails_pro/packages/node-renderer/tests/vm.test.ts
  • react_on_rails_pro/packages/node-renderer/src/worker/handleRenderRequest.ts
**/*.{js,jsx,ts,tsx}

📄 CodeRabbit inference engine (CLAUDE.md)

Use ESLint for JS/TS code (lint via rake lint or yarn lint)

Files:

  • react_on_rails_pro/packages/node-renderer/src/worker/requestPrechecks.ts
  • react_on_rails_pro/packages/node-renderer/src/shared/utils.ts
  • react_on_rails_pro/packages/node-renderer/tests/serverRenderRSCReactComponent.test.js
  • react_on_rails_pro/packages/node-renderer/tests/fixtures/bundle.js
  • react_on_rails_pro/packages/node-renderer/src/worker/handleIncrementalRenderStream.ts
  • react_on_rails_pro/packages/node-renderer/src/worker/handleIncrementalRenderRequest.ts
  • react_on_rails_pro/packages/node-renderer/tests/fixtures/secondary-bundle.js
  • react_on_rails_pro/packages/node-renderer/tests/worker.test.ts
  • react_on_rails_pro/packages/node-renderer/src/worker/checkProtocolVersionHandler.ts
  • react_on_rails_pro/packages/node-renderer/src/worker/vm.ts
  • react_on_rails_pro/packages/node-renderer/tests/incrementalRender.test.ts
  • react_on_rails_pro/packages/node-renderer/tests/fixtures/projects/spec-dummy/asyncComponentsTreeForTestingRenderingRequest.js
  • react_on_rails_pro/packages/node-renderer/src/worker/authHandler.ts
  • react_on_rails_pro/packages/node-renderer/src/worker.ts
  • react_on_rails_pro/packages/node-renderer/tests/helper.ts
  • react_on_rails_pro/packages/node-renderer/tests/vm.test.ts
  • react_on_rails_pro/packages/node-renderer/src/worker/handleRenderRequest.ts
{Gemfile,Rakefile,config.ru,**/*.{rb,rake,gemspec,ru}}

📄 CodeRabbit inference engine (CLAUDE.md)

{Gemfile,Rakefile,config.ru,**/*.{rb,rake,gemspec,ru}}: All Ruby code must pass RuboCop with zero offenses before commit/push
RuboCop is the sole authority for Ruby file formatting; never manually format Ruby files

Files:

  • react_on_rails_pro/lib/react_on_rails_pro/server_rendering_js_code.rb
  • react_on_rails_pro/lib/react_on_rails_pro/request.rb
🧠 Learnings (1)
📚 Learning: 2025-07-08T05:57:29.630Z
Learnt from: AbanoubGhadban
PR: shakacode/react_on_rails#1745
File: node_package/src/RSCRequestTracker.ts:8-14
Timestamp: 2025-07-08T05:57:29.630Z
Learning: The global `generateRSCPayload` function in React on Rails Pro (RORP) is provided by the framework during rendering requests, not implemented in application code. The `declare global` statements are used to document the expected interface that RORP will inject at runtime.

Applied to files:

  • react_on_rails_pro/lib/react_on_rails_pro/server_rendering_js_code.rb
  • react_on_rails_pro/packages/node-renderer/tests/fixtures/projects/spec-dummy/asyncComponentsTreeForTestingRenderingRequest.js
🧬 Code graph analysis (14)
react_on_rails_pro/packages/node-renderer/src/worker/requestPrechecks.ts (2)
react_on_rails_pro/packages/node-renderer/src/worker/checkProtocolVersionHandler.ts (2)
  • ProtocolVersionBody (7-9)
  • checkProtocolVersion (11-27)
react_on_rails_pro/packages/node-renderer/src/worker/authHandler.ts (2)
  • AuthBody (11-13)
  • authenticate (15-27)
react_on_rails_pro/packages/node-renderer/tests/serverRenderRSCReactComponent.test.js (1)
react_on_rails_pro/packages/node-renderer/src/worker/vm.ts (1)
  • buildExecutionContext (304-388)
react_on_rails_pro/packages/node-renderer/tests/fixtures/bundle.js (2)
react_on_rails_pro/spec/dummy/client/node-renderer.js (1)
  • require (8-8)
react_on_rails_pro/packages/node-renderer/tests/fixtures/secondary-bundle.js (6)
  • sharedExecutionContext (43-43)
  • sharedExecutionContext (49-49)
  • promiseData (8-8)
  • promiseData (24-24)
  • promise (9-12)
  • stream (31-31)
react_on_rails_pro/packages/node-renderer/src/worker/handleIncrementalRenderStream.ts (1)
react_on_rails_pro/packages/node-renderer/src/worker/handleIncrementalRenderRequest.ts (1)
  • ResponseResult (98-98)
react_on_rails_pro/packages/node-renderer/src/worker/handleIncrementalRenderRequest.ts (2)
react_on_rails_pro/packages/node-renderer/src/worker/handleRenderRequest.ts (1)
  • handleRenderRequest (183-264)
react_on_rails_pro/packages/node-renderer/src/shared/utils.ts (1)
  • getRequestBundleFilePath (163-166)
react_on_rails_pro/packages/node-renderer/tests/fixtures/secondary-bundle.js (1)
react_on_rails_pro/packages/node-renderer/tests/fixtures/bundle.js (6)
  • sharedExecutionContext (46-46)
  • sharedExecutionContext (53-53)
  • promiseData (10-10)
  • promiseData (27-27)
  • promise (11-14)
  • stream (34-34)
react_on_rails_pro/packages/node-renderer/tests/worker.test.ts (1)
react_on_rails_pro/packages/node-renderer/tests/helper.ts (8)
  • getFixtureBundle (22-24)
  • getFixtureSecondaryBundle (26-28)
  • getFixtureAsset (30-32)
  • getOtherFixtureAsset (34-36)
  • assetPath (98-100)
  • assetPathOther (102-104)
  • BUNDLE_TIMESTAMP (17-17)
  • SECONDARY_BUNDLE_TIMESTAMP (18-18)
react_on_rails_pro/packages/node-renderer/src/worker/vm.ts (1)
react_on_rails_pro/packages/node-renderer/src/shared/utils.ts (5)
  • smartTrim (21-42)
  • getRequestBundleFilePath (163-166)
  • isReadableStream (133-137)
  • handleStreamError (139-144)
  • formatExceptionMessage (65-75)
react_on_rails_pro/packages/node-renderer/tests/incrementalRender.test.ts (4)
react_on_rails_pro/packages/node-renderer/src/worker.ts (1)
  • disableHttp2 (105-107)
react_on_rails_pro/packages/node-renderer/src/worker/handleIncrementalRenderRequest.ts (3)
  • IncrementalRenderSink (6-9)
  • ResponseResult (98-98)
  • IncrementalRenderResult (35-38)
react_on_rails_pro/packages/node-renderer/src/shared/utils.ts (1)
  • ResponseResult (44-49)
react_on_rails_pro/packages/node-renderer/tests/helper.ts (5)
  • createVmBundle (60-63)
  • BUNDLE_TIMESTAMP (17-17)
  • waitFor (155-191)
  • createSecondaryVmBundle (65-68)
  • SECONDARY_BUNDLE_TIMESTAMP (18-18)
react_on_rails_pro/packages/node-renderer/src/worker/authHandler.ts (1)
react_on_rails_pro/packages/node-renderer/src/shared/configBuilder.ts (1)
  • getConfig (88-94)
react_on_rails_pro/packages/node-renderer/src/worker.ts (6)
react_on_rails_pro/packages/node-renderer/src/worker/requestPrechecks.ts (1)
  • performRequestPrechecks (13-27)
react_on_rails_pro/packages/node-renderer/src/worker/handleIncrementalRenderRequest.ts (4)
  • IncrementalRenderSink (6-9)
  • IncrementalRenderInitialRequest (29-33)
  • handleIncrementalRenderRequest (46-96)
  • ResponseResult (98-98)
react_on_rails_pro/packages/node-renderer/src/worker/handleIncrementalRenderStream.ts (1)
  • handleIncrementalRenderStream (30-114)
react_on_rails_pro/packages/node-renderer/src/shared/utils.ts (4)
  • errorResponseResult (51-58)
  • formatExceptionMessage (65-75)
  • ResponseResult (44-49)
  • Asset (88-92)
react_on_rails_pro/packages/node-renderer/src/worker/handleRenderRequest.ts (1)
  • handleNewBundlesProvided (158-176)
react_on_rails_pro/packages/node-renderer/src/worker/authHandler.ts (2)
  • authenticate (15-27)
  • AuthBody (11-13)
react_on_rails_pro/packages/node-renderer/tests/helper.ts (2)
react_on_rails_pro/packages/node-renderer/src/worker/vm.ts (1)
  • buildExecutionContext (304-388)
react_on_rails_pro/packages/node-renderer/src/shared/errorReporter.ts (2)
  • error (54-57)
  • message (46-49)
react_on_rails_pro/packages/node-renderer/tests/vm.test.ts (1)
react_on_rails_pro/packages/node-renderer/src/worker/vm.ts (1)
  • buildExecutionContext (304-388)
react_on_rails_pro/packages/node-renderer/src/worker/handleRenderRequest.ts (2)
react_on_rails_pro/packages/node-renderer/src/worker/vm.ts (3)
  • ExecutionContext (295-302)
  • buildExecutionContext (304-388)
  • VMContextNotFoundError (104-109)
react_on_rails_pro/packages/node-renderer/src/shared/utils.ts (3)
  • ResponseResult (44-49)
  • validateBundlesExist (173-197)
  • workerIdLabel (15-18)
🪛 Biome (2.1.2)
react_on_rails_pro/packages/node-renderer/tests/fixtures/bundle.js

[error] 8-8: This is an unexpected use of the debugger statement.

Unsafe fix: Remove debugger statement

(lint/suspicious/noDebugger)


[error] 23-23: This is an unexpected use of the debugger statement.

Unsafe fix: Remove debugger statement

(lint/suspicious/noDebugger)

⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (9)
  • GitHub Check: dummy-app-integration-tests (3.2, 20)
  • GitHub Check: dummy-app-integration-tests (3.4, 22)
  • GitHub Check: rspec-package-tests (3.4, latest)
  • GitHub Check: rspec-package-tests (3.2, minimum)
  • GitHub Check: rspec-package-tests (3.4, minimum)
  • GitHub Check: rspec-package-tests (3.2, latest)
  • GitHub Check: claude-review
  • GitHub Check: build
  • GitHub Check: build-and-test
🔇 Additional comments (15)
react_on_rails_pro/lib/react_on_rails_pro/request.rb (1)

30-36: LGTM! Asset upload correctly separated from rendering request.

The logic properly handles the streaming architecture:

  • When send_bundle is true, assets are uploaded separately via upload_assets before creating the rendering request
  • Line 36 correctly passes false to form_with_code because the bundle is already on the node renderer after the separate upload
  • If send_bundle is false, the upload is skipped and the form is created without the bundle, assuming it's already available on the renderer

This separation allows the rendering request to proceed without re-uploading bundles on each call, which is appropriate for the streaming use case.

react_on_rails_pro/lib/react_on_rails_pro/server_rendering_js_code.rb (1)

39-46: LGTM: local alias for runOnOtherBundle improves stability.

Aliasing once avoids global lookups and aligns with ExecutionContext injection.

Confirm that ExecutionContext always sets globalThis.runOnOtherBundle before this code executes (vm.ts runInVM does so).

react_on_rails_pro/packages/node-renderer/src/worker/authHandler.ts (1)

15-27: Verify authenticate usage and signature changes
No internal call sites of authenticate detected; ensure external consumers are updated for the optional body parameter and use body?.password.

react_on_rails_pro/packages/node-renderer/src/worker/vm.ts (1)

14-15: Ensure TypeScript ≥5.3import type { … } from 'pkg' with { "resolution-mode": "import" } is stable since TS 5.3; no special tsconfig flags required.

react_on_rails_pro/packages/node-renderer/tests/worker.test.ts (6)

324-378: LGTM!

The test comprehensively verifies bundle and asset upload behavior, including directory structure validation and file placement verification.


380-414: LGTM!

Good edge case coverage for bundle-only uploads, with proper verification that no assets are accidentally copied.


416-442: LGTM!

Appropriate edge case test for empty upload requests, verifying directory creation with no files.


444-525: LGTM!

Thorough test for race condition handling during duplicate bundle uploads. The verification of file preservation (size and modification time) is especially valuable for confirming the silent skip behavior.


527-571: LGTM!

Important behavioral test confirming bundles are placed in their own hash directories rather than targetBundles directories. This validates the separation of concerns between bundle storage and asset distribution.


573-938: LGTM!

Comprehensive incremental render test suite with excellent coverage of success paths, error cases, and edge conditions. The helper functions effectively reduce duplication while maintaining readability.

react_on_rails_pro/packages/node-renderer/tests/helper.ts (2)

62-68: LGTM!

The migration from buildVM to buildExecutionContext correctly aligns with the new VM architecture.


147-191: LGTM!

The waitFor helper implements a robust retry mechanism with proper timeout handling and error reporting. The eslint-disable for no-await-in-loop is appropriate since sequential polling is intentional.

react_on_rails_pro/packages/node-renderer/src/worker/handleIncrementalRenderRequest.ts (3)

71-82: Verify error handling strategy for incremental updates.

The sink.add method uses a fire-and-forget pattern where errors are logged but not propagated to the caller (Line 75: .catch() without rethrowing). While this may be intentional for non-blocking incremental updates, consider whether:

  1. Callers need to be notified of update failures
  2. There should be a mechanism to track failed updates
  3. Critical errors should halt subsequent updates

If the current behavior is correct for your use case, this is fine. Otherwise, consider adding error propagation or a failure tracking mechanism.


16-27: LGTM!

The type guard properly validates all required properties and their types for UpdateChunk.


46-96: LGTM!

The handler correctly delegates initial validation to handleRenderRequest and provides an appropriate sink for incremental updates with proper error handling boundaries.

Comment on lines +35 to +114
let hasReceivedFirstObject = false;
const decoder = new StringDecoder('utf8');
let buffer = '';

try {
for await (const chunk of request.raw) {
const str = decoder.write(chunk);
buffer += str;

// Process all complete JSON objects in the buffer
let boundary = buffer.indexOf('\n');
while (boundary !== -1) {
const rawObject = buffer.slice(0, boundary).trim();
buffer = buffer.slice(boundary + 1);
boundary = buffer.indexOf('\n');

if (rawObject) {
let parsed: unknown;
try {
parsed = JSON.parse(rawObject);
} catch (err) {
const errorMessage = `Invalid JSON chunk: ${err instanceof Error ? err.message : String(err)}`;

if (!hasReceivedFirstObject) {
// Error in first chunk - throw error to stop processing
throw new Error(errorMessage);
} else {
// Error in subsequent chunks - log and report but continue processing
const reportedMessage = `JSON parsing error in update chunk: ${err instanceof Error ? err.message : String(err)}`;
console.error(reportedMessage);
errorReporter.message(reportedMessage);
// Skip this malformed chunk and continue with next ones
// eslint-disable-next-line no-continue
continue;
}
}

if (!hasReceivedFirstObject) {
hasReceivedFirstObject = true;
try {
// eslint-disable-next-line no-await-in-loop
const result = await onRenderRequestReceived(parsed);
const { response, shouldContinue: continueFlag } = result;

void onResponseStart(response);

if (!continueFlag) {
return;
}
} catch (err) {
// Error in first chunk processing - throw error to stop processing
const error = err instanceof Error ? err : new Error(String(err));
error.message = `Error processing initial render request: ${error.message}`;
throw error;
}
} else {
try {
// eslint-disable-next-line no-await-in-loop
await onUpdateReceived(parsed);
} catch (err) {
// Error in update chunk processing - log and report but continue processing
const errorMessage = `Error processing update chunk: ${err instanceof Error ? err.message : String(err)}`;
console.error(errorMessage);
errorReporter.message(errorMessage);
// Continue processing other chunks
}
}
}
}
}
} catch (err) {
const error = err instanceof Error ? err : new Error(String(err));
// Update the error message in place to retain the original stack trace, rather than creating a new error object
error.message = `Error while handling the request stream: ${error.message}`;
throw error;
}

// Stream ended normally
void onRequestEnded();
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Flush the decoder/buffer so the last NDJSON chunk isn’t lost

If the client closes the stream without a trailing newline (allowed by NDJSON), buffer still holds the final JSON object when the loop exits. Because we never flush the decoder or re-run the parsing logic, that last chunk is silently dropped. We need to reuse the same parsing pathway after for await finishes, and exit early if the first object signalled shouldContinue: false.

-  let hasReceivedFirstObject = false;
+  let hasReceivedFirstObject = false;
+  let shouldContinueProcessing = true;
   const decoder = new StringDecoder('utf8');
   let buffer = '';
 
+  const processObject = async (rawObject: string) => {
+    if (!rawObject) {
+      return;
+    }
+
+    let parsed: unknown;
+    try {
+      parsed = JSON.parse(rawObject);
+    } catch (err) {
+      const message = err instanceof Error ? err.message : String(err);
+      if (!hasReceivedFirstObject) {
+        throw new Error(`Invalid JSON chunk: ${message}`);
+      }
+      const reportedMessage = `JSON parsing error in update chunk: ${message}`;
+      console.error(reportedMessage);
+      errorReporter.message(reportedMessage);
+      return;
+    }
+
+    if (!hasReceivedFirstObject) {
+      hasReceivedFirstObject = true;
+      try {
+        const { response, shouldContinue: continueFlag } = await onRenderRequestReceived(parsed);
+        void onResponseStart(response);
+        if (!continueFlag) {
+          shouldContinueProcessing = false;
+        }
+      } catch (err) {
+        const error = err instanceof Error ? err : new Error(String(err));
+        error.message = `Error processing initial render request: ${error.message}`;
+        throw error;
+      }
+    } else if (shouldContinueProcessing) {
+      try {
+        await onUpdateReceived(parsed);
+      } catch (err) {
+        const message = err instanceof Error ? err.message : String(err);
+        const errorMessage = `Error processing update chunk: ${message}`;
+        console.error(errorMessage);
+        errorReporter.message(errorMessage);
+      }
+    }
+  };
+
   try {
     for await (const chunk of request.raw) {
       const str = decoder.write(chunk);
@@
-        if (rawObject) {
-          let parsed: unknown;
-          try {
-            parsed = JSON.parse(rawObject);
-          } catch (err) {
-            const errorMessage = `Invalid JSON chunk: ${err instanceof Error ? err.message : String(err)}`;
-
-            if (!hasReceivedFirstObject) {
-              // Error in first chunk - throw error to stop processing
-              throw new Error(errorMessage);
-            } else {
-              // Error in subsequent chunks - log and report but continue processing
-              const reportedMessage = `JSON parsing error in update chunk: ${err instanceof Error ? err.message : String(err)}`;
-              console.error(reportedMessage);
-              errorReporter.message(reportedMessage);
-              // Skip this malformed chunk and continue with next ones
-              // eslint-disable-next-line no-continue
-              continue;
-            }
-          }
-
-          if (!hasReceivedFirstObject) {
-            hasReceivedFirstObject = true;
-            try {
-              // eslint-disable-next-line no-await-in-loop
-              const result = await onRenderRequestReceived(parsed);
-              const { response, shouldContinue: continueFlag } = result;
-
-              void onResponseStart(response);
-
-              if (!continueFlag) {
-                return;
-              }
-            } catch (err) {
-              // Error in first chunk processing - throw error to stop processing
-              const error = err instanceof Error ? err : new Error(String(err));
-              error.message = `Error processing initial render request: ${error.message}`;
-              throw error;
-            }
-          } else {
-            try {
-              // eslint-disable-next-line no-await-in-loop
-              await onUpdateReceived(parsed);
-            } catch (err) {
-              // Error in update chunk processing - log and report but continue processing
-              const errorMessage = `Error processing update chunk: ${err instanceof Error ? err.message : String(err)}`;
-              console.error(errorMessage);
-              errorReporter.message(errorMessage);
-              // Continue processing other chunks
-            }
-          }
+        if (rawObject) {
+          // eslint-disable-next-line no-await-in-loop
+          await processObject(rawObject);
+          if (!shouldContinueProcessing) {
+            return;
+          }
         }
       }
     }
   } catch (err) {
@@
-  // Stream ended normally
-  void onRequestEnded();
+  const flush = decoder.end();
+  if (flush) {
+    buffer += flush;
+  }
+
+  if (buffer.trim()) {
+    await processObject(buffer.trim());
+    if (!shouldContinueProcessing) {
+      return;
+    }
+  }
+
+  // Stream ended normally
+  void onRequestEnded();
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
let hasReceivedFirstObject = false;
const decoder = new StringDecoder('utf8');
let buffer = '';
try {
for await (const chunk of request.raw) {
const str = decoder.write(chunk);
buffer += str;
// Process all complete JSON objects in the buffer
let boundary = buffer.indexOf('\n');
while (boundary !== -1) {
const rawObject = buffer.slice(0, boundary).trim();
buffer = buffer.slice(boundary + 1);
boundary = buffer.indexOf('\n');
if (rawObject) {
let parsed: unknown;
try {
parsed = JSON.parse(rawObject);
} catch (err) {
const errorMessage = `Invalid JSON chunk: ${err instanceof Error ? err.message : String(err)}`;
if (!hasReceivedFirstObject) {
// Error in first chunk - throw error to stop processing
throw new Error(errorMessage);
} else {
// Error in subsequent chunks - log and report but continue processing
const reportedMessage = `JSON parsing error in update chunk: ${err instanceof Error ? err.message : String(err)}`;
console.error(reportedMessage);
errorReporter.message(reportedMessage);
// Skip this malformed chunk and continue with next ones
// eslint-disable-next-line no-continue
continue;
}
}
if (!hasReceivedFirstObject) {
hasReceivedFirstObject = true;
try {
// eslint-disable-next-line no-await-in-loop
const result = await onRenderRequestReceived(parsed);
const { response, shouldContinue: continueFlag } = result;
void onResponseStart(response);
if (!continueFlag) {
return;
}
} catch (err) {
// Error in first chunk processing - throw error to stop processing
const error = err instanceof Error ? err : new Error(String(err));
error.message = `Error processing initial render request: ${error.message}`;
throw error;
}
} else {
try {
// eslint-disable-next-line no-await-in-loop
await onUpdateReceived(parsed);
} catch (err) {
// Error in update chunk processing - log and report but continue processing
const errorMessage = `Error processing update chunk: ${err instanceof Error ? err.message : String(err)}`;
console.error(errorMessage);
errorReporter.message(errorMessage);
// Continue processing other chunks
}
}
}
}
}
} catch (err) {
const error = err instanceof Error ? err : new Error(String(err));
// Update the error message in place to retain the original stack trace, rather than creating a new error object
error.message = `Error while handling the request stream: ${error.message}`;
throw error;
}
// Stream ended normally
void onRequestEnded();
}
let hasReceivedFirstObject = false;
let shouldContinueProcessing = true;
const decoder = new StringDecoder('utf8');
let buffer = '';
const processObject = async (rawObject: string) => {
if (!rawObject) {
return;
}
let parsed: unknown;
try {
parsed = JSON.parse(rawObject);
} catch (err) {
const message = err instanceof Error ? err.message : String(err);
if (!hasReceivedFirstObject) {
throw new Error(`Invalid JSON chunk: ${message}`);
}
const reportedMessage = `JSON parsing error in update chunk: ${message}`;
console.error(reportedMessage);
errorReporter.message(reportedMessage);
return;
}
if (!hasReceivedFirstObject) {
hasReceivedFirstObject = true;
try {
const { response, shouldContinue: continueFlag } = await onRenderRequestReceived(parsed);
void onResponseStart(response);
if (!continueFlag) {
shouldContinueProcessing = false;
}
} catch (err) {
const error = err instanceof Error ? err : new Error(String(err));
error.message = `Error processing initial render request: ${error.message}`;
throw error;
}
} else if (shouldContinueProcessing) {
try {
await onUpdateReceived(parsed);
} catch (err) {
const message = err instanceof Error ? err.message : String(err);
const errorMessage = `Error processing update chunk: ${message}`;
console.error(errorMessage);
errorReporter.message(errorMessage);
}
}
};
try {
for await (const chunk of request.raw) {
const str = decoder.write(chunk);
buffer += str;
// Process all complete JSON objects in the buffer
let boundary = buffer.indexOf('\n');
while (boundary !== -1) {
const rawObject = buffer.slice(0, boundary).trim();
buffer = buffer.slice(boundary + 1);
boundary = buffer.indexOf('\n');
if (rawObject) {
// eslint-disable-next-line no-await-in-loop
await processObject(rawObject);
if (!shouldContinueProcessing) {
return;
}
}
}
}
} catch (err) {
const error = err instanceof Error ? err : new Error(String(err));
// Update the error message in place to retain the original stack trace, rather than creating a new error object
error.message = `Error while handling the request stream: ${error.message}`;
throw error;
}
const flush = decoder.end();
if (flush) {
buffer += flush;
}
if (buffer.trim()) {
await processObject(buffer.trim());
if (!shouldContinueProcessing) {
return;
}
}
// Stream ended normally
void onRequestEnded();
}

Comment on lines +320 to +324
const vmContext = mapBundleFilePathToVMContext.get(bundleFilePath);
if (!vmContext) {
throw new VMContextNotFoundError(bundleFilePath);
}

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion | 🟠 Major

Avoid hard failure when VMContext is missing; lazily build and cache.

runOnOtherBundle may target a bundle not pre-supplied in bundlePaths. Fallback to getOrBuildVMContext to build on demand.

Apply this diff:

-      const vmContext = mapBundleFilePathToVMContext.get(bundleFilePath);
-      if (!vmContext) {
-        throw new VMContextNotFoundError(bundleFilePath);
-      }
+      let vmContext = mapBundleFilePathToVMContext.get(bundleFilePath);
+      if (!vmContext) {
+        vmContext = await getOrBuildVMContext(bundleFilePath, buildVmsIfNeeded);
+        mapBundleFilePathToVMContext.set(bundleFilePath, vmContext);
+      }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const vmContext = mapBundleFilePathToVMContext.get(bundleFilePath);
if (!vmContext) {
throw new VMContextNotFoundError(bundleFilePath);
}
let vmContext = mapBundleFilePathToVMContext.get(bundleFilePath);
if (!vmContext) {
vmContext = await getOrBuildVMContext(bundleFilePath, buildVmsIfNeeded);
mapBundleFilePathToVMContext.set(bundleFilePath, vmContext);
}
🤖 Prompt for AI Agents
In react_on_rails_pro/packages/node-renderer/src/worker/vm.ts around lines 320
to 324, the current code throws VMContextNotFoundError when
mapBundleFilePathToVMContext lacks an entry; instead, call
getOrBuildVMContext(bundleFilePath) to lazily create and cache the VM context
and assign it into mapBundleFilePathToVMContext before continuing. Replace the
throw with an awaitable call to getOrBuildVMContext, store the returned context
in the map under bundleFilePath, and use that context so runOnOtherBundle can
target bundles not pre-supplied.

Comment on lines +335 to +338
const debugOutputPathCode = path.join(bundleFilePath, 'code.js');
log.debug(`Full code executed written to: ${debugOutputPathCode}`);
await writeFileAsync(debugOutputPathCode, renderingRequest);
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Fix debug output path: joining to a file path creates an invalid path.

path.join(bundleFilePath, 'code.js'|'result.json') yields ".js/code.js", which will fail. Use the bundle directory.

Apply this diff:

-        const debugOutputPathCode = path.join(bundleFilePath, 'code.js');
+        const debugOutputPathCode = path.join(path.dirname(bundleFilePath), 'code.js');
         log.debug(`Full code executed written to: ${debugOutputPathCode}`);
         await writeFileAsync(debugOutputPathCode, renderingRequest);
@@
-        const debugOutputPathResult = path.join(bundleFilePath, 'result.json');
+        const debugOutputPathResult = path.join(path.dirname(bundleFilePath), 'result.json');
         log.debug(`Wrote result to file: ${debugOutputPathResult}`);
         await writeFileAsync(debugOutputPathResult, result);

Also applies to: 371-374

🤖 Prompt for AI Agents
In react_on_rails_pro/packages/node-renderer/src/worker/vm.ts around lines
335-338 (and similarly 371-374), the code currently uses
path.join(bundleFilePath, 'code.js'|'result.json') which produces invalid paths
like "<file>.js/code.js"; replace these joins to use the bundle directory
instead: compute const bundleDir = path.dirname(bundleFilePath) (or inline
path.dirname(bundleFilePath)) and use path.join(bundleDir, 'code.js') and
path.join(bundleDir, 'result.json') when writing debug files so the paths point
to the directory containing the bundle file.


// Get or create async value promise
getAsyncValue: function() {
debugger;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Remove debugger statements (violates lint rules).

These trigger lint errors and should not be committed.

Apply this diff:

   getAsyncValue: function() {
-    debugger;
     if (!sharedExecutionContext.has('asyncPromise')) {
@@
   setAsyncValue: function(value) {
-    debugger;
     if (!sharedExecutionContext.has('asyncPromise')) {
       ReactOnRails.getAsyncValue();
     }

As per coding guidelines

Also applies to: 23-23

🧰 Tools
🪛 Biome (2.1.2)

[error] 8-8: This is an unexpected use of the debugger statement.

Unsafe fix: Remove debugger statement

(lint/suspicious/noDebugger)

🤖 Prompt for AI Agents
In react_on_rails_pro/packages/node-renderer/tests/fixtures/bundle.js around
lines 8 and 23, there are debugger; statements causing lint failures; remove
those debugger statements (delete the entire debugger; lines) and run the linter
to confirm no remaining debugger occurrences in the file or tests before
committing.

Comment on lines +29 to +35
getStreamValues: function() {
if (!sharedExecutionContext.has('secondaryStream')) {
const stream = new PassThrough();
sharedExecutionContext.set('secondaryStream', { stream });
}
return sharedExecutionContext.get('secondaryStream').stream;
},
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

PassThrough is not defined; add import. Also return value for parity with primary fixture.

Without requiring 'stream', calling new PassThrough() will throw. Align addStreamValue to return value like bundle.js.

Add this import at the top of the file (outside the shown range):

const { PassThrough } = require('stream');

And update addStreamValue to return the written value:

   addStreamValue: function(value) {
     if (!sharedExecutionContext.has('secondaryStream')) {
       // Create the stream first if it doesn't exist
       ReactOnRails.getStreamValues();
     }
     const { stream } = sharedExecutionContext.get('secondaryStream');
     stream.write(value);
+    return value;
   },

Also applies to: 38-45

🤖 Prompt for AI Agents
In react_on_rails_pro/packages/node-renderer/tests/fixtures/secondary-bundle.js
around lines 29 to 35 (and similarly for lines 38 to 45), the code uses
PassThrough but never imports it and addStreamValue does not return the written
value for parity with the primary fixture; add "const { PassThrough } =
require('stream');" at the top of the file (outside the shown range) and modify
the addStreamValue implementation(s) to return the value that was written to the
stream after writing so callers receive the same return value behavior as the
primary bundle.

Comment on lines +676 to +710
describe('incremental render update chunk functionality', () => {
test.only('basic incremental update - initial request gets value, update chunks set value', async () => {
await createVmBundle(TEST_NAME);
const SERVER_BUNDLE_TIMESTAMP = String(BUNDLE_TIMESTAMP);

// Create the HTTP request
const req = createHttpRequest(SERVER_BUNDLE_TIMESTAMP);

// Set up response handling
const responsePromise = setupResponseHandler(req, true);

// Send the initial object that gets the async value (should resolve after setAsyncValue is called)
const initialObject = {
...createInitialObject(SERVER_BUNDLE_TIMESTAMP),
renderingRequest: 'ReactOnRails.getStreamValues()',
};
req.write(`${JSON.stringify(initialObject)}\n`);

// Send update chunks that set the async value
const updateChunk1 = {
bundleTimestamp: SERVER_BUNDLE_TIMESTAMP,
updateChunk: 'ReactOnRails.addStreamValue("first update");ReactOnRails.endStream();',
};
req.write(`${JSON.stringify(updateChunk1)}\n`);

// End the request
req.end();

// Wait for the response
const response = await responsePromise;

// Verify the response
expect(response.statusCode).toBe(200);
expect(response.data).toBe('first update'); // Should resolve with the first setAsyncValue call
});
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Remove test.only to re-enable full suite

Leaving test.only here will make Jest run just this spec, silently skipping the rest of the incremental render suite in CI.

-    test.only('basic incremental update - initial request gets value, update chunks set value', async () => {
+    test('basic incremental update - initial request gets value, update chunks set value', async () => {
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
describe('incremental render update chunk functionality', () => {
test.only('basic incremental update - initial request gets value, update chunks set value', async () => {
await createVmBundle(TEST_NAME);
const SERVER_BUNDLE_TIMESTAMP = String(BUNDLE_TIMESTAMP);
// Create the HTTP request
const req = createHttpRequest(SERVER_BUNDLE_TIMESTAMP);
// Set up response handling
const responsePromise = setupResponseHandler(req, true);
// Send the initial object that gets the async value (should resolve after setAsyncValue is called)
const initialObject = {
...createInitialObject(SERVER_BUNDLE_TIMESTAMP),
renderingRequest: 'ReactOnRails.getStreamValues()',
};
req.write(`${JSON.stringify(initialObject)}\n`);
// Send update chunks that set the async value
const updateChunk1 = {
bundleTimestamp: SERVER_BUNDLE_TIMESTAMP,
updateChunk: 'ReactOnRails.addStreamValue("first update");ReactOnRails.endStream();',
};
req.write(`${JSON.stringify(updateChunk1)}\n`);
// End the request
req.end();
// Wait for the response
const response = await responsePromise;
// Verify the response
expect(response.statusCode).toBe(200);
expect(response.data).toBe('first update'); // Should resolve with the first setAsyncValue call
});
describe('incremental render update chunk functionality', () => {
test('basic incremental update - initial request gets value, update chunks set value', async () => {
await createVmBundle(TEST_NAME);
const SERVER_BUNDLE_TIMESTAMP = String(BUNDLE_TIMESTAMP);
🤖 Prompt for AI Agents
In react_on_rails_pro/packages/node-renderer/tests/incrementalRender.test.ts
around lines 676 to 710 the test is marked with test.only which forces Jest to
run only this spec and skips the rest of the suite; remove the .only so the test
reads test('basic incremental update - initial request gets value, update chunks
set value', async () => { ... }) and commit the change so the full test suite
runs in CI (run the test suite locally to verify no other accidental .only
remains).

@AbanoubGhadban AbanoubGhadban changed the title [WIP] Make node renderer use ndjson format for communication feat: Implement async props with incremental rendering for React Server Components Oct 8, 2025
Copy link

claude bot commented Oct 8, 2025

Code Review: Async Props with Incremental Rendering

Thank you for this substantial feature implementation! The architecture is well-designed with excellent separation of concerns and solid TypeScript usage. However, I've identified several critical security and reliability issues that must be addressed before merging.


🚨 Critical Issues (Must Fix - P0)

1. Memory Leak & Data Leakage: Missing sharedExecutionContext Cleanup

Severity: CRITICAL - Security & Memory
Location: react_on_rails_pro/packages/node-renderer/src/worker/vm.ts (runInVM function)

The sharedExecutionContext Map is created per request but never explicitly cleaned up. This creates two serious problems:

Problem 1: Memory Leak

  • Each request creates a new Map that may never be garbage collected
  • If the sink reference is retained, it keeps the entire execution context alive

Problem 2: Data Leakage Between Users (CRITICAL SECURITY ISSUE)

  • User A's request creates sharedExecutionContext with sensitive data
  • If not cleaned up, User B's request could access User A's data
  • This violates request isolation guarantees

Required Fix: Add cleanup in finally block to delete context.sharedExecutionContext

Recommended Test: Verify context is cleaned up between requests and not accessible to subsequent users


2. Security: No Bundle Timestamp Validation

Severity: HIGH - Security
Location: react_on_rails_pro/packages/node-renderer/src/worker/handleIncrementalRenderRequest.ts:71-78

Update chunks can specify arbitrary bundleTimestamp values without validation that they match the initial request's bundle.

Attack Scenario:

  1. Client sends initial request for bundle A (User 1's session)
  2. Client sends update chunk with bundleTimestamp for bundle B (User 2's session)
  3. Malicious code runs in User 2's context

Required Fix: Add allowedBundleTimestamps validation to ensure update chunks only reference authorized bundles from the initial request


3. Race Condition: VM Context Eviction During Rendering

Severity: HIGH - Reliability
Location: react_on_rails_pro/packages/node-renderer/src/worker/vm.ts:119-128

Between waiting for VM creation and accessing the context, another concurrent request could trigger manageVMPoolSize which might evict the VM being used, causing crashes.

Required Fix: Add inUse flag to VMContext interface and filter out in-use contexts from eviction logic


4. DoS: Unbounded NDJSON Buffer Growth

Severity: HIGH - Security (DoS)
Location: react_on_rails_pro/packages/node-renderer/src/worker/handleIncrementalRenderStream.ts:34-50

If a malicious client sends data without newline characters, the buffer string will grow indefinitely, causing memory exhaustion.

Required Fix: Add MAX_BUFFER_SIZE limit (e.g., 10MB) and throw error if exceeded


5. Missing Stream Timeout

Severity: MEDIUM - Resource Leak
Location: react_on_rails_pro/packages/node-renderer/src/worker/handleIncrementalRenderStream.ts

The stream processing has no timeout. If a client opens a connection but never sends data or closes the stream, server resources are held indefinitely.

Required Fix: Add Promise.race with timeout (e.g., 30 seconds)


⚠️ High-Priority Issues (Should Fix - P1)

  1. Incomplete Error Reporting - Update chunk errors should be reported to errorReporter
  2. Empty Stream Edge Case - Throw error if stream ends without initial request
  3. DoS: Unlimited Update Chunks - Add MAX_UPDATE_CHUNKS limit
  4. Unhandled Promise Rejections - Add catch handlers for fire-and-forget async operations

✅ Strengths

  • ✨ Excellent architecture with clean separation of concerns
  • ✨ Strong TypeScript usage with comprehensive type definitions
  • ✨ Well-designed error handling (different strategies for initial vs. update chunks)
  • ✨ Good modularity and code organization
  • ✨ Comprehensive test coverage (982 lines of tests)
  • ✨ Thoughtful security design (sharedExecutionContext concept is brilliant - just needs cleanup)

🎯 Summary

Overall: Request Changes

This is a well-engineered feature with solid fundamentals, but it has critical security issues that must be fixed:

Must Fix (P0):

  1. Add sharedExecutionContext cleanup (security + memory leak)
  2. Validate bundle timestamps in update chunks (security)
  3. Fix VM context eviction race condition (reliability)
  4. Add NDJSON buffer size limit (DoS protection)
  5. Add stream timeout (resource leak)

Should Fix (P1):
6. Improve error reporting for update chunks
7. Handle empty stream edge case
8. Limit update chunk count
9. Add error handlers for fire-and-forget promises

Once these issues are addressed, particularly the sharedExecutionContext cleanup and bundle validation, this will be an excellent addition to the codebase!

Please let me know if you need clarification on any of these issues.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant