Skip to content

Support debugging use cases for Dataproc and Serverless Spark #2405

@dborowitz

Description

@dborowitz

Prerequisites

What are you trying to do that currently feels hard or impossible?

I'd like to add more tools specifically to support debugging use cases for the existing Serverless Spark source, as well as the closely related Cloud Dataproc Clusters. Unlike the depth-first approach I used for Serverless, now I want to go a bit breadth-first, prioritizing:

  • List & get APIs across multiple supported resources (clusters, jobs, sessions)
  • Fetching logs from Cloud Logging associated with these resources--not just the logs URL that's already in the tools output, but actually running the LQL query in that link and returning structured JSON as the results.
  • (If necessary) exposing paths to other outputs stored on GCS as part of the "get" tools. (Actually fetching the files from GCS is likely not in scope for MCP Toolbox)

Suggested Solution(s)

No response

Alternatives Considered

No response

Additional Details

No response

Metadata

Metadata

Assignees

Labels

priority: p2Moderately-important priority. Fix may not be included in next release.type: feature request‘Nice-to-have’ improvement, new feature or different behavior or design.

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions