Skip to content

Feature Request: Allow robots module to emit URL tasks for discovered paths #2405

@AmandracOP

Description

@AmandracOP

Hi @kazet,

While exploring the Artemis modules and running the project locally, I noticed something about the robots module that I wanted to ask about.

Currently, the module downloads and parses robots.txt, extracts Allow and Disallow paths, and then checks those paths mainly for directory index exposure. However, many of these paths can point to interesting application endpoints such as /admin, /backup, /dev, etc.

Right now those discovered paths are only used within the robots module itself. I was wondering if it might make sense to instead (or additionally) emit URL tasks for these paths using self.add_task, so they can go through the rest of the Artemis pipeline.

That way other modules such as:

  • sql_injection_detector
  • lfi_detector
  • admin_panel_login_bruter
  • webapp_identifier
  • nuclei
  • directory_index

could analyze those endpoints as well.

The idea would roughly be:

robots.txt

extract paths

emit URL tasks

other Artemis modules analyze those endpoints

This might help reuse the discovered endpoints more effectively and increase scanning coverage without duplicating logic across modules.

Before trying to implement this, I wanted to check whether this approach fits the intended design of the Artemis task pipeline.

If it does, I’d be happy to work on a PR for it.

Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions