Skip to content

Commit d20fa45

Browse files
committed
Add in guard clause to check that page isn't nil before trying to use it for processing pages
1 parent 9373ab6 commit d20fa45

File tree

2 files changed

+3
-1
lines changed

2 files changed

+3
-1
lines changed

lib/msf/core/auxiliary/http_crawler.rb

+1
Original file line numberDiff line numberDiff line change
@@ -243,6 +243,7 @@ def crawl_target(t)
243243
# Specific module implementations should redefine this method
244244
# with whatever is meaningful to them.
245245
def crawler_process_page(t, page, cnt)
246+
return if page.nil? # Skip over pages that don't contain any info aka page is nil. We can't process these types of pages since there is no data to process.
246247
msg = "[#{"%.5d" % cnt}/#{"%.5d" % max_page_count}] #{page ? page.code || "ERR" : "ERR"} - #{@current_site.vhost} - #{page.url}"
247248
case page.code
248249
when 301,302

modules/auxiliary/scanner/http/crawler.rb

+2-1
Original file line numberDiff line numberDiff line change
@@ -63,7 +63,8 @@ def for_each_page( &block )
6363
# - The occurence of any form (web.form :path, :type (get|post|path_info), :params)
6464
#
6565
def crawler_process_page(t, page, cnt)
66-
msg = "[#{"%.5d" % cnt}/#{"%.5d" % max_page_count}] #{page.code || "ERR"} - #{t[:vhost]} - #{page.url}"
66+
return if page.nil? # Skip over pages that don't contain any info aka page is nil. We can't process these types of pages since there is no data to process.
67+
msg = "[#{"%.5d" % cnt}/#{"%.5d" % max_page_count}] #{page ? page.code || "ERR" : "ERR"} - #{t[:vhost]} - #{page.url}"
6768
if page.error
6869
print_error("Error accessing page #{page.error.to_s}")
6970
elog(page.error)

0 commit comments

Comments
 (0)