Skip to content
Discussion options

You must be logged in to vote

Looks like I figured out what the problem was. During the experiments, I stopped using a shared request queue and sent requests directly to crawlers. But in this case, I ran into the problem that after adding the first requests to the queue, crawling started, but re-adding the same request did not start a new crawling. The problem turned out to be that these requests were marked as wasAlreadyPresent and were not added to the queue.

Replies: 1 comment 8 replies

Comment options

You must be logged in to vote
8 replies
@Cit1zeN4
Comment options

@janbuchar
Comment options

@Cit1zeN4
Comment options

@Cit1zeN4
Comment options

Answer selected by Cit1zeN4
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants