Skip to content

Scrapyrt scrape multiple spiders asynchronously at once instead of overwhelming the server with request #130

@xaander1

Description

@xaander1

@pawelmhm requesting ability to scrape multiple spiders asynchronously at once instead of overwhelming the server with request
Here is what i mean:

{
    "request": {
        "url":["https://www.site1.com","https://www.site2.com","https://www.site3.com"] ,
        "callback": "parse_product",
        "dont_filter": "True"
    },
    "spider_name": ["Site1","Site2","Site3"]
}

Enabling the ability to scrape multiple spiders at once in real-time.

The alternative would be to write an api utilizing requests that programatically sends these requests one by one asynchronously then combine the results which i feel is a little bit unneat and resource intensive...built in support would be nice.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions