Hello,
Per the readme downloads are processed a month at a time.
Is there an estimate of the average size of data scraped in these chunks? As well as an estimate of the final total size of the scraped results?
It might also be useful to add the total disk space requirement post-scrape to the readme - as I imagine disk space can be a prohibitive requirement for some.
Thank you!