Created
June 29, 2016 09:52
-
-
Save redapple/f0944690938c7f0d3b4769de363f1e62 to your computer and use it in GitHub Desktop.
Scrapy spider returning multiple requests in a callback after some delay. Based on https://gist.github.com/dangra/2781744
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
$ scrapy runspider delayspider.py | |
2016-06-29 11:52:19 [scrapy] INFO: Scrapy 1.1.0 started (bot: scrapybot) | |
2016-06-29 11:52:19 [scrapy] INFO: Overridden settings: {} | |
2016-06-29 11:52:19 [scrapy] INFO: Enabled extensions: | |
['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.logstats.LogStats'] | |
2016-06-29 11:52:19 [scrapy] INFO: Enabled downloader middlewares: | |
['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', | |
'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', | |
'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', | |
'scrapy.downloadermiddlewares.retry.RetryMiddleware', | |
'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', | |
'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', | |
'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', | |
'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', | |
'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', | |
'scrapy.downloadermiddlewares.chunked.ChunkedTransferMiddleware', | |
'scrapy.downloadermiddlewares.stats.DownloaderStats'] | |
2016-06-29 11:52:19 [scrapy] INFO: Enabled spider middlewares: | |
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', | |
'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', | |
'scrapy.spidermiddlewares.referer.RefererMiddleware', | |
'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', | |
'scrapy.spidermiddlewares.depth.DepthMiddleware'] | |
2016-06-29 11:52:19 [scrapy] INFO: Enabled item pipelines: | |
[] | |
2016-06-29 11:52:19 [scrapy] INFO: Spider opened | |
2016-06-29 11:52:19 [scrapy] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) | |
2016-06-29 11:52:20 [scrapy] DEBUG: Crawled (200) <GET https://www.wikipedia.org> (referer: None) | |
2016-06-29 11:52:25 [scrapy] DEBUG: Redirecting (301) to <GET https://fr.wikipedia.org/wiki/Wikip%C3%A9dia:Accueil_principal> from <GET https://fr.wikipedia.org> | |
2016-06-29 11:52:25 [scrapy] DEBUG: Redirecting (301) to <GET https://it.wikipedia.org/wiki/Pagina_principale> from <GET https://it.wikipedia.org> | |
2016-06-29 11:52:25 [scrapy] DEBUG: Crawled (200) <GET https://fr.wikipedia.org/wiki/Wikip%C3%A9dia:Accueil_principal> (referer: https://www.wikipedia.org) | |
2016-06-29 11:52:25 [scrapy] DEBUG: Crawled (200) <GET https://it.wikipedia.org/wiki/Pagina_principale> (referer: https://www.wikipedia.org) | |
2016-06-29 11:52:30 [scrapy] DEBUG: Filtered duplicate request: <GET https://it.wikipedia.org> - no more duplicates will be shown (see DUPEFILTER_DEBUG to show all duplicates) | |
2016-06-29 11:52:30 [scrapy] INFO: Closing spider (finished) | |
2016-06-29 11:52:30 [scrapy] INFO: Dumping Scrapy stats: | |
{'downloader/request_bytes': 1492, | |
'downloader/request_count': 5, | |
'downloader/request_method_count/GET': 5, | |
'downloader/response_bytes': 60980, | |
'downloader/response_count': 5, | |
'downloader/response_status_count/200': 3, | |
'downloader/response_status_count/301': 2, | |
'dupefilter/filtered': 4, | |
'finish_reason': 'finished', | |
'finish_time': datetime.datetime(2016, 6, 29, 9, 52, 30, 941093), | |
'log_count/DEBUG': 6, | |
'log_count/INFO': 7, | |
'request_depth_max': 2, | |
'response_received_count': 3, | |
'scheduler/dequeued': 5, | |
'scheduler/dequeued/memory': 5, | |
'scheduler/enqueued': 5, | |
'scheduler/enqueued/memory': 5, | |
'start_time': datetime.datetime(2016, 6, 29, 9, 52, 19, 949044)} | |
2016-06-29 11:52:30 [scrapy] INFO: Spider closed (finished) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
from twisted.internet import reactor, defer | |
import scrapy | |
DELAY = 5 # seconds | |
class MySpider(scrapy.Spider): | |
name = 'wikipedia' | |
start_urls = ['https://www.wikipedia.org'] | |
def parse(self, response): | |
nextreqs = [scrapy.Request('https://it.wikipedia.org'), | |
scrapy.Request('https://fr.wikipedia.org')] | |
dfd = defer.Deferred() | |
reactor.callLater(DELAY, dfd.callback, nextreqs) | |
return dfd |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
What about something like this?