提交 180272c0 编写于 作者: J Julia Medina

Move scrapy/contrib/spidermiddleware to scrapy/spidermiddlewares

上级 c97a69c9
......@@ -149,7 +149,7 @@ middleware (enabled by default) whose purpose is to filter out requests to
domains outside the ones covered by the spider.
For more info see:
:class:`~scrapy.contrib.spidermiddleware.offsite.OffsiteMiddleware`.
:class:`~scrapy.spidermiddlewares.offsite.OffsiteMiddleware`.
What is the recommended way to deploy a Scrapy crawler in production?
---------------------------------------------------------------------
......
......@@ -906,11 +906,11 @@ SPIDER_MIDDLEWARES_BASE
Default::
{
'scrapy.contrib.spidermiddleware.httperror.HttpErrorMiddleware': 50,
'scrapy.contrib.spidermiddleware.offsite.OffsiteMiddleware': 500,
'scrapy.contrib.spidermiddleware.referer.RefererMiddleware': 700,
'scrapy.contrib.spidermiddleware.urllength.UrlLengthMiddleware': 800,
'scrapy.contrib.spidermiddleware.depth.DepthMiddleware': 900,
'scrapy.spidermiddlewares.httperror.HttpErrorMiddleware': 50,
'scrapy.spidermiddlewares.offsite.OffsiteMiddleware': 500,
'scrapy.spidermiddlewares.referer.RefererMiddleware': 700,
'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware': 800,
'scrapy.spidermiddlewares.depth.DepthMiddleware': 900,
}
A dict containing the spider middlewares enabled by default in Scrapy. You
......@@ -1001,7 +1001,7 @@ URLLENGTH_LIMIT
Default: ``2083``
Scope: ``contrib.spidermiddleware.urllength``
Scope: ``spidermiddlewares.urllength``
The maximum URL length to allow for crawled URLs. For more information about
the default value for this setting see: http://www.boutell.com/newfaq/misc/urllength.html
......
......@@ -43,7 +43,7 @@ value. For example, if you want to disable the off-site middleware::
SPIDER_MIDDLEWARES = {
'myproject.middlewares.CustomSpiderMiddleware': 543,
'scrapy.contrib.spidermiddleware.offsite.OffsiteMiddleware': None,
'scrapy.spidermiddlewares.offsite.OffsiteMiddleware': None,
}
Finally, keep in mind that some middlewares may need to be enabled through a
......@@ -55,7 +55,7 @@ Writing your own spider middleware
Each middleware component is a Python class that defines one or more of the
following methods:
.. module:: scrapy.contrib.spidermiddleware
.. module:: scrapy.spidermiddlewares
.. class:: SpiderMiddleware
......@@ -178,7 +178,7 @@ For a list of the components enabled by default (and their orders) see the
DepthMiddleware
---------------
.. module:: scrapy.contrib.spidermiddleware.depth
.. module:: scrapy.spidermiddlewares.depth
:synopsis: Depth Spider Middleware
.. class:: DepthMiddleware
......@@ -199,7 +199,7 @@ DepthMiddleware
HttpErrorMiddleware
-------------------
.. module:: scrapy.contrib.spidermiddleware.httperror
.. module:: scrapy.spidermiddlewares.httperror
:synopsis: HTTP Error Spider Middleware
.. class:: HttpErrorMiddleware
......@@ -264,7 +264,7 @@ Pass all responses, regardless of its status code.
OffsiteMiddleware
-----------------
.. module:: scrapy.contrib.spidermiddleware.offsite
.. module:: scrapy.spidermiddlewares.offsite
:synopsis: Offsite Spider Middleware
.. class:: OffsiteMiddleware
......@@ -298,7 +298,7 @@ OffsiteMiddleware
RefererMiddleware
-----------------
.. module:: scrapy.contrib.spidermiddleware.referer
.. module:: scrapy.spidermiddlewares.referer
:synopsis: Referer Spider Middleware
.. class:: RefererMiddleware
......@@ -323,7 +323,7 @@ Whether to enable referer middleware.
UrlLengthMiddleware
-------------------
.. module:: scrapy.contrib.spidermiddleware.urllength
.. module:: scrapy.spidermiddlewares.urllength
:synopsis: URL Length Spider Middleware
.. class:: UrlLengthMiddleware
......
......@@ -77,7 +77,7 @@ scrapy.Spider
An optional list of strings containing domains that this spider is
allowed to crawl. Requests for URLs not belonging to the domain names
specified in this list won't be followed if
:class:`~scrapy.contrib.spidermiddleware.offsite.OffsiteMiddleware` is enabled.
:class:`~scrapy.spidermiddlewares.offsite.OffsiteMiddleware` is enabled.
.. attribute:: start_urls
......
......@@ -224,11 +224,11 @@ SPIDER_MIDDLEWARES = {}
SPIDER_MIDDLEWARES_BASE = {
# Engine side
'scrapy.contrib.spidermiddleware.httperror.HttpErrorMiddleware': 50,
'scrapy.contrib.spidermiddleware.offsite.OffsiteMiddleware': 500,
'scrapy.contrib.spidermiddleware.referer.RefererMiddleware': 700,
'scrapy.contrib.spidermiddleware.urllength.UrlLengthMiddleware': 800,
'scrapy.contrib.spidermiddleware.depth.DepthMiddleware': 900,
'scrapy.spidermiddlewares.httperror.HttpErrorMiddleware': 50,
'scrapy.spidermiddlewares.offsite.OffsiteMiddleware': 500,
'scrapy.spidermiddlewares.referer.RefererMiddleware': 700,
'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware': 800,
'scrapy.spidermiddlewares.depth.DepthMiddleware': 900,
# Spider side
}
......
from unittest import TestCase
from scrapy.contrib.spidermiddleware.depth import DepthMiddleware
from scrapy.spidermiddlewares.depth import DepthMiddleware
from scrapy.http import Response, Request
from scrapy.spider import Spider
from scrapy.statscol import StatsCollector
......@@ -37,7 +37,7 @@ class TestDepthMiddleware(TestCase):
rdm = self.stats.get_value('request_depth_max', spider=self.spider)
self.assertEquals(rdm, 1)
def tearDown(self):
self.stats.close_spider(self.spider, '')
......@@ -8,7 +8,7 @@ from scrapy.utils.test import get_crawler
from tests.mockserver import MockServer
from scrapy.http import Response, Request
from scrapy.spider import Spider
from scrapy.contrib.spidermiddleware.httperror import HttpErrorMiddleware, HttpError
from scrapy.spidermiddlewares.httperror import HttpErrorMiddleware, HttpError
from scrapy.settings import Settings
......
......@@ -4,7 +4,7 @@ from six.moves.urllib.parse import urlparse
from scrapy.http import Response, Request
from scrapy.spider import Spider
from scrapy.contrib.spidermiddleware.offsite import OffsiteMiddleware
from scrapy.spidermiddlewares.offsite import OffsiteMiddleware
from scrapy.utils.test import get_crawler
class TestOffsiteMiddleware(TestCase):
......
......@@ -2,7 +2,7 @@ from unittest import TestCase
from scrapy.http import Response, Request
from scrapy.spider import Spider
from scrapy.contrib.spidermiddleware.referer import RefererMiddleware
from scrapy.spidermiddlewares.referer import RefererMiddleware
class TestRefererMiddleware(TestCase):
......
from unittest import TestCase
from scrapy.contrib.spidermiddleware.urllength import UrlLengthMiddleware
from scrapy.spidermiddlewares.urllength import UrlLengthMiddleware
from scrapy.http import Response, Request
from scrapy.spider import Spider
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册