提交 a5f44503 编写于 作者: M Mikhail Korobov 提交者: GitHub

Merge pull request #2302 from Granitosaurus/pipeline_doc_fix

[MRG+1] Fix JsonWriterPipeline example in docs
......@@ -106,9 +106,12 @@ format::
class JsonWriterPipeline(object):
def __init__(self):
def open_spider(self, spider):
self.file = open('items.jl', 'wb')
def close_spider(self, spider):
self.file.close()
def process_item(self, item, spider):
line = json.dumps(dict(item)) + "\n"
self.file.write(line)
......@@ -126,14 +129,7 @@ MongoDB address and database name are specified in Scrapy settings;
MongoDB collection is named after item class.
The main point of this example is to show how to use :meth:`from_crawler`
method and how to clean up the resources properly.
.. note::
Previous example (JsonWriterPipeline) doesn't clean up resources properly.
Fixing it is left as an exercise for the reader.
::
method and how to clean up the resources properly.::
import pymongo
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册