Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
looyolo
scrapy
提交
0665175d
S
scrapy
项目概览
looyolo
/
scrapy
与 Fork 源项目一致
从无法访问的项目Fork
通知
2
Star
0
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
0
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
S
scrapy
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
0
Issue
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
前往新版Gitcode,体验更适合开发者的 AI 搜索 >>
提交
0665175d
编写于
3月 16, 2012
作者:
P
Pablo Hoffman
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
removed some obsolete remaining code related to sqlite support in scrapy
上级
68c1c98d
变更
5
隐藏空白更改
内联
并排
Showing
5 changed file
with
0 addition
and
27 deletion
+0
-27
docs/topics/commands.rst
docs/topics/commands.rst
+0
-10
docs/topics/settings.rst
docs/topics/settings.rst
+0
-12
scrapy/settings/default_settings.py
scrapy/settings/default_settings.py
+0
-2
scrapyd/environ.py
scrapyd/environ.py
+0
-2
scrapyd/tests/test_environ.py
scrapyd/tests/test_environ.py
+0
-1
未找到文件。
docs/topics/commands.rst
浏览文件 @
0665175d
...
...
@@ -35,8 +35,6 @@ structure by default, similar to this::
spider1.py
spider2.py
...
.scrapy/
scrapy.db
The directory where the ``scrapy.cfg`` file resides is known as the *project
root directory*. That file contains the name of the python module that defines
...
...
@@ -45,12 +43,6 @@ the project settings. Here is an example::
[settings]
default = myproject.settings
By default, Scrapy projects use a SQLite_ database to store persistent runtime
data of the project, such as the spider queue (the list of spiders that are
scheduled to run). By default, this SQLite database is stored in the *project
data directory* which, by default, is the ``.scrapy`` directory inside the
project root directory mentioned above.
Using the ``scrapy`` tool
=========================
...
...
@@ -466,5 +458,3 @@ commands for your Scrapy project.
Example::
COMMANDS_MODULE = 'mybot.commands'
.. _SQLite: http://en.wikipedia.org/wiki/SQLite
docs/topics/settings.rst
浏览文件 @
0665175d
...
...
@@ -883,18 +883,6 @@ Example::
SPIDER_MODULES = ['mybot.spiders_prod', 'mybot.spiders_dev']
.. setting:: SQLITE_DB
SQLITE_DB
---------
Default: ``'scrapy.db'``
The location of the project SQLite database, used for storing the spider queue
and other persistent data of the project. If a relative path is given, is taken
relative to the project data dir. For more info see:
:ref:`topics-project-structure`.
.. setting:: STATS_CLASS
STATS_CLASS
...
...
scrapy/settings/default_settings.py
浏览文件 @
0665175d
...
...
@@ -246,8 +246,6 @@ SPIDER_MIDDLEWARES_BASE = {
SPIDER_MODULES
=
[]
SQLITE_DB
=
'scrapy.db'
STATS_CLASS
=
'scrapy.statscol.MemoryStatsCollector'
STATS_ENABLED
=
True
STATS_DUMP
=
True
...
...
scrapyd/environ.py
浏览文件 @
0665175d
...
...
@@ -27,8 +27,6 @@ class Environment(object):
env
[
'SCRAPY_JOB'
]
=
message
[
'_job'
]
if
project
in
self
.
settings
:
env
[
'SCRAPY_SETTINGS_MODULE'
]
=
self
.
settings
[
project
]
dbpath
=
os
.
path
.
join
(
self
.
dbs_dir
,
'%s.db'
%
project
)
env
[
'SCRAPY_SQLITE_DB'
]
=
dbpath
env
[
'SCRAPY_LOG_FILE'
]
=
self
.
_get_log_file
(
message
)
return
env
...
...
scrapyd/tests/test_environ.py
浏览文件 @
0665175d
...
...
@@ -29,6 +29,5 @@ class EnvironmentTest(unittest.TestCase):
self
.
assertEqual
(
env
[
'SCRAPY_SLOT'
],
'3'
)
self
.
assertEqual
(
env
[
'SCRAPY_SPIDER'
],
'myspider'
)
self
.
assertEqual
(
env
[
'SCRAPY_JOB'
],
'ID'
)
self
.
assert_
(
env
[
'SCRAPY_SQLITE_DB'
].
endswith
(
'mybot.db'
))
self
.
assert_
(
env
[
'SCRAPY_LOG_FILE'
].
endswith
(
os
.
path
.
join
(
'mybot'
,
'myspider'
,
'ID.log'
)))
self
.
failIf
(
'SCRAPY_SETTINGS_MODULE'
in
env
)
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录