Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
喜羊羊3508
Dak
提交
fc2f0edf
D
Dak
项目概览
喜羊羊3508
/
Dak
11 个月 前同步成功
通知
1
Star
0
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
0
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
D
Dak
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
0
Issue
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
体验新版 GitCode,发现更多精彩内容 >>
提交
fc2f0edf
编写于
8月 16, 2008
作者:
J
Joerg Jaspert
浏览文件
操作
浏览文件
下载
差异文件
Merge commit 'mhy/checksums'
* commit 'mhy/checksums': revert change to get_files_id
上级
cce84a5c
6c793f19
变更
4
隐藏空白更改
内联
并排
Showing
4 changed file
with
11 addition
and
9 deletion
+11
-9
ChangeLog
ChangeLog
+4
-0
dak/process_accepted.py
dak/process_accepted.py
+1
-1
dak/process_unchecked.py
dak/process_unchecked.py
+2
-2
daklib/database.py
daklib/database.py
+4
-6
未找到文件。
ChangeLog
浏览文件 @
fc2f0edf
2008
-
08
-
15
Mark
Hymers
<
mhy
@
debian
.
org
>
*
dak
/
process_accepted
.
py
,
dak
/
process_unchecked
.
py
,
daklib
/
database
.
py
:
Don
't change get_files_id to use sha1sum and
sha256sum.
* setup/init_pool.sql, dak/check_archive.py, dak/decode_dot_dak.py,
dak/process_accepted.py, dak/process_unchecked.py, daklib/database.py,
daklib/queue.py, daklib/utils.py: Attempt to add sha1sum and
...
...
dak/process_accepted.py
浏览文件 @
fc2f0edf
...
...
@@ -311,7 +311,7 @@ def install ():
# files id is stored in dsc_files by check_dsc().
files_id
=
dsc_files
[
dsc_file
].
get
(
"files id"
,
None
)
if
files_id
==
None
:
files_id
=
database
.
get_files_id
(
filename
,
dsc_files
[
dsc_file
][
"size"
],
dsc_files
[
dsc_file
][
"md5sum"
],
files
[
file
][
"sha1sum"
],
files
[
file
][
"sha256sum"
],
dsc_location_id
)
files_id
=
database
.
get_files_id
(
filename
,
dsc_files
[
dsc_file
][
"size"
],
dsc_files
[
dsc_file
][
"md5sum"
],
dsc_location_id
)
# FIXME: needs to check for -1/-2 and or handle exception
if
files_id
==
None
:
files_id
=
database
.
set_files_id
(
filename
,
dsc_files
[
dsc_file
][
"size"
],
dsc_files
[
dsc_file
][
"md5sum"
],
files
[
file
][
"sha1sum"
],
files
[
file
][
"sha256sum"
],
dsc_location_id
)
...
...
dak/process_unchecked.py
浏览文件 @
fc2f0edf
...
...
@@ -630,11 +630,11 @@ def check_files():
# Check the md5sum & size against existing files (if any)
files
[
f
][
"pool name"
]
=
utils
.
poolify
(
changes
[
"source"
],
files
[
f
][
"component"
])
files_id
=
database
.
get_files_id
(
files
[
f
][
"pool name"
]
+
f
,
files
[
f
][
"size"
],
files
[
f
][
"md5sum"
],
files
[
f
][
"
sha1sum"
],
files
[
f
][
"sha256sum"
],
files
[
f
][
"
location id"
])
files_id
=
database
.
get_files_id
(
files
[
f
][
"pool name"
]
+
f
,
files
[
f
][
"size"
],
files
[
f
][
"md5sum"
],
files
[
f
][
"location id"
])
if
files_id
==
-
1
:
reject
(
"INTERNAL ERROR, get_files_id() returned multiple matches for %s."
%
(
f
))
elif
files_id
==
-
2
:
reject
(
"md5sum
, sha1sum, sha256sum
and/or size mismatch on existing copy of %s."
%
(
f
))
reject
(
"md5sum and/or size mismatch on existing copy of %s."
%
(
f
))
files
[
f
][
"files id"
]
=
files_id
# Check for packages that have moved from one component to another
...
...
daklib/database.py
浏览文件 @
fc2f0edf
...
...
@@ -317,7 +317,7 @@ def get_or_set_fingerprint_id (fingerprint):
################################################################################
def
get_files_id
(
filename
,
size
,
md5sum
,
sha1sum
,
sha256sum
,
location_id
):
def
get_files_id
(
filename
,
size
,
md5sum
,
location_id
):
global
files_id_cache
cache_key
=
"%s_%d"
%
(
filename
,
location_id
)
...
...
@@ -326,7 +326,7 @@ def get_files_id (filename, size, md5sum, sha1sum, sha256sum, location_id):
return
files_id_cache
[
cache_key
]
size
=
int
(
size
)
q
=
projectB
.
query
(
"SELECT id, size, md5sum
, sha1sum, sha256sum
FROM files WHERE filename = '%s' AND location = %d"
%
(
filename
,
location_id
))
q
=
projectB
.
query
(
"SELECT id, size, md5sum FROM files WHERE filename = '%s' AND location = %d"
%
(
filename
,
location_id
))
ql
=
q
.
getresult
()
if
ql
:
if
len
(
ql
)
!=
1
:
...
...
@@ -334,9 +334,7 @@ def get_files_id (filename, size, md5sum, sha1sum, sha256sum, location_id):
ql
=
ql
[
0
]
orig_size
=
int
(
ql
[
1
])
orig_md5sum
=
ql
[
2
]
orig_sha1sum
=
ql
[
3
]
orig_sha256sum
=
ql
[
4
]
if
orig_size
!=
size
or
orig_md5sum
!=
md5sum
or
orig_sha1sum
!=
sha1sum
or
orig_sha256sum
!=
sha256sum
:
if
orig_size
!=
size
or
orig_md5sum
!=
md5sum
:
return
-
2
files_id_cache
[
cache_key
]
=
ql
[
0
]
return
files_id_cache
[
cache_key
]
...
...
@@ -367,7 +365,7 @@ def set_files_id (filename, size, md5sum, sha1sum, sha256sum, location_id):
projectB
.
query
(
"INSERT INTO files (filename, size, md5sum, sha1sum, sha256sum, location) VALUES ('%s', %d, '%s', %d)"
%
(
filename
,
long
(
size
),
md5sum
,
sha1sum
,
sha256sum
,
location_id
))
return
get_files_id
(
filename
,
size
,
md5sum
,
sha1sum
,
sha256sum
,
location_id
)
return
get_files_id
(
filename
,
size
,
md5sum
,
location_id
)
### currval has issues with postgresql 7.1.3 when the table is big
### it was taking ~3 seconds to return on auric which is very Not
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录