提交 17169725 编写于 作者: W wanghaoshuang

Avoid duplicate computing.

上级 1cfa65ec
...@@ -80,8 +80,12 @@ def sensitivity(program, ...@@ -80,8 +80,12 @@ def sensitivity(program,
loss = (baseline - pruned_metric) / baseline loss = (baseline - pruned_metric) / baseline
_logger.info("pruned param: {}; {}; loss={}".format(name, ratio, _logger.info("pruned param: {}; {}; loss={}".format(name, ratio,
loss)) loss))
sensitivities[name]['pruned_percent'].append(ratio)
sensitivities[name]['loss'].append(loss) for brother in pruner.pruned_list[0]:
if brother in sensitivities:
sensitivities[name]['pruned_percent'].append(ratio)
sensitivities[name]['loss'].append(loss)
_save_sensitivities(sensitivities, sensitivities_file) _save_sensitivities(sensitivities, sensitivities_file)
# restore pruned parameters # restore pruned parameters
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册