提交 9e5a3a08 编写于 作者: wgzqz's avatar wgzqz

Change the default norm of gradient method to be L∞

上级 65d90682
......@@ -32,7 +32,7 @@ class GradientMethodAttack(Attack):
super(GradientMethodAttack, self).__init__(model)
self.support_targeted = support_targeted
def _apply(self, adversary, norm_ord=2, epsilons=0.01, steps=100):
def _apply(self, adversary, norm_ord=np.inf, epsilons=0.01, steps=100):
"""
Apply the gradient attack method.
:param adversary(Adversary):
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册