Add an advanced activation layer for ReLU (#10322)
The max_value argument can not be used in a layer, except custom layer or Lambda. Hence, similarly to LeakyReLU or for example Softmax, this PR adds a layer for ReLU, enabling also a capped ReLU to be used.
Showing
想要评论请 注册 或 登录