未验证 提交 8e7c8117 编写于 作者: N Nyakku Shigure 提交者: GitHub

[CodeStyle] bump common hooks and remove all tabs in python files (#54796)

上级 7f6bb160
...@@ -10,7 +10,7 @@ exclude: | ...@@ -10,7 +10,7 @@ exclude: |
repos: repos:
# Common hooks # Common hooks
- repo: https://github.com/pre-commit/pre-commit-hooks - repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.1.0 rev: v4.4.0
hooks: hooks:
- id: check-added-large-files - id: check-added-large-files
- id: check-merge-conflict - id: check-merge-conflict
...@@ -22,7 +22,7 @@ repos: ...@@ -22,7 +22,7 @@ repos:
- id: trailing-whitespace - id: trailing-whitespace
files: (.*\.(py|bzl|md|rst|c|cc|cxx|cpp|cu|h|hpp|hxx|xpu|kps|cmake|yaml|yml|hook)|BUILD|.*\.BUILD|WORKSPACE|CMakeLists\.txt)$ files: (.*\.(py|bzl|md|rst|c|cc|cxx|cpp|cu|h|hpp|hxx|xpu|kps|cmake|yaml|yml|hook)|BUILD|.*\.BUILD|WORKSPACE|CMakeLists\.txt)$
- repo: https://github.com/Lucas-C/pre-commit-hooks.git - repo: https://github.com/Lucas-C/pre-commit-hooks.git
rev: v1.1.14 rev: v1.5.1
hooks: hooks:
- id: remove-crlf - id: remove-crlf
- id: remove-tabs - id: remove-tabs
...@@ -33,11 +33,9 @@ repos: ...@@ -33,11 +33,9 @@ repos:
name: Tabs remover (Python) name: Tabs remover (Python)
files: (.*\.(py|bzl)|BUILD|.*\.BUILD|WORKSPACE)$ files: (.*\.(py|bzl)|BUILD|.*\.BUILD|WORKSPACE)$
args: [--whitespaces-count, '4'] args: [--whitespaces-count, '4']
# Exclude the fluid directory. # Exclude some unit test files that require tabs.
# And exclude some unit test files that require tabs.
exclude: | exclude: |
(?x)^( (?x)^(
python/paddle/fluid/.+|
test/dygraph_to_static/test_error.py test/dygraph_to_static/test_error.py
)$ )$
- repo: local - repo: local
......
...@@ -409,7 +409,7 @@ def scaled_dot_product_attention( ...@@ -409,7 +409,7 @@ def scaled_dot_product_attention(
queries, keys, values, num_heads=1, dropout_rate=0.0 queries, keys, values, num_heads=1, dropout_rate=0.0
): ):
r""" r"""
:api_attr: Static Graph :api_attr: Static Graph
This interface Multi-Head Attention using scaled dot product. This interface Multi-Head Attention using scaled dot product.
Attention mechanism can be seen as mapping a query and a set of key-value Attention mechanism can be seen as mapping a query and a set of key-value
......
...@@ -3908,7 +3908,7 @@ Lamb = LambOptimizer ...@@ -3908,7 +3908,7 @@ Lamb = LambOptimizer
class ModelAverage(Optimizer): class ModelAverage(Optimizer):
r""" r"""
:api_attr: Static Graph :api_attr: Static Graph
The ModelAverage optimizer accumulates specific continuous historical parameters The ModelAverage optimizer accumulates specific continuous historical parameters
during training. The accumulated historical range can be controlled by the passed during training. The accumulated historical range can be controlled by the passed
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册