From fb72b192e777a41625ead3db95fd5f9cdda729b2 Mon Sep 17 00:00:00 2001 From: Wilber Date: Wed, 12 Aug 2020 11:01:08 +0800 Subject: [PATCH] [DOC] Fix dead link (#26154) --- paddle/fluid/inference/analysis/README.md | 18 +++++++++--------- .../tests/api/int8_mkldnn_quantization.md | 2 +- 2 files changed, 10 insertions(+), 10 deletions(-) diff --git a/paddle/fluid/inference/analysis/README.md b/paddle/fluid/inference/analysis/README.md index 70adb4a974..9a53ce53ab 100644 --- a/paddle/fluid/inference/analysis/README.md +++ b/paddle/fluid/inference/analysis/README.md @@ -6,13 +6,13 @@ and make the various optimization features be pluggable and co-exist in a pipeli We borrowed some concepts from LLVM, such as -- [Pass](./pass.h)es to implement optimization that traverse the inference program, -- [DataFlowGraph](./data_flow_graph.h) to represent the data flow graph built from a program, -- [PassManager](./pass_manager.h) to manage a sequence of `Pass`es over a graph. +- [Pass](../../framework/ir/pass.h)es to implement optimization that traverse the inference program, +- [Graph](../../framework/ir/graph.h) to represent the data flow graph built from a program, +- [PassManager](./ir_pass_manager.h) to manage a sequence of `Pass`es over a graph. There are some other basic concepts here -- [Node](./node.h), the node in a `DataFlowGraph`, +- [Node](../../framework/ir/node.h), the node in a `Graph`, - `Function`, the Operator in Fluid, - `Value`, the Variable in Fluid; - [Argument](./argument.h), the argument that treat as the input and output of all `Pass`es in the pipeline, @@ -21,9 +21,9 @@ There are some other basic concepts here The `inference/analysis` module make all the passes in a pipeline, and works in such way: -1. Build a `DataFlowGraph` from a Fluid inference ProgramDesc, -2. Call the middle passes one by one, the same `DataFlowGraph` is passed across all the passes, -3. Transform a new ProgramDesc from the modified `DataFlowGraph`. +1. Build a `Graph` from a Fluid inference ProgramDesc, +2. Call the middle passes one by one, the same `Graph` is passed across all the passes, +3. Transform a new ProgramDesc from the modified `Graph`. The new optimization features can be added as an independent `Pass` and controlled by gflags, each pass will generate unified debug information or visualization for better debugging. @@ -54,5 +54,5 @@ It can be used as a helper class that draws the modified graph after each pass. There is some helper legacy/function/class for analysis. - [dot.h](./dot.h) give a easy to use interface for generating `DOT` codes, -- [graph_traits.h](./graph_traits.h) contains the interfaces of the graph traversal algorithms, it uses `iterator`to make the algorithms easy to share across different passes, -there are some implementations in [data_flow_graph.cc](./data_flow_graph.cc) , such as BFS and DFS.. +- [graph_traits.h](../../framework/ir/graph_traits.h) contains the interfaces of the graph traversal algorithms, it uses `iterator`to make the algorithms easy to share across different passes, +there are some implementations in [graph_helper.cc](../../framework/ir/graph_helper.cc) , such as BFS and DFS.. diff --git a/paddle/fluid/inference/tests/api/int8_mkldnn_quantization.md b/paddle/fluid/inference/tests/api/int8_mkldnn_quantization.md index 1fc35f86bc..51870e8041 100644 --- a/paddle/fluid/inference/tests/api/int8_mkldnn_quantization.md +++ b/paddle/fluid/inference/tests/api/int8_mkldnn_quantization.md @@ -18,7 +18,7 @@ For reference, please examine the code of unit test enclosed in [analyzer_int8_i * ### Create Analysis config -INT8 quantization is one of the optimizations in analysis config. More information about analysis config can be found [here](https://github.com/PaddlePaddle/FluidDoc/blob/develop/doc/fluid/advanced_usage/deploy/inference/native_infer_en.md#upgrade-performance-based-on-contribanalysisconfig-prerelease) +INT8 quantization is one of the optimizations in analysis config. More information about analysis config can be found [here](https://www.paddlepaddle.org.cn/documentation/docs/en/advanced_guide/inference_deployment/inference/native_infer_en.html#a-name-use-analysisconfig-to-manage-inference-configurations-use-analysisconfig-to-manage-inference-configurations-a) * ### Create quantize config by analysis config -- GitLab