Quant Aware Training(QAT): Add the fake quant logic for given quantizable layers, namely add the quant_dequant computational logic both for activation inputs and weight inputs.
"""
def__init__(self,
config=None,
weight_preprocess=None,
act_preprocess=None,
weight_quantize=None,
act_quantize=None):
"""
Args:
model(nn.Layer)
config(dict, optional): configs for quantization. if None, will use default config.
Default: None.
weight_quantize(class, optional): Defines how to quantize weight. Using this
can quickly test if user's quantization method works or not. In this method, user should
both define quantization function and dequantization function, that is, the function's input
is non-quantized weight and function returns dequantized weight. If None, will use
quantization op defined by 'weight_quantize_type'.
Default is None.
act_quantize(class, optional): Defines how to quantize activation. Using this
can quickly test if user's quantization method works or not. In this function, user should
both define quantization and dequantization process, that is, the function's input
is non-quantized activation and function returns dequantized activation. If None, will use
quantization op defined by 'activation_quantize_type'.
Default is None.
weight_preprocess(class, optional): Defines how to preprocess weight before quantization. Using this
can quickly test if user's preprocess method works or not. The function's input
is non-quantized weight and function returns processed weight to be quantized. If None, will
use preprocess method defined by 'weight_preprocess_type'.
Default is None.
act_preprocess(class, optional): Defines how to preprocess activation before quantization. Using this
can quickly test if user's preprocess method works or not. The function's input
is non-quantized activation and function returns processed activation to be quantized. If None,
will use preprocess method defined by 'activation_preprocess_type'.
Default is None.
"""
ifconfigisNone:
config=_quant_config_default
else:
assertisinstance(config,dict),"config must be dict"