Casts all floating point parameters and buffers to ``float`` data type.
Parameters:
excluded_layers(nn.Layer|list|None, optional): Specify the layers that need to be kept original data type. if excluded_layers is None, casts all floating point parameters and buffers. Default: None.
Casts all floating point parameters and buffers to ``float16`` data type.
.. note::
``nn.BatchNorm`` does not support ``bfloat16`` weights, so it would not be converted by default.
Parameters:
excluded_layers(nn.Layer|list|None, optional): Specify the layers that need to be kept original data type. if excluded_layers is None, casts all floating point parameters and buffers except ``nn.BatchNorm``. Default: None.
Returns:
Layer: self
Examples:
.. code-block:: python
import paddle
class Model(paddle.nn.Layer):
def __init__(self):
super().__init__()
self.linear = paddle.nn.Linear(1, 1)
self.dropout = paddle.nn.Dropout(p=0.5)
def forward(self, input):
out = self.linear(input)
out = self.dropout(out)
return out
model = Model()
model.float16()
'''
ifpaddle.amp.is_float16_supported()isFalse:
warnings.warn(
"Paddle compiled by the user does not support float16, so keep original data type."
Casts all floating point parameters and buffers to ``bfloat16`` data type.
.. note::
``nn.BatchNorm`` does not support ``bfloat16`` weights, so it would not be converted by default.
Parameters:
excluded_layers(nn.Layer|list|None, optional): Specify the layers that need to be kept original data type. if excluded_layers is None, casts all floating point parameters and buffers except ``nn.BatchNorm``. Default: None.
Returns:
Layer: self
Examples:
.. code-block:: python
import paddle
class Model(paddle.nn.Layer):
def __init__(self):
super().__init__()
self.linear = paddle.nn.Linear(1, 1)
self.dropout = paddle.nn.Dropout(p=0.5)
def forward(self, input):
out = self.linear(input)
out = self.dropout(out)
return out
model = Model()
model.bfloat16()
'''
ifpaddle.amp.is_bfloat16_supported()isFalse:
warnings.warn(
"Paddle compiled by the user does not support bfloat16, so keep original data type."