common.encoder¶
Please Reference ding/ding/docs/source/api_doc/model/common/encoder.py for usage
ConvEncoder¶
- class ding.model.common.encoder.ConvEncoder(obs_shape: ding.utils.type_helper.SequenceType, hidden_size_list: ding.utils.type_helper.SequenceType = [32, 64, 64, 128], activation: Optional[torch.nn.modules.module.Module] = ReLU(), norm_type: Optional[str] = None)[source]¶
- Overview:
The
Convolution Encoder
used in models. Used to encoder raw 2-dim observation.- Interfaces:
__init__
,forward
- __init__(obs_shape: ding.utils.type_helper.SequenceType, hidden_size_list: ding.utils.type_helper.SequenceType = [32, 64, 64, 128], activation: Optional[torch.nn.modules.module.Module] = ReLU(), norm_type: Optional[str] = None) None [source]¶
- Overview:
Init the Convolution Encoder according to arguments.
- Arguments:
obs_shape (
SequenceType
): Sequence ofin_channel
, someoutput size
hidden_size_list (
SequenceType
): The collection ofhidden_size
- activation (
nn.Module
): The type of activation to use in the conv
layers
andResBlock
, ifNone
then default set tonn.ReLU()
- activation (
- norm_type (
str
): The type of normalization to use, see
ding.torch_utils.ResBlock
for more details
- norm_type (
FCEncoder¶
- class ding.model.common.encoder.FCEncoder(obs_shape: int, hidden_size_list: ding.utils.type_helper.SequenceType, res_block: bool = False, activation: Optional[torch.nn.modules.module.Module] = ReLU(), norm_type: Optional[str] = None)[source]¶
- Overview:
The
FCEncoder
used in models. Used to encoder raw 1-dim observation.- Interfaces:
__init__
,forward
- __init__(obs_shape: int, hidden_size_list: ding.utils.type_helper.SequenceType, res_block: bool = False, activation: Optional[torch.nn.modules.module.Module] = ReLU(), norm_type: Optional[str] = None) None [source]¶
- Overview:
Init the FC Encoder according to arguments.
- Arguments:
obs_shape (
int
): Observation shapehidden_size_list (
SequenceType
): The collection ofhidden_size
res_block (
bool
): Whether useres_block
.- activation (
nn.Module
): The type of activation to use in the
ResFCBlock
, ifNone
then default set tonn.ReLU()
- activation (
- norm_type (
str
): The type of normalization to use, see
ding.torch_utils.ResFCBlock
for more details
- norm_type (