mindchemistry.cell.FCNet
- class mindchemistry.cell.FCNet(channels, weight_init='normal', has_bias=True, bias_init='zeros', has_dropout=False, dropout_rate=0.5, has_layernorm=False, layernorm_epsilon=1e-7, has_activation=True, act='relu')[source]
- The Fully Connected Network. Applies a series of fully connected layers to the incoming data. - Parameters
- channels (List) – the list of numbers of channel of each fully connected layers. 
- weight_init (Union[str, float, mindspore.common.initializer, List]) – initialize layer weights. If weight_init was List, each element corresponds to each layer. Default: - 'normal'.
- has_bias (Union[bool, List]) – The switch for whether the dense layers has bias. If has_bias was List, each element corresponds to each dense layer. Default: - True.
- bias_init (Union[str, float, mindspore.common.initializer, List]) – The initializer of the bias of dense layer. If bias_init was List, each element corresponds to each dense layer. Default: - 'zeros'.
- has_dropout (Union[bool, List]) – The switch for whether linear block has a dropout layer. If has_dropout was List, each element corresponds to each layer. Default: - False.
- dropout_rate (float) – The dropout rate for dropout layer, the dropout rate must be a float in range (0, 1] If dropout_rate was List, each element corresponds to each dropout layer. Default: - 0.5.
- has_layernorm (Union[bool, List]) – The switch for whether linear block has a layer normalization layer. If has_layernorm was List, each element corresponds to each layer. Default: - False.
- layernorm_epsilon (float) – The hyper parameter epsilon for layer normalization layer. If layernorm_epsilon was List, each element corresponds to each layer normalization layer. Default: - 1e-7.
- has_activation (Union[bool, List]) – The switch for whether linear block has an activation layer. If has_activation was List, each element corresponds to each layer. Default: - True.
- act (Union[str, None, List]) – The activation function in linear block. If act was List, each element corresponds to each activation layer. Default: - 'relu'.
 
 - Inputs:
- input (Tensor) - The shape of Tensor is \((*, channels[0])\). 
 
- Outputs:
- output (Tensor) - The shape of Tensor is \((*, channels[-1])\). 
 
- Supported Platforms:
- Ascend
 - Examples - >>> import numpy as np >>> from mindchemistry.cell import FCNet >>> from mindspore import Tensor >>> inputs = Tensor(np.array([[180, 234, 154], [244, 48, 247]], np.float32)) >>> net = FCNet([3, 16, 32, 16, 8]) >>> output = net(inputs) >>> print(output.shape) (2, 8)