mindscience.e3nn.nn.FullyConnectedNet
- class mindscience.e3nn.nn.FullyConnectedNet(h_list, act=None, out_act=False, init_method='normal', dtype=mindspore.float32)[source]
Fully-connected Neural Network with normalized activation on scalars. It stacks multiple dense layers and automatically normalizes the activation function to maintain stable signal magnitudes during forward/backward passes.
- Parameters
h_list (list[int]) – A list of input, internal and output dimensions for dense layers.
act (Func, optional) – Activation function which will be automatically normalized. Default:
None.out_act (bool, optional) – Whether to apply the activation function on the output. Default:
False.init_method (Union[str, mindspore.common.initializer], optional) – Method to initialize parameters. Default:
'normal'.dtype (mindspore.dtype, optional) – The data type of the input tensor. Default:
mindspore.float32.
- Inputs:
input (Tensor) - The shape of Tensor is \((h\_list[0])\).
- Outputs:
output (Tensor) - The shape of Tensor is \((h\_list[-1])\).
- Raises
TypeError – If the elements h_list are not int.
Examples
>>> import mindspore as ms >>> from mindscience.e3nn.nn import FullyConnectedNet >>> fc = FullyConnectedNet([4,10,20,12,6], ops.tanh) FullyConnectedNet [4, 10, 20, 12, 6] >>> v = ms.Tensor([.1,.2,.3,.4]) >>> grad = ops.grad(fc, weights=fc.trainable_params()) >>> fc(v).shape (6,) >>> [x.shape for x in grad(v)[1]] [(4, 10), (10, 20), (20, 12), (12, 6)]