mindspore.nn.Dense

class mindspore.nn.Dense(in_channels, out_channels, weight_init='normal', bias_init='zeros', has_bias=True, activation=None)[source]

The dense connected layer.

Applies dense connected layer for the input. This layer implements the operation as:

\[\text{outputs} = \text{activation}(\text{X} * \text{kernel} + \text{bias}),\]

where \(X\) is the input tensors, \(\text{activation}\) is the activation function passed as the activation argument (if passed in), \(\text{kernel}\) is a weight matrix with the same data type as the \(X\) created by the layer, and \(\text{bias}\) is a bias vector with the same data type as the \(X\) created by the layer (only if has_bias is True).

Parameters
  • in_channels (int) – The number of channels in the input space.

  • out_channels (int) – The number of channels in the output space.

  • weight_init (Union[Tensor, str, Initializer, numbers.Number]) – The trainable weight_init parameter. The dtype is same as x. The values of str refer to the function initializer. Default: ‘normal’.

  • bias_init (Union[Tensor, str, Initializer, numbers.Number]) – The trainable bias_init parameter. The dtype is same as x. The values of str refer to the function initializer. Default: ‘zeros’.

  • has_bias (bool) – Specifies whether the layer uses a bias vector. Default: True.

  • activation (Union[str, Cell, Primitive, None]) – activate function applied to the output of the fully connected layer. Both activation name, e.g. ‘relu’, and mindspore activation function, e.g. mindspore.ops.ReLU(), are supported. Default: None.

Inputs:
  • x (Tensor) - Tensor of shape \((*, in\_channels)\). The in_channels in Args should be equal to \(in\_channels\) in Inputs.

Outputs:

Tensor of shape \((*, out\_channels)\).

Raises
  • TypeError – If in_channels or out_channels is not an int.

  • TypeError – If has_bias is not a bool.

  • TypeError – If activation is not one of str, Cell, Primitive, None.

  • ValueError – If length of shape of weight_init is not equal to 2 or shape[0] of weight_init is not equal to out_channels or shape[1] of weight_init is not equal to in_channels.

  • ValueError – If length of shape of bias_init is not equal to 1 or shape[0] of bias_init is not equal to out_channels.

Supported Platforms:

Ascend GPU CPU

Examples

>>> x = Tensor(np.array([[180, 234, 154], [244, 48, 247]]), mindspore.float32)
>>> net = nn.Dense(3, 4)
>>> output = net(x)
>>> print(output.shape)
(2, 4)