mindelec.architecture.ResBlock

class mindelec.architecture.ResBlock(in_channels, out_channels, weight_init='normal', bias_init='zeros', has_bias=True, activation=None)[source]

The ResBlock of dense layer.

Parameters
  • in_channels (int) – The number of channels in the input space.

  • out_channels (int) – The number of channels in the output space.

  • weight_init (Union[Tensor, str, Initializer, numbers.Number]) – The trainable weight_init parameter. The dtype is same as input x. The values of str refer to the function initializer. Default: “normal”.

  • bias_init (Union[Tensor, str, Initializer, numbers.Number]) – The trainable bias_init parameter. The dtype is same as input x. The values of str refer to the function initializer. Default: “zeros”.

  • has_bias (bool) – Specifies whether the layer uses a bias vector. Default: True.

  • activation (Union[str, Cell, Primitive, None]) – activate function applied to the output of the dense layer. Default: None.

Inputs:
  • input (Tensor) - Tensor of shape \((*, in\_channels)\).

Outputs:

Tensor of shape \((*, out\_channels)\).

Raises
  • ValueError – If in_channels not equal out_channels.

  • TypeError – If activation is not in str or Cell or Primitive.

Supported Platforms:

Ascend

Examples

>>> import numpy as np
>>> from mindelec.architecture import ResBlock
>>> from mindspore import Tensor
>>> input = Tensor(np.array([[180, 234, 154], [244, 48, 247]], np.float32))
>>> net = ResBlock(3, 3)
>>> output = net(input)
>>> print(output.shape)
(2, 3)