mindspore.nn.TimeDistributed

class mindspore.nn.TimeDistributed(layer, time_axis, reshape_with_axis=None)[source]

The time distributed layer.

Time distributed is a wrapper which allows to apply a layer to every temporal slice of an input. And the input should be at least 3D. There are two cases in the implementation. When reshape_with_axis provided, the reshape method will be chosen, which is more efficient; otherwise, the method of dividing the inputs along time axis will be used, which is more general. For example, reshape_with_axis could not be provided when deal with Batch Normalization.

Parameters
  • layer (Union[Cell, Primitive]) – The Cell or Primitive which will be wrapped.

  • time_axis (int) – The axis of time_step.

  • reshape_with_axis (int) – The axis which will be reshaped with time_axis. Default: None.

Inputs:
  • input (Tensor) - Tensor of shape \((N, T, *)\).

Outputs:

Tensor of shape \((N, T, *)\)

Supported Platforms:

Ascend GPU CPU

Raises

TypeError – If layer is not a Cell or Primitive.

Examples

>>> input = Tensor(np.random.random([32, 10, 3]), mindspore.float32)
>>> dense = nn.Dense(3, 6)
>>> net = nn.TimeDistributed(dense, time_axis=1, reshape_with_axis=0)
>>> output = net(input)
>>> print(output.shape)
(32, 10, 6)