mindspore.mint.nn.PReLU
- class mindspore.mint.nn.PReLU(num_parameters=1, init=0.25, dtype=None)[source]
Apply the element-wise PReLU function.
PReLU is defined as:
\[PReLU(x_i)= \max(0, x_i) + w * \min(0, x_i),\]where \(x_i\) is an element of an channel of the input.
Here \(w\) is a learnable parameter with a default initial value 0.25. Parameter \(w\) has dimensionality of the argument channel. If called without argument channel, a single parameter \(w\) will be shared across all channels.
PReLU Activation Function Graph:
Note
Channel dim is the 2nd dim of input. When input has dims < 2, then there is no channel dim and the number of channels = 1.
In GE mode, the rank of the input tensor must be greater than 1; otherwise, an error will be triggered.
- Parameters
num_parameters (int, optional) – number of w to learn. Legitimate values are 1, or the number of channels at tensor input. Default:
1.init (float, optional) – the initial value of w. Default:
0.25.dtype (mindspore.dtype, optional) – the type of w. Default:
None. Supported data type are {float16, float32, bfloat16}.
- Inputs:
input (Tensor) - The input tensor.
- Outputs:
Tensor.
- Supported Platforms:
Ascend
Examples
>>> import mindspore >>> x = mindspore.tensor([[[[0.1, 0.6], [0.9, 0.9]]]], mindspore.float32) >>> prelu = mindspore.mint.nn.PReLU() >>> output = prelu(x) >>> print(output) [[[[0.1 0.6] [0.9 0.9]]]]