mindspore.ops.SpaceToBatch
- class mindspore.ops.SpaceToBatch(block_size, paddings)[source]
- SpaceToBatch is deprecated. Please use - mindspore.ops.SpaceToBatchNDinstead. Divides spatial dimensions into blocks and combines the block size with the original batch.- This operation will divide spatial dimensions (H, W) into blocks with block_size, the output tensor’s H and W dimension is the corresponding number of blocks after division. The output tensor’s batch dimension is the product of the original batch and the square of block_size. Before division, the spatial dimensions of the input are zero padded according to paddings if necessary. - Parameters
- block_size (int) – The block size of dividing blocks with value greater than or equal to 2. 
- paddings (Union[tuple, list]) – The padding values for H and W dimension, containing 2 subtraction lists. Each subtraction list contains 2 integer value. All values must be greater than 0. paddings[i] specifies the paddings for the spatial dimension i, which corresponds to the input dimension i+2. It is required that input_shape[i+2]+paddings[i][0]+paddings[i][1] is divisible by block_size. 
 
 - Inputs:
- input_x (Tensor) - The input tensor. It must be a 4-D tensor. The data type is Number. 
 
- Outputs:
- Tensor, the output tensor with the same data type as input. Assume input shape is \((n, c, h, w)\) with \(block\_size\) and \(paddings\). The shape of the output tensor will be \((n', c', h', w')\), where - \(n' = n*(block\_size*block\_size)\) - \(c' = c\) - \(h' = (h+paddings[0][0]+paddings[0][1])//block\_size\) - \(w' = (w+paddings[1][0]+paddings[1][1])//block\_size\) 
 - Raises
- TypeError – If block_size is not an int. 
- ValueError – If block_size is less than 2. 
 
 - Supported Platforms:
- Deprecated 
 - Examples - >>> block_size = 2 >>> paddings = [[0, 0], [0, 0]] >>> space_to_batch = ops.SpaceToBatch(block_size, paddings) >>> input_x = Tensor(np.array([[[[1, 2], [3, 4]]]]), mindspore.float32) >>> output = space_to_batch(input_x) >>> print(output) [[[[1.]]] [[[2.]]] [[[3.]]] [[[4.]]]]