Function Differences with tf.keras.initializers.RandomUniform

View Source On Gitee

tf.keras.initializers.RandomUniform

tf.keras.initializers.RandomUniform(
    minval=-0.05, maxval=0.05, seed=None, dtype=tf.dtypes.float32
)

For more information, see tf.keras.initializers.RandomUniform.

mindspore.common.initializer.Uniform

class mindspore.common.initializer.Uniform(scale=0.07)

For more information, see mindspore.common.initializer.Uniform.

Usage

TensorFlow: The upper and lower bounds of the uniform distribution are specified by the entry minval and maxval, i.e., U(-minval, maxval), respectively. Default values: minval=-0.05, maxval=0.05.

MindSpore: The range of the uniform distribution is specified by only one input scale, i.e. U(-scale, scale). Default value: scale=0.7.

Code Example

import tensorflow as tf

init = tf.keras.initializers.RandomUniform()
x = init(shape=(1, 2))

with tf.Session() as sess:
    print(x.eval())

# Out:
# [[0.9943197  0.93056154]]
import mindspore as ms
from mindspore.common.initializer import Uniform, initializer

x = initializer(Uniform(), shape=[1, 2], dtype=ms.float32)
print(x)

# out:
# [[0.01140347 0.0076657 ]]