Composite operators.

Pre-defined combination of operators.

mindspore.ops.composite.core(fn=None, **flags)[source]

A decorator to add flag to a function.

By default, the function is marked core=True using this decorator to set flag to a graph.

  • fn (Function) – Function to add flag. Default: None.

  • flags (dict) – The following flags can be set core, which indicates that this is a core function or other flag. Default: None.

mindspore.ops.composite.add_flags(fn=None, **flags)[source]

An decorator to add flag for a function.


Only supports bool value.

  • fn (Function) – Function or cell to add flag. Default: None.

  • flags (dict) – Flags use kwargs. Default: None.


Function, the fn added flags.


>>> add_flags(net, predit=True)
class mindspore.ops.composite.MultitypeFuncGraph(name)[source]

Generate multiply graph.

MultitypeFuncGraph is a class used to generate graphs for function with different type as input.


name (str) – Operator name.


ValueError – Cannot find matching fn for the given args.


>>> # `add` is a metagraph object which will add two objects according to
>>> # input type using ".register" decorator.
>>> add = MultitypeFuncGraph('add')

Register a function for the given type string.

class mindspore.ops.composite.GradOperation(name, get_all=False, get_by_list=False, sens_param=False)[source]

An metafuncgraph object which is used to get the gradient of output of a network(function).

The GradOperation will convert the network(function) into a back propagation graph.

  • get_all (bool) – If True, get all the gradients w.r.t inputs. Default: False.

  • get_by_list (bool) – If True, get all the gradients w.r.t Parameter variables. If get_all and get_by_list are both False, get the gradient w.r.t first input. If get_all and get_by_list are both True, get the gradients w.r.t inputs and Parameter variables at the same time in the form of ((grads w.r.t inputs), (grads w.r.t parameters)). Default: False.

  • sens_param (bool) – Whether append sensitivity as input. If sens_param is False, a ‘ones_like(outputs)’ sensitivity will be attached automatically. Default: False.

class mindspore.ops.composite.HyperMap(ops=None)[source]

Hypermap will apply the set operation on input sequences.

Which will apply the operations of every elements of the sequence.


ops (Union[MultitypeFuncGraph, None]) – ops is the operation to apply. If ops is None, the operations should be putted in the first input of the instance.

  • args (Tuple[sequence]) - If ops is not None, all the inputs should be the same length sequences, and each row of the sequences. e.g. If args length is 2, and for i in length of each sequence (args[0][i], args[1][i]) will be the input of the operation.

    If ops is not None, the first input is the operation, and the other is inputs.


sequence, the output will be same type and same length of sequence from input and the value of each element is the result of operation apply each row of element. e.g. operation(args[0][i], args[1][i]).

mindspore.ops.composite.normal(shape, mean, stddev, seed=0)[source]

Generates random numbers according to the Normal (or Gaussian) random number distribution. It is defined as:

  • shape (tuple) – The shape of random tensor to be generated.

  • mean (Tensor) – The mean μ distribution parameter, which specifies the location of the peak. With float32 data type.

  • stddev (Tensor) – The deviation σ distribution parameter. With float32 data type.

  • seed (int) – Seed is used as entropy source for Random number engines generating pseudo-random numbers. Default: 0.


Tensor. The shape should be the broadcasted shape of Input “shape” and shapes of mean and stddev. The dtype is float32.


>>> shape = (4, 16)
>>> mean = Tensor(1.0, mstype.float32)
>>> stddev = Tensor(1.0, mstype.float32)
>>> C.set_seed(10)
>>> output = C.normal(shape, mean, stddev, seed=5)
mindspore.ops.composite.clip_by_value(x, clip_value_min, clip_value_max)[source]

Clips tensor values to a specified min and max.

Limits the value of \(x\) to a range, whose lower limit is ‘clip_value_min’ and upper limit is ‘clip_value_max’.


‘clip_value_min’ needs to be less than or equal to ‘clip_value_max’.

  • x (Tensor) – Input data.

  • clip_value_min (Tensor) – The minimum value.

  • clip_value_max (Tensor) – The maximum value.


Tensor, a clipped Tensor.