mindelec.loss.NetWithEval

class mindelec.loss.NetWithEval(net_without_loss, constraints, loss='l2', dataset_input_map=None)[source]

Encapsulation class of network with loss of eval.

Parameters
  • net_without_loss (Cell) – The training network without loss definition.

  • constraints (Constraints) – The constraints function of pde problem.

  • loss (Union[str, dict, Cell]) – The name of loss function, e.g. “l1”, “l2” and “mae”. Default: “l2”.

  • dataset_input_map (dict) – The input map of the dataset Default: None.

Inputs:
  • inputs (Tensor) - The input is variable-length argument which contains network inputs and label.

Outputs:

Tuple, containing a scalar loss Tensor, a network output Tensor of shape \((N, \ldots)\) and a label Tensor of shape \((N, \ldots)\).

Supported Platforms:

Ascend

Examples

>>> import numpy as np
>>> from mindelec.loss import Constraints, NetWithEval
>>> from mindspore import Tensor, nn
>>> class Net(nn.Cell):
...     def __init__(self, input_dim, output_dim):
...         super(Net, self).__init__()
...         self.fc1 = nn.Dense(input_dim, 64)
...         self.fc2 = nn.Dense(64, output_dim)
...
...     def construct(self, *input):
...         x = input[0]
...         out = self.fc1(x)
...         out = self.fc2(out)
...         return out
>>> net = Net(3, 3)
>>> # For details about how to build the Constraints, please refer to the tutorial
>>> # document on the official website.
>>> constraints = Constraints(dataset, pde_dict)
>>> loss_network = NetWithEval(net, constraints)
>>> input = Tensor(np.ones([1000, 3]).astype(np.float32) * 0.01)
>>> label = Tensor(np.ones([1000, 3]).astype(np.float32))
>>> output_data = loss_network(input, label)