mindspore_gl.nn.SGConv

View Source On Gitee
class mindspore_gl.nn.SGConv(in_feat_size: int, out_feat_size: int, num_hops: int = 1, cached: bool = True, bias: bool = True, norm=None)[source]

Simplified Graph convolutional layer. From the paper Simplifying Graph Convolutional Networks .

\[H^{K} = (\tilde{D}^{-1/2} \tilde{A} \tilde{D}^{-1/2})^K X \Theta\]

Where \(\tilde{A}=A+I\).

..Note:

PYNATIVE mode only now.

Parameters
  • in_feat_size (int) – Input node feature size.

  • out_feat_size (int) – Output node feature size.

  • num_hops (int, optional) – Number of hops. Default: 1.

  • cached (bool, optional) – Whether to use cached. Default: True.

  • bias (bool, optional) – Whether to use bias. Default: True.

  • norm (Cell, optional) – Normalization function Cell. Default: None.

Inputs:
  • x (Tensor) - The input node features. The shape is \((N, D_{in})\) where \(N\) is the number of nodes, and \(D_{in}\) should be equal to in_feat_size in Args.

  • in_deg (Tensor) - In degree for nodes. The shape is \((N, )\) where \(N\) is the number of nodes.

  • out_deg (Tensor) - Out degree for nodes. The shape is \((N, )\) where \(N\) is the number of nodes.

  • g (Graph) - The input graph.

Outputs:
  • Tensor, output node features with shape of \((N, D_{out})\), where \((D_{out})\) should be the same as out_feat_size in Args.

Raises
  • TypeError – If in_feat_size or out_feat_size or num_hops is not an int.

  • TypeError – If bias or cached is not a bool.

  • TypeError – If norm is not a Cell.

Supported Platforms:

Ascend GPU

Examples

>>> import mindspore as ms
>>> import mindspore.context as context
>>> from mindspore_gl.nn import SGConv
>>> from mindspore_gl import GraphField
>>> context.set_context(device_target="GPU", mode=context.PYNATIVE_MODE)
>>> n_nodes = 4
>>> n_edges = 8
>>> src_idx = ms.Tensor([0, 0, 0, 1, 1, 1, 2, 3], ms.int32)
>>> dst_idx = ms.Tensor([0, 1, 3, 1, 2, 3, 3, 2], ms.int32)
>>> in_deg = ms.Tensor([1, 2, 2, 3], ms.int32)
>>> out_deg = ms.Tensor([3, 3, 1, 1], ms.int32)
>>> feat_size = 4
>>> in_feat_size = feat_size
>>> nh = ms.ops.Ones()((n_nodes, feat_size), ms.float32)
>>> eh = ms.ops.Ones()((n_edges, feat_size), ms.float32)
>>> g = GraphField(src_idx, dst_idx, n_nodes, n_edges)
>>> in_deg = in_deg
>>> out_deg = out_deg
>>> sgconv = SGConv(in_feat_size, feat_size)
>>> res = sgconv(nh, in_deg, out_deg, *g.get_graph())
>>> print(res.shape)
(4, 4)