mindspore_gl.nn.AGNNConv

View Source On Gitee
class mindspore_gl.nn.AGNNConv(init_beta: float = 1.0, learn_beta: bool = True)[source]

Attention Based Graph Neural Network. From the paper Attention-based Graph Neural Network for Semi-Supervised Learning .

\[H^{l+1} = P H^{l}\]

Computation of \(P\) is:

\[P_{ij} = \mathrm{softmax}_i ( \beta \cdot \cos(h_i^l, h_j^l))\]

\(\beta\) is a single scalar parameter.

Parameters
  • init_beta (float, optional) – Init \(\beta\), a single scalar parameter. Default: 1.0.

  • learn_beta (bool, optional) – Whether \(\beta\) is learnable. Default: True.

Inputs:
  • x (Tensor): The input node features. The shape is \((N,*)\) where \(N\) is the number of nodes, and \(*\) could be of any shape.

  • g (Graph): The input graph.

Outputs:
  • Tensor, output node features, where the shape should be the same as input ‘x’.

Raises
Supported Platforms:

Ascend GPU

Examples

>>> import mindspore as ms
>>> from mindspore_gl.nn import AGNNConv
>>> from mindspore_gl import GraphField
>>> n_nodes = 4
>>> n_edges = 8
>>> feat_size = 16
>>> src_idx = ms.Tensor([0, 0, 0, 1, 1, 1, 2, 3], ms.int32)
>>> dst_idx = ms.Tensor([0, 1, 3, 1, 2, 3, 3, 2], ms.int32)
>>> ones = ms.ops.Ones()
>>> feat = ones((n_nodes, feat_size), ms.float32)
>>> graph_field = GraphField(src_idx, dst_idx, n_nodes, n_edges)
>>> conv = AGNNConv()
>>> ret = conv(feat, *graph_field.get_graph())
>>> print(ret.shape)
(4, 16)