mindsponge.cell

API Name

Description

Supported Platforms

mindsponge.cell.Attention

This is an implementation of multihead attention in the paper Attention is all you need.

Ascend GPU

mindsponge.cell.GlobalAttention

This is an implementation of global gated self attention in the paper Highly accurate protein structure prediction with AlphaFold.

Ascend GPU

mindsponge.cell.InvariantPointAttention

Invariant Point attention module.

Ascend GPU

mindsponge.cell.MSAColumnAttention

MSA column-wise gated self attention.

Ascend GPU

mindsponge.cell.MSAColumnGlobalAttention

MSA column global attention.

Ascend GPU

mindsponge.cell.MSARowAttentionWithPairBias

MSA row attention. Information from pair action value is made as the bias of the matrix of MSARowAttention,

Ascend GPU

mindsponge.cell.OuterProductMean

Computing the correlation of the input tensor along its second dimension, the computed correlation could be used to update the correlation features(e.g.

Ascend GPU

mindsponge.cell.Transition

This is 2-layer MLP where the intermediate layer expands number of channels of the input by a factor(num_intermediate_factor).

Ascend GPU

mindsponge.cell.TriangleAttention

Triangle attention.

Ascend GPU

mindsponge.cell.TriangleMultiplication

Triangle multiplication layer.

Ascend GPU