mindspore_lite.LiteInfer

View Source On Gitee
class mindspore_lite.LiteInfer(model_or_net, *net_inputs, context=None, model_group_id=None, config: dict = None)[source]

The LiteInfer class takes training model as input and performs predictions directly.

Parameters
  • model_or_net (Model, Cell) – MindSpore Model or MindSpore nn.Cell.

  • net_inputs (Union[Tensor, Dataset, List, Tuple, Number, Bool]) – It represents the inputs of the net, if the network has multiple inputs, set them together. While its type is Dataset, it represents the preprocess behavior of the net, data preprocess operations will be serialized. In second situation, you should adjust batch size of dataset script manually which will impact on the batch size of 'net' input. Only supports parse "image" column from dataset currently.

  • context (Context, optional) – Define the context used to transfer options during execution. Default: None. None means the Context with cpu target.

  • model_group_id (int, optional) – model_group_id is used to bind model to model group. Default: None.

  • config (dict, optional) –

    Enabled when the backend is 'lite'. config includes two parts, config_path ('configPath', str) and config_item (str, dict). When config_item is set, its priority is higher than config_path. Set rank table file for inference. The content of the configuration file is as follows:

    [ascend_context]
    rank_table_file=[path_a](storage initial path of the rank table file)
    

    When set

    config = {"ascend_context" : {"rank_table_file" : "path_b"}}
    

    The path_b from the config will be used to compile the model. Default: None.

Raises

ValueErrormodel_or_net is not a MindSpore Model or MindSpore nn.Cell.

__init__(model_or_net, *net_inputs, context=None, model_group_id=None, config: dict = None)[source]

Methods

__init__(model_or_net, *net_inputs[, ...])

get_inputs()

Obtains all input Tensors of the model.

get_model_info(key)

Obtains model info of the model.

get_outputs()

Obtains all output TensorMeta of the model.

predict(inputs)

Inference model.

resize(inputs, dims)

Resizes the shapes of inputs.

update_weights(weights)

Update constant weight of the model node.