[{"data":1,"prerenderedAt":601},["ShallowReactive",2],{"content-query-DOikUiweCX":3},{"_path":4,"_dir":5,"_draft":6,"_partial":6,"_locale":7,"title":8,"description":9,"date":10,"cover":11,"type":12,"body":13,"_type":595,"_id":596,"_source":597,"_file":598,"_stem":599,"_extension":600},"/technology-blogs/en/3134","en",false,"","Implementation of MindSpore-Powered Models in CV (1) — Basic and Advanced MindSpore APIs","This experiment introduces the data structures and data types of MindSpore and the basic modules used by MindSpore to set up a neural network, such as dataset loading, neural network setup, and model training and evaluation.","2024-03-25","https://obs-mindspore-file.obs.cn-north-4.myhuaweicloud.com/file/2024/05/31/be365f8c81d54d3db7d5513c5d5c6d6e.png","technology-blogs",{"type":14,"children":15,"toc":592},"root",[16,24,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,120,125,130,185,190,198,203,208,213,218,226,231,241,249,254,259,264,272,277,282,287,295,300,305,310,320,330,347,355,360,365,373,378,383,391,396,401,419,424,432,437,442,447,455,460,465,475,485,495,505,513,518,523,533,543,554,559,564,574,582,587],{"type":17,"tag":18,"props":19,"children":21},"element","h1",{"id":20},"implementation-of-mindspore-powered-models-in-cv-1-basic-and-advanced-mindspore-apis",[22],{"type":23,"value":8},"text",{"type":17,"tag":25,"props":26,"children":27},"p",{},[28],{"type":23,"value":29},"1. Introduction",{"type":17,"tag":25,"props":31,"children":32},{},[33],{"type":23,"value":34},"1.1 About This Experiment",{"type":17,"tag":25,"props":36,"children":37},{},[38],{"type":23,"value":39},"This experiment introduces the data structures and data types of MindSpore and the basic modules used by MindSpore to set up a neural network, such as dataset loading, neural network setup, and model training and evaluation. It aims to help trainees get familiar with the basic usage of MindSpore and master the basic development process of MindSpore.",{"type":17,"tag":25,"props":41,"children":42},{},[43],{"type":23,"value":44},"1.2 Objective",{"type":17,"tag":25,"props":46,"children":47},{},[48],{"type":23,"value":49},"● Understand the basic development process of MindSpore.",{"type":17,"tag":25,"props":51,"children":52},{},[53],{"type":23,"value":54},"● Understand the functions of the MindSpore basic modules.",{"type":17,"tag":25,"props":56,"children":57},{},[58],{"type":23,"value":59},"● Master the basic operations of MindSpore.",{"type":17,"tag":25,"props":61,"children":62},{},[63],{"type":23,"value":64},"1.3 Background Knowledge",{"type":17,"tag":25,"props":66,"children":67},{},[68],{"type":23,"value":69},"Knowledge of neural networks, perceptrons, multi-layer perceptrons, forward propagation, backward propagation, activation functions, loss functions, optimizers, validation methods.",{"type":17,"tag":25,"props":71,"children":72},{},[73],{"type":23,"value":74},"1.4 Experiment Design",{"type":17,"tag":25,"props":76,"children":77},{},[78],{"type":23,"value":79},"Tensor and data type, dataset loading, fully-connected network setup, model training, and model evaluation.",{"type":17,"tag":25,"props":81,"children":82},{},[83],{"type":23,"value":84},"2 Experiment Process",{"type":17,"tag":25,"props":86,"children":87},{},[88],{"type":23,"value":89},"2.1 Tensor and Data Type",{"type":17,"tag":25,"props":91,"children":92},{},[93],{"type":23,"value":94},"Tensor is a basic data structure in the MindSpore network computing. Tensors of different dimensions represent different data. For example, a 0-dimensional tensor represents a scalar, a 1-dimensional tensor represents a vector, a 2-dimensional tensor represents a matrix, and a 3-dimensional tensor may represent the three channels of RGB images.",{"type":17,"tag":25,"props":96,"children":97},{},[98],{"type":23,"value":99},"MindSpore tensors support different data types, including int8, int16, int32, int64, uint8, uint16, uint32, uint64, float16, float32, float64 and bool_, which correspond to the data types of NumPy. In the computation process of MindSpore, the int data type in Python is converted into the defined int64 type, and the float data type is converted into the defined float32 type.",{"type":17,"tag":25,"props":101,"children":102},{},[103],{"type":23,"value":104},"Step 1 Specify the MindSpore data type.",{"type":17,"tag":25,"props":106,"children":107},{},[108],{"type":23,"value":109},"Import MindSpore and set the cell of Jupyter Notebook to output multiple lines at the same time.",{"type":17,"tag":111,"props":112,"children":114},"pre",{"code":113},"# Import MindSpore.\nimport mindspore\nfrom mindspore import dtype \nfrom mindspore import Tensor\n# The cell outputs multiple lines at the same time.\nfrom IPython.core.interactiveshell import InteractiveShell\nInteractiveShell.ast_node_interactivity = \"all\"\n\n# Specify the data type.\na = 1\ntype(a)\nb = Tensor(a, dtype.float64) \nb.dtype \n",[115],{"type":17,"tag":116,"props":117,"children":118},"code",{"__ignoreMap":7},[119],{"type":23,"value":113},{"type":17,"tag":25,"props":121,"children":122},{},[123],{"type":23,"value":124},"Step 2 Construct tensors.",{"type":17,"tag":25,"props":126,"children":127},{},[128],{"type":23,"value":129},"During tensor construction, the tensor, float, int, Boolean, tuple, list, and NumPy.array types can be input. The tuple and list can store only data of the float, int, and Boolean types.",{"type":17,"tag":25,"props":131,"children":132},{},[133,135,141,143,148,150,155,157,162,164,169,171,176,178,183],{"type":23,"value":134},"The data type can be specified during tensor initialization. However, if the data type is not specified, the initial values ",{"type":17,"tag":136,"props":137,"children":138},"strong",{},[139],{"type":23,"value":140},"int",{"type":23,"value":142},", ",{"type":17,"tag":136,"props":144,"children":145},{},[146],{"type":23,"value":147},"float",{"type":23,"value":149},", and ",{"type":17,"tag":136,"props":151,"children":152},{},[153],{"type":23,"value":154},"bool",{"type":23,"value":156}," respectively generate 0-dimensional tensors with mindspore.int32, mindspore.float32 and mindspore.bool_ data types. The data types of the 1-dimensional tensors generated by the initial values ",{"type":17,"tag":136,"props":158,"children":159},{},[160],{"type":23,"value":161},"tuple",{"type":23,"value":163}," and ",{"type":17,"tag":136,"props":165,"children":166},{},[167],{"type":23,"value":168},"list",{"type":23,"value":170}," correspond to those of tensors stored in the tuple and list. If multiple types of data are contained, the MindSpore data type corresponding to the type with the highest priority is selected (Boolean \u003C int \u003C float). If the initial value is ",{"type":17,"tag":136,"props":172,"children":173},{},[174],{"type":23,"value":175},"Tensor",{"type":23,"value":177},", the data type is tensor. If the initial value is ",{"type":17,"tag":136,"props":179,"children":180},{},[181],{"type":23,"value":182},"NumPy.array",{"type":23,"value":184},", the generated tensor data type corresponds to NumPy.array.",{"type":17,"tag":25,"props":186,"children":187},{},[188],{"type":23,"value":189},"Use an array to create tensors:",{"type":17,"tag":111,"props":191,"children":193},{"code":192},"import numpy as np\nfrom mindspore import Tensor\n\n# Use an array to create a tensor.\nx = Tensor(np.array([[1, 2], [3, 4]]), dtype.int32)\nx\n\nUse a number to create tensors:\n# Use a number to create tensors.\ny = Tensor(1.0, dtype.int32)\nz = Tensor(2, dtype.int32)\ny\nz\n\nUse Boolean to create tensors:\n# Use Boolean to create a tensor.\nm = Tensor(True, dtype.bool_)\nm\n\nUse a tuple to create tensors:\n# Use a tuple to create a tensor.\nn = Tensor((1, 2, 3), dtype.int16)\nn\n\nUse a list to create tensors:\n# Use a list to create a tensor.\np = Tensor([4.0, 5.0, 6.0], dtype.float64)\np\n\nUse a constant to create tensors.\n# Use a constant to create a tensor.\nq = Tensor(1, dtype.float64)\nq\n",[194],{"type":17,"tag":116,"props":195,"children":196},{"__ignoreMap":7},[197],{"type":23,"value":192},{"type":17,"tag":25,"props":199,"children":200},{},[201],{"type":23,"value":202},"Step 3 Specify attributes of a tensor.",{"type":17,"tag":25,"props":204,"children":205},{},[206],{"type":23,"value":207},"Tensor attributes include shape and data type (dtype).",{"type":17,"tag":25,"props":209,"children":210},{},[211],{"type":23,"value":212},"Shape: a tuple",{"type":17,"tag":25,"props":214,"children":215},{},[216],{"type":23,"value":217},"Data type: a data type of MindSpore",{"type":17,"tag":111,"props":219,"children":221},{"code":220},"x = Tensor(np.array([[1, 2], [3, 4]]), dtype.int32)\nx_shape = x.shape # Shape\nx_dtype = x.dtype # Data type\n\nx_shape\nx_dtype\n\nx = Tensor(np.array([[1, 2], [3, 4]]), dtype.int32)\n\nx.shape # Shape\nx.dtype # Data type\nx.ndim  # Dimension\nx.size  # Size\n",[222],{"type":17,"tag":116,"props":223,"children":224},{"__ignoreMap":7},[225],{"type":23,"value":220},{"type":17,"tag":25,"props":227,"children":228},{},[229],{"type":23,"value":230},"Step 4 Convert tensors.",{"type":17,"tag":25,"props":232,"children":233},{},[234,239],{"type":17,"tag":136,"props":235,"children":236},{},[237],{"type":23,"value":238},"asnumpy()",{"type":23,"value":240},": converts a tensor to an array of NumPy.",{"type":17,"tag":111,"props":242,"children":244},{"code":243},"y = Tensor(np.array([[True, True], [False, False]]), dtype.bool_)\n\n# Convert the tensor data type to NumPy.\ny_array = y.asnumpy()\n\ny\ny_array\n",[245],{"type":17,"tag":116,"props":246,"children":247},{"__ignoreMap":7},[248],{"type":23,"value":243},{"type":17,"tag":25,"props":250,"children":251},{},[252],{"type":23,"value":253},"2.2 Dataset Loading",{"type":17,"tag":25,"props":255,"children":256},{},[257],{"type":23,"value":258},"MindSpore.dataset provides APIs to load and process datasets, such as MNIST, CIFAR-10, CIFAR-100, VOC, ImageNet, and CelebA.",{"type":17,"tag":25,"props":260,"children":261},{},[262],{"type":23,"value":263},"Step 1 Load the MNIST dataset",{"type":17,"tag":111,"props":265,"children":267},{"code":266},"mindspore.dataset.MnistDataset\n\nimport os\nimport mindspore.dataset as ds\nimport matplotlib.pyplot as plt\n\ndataset_dir = \"./data/train\"  # Dataset path\n\n# Read three images from the MNIST dataset.\nmnist_dataset = ds.MnistDataset(dataset_dir=dataset_dir, num_samples=3)\n\n# Set the image size.\nplt.figure(figsize=(8,8))\ni = 1\n\n# Print three subgraphs.\nfor dic in mnist_dataset.create_dict_iterator(output_numpy=True):\n    plt.subplot(3,3,i)\n    plt.imshow(dic['image'][:,:,0])\n    plt.axis('off')\n    i +=1\n\nplt.show()\n",[268],{"type":17,"tag":116,"props":269,"children":270},{"__ignoreMap":7},[271],{"type":23,"value":266},{"type":17,"tag":25,"props":273,"children":274},{},[275],{"type":23,"value":276},"MindSpore can load datasets in different data storage formats. You can directly use the corresponding classes in mindspore.dataset to load data files in disks.",{"type":17,"tag":25,"props":278,"children":279},{},[280],{"type":23,"value":281},"Step 2 Load the NumPy dataset.",{"type":17,"tag":25,"props":283,"children":284},{},[285],{"type":23,"value":286},"mindspore.dataset.NumpySlicesDataset",{"type":17,"tag":111,"props":288,"children":290},{"code":289},"import mindspore.dataset as ds\n\ndata = ds.NumpySlicesDataset([1, 2, 3], column_names=[\"col_1\"])\nfor x in data.create_dict_iterator():\n    print(x)\n",[291],{"type":17,"tag":116,"props":292,"children":293},{"__ignoreMap":7},[294],{"type":23,"value":289},{"type":17,"tag":25,"props":296,"children":297},{},[298],{"type":23,"value":299},"2.3 Fully-Connected Network Setup",{"type":17,"tag":25,"props":301,"children":302},{},[303],{"type":23,"value":304},"Step 1 Set up a fully-connected neural network.",{"type":17,"tag":25,"props":306,"children":307},{},[308],{"type":23,"value":309},"Fully-connected layer: mindspore.nn.Dense",{"type":17,"tag":25,"props":311,"children":312},{},[313,318],{"type":17,"tag":136,"props":314,"children":315},{},[316],{"type":23,"value":317},"in_channels",{"type":23,"value":319},": input channel",{"type":17,"tag":25,"props":321,"children":322},{},[323,328],{"type":17,"tag":136,"props":324,"children":325},{},[326],{"type":23,"value":327},"out_channels",{"type":23,"value":329},": output channel",{"type":17,"tag":25,"props":331,"children":332},{},[333,338,340,345],{"type":17,"tag":136,"props":334,"children":335},{},[336],{"type":23,"value":337},"weight_init",{"type":23,"value":339},": weight initialization. Default value: ",{"type":17,"tag":136,"props":341,"children":342},{},[343],{"type":23,"value":344},"'normal'",{"type":23,"value":346},".",{"type":17,"tag":111,"props":348,"children":350},{"code":349},"import mindspore.nn as nn\nfrom mindspore import Tensor\n\n# Construct the input tensor.\ninput = Tensor(np.array([[1, 1, 1], [2, 2, 2]]), mindspore.float32)\nprint(input)\n\n# Construct a fully-connected network. Set both in_channels and out_channels to 3.\nnet = nn.Dense(in_channels=3, out_channels=3, weight_init=1)\noutput = net(input)\nprint(output)\n",[351],{"type":17,"tag":116,"props":352,"children":353},{"__ignoreMap":7},[354],{"type":23,"value":349},{"type":17,"tag":25,"props":356,"children":357},{},[358],{"type":23,"value":359},"Step 2 Construct an activation function.",{"type":17,"tag":25,"props":361,"children":362},{},[363],{"type":23,"value":364},"Rectified linear unit (ReLU) activation function: mindspore.nn.ReLU",{"type":17,"tag":111,"props":366,"children":368},{"code":367},"input_x = Tensor(np.array([-1, 2, -3, 2, -1]), mindspore.float16)\n\nrelu = nn.ReLU()\noutput = relu(input_x)\nprint(output)\n",[369],{"type":17,"tag":116,"props":370,"children":371},{"__ignoreMap":7},[372],{"type":23,"value":367},{"type":17,"tag":25,"props":374,"children":375},{},[376],{"type":23,"value":377},"Step 3 Build a model.",{"type":17,"tag":25,"props":379,"children":380},{},[381],{"type":23,"value":382},"Base class of all neural networks: mindspore.nn.Cell",{"type":17,"tag":111,"props":384,"children":386},{"code":385},"import mindspore.nn as nn\n\nclass MyCell(nn.Cell):\n    \n    # Define operators.\n    def __init__(self, ):\n        super(MyCell, self).__init__()\n        \n         # Fully-connected layer\n        self.fc = nn.Dense()\n\n         # Activation function\n        self.relu = nn.ReLU()\n\n    # Build the network.\n    def construct(self, x):\n        x = self.fc(x)\n        x = self.relu(x)\n        return x\n",[387],{"type":17,"tag":116,"props":388,"children":389},{"__ignoreMap":7},[390],{"type":23,"value":385},{"type":17,"tag":25,"props":392,"children":393},{},[394],{"type":23,"value":395},"2.4 Model Training and Evaluation",{"type":17,"tag":25,"props":397,"children":398},{},[399],{"type":23,"value":400},"Step 1 Define a loss function.",{"type":17,"tag":25,"props":402,"children":403},{},[404,406,411,413,418],{"type":23,"value":405},"This cross entropy loss function is used to classify models. If the label data is not encoded in one-hot mode, set ",{"type":17,"tag":136,"props":407,"children":408},{},[409],{"type":23,"value":410},"sparse",{"type":23,"value":412}," to ",{"type":17,"tag":136,"props":414,"children":415},{},[416],{"type":23,"value":417},"True",{"type":23,"value":346},{"type":17,"tag":25,"props":420,"children":421},{},[422],{"type":23,"value":423},"mindspore.nn.SoftmaxCrossEntropyWithLogits",{"type":17,"tag":111,"props":425,"children":427},{"code":426},"import mindspore.nn as nn\n\n# Cross entropy loss function\nloss = nn.SoftmaxCrossEntropyWithLogits(sparse=True)\n\nnp.random.seed(0)\nlogits = Tensor(np.random.randint(0, 9, [1, 10]), mindspore.float32)\nprint(logits)\n\nlabels_np = np.ones([1,]).astype(np.int32)\nlabels = Tensor(labels_np)\nprint(labels)\n\noutput = loss(logits, labels)\nprint(output)\n",[428],{"type":17,"tag":116,"props":429,"children":430},{"__ignoreMap":7},[431],{"type":23,"value":426},{"type":17,"tag":25,"props":433,"children":434},{},[435],{"type":23,"value":436},"Step 2 Define an optimizer.",{"type":17,"tag":25,"props":438,"children":439},{},[440],{"type":23,"value":441},"Common deep learning optimization algorithms include SGD, Adam, Ftrl, lazyadam, Momentum, RMSprop, Lars, Proximal_ada_grad, and lamb.",{"type":17,"tag":25,"props":443,"children":444},{},[445],{"type":23,"value":446},"Momentum optimizer: mindspore.nn.Momentum",{"type":17,"tag":111,"props":448,"children":450},{"code":449},"# optim = nn.Momentum(params, learning_rate=0.1, momentum=0.9, weight_decay=0.0) # params is the passed parameter.\n",[451],{"type":17,"tag":116,"props":452,"children":453},{"__ignoreMap":7},[454],{"type":23,"value":449},{"type":17,"tag":25,"props":456,"children":457},{},[458],{"type":23,"value":459},"Step 3 Build the model.",{"type":17,"tag":25,"props":461,"children":462},{},[463],{"type":23,"value":464},"mindspore.Model",{"type":17,"tag":25,"props":466,"children":467},{},[468,473],{"type":17,"tag":136,"props":469,"children":470},{},[471],{"type":23,"value":472},"network",{"type":23,"value":474},": neural network",{"type":17,"tag":25,"props":476,"children":477},{},[478,483],{"type":17,"tag":136,"props":479,"children":480},{},[481],{"type":23,"value":482},"loss_fn",{"type":23,"value":484},": loss function",{"type":17,"tag":25,"props":486,"children":487},{},[488,493],{"type":17,"tag":136,"props":489,"children":490},{},[491],{"type":23,"value":492},"optimizer",{"type":23,"value":494},": optimizer",{"type":17,"tag":25,"props":496,"children":497},{},[498,503],{"type":17,"tag":136,"props":499,"children":500},{},[501],{"type":23,"value":502},"metrics",{"type":23,"value":504},": evaluation metrics",{"type":17,"tag":111,"props":506,"children":508},{"code":507},"from mindspore import Model\n\n# Define the neural network.\nnet = nn.Dense(in_channels=3, out_channels=3, weight_init=1)\n# Define the loss function.\nloss = nn.SoftmaxCrossEntropyWithLogits()\n# Define the optimizer.\noptim = nn.Momentum(params=net.trainable_params(), learning_rate=0.1, momentum=0.9)\n# Build the model.\nmodel = Model(network = net, loss_fn=loss, optimizer=optim, metrics=None)\n",[509],{"type":17,"tag":116,"props":510,"children":511},{"__ignoreMap":7},[512],{"type":23,"value":507},{"type":17,"tag":25,"props":514,"children":515},{},[516],{"type":23,"value":517},"Step 4 Train the model.",{"type":17,"tag":25,"props":519,"children":520},{},[521],{"type":23,"value":522},"model.train",{"type":17,"tag":25,"props":524,"children":525},{},[526,531],{"type":17,"tag":136,"props":527,"children":528},{},[529],{"type":23,"value":530},"epoch",{"type":23,"value":532},": number of training epochs",{"type":17,"tag":25,"props":534,"children":535},{},[536,541],{"type":17,"tag":136,"props":537,"children":538},{},[539],{"type":23,"value":540},"train_dataset",{"type":23,"value":542},": training dataset",{"type":17,"tag":25,"props":544,"children":545},{},[546,548,552],{"type":23,"value":547},"# model.train(epoch=10, train_dataset=train_dataset) # ",{"type":17,"tag":136,"props":549,"children":550},{},[551],{"type":23,"value":540},{"type":23,"value":553}," is the passed parameter.",{"type":17,"tag":25,"props":555,"children":556},{},[557],{"type":23,"value":558},"Step 5 Evalue the model.",{"type":17,"tag":25,"props":560,"children":561},{},[562],{"type":23,"value":563},"model.eval",{"type":17,"tag":25,"props":565,"children":566},{},[567,572],{"type":17,"tag":136,"props":568,"children":569},{},[570],{"type":23,"value":571},"valid_dataset",{"type":23,"value":573},": validation dataset",{"type":17,"tag":111,"props":575,"children":577},{"code":576},"# model.eval(valid_dataset=test_dataset) # test_dataset is the passed parameter.\n",[578],{"type":17,"tag":116,"props":579,"children":580},{"__ignoreMap":7},[581],{"type":23,"value":576},{"type":17,"tag":25,"props":583,"children":584},{},[585],{"type":23,"value":586},"3 Experiment Summary",{"type":17,"tag":25,"props":588,"children":589},{},[590],{"type":23,"value":591},"This experiment introduces the data structures and types of MindSpore and the basic modules it uses to set up a neural network. It aims to help trainees learn how to load data sets, set up neural networks, train and evaluate models, become familiar with the basic usage of MindSpore, and master its basic development process.",{"title":7,"searchDepth":593,"depth":593,"links":594},4,[],"markdown","content:technology-blogs:en:3134.md","content","technology-blogs/en/3134.md","technology-blogs/en/3134","md",1776506110636]