{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# 构建单算子网络和多层网络\n", "\n", "`Ascend` `GPU` `CPU` `模型开发`\n", "\n", "[![在线运行](https://gitee.com/mindspore/docs/raw/r1.6/resource/_static/logo_modelarts.png)](https://authoring-modelarts-cnnorth4.huaweicloud.com/console/lab?share-url-b64=aHR0cHM6Ly9taW5kc3BvcmUtd2Vic2l0ZS5vYnMuY24tbm9ydGgtNC5teWh1YXdlaWNsb3VkLmNvbS9ub3RlYm9vay9tYXN0ZXIvcHJvZ3JhbW1pbmdfZ3VpZGUvemhfY24vbWluZHNwb3JlX2J1aWxkX25ldC5pcHluYg==&imageid=65f636a0-56cf-49df-b941-7d2a07ba8c8c) [![下载Notebook](https://gitee.com/mindspore/docs/raw/r1.6/resource/_static/logo_notebook.png)](https://mindspore-website.obs.cn-north-4.myhuaweicloud.com/notebook/r1.6/programming_guide/zh_cn/mindspore_build_net.ipynb) [![下载样例代码](https://gitee.com/mindspore/docs/raw/r1.6/resource/_static/logo_download_code.png)](https://mindspore-website.obs.cn-north-4.myhuaweicloud.com/notebook/r1.6/programming_guide/zh_cn/mindspore_build_net.py) [![查看源文件](https://gitee.com/mindspore/docs/raw/r1.6/resource/_static/logo_source.png)](https://gitee.com/mindspore/docs/blob/r1.6/docs/mindspore/programming_guide/source_zh_cn/build_net.ipynb)\n", "\n", "## 概述\n", "\n", "MindSpore的Cell类是构建所有网络的基类,也是网络的基本单元。定义网络时,可以继承Cell类,并重写`__init__`方法和`construct`方法。MindSpore的ops模块提供了基础算子的实现,nn模块实现了对基础算子的进一步封装,用户可以根据需要,灵活使用不同的算子。\n", "\n", "Cell本身具备模块管理能力,一个Cell可以由多个Cell组成,便于组成更复杂的网络。同时,为了更好地构建和管理复杂的网络,`mindspore.nn`提供了容器对网络中的子模块或模型层进行管理,分为CellList和SequentialCell两种方式。\n", "\n", "## 运行基础算子\n", "\n", "网络的构建离不开基础算子的使用。operations模块是MindSpore的基础运算单元,封装了不同类型的算子,例如:\n", "\n", "- array_ops: 数组相关的算子\n", "\n", "- math_ops: 数学相关的算子\n", "\n", "- nn_ops: 网络类算子\n", "\n", "> 更多算子使用方式参考文档[算子](https://www.mindspore.cn/docs/programming_guide/zh-CN/r1.6/operators.html)。\n", "\n", "直接运行两个基础算子,`ops.Mul()`和`ops.Add()`:" ] }, { "cell_type": "code", "execution_count": 1, "metadata": { "ExecuteTime": { "end_time": "2022-01-04T03:00:13.230180Z", "start_time": "2022-01-04T03:00:11.250401Z" } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "[ 3. 6. 11.]\n" ] } ], "source": [ "import numpy as np\n", "import mindspore\n", "from mindspore import Tensor, ops\n", "\n", "x = Tensor(np.array([1, 2, 3]), mindspore.float32)\n", "y = Tensor(np.array([2, 2, 2]), mindspore.float32)\n", "\n", "mul = ops.Mul()\n", "add = ops.Add()\n", "output = add(mul(x, x), y)\n", "print(output)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## 使用Cell构建和执行网络\n", "\n", "### Cell的基础使用\n", "\n", "MindSpore提供了Cell类来方便用户定义和执行自己的网络,用户通过继承nn.Cell,在`__init__`构造函数中申明各个层的定义,在`construct`中实现层之间的连接关系,完成神经网络的前向构造。需要注意的是,construct中存在一定的限制,无法使用第三方库的方法,一般使用MindSpore的Tensor和Cell实例。\n", "\n", "使用简单的ops算子,组合一个Cell:" ] }, { "cell_type": "code", "execution_count": 2, "metadata": { "ExecuteTime": { "end_time": "2022-01-04T03:00:13.271689Z", "start_time": "2022-01-04T03:00:13.232410Z" } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "[ 3. 6. 11.]\n" ] } ], "source": [ "import numpy as np\n", "import mindspore\n", "from mindspore import Parameter, ops, Tensor, nn\n", "\n", "class Net(nn.Cell):\n", " def __init__(self):\n", " super(Net, self).__init__()\n", " self.mul = ops.Mul()\n", " self.add = ops.Add()\n", " self.weight = Parameter(Tensor(np.array([2, 2, 2]), mindspore.float32))\n", "\n", " def construct(self, x):\n", " return self.add(self.mul(x, x), self.weight)\n", "\n", "net = Net()\n", "input = Tensor(np.array([1, 2, 3]))\n", "output = net(input)\n", "print(output)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### 基础算子的nn封装\n", "\n", "尽管ops模块提供的多样算子可以基本满足网络构建的诉求,但为了在复杂的深度网络中提供更方便易用的接口,MindSpore对复杂算子进行了进一步的nn层封装。nn模块包括了各种模型层、损失函数、优化器等,为用户的使用提供了便利。\n", "\n", "基于nn提供的模型层,使用Cell构建一个网络:" ] }, { "cell_type": "code", "execution_count": 3, "metadata": { "ExecuteTime": { "end_time": "2022-01-04T03:00:13.423711Z", "start_time": "2022-01-04T03:00:13.273285Z" } }, "outputs": [], "source": [ "import numpy as np\n", "import mindspore\n", "from mindspore import Tensor, nn\n", "\n", "class ConvBNReLU(nn.Cell):\n", " def __init__(self):\n", " super(ConvBNReLU, self).__init__()\n", " self.conv = nn.Conv2d(3, 64, 3)\n", " self.bn = nn.BatchNorm2d(64)\n", " self.relu = nn.ReLU()\n", "\n", " def construct(self, x):\n", " x = self.conv(x)\n", " x = self.bn(x)\n", " out = self.relu(x)\n", " return out\n", "\n", "net = ConvBNReLU()\n", "input = Tensor(np.ones([1, 3, 64, 32]), mindspore.float32)\n", "output = net(input)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### CellList和SequentialCell\n", "\n", "为了便于管理和组成更复杂的网络,`mindspore.nn`提供了容器对网络中的子模型块或模型层进行管理,有CellList和SequentialCell两种方式。\n", "\n", "- mindspore.nn.CellList:存储Cell的列表,存储的Cell既可以是模型层,也可以是构建的网络子块。CellList支持append,extend,insert方法。在执行网络时,可以在construct方法里,使用for循环,运行输出结果。\n", "\n", "- mindspore.nn.SequentialCell:顺序容器,支持子模块以list或OrderedDict格式作为输入。不同于CellList的是,SequentialCell类内部实现了construct方法,可以直接输出结果。\n", "\n", "使用CellList定义并执行一个网络,依次包含一个之前定义的模型子块ConvBNReLU,一个Conv2d层,一个BatchNorm2d层,一个ReLU层:" ] }, { "cell_type": "code", "execution_count": 4, "metadata": { "ExecuteTime": { "end_time": "2022-01-04T03:00:13.571293Z", "start_time": "2022-01-04T03:00:13.425734Z" } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "MyNet<\n", " (build_block): CellList<\n", " (0): ConvBNReLU<\n", " (conv): Conv2d\n", " (bn): BatchNorm2d\n", " (relu): ReLU<>\n", " >\n", " (1): Conv2d\n", " (2): BatchNorm2d\n", " (3): ReLU<>\n", " >\n", " >\n", "(1, 4, 64, 32)\n" ] } ], "source": [ "import numpy as np\n", "import mindspore\n", "from mindspore import Tensor, nn\n", "\n", "class MyNet(nn.Cell):\n", "\n", " def __init__(self):\n", " super(MyNet, self).__init__()\n", " # 将上一步中定义的ConvBNReLU加入一个列表\n", " layers = [ConvBNReLU()]\n", " # 使用CellList对网络进行管理\n", " self.build_block = nn.CellList(layers)\n", " # 使用append方法添加Conv2d层和ReLU层\n", " self.build_block.append(nn.Conv2d(64, 4, 4))\n", " self.build_block.append(nn.ReLU())\n", " # 使用insert方法在Conv2d层和ReLU层中间插入BatchNorm2d\n", " self.build_block.insert(-1, nn.BatchNorm2d(4))\n", "\n", " def construct(self, x):\n", " # for循环执行网络\n", " for layer in self.build_block:\n", " x = layer(x)\n", " return x\n", "\n", "net = MyNet()\n", "print(net)\n", "\n", "input = Tensor(np.ones([1, 3, 64, 32]), mindspore.float32)\n", "output = net(input)\n", "print(output.shape)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "使用SequentialCell构建一个网络,输入为list,网络结构依次包含一个之前定义的模型子块ConvBNReLU,一个Conv2d层,一个BatchNorm2d层,一个ReLU层:\n" ] }, { "cell_type": "code", "execution_count": 5, "metadata": { "ExecuteTime": { "end_time": "2022-01-04T03:00:13.691370Z", "start_time": "2022-01-04T03:00:13.573311Z" } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "MyNet<\n", " (build_block): SequentialCell<\n", " (0): ConvBNReLU<\n", " (conv): Conv2d\n", " (bn): BatchNorm2d\n", " (relu): ReLU<>\n", " >\n", " (1): Conv2d\n", " (2): BatchNorm2d\n", " (3): ReLU<>\n", " >\n", " >\n", "(1, 4, 64, 32)\n" ] } ], "source": [ "import numpy as np\n", "import mindspore\n", "from mindspore import Tensor, nn\n", "\n", "class MyNet(nn.Cell):\n", "\n", " def __init__(self):\n", " super(MyNet, self).__init__()\n", " # 将上一步中定义的ConvBNReLU加入一个列表\n", " layers = [ConvBNReLU()]\n", " # 在列表中添加模型层\n", " layers.extend([\n", " nn.Conv2d(64, 4, 4),\n", " nn.BatchNorm2d(4),\n", " nn.ReLU()\n", " ])\n", " # 使用SequentialCell对网络进行管理\n", " self.build_block = nn.SequentialCell(layers)\n", "\n", " def construct(self, x):\n", " return self.build_block(x)\n", "\n", "net = MyNet()\n", "print(net)\n", "\n", "input = Tensor(np.ones([1, 3, 64, 32]), mindspore.float32)\n", "output = net(input)\n", "print(output.shape)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "SequentialCell也支持输入为OrderdDict类型:" ] }, { "cell_type": "code", "execution_count": 6, "metadata": { "ExecuteTime": { "end_time": "2022-01-04T03:00:13.791963Z", "start_time": "2022-01-04T03:00:13.692414Z" } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "MyNet<\n", " (build_block): SequentialCell<\n", " (ConvBNReLU): ConvBNReLU<\n", " (conv): Conv2d\n", " (bn): BatchNorm2d\n", " (relu): ReLU<>\n", " >\n", " (conv): Conv2d\n", " (norm): BatchNorm2d\n", " (relu): ReLU<>\n", " >\n", " >\n", "(1, 4, 64, 32)\n" ] } ], "source": [ "import numpy as np\n", "import mindspore\n", "from mindspore import Tensor, nn\n", "from collections import OrderedDict\n", "\n", "class MyNet(nn.Cell):\n", "\n", " def __init__(self):\n", " super(MyNet, self).__init__()\n", " layers = OrderedDict()\n", " # 将cells加入字典\n", " layers[\"ConvBNReLU\"] = ConvBNReLU()\n", " layers[\"conv\"] = nn.Conv2d(64, 4, 4)\n", " layers[\"norm\"] = nn.BatchNorm2d(4)\n", " layers[\"relu\"] = nn.ReLU()\n", " # 使用SequentialCell对网络进行管理\n", " self.build_block = nn.SequentialCell(layers)\n", "\n", " def construct(self, x):\n", " return self.build_block(x)\n", "\n", "net = MyNet()\n", "print(net)\n", "\n", "input = Tensor(np.ones([1, 3, 64, 32]), mindspore.float32)\n", "output = net(input)\n", "print(output.shape)" ] } ], "metadata": { "kernelspec": { "display_name": "MindSpore", "language": "python", "name": "mindspore" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.7.5" } }, "nbformat": 4, "nbformat_minor": 4 }