{ "cells": [ { "attachments": {}, "cell_type": "markdown", "id": "2579fa00-0dc6-4672-8a96-5a6bb315881f", "metadata": {}, "source": [ "# 编程范式\n", "\n", "[![查看源文件](https://mindspore-website.obs.cn-north-4.myhuaweicloud.com/website-images/r2.0/resource/_static/logo_source.png)](https://gitee.com/mindspore/docs/blob/r2.0/docs/mindspore/source_zh_cn/design/programming_paradigm.ipynb)" ] }, { "cell_type": "markdown", "id": "65f5d9f4-b38f-4c3a-bf28-5a4bc9510b6e", "metadata": {}, "source": [ "编程范式(Programming paradigm)是指编程语言的编程风格或编程方式。通常情况下,AI框架均依赖前端编程接口所使用的编程语言的编程范式进行神经网络的构造和训练。MindSpore作为AI+科学计算融合计算框架,分别面向AI、科学计算场景,提供了面向对象编程和函数式编程的支持。同时为提升框架使用的灵活性和易用性,提出了函数式+面向对象融合编程范式,有效地体现了函数式自动微分机制的优势。\n", "\n", "下面分别介绍MindSpore支持的三类编程范式及其简单示例。" ] }, { "attachments": {}, "cell_type": "markdown", "id": "62dc35ab-780a-484d-a6bb-113b8e5af1d4", "metadata": {}, "source": [ "## 面向对象编程\n", "\n", "面向对象编程(Object-oriented programming, OOP),是指一种将程序分解为封装数据及相关操作的模块(类)而进行的编程方式,对象为类(class)的实例。面向对象编程将对象作为程序的基本单元,将程序和数据封装其中,以提高软件的重用性、灵活性和扩展性,对象里的程序可以访问及经常修改对象相关联的数据。" ] }, { "attachments": {}, "cell_type": "markdown", "id": "d2f9816d-fa7a-4524-accb-4fa3c30a2dc4", "metadata": {}, "source": [ "在一般的编程场景中,代码(code)和数据(data)是两个核心构成部分。面向对象编程是针对特定对象(Object)来设计数据结构,定义类(Class)。类通常由以下两部分构成,分别对应了code和data:\n", "\n", "- 方法(Methods)\n", "- 属性(Attributes)\n", "\n", "对于同一个Class实例化(instantiation)后得到的不同对象而言,方法和属性相同,不同的是属性的值。不同的属性值决定了对象的内部状态,因此OOP能够很好地进行状态管理。\n", "\n", "下面为Python构造简单类的示例:\n", "\n", "```python\n", "class Sample: #class declaration\n", " def __init__(self, name): # class constructor (code)\n", " self.name = name # attribute (data)\n", "\n", " def set_name(self, name): # method declaration (code)\n", " self.name = name # method implementation (code)\n", "```" ] }, { "cell_type": "markdown", "id": "b5584f1a-2bf6-4ea1-a393-99aa015926d8", "metadata": {}, "source": [ "对于构造神经网络来说,首要的组件就是网络层(Layer),一个神经网络层包含以下部分:\n", "\n", "- Tensor操作 (Operation)\n", "- 权重 (Weights)\n", "\n", "此二者恰好与类的Methods和Attributes一一对应,同时权重本身就是神经网络层的内部状态,因此使用类来构造Layer天然符合其定义。此外,我们在编程时希望使用神经网络层进行堆叠,构造深度神经网络,使用OOP编程可以很容易地通过Layer对象组合构造新的Layer类。\n", "\n", "下面为使用MindSpore构造神经网络类的示例:\n", "\n", "```python\n", "from mindspore import nn, Parameter\n", "from mindspore.common.initializer import initializer\n", "\n", "class Linear(nn.Cell):\n", " def __init__(self, in_features, out_features, has_bias): # class constructor (code)\n", " super().__init__()\n", " self.weight = Parameter(initializer('normal', [out_features, in_features], mindspore.float32), 'weight') # layer weight (data)\n", " self.bias = Parameter(initializer('zeros', [out_features], mindspore.float32), 'bias') # layer weight (data)\n", "\n", " def construct(self, inputs): # method declaration (code)\n", " output = ops.matmul(inputs, self.weight.transpose(0, 1)) # tensor transformation (code)\n", " output = output + self.bias # tensor transformation (code)\n", " return output\n", "```" ] }, { "cell_type": "markdown", "id": "72d6cbde", "metadata": {}, "source": [ "除神经网络层的构造使用面向对象编程范式外,MindSpore支持纯面向对象编程方式构造神经网络训练逻辑,此时神经网络的正向计算、反向传播、梯度优化等操作均使用类进行构造。下面是纯面向对象编程的示例:\n", "\n", "```python\n", "import mindspore\n", "import mindspore.nn as nn\n", "from mindspore import value_and_grad\n", "\n", "class TrainOneStepCell(nn.Cell):\n", " def __init__(self, network, optimizer):\n", " super().__init__()\n", " self.network = network\n", " self.optimizer = optimizer\n", " self.grad_fn = value_and_grad(self.network, None, self.optimizer.parameters)\n", "\n", " def construct(self, *inputs):\n", " loss, grads = self.grad_fn(*inputs)\n", " self.optimizer(grads)\n", " return loss\n", "\n", "network = nn.Dense(5, 3)\n", "loss_fn = nn.BCEWithLogitsLoss()\n", "network_with_loss = nn.WithLossCell(network, loss_fn)\n", "optimizer = nn.SGD(network.trainable_params(), 0.001)\n", "trainer = TrainOneStepCell(network_with_loss, optimizer)\n", "```\n", "\n", "此时,不论是神经网络,还是其训练过程,均使用继承`nn.Cell`的类进行管理,可以方便地作为计算图进行编译加速。" ] }, { "cell_type": "markdown", "id": "eea3afc9-7172-4175-a836-374f97589b71", "metadata": {}, "source": [ "## 函数式编程\n", "\n", "函数式编程(Functional programming)是一种将计算机运算视为函数运算,并且避免使用程序状态以及可变对象的编程范式。\n", "\n", "在函数式编程中,函数被视为一等公民,这意味着它们可以绑定到名称(包括本地标识符),作为参数传递,并从其他函数返回,就像任何其他数据类型一样。这允许以声明性和可组合的风格编写程序,其中小功能以模块化方式组合。函数式编程有时被视为纯函数式编程的同义词,是将所有函数视为确定性数学函数或纯函数的函数式编程的一个子集。当使用一些给定参数调用纯函数时,它将始终返回相同的结果,并且不受任何可变状态或其他副作用的影响。\n", "\n", "函数式编程有两个核心特点,使其十分符合科学计算的需要:\n", "\n", "1. 编程函数语义与数学函数语义完全对等。\n", "2. 确定性,给定相同输入必然返回相同输出。无副作用。\n", "\n", "由于确定性这一特点,通过限制副作用,程序可以有更少的错误,更容易调试和测试,更适合形式验证。\n", "\n", "MindSpore提供纯函数式编程的支持,配合`mindspore.numpy`及`mindspore.scipy`提供的数值计算接口,可以便捷地进行科学计算编程。下面是使用函数式编程的示例:" ] }, { "cell_type": "markdown", "id": "edf6610b-aa3f-4073-880c-fc88f416f26d", "metadata": {}, "source": [ "```python\n", "import mindspore.numpy as mnp\n", "from mindspore import grad\n", "\n", "grad_tanh = grad(mnp.tanh)\n", "print(grad_tanh(2.0))\n", "# 0.070650816\n", "\n", "print(grad(grad(mnp.tanh))(2.0))\n", "print(grad(grad(grad(mnp.tanh)))(2.0))\n", "# -0.13621868\n", "# 0.25265405\n", "```\n", "\n", "配合函数式编程范式的需要,MindSpore提供了多种函数变换接口,涵盖包括自动微分、自动向量化、自动并行、即时编译、数据下沉等功能模块,下面简单进行介绍:\n", "\n", "- 自动微分:`grad`、`value_and_grad`,提供微分函数变换功能;\n", "- 自动向量化:`vmap`,用于沿参数轴映射函数 fn 的高阶函数;\n", "- 自动并行:`shard`,函数式算子切分,指定函数输入/输出Tensor的分布策略;\n", "- 即时编译:`jit`,将Python函数编译为一张可调用的MindSpore图;\n", "- 数据下沉:`data_sink`,对输入的函数进行变换,获得可使用数据下沉模式的函数。\n", "\n", "基于上述函数变换接口,在使用函数式编程范式时可以快速高效地使用函数变换实现复杂的功能。" ] }, { "cell_type": "markdown", "id": "680306ff-ecba-4b1f-be44-df6ac1935f91", "metadata": {}, "source": [ "## 函数式+面向对象融合编程" ] }, { "cell_type": "markdown", "id": "d74658ba-41e7-4178-932b-ae8cd888902a", "metadata": {}, "source": [ "考虑到神经网络模型构建和训练流程的灵活性和易用性需求,结合MindSpore自身的函数式自动微分机制,MindSpore针对AI模型训练设计了函数式+面向对象融合编程范式,可以兼顾面向对象编程和函数式编程的优势,同时使用同一套自动微分机制实现深度学习反向传播和科学计算自动微分的兼容,从底层支持AI和科学计算建模的兼容。下面是函数式+面向对象融合编程的典型过程:\n", "\n", "1. 用类构建神经网络;\n", "2. 实例化神经网络对象;\n", "3. 构造正向函数,连接神经网络和损失函数;\n", "4. 使用函数变换,获得梯度计算(反向传播)函数;\n", "5. 构造训练过程函数;\n", "6. 调用函数进行训练。\n", "\n", "下面是函数式+面向对象融合编程的简单示例:" ] }, { "cell_type": "markdown", "id": "673cec96-9965-46da-b5a4-2d705ee869e3", "metadata": {}, "source": [ "```python\n", "# Class definition\n", "class Net(nn.Cell):\n", " def __init__(self):\n", " ......\n", " def construct(self, inputs):\n", " ......\n", "\n", "# Object instantiation\n", "net = Net() # network\n", "loss_fn = nn.CrossEntropyLoss() # loss function\n", "optimizer = nn.Adam(net.trainable_params(), lr) # optimizer\n", "\n", "# define forward function\n", "def forword_fn(inputs, targets):\n", " logits = net(inputs)\n", " loss = loss_fn(logits, targets)\n", " return loss, logits\n", "\n", "# get grad function\n", "grad_fn = value_and_grad(forward_fn, None, optim.parameters, has_aux=True)\n", "\n", "# define train step function\n", "def train_step(inputs, targets):\n", " (loss, logits), grads = grad_fn(inputs, targets) # get values and gradients\n", " optimizer(grads) # update gradient\n", " return loss, logits\n", "\n", "for i in range(epochs):\n", " for inputs, targets in dataset():\n", " loss = train_step(inputs, targets)\n", "```" ] }, { "cell_type": "markdown", "id": "49aadacc-d515-47a9-945b-f255e02508ae", "metadata": {}, "source": [ "如上述示例,在神经网络构造时,使用面向对象编程,神经网络层的构造方式符合AI编程的习惯。在进行前向计算和反向传播时,MindSpore使用函数式编程,将前向计算构造为函数,然后通过函数变换,获得`grad_fn`,最后通过执行`grad_fn`获得权重对应的梯度。\n", "\n", "通过函数式+面向对象融合编程,即保证了神经网络构建的易用性,同时提高了前向计算和反向传播等训练过程的灵活性,是MindSpore推荐的默认编程范式。" ] } ], "metadata": { "kernelspec": { "display_name": "MindSpore", "language": "python", "name": "mindspore" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.7.5" }, "vscode": { "interpreter": { "hash": "8c9da313289c39257cb28b126d2dadd33153d4da4d524f730c81a4aaccbd2ca7" } } }, "nbformat": 4, "nbformat_minor": 5 }