[{"data":1,"prerenderedAt":641},["ShallowReactive",2],{"content-query-bjM2OBAW1O":3},{"_path":4,"_dir":5,"_draft":6,"_partial":6,"_locale":7,"title":8,"description":9,"date":10,"cover":11,"type":12,"category":13,"body":14,"_type":635,"_id":636,"_source":637,"_file":638,"_stem":639,"_extension":640},"/technology-blogs/zh/622","zh",false,"","【跟着小Mi一起机器学习吧！】逻辑回归（一）","什么是机器学习，什么是机器学习，如果你想知道什么是机器学习，那么小Mi带你一起研究！","2021-06-22","https://obs-mindspore-file.obs.cn-north-4.myhuaweicloud.com/file/2021/06/22/94c2f3394942423aac7bf4ec37d9323e.png","technology-blogs","基础知识",{"type":15,"children":16,"toc":632},"root",[17,25,31,37,49,60,67,72,77,88,93,99,146,153,164,174,191,200,259,264,270,275,291,298,361,471,482,487,494,593,604,611,627],{"type":18,"tag":19,"props":20,"children":22},"element","h1",{"id":21},"跟着小mi一起机器学习吧逻辑回归一",[23],{"type":24,"value":8},"text",{"type":18,"tag":26,"props":27,"children":28},"p",{},[29],{"type":24,"value":30},"小Mi学习，向上积极！在前面几周的学习中，小Mi终于带着大家完完整整学完了线性回归，同时小Mi也收到了大家的很多反馈，在后续的学习中，小Mi会一一改进的！今天我们就开启新的章节学习—logistic回归(Logistic Regression) 算法吧（冲鸭）！",{"type":18,"tag":19,"props":32,"children":34},{"id":33},"_1-分类问题",[35],{"type":24,"value":36},"1 分类问题",{"type":18,"tag":26,"props":38,"children":39},{},[40,42,47],{"type":24,"value":41},"在logistic回归算法中，我们通常遇到的是所需预测的变量y是一个离散值这种情况下的分类问题，在介绍篇中，小Mi就已经带着大家稍微了解了哪些问题是分类问题，比如垃圾邮件分类问题，还有就是对肿瘤进行分类的例子，确定其是恶性癌症还是良性肿瘤。在所有的这些问题中，我们尝试预测的变量y，都只有两个取值的变量，0或1，垃圾邮件或者非垃圾邮件，恶性或良性。我们将因变量(dependent variable)可能属于的两个类分别称为负向类（negative class）和正向类（positive class），则因变量",{"type":18,"tag":43,"props":44,"children":46},"img",{"alt":7,"src":45},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/103537jcrmo35zi6rcg9i9.png",[],{"type":24,"value":48}," ，其中0表示负向类，1 表示正向类。两个类别中，例如垃圾邮件或非垃圾邮件，到底哪个是正类/1，哪个是负类/0，这都是任意的，并没有什么区别。但通常来说，我们默认负类表示没有某样东西，例如没有恶性肿瘤，而1即正类，表示具有我们要寻找的东西。当然啦，这都不重要，重要的是如何解决分类问题，对不对！当然还会有多分类问题，例如变量y可以选取0，1，2，3这几个值，这就是多分类问题，现在我们就先从二分类问题开始了解吧！",{"type":18,"tag":26,"props":50,"children":51},{},[52,54,58],{"type":24,"value":53},"说到这小Mi不由得灵机一动，话说之前学习的线性回归是否适用于分类问题呢？废话不多说，我们一起来验证一下！假设下图中的数据集，是对肿瘤进行恶性或良性分类，得到的数据只有两个值，0/否或者1/是，根据线性回归算法用直线对数据进行拟合。最终得到的假设为：",{"type":18,"tag":43,"props":55,"children":57},{"alt":7,"src":56},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/103634ctn6tteuv76xzke7.png",[],{"type":24,"value":59},"，而分类器输出的阈值设为0.5，即纵坐标值如果大于等于0.5，预测y为1；如果小于0.5，预测y等于为0。",{"type":18,"tag":26,"props":61,"children":62},{},[63],{"type":18,"tag":43,"props":64,"children":66},{"alt":7,"src":65},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/103721rtj7viuv5etbxeel.png",[],{"type":18,"tag":26,"props":68,"children":69},{},[70],{"type":24,"value":71},"在这个例子中，似乎线性回归拟合的效果还不错，但是，尝试增加些难度，假如我们还有另一个训练样本，位于右边最远边，如果依旧运行线性回归，我们会得到另一条蓝色的直线。",{"type":18,"tag":26,"props":73,"children":74},{},[75],{"type":24,"value":76},"如果还跟之前一样将阈值设为0.5，这时候就能明显看出来是一个相当差劲的线性回归了。毫无疑问，学习算法会将这个样本判定为恶性，但加了这个样本后，使得线性回归对数据的拟合直线从红色变成蓝色这条直线，变成了一个更坏的假设。因此，把线性回归应用于分类问题，通常并不是一个好主意。在这个例子中，在额外加这个样本之前，之前的线性回归运气很好，我们得到了一个假设，效果不错。但通常对数据集进行线性回归，只是偶尔效果会很好，所以并不推荐将线性回归用于分类问题。",{"type":18,"tag":26,"props":78,"children":79},{},[80,82,86],{"type":24,"value":81},"总的来说，如果我们要用线性回归算法来解决一个分类问题，首先分类问题中，取值只有 0 或者1，但如果假设函数的输出值远大于 1，或者远小于0，就无法解决这种情况了。所以我们的逻辑回归算法就闪亮登场啦！这个算法的性质是：它的输出值永远在0到 1 之间，即",{"type":18,"tag":43,"props":83,"children":85},{"alt":7,"src":84},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/103802fbkvhw3c0cshdon3.png",[],{"type":24,"value":87},"。",{"type":18,"tag":26,"props":89,"children":90},{},[91],{"type":24,"value":92},"BTW，我们把logistic回归视为一种分类算法，因为名字中有“回归”，有些时候可能会让你产生误解，但logistic回归实际上是一种分类算法，用在标签y为离散值0或1的情况下，适用于标签取值离散的情况，如：1 0 0 1。",{"type":18,"tag":19,"props":94,"children":96},{"id":95},"_2-假说表示",[97],{"type":24,"value":98},"2 假说表示",{"type":18,"tag":26,"props":100,"children":101},{},[102,104,108,110,114,116,120,122,126,128,132,134,138,140,144],{"type":24,"value":103},"那么在logistic回归中，我们要使用哪个方程来表示我们的假设呢？此前，我们提到希望我们分类器的输出值在0和1之间，另外，当我们使用线性回归的时候，假设形式为",{"type":18,"tag":43,"props":105,"children":107},{"alt":7,"src":106},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/103845fmsuc4ua7giylwak.png",[],{"type":24,"value":109},"，现在我们可以稍作修改，假设",{"type":18,"tag":43,"props":111,"children":113},{"alt":7,"src":112},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/103905obmkh4zye0rxwwi2.png",[],{"type":24,"value":115},"，sigmoid/logistic函数",{"type":18,"tag":43,"props":117,"children":119},{"alt":7,"src":118},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/104045vzndiqrzhsnqbw9v.png",[],{"type":24,"value":121},"，z是一个实数，将这两个函数结合起来，得出的结果为：",{"type":18,"tag":43,"props":123,"children":125},{"alt":7,"src":124},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/104152vdmoddzt1n3efiyi.png",[],{"type":24,"value":127},"，",{"type":18,"tag":43,"props":129,"children":131},{"alt":7,"src":130},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/104225j4wvlnpid2liik2v.png",[],{"type":24,"value":133},"的图像如下图所示，介于0~1之间。现在需要做的就是用参数",{"type":18,"tag":43,"props":135,"children":137},{"alt":7,"src":136},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/104307uzwrrv7zacunar62.png",[],{"type":24,"value":139},"来拟合我们的数据。因此，拿到一个数据集，我们需要给参数",{"type":18,"tag":43,"props":141,"children":143},{"alt":7,"src":142},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/104326idl3nrfobr5jtkya.png",[],{"type":24,"value":145},"选定一个值，假设函数会帮助我们做出预测。",{"type":18,"tag":26,"props":147,"children":148},{},[149],{"type":18,"tag":43,"props":150,"children":152},{"alt":7,"src":151},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/104422y8jkokqdjwxlqzdx.png",[],{"type":18,"tag":26,"props":154,"children":155},{},[156,162],{"type":18,"tag":157,"props":158,"children":159},"strong",{},[160],{"type":24,"value":161},"python",{"type":24,"value":163},"代码实现：",{"type":18,"tag":165,"props":166,"children":168},"pre",{"code":167},"import numpy as np\n    \ndef sigmoid(z):\n    \n   return 1 / (1 + np.exp(-z))\n",[169],{"type":18,"tag":170,"props":171,"children":172},"code",{"__ignoreMap":7},[173],{"type":24,"value":167},{"type":18,"tag":26,"props":175,"children":176},{},[177,179,183,185,189],{"type":24,"value":178},"在这里，假设函数",{"type":18,"tag":43,"props":180,"children":182},{"alt":7,"src":181},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/104528xxszh3p7usy9tkmj.png",[],{"type":24,"value":184},"的输出，假若输出某个数字，可以把这个数字当作对一个输入",{"type":18,"tag":43,"props":186,"children":188},{"alt":7,"src":187},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/1045575fl8jcji7fmkqckt.png",[],{"type":24,"value":190},"，y=1的概率估计。",{"type":18,"tag":26,"props":192,"children":193},{},[194,196],{"type":24,"value":195},"举例：",{"type":18,"tag":43,"props":197,"children":199},{"alt":7,"src":198},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/105508fabxemnran2gfosy.png",[],{"type":18,"tag":26,"props":201,"children":202},{},[203,205,208,210,214,216,219,221,224,226,230,232,235,237,241,243,247,249,252,254,257],{"type":24,"value":204},"我们使用肿瘤分类的例子，可能有一个特征向量",{"type":18,"tag":43,"props":206,"children":207},{"alt":7,"src":187},[],{"type":24,"value":209},"，同样，",{"type":18,"tag":43,"props":211,"children":213},{"alt":7,"src":212},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/105555wqfgstvno5ibuujm.png",[],{"type":24,"value":215},"，然后我们的一个特征是肿瘤的大小，假设有一个病人，他的肿瘤大小确定，把他的特征向量",{"type":18,"tag":43,"props":217,"children":218},{"alt":7,"src":187},[],{"type":24,"value":220},"带入我们的假设中，并且假设输出为0.7，说明对于一个特征为",{"type":18,"tag":43,"props":222,"children":223},{"alt":7,"src":187},[],{"type":24,"value":225},"的患者，y=1的概率是0.7，也就是说，该病人有70%的可能性是恶性肿瘤，可以更加正式地写成数学表达式，假设函数的输出等于",{"type":18,"tag":43,"props":227,"children":229},{"alt":7,"src":228},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/105648mj6fthliplmhcbzu.png",[],{"type":24,"value":231},"（知识点：表示在给定x的条件下y=1的概率，即病人的特征为x的情况下，病人的特征也就是代表肿瘤的大小，这个概率的参数是",{"type":18,"tag":43,"props":233,"children":234},{"alt":7,"src":142},[],{"type":24,"value":236},"），所以基本上依赖假设函数来估计y=1的概率是多少，因为这是一个分类任务，y必须是0或1，无论是在训练集中还是未来可能走进医生办公室的新患者，同时也可以计算y=0的概率，另外，还有个表达式需要了解一下：",{"type":18,"tag":43,"props":238,"children":240},{"alt":7,"src":239},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/105817ffcuexoylkdrl1s8.png",[],{"type":24,"value":242},"（可以参照",{"type":18,"tag":43,"props":244,"children":246},{"alt":7,"src":245},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/105851snbjv8ufnsntfmvt.png",[],{"type":24,"value":248},"来理解），给定参数",{"type":18,"tag":43,"props":250,"children":251},{"alt":7,"src":142},[],{"type":24,"value":253},"，对具有特征x的病人，y=0的概率与同样给定参数",{"type":18,"tag":43,"props":255,"children":256},{"alt":7,"src":142},[],{"type":24,"value":258},"，对具有特征x的病人，y=1的概率相加和为1。",{"type":18,"tag":26,"props":260,"children":261},{},[262],{"type":24,"value":263},"本节介绍的是logistic回归中假设函数的表示方法，下面将更直观地认识假设函数是什么样子的，会涉及到决定边界这个概念，也会有一些可视化的展现帮助大家更好地理解。",{"type":18,"tag":19,"props":265,"children":267},{"id":266},"_3-判定边界",[268],{"type":24,"value":269},"3 判定边界",{"type":18,"tag":26,"props":271,"children":272},{},[273],{"type":24,"value":274},"现在讲下决策边界(decision boundary)的概念。这个概念能更好地帮助我们理解逻辑回归的假设函数在计算什么。",{"type":18,"tag":26,"props":276,"children":277},{},[278,280,284,285,289],{"type":24,"value":279},"Logistic回归的表达式可以写成：",{"type":18,"tag":43,"props":281,"children":283},{"alt":7,"src":282},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/105943r37axnv9hqqbwnfo.png",[],{"type":24,"value":127},{"type":18,"tag":43,"props":286,"children":288},{"alt":7,"src":287},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/110017i94xjadneamqatll.png",[],{"type":24,"value":290},"（sigmoid函数），g(z)图像如下图所示，从零开始慢慢增加至1，并逐渐逼近1。现在可以进一步理解，这个假设函数何时会将y预测为1，什么时候又将y预测为0，并且可以更好地理解这个假设函数的形状，特别是当我们的数据有多个特征的时候。",{"type":18,"tag":26,"props":292,"children":293},{},[294],{"type":18,"tag":43,"props":295,"children":297},{"alt":7,"src":296},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/110057yj6u8a7sbr6ufkmq.png",[],{"type":18,"tag":26,"props":299,"children":300},{},[301,303,307,309,313,315,319,321,324,326,329,331,334,336,339,341,344,346,349,351,354,356,359],{"type":24,"value":302},"那么什么时候",{"type":18,"tag":43,"props":304,"children":306},{"alt":7,"src":305},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/110146fmwl0rhkqwmtrkhd.png",[],{"type":24,"value":308},"将大于或等于0.5呢？我们最终预测y=1，看sigmoid函数的曲线图，只要z大于或等于0，g(z)就大于等于0.5，由于logistic回归的假设函数",{"type":18,"tag":43,"props":310,"children":312},{"alt":7,"src":311},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/1102328cn9egvns8eo3jx6.png",[],{"type":24,"value":314},"，只要",{"type":18,"tag":43,"props":316,"children":318},{"alt":7,"src":317},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/110251rq5u4kcqh7utlcbu.png",[],{"type":24,"value":320},"大于等于0，",{"type":18,"tag":43,"props":322,"children":323},{"alt":7,"src":305},[],{"type":24,"value":325},"就大于或等于0.5，所以这里",{"type":18,"tag":43,"props":327,"children":328},{"alt":7,"src":317},[],{"type":24,"value":330},"取代了z的位置，所以，我们的假设函数将会预测y=1，只要",{"type":18,"tag":43,"props":332,"children":333},{"alt":7,"src":317},[],{"type":24,"value":335},"大于或等于0。假设函数预测y=0的情况，类似地，",{"type":18,"tag":43,"props":337,"children":338},{"alt":7,"src":305},[],{"type":24,"value":340},"将会小于0.5，只要g(z)小于0.5，这是因为z的取值，使得g(z)小于0.5的部分，是z小于0的部分，当g(z)小于0.5时，我们的假设函数将会预测y=0，根据与之前类似的原因，",{"type":18,"tag":43,"props":342,"children":343},{"alt":7,"src":311},[],{"type":24,"value":345},"，因此，只要",{"type":18,"tag":43,"props":347,"children":348},{"alt":7,"src":317},[],{"type":24,"value":350},"小于0，我们就能预测y等于0。总体来说，如果我们决定要预测y=1或y=0，取决于估值概率是大于等于0.5还是小于0.5，换句话说，我们将预测y=1的话，只需要",{"type":18,"tag":43,"props":352,"children":353},{"alt":7,"src":317},[],{"type":24,"value":355},"大于或等于0，另外，我们将预测y=0，只需要",{"type":18,"tag":43,"props":357,"children":358},{"alt":7,"src":317},[],{"type":24,"value":360},"小于0。通过这些，我们能更好地理解logistic回归的假设函数是如何做出预测的。",{"type":18,"tag":26,"props":362,"children":363},{},[364,366,370,372,376,378,382,384,388,390,394,396,399,401,405,407,411,412,416,418,422,424,428,430,433,435,438,440,444,446,450,451,455,457,461,462,465,466,469],{"type":24,"value":365},"现在，我们假设有一个训练集，假定假设函数是",{"type":18,"tag":43,"props":367,"children":369},{"alt":7,"src":368},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/110533fz7nhhctpscmgmgo.png",[],{"type":24,"value":371},"（如何模拟此模型中的参数后续将会讨论，目前假设我们已经拟合好了参数），目前我们选择",{"type":18,"tag":43,"props":373,"children":375},{"alt":7,"src":374},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/110617deuzgiiugggtbrvs.png",[],{"type":24,"value":377},"为3，",{"type":18,"tag":43,"props":379,"children":381},{"alt":7,"src":380},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/1106415bdh89g2whik9eod.png",[],{"type":24,"value":383},"和",{"type":18,"tag":43,"props":385,"children":387},{"alt":7,"src":386},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/110658jjyyjmmizsywlncr.png",[],{"type":24,"value":389},"均为1，这意味着我的参数向量",{"type":18,"tag":43,"props":391,"children":393},{"alt":7,"src":392},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/110738ag6lsvkoqrzfaf1u.png",[],{"type":24,"value":395},"，这样我们就有了这样的一个参数选择，然后开始试着找出假设函数何时将预测y=1/0，使用公式，我们会发现y更有可能是1，或者说y=1的概率大于等于0.5，只要满足",{"type":18,"tag":43,"props":397,"children":398},{"alt":7,"src":317},[],{"type":24,"value":400},"大于0，即",{"type":18,"tag":43,"props":402,"children":404},{"alt":7,"src":403},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/110816cl65dxffcgqidjuw.png",[],{"type":24,"value":406},"大于0。转换一下，对于任何样本",{"type":18,"tag":43,"props":408,"children":410},{"alt":7,"src":409},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/110840m2dedfmba0m6efq2.png",[],{"type":24,"value":383},{"type":18,"tag":43,"props":413,"children":415},{"alt":7,"src":414},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/110852rrfapzfrvrh9xu85.png",[],{"type":24,"value":417},"，只需要满足",{"type":18,"tag":43,"props":419,"children":421},{"alt":7,"src":420},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/110914yrd23tw01p4ltf1d.png",[],{"type":24,"value":423},"即可。而在图片中也可以直接显示出来，画出",{"type":18,"tag":43,"props":425,"children":427},{"alt":7,"src":426},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/110944xth9cejs2n3hjmol.png",[],{"type":24,"value":429},"这条直线，直线的右半部分将预测y=1，左半部分将预测y=0。因此，这条直线就被成为决策边界，在该实例中就是",{"type":18,"tag":43,"props":431,"children":432},{"alt":7,"src":426},[],{"type":24,"value":434},"这条直线对应的一系列的点，也恰好对应",{"type":18,"tag":43,"props":436,"children":437},{"alt":7,"src":305},[],{"type":24,"value":439},"正好等于0.5的区域，将整个平面分成了两部分。决策边界是假设函数的一个属性，包括参数",{"type":18,"tag":43,"props":441,"children":443},{"alt":7,"src":442},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/113202aq4fsrdnxuxibmdt.png",[],{"type":24,"value":445},"、",{"type":18,"tag":43,"props":447,"children":449},{"alt":7,"src":448},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/111058rs0uzytkbgpoglpm.png",[],{"type":24,"value":383},{"type":18,"tag":43,"props":452,"children":454},{"alt":7,"src":453},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/111112r5e3tdozdihtygzc.png",[],{"type":24,"value":456},"，决定于其参数。事后，我们将讨论如何拟合参数，将使用训练集结合我们的数据来确定参数的取值，但是，一旦我们有确定的参数取值，如参数",{"type":18,"tag":43,"props":458,"children":460},{"alt":7,"src":459},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/113205gh4eroumgr1v1apn.png",[],{"type":24,"value":445},{"type":18,"tag":43,"props":463,"children":464},{"alt":7,"src":448},[],{"type":24,"value":383},{"type":18,"tag":43,"props":467,"children":468},{"alt":7,"src":453},[],{"type":24,"value":470},"，我们就可以完全确定决策边界。我们实际上并不需要通过绘制训练集来确定决策边界。",{"type":18,"tag":26,"props":472,"children":473},{},[474,478],{"type":18,"tag":43,"props":475,"children":477},{"alt":7,"src":476},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/111158xfraakojukcxjzoz.png",[],{"type":18,"tag":43,"props":479,"children":481},{"alt":7,"src":480},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/111216pvflpaoex3yoxzix.png",[],{"type":18,"tag":26,"props":483,"children":484},{},[485],{"type":24,"value":486},"现在我们看一个更加复杂的例子，和往常一样，使用x表示我们的正样本，圆圈表示我们的负样本，给定一个这样的数据集，呈现这样的分布情况，怎样才能使用logistic回归来拟合这些数据呢？",{"type":18,"tag":26,"props":488,"children":489},{},[490],{"type":18,"tag":43,"props":491,"children":493},{"alt":7,"src":492},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/111301emchv7tahpgsv6ol.png",[],{"type":18,"tag":26,"props":495,"children":496},{},[497,499,503,505,509,510,514,516,520,521,525,527,531,533,536,538,541,542,546,548,552,553,556,558,562,564,568,570,574,576,580,582,586,588,592],{"type":24,"value":498},"之前我们提到过多项式回归或线性回归时，谈到可以在特征中添加额外的高阶多项式项，我们也可以对logistic回归使用相同的方法，具体地说，假设函数为:",{"type":18,"tag":43,"props":500,"children":502},{"alt":7,"src":501},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/111337admjpwolhtwn0kbh.png",[],{"type":24,"value":504},"，添加了两个额外的特征",{"type":18,"tag":43,"props":506,"children":508},{"alt":7,"src":507},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/111438hqegcsgjxfjhipli.png",[],{"type":24,"value":383},{"type":18,"tag":43,"props":511,"children":513},{"alt":7,"src":512},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/1114537t23ijmr1v9o0x6c.png",[],{"type":24,"value":515},"，所以我们现在有五个参数（如何选择",{"type":18,"tag":43,"props":517,"children":519},{"alt":7,"src":518},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/1122020bwddf6yznxvndyk.png",[],{"type":24,"value":127},{"type":18,"tag":43,"props":522,"children":524},{"alt":7,"src":523},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/112221lkvbbf9ifetxuhcv.png",[],{"type":24,"value":526},"，...，",{"type":18,"tag":43,"props":528,"children":530},{"alt":7,"src":529},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/112234tqytvzhlbps2k3gb.png",[],{"type":24,"value":532},"的取值），假使我们已经确定，",{"type":18,"tag":43,"props":534,"children":535},{"alt":7,"src":518},[],{"type":24,"value":537},"为-1，",{"type":18,"tag":43,"props":539,"children":540},{"alt":7,"src":523},[],{"type":24,"value":383},{"type":18,"tag":43,"props":543,"children":545},{"alt":7,"src":544},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/112321ms29qovz8xudr2pz.png",[],{"type":24,"value":547},"均为0，",{"type":18,"tag":43,"props":549,"children":551},{"alt":7,"src":550},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/1123417gajdb3z1vyp7nji.png",[],{"type":24,"value":383},{"type":18,"tag":43,"props":554,"children":555},{"alt":7,"src":529},[],{"type":24,"value":557},"取值为1，这意味着在这个参数的选择下，得到的参数向量是",{"type":18,"tag":43,"props":559,"children":561},{"alt":7,"src":560},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/112525yyjnlbyv6llmuwmg.png",[],{"type":24,"value":563},"。也就是说，预测y=1时，只需要",{"type":18,"tag":43,"props":565,"children":567},{"alt":7,"src":566},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/112547c24rxxguyanmcszo.png",[],{"type":24,"value":569},"即可，更为简单地说也就是只要",{"type":18,"tag":43,"props":571,"children":573},{"alt":7,"src":572},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/112607g32o0myqwbqktl7c.png",[],{"type":24,"value":575},"就行。从而可以知道决策边界是",{"type":18,"tag":43,"props":577,"children":579},{"alt":7,"src":578},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/112632gao51kiuaxkkyols.png",[],{"type":24,"value":581},"，这是以原点为中心半径为1的圆，只要在圆外面，都将预测y=1，而在圆内的话，都将预测y=0。通过在特征中增加这些复杂的多项式，还可以得到更为复杂的决策边界，并不全是以直线划分正负样本。再次强调，决策边界并不是训练集的属性，而是假设本身及其参数的属性，只要给定了参数向量",{"type":18,"tag":43,"props":583,"children":585},{"alt":7,"src":584},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/112720iwcct1wgp6wlvu0j.png",[],{"type":24,"value":587},"，决定边界就确定了，并不是用训练集来确定决策边界，而是用训练集来拟合参数",{"type":18,"tag":43,"props":589,"children":591},{"alt":7,"src":590},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/112734f3tscmuqdbor3646.png",[],{"type":24,"value":87},{"type":18,"tag":26,"props":594,"children":595},{},[596,598,602],{"type":24,"value":597},"还有比这更为复杂的决策边界吗？，比如我们有更高阶的多项式：",{"type":18,"tag":43,"props":599,"children":601},{"alt":7,"src":600},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/112803tocrjmh0vlfjewz4.png",[],{"type":24,"value":603},"，这种就可以得到更为复杂的决策边界，而logistic回归可以用于寻找决策边界，例如这样一个椭圆，参数不同，也许会得到另一个不同的决策边界，还有一些其他有趣的形状。因此，这些高阶多项式特征可以让我们得到非常复杂的决策边界。从而可以让我们更加清晰地认识到，什么样的假设函数，我们可以使用logistic回归来表示。",{"type":18,"tag":26,"props":605,"children":606},{},[607],{"type":18,"tag":43,"props":608,"children":610},{"alt":7,"src":609},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/112824wtekpadj2iczvdzc.png",[],{"type":18,"tag":26,"props":612,"children":613},{},[614,616,620,622,625],{"type":24,"value":615},"现在我们知道了",{"type":18,"tag":43,"props":617,"children":619},{"alt":7,"src":618},"https://bbs-img.huaweicloud.com/data/forums/attachment/forum/202106/22/112846ty6vznwc8ecq3e8u.png",[],{"type":24,"value":621},"，接下来将学习如何自动选择参数",{"type":18,"tag":43,"props":623,"children":624},{"alt":7,"src":590},[],{"type":24,"value":626},"，使我们能在给定一个训练集时，根据数据自动拟合参数。",{"type":18,"tag":26,"props":628,"children":629},{},[630],{"type":24,"value":631},"好啦，今天小Mi带领大家全新认识了逻辑回归的基本函数表示，下期我们还将进一步学习其代价函数、梯度下降法以及多类别的分类问题。我们，下期再见呦~",{"title":7,"searchDepth":633,"depth":633,"links":634},4,[],"markdown","content:technology-blogs:zh:622.md","content","technology-blogs/zh/622.md","technology-blogs/zh/622","md",1776506138876]