Chapter 1. Convex Optimization ——Introduction

Chapter 1. Convex Optimization ——Introduction

最小二乘问题具有很多统计意义,例如,给定包含高斯噪声的线性测量值时,向量X的最大思然估计等价于最小二乘问题的解

判断一个优化问题是否是最小二乘问题:
1、检验目标函数是否是二次函数
2、检验此二次函数是否半正定

χ(Mg)=22g,

加权最小二乘法:

ki=1wi(αTibi)2

正则化最小二乘法:

ki=1(αTibi)2+ρni=1x2i

线性规划问题,目标函数和所有的约束函数均为线性函数

minimize cTx s.t.aTixbi

大部分局部优化方法仅仅要求目标函数和约束函数可微,因此将实际问题建模为非线性优化问题是相当直接的。建模完成以后,局部优化中的技巧体现在问题的求解上(在寻找一个局部最优点的意义上)。而凸优化的情况完全相反,技巧和难点体现在描述问题的环节,一旦问题被建模为凸优化问题,求解过程相对来说就非常简单。

"...in fact, the great watershed in optimization isn't between linearity and nonlinearity, but convexity and nonconvexity." by R. Tyrrell Rockafellar, in SIAM Review, 1993

参考文献:

  1. Optimization Problem Types - Convex Optimization
  2. Advanced Tutorial - Gradients, Linearity, and Sparsity
Chapter%201.%20Convex%20Optimization%20%u2014%u2014Introduction%0A%3D%3D%3D%3D%0A@%5Bconvex%20optimization%7Cpublished%7Cmachine%20learning%5D%20%20%0A%0A%u6700%u5C0F%u4E8C%u4E58%u95EE%u9898%u5177%u6709%u5F88%u591A%u7EDF%u8BA1%u610F%u4E49%uFF0C%u4F8B%u5982%uFF0C%u7ED9%u5B9A%u5305%u542B%u9AD8%u65AF%u566A%u58F0%u7684%u7EBF%u6027%u6D4B%u91CF%u503C%u65F6%uFF0C%u5411%u91CF%60%24X%24%60%u7684%u6700%u5927%u601D%u7136%u4F30%u8BA1%u7B49%u4EF7%u4E8E%u6700%u5C0F%u4E8C%u4E58%u95EE%u9898%u7684%u89E3%0A%20%0A%u5224%u65AD%u4E00%u4E2A%u4F18%u5316%u95EE%u9898%u662F%u5426%u662F%u6700%u5C0F%u4E8C%u4E58%u95EE%u9898%uFF1A%20%20%0A1%u3001%u68C0%u9A8C%u76EE%u6807%u51FD%u6570%u662F%u5426%u662F%u4E8C%u6B21%u51FD%u6570%20%20%0A2%u3001%u68C0%u9A8C%u6B64%u4E8C%u6B21%u51FD%u6570%u662F%u5426%u534A%u6B63%u5B9A%20%20%20%0A%0A%24%0A%5Cchi%28M_g%29%20%3D%202-2g%2C%0A%24%0A%0A%u52A0%u6743%u6700%u5C0F%u4E8C%u4E58%u6CD5%3A%0A%20%0A%60%60%60mathjax%0A%20%20%20%20%5Csum_%7Bi%3D1%7D%5E%7Bk%7Dw_%7Bi%7D%5Cleft%20%28%5Calpha%20_%7Bi%7D%5E%7BT%7D-b_%7Bi%7D%5Cright%20%29%5E%7B2%7D%0A%60%60%60%0A%0A%u6B63%u5219%u5316%u6700%u5C0F%u4E8C%u4E58%u6CD5%3A%20%20%0A%0A%60%60%60mathjax%0A%5Csum_%7Bi%3D1%7D%5E%7Bk%7D%5Cleft%20%28%5Calpha%20_%7Bi%7D%5E%7BT%7D-b_%7Bi%7D%5Cright%20%29%5E%7B2%7D+%5Crho%20%5Csum_%7Bi%3D1%7D%5E%7Bn%7Dx_%7Bi%7D%5E%7B2%7D%0A%60%60%60%20%20%0A%0A%u7EBF%u6027%u89C4%u5212%u95EE%u9898%uFF0C%u76EE%u6807%u51FD%u6570%u548C%u6240%u6709%u7684%u7EA6%u675F%u51FD%u6570%u5747%u4E3A%u7EBF%u6027%u51FD%u6570%0A%0Aminimize%20%60%24c%5E%7BT%7Dx%24%60%20%20%60%24s.t.%20a_%7Bi%7D%5E%7BT%7Dx%5Cleq%20b_%7Bi%7D%24%60%0A%0A%0A%u5927%u90E8%u5206%u5C40%u90E8%u4F18%u5316%u65B9%u6CD5%u4EC5%u4EC5%u8981%u6C42%u76EE%u6807%u51FD%u6570%u548C%u7EA6%u675F%u51FD%u6570%u53EF%u5FAE%uFF0C%u56E0%u6B64%u5C06%u5B9E%u9645%u95EE%u9898%u5EFA%u6A21%u4E3A%u975E%u7EBF%u6027%u4F18%u5316%u95EE%u9898%u662F%u76F8%u5F53%u76F4%u63A5%u7684%u3002%u5EFA%u6A21%u5B8C%u6210%u4EE5%u540E%uFF0C%u5C40%u90E8%u4F18%u5316%u4E2D%u7684%u6280%u5DE7%u4F53%u73B0%u5728%u95EE%u9898%u7684%u6C42%u89E3%u4E0A%uFF08%u5728%u5BFB%u627E%u4E00%u4E2A%u5C40%u90E8%u6700%u4F18%u70B9%u7684%u610F%u4E49%u4E0A%uFF09%u3002%u800C%u51F8%u4F18%u5316%u7684%u60C5%u51B5%u5B8C%u5168%u76F8%u53CD%uFF0C%u6280%u5DE7%u548C%u96BE%u70B9%u4F53%u73B0%u5728**%u63CF%u8FF0%u95EE%u9898%u7684%u73AF%u8282**%uFF0C%u4E00%u65E6%u95EE%u9898%u88AB%u5EFA%u6A21%u4E3A%u51F8%u4F18%u5316%u95EE%u9898%uFF0C%u6C42%u89E3%u8FC7%u7A0B%u76F8%u5BF9%u6765%u8BF4%u5C31%u975E%u5E38%u7B80%u5355%u3002%0A%0A%0A%22...in%20fact%2C%20the%20great%20watershed%20in%20optimization%20isn%27t%20between%20linearity%20and%20nonlinearity%2C%20but%20convexity%20and%20nonconvexity.%22%20%20%20by%20R.%20Tyrrell%20Rockafellar%2C%20in%20*SIAM%20Review*%2C%201993%20%20%0A%0A%u53C2%u8003%u6587%u732E%3A%0A1.%20%5BOptimization%20Problem%20Types%20-%20Convex%20Optimization%5D%28http%3A//www.solver.com/convex-optimization%29%0A2.%20%5BAdvanced%20Tutorial%20-%20Gradients%2C%20Linearity%2C%20and%20Sparsity%5D%28http%3A//www.solver.com/advanced-tutorial%29


comments powered by Disqus