Cyan's Blog

Search

Search IconIcon to open search

证明Logistic回归的损失函数是凸函数

Last updated Sep 11, 2021 Edit Source

# 证明Logistic回归的损失函数是凸函数

2021-09-11

Tags: #MachineLearning #LogisticRegression #ConvexOptimization #CostFunction

首先 如何证明一个函数为凸函数

# 证明

# 原函数

Part.13_Cost_Function-Logistic_Regression(ML_Andrew.Ng.) $$\begin{aligned} h&=g(X \theta) \\ J(\theta)&=-\frac{1}{m} \cdot\left[y^{T} \log (h)+(1-y)^{T} \log (1-h)\right] \end{aligned}$$

# 一阶导数

在梯度下降里面我们已经求出了一阶导数了: 推导

# 二阶导数

$$h(x)=g\left(\theta^Tx\right) $$ $$\begin{aligned} \frac{\partial}{\partial \theta_{j}} h(x)&= \frac{\partial}{\partial \theta_{j}} g\left(\theta^Tx\right)\\ &=g\left(\theta^Tx\right)\left(1-g\left(\theta^Tx\right)\right) \frac{\partial}{\partial \theta_{j}} \theta^Tx\\ &=g\left(\theta^Tx\right)\left(1-g\left(\theta^Tx\right)\right)x_j\\ &=h(x)\left(1-h(x)\right)x_j \end{aligned}$$

$$\begin{aligned} \frac{\partial^2}{\partial \theta_{j}^2} J(\theta) &=\frac{\partial}{\partial \theta_{j}}\left[\frac{1}{m} \sum_{i=1}^{m}\left(h(x^{(i)})-y^{(i)}\right) x^{(i)}{j}\right]\\ &=\frac{1}{m} \sum{i=1}^{m}\left(\frac{\partial}{\partial \theta_{j}}h(x^{(i)})\right) x^{(i)}{j} \\ &=\frac{1}{m} \sum{i=1}^{m}g\left(\theta^Tx\right)\left(1-g\left(\theta^Tx\right)\right)\left(x_j^{(i)}\right)^2 \\ &=\frac{1}{m} \sum_{i=1}^{m}h(x)\left(1-h(x)\right)\left(x_j^{(i)}\right)^2 \end{aligned}$$