Cyan's Blog

Search

Search IconIcon to open search

Relation_between_Softmax_and_Logistic_Regression

Last updated Feb 11, 2022 Edit Source

# Softmax 与 Logistic 回归的联系

2022-02-11

Tags: #SoftmaxRegression #LogisticRegression #Classification #MulticlassClassification

Ref: Unsupervised Feature Learning and Deep Learning Tutorial

$$\begin{aligned} h(x) &=\frac{1}{\exp \left(\left(\theta^{(1)}-\theta^{(2)}\right)^{\top} x^{(i)}\right)+\exp \left(\overrightarrow{0}^{\top} x\right)}\left[\exp \left(\left(\theta^{(1)}-\theta^{(2)}\right)^{\top} x\right) \exp \left(\overrightarrow{0}^{\top} x\right)\right] \\ &=\left[\begin{array}{l} \frac{1}{1+\exp \left(\left(\theta^{(1)}-\theta^{(2)}\right)^{\top} x^{(i)}\right)} \\ \frac{\exp \left(\left(\theta^{(1)}-\theta^{(2)}\right)^{\top} x\right)}{1+\exp \left(\left(\theta^{(1)}-\theta^{(2)}\right)^{\top} x^{(i)}\right)} \end{array}\right] \\ &=\left[\begin{array}{c} \frac{1}{1+\exp \left(\left(\theta^{(1)}-\theta^{(2)}\right)^{\top} x^{(i)}\right)}\\ {1-\frac{1}{1+\exp \left(\left(\theta^{(1)}-\theta^{(2)}\right)^{\top} x^{(i)}\right)}} \end{array}\right] \end{aligned}$$

通过将 $\theta^{(2)}-\theta^{(1)}$ 替换为 $\theta’$, 得到 $$\begin{bmatrix} \frac{1}{1+\exp \left(-(\theta’)^{\top} x^{(i)}\right)}\\ {1-\frac{1}{1+\exp \left(-(\theta’)^{\top} x^{(i)}\right)}} \end{bmatrix}$$ 我们可以看到函数预测第一个类的概率为: $$\frac{1}{1+\exp \left(-(\theta’)^{\top} x^{(i)}\right)}$$ 就是 Logistic回归的情形.

第二个类的概率为 $${1-\frac{1}{1+\exp \left(-(\theta’)^{\top} x^{(i)}\right)}}$$ 也就是logistic回归没有表述出来的情况.