Cyan's Blog

Search

Search IconIcon to open search

Relation_Between_Linear_Regression&Gradient_Descent_梯度下降和线性回归的关系

Last updated Aug 5, 2021 Edit Source

# 梯度下降法和线性回归的关系

2021-08-05

Tags: #MachineLearning #LinearRegression #GradientDescent

1
2
graph TD
	A([梯度下降])-->B([梯度下降+平方损失])-->C([梯度下降+平方损失+线性回归])

所以

$$ \frac{\partial}{\partial \theta_{j}} J(\theta) =\frac 1 m \sum_{i=1}^{m} \left(h_{\theta}(x^{(i)})-y^{(i)}\right) f_{j}(x^{(i)}) $$

公式变为: $$ \begin{array}{l} \text { repeat until convergence }{\\ \begin{array}{cc} &\theta_{j}:=\theta_{j}-\alpha \frac 1 m \sum_{i=1}^{m} \left(h_{\theta}(x^{(i)})-y^{(i)}\right) f_{j}(x^{(i)}) \end{array}\\ \text { } } \\\ \text { (simultaneously update } j=0, \cdots ,j=n) \end{array} $$

$$ \begin{array}{l} \text { repeat until convergence }{\\ \begin{array}{cc} &\theta_{j}:=\theta_{j}-\alpha \frac 1 m \sum_{i=1}^{m} \left(h_{\theta}(x^{(i)})-y^{(i)}\right) x_j^{(i)} \end{array}\\ \text { } } \\\ \text { (simultaneously update } j=0, \cdots ,j=n) \end{array} $$