Cyan's Blog

Search

Search IconIcon to open search

Part.5_Gradient_Descent(ML_Andrew.Ng.)

Last updated Aug 2, 2021 Edit Source

# Gradient Descent

2021-08-02

Tags: #MachineLearning #GradientDescent

梯度下降是一种最小化损失函数的标准方法 So we have our hypothesis function and we have a way of measuring how well it fits into the data. Now we need to estimate the parameters in the hypothesis function. That’s where gradient descent comes in.

# Intuition

这个曲面是在CostFunction空间里面的, XY坐标表示Hypothesis里面的参数$\theta_0,\theta_1 \cdots$ CostFunction代表Hypothesis与真实值的偏差, CostFunction的目标是使自己的值最小, 即Hypothesis与真实值的偏差最小. 更新哪个参数就相当于在那个参数的方向上走一步.

# Algorithm

$$ \begin{array}{l} \text {repeat until convergence }{\\ \begin{array}{cc} \theta_{j}:=\theta_{j}-\alpha \frac{\Large\partial}{\Large\partial \Large\theta_{j}} J\left(\theta_{0},\cdots ,\theta_{n}\right) & \text { (simultaneously update } j=0, \cdots ,j=n) \end{array}\\ \text { } } \end{array} $$

# Learning Rate / 学习率

即每一步的长度

# No need to decrease Learning Rate overtime

|300

As we approach a local minimum, gradient descent will automatically take smaller steps. So, no need to decrease α over time.

# 合理调整学习率

调整每一步的大小比例

# 需要同时赋值

# Different Gradient Descent Methods

Different_Gradient_Descent_Methods

Linear Regression & Gradient Descent

Linear_Regression&Gradient_Descent

Relation_Between_Linear_Regression&Gradient_Descent_梯度下降和线性回归的关系

Logistic Regression & Gradient Descent

Part.14_Logistic_Regression&Gradient_Descent(ML_Andrew.Ng.)

# 凸优化与线性回归问题

凸优化与线性回归问题