Cyan's Blog

Search

Search IconIcon to open search

Part.15_Advanced_Optimization(ML_Andrew.Ng.)

Last updated Aug 19, 2021 Edit Source

# Advanced Optimization

2021-08-19

Tags: #Octave #MachineLearning #GradientDescent #LinearRegression #LogisticRegression

Link:其他Gradient_Descent Different_Gradient_Descent_Methods

Octave里面, 只需要写出怎么计算$J(\theta)$, $\frac{\Large\partial}{\Large\partial \Large\theta_{j}} J(\theta)$即可调用内置的库函数快速计算参数值.

1
2
3
4
function [jVal, gradient] = costFunction(theta)
  jVal = [...code to compute J(theta)...];
  gradient = [...code to compute derivative of J(theta)...];
end
1
2
3
options = optimset('GradObj', 'on', 'MaxIter', 100);
initialTheta = zeros(2,1);
   [optTheta, functionVal, exitFlag] = fminunc(@costFunction, initialTheta, options);