首页 > 代码库 > 机器学习笔记(二)- from Andrew Ng的教学视频
机器学习笔记(二)- from Andrew Ng的教学视频
省略了Octave的使用方法结束,以后用得上再看吧
week three:
Logistic Regression:
用于0-1分类
Hypothesis Representation:
:Sigmoid function or Logistic function
Decision boundary:
theta 的转置*小x>=0 即为boundary
may :Non-linear decision boundaries,构造x的多项式项
Cost function:
Simplified cost function and gradient descent:
由于y只有两个值,所以合并:
对上式求最小偏导:
(应该是忽略了分母)
Advanced optimization:
Conjugate gradient,BFGS,L-BFGS(有待查询学习)
Multi-class classification: One-vs-all:
对每个类使用一次Logistic Regression分类,确定参数之后,求出max的那一类:called One-vs-all(一对多方法)。
Regularization:The problem of overfitting
overfiting:reduce number of features or regularization
linear regression:
Gradient descent:
Normal equation:
Regularized logistic regression:like linear regression,add extra in J(theta)
attention : 多的正则项是从1开始的,对于0不做惩罚。
机器学习笔记(二)- from Andrew Ng的教学视频