首页 > 代码库 > 深度学习 Deep Learning UFLDL 最新 Tutorial 学习笔记 1:Linear Regression

深度学习 Deep Learning UFLDL 最新 Tutorial 学习笔记 1:Linear Regression

1 前言

Andrew Ng的UFLDL在2014年9月底更新了!

对于开始研究Deep Learning的童鞋们来说这真的是极大的好消息!


新的Tutorial相比旧的Tutorial增加了Convolutional Neural Network的内容,了解的童鞋都知道CNN在Computer Vision的重大影响。并且从新编排了内容及exercises。


新的UFLDL网址为:

http://ufldl.stanford.edu/tutorial/


2 Linear Regression 理论简述

对于线性回归Linear Regression,恐怕大部分童鞋都了解,简单的说

线性回归问题就是一个目标值y取决于一组输入值x,我们要寻找一个最合适的假设Hypothesis来描述这个y与x的关系,然后利用这个Hypothesis来预测新的输入x对应的y。


这是个简单的最优化问题,我们需要一个代价函数cost function来描述在training set样本中的y与通过h函数预测的y之间的差距,从而利用这个cost function通过Gradient Decent梯度下降法来计算h的最优参数从而得到最优的h。

因为是通过样本让计算机“学习”合适的参数theta,因此这是一个最基本的机器学习算法。


cost function:

J(θ)=12i(hθ(x(i))?y(i))2=12i(θ?x(i)?y(i))2

对theta做偏导:

Differentiating the cost function J(θ) as given above with respect to a particular parameter θj gives us:

?J(θ)?θj=ix(i)j(hθ(x(i))?y(i))

3 Linear Regression 练习

3.1 ex1a_linreg.m 分析

%
%This exercise uses a data from the UCI repository:
% Bache, K. & Lichman, M. (2013). UCI Machine Learning Repository
% http://archive.ics.uci.edu/ml
% Irvine, CA: University of California, School of Information and Computer Science.
%
%Data created by:
% Harrison, D. and Rubinfeld, D.L.
% ‘‘Hedonic prices and the demand for clean air‘‘
% J. Environ. Economics & Management, vol.5, 81-102, 1978.
%
addpath ../common
addpath ../common/minFunc_2012/minFunc
addpath ../common/minFunc_2012/minFunc/compiled

% Load housing data from file.
data = http://www.mamicode.com/load(‘housing.data‘);  % housing data  506x14 >


3.2 linear_regression.m code

function [f,g] = linear_regression(theta, X,y)
  %
  % Arguments:
  %   theta - A vector containing the parameter values to optimize.
  %   X - The examples stored in a matrix.
  %       X(i,j) is the i‘th coordinate of the j‘th example.
  %   y - The target value for each example.  y(j) is the target for example j.
  %
  
  m=size(X,2);
  n=size(X,1);

  f=0;
  g=zeros(size(theta));

  %
  % TODO:  Compute the linear regression objective by looping over the examples in X.
  %        Store the objective function value in ‘f‘.
  %
  % TODO:  Compute the gradient of the objective with respect to theta by looping over
  %        the examples in X and adding up the gradient for each example.  Store the
  %        computed gradient in ‘g‘.
  
%%% YOUR CODE HERE %%%

% Step 1 : Compute f cost function
for i = 1:m
    f = f + (theta‘ * X(:,i) - y(i))^2;
end

f = 1/2*f;

% Step 2: Compute gradient 

for j = 1:n
    for i = 1:m
        g(j) = g(j) + X(j,i)*(theta‘ * X(:,i) - y(i));
    end
    
end

3.3 Result

Optimization took 3.374166 seconds.
RMS training error: 4.679871
RMS testing error: 4.865463



深度学习 Deep Learning UFLDL 最新 Tutorial 学习笔记 1:Linear Regression