Coursera ML(3)-Multivariate Linear Regression python实现

Multivariate Linear Regression and Programming Exercise 1


Gradient Descent for Multiple Variables

  • Suppose we have n variables, set hypothesis to be:

  • Cost Function

  • Gradient Descent Algorithm

    Get every feature into approximately [-1, 1]. Just normalize all the parameters :)

  • Learning Rate:Not too big(fail to converge), not too small(too slow)

  • Polynormal Regression:Use feature scalling. (Somewhat like normalizing dimension)

Programming Exercise 1

下载程序及相关数据

Stanford coursera Andrew Ng 机器学习课程编程作业(Exercise 1),作业下载链接貌似被墙了,下载链接放这。
http://home.ustc.edu.cn/~mmmwhy/machine-learning-ex1.zip

重新推导一下:

其实这里一共就两个式子:

  • computeCost
  • gradientDescent

python拟合实现代码

原本用的是matlab代码,我用python实现了一下,结果是一样的:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt


def readfile(path):
X=[]
y=[]
with open(path,'r') as f:
for line in f:
X.append([1,float(line.split(',')[0])])
y.append(float(line.split(',')[1]))
return X,y


def dataplot(x,theta,y):
plt.plot(x, y, 'rx', markersize=10)
plt.ylabel('Profit in $10,000s')
plt.xlabel('Population of City in 10,000s')
plt.plot(X[:,1],X*theta,'-')
plt.show()


def computeCost(X,y,theta):
m = len(y)
J = 0
for i in range(m):
J = J + float((X[i]*theta-y[i])**2)
return J/(2*m)

def gradientDescent(X, y, theta, alpha, num_iters):
m = len(y)
num_iters = 1500
J_history = np.zeros(num_iters)
for i in range(num_iters):
S =X.T * (X * theta - np.mat(y).T) / m
theta = theta - alpha * S;
J_history[i] = computeCost(X,y,theta)
return theta

if __name__=="__main__":
theta = np.mat([[0],[0]])
iterations = 1500
alpha = 0.01
iterations = 1500
path = "C:\Users\wing\Documents\MATLAB\ex1\ex1data1.txt"

x,y = readfile(path)# 小写的X不是矩阵,是list,大写的X是矩阵。
X = np.mat(x)
J = computeCost(X, y, theta)
theta = gradientDescent(X, y, theta, alpha, iterations)
dataplot(X[:,1],theta,y)


输出的图有点小,就这样吧。

Coursera ML(3)-Multivariate Linear Regression python实现

https://iii.run/archives/de87fc6fe38c.html

作者

mmmwhy

发布于

2017-03-26

更新于

2022-10-30

许可协议

评论