Coursera ML(2)-Model and Cost Function
in Coursera ML with 0 comment

Coursera ML(2)-Model and Cost Function

?> with 0 comment

Model and Cost Function / Parameter Learning / Gradient Descent For Linear Regression

Model and Cost Function

Cost Function$J(\theta_0,\theta_1)= \frac1{2m}\sum_{i=1}^m(h_{\theta}(x^i)-y^i)^w$

Model Representation


Cost Function

We can measure the accuracy of our hypothesis function by using a cost function. his takes an average difference (actually a fancier version of an average) of all the results of the hypothesis with inputs from x's and the actual output y's. 如何尽可能的将直线与我们的数据相拟合

Parameter Learning

Gradient descent idea

Turns out, that if you're standing at that point on the hill, you look all around and you find that the best direction is to take a little step downhill is roughly that direction. Okay, and now you're at this new point on your hill. You're gonna, again, look all around and say what direction should I step in order to take a little baby step downhill? And if you do that and take another step, you take a step in that direction.

Gradient descent algorithm

repeat until convergence:{

Gradient Descent For Linear Regression

$$\begin{align*} \text{repeat until convergence: } \lbrace & \newline \theta_0 := & \theta_0 - \alpha \frac{1}{m} \sum\limits_{i=1}^{m}(h_\theta(x_{i}) - y_{i}) \newline \theta_1 := & \theta_1 - \alpha \frac1m \sum\limits_{i=1}^m\left((h_\theta(x_i) - y_i) x_i\right) \newline \rbrace& \end{align*}$$

where m is the size of the training set, $\theta_0$ a constant that will be changing simultaneously with $\theta_1$ and $x_i y_i$are values of the given training set (data).


From now on, bravely dream and run toward that dream.
陕ICP备17001447号·苏公网安备 32059002001895号