https://www.bilibili.com/video/BV164411S78V
线性回归(Linear Regression)与梯度下降(Gradient Descent)
记号:
\(m\) = 训练样本数,\(n\) = 特征数,\(x\) = 输入变量/特征,\(y\) = 输出变量/目标变量
\((x, y)\) = 训练样本。第i个: \((x^{(i)},y^{(i)})\)
\(h_\theta(x)=\theta_0+\theta_1x_1+\theta_2x_2+...+\theta_nx_n\)
令\(x_0\)为\(1\),则\(h_\theta(x) = \sum_{i=0}^{n}\theta_ix_i=\theta^T x\)
\(Minimize_{\theta}\ \ J(\theta) = \frac{1}{2m} \sum_{i=1}^m(h_\theta(x^{(i)})-y^{(i)})^2\)
(最小二乘线性回归)