deriviation_of_a_and_b_in_a_simple_regression
Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
deriviation_of_a_and_b_in_a_simple_regression [2024/05/23 08:27] – hkimscil | deriviation_of_a_and_b_in_a_simple_regression [2025/05/20 01:08] (current) – hkimscil | ||
---|---|---|---|
Line 6: | Line 6: | ||
\end{eqnarray*} | \end{eqnarray*} | ||
+ | <WRAP box> | ||
\begin{eqnarray*} | \begin{eqnarray*} | ||
- | \\ | ||
\text{for a (constant)} \\ | \text{for a (constant)} \\ | ||
\\ | \\ | ||
\dfrac{\text{d}}{\text{dv}} \sum{(Y_i - (a + bX_i))^2} & = & \sum \dfrac{\text{d}}{\text{dv}} {(Y_i - (a + bX_i))^2} \\ | \dfrac{\text{d}}{\text{dv}} \sum{(Y_i - (a + bX_i))^2} & = & \sum \dfrac{\text{d}}{\text{dv}} {(Y_i - (a + bX_i))^2} \\ | ||
- | & = & \sum{2 (Y_i - (a + bX_i))} * (-1) \;\;\;\; \because \dfrac{\text{d}}{\text{dv for a}} (Y_i - (a+bX_i)) = -1 \\ | + | & = & \sum{2 (Y_i - (a + bX_i))} * (-1) \; |
+ | & \because | ||
& = & -2 \sum{(Y_i - (a + bX_i))} \\ | & = & -2 \sum{(Y_i - (a + bX_i))} \\ | ||
\\ | \\ | ||
Line 24: | Line 25: | ||
a & = & \overline{Y} - b \overline{X} \\ | a & = & \overline{Y} - b \overline{X} \\ | ||
\end{eqnarray*} | \end{eqnarray*} | ||
+ | </ | ||
- | + | <WRAP box> | |
\begin{eqnarray*} | \begin{eqnarray*} | ||
\text{for b, (coefficient)} \\ | \text{for b, (coefficient)} \\ | ||
\\ | \\ | ||
\dfrac{\text{d}}{\text{dv}} \sum{(Y_i - (a + bX_i))^2} | \dfrac{\text{d}}{\text{dv}} \sum{(Y_i - (a + bX_i))^2} | ||
- | & = & \sum{2 (Y_i - (a + bX_i))} * (-X_i) \;\;\;\; \because \dfrac{\text{d}}{\text{dv for b}} (Y_i - (a+bX_i)) = -X_i \\ | + | & = & \sum{2 (Y_i - (a + bX_i))} * (-X_i) \; |
+ | & \because | ||
& = & -2 \sum{X_i (Y_i - (a + bX_i))} \\ | & = & -2 \sum{X_i (Y_i - (a + bX_i))} \\ | ||
\\ | \\ | ||
Line 45: | Line 47: | ||
b & = & \dfrac{\sum{(Y_i - \overline{Y})}}{\sum{(X_i - \overline{X})}} \\ | b & = & \dfrac{\sum{(Y_i - \overline{Y})}}{\sum{(X_i - \overline{X})}} \\ | ||
b & = & \dfrac{ \sum{(Y_i - \overline{Y})(X_i - \overline{X})} } {\sum{(X_i - \overline{X})(X_i - \overline{X})}} \\ | b & = & \dfrac{ \sum{(Y_i - \overline{Y})(X_i - \overline{X})} } {\sum{(X_i - \overline{X})(X_i - \overline{X})}} \\ | ||
- | b & = & \dfrac{ \text{SP} } {\text{SS}_\text{x}} \\ | + | b & = & \dfrac{ \text{SP} } {\text{SS}_\text{x}} = \dfrac{\text{Cov(X, |
\end{eqnarray*} | \end{eqnarray*} | ||
+ | </ | ||
+ | 리그레션 라인으로 예측하고 틀린 나머지 error의 제곱의 합을 (ss.res) 최소값으로 만드는 선의 기울기와 절편값은 위와 같다 (a and b). | ||
- | |||
- | |||
- | |||
- | {{: | ||
- | {{: |
deriviation_of_a_and_b_in_a_simple_regression.1716420442.txt.gz · Last modified: 2024/05/23 08:27 by hkimscil