User Tools

Site Tools


deriviation_of_a_and_b_in_a_simple_regression

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
deriviation_of_a_and_b_in_a_simple_regression [2024/05/23 08:31] hkimscilderiviation_of_a_and_b_in_a_simple_regression [2025/08/05 06:24] (current) hkimscil
Line 1: Line 1:
 +derivate of a and b in regression
 +dv for a
 +dv for b
 +to understand [[:gradient descent]]
 [{{:r.regressionline3.png}}] [{{:r.regressionline3.png}}]
  
 \begin{eqnarray*} \begin{eqnarray*}
-\sum{(Y_i - \hat{Y_i})^2} & = & \sum{(Y_i - (a + bX_i))^2}  \;\;\; \because \hat{Y_i} = a + bX_i \\+\sum{(Y_i - \hat{Y_i})^2}  
 +& = & \sum{(Y_i - (a + bX_i))^2}  \;\;\; \because \hat{Y_i} = a + bX_i \\
 & = & \text{SSE or SS.residual} \;\;\; \text{(and this should be the least value.)} & = & \text{SSE or SS.residual} \;\;\; \text{(and this should be the least value.)}
 \end{eqnarray*} \end{eqnarray*}
Line 10: Line 15:
 \text{for a (constant)} \\  \text{for a (constant)} \\ 
 \\ \\
-\dfrac{\text{d}}{\text{dv}} \sum{(Y_i - (a + bX_i))^2} & = & \sum \dfrac{\text{d}}{\text{dv}} {(Y_i - (a + bX_i))^2} \\  +\dfrac{\text{d}}{\text{da}} \sum{(Y_i - (a + bX_i))^2}  
-& \sum{(Y_i - (a + bX_i))} * (-1) \;\;\;\\\ +& = & \sum \dfrac{\text{d}}{\text{da}} {(Y_i - (a + bX_i))^2} \\ 
-& \because & \dfrac{\text{d}}{\text{dv for a}} (Y_i - (a+bX_i)) = -1 \\+& & \because {(Y_i - (a + bX_i))^2\text{residual}^2 \\ 
 +& & \therefore{} \\ 
 +& = & \sum \dfrac{\text{dresidual}^2} {da}  \\ 
 += & \sum \dfrac{\text{dresidual}^2}{\text{dresidual}} * \dfrac{\text{dresidual}}{\text{da}} \\ 
 +& = & \sum{2 * \text{residual}} * {\dfrac{\text{dresidual}}{\text{da}}} \;\;\;\; \\ 
 +& = & \sum{2 * \text{residual}} * {\dfrac{d{(Y_i - (a + bX_i))}}{\text{da}}} \;\;\;\; \\ 
 +& \sum{2 * \text{residual}} * (0 - 1 - 0) \;\;\;\; \\ 
 +& & \because{Y_i = 0; \;\;\; a = 1; \;\;\; bX_i = 0} \\ 
 +& = & \sum{2 * \text{residual}} * (-1) \;\;\;\; \\
 & = & -2 \sum{(Y_i - (a + bX_i))} \\  & = & -2 \sum{(Y_i - (a + bX_i))} \\ 
 \\ \\
Line 31: Line 44:
 \text{for b, (coefficient)} \\  \text{for b, (coefficient)} \\ 
 \\ \\
-\dfrac{\text{d}}{\text{dv}} \sum{(Y_i - (a + bX_i))^2}  & = & \sum \dfrac{\text{d}}{\text{dv}} {(Y_i - (a + bX_i))^2} \\ +\dfrac{\text{d}}{\text{db}} \sum{(Y_i - (a + bX_i))^2}  & = & \sum \dfrac{\text{d}}{\text{db}} {(Y_i - (a + bX_i))^2} \\ 
 & = & \sum{2 (Y_i - (a + bX_i))} * (-X_i) \;\;\;\; \\ & = & \sum{2 (Y_i - (a + bX_i))} * (-X_i) \;\;\;\; \\
 & \because & \dfrac{\text{d}}{\text{dv for b}} (Y_i - (a+bX_i)) = -X_i \\ & \because & \dfrac{\text{d}}{\text{dv for b}} (Y_i - (a+bX_i)) = -X_i \\
Line 47: Line 60:
 b & = & \dfrac{\sum{(Y_i - \overline{Y})}}{\sum{(X_i - \overline{X})}} \\ b & = & \dfrac{\sum{(Y_i - \overline{Y})}}{\sum{(X_i - \overline{X})}} \\
 b & = & \dfrac{ \sum{(Y_i - \overline{Y})(X_i - \overline{X})} } {\sum{(X_i - \overline{X})(X_i - \overline{X})}} \\ b & = & \dfrac{ \sum{(Y_i - \overline{Y})(X_i - \overline{X})} } {\sum{(X_i - \overline{X})(X_i - \overline{X})}} \\
-b & = & \dfrac{ \text{SP} } {\text{SS}_\text{x}} \\+b & = & \dfrac{ \text{SP} } {\text{SS}_\text{x}} = \dfrac{\text{Cov(X, Y)}} {\text{Var(X)}} = \dfrac{\text{Cov(X, Y)}} {\text{Cov(X, X)}}\\
 \end{eqnarray*}  \end{eqnarray*} 
 </WRAP> </WRAP>
 +리그레션 라인으로 예측하고 틀린 나머지 error의 제곱의 합을 (ss.res) 최소값으로 만드는 선의 기울기와 절편값은 위와 같다 (a and b).
  
 +위는 증명을 통해서 a와 b값을 알아낸 것이고, [[:gradient descent|R과 같은 어플리케이션에서 a와 b를 알아내는 방법은]] 없을까?
deriviation_of_a_and_b_in_a_simple_regression.1716420675.txt.gz · Last modified: 2024/05/23 08:31 by hkimscil

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki