User Tools

Site Tools


gradient_descent

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
gradient_descent [2025/08/21 06:51] – [R output] hkimscilgradient_descent [2025/12/18 18:48] (current) hkimscil
Line 1: Line 1:
 ====== Gradient Descent ====== ====== Gradient Descent ======
-====== explanation ======+ 
 +<tabbed> 
 +  * :gradient descent:code01 
 +  * :gradient descent:code02 
 +  * *:gradient descent:output01 
 +  * :gradient descent:output02 
 +</tabbed>
  
 ====== R code: Idea ====== ====== R code: Idea ======
 <code> <code>
 +library(tidyverse) 
 +library(data.table)
 library(ggplot2) library(ggplot2)
 library(ggpmisc) library(ggpmisc)
Line 525: Line 532:
 & \because & \dfrac{\text{d}}{\text{da}} (Y_i - (a+bX_i)) = -1 \\ & \because & \dfrac{\text{d}}{\text{da}} (Y_i - (a+bX_i)) = -1 \\
 & = & 2 * \sum{(Y_i - (a + bX_i))} * -1 \\ & = & 2 * \sum{(Y_i - (a + bX_i))} * -1 \\
-& = & -2 *\sum{\text{Residual}} \\ +& = & -2 *\sum{\text{residual}} \\  
 +& .. & -2 \frac{\sum{\text{residual}}}{n} \\ 
 +& = & -2 * \overline{\text{residual}} \\
 \end{eqnarray*}  \end{eqnarray*} 
 아래 R code에서 gradient function을 참조. 아래 R code에서 gradient function을 참조.
Line 541: Line 550:
 & = & -2 X_i \sum{(Y_i - (a + bX_i))} \\ & = & -2 X_i \sum{(Y_i - (a + bX_i))} \\
 & = & -2 * X_i * \sum{\text{residual}} \\ & = & -2 * X_i * \sum{\text{residual}} \\
-\\+& .. & -2 * X_i * \frac{\sum{\text{residual}}}{n} \\ 
 +& = & -2 * \overline{X_i * \text{residual}} \\ 
 \end{eqnarray*} \end{eqnarray*}
  
-(미분을 이해한다는 것을 전제로) 위의 식은 b값이 변할 때 msr (mean square residual) 값이 어떻게 변하는가를 알려주는 것이다. 그리고 그것은 b값에 대한 residual의 총합에 (-2/N)*X값을 곱한 값이다. +위의 설명은 Sum of Square값을 미분하는 것을 전제로 하였지만, Mean Square 값을 (Sum of Square값을 N으로 나눈 것) 대용해서 이해할 수도 있다. 아래의 code는 (미분을 이해한다는 것을 전제로) b값과 a값이 변할 때 msr (mean square residual) 값이 어떻게 변하는가를 알려주는 것이다. 
  
 <code> <code>
Line 1007: Line 1018:
 & = & k + \frac{m * x}{\sigma} - \frac{m * \mu}{\sigma}  \\ & = & k + \frac{m * x}{\sigma} - \frac{m * \mu}{\sigma}  \\
 & = & k - \frac{m * \mu}{\sigma} + \frac{m * x}{\sigma}  \\ & = & k - \frac{m * \mu}{\sigma} + \frac{m * x}{\sigma}  \\
-& = & k - \frac{\mu}{\sigma} * m + \frac{m}{\sigma} * x \\+& = & \underbrace{k - \frac{\mu}{\sigma} * m}_\text{ 1 } \underbrace{\frac{m}{\sigma}}_\text{ 2 } * x \\
 & & \text{therefore, a and be that we try to get are } \\ & & \text{therefore, a and be that we try to get are } \\
 a & = & k - \frac{\mu}{\sigma} * m \\ a & = & k - \frac{\mu}{\sigma} * m \\
gradient_descent.1755759073.txt.gz · Last modified: by hkimscil

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki