ML 지도학습

CSH_tech·2023년 8월 31일
0

회귀

  • 모델: LinearRegression, Ridge, Lasso.. 등

  • 평가 Metric: R2R^2

    R2=1i=1n(오차)2i=1n(편차)2R^2=1-\frac{\sum_{i=1}^{n}(오차)^2}{\sum_{i=1}^{n}(편차)^2}

    =1i=1n(실제값예측값)2i=1n(실제값실제평균)2=1-\frac{\sum_{i=1}^{n}(실제값-예측값)^2}{\sum_{i=1}^{n}(실제값-실제평균)^2}

    =1i=1n(yiyi^)2i=1n(yiyiˉ)2=1-\frac{\sum_{i=1}^{n}(y_i-\widehat{y_i})^2}{\sum_{i=1}^{n}(y_i-\bar{y_i})^2}

  • cost function: MSE(Mean Squared Error)

    MSE=1ni=1n(yiyi^)2MSE=\frac{1}{n}\sum_{i=1}^{n}(y_i-\widehat{y_i})^2

    =1ni=1n(yi(w1x+(1w0)))2=\frac{1}{n}\sum_{i=1}^{n}(y_i-(w_1\cdot{x}+(1\cdot{w_0})))^2

분류

  • 모델: LogisticRegression, SGDClassifier.. 등
  • 평가 Metric: Accuracy, Preicision, Recall, F1_score


    https://medium.com/@shrutisaxena0617/precision-vs-recall-386cf9f89488

    Accuracy=TruePositive+TrueNegativeTotalAccuracy = \frac{True Positive + True Negative}{Total}

    Precision=TruePositiveTruePositive+FalsePositivePrecision = \frac{True Positive}{True Positive + False Positive}

    Recall=TruePositiveTruePositive+FalseNegativeRecall = \frac{True Positive}{True Positive+False Negative}

    F1Score=2recallprecisionrecall+precisionF1-Score=2*\frac{recall * precision}{recall+precision}

  • cost function: log_loss, hinge

    loss=ploge(p^)loss=plog_e(\widehat{p})

    =ploge(11+ez)=plog_e(\frac{1}{1+e^{-z}})

    =ploge(11+e(w1x+(1w0)))=plog_e(\frac{1}{1+e^{(w_1\cdot{x}+(1\cdot{w_0}))}})

  • hinge
    • 0보다 작은 값에 대해서는 그대로 출력, 0과 같거나 0보다 큰 값에 대해서는 0으로 처리하는 function입니다.
    • loss=max{0,1(y^y)}loss = max\{0, 1 - (\widehat{y}\cdot{y})\}

      https://www.baeldung.com/cs/hinge-loss-vs-logistic-loss
profile
개발 초보자

0개의 댓글