Machine Learning Day 14

<Bayesian Inference>

<Bayesian Updating>
When a prior dataset can be roughly represented by a normal distribution, bayesian statistics show that sample information from the same process can be used to obtain a posterior normal distribution. The latter is a weighted combination of the prior and the sample.

<Bayes' Theorem>

P(AB)=P(BA)P(A)  /  P(B)P(A|B) = P(B|A)P(A) \;/\; P(B)

=> * Prior Probability: P(A)P(A)
=> * Likelihood : P(BA)P(B|A)
=> * Posterior Probability: P(AB)P(A|B)
=> * Unnorm Constant: P(B)  =  (joint  probabilities)P(B) \;=\; \sum (joint\;probabilities)
It can also be

P(AB)=P(AB)  /  P(B)P(A|B)=P(A\cap B) \;/\; P(B)
  • Joint Probability: P(AB)P(A\cap B) = Likelihood x Prior Probability
    사전확률을 어떤 행동의 관찰, 정보의 획득에 의해서 사후확률로 업데이트하는것: 베이즈 업데이트

Therefore,

P(BA)P(A)  =  P(AB)P(B|A)P(A) \;=\; P(A\cap B)





<Exercise.1>

blue: posterial probability (0.428, 0.5718)

-> Probability grew two times bigger with just one more information. 20% -> 43%





<Exercise. 2>

  • Prior Probability:

    P(C)=0.001P(¬C)=0.999P(C)=0.001 \quad P(\neg C)=0.999
  • Likelyhood:

    P(PC)=0.95P(NC)=0.05P(P¬C)=0.02P(N¬C)=0.98P(P|C)=0.95 \quad P(N|C)=0.05 \quad P(P|\neg C)=0.02 \quad P(N|\neg C)=0.98
  • Joint Probability:

    P(PC)=0.95×0.001=0.00095P(NC)=0.05×0.001=0.00005P(P¬C)=0.02×0.999=0.01998P(N¬C)=0.98×0.999=0.97902P(P \cap C) = 0.95 \times 0.001 = 0.00095 \\ P(N \cap C) = 0.05 \times 0.001 = 0.00005 \\ P(P \cap \neg C) = 0.02 \times 0.999 = 0.01998 \\ P(N \cap \neg C) = 0.98 \times 0.999 = 0.97902
  • Posterior Probability:

    P(CP)=0.00095  /  (0.00095+0.01998)=0.04539P(¬CP)=0.01998  /  (0.00095+0.01998)=0.0955P(CN)=0.00005  /  (0.00005+0.97902)=0.00005P(¬CN)=0.97902  /  (0.00005+0.97902)=0.99995P(C|P) = 0.00095 \; / \; (0.00095+0.01998) = 0.04539 \\P(\neg C|P) = 0.01998 \;/\; (0.00095+0.01998) = 0.0955 \\P(C|N) = 0.00005 \;/\; (0.00005+0.97902) = 0.00005 \\P(\neg C|N) = 0.97902 \;/\; (0.00005+0.97902) = 0.99995





<Exercise.3>

  • Likelihood:

    P(FCA)=0.6P(MCA)=0.4P(FCB)=0.5P(MCB)=0.6P(FCC)=0.4P(MCC)=0.6P(F|C_{A})=0.6 \quad P(M|C_{A})=0.4 \\P(F|C_{B})=0.5 \quad P(M|C_{B})=0.6 \\P(F|C_{C})=0.4 \quad P(M|C_{C})=0.6
  • Joint Probability:

    P(FCA)=0.6×1/3  =  0.2P(MCA)=0.4×1/3  =  0.1333P(FCB)=0.5×1/3  =  0.1666P(MCB)=0.5×1/3  =  0.1666P(FCC)=0.4×1/3  =  0.1333P(MCC)=0.6×1/3  =  0.2P(F \cap C_{A})= 0.6\times 1/3\;=\;0.2 \quad P(M \cap C_{A})= 0.4\times 1/3\;=\;0.1333 \\P(F \cap C_{B})= 0.5\times 1/3\;=\;0.1666 \quad P(M \cap C_{B})= 0.5\times 1/3\;=\;0.1666 \\P(F \cap C_{C})= 0.4\times 1/3\;=\;0.1333 \quad P(M \cap C_{C})= 0.6\times 1/3\;=\;0.2
  • Posterior Probability:

    P(CAF)=0.2  /  (0.2+0.1666+0.1333)=0.40008P(CBF)=0.1666  /  (0.2+0.1666+0.1333)=0.33326P(CCF)=0.1333  /  (0.2+0.1666+0.1333)=0.26665P(C_{A}|F)=0.2 \;/\; (0.2+0.1666+0.1333)=0.40008 \\P(C_{B}|F)=0.1666 \;/\; (0.2+0.1666+0.1333)=0.33326 \\P(C_{C}|F)=0.1333 \;/\; (0.2+0.1666+0.1333)=0.26665

0개의 댓글