Nyhetsflöde
Logga in till din kurswebb
Du är inte inloggad på KTH så innehållet är inte anpassat efter dina val.
I Nyhetsflödet hittar du uppdateringar på sidor, schema och inlägg från lärare (när de även behöver nå tidigare registrerade studenter).
December 2015
Visa fler liknande händelser (1)
Oktober 2013
Note: for those who have thier course project presentations on Thursday 17 October 2013, the location has been changed to the following:
Location:
Conference Room SIP (Floor 3)
Osquldas väg 10, 100 44 Stockholm
Note: for those who have thier course project presentations on Friday 18 October 2013, the location has been changed to the following:
Location:
Conference Room SIP (Floor 3)
Osquldas väg 10, 100 44 Stockholm
Oktober 2012
Bayesian learning:
My understanding of probabilities is that:
P[A|B]= integral{ P[A|C,B] * P[C|B] dC} (integration with respect to C)
But expression 8.10 claims that:
P[A|B]= integral{ P[A|C] * P[C|B] dC} (where A=X_t+1 , B=X_obs , C=W)
is 8.10 a special case?
in that case I would guess that the information from all X_obs are included in W after training, and therefore they are redundant and either one of them can be removed from P[A|C,B]; i.e. P[A|B]=P[A|C]
please help, thanks
Hello Oussama,
Your understanding is correct. Equation (8.10) is a special case, like you suggest.
Specifically, all observations X_t are conditionally independent given w. Hence P[X_{T+1} | W=w,x_obs] = P[X_{T+1} | W=w] in this case.
Best,
Gustav