P(B) is called marginal probability, pure probability of an evidence. P(A) is called the prior probability, probability of hypothesis before considering the evidence P(B|A) is called the likelihood, in which we consider that hypothesis is true, then we calculate the probability of evidence. P(A|B) is known as posterior, which we need to calculate, and it will be read as Probability of hypothesis A when we have occurred an evidence B. It shows the simple relationship between joint and conditional probabilities. This equation is basic of most modern AI systems for probabilistic inference. The above equation (a) is called as Bayes' rule or Bayes' theorem. Equating right hand side of both the equations, we will get:
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |