> P(H|E) = P(H and E) / P(E)

This is the definition of Conditional Probability.

> P(H|E) = (P(E|H)*P(H))/P(E)

Bayes' Rule is derived from the definition of conditional probability.

Hence, all assumptions made in the definition of conditional probability

will be true even for Bayes' Rule

> Is it because the former requires P(H and E) assumes independence,

P(H and E) does not assume independence (or dependence for that matter).

It just represents joint distribution.

> Why is the definition of conditional probability not as useful as the

Bayes' Rule?

That is because in most applications, when you want to predict P(H|E),

the most estimable quantity that you have with you is P(E|H). For

example, if you want to predict what disease is afflicting a person

based on his symptoms i.e P(Disease | Symptoms), the data that you will

have with you is which diseases have what all symptoms, from which you

can estimate P(Symptoms | Disease).

Regards.

--- In aima-talk@yahoogroups.com, "Jim Davies" <jim.davies@...> wrote:

>

> P(H|E) = P(H and E) / P(E)

>

> is not as useful as Bayes' Rule

>

> P(H|E) = (P(E|H)*P(H))/P(E)

>

> But I'm not sure why. Is it because the former requires P(H and E)

assumes independence, which we might not have?

>

> JimDavies

>