Conditional Independence Relation
(Redirected from Conditional Independence)
Jump to navigation
Jump to search
A Conditional Independence Relation is a statistical relation that is true if and only if [math]\displaystyle{ \Pr(A \mid B \cap C) = \Pr(A \mid C) }[/math], where [math]\displaystyle{ Pr }[/math] is a conditional probability function.
- AKA: CI, [math]\displaystyle{ CI(A,B,C) }[/math].
- …
- Counter-Example(s):
- See: Conditional, Independence Relation, Statistical Independence Relation.
References
2011
- http://en.wikipedia.org/wiki/Conditional_independence
- In probability theory, two events [math]\displaystyle{ R }[/math] and [math]\displaystyle{ B }[/math] are conditionally independent given a third event [math]\displaystyle{ Y }[/math] precisely if the occurrence or non-occurrence of [math]\displaystyle{ R }[/math] and the occurrence or non-occurrence of [math]\displaystyle{ B }[/math] are independent events in their conditional probability distribution given Y. In other words, [math]\displaystyle{ R }[/math] and [math]\displaystyle{ B }[/math] are conditionally independent if and only if, given knowledge of whether [math]\displaystyle{ Y }[/math] occurs, knowledge of whether [math]\displaystyle{ R }[/math] occurs provides no information on the likelihood of [math]\displaystyle{ B }[/math] occurring, and knowledge of whether [math]\displaystyle{ B }[/math] occurs provides no information on the likehood of [math]\displaystyle{ R }[/math] occurring. In the standard notation of probability theory, [math]\displaystyle{ R }[/math] and [math]\displaystyle{ B }[/math] are conditionally independent given [math]\displaystyle{ Y }[/math] if and only if [math]\displaystyle{ \Pr(R \cap B \mid Y) = \Pr(R \mid Y)\Pr(B \mid Y),\, }[/math] or equivalently, [math]\displaystyle{ \Pr(R \mid B \cap Y) = \Pr(R \mid Y).\, }[/math] Two random variables [math]\displaystyle{ X }[/math] and [math]\displaystyle{ Y }[/math] are conditionally independent given a third random variable [math]\displaystyle{ Z }[/math] if and only if they are independent in their conditional probability distribution given Z. That is, [math]\displaystyle{ X }[/math] and [math]\displaystyle{ Y }[/math] are conditionally independent given [math]\displaystyle{ Z }[/math] if and only if, given any value of [math]\displaystyle{ Z }[/math], the probability distribution of [math]\displaystyle{ X }[/math] is the same for all values of [math]\displaystyle{ Y }[/math] and the probability distribution of [math]\displaystyle{ Y }[/math] is the same for all values of X.
2005
- (Rue & Held, 2005) ⇒ Havard Rue, and Leonhard Held. (2005). “Gaussian Markov Random Fields: Theory and Applications." CRC Press. ISBN:1584884320
- QUOTE: Conditional independence is a powerful concept. Let [math]\displaystyle{ \mathbf{x} = (x_1,x_2,x_3)^T }[/math] be a random vector, then [math]\displaystyle{ x_1 }[/math] and [math]\displaystyle{ x_2 }[/math] are conditionally independent given [math]\displaystyle{ x_3 }[/math] if, for known value of [math]\displaystyle{ x_3 }[/math], discover [math]\displaystyle{ x_2 }[/math] tells you nothing new about the distribution of [math]\displaystyle{ x_1 }[/math].