Abstract Random Vector
(Redirected from multidimensional random variable)
Jump to navigation
Jump to search
An Abstract Random Vector is an abstract vector that represents a random vector (from a vector space).
- Context:
- It can range from being a Bivariate Random Vector to being a n-Variate Random Vector.
- It can range from being a Discrete Random Vector to being a Continuous Random Vector.
- …
- Counter-Example(s):
- See: Random Vector Dataset, Random Variable Set, MANOVA.
References
2006
- (Dubnicka, 2006e) ⇒ Suzanne R. Dubnicka. (2006). “Random Vectors and Multivariate Distributions - Handout 5.” Kansas State University, Introduction to Probability and Statistics I, STAT 510 - Fall 2006.
- TERMINOLOGY : If X and Y are random variables, then (X, Y ) is called a bivariate random vector. In general, if X1,X2, ...,Xn denote n random variables, then X = (X1,X2, ...,Xn) is called an n-variate random vector. For much of this chapter, we will consider the n = 2 bivariate case. However, all ideas discussed herein extend naturally to higher dimensional settings.
- TERMINOLOGY : Let X and Y be discrete random variables. Then, (X, Y ) is called a discrete random vector, and the joint probability mass function (pmf) of X and Y is given by pX,Y (x, y) = P(X = x, Y = y),
- TERMINOLOGY : Suppose that (X, Y ) is a discrete random vector with joint pmf pX,Y (x, y). We define the conditional probability mass function (pmf) of X, given Y = y, as pX|Y (x|y) = pX,Y (x, y) pY (y), whenever pY (y) > 0. Similarly, the conditional probability mass function of Y, given X = x, as pY |X(y|x) = pX,Y (x, y) pX(x), whenever pX(x) > 0.
- TERMINOLOGY : Suppose that (X, Y ) is a continuous random vector with joint pdf fX,Y (x, y). We define the conditional probability density function (pdf) of X, given Y = y, as fX|Y (x|y) = fX,Y (x, y) fY (y) .
- TERMINOLOGY : Suppose that (X, Y ) is a random vector (discrete or continuous) with joint cdf FX,Y (x, y), and denote the marginal cdfs of X and Y by FX(x) and FY (y), respectively. We say that the random variables X and Y are independent if and only if FX,Y (x, y) = FX(x)FY (y) for all values of x and y. Otherwise, we say that X and Y are dependent.
2005
- (Rue & Held, 2005) ⇒ Havard Rue, and Leonhard Held. (2005). “Gaussian Markov Random Fields: Theory and Applications.” CRC Press. ISBN:1584884320
- QUOTE: Let [math]\displaystyle{ \mathbf{x} = (x_1,x_2,x_3)^T }[/math] be a random vector, then [math]\displaystyle{ x_1 }[/math] and [math]\displaystyle{ x_2 }[/math] are conditionally independent given [math]\displaystyle{ x_3 }[/math] if, for known value of [math]\displaystyle{ x_3 }[/math], discover [math]\displaystyle{ x_2 }[/math] tells you nothing new about the distribution of [math]\displaystyle{ x_1 }[/math].