2006 AdvancesinGaussianProcesses
Jump to navigation
Jump to search
- (Rasmussen, 2006) ⇒ Carl Edward Rasmussen. (2006). “Advances in Gaussian Processes.” Tutorial at Advances in Neural Information Processing Systems, 19 (NIPS 2016).
Subject Headings: Gaussian Process Model, Gaussian Process Regression.
Notes
Cited By
Quotes
The Gaussian Distribution
The Gaussian distribution is given by
- [math]\displaystyle{ \mathcal{p}(\mathbf{x}|µ, \Sigma) = \mathcal{N}(µ,\Sigma) = (2\pi)^{-D/2}|\Sigma|^{-1/2} \exp\bigl(-\frac{1}{2}(x - µ))^T \Sigma^{-1}(x - µ)\bigr) }[/math]
where [math]\displaystyle{ µ }[/math] is the mean vector and [math]\displaystyle{ \Sigma }[/math] the covariance matrix.
Both the conditionals and the marginals of a joint Gaussian are again Gaussian.
What is a Gaussian Process?
A Gaussian process is a generalization of a multivariate Gaussian distribution to infinitely many variables.
Informally: infinitely long vector [math]\displaystyle{ \simeq }[/math] function
- Definition: a Gaussian process is a collection of random variables, any finite number of which have (consistent) Gaussian distributions. �
A Gaussian distribution is fully specified by a mean vector, µ, and covariance matrix [math]\displaystyle{ \Sigma }[/math]:
- [math]\displaystyle{ \mathbf{f} = (f_1,...,f_n)^T \sim \mathcal{N}(µ, \Sigma), \text{indexes} i = 1,...,n }[/math]
A Gaussian process is fully specified by a mean function m(x) and covariance function k(x,x'):
- [math]\displaystyle{ f(x) \sim \mathcal{GP}(m(x), k(x,x')), \text{indexes} x }[/math]
…
References
;
Author | volume | Date Value | title | type | journal | titleUrl | doi | note | year | |
---|---|---|---|---|---|---|---|---|---|---|
2006 AdvancesinGaussianProcesses | Carl Edward Rasmussen | Advances in Gaussian Processes | 2006 |