Gaussian Process Algorithm
(Redirected from Gaussian Process-based algorithm)
Jump to navigation
Jump to search
A Gaussian Process Algorithm is kernel-based algorithm/nonparametric statistical modeling algorithm that fits a Gaussian Process model.
- Context:
- It can be implemented by a Gaussian Process Training System to solve a Gaussian Process Modeling Task.
- It can be well applied to Small Regression Datasets.
- It can range from being a Gaussian Process-based Regression Algorithm to being a Gaussian Process-based Classification Algorithm.
- …
- Counter-Example(s):
- See: Gaussian Process Model, Dirichlet Process, Stacked Gaussian Process.
References
2017
- http://quora.com/What-are-some-advantages-of-using-Gaussian-Process-Models-vs-Neural-Networks/answer/Yoshua-Bengio?srid=uuoZN
- QUOTE: An advantage of Gaussian Processes is that, like other kernel methods, they can be optimized exactly, given the values of their hyper-parameters (such as the weight decay and the spread of a Gaussian kernel), and this often allows a fine and precise trade-off between fitting the data and smoothing. On small datasets they are very good because of this well-tuned smoothing and because they are still computationally affordable. They are my method of choice for small regression datasets (less than 1000 or 2000 examples). On the other hand, if you want to capture a complicated function (with many many ups and downs, i.e., not necessarily very smooth), then you need a model that can scale to large datasets and that can generalize non-locally (which kernel machines with standard generic kernels, typically local, do not provide). Modern variants of neural networks (so-called Deep Learning, Deep Learning) are more attractive with respect to these two properties, so I would prefer them for larger datasets where there is a lot of structure to be extracted from the data (the target function is not smooth).
2016
- Scaleable Gaussian Processes for Scientific Discovery http://slideshot.epfl.ch/play/k5FuJcUA0L0c
- QUOTE: GPs are ML methods for function approximation
2016
- https://people.orie.cornell.edu/andrew/code/
- QUOTE: Kernel methods, such as Gaussian processes, have had an exceptionally consequential impact on machine learning theory and practice. However, these methods face two fundamental open questions: …
Scalability: Gaussian processes, for example, scale asO(n^3) for training, O(n^2) for storage, and O(n^2) per test prediction, for n training points. This scalability typically limits GP applicability to at most a few thousand datapoints. Moreover, standard approaches to GP scalability typically only work for up to about fifty thousand training points, and sacrifice too much model structure for kernel learning.
- QUOTE: Kernel methods, such as Gaussian processes, have had an exceptionally consequential impact on machine learning theory and practice. However, these methods face two fundamental open questions: …
2011
- (Quadrianto; Kersting; Xu, 2011) ⇒ Novi Quadrianto; Kristian Kersting; Zhoa Xu. (2011). “Gaussian Processes.” In: (Sammut & Webb, 2011) p.428
2009
- Zoubin Ghahramani. (2009). http://learning.eng.cam.ac.uk/zoubin/gp.html
- QUOTE: Gaussian processes are a non-parametric method for doing Bayesian inference and learning on unknown functions. They can be used for non-linear regression, time-series modelling, classification, and many other problems.