Basis Pursuit Denoising Task
A Basis Pursuit Denoising Task is an Applied Mathematics that ...
- See: Lasso (Statistics), Applied Mathematics, Statistics, Mathematical Optimization, Sparse_vector, Convex Optimization, Quadratic Programming, Regularization (Mathematics), Occam's Razor, Lasso_(Statistics)#Lasso_method, Image Compression, Compressed Sensing.
References
2017
- (Wikipedia, 2017) ⇒ https://en.wikipedia.org/wiki/Basis_pursuit_denoising Retrieved:2017-8-27.
- In applied mathematics and statistics, basis pursuit denoising (BPDN) refers to a mathematical optimization problem of the form: : [math]\displaystyle{ \min_x \frac{1}{2}\|y-Ax\|^2_2+\lambda\|x\|_1, }[/math] where [math]\displaystyle{ \lambda }[/math] is a parameter that controls the trade-off between sparsity and reconstruction fidelity, [math]\displaystyle{ x }[/math] is an [math]\displaystyle{ N \times 1 }[/math] solution vector, [math]\displaystyle{ y }[/math] is an [math]\displaystyle{ M \times 1 }[/math] vector of observations, [math]\displaystyle{ A }[/math] is an [math]\displaystyle{ M \times N }[/math] transform matrix and [math]\displaystyle{ M \lt N }[/math]. This is an instance of convex optimization and also of quadratic programming.
Some authors refer to basis pursuit denoising as the following closely related problem: : [math]\displaystyle{ \min_x \|x\|_1 \;\textrm{subject} \ \textrm{to}\;\;\|y-Ax\|^2_2 \le \delta }[/math] which, for any given [math]\displaystyle{ \lambda }[/math] , is equivalent to the unconstrained formulation for some (usually unknown a priori) value of [math]\displaystyle{ \delta }[/math] . The two problems are quite similar. In practice, the unconstrained formulation, for which most specialized and efficient computational algorithms are developed, is usually preferred.
Either types of basis pursuit denoising solve a regularization problem with a trade-off between having a small residual (making [math]\displaystyle{ y }[/math] close to [math]\displaystyle{ Ax }[/math] in terms of the squared error) and making [math]\displaystyle{ x }[/math] simple in the [math]\displaystyle{ \ell_1 }[/math] -norm sense. It can be thought of as a mathematical statement of Occam's razor, finding the simplest possible explanation (i.e. one that yields [math]\displaystyle{ \min_x \|x\|_1 }[/math] ) capable of accounting for the observations [math]\displaystyle{ y }[/math] .
Exact solutions to basis pursuit denoising are often the best computationally tractable approximation of an underdetermined system of equations. Basis pursuit denoising has potential applications in statistics (c.f. the LASSO method of regularization), image compression and compressed sensing.
As [math]\displaystyle{ \lambda \rightarrow 0 }[/math] (or when [math]\displaystyle{ \delta = 0 }[/math] ), this problem becomes basis pursuit.
Basis pursuit denoising was introduced by Chen and Donoho in 1994, in the field of signal processing. In statistics, it is well-known under the name LASSO, after being introduced by Tibshirani in 1996.
- In applied mathematics and statistics, basis pursuit denoising (BPDN) refers to a mathematical optimization problem of the form: : [math]\displaystyle{ \min_x \frac{1}{2}\|y-Ax\|^2_2+\lambda\|x\|_1, }[/math] where [math]\displaystyle{ \lambda }[/math] is a parameter that controls the trade-off between sparsity and reconstruction fidelity, [math]\displaystyle{ x }[/math] is an [math]\displaystyle{ N \times 1 }[/math] solution vector, [math]\displaystyle{ y }[/math] is an [math]\displaystyle{ M \times 1 }[/math] vector of observations, [math]\displaystyle{ A }[/math] is an [math]\displaystyle{ M \times N }[/math] transform matrix and [math]\displaystyle{ M \lt N }[/math]. This is an instance of convex optimization and also of quadratic programming.