Linear Matrix Inequality
(Redirected from LMI)
Jump to navigation
Jump to search
A Linear Matrix Inequality is a constraint equation of the form [math]\displaystyle{ LMI(u) := F_0 + u_1F_1 + … + u_qF_q \preceq 0 }[/math] where ...
- AKA: LMI.
- See: Convex Optimization, Convex Set, Symmetric Matrix, Positive Semidefinite Matrix.
References
2015
- (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/linear_matrix_inequality Retrieved:2015-6-7.
- In convex optimization, a linear matrix inequality (LMI) is an expression of the form : [math]\displaystyle{ \operatorname{LMI}(y):=A_0+y_1A_1+y_2A_2+\cdots+y_m A_m\geq0\, }[/math] where
- [math]\displaystyle{ y=[y_i\,,~i\!=\!1,\dots, m] }[/math] is a real vector,
- [math]\displaystyle{ A_0, A_1, A_2,\dots,A_m }[/math] are [math]\displaystyle{ n\times n }[/math] symmetric matrices [math]\displaystyle{ \mathbb{S}^n }[/math] ,
- [math]\displaystyle{ B\geq0 }[/math] is a generalized inequality meaning [math]\displaystyle{ B }[/math] is a positive semidefinite matrix belonging to the positive semidefinite cone [math]\displaystyle{ \mathbb{S}_+ }[/math] in the subspace of symmetric matrices [math]\displaystyle{ \mathbb{S} }[/math] .
- This linear matrix inequality specifies a convex constraint on y.
- In convex optimization, a linear matrix inequality (LMI) is an expression of the form : [math]\displaystyle{ \operatorname{LMI}(y):=A_0+y_1A_1+y_2A_2+\cdots+y_m A_m\geq0\, }[/math] where
2004
- (Lanckriet et al., 2004a) ⇒ Gert R. G. Lanckriet, Nello Cristianini, Peter Bartlett, Laurent El Ghaoui, and Michael I. Jordan. (2004). “Learning the Kernel Matrix with Semidefinite Programming.” In: The Journal of Machine Learning Research, 5.
- QUOTE: A linear matrix inequality, abbreviated LMI, is a constraint of the form: [math]\displaystyle{ F (u): = F_0 + u_1F_1 + … + u_qF_q \preceq 0: }[/math] Here, [math]\displaystyle{ u }[/math] is the vector of decision variables, and [math]\displaystyle{ F_0, ..., F_q }[/math] are given symmetric [math]\displaystyle{ p \times p }[/math] matrices. The notation [math]\displaystyle{ F(u) = 0 }[/math] means that the symmetric matrix [math]\displaystyle{ F }[/math] is negative semidefinite.