To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time. 4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds # Polynomial SOS

## From Wikipedia, the free encyclopedia

In mathematics, a form (i.e. a homogeneous polynomial) h(x) of degree 2m in the real n-dimensional vector x is sum of squares of forms (SOS) if and only if there exist forms $g_{1}(x),\ldots ,g_{k}(x)$ of degree m such that

$h(x)=\sum _{i=1}^{k}g_{i}(x)^{2}.$ Every form that is SOS is also a positive polynomial, and although the converse is not always true, Hilbert proved that for n = 2, m = 1 or n = 3 and 2m = 4 a form is SOS if and only if it is positive. The same is also valid for the analog problem on positive symmetric forms.

Although not every form can be represented as SOS, explicit sufficient conditions for a form to be SOS have been found. Moreover, every real nonnegative form can be approximated as closely as desired (in the $l_{1}$ -norm of its coefficient vector) by a sequence of forms $\{f_{\epsilon }\}$ that are SOS.

## Square matricial representation (SMR)

To establish whether a form h(x) is SOS amounts to solving a convex optimization problem. Indeed, any h(x) can be written as

$h(x)=x^{\{m\}'}\left(H+L(\alpha )\right)x^{\{m\}}$ where $x^{\{m\}}$ is a vector containing a base for the forms of degree m in x (such as all monomials of degree m in x), the prime ′ denotes the transpose, H is any symmetric matrix satisfying

$h(x)=x^{\left\{m\right\}'}Hx^{\{m\}}$ and $L(\alpha )$ is a linear parameterization of the linear space

${\mathcal {L}}=\left\{L=L':~x^{\{m\}'}Lx^{\{m\}}=0\right\}.$ The dimension of the vector $x^{\{m\}}$ is given by

$\sigma (n,m)={\binom {n+m-1}{m}}$ whereas the dimension of the vector $\alpha$ is given by

$\omega (n,2m)={\frac {1}{2}}\sigma (n,m)\left(1+\sigma (n,m)\right)-\sigma (n,2m).$ Then, h(x) is SOS if and only if there exists a vector $\alpha$ such that

$H+L(\alpha )\geq 0,$ meaning that the matrix $H+L(\alpha )$ is positive-semidefinite. This is a linear matrix inequality (LMI) feasibility test, which is a convex optimization problem. The expression $h(x)=x^{\{m\}'}\left(H+L(\alpha )\right)x^{\{m\}}$ was introduced in  with the name square matricial representation (SMR) in order to establish whether a form is SOS via an LMI. This representation is also known as Gram matrix.

### Examples

• Consider the form of degree 4 in two variables $h(x)=x_{1}^{4}-x_{1}^{2}x_{2}^{2}+x_{2}^{4}$ . We have
$m=2,~x^{\{m\}}=\left({\begin{array}{c}x_{1}^{2}\\x_{1}x_{2}\\x_{2}^{2}\end{array}}\right),~H+L(\alpha )=\left({\begin{array}{ccc}1&0&-\alpha _{1}\\0&-1+2\alpha _{1}&0\\-\alpha _{1}&0&1\end{array}}\right).$ Since there exists α such that $H+L(\alpha )\geq 0$ , namely $\alpha =1$ , it follows that h(x) is SOS.
• Consider the form of degree 4 in three variables $h(x)=2x_{1}^{4}-2.5x_{1}^{3}x_{2}+x_{1}^{2}x_{2}x_{3}-2x_{1}x_{3}^{3}+5x_{2}^{4}+x_{3}^{4}$ . We have
$m=2,~x^{\{m\}}=\left({\begin{array}{c}x_{1}^{2}\\x_{1}x_{2}\\x_{1}x_{3}\\x_{2}^{2}\\x_{2}x_{3}\\x_{3}^{2}\end{array}}\right),~H+L(\alpha )=\left({\begin{array}{cccccc}2&-1.25&0&-\alpha _{1}&-\alpha _{2}&-\alpha _{3}\\-1.25&2\alpha _{1}&0.5+\alpha _{2}&0&-\alpha _{4}&-\alpha _{5}\\0&0.5+\alpha _{2}&2\alpha _{3}&\alpha _{4}&\alpha _{5}&-1\\-\alpha _{1}&0&\alpha _{4}&5&0&-\alpha _{6}\\-\alpha _{2}&-\alpha _{4}&\alpha _{5}&0&2\alpha _{6}&0\\-\alpha _{3}&-\alpha _{5}&-1&-\alpha _{6}&0&1\end{array}}\right).$ Since $H+L(\alpha )\geq 0$ for $\alpha =(1.18,-0.43,0.73,1.13,-0.37,0.57)$ , it follows that h(x) is SOS.

## Generalizations

### Matrix SOS

A matrix form F(x) (i.e., a matrix whose entries are forms) of dimension r and degree 2m in the real n-dimensional vector x is SOS if and only if there exist matrix forms $G_{1}(x),\ldots ,G_{k}(x)$ of degree m such that

$F(x)=\sum _{i=1}^{k}G_{i}(x)'G_{i}(x).$ #### Matrix SMR

To establish whether a matrix form F(x) is SOS amounts to solving a convex optimization problem. Indeed, similarly to the scalar case any F(x) can be written according to the SMR as

$F(x)=\left(x^{\{m\}}\otimes I_{r}\right)'\left(H+L(\alpha )\right)\left(x^{\{m\}}\otimes I_{r}\right)$ where $\otimes$ is the Kronecker product of matrices, H is any symmetric matrix satisfying

$F(x)=\left(x^{\{m\}}\otimes I_{r}\right)'H\left(x^{\{m\}}\otimes I_{r}\right)$ and $L(\alpha )$ is a linear parameterization of the linear space

${\mathcal {L}}=\left\{L=L':~\left(x^{\{m\}}\otimes I_{r}\right)'L\left(x^{\{m\}}\otimes I_{r}\right)=0\right\}.$ The dimension of the vector $\alpha$ is given by

$\omega (n,2m,r)={\frac {1}{2}}r\left(\sigma (n,m)\left(r\sigma (n,m)+1\right)-(r+1)\sigma (n,2m)\right).$ Then, F(x) is SOS if and only if there exists a vector $\alpha$ such that the following LMI holds:

$H+L(\alpha )\geq 0.$ The expression $F(x)=\left(x^{\{m\}}\otimes I_{r}\right)'\left(H+L(\alpha )\right)\left(x^{\{m\}}\otimes I_{r}\right)$ was introduced in  in order to establish whether a matrix form is SOS via an LMI.

### Noncommutative polynomial SOS

Consider the free algebra RX⟩ generated by the n noncommuting letters X = (X1,...,Xn) and equipped with the involution T, such that T fixes R and X1,...,Xn and reverse words formed by X1,...,Xn. By analogy with the commutative case, the noncommutative symmetric polynomials f are the noncommutative polynomials of the form f = fT. When any real matrix of any dimension r x r is evaluated at a symmetric noncommutative polynomial f results in a positive semi-definite matrix, f is said to be matrix-positive.

A noncommutative polynomial is SOS if there exists noncommutative polynomials $h_{1},\ldots ,h_{k}$ such that

$f(X)=\sum _{i=1}^{k}h_{i}(X)^{T}h_{i}(X).$ Surprisingly, in the noncommutative scenario a noncommutative polynomial is SoS if and only if it is matrix-positive. Moreover, there exist algorithms available to decompose matrix-positive polynomials in sum of squares of noncommutative polynomials.

Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.