To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
Languages
Recent
Show all languages
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

Hessenberg matrix

In linear algebra, a Hessenberg matrix is a special kind of square matrix, one that is "almost" triangular. To be exact, an upper Hessenberg matrix has zero entries below the first subdiagonal, and a lower Hessenberg matrix has zero entries above the first superdiagonal.[1] They are named after Karl Hessenberg.[2]

Definitions

Upper Hessenberg matrix

A square ${\displaystyle n\times n}$ matrix ${\displaystyle A}$ is said to be in upper Hessenberg form or to be an upper Hessenberg matrix if ${\displaystyle a_{i,j}=0}$ for all ${\displaystyle i,j}$ with ${\displaystyle i>j+1}$.

An upper Hessenberg matrix is called unreduced if all subdiagonal entries are nonzero, i.e. if ${\displaystyle a_{i+1,i}\neq 0}$ for all ${\displaystyle i\in \{1,\ldots ,n-1\}}$.[3]

Lower Hessenberg matrix

A square ${\displaystyle n\times n}$ matrix ${\displaystyle A}$ is said to be in lower Hessenberg form or to be a lower Hessenberg matrix if its transpose ${\displaystyle }$ is an upper Hessenberg matrix or equivalently if ${\displaystyle a_{i,j}=0}$ for all ${\displaystyle i,j}$ with ${\displaystyle j>i+1}$.

A lower Hessenberg matrix is called unreduced if all superdiagonal entries are nonzero, i.e. if ${\displaystyle a_{i,i+1}\neq 0}$ for all ${\displaystyle i\in \{1,\ldots ,n-1\}}$.

Examples

Consider the following matrices.

${\displaystyle A={\begin{bmatrix}1&4&2&3\\3&4&1&7\\0&2&3&4\\0&0&1&3\\\end{bmatrix}}}$
${\displaystyle B={\begin{bmatrix}1&2&0&0\\5&2&3&0\\3&4&3&7\\5&6&1&1\\\end{bmatrix}}}$
${\displaystyle C={\begin{bmatrix}1&2&0&0\\5&2&0&0\\3&4&3&7\\5&6&1&1\\\end{bmatrix}}}$

The matrix ${\displaystyle A}$ is an upper unreduced Hessenberg matrix, ${\displaystyle B}$ is a lower unreduced Hessenberg matrix and ${\displaystyle C}$ is a lower Hessenberg matrix but is not unreduced.

Computer programming

Many linear algebra algorithms require significantly less computational effort when applied to triangular matrices, and this improvement often carries over to Hessenberg matrices as well. If the constraints of a linear algebra problem do not allow a general matrix to be conveniently reduced to a triangular one, reduction to Hessenberg form is often the next best thing. In fact, reduction of any matrix to a Hessenberg form can be achieved in a finite number of steps (for example, through Householder's transformation of unitary similarity transforms). Subsequent reduction of Hessenberg matrix to a triangular matrix can be achieved through iterative procedures, such as shifted QR-factorization. In eigenvalue algorithms, the Hessenberg matrix can be further reduced to a triangular matrix through Shifted QR-factorization combined with deflation steps. Reducing a general matrix to a Hessenberg matrix and then reducing further to a triangular matrix, instead of directly reducing a general matrix to a triangular matrix, often economizes the arithmetic involved in the QR algorithm for eigenvalue problems.

Properties

For ${\displaystyle n\in \{1,2\}}$, it is vacuously true that every ${\displaystyle n\times n}$ matrix is both upper Hessenberg, and lower Hessenberg.[4]

The product of a Hessenberg matrix with a triangular matrix is again Hessenberg. More precisely, if ${\displaystyle A}$ is upper Hessenberg and ${\displaystyle T}$ is upper triangular, then ${\displaystyle AT}$ and ${\displaystyle TA}$ are upper Hessenberg.

A matrix that is both upper Hessenberg and lower Hessenberg is a tridiagonal matrix, of which symmetric or Hermitian Hessenberg matrices are important examples. A Hermitian matrix can be reduced to tri-diagonal real symmetric matrices.[5]

Hessenberg operator

The Hessenberg operator is an infinite dimensional Hessenberg matrix. It commonly occurs as the generalization of the Jacobi operator to a system of orthogonal polynomials for the space of square-integrable holomorphic functions over some domain -- that is, a Bergman space. In this case, the Hessenberg operator is the right-shift operator ${\displaystyle S}$, given by

${\displaystyle [Sf](z)=zf(z)}$.

The eigenvalues of each principal submatrix of the Hessenberg operator are given by the characteristic polynomial for that submatrix. These polynomials are called the Bergman polynomials, and provide an orthogonal polynomial basis for Bergman space.