To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.

Hessenberg matrix

From Wikipedia, the free encyclopedia

In linear algebra, a Hessenberg matrix is a special kind of square matrix, one that is "almost" triangular. To be exact, an upper Hessenberg matrix has zero entries below the first subdiagonal, and a lower Hessenberg matrix has zero entries above the first superdiagonal.[1] They are named after Karl Hessenberg.[2]


Upper Hessenberg matrix

A square matrix is said to be in upper Hessenberg form or to be an upper Hessenberg matrix if for all with .

An upper Hessenberg matrix is called unreduced if all subdiagonal entries are nonzero, i.e. if for all .[3]

Lower Hessenberg matrix

A square matrix is said to be in lower Hessenberg form or to be a lower Hessenberg matrix if its transpose is an upper Hessenberg matrix or equivalently if for all with .

A lower Hessenberg matrix is called unreduced if all superdiagonal entries are nonzero, i.e. if for all .


Consider the following matrices.

The matrix is an upper unreduced Hessenberg matrix, is a lower unreduced Hessenberg matrix and is a lower Hessenberg matrix but is not unreduced.

Computer programming

Many linear algebra algorithms require significantly less computational effort when applied to triangular matrices, and this improvement often carries over to Hessenberg matrices as well. If the constraints of a linear algebra problem do not allow a general matrix to be conveniently reduced to a triangular one, reduction to Hessenberg form is often the next best thing. In fact, reduction of any matrix to a Hessenberg form can be achieved in a finite number of steps (for example, through Householder's transformation of unitary similarity transforms). Subsequent reduction of Hessenberg matrix to a triangular matrix can be achieved through iterative procedures, such as shifted QR-factorization. In eigenvalue algorithms, the Hessenberg matrix can be further reduced to a triangular matrix through Shifted QR-factorization combined with deflation steps. Reducing a general matrix to a Hessenberg matrix and then reducing further to a triangular matrix, instead of directly reducing a general matrix to a triangular matrix, often economizes the arithmetic involved in the QR algorithm for eigenvalue problems.


For , it is vacuously true that every matrix is both upper Hessenberg, and lower Hessenberg.[4]

The product of a Hessenberg matrix with a triangular matrix is again Hessenberg. More precisely, if is upper Hessenberg and is upper triangular, then and are upper Hessenberg.

A matrix that is both upper Hessenberg and lower Hessenberg is a tridiagonal matrix, of which symmetric or Hermitian Hessenberg matrices are important examples. A Hermitian matrix can be reduced to tri-diagonal real symmetric matrices.[5]

Hessenberg operator

The Hessenberg operator is an infinite dimensional Hessenberg matrix. It commonly occurs as the generalization of the Jacobi operator to a system of orthogonal polynomials for the space of square-integrable holomorphic functions over some domain -- that is, a Bergman space. In this case, the Hessenberg operator is the right-shift operator , given by


The eigenvalues of each principal submatrix of the Hessenberg operator are given by the characteristic polynomial for that submatrix. These polynomials are called the Bergman polynomials, and provide an orthogonal polynomial basis for Bergman space.

See also


  1. ^ Horn & Johnson (1985), page 28; Stoer & Bulirsch (2002), page 251
  2. ^ Biswa Nath Datta (2010) Numerical Linear Algebra and Applications, 2nd Ed., Society for Industrial and Applied Mathematics (SIAM) ISBN 978-0-89871-685-6, p. 307
  3. ^ Horn & Johnson (1985), page 35
  4. ^
  5. ^ "Computational Routines (eigenvalues) in LAPACK". Retrieved 2020-05-24.


External links

This page was last edited on 13 June 2021, at 07:40
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.