To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

Block matrix pseudoinverse

From Wikipedia, the free encyclopedia

In mathematics, a block matrix pseudoinverse is a formula for the pseudoinverse of a partitioned matrix. This is useful for decomposing or approximating many algorithms updating parameters in signal processing, which are based on the least squares method.

YouTube Encyclopedic

  • 1/5
    Views:
    61 393
    12 219
    21 354
    56 200
    1 083 995
  • Partitioned Matrices or Block Matrix Multiplication
  • What is the Pseudo-inverse or A-dagger? Linear Algebra - invertible Matrix
  • Similar Matrices | MIT 18.06SC Linear Algebra, Fall 2011
  • Singular Value Decomposition (SVD): Matrix Approximation
  • 3. Multiplication and Inverse Matrices

Transcription

Derivation

Consider a column-wise partitioned matrix:

If the above matrix is full column rank, the Moore–Penrose inverse matrices of it and its transpose are

This computation of the pseudoinverse requires (n + p)-square matrix inversion and does not take advantage of the block form.

To reduce computational costs to n- and p-square matrix inversions and to introduce parallelism, treating the blocks separately, one derives [1]

where orthogonal projection matrices are defined by

The above formulas are not necessarily valid if does not have full rank – for example, if , then

Application to least squares problems

Given the same matrices as above, we consider the following least squares problems, which appear as multiple objective optimizations or constrained problems in signal processing. Eventually, we can implement a parallel algorithm for least squares based on the following results.

Column-wise partitioning in over-determined least squares

Suppose a solution solves an over-determined system:

Using the block matrix pseudoinverse, we have

Therefore, we have a decomposed solution:

Row-wise partitioning in under-determined least squares

Suppose a solution solves an under-determined system:

The minimum-norm solution is given by

Using the block matrix pseudoinverse, we have

Comments on matrix inversion

Instead of , we need to calculate directly or indirectly[citation needed][original research?]

In a dense and small system, we can use singular value decomposition, QR decomposition, or Cholesky decomposition to replace the matrix inversions with numerical routines. In a large system, we may employ iterative methods such as Krylov subspace methods.

Considering parallel algorithms, we can compute and in parallel. Then, we finish to compute and also in parallel.

See also

References

  1. ^ J.K. Baksalary and O.M. Baksalary (2007). "Particular formulae for the Moore–Penrose inverse of a columnwise partitioned matrix". Linear Algebra Appl. 421: 16–23. doi:10.1016/j.laa.2006.03.031.

External links

This page was last edited on 16 August 2023, at 04:41
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.