To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
Languages
Recent
Show all languages
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

Rank factorization

From Wikipedia, the free encyclopedia

In mathematics, given a field , nonnegative integers , and a matrix , a rank decomposition or rank factorization of A is a factorization of A of the form A = CF, where and , where is the rank of .

YouTube Encyclopedic

  • 1/5
    Views:
    330 336
    59 755
    430
    14 451
    105 723
  • Matrix Factorization - Numberphile
  • Lecture 16.5 — Recommender Systems | Vectorization Low Rank Matrix Factorization — [ Andrew Ng ]
  • Irène Waldspurger: Rank optimality for the Burer-Monteiro factorization
  • Lecture 30: Completing a Rank-One Matrix, Circulants!
  • Lecture 2: Multiplying and Factoring Matrices

Transcription

Existence

Every finite-dimensional matrix has a rank decomposition: Let be an matrix whose column rank is . Therefore, there are linearly independent columns in ; equivalently, the dimension of the column space of is . Let be any basis for the column space of and place them as column vectors to form the matrix . Therefore, every column vector of is a linear combination of the columns of . To be precise, if is an matrix with as the -th column, then

where 's are the scalar coefficients of in terms of the basis . This implies that , where is the -th element of .

Non-uniqueness

If is a rank factorization, taking and gives another rank factorization for any invertible matrix of compatible dimensions.

Conversely, if are two rank factorizations of , then there exists an invertible matrix such that and .[1]

Construction

Rank factorization from reduced row echelon forms

In practice, we can construct one specific rank factorization as follows: we can compute , the reduced row echelon form of . Then is obtained by removing from all non-pivot columns (which can be determined by looking for columns in which do not contain a pivot), and is obtained by eliminating any all-zero rows of .

Note: For a full-rank square matrix (i.e. when ), this procedure will yield the trivial result and (the identity matrix).

Example

Consider the matrix

is in reduced echelon form.

Then is obtained by removing the third column of , the only one which is not a pivot column, and by getting rid of the last row of zeroes from , so

It is straightforward to check that

Proof

Let be an permutation matrix such that in block partitioned form, where the columns of are the pivot columns of . Every column of is a linear combination of the columns of , so there is a matrix such that , where the columns of contain the coefficients of each of those linear combinations. So , being the identity matrix. We will show now that .

Transforming into its reduced row echelon form amounts to left-multiplying by a matrix which is a product of elementary matrices, so , where . We then can write , which allows us to identify , i.e. the nonzero rows of the reduced echelon form, with the same permutation on the columns as we did for . We thus have , and since is invertible this implies , and the proof is complete.

Singular value decomposition

If then one can also construct a full-rank factorization of via a singular value decomposition

Since is a full-column-rank matrix and is a full-row-rank matrix, we can take and .

Consequences

rank(A) = rank(AT)

An immediate consequence of rank factorization is that the rank of is equal to the rank of its transpose . Since the columns of are the rows of , the column rank of equals its row rank.[2]

Proof: To see why this is true, let us first define rank to mean column rank. Since , it follows that . From the definition of matrix multiplication, this means that each column of is a linear combination of the columns of . Therefore, the column space of is contained within the column space of and, hence, .

Now, is , so there are columns in and, hence, . This proves that .

Now apply the result to to obtain the reverse inequality: since , we can write . This proves .

We have, therefore, proved and , so .

Notes

  1. ^ Piziak, R.; Odell, P. L. (1 June 1999). "Full Rank Factorization of Matrices". Mathematics Magazine. 72 (3): 193. doi:10.2307/2690882. JSTOR 2690882.
  2. ^ Banerjee, Sudipto; Roy, Anindya (2014), Linear Algebra and Matrix Analysis for Statistics, Texts in Statistical Science (1st ed.), Chapman and Hall/CRC, ISBN 978-1420095388

References

  • Banerjee, Sudipto; Roy, Anindya (2014), Linear Algebra and Matrix Analysis for Statistics, Texts in Statistical Science (1st ed.), Chapman and Hall/CRC, ISBN 978-1420095388
  • Lay, David C. (2005), Linear Algebra and its Applications (3rd ed.), Addison Wesley, ISBN 978-0-201-70970-4
  • Golub, Gene H.; Van Loan, Charles F. (1996), Matrix Computations, Johns Hopkins Studies in Mathematical Sciences (3rd ed.), The Johns Hopkins University Press, ISBN 978-0-8018-5414-9
  • Stewart, Gilbert W. (1998), Matrix Algorithms. I. Basic Decompositions, SIAM, ISBN 978-0-89871-414-2
  • Piziak, R.; Odell, P. L. (1 June 1999). "Full Rank Factorization of Matrices". Mathematics Magazine. 72 (3): 193. doi:10.2307/2690882. JSTOR 2690882.
This page was last edited on 16 October 2022, at 18:49
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.