To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

Real rank (C*-algebras)

From Wikipedia, the free encyclopedia

In mathematics, the real rank of a C*-algebra is a noncommutative analogue of Lebesgue covering dimension. The notion was first introduced by Lawrence G. Brown and Gert K. Pedersen.[1]

YouTube Encyclopedic

  • 1/3
    Views:
    228 927
    592 453
    5 477
  • Dimension of the column space or rank | Vectors and spaces | Linear Algebra | Khan Academy
  • Basis of a subspace | Vectors and spaces | Linear Algebra | Khan Academy
  • The Lie group SL(2,C) and its Lie algebra sl(2,C) - lec 15 - Frederic Schuller

Transcription

We've seen in several videos that the column space of a matrix is pretty straightforward to find. In this situation the column space of A is just equal to all of the linear combinations of the column vectors of A. Another way of saying all of the linear combinations is just the span of each of these column vectors. So if we call this one right here a1. This is a2, a3, a4. This is a5. Then the column space of A is just equal to the span of a1, a2, a3, a4, and a5. Fair enough. But a more interesting question is whether these guys form a basis for the column space. Or even more interesting, what is the basis for the column space of A? And in this video I'm going to show you a method for determining the basis, and along the way we'll get an intuition for maybe why it works. And if I have time, actually I probably won't have time in this video. In the next video I'll prove to you why it works. So we want to figure out the basis for the column space of A. Remember the basis just means that vectors span, C, A. Clearly these vectors span our column space. I mean the span of these vectors is the column space. But in order to be a basis, the vectors also have to be linearly, let me just write, linearly independent. And we don't know whether these guys or what subset of these guys are linearly independent. So what you do-- and I'm just really going to describe the process here, as opposed to the proof-- is you put this guy in reduced row echelon form. So let's do that. So let me see if we can do that. Let's keep our first row the same. 1, 0. Let me do it actually in the right side right here. So let's keep the first row the same. 1, 0, minus 1, 0, 4. And then let's replace our second row with the second row minus 2 times the first row. So then our second row. 2 minus 2 times 1 is 0. 1 minus 2 times 0 is 1. 0 minus 2 times negative 1, so that's 0 plus 2. 0 minus 2 times 0 is just 0. And then 9 minus 2 times 4 is 1. Fair enough. Now we want to zero out this guy. Well it seems like a pretty straightforward way. Just replace this row with this row plus the first row. So minus 1 plus 1 is 0. 2 plus 0 is 2. 5 minus 1 is 4. 1 plus 0 is 1. Minus 5 plus 4 is minus 1. And then finally we got this guy right here, and in order to zero him out, let's replace him with him minus the first row. So 1 minus 1 is 0. Minus 1 minus 0 is minus 1. Minus 3 minus negative 1, that's minus 3 plus 1, so that's minus 2. Minus 2 minus 0 is minus 2. And then 9 minus 4 is 5. So we did one round. We got our first pivot column going. Now let's do another round of row operations. Well we want to zero all of these guys out. Luckily this is already 0. So we don't have to change our first row or our second row. So we get 1, 0, minus 1, 0, 4. Our second row becomes 0, 1, 2, 0, 1. And now let us see if we can eliminate this guy right here. And let's do it by replacing our blue row, our third row, with the third row minus 2 times the second row. So 0 minus 2 times 0 is 0. 2 minus 2 times 1 is 0. 4 minus 2 times 2 is 0. 1 minus 2 times 0 is 1. Minus 1 minus 2 times 1 is minus 3. All right. Now this last guy we want to eliminate him, and we want turn this into a 0. Let's replace this fourth row with the fourth row plus the second row. So 0 plus 0 is 0. Minus 1 plus minus 1 is 0. Minus 2 plus minus 2 is 0. Minus 2 plus 0 is minus 2. And then 5 plus 1 is 6. We're getting close. So let's look at our pivot entries. We have this is a pivot entry. That's a pivot entry. And this is not a pivot entry, because it's following obviously another. This guy is a pivot entry right here, or will be. Zero this minus 2 out, and I think we'll be done. So let me write my first row just the way it is, because everything above it is 0, so we don't have to worry about it. So my first row I can just write as 1, 0, minus 1, 0, 4. I can write my second row, 0, 1, 2, 0, 1. I can write my third row as 0, 0, 0, 1 minus 3. And now let's replace my fourth row. Let's replace it with it plus 2 times the second row. So 0 plus 2 times 0, 0 plus 2 times 0, 0 plus 2 times 0, minus 2 plus 2 times 1 is just 0. 6 plus 2 times minus 3, that's 6 minus 6, that's just 0. And there we've actually put our matrix in reduced row echelon form. So let me put brackets around it. It's not so bad if you just kind of go and just do the manipulations. And sometimes you kind of get a headache thinking about doing something like this, but this wasn't too bad. So this is let me just say the reduced row echelon form of A. Let me just call that matrix R. So this is matrix R right there. Now what do we see about matrix R? Well it has 3 pivot entries, or 3 pivot columns. Let me square them out, or circle them out. Column 1 is a pivot column, column 2 is a pivot column, and column 3 is a pivot column. And we've done this in previous videos. There's two things that you can see. These three columns are clearly linearly independent. How do we know that? And that's just with respect to each other. If we just took a set of, let's call this r1, r2, and this would be r3, this would be r4 right here. It's clear that the set r1, r2, and r4 is linearly independent. And you say why is that? Well look, our one's got a 1 here, while the other two have a 0 in that entry, right? And this is by definition of pivot entries. Pivot entries have 0's, or pivot columns have 0's everywhere except for where they have a 1. For any pivot column, it will be the only pivot column that has 0's there. Or it'll be the only pivot column that has a 1 there. So there's no way that you can add up combinations of these guys to get a 1. You can say 100 times 0, minus 3, times 0. You're just going to get a bunch of 0's. So no combination of these two guys is going to be equal to that guy. By the same reasoning, no combination of that and that is going to equal this. This is by definition of a pivot entry. When you put it in reduced row echelon form, it's very clear that any pivot column will be the only one to have 1 in that place. So it's very clear that these guys are linearly independent. Now it turns out, and I haven't proven it to you, that the corresponding columns in A-- this is r1, but it's A before we put it in reduced row echelon form-- that these guys right here, so a1, a2, and a4 are also linearly independent. So a1-- let me circle it-- a2, and a4. So if I write it like this, a1, a2, and a4. Let me write it in set notation. These guys are also linearly independant, which I haven't proven. But I think you can kind of get a sense that these row operations really don't change the sense of the matrix. And I'll do a better explanation of this, but I really just wanted you to understand how to develop a basis for the column space. So they're linearly independent. So the next question is do they span our column space? And in order for them to span, obviously all of these 5 vectors, if you have all of them, that's going to span your column space by definition. But if we can show, and I'm not going to show it in this video, but it turns out that you can always represent the non-pivot columns as linear combinations of the pivot columns. And we've kind of touched on that in previous videos where we find a solution for the null space and all that. So these guys can definitely be represented as linear combinations of these guys. I haven't shown you that, but if you take that on faith, then you don't need that column and that column to span. If you did then, or I guess a better way to think it, you don't need them to span, although they are part of the span. Because if you needed this guy, you can just construct him with linear combinations of these guys. So if you wanted to figure out a basis for the column space of A, you literally just take A into reduced row echelon form. You look at the pivot entries in the reduced row echelon form of A, and that's those three. And then you look at the corresponding columns to those pivot columns in your original A. And those form the basis. Because any linear combination of them, or linear combinations of them can be used to construct the non-pivot columns, and they're linearly independant. So I haven't shown you that. But for this case, if you want to know the basis, it's just a1, a2, and a4. And now we can answer another question. So a1, a2, and a4 form a basis for the column space of A, because you can construct the other two guys with linear combinations of our basis vectors, and they're also linearly independent. Now the next question is what is the dimension of the basis? Or what is the dimension-- not the dimension of the basis-- what is the dimension of the column space of A? Well the dimension is just the number of vectors in any basis for the column space. And all bases have the same number of vectors for any given subspace. So we have 1, 2, 3 vectors. So the dimension of our column space is equal to 3. And the dimension of a column space actually has a specific term for it, and that's called the rank. So the rank of A, which is the exact same thing as the dimension of the column space, it is equal to 3. And another way to think about it is, the rank of A is the number of linearly independent column vectors that you have that can span your entire column space. Or the number of linearly independent column vectors that can be used to construct all of the other column vectors. But hopefully this didn't confuse you too much, because the idea is very simple. Take A, put it into reduced row echelon form, see which columns are pivot columns. The corresponding columns are going to be a basis for your column space. If you want to know the rank for your matrix, you can just count them. Or if you don't want to count those, you could literally just count the number of pivot columns you have in your reduced row echelon form. So that's how you do it. In the next video I'll explain why this worked.

Definition

The real rank of a unital C*-algebra A is the smallest non-negative integer n, denoted RR(A), such that for every (n + 1)-tuple (x0, x1, ... ,xn) of self-adjoint elements of A and every ε > 0, there exists an (n + 1)-tuple (y0, y1, ... ,yn) of self-adjoint elements of A such that is invertible and . If no such integer exists, then the real rank of A is infinite. The real rank of a non-unital C*-algebra is defined to be the real rank of its unitalization.

Comparisons with dimension

If X is a locally compact Hausdorff space, then RR(C0(X)) = dim(X), where dim is the Lebesgue covering dimension of X. As a result, real rank is considered a noncommutative generalization of dimension, but real rank can be rather different when compared to dimension. For example, most noncommutative tori have real rank zero, despite being a noncommutative version of the two-dimensional torus. For locally compact Hausdorff spaces, being zero-dimensional is equivalent to being totally disconnected. The analogous relationship fails for C*-algebras; while AF-algebras have real rank zero, the converse is false. Formulas that hold for dimension may not generalize for real rank. For example, Brown and Pedersen conjectured that RR(AB) ≤ RR(A) + RR(B), since it is true that dim(X × Y) ≤ dim(X) + dim(Y). They proved a special case that if A is AF and B has real rank zero, then A ⊗ B has real rank zero. But in general their conjecture is false, there are C*-algebras A and B with real rank zero such that A ⊗ B has real rank greater than zero.[2]

Real rank zero

C*-algebras with real rank zero are of particular interest. By definition, a unital C*-algebra has real rank zero if and only if the invertible self-adjoint elements of A are dense in the self-adjoint elements of A. This condition is equivalent to the previously studied conditions:

This equivalence can be used to give many examples of C*-algebras with real rank zero including AW*-algebras, Bunce–Deddens algebras,[3] and von Neumann algebras. More broadly, simple unital purely infinite C*-algebras have real rank zero including the Cuntz algebras and Cuntz–Krieger algebras. Since simple graph C*-algebras are either AF or purely infinite, every simple graph C*-algebra has real rank zero.

Having real rank zero is a property closed under taking direct limits, hereditary C*-subalgebras, and strong Morita equivalence. In particular, if A has real rank zero, then Mn(A), the algebra of n × n matrices over A, has real rank zero for any integer n ≥ 1.

References

  1. ^ Brown, Lawrence G; Pedersen, Gert K (July 1991). "C*-algebras of real rank zero". Journal of Functional Analysis. 99 (1): 131–149. doi:10.1016/0022-1236(91)90056-B. Zbl 0776.46026.
  2. ^ Kodaka, Kazunori; Osaka, Hiroyuki (July 1995). "Real Rank of Tensor Products of C*-algebras". Proceedings of the American Mathematical Society. 123 (7): 2213–2215. doi:10.1090/S0002-9939-1995-1264820-4. Zbl 0835.46053.
  3. ^ Blackadar, Bruce; Kumjian, Alexander (March 1985). "Skew Products of Relations and the Structure of Simple C*-Algebras". Mathematische Zeitschrift. 189 (1): 55–63. doi:10.1007/BF01246943. Zbl 0613.46049.
This page was last edited on 18 August 2023, at 23:03
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.