In linear algebra, an orthogonal transformation is a linear transformation T : V → V on a real inner product space V, that preserves the inner product. That is, for each pair u, v of elements of V, we have^{[1]}
Since the lengths of vectors and the angles between them are defined through the inner product, orthogonal transformations preserve lengths of vectors and angles between them. In particular, orthogonal transformations map orthonormal bases to orthonormal bases.
Orthogonal transformations in two or threedimensional Euclidean space are stiff rotations, reflections, or combinations of a rotation and a reflection (also known as improper rotations). Reflections are transformations that reverse the direction front to back, orthogonal to the mirror plain, like (realworld) mirrors do. The matrices corresponding to proper rotations (without reflection) have determinant +1. Transformations with reflection are represented by matrices with determinant −1. This allows the concept of rotation and reflection to be generalized to higher dimensions.
In finitedimensional spaces, the matrix representation (with respect to an orthonormal basis) of an orthogonal transformation is an orthogonal matrix. Its rows are mutually orthogonal vectors with unit norm, so that the rows constitute an orthonormal basis of V. The columns of the matrix form another orthonormal basis of V.
The inverse of an orthogonal transformation is another orthogonal transformation. Its matrix representation is the transpose of the matrix representation of the original transformation.
YouTube Encyclopedic

1/3Views:79 992277 010889 443

✪ Orthogonal matrices preserve angles and lengths  Linear Algebra  Khan Academy

✪ Introduction to orthonormal bases  Linear Algebra  Khan Academy

✪ Linear transformations  Matrix transformations  Linear Algebra  Khan Academy
Transcription
In the last couple of videos, we've seen that if we have some matrix C that is n by n. It's a square matrix, and is columns, column form and orthonormal set. Which just means that the columns each have been normalized. So they each have length of 1 if you view them as column vectors. And they're all mutually orthogonal to each other. So if you dot it with yourself you get 1. If you dot it with any of the other columns, you get 0. We've seen this multiple times. It's orthogonal to everything else. If you have a matrix like this and I actually forgot to tell you the name of this this is called an orthogonal matrix. We've already seen that the transpose of this matrix is the same thing as the inverse of this matrix. Which makes it super, duper, duper useful to deal with. The transpose of this matrix is equal to the inverse. Now, this statement leads to some other interesting things about this. So, so far we've been dealing this mainly with the change of basis. I can kind of draw the diagram that you're probably tired of by now. Let's say that's the standard basis. Let's say that I have x in coordinates with another basis. We've seen I can multiply this guy times c. To get that up there I could multiply that guy by c inverse to get this guy right here. And, in that world, we viewed c as just a change of basis. Were representing the same matrix we're representing the same vector. We're just changing the coordinates of how we represent it. But we also know that any matrix product, any matrix vector product, is also a linear transformation. So, this change of basis is really just a linear transformation. What I want to show you in this video, and you could view it either as a change of basis or as a linear transformation, is that when you multiply this orthogonal matrix times some vector, it preserves let me write this down lengths and angles. So let's have a little touchyfeely discussion of what that means. Let's view it as a transformation. Let's say I have some set of vectors in my domain. Let's say they look like this. Let's say that it looks like this. Well, let me do it like I'll draw that one like that guy, and this guy like that. And there's some angle between them. Angles are easy to visualize in r2, r3. Maybe a little harder once we get the higher dimensions. But that's the angle between them. Now, if we're saying that we're preserving the angles and the lengths, that means if I were to multiply these vectors times c then we could view it as a transformation. Maybe I rotate them or I well, you can't really. Maybe I rotate them or do something like that. So maybe that pink vector will now look like this. But it's going to have the same length. This length is going to be the same thing as that length. And even more, when I said it preserves lengths and angles, this yellow vector's going to look something like this. Where the angle is going to be the same. Where this data is going to be that data. That's what I mean by preserves angles. If we didn't have this case, we could imagine a transformation that doesn't preserve angles. Let me draw one that doesn't. If this got transformed to, I don't know, let's say this guy got a lot longer, and let's say this guy also got longer, and I want to show that the angle also doesn't get preserved. Not only did it get longer, but it got distorted a little bit. So, the angle also changed. This transformation right there is not preserving angles. So when you have a change of basis matrix that's orthogonal, when you have a transformation matrix that's orthogonal, all it's essentially doing to your to your vectors, is it kind of a rotates them around, but it's not going to really distort them. So I'll write that in quotes because that's not a mathematically rigorous term. So, no distortion of vectors. So, I've kind of showed you the intuition of what that means. Let's actually prove it to ourselves that this is the case. So, I'm saying that if this pink vector here is x, and that this pink vector here is c times x, I'm claiming that the length of x is equal to the length of c times x. Let's see if that's actually the case. The length of cx squared is the same thing as cx dot cx. And here it's always useful for me to kind of remind myself that if I take two vectors let me do it over here. Let's say I have y dot y. This is the same thing as y transpose, if you view them as matrices, y transpose times y. y transpose y is just y1, y2, all the way to yn times y1, y2, all the way to yn. And if you were to do this 1 by n times n by 1 matrix product, you're going to get a 1 by 1 matrix or just a number that's going to be y1 times y1 plus y2 times y2 all the way to yn times yn. So, this is the same thing as y dot y. I think I did this about ten or twenty videos ago, but it's always a good refresher. So let's use this property right here. So these two dotted with each other. This is the same thing is taking one of their transpose times the other one. So turn this from a vector, vector dot product to a matrix, matrix product. So this is the same thing as CX transpose, CX. so you can view this as a 1 by n matrix now, times the 1 by 1 matrix which is just the column vector cx. These are the same thing. Now, we also know that A times B transpose is the same thing is B transpose, A transpose. We saw that a long time ago. So this thing right here is going to be equal to X transpose, C transpose. Just switch the order and take the transpose of each. X transpose times C transpose. And then you have that times CX. And now we know that C transpose is the same thing is as C inverse. This is where we need the orthogonality of the matrix C. This is where we need it to be a square matrix where all of its columns are mutually orthogonal and they're all normal. And so this thing is just going to become the identity matrix. I can write the identity matrix there, but that's just going to disappear. So this is going to be equal to X transpose X. X transpose is the same thing as X dot X which is the same thing as the length of X squared. So the length of CX squared is the same thing as the length of X squared. So, that tells us that the length of X, or the length of CX, is the length of x because both of these are going to be positive quantities. So I've shown you that orthogonal matrices definitely preserve length. Let's see if they preserve angles. So we actually have to define angles. Throughout our mathematical careers, we understood what angles mean in kind of r2 or r3. But in linear algebra, we like to be general. And we defined an angle using the dot product. We use the law of cosines and we took an analogy to kind of triangle in r2. But we defined an angle or we said the dot product V dot W is equal to the lengths, the products of the lengths of those two vectors times the cosine of the angle between them. Or you could say that the cosine of the angle between two vectors, we defined as the dot product of those two vectors divided by the lengths of those two vectors. This was the definition so that we can extend the idea of an angle to an arbitrarily high dimension to r google if we had to. So let's see if it preserves. Let's see what the angle is if we multiply these guys by C. So, if we wanted let's say our new angle. So, cosine of angle C. Once we perform our transformation. We're going to perform the transformation on all of these characters. It's going to be CV dot CW over the lengths of CV times the lengths of CW. Now we already know that lengths are preserved. We already know that the length of CW and CV are just going to be W and V. We just proved that. Let me write that. So the cosine of theta C is equal to CV dot CW over the lengths of V times W. Because we've already shown that it preserves lengths. We'll see what this top part equals. So we can just use the general property. The dot product is equal to the transpose of one guy as kind of a matrix times the second guy. So this is equal to CW transpose times CV. And all of that over these lengths. Like the W. And this is going to be equal to I'm going to write it down here to have some space. We can switch these guys and take their transpose. So, it's W transpose times C transpose times CV. All of that over their lengths, the product of their lengths V and W. And this is the identity matrix. That's the identity matrix, and this is going to be equal to W transpose times V over the products of their lengths. And this is the same thing as V dot W. This is V dot W over their lengths. Which is cosine of theta. So, you notice, by our definition of an angle as the dot product divided by the vector lengths, when you perform a transformation or you can imagine a change of basis either way, with an orthogonal matrix C the angle between the transformed vectors does not change. It is the same as the angle between the vectors before they were transformed. Which is a really neat thing to know. The change of bases or transformations with orthogonal matrices don't distort the vectors. They might just kind of rotate them around or shift them a little bit, but it doesn't change the angles between them.
Examples
Consider the innerproduct space with the standard euclidean inner product and standard basis. Then, the matrix transformation
is orthogonal. To see this, consider
Then,
The previous example can be extended to construct all orthogonal transformations. For example, the following matrices define orthogonal transformations on :
See also
References
 ^ Rowland, Todd. "Orthogonal Transformation". MathWorld. Retrieved 4 May 2012.