Search results
Results from the WOW.Com Content Network
Visual understanding of multiplication by the transpose of a matrix. If A is an orthogonal matrix and B is its transpose, the ij-th element of the product AA T will vanish if i≠j, because the i-th row of A is orthogonal to the j-th row of A. An orthogonal matrix is the real specialization of a unitary matrix, and thus always a normal matrix.
Equivalently, it is the group of n × n orthogonal matrices, where the group operation is given by matrix multiplication (an orthogonal matrix is a real matrix whose inverse equals its transpose). The orthogonal group is an algebraic group and a Lie group. It is compact. The orthogonal group in dimension n has two connected components.
A square matrix that commutes with its conjugate transpose: AA ∗ = A ∗ A: They are the matrices to which the spectral theorem applies. Orthogonal matrix: A matrix whose inverse is equal to its transpose, A −1 = A T. They form the orthogonal group. Orthonormal matrix: A matrix whose columns are orthonormal vectors. Partially Isometric matrix
A square complex matrix whose transpose is equal to the matrix with every entry replaced by its complex conjugate (denoted here with an overline) is called a Hermitian matrix (equivalent to the matrix being equal to its conjugate transpose); that is, A is Hermitian if = ¯.
A matrix will preserve or reverse orientation according to whether the determinant of the matrix is positive or negative. For an orthogonal matrix R, note that det R T = det R implies (det R) 2 = 1, so that det R = ±1. The subgroup of orthogonal matrices with determinant +1 is called the special orthogonal group, denoted SO(3).
Then, any orthogonal matrix is either a rotation or an improper rotation. A general orthogonal matrix has only one real eigenvalue, either +1 or −1. When it is +1 the matrix is a rotation. When −1, the matrix is an improper rotation. If R has more than one invariant vector then φ = 0 and R = I. Any vector is an invariant vector of I.
In finite-dimensional spaces, the matrix representation (with respect to an orthonormal basis) of an orthogonal transformation is an orthogonal matrix. Its rows are mutually orthogonal vectors with unit norm, so that the rows constitute an orthonormal basis of V. The columns of the matrix form another orthonormal basis of V.
One can always write = where V is a real orthogonal matrix, is the transpose of V, and S is a block upper triangular matrix called the real Schur form. The blocks on the diagonal of S are of size 1×1 (in which case they represent real eigenvalues) or 2×2 (in which case they are derived from complex conjugate eigenvalue pairs).