Matrices of the same size can be added and subtracted entrywise and matrices of compatible sizes can be multiplied. These operations have many of the properties of ordinary arithmetic, except that matrix multiplication is not commutative, that is, AB and BA are not equal in general. Matrices consisting of only one column or row define the components of vectors, while higher-dimensional (e.g., three-dimensional) arrays of numbers define the components of a generalization of a vector called a tensor. Matrices with entries in other fields or rings are also studied.
Get Help With Your Essay
If you need assistance with writing your essay, our professional essay writing service is here to help!
Matrices are a key tool in linear algebra. One use of matrices is to represent linear transformations, which are higher-dimensionalanalogs of linear functions of the form f(x) = cx, where c is a constant; matrix multiplication corresponds to composition of linear transformations. Matrices can also keep track of the coefficients in a system of linear equations. For a square matrix, the determinant and invers matrix (when it exists) govern the behavior of solutions to the corresponding system of linear equations, and eigenvalues and eigenvectors provide insight into the geometry of the associated linear transformation.
Eigen values
Eigenvalues are a special set of scalars associated with a linear system of equations (i.e., a matrix equation) that are sometimes also known as characteristic roots, characteristic values , proper values, or latent roots .
The determination of the eigenvalues and eigenvectors of a system is extremely important in physics and engineering, where it is equivalent tomatrix diagonalization and arises in such common applications as stability analysis, the physics of rotating bodies, and small oscillations of vibrating systems, to name only a few. Each eigenvalue is paired with a corresponding so-called eigenvector (or, in general, a corresponding right eigenvector and a corresponding left eigenvector; there is no analogous distinction between left and right for eigenvalues).
Hermitian matrix
Hermitian matrix (or self-adjoint matrix) is a square matrix with complex entries which is equal to its own conjugate transpose - that is, the element in the ith row and jth column is equal to the complex conjugate of the element in the jth row and ith column, for all indices i and j:
If the conjugate transpose of a matrix A is denoted by , then the Hermitian property can be written concisely as
Properties of Hermitian matrices
For two matrices we have:
If is Hermitian, then the main diagonal entries of are all real. In order to specify the elements of one may specify freely any real numbers for the main diagonal entries and any complex numbers for the off-diagonal entries;
, and are all Hermitian for all ;
If is Hermitian, then is Hermitian for all . If is nonsingular as well, then is Hermitian;
If are Hermitian, then is Hermitian for all real scalars ;
is skew-Hermitian for all ;
If are skew-Hermitian, then is skew-Hermitian for all real scalars ;
If is Hermitian, then is skew-Hermitian;
If is skew-Hermitian, then is Hermitian;
Any can be written as
where respectively are the Hermitian and skew-Hermitian parts of .
Theorem: Each can be written uniquely as , where and are both Hermitian. It can also be written uniquely as , where is Hermitian and is skew-Hermitian.
Theorem: Let be Hermitian. Then
is real for all ;
All the eigenvalues of are real; and
is Hermitian for all .
Theorem: Let be given. Then is Hermitian if and only if at least one of the following holds:
is real for all ;
is normal and all the eigenvalues of are real; or
is Hermitian for all .
Theorem [the spectral theorem for Hermitian matrices]: Let be given. Then is Hermitian if and only if there are a unitary matrix and a real diagonal matrix such that . Moreover, is real and Hermitian (i.e. real symmetric) if and only if there exist a real orthogonal matrix and a real diagonal matrix such that .
Theorem: Let be a given family of Hermitian matrices. Then there exists a unitary matrix such that is diagonal for all if and only if for all .
Positivity of Hermitian matrices
Definition: An Hermitian matrix is said to be positive definite if
for all
If , then is said to be positive semidefinite.
The following two theorems give useful and simple characterizations of the positivity of Hermitian matrices.
Theorem: A Hermitian matrix is positive semidefinite if and only if all of its eigenvalues are nonnegative. It is positive definite if and only if all of its eigenvalues are positive.
In the following we denote by the leading principal submatrix of determined by the first rows and columns:.
As for any positive matrix, if is positive definite, then all principal minors of are positive; when is Hermitian, the converse is also valid. However, an even stronger statement can be made.
Theorem: If is Hermitian, then is positive definite if and only if for . More generally, the positivity of any nested sequence of principal minors of is a necessary and sufficient condition for to be positive definite.
Eigen values of hermitian matrix are always real
Let's take a real symmetric matrix A. The eigenvalue equation is:
Ax = ax
where the eigenvalue a is a root of the characteristic polynomial
p(a) = det(A - aI)
and x is just the corresponding eigenvector of a. The important part
is that x is not 0 (the zero vector).
Well, anyway. Let's calculate the following inner product
(here, x_i* is the complex conjugate of x_i):
= sum_i x_i* (sum_j A_ij x_j)
= sum_i sum_j x_i* A_ij x_j
That's the inner product expanded out, which we'll use later.
But for now, note that since x is an eigenvector, we know that
Ax = ax. We can use this fact to conclude:
= sum_i x_i* (ax)_i
= sum_i x_i* a x_i
= a sum_i x_i* x_i
= a (sum_i |x_i|^2)
Note that sum_i |x_i|^2 is always positive since x is nonzero. We'll
use this fact later, too. Next, we should find the following inner
product (again, y* means complex conjugate of y):
= sum_i (sum_j A_ij x_j)* x_i
= sum_i (sum_j A_ij* x_j*) x_i
= sum_i sum_j x_i A_ij* x_j*
But now, we can use the fact that A^t = A and that A is real. In
particular, that A_ij* = A_ij, and A_ji = A_ij.
= sum_i sum_j x_i A_ji x_j*
= sum_j sum_i x_j* A_ji x_i
= sum_I sum_J x_I* A_IJ x_J (renaming j->I, i->J)
= sum_i sum_j x_i* A_ij x_j (dummy variables J->j, I->i)
= So, because A is real and symmetric, we have A = A^t and
Now, take the eigenvalue equation again:
Ax = ax
Now, take the transpose and then complex conjugate:
(Ax)^t = (ax)^t
x^t A^t = a x^t
x^t A = a x^t (since A^t = A)
(x^t A)* = (a x^t)*
(x*)^t A* = a* (x*)^t
(x*)^t A = a* (x*)^t (since A* = A)
Now, just multiply both sides by x, (on the right),
(x*)^t A x = a* (x*)^t x
sum_i (x*)_i (Ax)_i = a* sum_i (x*)_i x_i
sum_i x_i* (sum_j A_ij x_j) = a* sum_i x_i* x_i
sum_i sum_j x_i* A_ij x_j = a* (sum_i |x_i|^2)
or
But, we already found that and that 0 = = a* (sum_i |x_i|^2) - a (sum_i |x_i|^2)
0 = (a* - a) (sum_i |x_i|^2)
Since sum_i |x_i| > 0, we can divide this last equation by it,
which gives us
0 = a* - a
or
a = a*
Since a is any eigenvalue of A, we have proven that the complex
conjugate of a is a itself. This can only happen if a is real,
which concludes the proof.
Note that we spent most of the time doing inner product math in the
long-winded explanation given above. All we really wanted to say was
Cite This Work
To export a reference to this article please select a referencing style below: