Go to: Introduction, Notation, Index

see skew-symmetric .

**A** is upper bidiagonal if *a(i,j)=*0 unless *i=j *or *i=j-*1.

**A** is lower bidiagonal if *a(i,j)=*0 unless *i=j *or *i=j+*1

A bidiagonal matrix is also tridiagonal, triangular, , and Hessenberg.

**A** is block diagonal if it has the form [**A 0** ... **0**; **0
B** ... **0**;...;**0 0** ... **Z**] where **A**, **B**, ..., **Z**
are matrices (not necessarily square).

- A matrix is block diagonal iff is the direct sum of two or more smaller matrices.

A circulant matrix, **A**, is an *n*n Toeplitz *matrix
in which a(i,j) is a function of {*(i-j) *modulo *n*}. In other words each
column of **A** is equal to the previous column rotated downwards by one element.

- If
**a**is the first column of a circulant matrix**A**, then the eigenvalues of**A**are given by the polynomial Sum(*a(r-1)*x*) evaluated at the^{r}*n*'th roots of unity.

A Circular matrix, **A**, is one for which inv(**A**) = conj(**A**).

- A matrix
**A**is circular iff**A**=exp (*j***B**) where*j*= sqrt(-1),**B**is real and exp() is the matrix exponential function. - If
**A**=**B**+*j***C**where**B**and**C**are real and*j*= sqrt(-1) then**A**is circular iff**BC**=**CB**and also**BB**+**CC**=**I**.

If p(x) is a polynomial of the form *a*(0) + *a*(1)**x* + *a*(2)**x*^{2}
+ ... + *a*(*n*)**x ^{n}* then the polynomial's companion
matrix is

The rows and columns are sometimes given in reverse order [-*a*(*n*-1:0)/*a*(*n*)
; **I 0**].

- The characteristic and minimal polynomials of a companion matrix both equal p(x).
- The eigenvalues of a companion matrix equal the roots of p(x).

A matrix **A** is convergent if **A**^{k} tends to **0** as *k*
tends to infinity.

**A**is convergent iff all its eigenvalues have modulus < 1.**A**is convergent iff there exists a positive definite**X**such that**X-A**^{H}**XA**is positive definite (Stein's theorem)- If
**S**_{k}is defined as**I**+**A**+**A**^{2}+ … +**A**^{k}, then**A**is convergent iff**S**_{k}converges as*k*tends to infinity. If it does converge its limit is (**I-A**)^{-1}.

**WARNING**: The term reducible is sometimes used
instead of decomposable.

A matrix, **A**, is decomposable if there exists a permutation matrix **P**
such that **P**^{T}**AP** is of the form [**B 0**; **C D**]
where **B **and **D** are square.

A matrix that is not decomposable is indecomposable.

A matrix, **A**, is partly-decomposable if there exist permutation matrices **P**
and **Q** such that **P**^{T}**AQ** is of the form [**B 0**; **C
D**] where **B **and **D** are square.

A matrix that is not even partly-decomposable is fully-indecomposable.

- An indecomposable matrix has at least one non-zero off-diagonal element in each row and column.

A matrix **A**:*n#n* is *defective* if it does not have n linearly
independent eigenvectors, otherwise it is simple.

**A**is defective iff**A**is not diagonable.- A is defective iff the geometric and algebraic multiplicity differ for at least one eigenvalue.

A real or Hermitian square matrix** A** is positive
definite* *if **x**^{H}**Ax** > 0 for all non-zero **x**.

A real or Hermitian square matrix** A** is positive semi-definite or non-negative
definite if **x**^{H}**Ax** >=0 for all non-zero **x**.

A real or Hermitian square matrix** A** is indefinite*
*if **x**^{H}**Ax** is > 0 for some **x** and < 0 for
some other **x**.

Note that for any non-Hermitian complex matrix **x**^{H}**Ax** is
complex for some values of **x**. Such matrices are therefore excluded from the concept
of definiteness.

- The following are equivalent
**A**is Hermitian and +ve semidefinite**A**=**B**^{H}**B**for some**B**(not necessarily square)**A**=**C**^{2}for some Hermitian**C**.

- If
**A**is +ve definite then**INV**(**A**) exists and is +ve definite. - The matrix [
*a b*; 0*c*] is +ve definite iff*b*^{2}< 4*ac* - [
*Real*] A real matrix**A**is +ve definite iff its symmetric part**B**=(**A**+**A**^{T})/2 is positive definite. Indeed**x**^{T}**Ax= x**^{T}**Bx**for all**x**. **B**^{H}**B**is +ve definite iff the columns of**B**are linearly independent (**B**not necessarily square)

An *n***n* square matrix is derogatory if its minimal polynomial
is of lower order than *n*.

**A** is diagonal if *a(i,j)=0* unless *i=j*.

- Diagonal matrices are closed under addition, multiplication and (where possible) inversion.
- The determinant of a square diagonal matrix is the product of its diagonal elements.
- If
**D**is diagonal,**DA**multiplies each row of**A**by a constant while**BD**multiplies each column of**B**by a constant.

A matrix is simple or *diagonable* if it is similar to a
diagonal matrix otherwise it is defective.

A square matrix **A** is diagonally dominant if the absolute value of
each diagonal element is greater than the sum of absolute values of the non-diagonal
elements in its row. That is if for all *i* |*a*(*i*,*i*)| > Sum(|*a*(*i*,*j*)|;*j
*!= *i*).

- [Real]: If the diagonal elements of a square matrix
**A**are all >0 and if**A**and**A**^{T}are both diagonally dominant then**A**is positive definite.

A real non-negative square matrix **A** is doubly-stochastic if its rows
and columns all sum to 1.

See under stochastic for properties.

An *essential* matrix is the product of a 3#3 orthogonal matrix and a
3#3 skew-symmetric matrix. In 3-D euclidean space, a translation+rotaion
transformation is associated with an essential matrix.

- If
**E**is an essential matrix then so are**E**,^{T}*k***E**and**UEV**where*k*is a non-zero scalar and**U**and**V**are orthogonal. **E**is an essential matrix iff rank(**E**)=2 and**EE**^{T}**E**= ½tr(**EE**)^{T}**E**. This defines a set of nine homogeneous cubic equations.**E**is an essential matrix iff its singular values are*k*,*k*and 0 for some*k*>0.- If
**E**is an essential matrix then**A**=*k***E**for some*k*iff**Ex****×****Ax**= 0 for all**x**where × denotes the vector cross product.

The exchange matrix **J**:*n*n* is equal to [**e**(*n*) **e**(*n*-1)
... **e**(2) **e**(1)]. It is equal to **I** but with the columns in reverse
order.

**J**is Hankel, Orthogonal, Symmetric, Permutation, Doubly Stochastic and a square root of**I**.**JA**^{T},**JAJ**and**A**^{T}**J**are versions of the matrix**A**that have been rotated anti-clockwise by 90, 180 and 270 degrees**JA**,**JA**^{T}**J**,**AJ**and**A**^{T}are versions of the matrix**A**that have been reflected in lines at 0, 45, 90 and 135 degrees to the horizontal measured anti-clockwise.- det(
**J**_{n#n}) = (-1)^{n(n-1)/2}i.e. it equals +1 if n mod 4 equals 0 or 1 and -1 if n mod 4 equals 2 or 3

[*Real*]: A Givens Reflection is an *n*n* matrix of the form **P**^{T}**[Q
0 ; 0 I]P** where **P** is any permutation matrix and **Q**
is a matrix of the form [cos(*x*) sin(*x*); sin(*x*) -cos(*x*)].

- A Givens reflection is symmetric and orthogonal.
- [2*2]: A 2*2 matrix is a Givens reflection iff it is a Householder matrix.

[*Real*]: A Givens Rotation is an *n*n* matrix of the form **P**^{T}**[Q
0 ; 0 I]P** where **P** is a permutation matrix and **Q**
is a matrix of the form [cos(*x*) sin(*x*); -sin(*x*) cos(*x*)].

- A Givens rotation is skew-symmetric and orthogonal.

An *n*n* Hadamard matrix has orthogonal columns whose elements are all
equal to +1 or -1.

- Hadamard matrices exist only for
*n*=2 or*n*a multiple of 4. - If A is an
*n*n*Hadamard matrix then**A**^{T}**A**=*n****I**. Thus**A**/sqrt(*n*) is orthogonal. - If A is an
*n*n*Hadamard matrix then det(**A**) =*n*.^{n/2}

A real 2*n**2*n* matrix, **A**, is Hamiltonian if **KA** is
symmetric where **K** = [**0** **I**; -**I** **0**].

*See also:* symplectic

A Hankel matrix has constant anti-diagonals. In other words *a(i,j)*
depends only on *(i+j).*

- A Hankel matrix is symmetric.
- [
**A**:Hankel] If**J**is the exchange matrix, then**JAJ**is Hankel;**JA**and**AJ**are Toepliz. - [
**A**:Hankel]**A+B**and**A-B**are Hankel.

A square matrix** A** is Hermitian if **A** = **A**^{H},
that is **A**(*i,j*)=conj(**A**(*j,i*))

For real matrices, *Hermitian* and symmetric are
equivalent. Except where stated, the following properties apply to real symmetric matrices
as well.

- [
*Complex*]:**A**is Hermitian iff**x**^{H}**Ax**is real for all (complex)**x**. - The following are equivalent
**A**is Hermitian and +ve semidefinite**A**=**B**^{H}**B**for some**B****A**=**C**^{2}for some Hermitian**C**.

- Any matrix
**A**has a unique decomposition**A**=**B**+*j***C**where**B**and**C**are Hermitian: B = (**A+A**^{H})/2 and C=(**A-A**^{H})/2*j* - Hermitian matrices are closed under addition, multiplication by a scalar, raising to an integer power, and (if non-singular) inversion.
- The eigenvalues of a Hermitian matrix are all real.
- Hermitian matrices are normal
**A**is Hermitian iff**x**^{H}**Ay=x**^{H}**A**^{H}**y**for all**x**and**y**.

A Hessenberg matrix is like a triangular matrix except that the elements
adjacent to the main diagonal can be non-zero.

**A** is upper Hessenberg if **A**(i,j)=0 whenever i>j+1. It is like
an upper triangular matrix except for the elements immediately
below the main diagonal.

**A** is lower Hessenberg if *a(i,j)*=0 whenever *i<j*-1. It is
like a lower triangular matrix except for the elements
immediately above the main diagonal.

- A symmetric or Hermitian Hessenberg matrix is tridiagonal.
- If
**A**is upper triangular and**B**is upper Hessenberg then**AB**is upper Hessenberg.

A Hilbert matrix is a square Hankel matrix with elements *a(i,j)*=1/(*i+j*-1).

- The inverse of a hilbert matrix has integer elements.

A Householder matrix (also called Householder reflection or transformation)
is a matrix of the form **(I-2vv**^{H}**)** for some vector **v**
with ||**v**||=1.

Multiplying a vector by a Householder transformation reflects it in the hyperplane that
is orthogonal to **v**.

Householder matrices are important because they can be chosen to annihilate any contiguous block of elements in any chosen vector.

- A Householder matrix is symmetric and orthogonal.
- [
*n*]: Given a vector**x**, we can choose a Householder matrix**P**such that**Px**=[*y*0 0 ... 0]^{H}. To do so, we choose**v**= (**x**+*k***e**(1))/||**x**+*k***e**(1)|| where*k*=sgn(*x*(1))*||**x**||. - [2*2]: A 2*2 matrix is Householder iff it is a Givens Reflection.

The hypercompanion matrix of the polynomial *p*(*x*)=(*x-a*)^{n}
is *a***I**:*n#n*+[**0** **I**:*n-1#n-1*; **0**]. It is an
upper bidiagonal matrix that is zero except for the value *a*
along the main diagonal and the value 1 on the diagonal immediately above it.

- The characteristic and minimal polynomials equal
*p(x)*.

P matrix **P** is idempotent if **P**^{2} = **P **. An
idempotent matrix that is also hermitian is called a projection matrix.

**WARNING:** Some people call any idempotent matrix a projection matrix and call it
an *orthogonal projection* matrix if it is also hermitian.

If **P** is idempotent:

- rank(
**P**)=tr(**P**). - The eigenvalues of
**P**are all either 0 or 1. The geometric multiplicity of the eigenvalue 1 is rank(**P**). **P**^{H},**I-P**and**I-P**^{H}are all idempotent.**P(I-P)**=**(I-P)P**=**0**.**Px**=**x**iff**x**lies in the range of**P**.- The null space of
**P**equals the range of**I-P**. In other words**Px**=**0**iff**x**lies in the range of**I-P**. **P**is its own generalized inverse,**P**^{#}.- [
**A**: n#n,**F**,**G**: n#r] If**A**=**FG**^{H}where**F**and**G**are of full rank, then**A**is idempotent iff**G**^{H}**F**=**I**.

The identity matrix , **I**, has *a(i,i)*=1 for all i and *a(i,j)*=0
for all i !=j

An incidence matrix is one whose elements all equal 1 or 0.

An Integral matrix is one whose elements are all integers.

An Involutary matrix is one whose square equals the identity.

see under Tridiagonal

A matrix **A** is nilpotent to index *k *if **A**^{k}
= **0 **but **A**^{k-1} != 0.

- The determinant of a nilpotent matrix is 0.
- The eigenvalues of a nilpotent matrix are all 0.
- If
**A**is nilpotent to index*k*, its minimal polynomial is*t*.^{k}

see under positive

A square matrix **A** is normal if **A**^{H}**A **= **AA**^{H}

**A**is normal iff it is unitarily similar to a diagonal matrix.- The following types of matrix are normal: diagonal, hermitian, unitary, skew-hermitian.
- A normal matrix is hermitian iff its eigenvalues are all real.
- The singular values of a normal matrix are the absolute values of the eigenvalues.
- [
**A**: normal] The eigenvalues of**A**^{H}are the conjugates of the eigenvalues of**A**and have the same eigenvectors. - A normal matrix is skew-hermitian iff its eigenvalues all have zero real parts.
- A normal matrix is unitary iff its eigenvalues all have an absolute value of 1.
- Normal matrices are closed under raising to an integer power and (if non-singular) inversion.
- If
**A**and**B**are normal__and__**AB=BA**then**AB**is normal.

A real square matrix** Q** is orthogonal if **Q'Q **= **I**.
It is a *proper orthogonal* matrix if det(**Q**)=1 and an *improper
orthogonal* matrix if det(**Q**)=-1.

For real matrices, *orthogonal* and unitary mean
the same thing. Most properties are listed under unitary.

Geometrically: Orthogonal matrices correspond to rotations and reflections.

- [2*2]: A 2*2 orthogonal matrix is either a Givens rotation or a Givens reflection.
- The determinant of an orthogonal matrix equals +-1.
**Q**is a proper orthogonal matrix iff**Q**= exp(**K**) or**K**=ln(**Q**) for some real skew-symmetric**K**.- For
*a*=+1 or*a*=-1, there is a 1-to-1 correspondence between real skew-symmetric matrices,**K**, and orthogonal matrices,**Q**, not having*a*as an eigenvalue given by**Q**=*a*(**K-**I)(**K**+**I**)^{-1}and**K**=(*a***I**+**Q**)(*a***I**-**Q**)^{-1}. These are Caley's formulae.- For a=-1 this gives
**Q**=(**I**-**K**)(**I**+**K**)^{-1}and**K**=(**I**-**Q**)(**I**+**Q**)^{-1}. Note that (**I**+**K**) is always non-singular.

- For a=-1 this gives

A square matrix **A** is a permutation matrix if its columns are a
permutation of the columns of **I**.

**A**is a permutation matrix iff**A**^{T}is a permutation matrix- The set of permuation matrices is closed under multiplication and inversion.
- A permutation matrix is orthogonal .

An *n*n* matrix **A** is persymmetric if it is symmetric about is
anti-diagonal, i.e. if *a(i,j) = a(n+*1*-j,n+*1*-i)*.

- A Toeplitz matrix is persymmetric.

A polynomial matrix of order *p *is one whose elements are polynomials
of a single variable *x*. Thus **A**=**A**(0)+**A**(1)*x*+...+**A**(*p*)*x ^{p}*
where the

See also regular.

A real matrix is positive if all its elements are strictly > 0.

A real matrix is non-negative if all its elements are >= 0.

see under definiteness

If *k* is the eigenvalue of a matrix **A**
having the largest absolute value, then **A** is primitive if the absolute
values of all other eigenvalues are < |*k*|.

A projection matrix (or orthogonal projection matrix) is a square matrix that
is hermitian and idempotent: i.e.
**P**^{H}=**P**^{2}=**P**.

**WARNING:** Some people call any idempotent matrix a projection matrix and call it
an *orthogonal projection* matrix if it is also hermitian.

**P**is positive semi-definite.**X**(**X**^{H}**X**)^{#}**X**^{H}is a projection whose range is the subspace spanned by the columns of**X**.**xx**^{H}/**x**^{H}**x**is a projection in the direction of**x**.

- If
**P**and**Q**are projection matrices, then the following are equivalent:**P-Q**is a projection matrix**P-Q**is positive semidefinite- ||
**Px|**| >= ||**Qx**|| for all**x**. **PQ=Q****QP=Q**

- [
**A**: idempotent]**A**is a projection matrix iff ||**Ax**|| <= ||**x**|| for all**x**.

**WARNING**: The term reducible is sometimes used to mean decomposable.

A matrix **A**:*n*n* is reducible if it is similar to a block-diagonal matrix
of the form [**B 0; 0 C**] where **B** and **C** are square.

A polynomial matrix, **A**, of order *p* is regular
if det(**A**) is non-zero.

- An
*n*n*square polynomial matrix,**A**(*x*), of order*p*is regular iff det(**A**) is a polynomial in*x*of degree*n*p*.

An *n***n* square matrix is simple or diagonable
if it has *n* linearly independent eigenvectors, otherwise it is defective.

A matrix is singular if it has no inverse.

- A matrix
**A**is singular iff det(**A**)=0.

A square matrix** K** is Skew-Hermitian (or *antihermition*)
if **K **= -**K**^{H},
that is *a(i,j)*=-conj(*a(j,i)*)

For real matrices, Skew-Hermitian and skew-symmetric are equivalent. The following properties apply also to real skew-symmetric matrices.

**S**is Hermitian iff*j***S**is skew-Hermitian where*j*= sqrt(-1)**K**is skew-Hermitian iff**x**^{H}**Ky = -x**^{H}**K**^{H}**y**for all**x**and**y**.- Skew-Hermitian matrices are closed under addition, multiplication by a scalar, raising to an odd power and (if non-singular) inversion.
- Skew-Hermitian matrices are normal.
- The eigenvalues of a skew-Hermitian matrix are either 0 or pure imaginary.
- Any matrix
**A**has a unique decomposition**A**=**S**+**K**where**S**is Hermitian and**K**is skew-hermitian. **K**is skew-hermitian iff**K**=ln(**U**) or**U**=exp(**K**) for some unitary**U**.- For any complex
*a*with |*a*|=1, there is a 1-to-1 correspondence between the unitary matrices,**U**, not having*a*as an eigenvalue and skew-hermitian matrices,**K**, given by**U**=*a*(**K**-**I**)(**I**+**K**)^{-1}and**K**=(*a***I**+**U**)(*a***I**-**U**)^{-1}. These are Caley's formulae.- Taking
*a*=-1 gives**U**=(**I**-**K**)(**I**+**K**)^{-1}and**K**=(**I**-**U**)(**I**+**U**)^{-1}.

- Taking

A square matrix** K** is skew-symmetric (or *antisymmetric*)
if **K **= -**K**^{T},
that is *a(i,j)*=-*a(j,i)*

For real matrices, skew-symmetric and Skew-Hermitian are equivalent. Most properties are listed under skew-Hermitian .

- Skew-symmetry is preserved by congruence.
- [
*Real*]: The eigenvalues of a skew-symmetric matrix are all 0.- If
**K**is skew-symmetric, then**I**-**K**is non-singular

- If
- [
*Real*]: If**A**is skew-symmetric, then**x**^{T}**Ax**= 0 for all real**x**. - There is a 1-to-1 correspondence between real skew-symmetric matrices,
**K**, and orthogonal matrices,**Q**, not having*a*as an eigenvalue given by**Q**=*a*(**K-**I)(**K**+**I**)^{-1}and**K**=(*a***I**+**Q**)(*a***I**-**Q**)^{-1}where*a*=+1 or -1. These are Caley's formulae. **K**is real skew-symmetric iff**K**=ln(**Q**) or**Q**= exp(**K**) for some real proper orthogonal matrix**Q**.- [3#3]: All 3#3 skew-symmetric matrices have the form
**SKEW**(**a**) = [0 -*a*_{3}*a*_{2};*a*_{3}0 -*a*_{1}; -*a*_{2}*a*_{1}0] for some vector**a**.**SKEW**(*k***a**) =*k***SKEW**(**a**) for any scalar*k*- The
vector cross product is given by
**a**×**b**=**SKEW**(**a**)**b = -SKEW**(**b**)**a** **SKEW**(**a**)**b**= 0 iff**a**=*k***b**for some scalar*k*- The determinant det([
**a****b****c**]) =**a**^{T }**SKEW**(**b**)**c**=**b**^{T}**SKEW**(**c**)**a**=**c**^{T}**SKEW**(**a**)**b** **a**^{T }**SKEW**(**b**)**a**= 0 for all**a**and**b**

A matrix is sparse if it has relatively few non-zero elements.

A Stability or Stable matrix is one whose eigenvalues all have
strictly negative real parts.

A semi-stable matrix is one whose eigenvalues all have non-positive real parts.

A real non-negative square matrix **A** is stochastic
if all its rows sum to 1.

- All eigenvalues of
**A**are <= 1. - 1 is an eigenvalue with eigenvector [1 1 ... 1]
^{T}

See also Doubly Stochastic

A real non-negative square matrix **A** is sub-stochastic
if all its rows sum to <=1.

**A** is *subunitary* if ||**AA**^{H}**x**|| = ||**A**^{H}**x**||
for all **x**. A is also called a *partial isometry*.

The following are equivalent:

**A**is subunitary**A**^{H}**A**is a projection matrix**AA**^{H}**A**=**A****A**^{+}=**A**^{H}

**A**is subunitary iff**A**^{H}is subunitary iff**A**^{+}is subunitary.- If
**A**is subunitary and non-singular than**A**is unitary.

A square matrix** A** is symmetric if **A** = **A**^{T},
that is *a(i,j) = a(j,i)*.

Most properties of real symmetric matrices are listed under Hermitian .

- [
*Real*]: If**A**is symmetric, then**A**=0 iff**x**^{T}**Ax**= 0 for all real**x**. - [
*Real*]: A symmetric matrix is orthogonally similar to a diagonal matrix. **A**is symmetric iff it is congruent to a diagonal matrix.- Any square matrix may be uniquely decomposed as the sum of a symmetric matrix and a skew-symmetric matrix.

See also Hankel.

A real matrix, **A**, is symmetrizable if **A**^{T}**M
**= **MA** for some positive definite **M**.

A real matrix, **A**_{[2n#2n]}, is symplectic
if **A**^{T}**KA=K** is symmetric where **K** = [**0** **I**;
-**I** **0**].

*See also*: hamiltonian

A toeplitz matrix has constant diagonals. In other words *a(i,j)*
depends only on *(i-j)*.

- A toeplitz matrix is persymmetric and so, if it exists, is its inverse.
- [
**A,B**:Toeplitz]**A+B**and**A-B**are Toeplitz. - [
**A**:Toeplitz] If**J**is the exchange matrix, then**JAJ**is Toeplitz;**JA**=**A**^{T}**J**and**AJ**=**JA**^{T}are Hankel. - A toeplitz matrix
**A**may be decomposed as**A**=**TOE**(**b**) +**TOE**(**c**)^{T}for some vectors**b**and**c**(see Notation section for**TOE**() definition)

**A** is upper triangular if *a(i,j)=0* whenever *i>j.
*

A triangular matrix

A triangular matrix

- [
*Real*]: An orthogonal triangular matrix must be diagonal - [
*n*n*]: The determinant of a triangular matrix is the product of its diagonal elements. - If
**A**is unit triangular then inv(**A**) exists and is unit triangular. - A strictly triangular matrix is nilpotent .
- The set of upper triangular matricies are closed under multiplication and addition and (where possible) inversion.
- The set of lower triangular matricies are closed under multiplication and addition and (where possible) inversion.

A is tridiagonal if A(i,j)=0 whenever |i-j|>1. In other words its non-zero elements lie either on or immediately adjacent to the main diagonal.

- A is tridiagonal iff it is both upper and lower Hessenberg.

A complex square matrix** A** is unitary if **A**^{H}**A
**= **I**. **A** is also sometimes called an *isometry*.

A real unitary matrix is called orthogonal .The following properties apply to orthogonal matrices as well as to unitary matrices.

- Unitary matrices are closed under multiplication, raising to an integer power and inversion
**U**is unitary iff**U**^{H}is unitary.- Unitary matrices are normal.
**U**is unitary iff ||**Ux**|| = ||**x**|| for all**x**.- The eigenvalues of a unitary matrix all have an absolute value of 1.
- The determinant of a unitary matrix has an absolute value of 1.
- A matrix is unitary iff its columns form an orthonormal basis.
**U**is unitary iff**U**=exp(**K**) or**K**=ln(**U**) for some skew-hermitian**K**.- For any complex
*a*with |*a*|=1, there is a 1-to-1 correspondence between the unitary matrices,**U**, not having*a*as an eigenvalue and skew-hermitian matrices,**K**, given by**U**=*a*(**K**-**I**)(**I**+**K**)^{-1}and**K**=(*a***I**+**U**)(*a***I**-**U**)^{-1}. These are Caley's formulae.- Taking
*a*=-1 gives**U**=(**I**-**K**)(**I**+**K**)^{-1}and**K**=(**I**-**U**)(**I**+**U**)^{-1}.

- Taking

An *n*n* Vandermonde matrix is of the form [**x**^{n-1} ... **x**^{2}
**x** 1] for some column vector **x**. a general element is given by *v(i,j)=x(i) ^{n-j}*.
The rightmost column of the matrix has all elements = 1.

Vandermonde matrices arise in connection with fitting polynomials to data.

The zero matrix, 0, has a(i,j)=0 for all i,j

- [
*Complex*]:**A**=0 iff**x**^{H}**Ax**= 0 for all**x**. - [
*Real*]: If**A**is symmetric, then**A**=0 iff**x**^{T}**Ax**= 0 for all**x**. - [
*Real*]:**A**=0 iff**x**^{T}**Ay**= 0 for all**x**and**y**. **A**=0 iff**A**^{H}**A**= 0

The Matrix Reference Manual is written by Mike Brookes, Imperial College, London, UK. Please send any comments or suggestions to mike.brookes@ic.ac.uk