basis for a subspace:
A basis for a subspace W is a set of vectors v1_, ...,_vk in W such that:
-
v1_, ...,_ vk are [linearly independent](#linearly independent); and
-
v1_, ...,_ vk span W.
characteristic polynomial of a matrix:
The characteristic polynomial of a n by n matrix A is the polynomial in t given by the formula det(A - t*I).
column space of a matrix:
The column space of a matrix is the subspace spanned by the columns of the matrix considered as vectors. See also: [row space](#row space).
consistent linear system:
A system of linear equations is consistent if it has at least one solution. See also: inconsistent.
defective matrix:
A matrix A is defective if A has an eigenvalue whose [geometric multiplicity](#geometric multiplicity) is less than its [algebraic multiplicity](#algebraic multiplicity).
diagonalizable matrix:
A matrix is diagonalizable if it is [dimension](#similar>similar to a diagonal matrix.
row echelon form of a matrix:
A matrix is in row echelon form if:
-
all rows that consist entirely of zeros are grouped together at the bottom of the matrix; and
-
the first (counting left to right) nonzero entry in each nonzero row appears in a column to the right of the first nonzero entry in the preceding row (if there is a preceding row).
A matrix is in reduced row echelon form if:
- the matrix is in [row echelon form](#row echelon form);
- the first nonzero entry in each nonzero row is the number 1; and
- the first nonzero entry in each nonzero row is the only nonzero entry in its column.
The eigenspace associated with the eigenvalue c of a matrix A is the [null space](#null space) of A - c*I.
An eigenvalue of a n by n matrix A is a scalar c such that A*x = c*x holds for some nonzero vector x (where x is an n-tuple). See also: eigenvector.
An eigenvector of a n by n matrix A is a nonzero vector x such that A*x = c*x holds for some scalar c. See also: eigenvalue.
Two systems of linear equations in n unknowns are equivalent if they have the same set of solutions.
A system of linear equations A*x = b is homogeneous if b = 0.
A system of linear equations is inconsistent if it has no solutions. See also: consistent.
The matrix B is an inverse for the matrix A if AB = BA = I.
A matrix is invertible if it has an inverse.
A least-squares solution to a system of linear equations A*x = b is a vector x that minimizes the length of the vector A*x - b.
A vector v is a linear combination of the vectors v1_, ...,_ vk if there exist scalars a_1, ..., ak_ such that v = a_1_v1 _+ ... + ak_vk.
The vectors v1_, ...,_ vk are linearly dependent if the equation a_1_v1 _+ ... + akvk = 0 has a solution where not all the scalars a_1, ..., ak are zero.
The vectors v1_, ...,_ vk are linearly independent if the only solution to the equation a1*v1 + ... + ak*vk = 0 is the solution where all the scalars a_1, ..., ak_ are zero.
A linear transformation from V to W is a function T from V to W such that:
- T(u+v) = T(u) + T(v) for all vectors u and v in V; and
- T(a*v) = a*T(v) for all vectors v in V and all scalars a.
The algebraic multiplicity of an eigenvalue c of a matrix A is the number of times the factor (t-c) occurs in the [characteristic polynomial](#characteristic polynomial) of A.
The geometric multiplicity of an eigenvalue c of a matrix A is the dimension of the eigenspace of c.
An n by n matrix A is nonsingular if the only solution to the equation A*x = 0 (where x is an n-tuple) is x = 0. See also: singular.
The null space of a m by n matrix A is the set of all n-tuples x such that A*x = 0.
The null space of a [linear transformation](#linear transformation) T is the set of vectors v in its domain such that T(v) = 0.
The nullity of a matrix is the dimension of its [null space](#null space of a matrix).
The nullity of a [dimension](#linear transformation>linear transformation is the <a href=) of its [null space](#null space).
A set of n-tuples is orthogonal if the dot product of any two of them is 0.
A matrix A is orthogonal if A is invertible and its [orthogonal linear transformation:](#inverse>inverse equals its transpose.
A set of n-tuples is orthonormal if it is orthogonal and each vector has length 1.
The range of a m by n matrix A is the set of all m-tuples A*x, where x is any n-tuple.
The range of a [linear transformation](#linear transformation) T is the set of all vectors T(v), where v is any vector in its domain.
The rank of a matrix is the number of nonzero rows in any [row equivalent](#row equivalent) matrix that is in [row echelon form](#row echelon form).
The rank of a linear transformation (and hence of any matrix regarded as a [linear transformation](#linear transformation)) is the dimension of its [range](#range of a linear transformation). Note: A theorem tells us that the two definitions of rank of a matrix are equivalent.
Two matrices are row equivalent if one can be obtained from the other by a sequence of elementary [row operations:](#row operations>row operations.
The row space of a matrix is the subspace spanned by the rows of the matrix considered as vectors. See also: [similar matrices:](#column space>column space.
An n by n matrix A is singular if the equation A*x = 0 (where x is an n-tuple) has a nonzero solution for x. See also: nonsingular.
span of a set of vectors:
The span of the vectors v1_, ...,_ vk is the subspace V consisting of all [linear combinations](#linear combination) of v1_, ...,_ vk. One also says that the subspace V is spanned by the vectors v1_, ...,_ vk and that these vectors span V.
A subset W of n-space is a subspace if:
- the zero vector is in W;
- x+y is in W whenever x and y are in W; and
- a*x is in W whenever x is in W and a is any scalar.
A matrix is symmetric if it equals its transpose.