SolitaryRoad.com

Website owner: James Miller

[ Home ] [ Up ] [ Info ] [ Mail ]

VECTOR SPACE, SUBSPACE, BASIS, DIMENSION, LINEAR INDEPENDENCE

Vector spaces and subspaces – examples.

Let A and B be any two non-collinear vectors in the x-y plane. Then any other vector X in the
plane can be expressed as a linear combination of vectors A and B. That is there exist numbers
k_{1} and k_{2} such that X = k_{1}A + k_{2}B for any vector X. Conversely, any linear combination of
vectors A and B gives a vector in the x-y plane, Note the closure idea involved. Any vector in
the plane can be obtained as a linear combination of A and B and any linear combination gives
some vector in the plane. It is a closed system. It is a vector space.

Let A, B and C be any three non-coplanar vectors in an x-y-z Cartesian coordinate system. Then
any vector in this x-y-z coordinate system can be expressed as a linear combination of A, B and
C. That is, there exist numbers k_{1, }k_{2, }and k_{3} such that X = k_{1}A + k_{2}B + k_{3}C for any vector X.
Conversely, any linear combination of A, B and C gives some vector in the x-y-z coordinate
system. Note the closure idea involved. It is a closed system. It is a vector space.

Pass any plane through the origin of an x-y-z Cartesian coordinate system. Denote the plane by
K. Let A and B be any two non-collinear vectors lying in plane K. Then any linear combination
of vectors A and B is a vector lying in plane K (i.e. if c_{1} and c_{2} are any two numbers, the vector X
= c_{1}A + c_{2}B lies in plane K). Moreover, any vector lying in plane K can be expressed as a linear
combination of vectors A and B (i.e. for any vector X in plane K there exist numbers c_{1} and c_{2}
such that X = c_{1}A + c_{2}B ). Note the closure concept involved. It is a closed system. The
totality of all vectors in plane K constitute a vector space. Vectors A and B constitute a basis for
the space –- as would any other set of two non-collinear vectors lying in K. The dimension of the
space is “two” (it is a two dimensional space). This space constitutes a two-dimensional
subspace of the three dimensional space of the last paragraph. In fact, any plane passing through
the origin of the x-y-z coordinate system constitutes a two-dimensional subspace of three-dimensional space.

Pass any line through the origin of an x-y-z Cartesian coordinate system. Denote the line by L. Let A be any vector lying in the line. Then any multiple of vector A is a vector lying in line L.. Moreover, any vector lying in line L can be expressed as a multiple of vector A.. Note the closure concept involved. It is a closed system. The totality of all vectors in line L constitute a vector space. Line L is a one-dimensional subspace of three-dimensional space.

Linearly dependent and independent sets of vectors

Linear combination of vectors. The vector c_{1}x_{1} + c_{2}x_{2} + ... + c_{m}x_{m } with arbitrary
numerical values for the coefficients c_{1}, c_{2}, ... ,c_{m} is called a linear combination of the vectors
x_{1}, x_{2}, ... ,x_{m} .

Linearly dependent and independent sets of vectors. A set of vectors x_{1}, x_{2}, ... ,x_{m}
is said to be linearly dependent if some one of the vectors in the set can be expressed as a linear
combination of one or more of the other vectors in the set. If none of the vectors in the set can be
expressed as a linear combination of any other vectors of the set, then the set is said to be
linearly independent.

Examples from three-dimensional space. To illustrate the concepts let us consider some examples from three dimensional space.

Let A, B and C be any three non-coplanar vectors in an x-y-z Cartesian coordinate system. Then any vector in this x-y-z coordinate system can be expressed as a linear combination of A, B and C. However, none of these three vectors A, B and C can be expressed as a linear combination of the other two. The vectors A, B and C constitute a linearly independent set. Now add another vector D to the set. Consider the set A, B, C and D. This set is a dependent set because vector D can be expressed as a linear combination of A, B and C.

Pass any plane through the origin of an x-y-z Cartesian coordinate system. Denote the plane by K. Let A and B be any two non-collinear vectors lying in plane K. Then any linear combination of vectors A and B is a vector lying in plane K. However, neither vector A or B can be expressed as a linear combination of the other. The two vectors form a linearly independent set. Now add another vector C to the set. The set A, B and C is a dependent set because vector C can be expressed as a linear combination of A and B.

A necessary and sufficient condition for the independence of a set of vectors.

Theorem. A necessary and sufficient condition for the set of vectors to x_{1}, x_{2}, ... ,x_{m }to be
linearly independent is that

c_{1}x_{1} + c_{2}x_{2} + ... + c_{m}x_{m} = 0

only when all the scalars c_{i} are zero.

What is the reasoning that leads to the assertion of this theorem? Well, a set of vectors x_{1}, x_{2},
... ,x_{m} is linearly dependent if some one of the vectors in the set can be expressed as a linear
combination of one or more of the other vectors in the set. This assertion is equivalent to the
assertion that a set of vectors is linearly dependent if there exist two or more vectors x_{i}, x_{j}, etc.
such that

c_{i}x_{i} + c_{j}x_{j} + ... = 0

where c_{i} c_{j},, etc are non-zero. Said differently, a set is linearly dependent if there exist two or
more non-zero c’s for which the following equation holds true:

c_{1}x_{1} + c_{2}x_{2} + ... + c_{m}x_{m} = 0

If there does not exist two or more non-zero c’s for which it will hold, then the set of vectors is
independent. The case in which only one of the c’s is non-zero is impossible since c_{i}x_{i} = 0 is not
possible. Thus the set of vectors is linearly independent if and only if

c_{1}x_{1} + c_{2}x_{2} + ... + c_{m}x_{m} = 0

only when all the scalars c_{i} are zero.

Linear dependence or independence of a set of vectors is determined from the
rank of a matrix formed from them. Consider a matrix formed from m n-vectors with
each vector corresponding to a row in the matrix. If the rank of the matrix is m the set of vectors
is linearly independent. If the rank is less than m the set of vectors is linearly dependent. If the
rank r is less than m then there are exactly r vectors in the set which are linearly independent and
the remaining vectors c_{1}x_{1} + c_{2}x_{2} + ... + c_{m}x_{m} can be expressed as a linear combination of these
r independent vectors.

Space spanned by a set of vectors. Let x_{1}, x_{2}, ... ,x_{m } be a set of n-vectors in n-dimensional space. They may be a linearly independent set or a linearly dependent set of vectors,
it doesn’t matter. The set of all linear combinations of these vectors corresponds either to all of
n-dimensional space or to some subspace of n-dimensional space. The vector space generated by
all linear combinations of x_{1}, x_{2}, ... ,x_{m } is called the subspace spanned by x_{1}, x_{2}, ... ,x_{m }.

Example. Let us illustrate the concept with an example from three dimensional space.

Pass any plane through the origin of an x-y-z Cartesian coordinate system. Denote the plane by K. Let A, B, C, D and E be five non-collinear vectors lying in plane K. These five vectors form a set that spans plane K, a subspace of three dimensional space. Any linear combination of these vectors lies in plane K and no linear combination lies outside the plane. Now add to this set a vector F which lies outside this plane. The new set of vectors (vectors A, B, C, D, E and F) span all of three dimensional space, Why? Because the set now contains three linearly independent vectors and three independent vectors will span all of three dimensional space.

Basis of a vector space. A basis of a vector space is any set of linearly independent vectors that spans the space. Each vector of the space is then a unique linear combination of the vectors of this basis.

Examples.

In three dimensional space any set of three non-coplanar vectors constitute a basis for the space (choose any three non-coplanar vectors and they qualify as a basis). Any vector in the space can be expressed as a linear combination of these basis vectors and, conversely, any linear combination of these three basis vectors lies in three dimensional space. This basis plays the same role as a set of coordinate axes (it is viewed as a non-orthogonal set of coordinate axes that serve as a reference frame from which we can express any other vector in the space).

Any two non-coplinear vectors in three-dimensional space define a plane that constitutes a subspace of three-dimensional space since any linear combination of these two vectors lies in the plane and, conversely, any vector in the plane can be expressed as a linear combination of these two vectors. Thus these two vectors constitute a basis for a two dimensional subspace of three dimensional space. Similarly, a single vector in 3-space constitutes a basis for a one dimensional subspace of 3-space.

In two dimensional space any set of two non-collinear vectors constitute a basis for the space. These two basis vectors than serve as a non-orthogonal reference frame from which any other vector in the space can be expressed.

Dimension of a vector space. The dimension of a vector space is the number of independent vectors required to span the space.

Subspaces. N-dimensional space V_{n}(F) has embedded in it subspaces of lesser dimensions.
For example, ordinary three-dimensional space has embedded in it two-dimensional subspaces in
the form of planes passing through the origin of the coordinate system and one-dimensional
subspaces in the form of lines passing through the origin. A r-dimensional subspace of V_{n}(F) is
denoted by V_{n}^{r}(F). A two-dimensional subspace of ordinary three-dimensional space V_{3}(R)
would, for example, be denoted by V_{3}^{2}(R).

Row space of a matrix. The row space of a matrix is that subspace spanned by the rows of the matrix (rows viewed as vectors). It is that space defined by all linear combinations of the rows of the matrix.

Consider a matrix containing five rows and three columns. The rows may be viewed as 3-vectors spanning some subspace of three-dimensional space. If the rows contain three linearly independent vectors they span all of three-dimensional space. If the rows contain only two linearly independent vectors they span the subspace of three-dimensional space defined by these two vectors (some plane passing through the origin). If all rows are multiples of some one row they represent one-dimensional subspace of three dimensional space corresponding to some line passing through the origin.

The effect of the elementary row operations on a matrix is to produce other sets of rows in the
same row space. If the rows of a matrix A span some subspace K of n-space V_{n} then the
elementary row operations will produce another matrix whose row vectors span the same
subspace of V_{n} . Row-equivalent matrices have the same row space. The dimension of the row
space corresponds to the number of linearly independent vectors required to span the row space
— which is equal to the rank of the matrix.

Column space of a matrix. The column space of a matrix is the subspace spanned by the columns of the matrix (columns viewed as vectors).

Theorem. For any mxn matrix the dimension of its row space is equal to the dimension of its column space and both dimensions are equal to its rank. In other words the number of linearly independent rows in a matrix is equal to the number of linearly independent columns.

Sum space of two vector spaces. The sum of two vector spaces P and Q is defined as the totality of all vectors x + y where x is in P and y is in Q. This is a vector space and we call it the sum space of P and Q. The sum space can be regarded as the space spanned by the union of the bases of the spaces P and Q.

Intersection space of two spaces. The intersection space of two vector spaces is the set of all vectors that belong to both spaces.

Example. A plane passing through the origin of an x-y-z Cartesian system in ordinary three-dimensional space represents a two-dimensional subspace of three-dimensional space. Consider two planes P and Q passing through the origin which are assumed not to coincide. The sum space of the two planes is the whole three dimensional space and the intersection space is a straight line (the line of their intersection).

Theorem. The dimensions p and q of two given spaces, the dimensions t of their sum and the dimension s of their intersection satisfy the following relation:

p + q = t + s

Coordinate systems in vector spaces. Consider a k-dimensional vector space with
basis vectors A_{1}, A_{2} , ... ,A_{k} so that an arbitrary vector X of the space has a unique representation

X = u_{1}A_{1} + u_{2}A_{2} + ... + u_{k}A_{k}

The vectors A_{1}, A_{2} , ... ,A_{k} are called a coordinate system or reference system in the space and
u_{1}, u_{2}, ... ,u_{k} are the coordinates of X with respect to this system. Thus we can call the n-tuple
{u_{1}, u_{2}, ... ,u_{k} } the coordinate vector of X relative to the basis {A_{1}, A_{2} , ... ,A_{k} } and denote X by

where A denotes the basis {A_{1}, A_{2} , ... ,A_{k} }.

E-basis Coordinate System. The n-vectors

E_{1} = [1, 0, 0, ...., 0]

E_{2} = [0, 1, 0, ...., 0]

..............................

E_{n} = [0, 0, 0, ...., 1]

are called the elementary or unit n-vectors. The elementary vector E_{j}, whose j-th component is 1,
is called the j-th elementary n-vector. The elementary vectors E_{1}, E_{2}, ... ,E_{n} constitute an
important basis for V_{n}(F). Every vector X = [x_{1}, x_{2}, ... ,x_{n}] of n-space V_{n}(F), can be expressed
uniquely as the sum

X = x_{1}E_{1} + x_{2}E_{2} + ... + x_{n}E_{n}

of the elementary vectors. The components x_{1}, x_{2}, ... ,x_{n} of X are now called the coordinates of
X relative to the E-basis.

Changes in coordinates due to change in basis. Let us consider the problem of how the coordinates of vectors are changed on transition from one basis to another in an

n-dimensional vector space.

Let the original basis be the usual E-basis E_{1}, E_{2}, ... ,E_{n} . Let x_{1}, x_{2}, ... ,x_{n} be the coordinates of
a vector X with respect to the E-basis . Let Z_{1}, Z_{2}, ... ,Z_{n} some arbitrary other basis. Then
there exist unique numbers a_{1}, a_{2}, ... ,a_{n} such that

X = a_{1}Z_{1} + a_{2}Z_{2} + ... + a_{n}Z_{n}

These numbers a_{1}, a_{2}, ... ,a_{n} represent the coordinates of X relative to the Z-basis. Writing

we have X = [ Z_{1}, Z_{2}, ... ,Z_{n} ] X_{Z} = ZX_{Z}

where Z is the matrix [ Z_{1}, Z_{2}, ... ,Z_{n} ] whose columns are the basis vectors Z_{1}, Z_{2}, ... ,Z_{n} .

Thus we have the following result: the coordinates of a vector X with respect to the E-basis are related to its coordinates with respect to some other Z-basis by

X = ZX_{Z}

where matrix Z , whose columns are the new basis vectors Z_{1}, Z_{2}, ... ,Z_{n} , is called the
“matrix of the coordinate transformation”.

Now let us ask what happens to the coordinates of a point if we move from some Z-basis given
by {Z_{1}, Z_{2}, ... ,Z_{n} } to some other W-basis given by {W_{1}, W_{2}, ... ,W_{n} }. We know

X = ZX_{Z}

X = WX_{W}

So

WX_{W} = ZX_{Z}

and

X_{W} = W^{-1}ZX_{Z}

Theorem. If a vector of V_{n}(F) has coordinates X_{Z }and X_{W} relative to bases {Z_{1}, Z_{2}, ... ,Z_{n} }
and {W_{1}, W_{2}, ... ,W_{n} } of V_{n}(F), there exits a nonsingular matrix P, determined solely by the
two bases and given by P = W^{-1}Z, such that X_{W} = PX_{Z} .

References.

Ayres. Matrices. Chap. 11

Mathematics, Its Content, Methods and Meaning, Vol. 3, p. 54

James & James. Math. Dictionary. ”vector space”

Hohn. Elementary Matrix Theory. p. 179

Lipschutz. Linear Algebra. p. 90

[ Home ] [ Up ] [ Info ] [ Mail ]