SPANNING, LINEAR INDEPENDENCE, BASIS

Spanning -- What does it Mean??

Before we can explain "spanning", we need to define a "linear combination" of vectors.

Def'n: if vector w can be expressed as k1v1 + k2v2 + .... + knvn where ,
w is a linear combination of v1, v2, .... vn .

Now that we know that, the best way to understand what "Spanning" means is to call it
"Generating" instead of spanning. Spanning makes us think of a bridge reaching OVER a space -- whereas Generating evokes the image of a garden where things
grow up or out FROM WITHIN a space.

Our text books define it this way:

Def'n: A set of vectors S = { v1, v2, .... vn } spans a vector space V,
if every vector in V can be expressed as a linear combination of { v1, v2, .... vn }.

The definition says that if every vector in V, can be GENERATED by a
combination of some or all vectors in set S, then S spans V.

Think of the vectors in S as SEEDS for the Garden (the vector space V). We can mix them together with soil, sun and water (our linear combinations) to Grow or Generate all the plants in the garden (all the vectors in the Space).

Or consider them the ingredients (elements) such as flour, eggs, milk, etc., that we combine to Generate Yummy Baked Stuff (cakes, breads and cookies). We use a unique combination of those ingredients to generate the specific cake, bread or cookies on the menu, so we can say that those ingredients SPAN the Vector Space of Yummy Baked Stuff (cakes, breads and cookies).

_________________________

Notation: lin(S) or lin { v1, v2, .... vn } is the notation for the linear space spanned by S.

_________________________

Example: Determine if these vectors span R²

Solution: If every vector in R² can be expressed as k1u + k2v, for every a and b in R, then the set of vectors { u, v } spans R².

We rewrite as the system .

Since the determinant of the system matrix = – 2, the system has a unique solution for every a and b in R, so u and v Span R², the 2-dimensional vectors. We can generate any vector in the Cartesian plane by a linear combination of these 2 vectors.

However, we could do the same with these 3 vectors: .

The difference is that: vector w can be GENERATED by the other 2, so it is unnecessary.

We can show that 3u – 2v = w which means that 3u – 2vw = 0. So, though this set of 3 vectors SPANS R², it is a linearly dependent (definition follows) set of vectors.

Linear Independence -- What does it Mean??

The "street-talk" explanation of linear independence says: the only way to get the zero vector from a combination of the given vectors is to TAKE NONE OF THEM.

That means the only solution to k1v1 + k2v2 + .... + knvn = 0 is all the k's must = 0.

In the last example, we found 3u – 2vw = 0 so, {u, v, w} is a linearly dependent set.

__________________________

Def'n: a set of vectors S = { v1, v2, .... vn } is linearly independent if k1v1 + k2v2 + ... + knvn = 0 has only the trivial solution. ie: all k's = 0.

Example: these vectors are linearly independent because
only 0u + 0v = 0 -- so they satisfy the definition.

________________________

Example: these vectors
are linearly dependent because 3v1 + v2v3 = 0.

Note: If any vector in the set is a linear combination of other vectors, the set is dependent.
Here we can see that 3v1 + v2 = v3

___________________

A familiar set of linearly independent vectors is , denoted { i, j, k }, from which every 3-dimensional vector can be uniquely generated.

This set therefore is linearly independent and it spans R³, so it forms a BASIS FOR .

Put 'Em Together & What've You Got?? A Basis!

Def'n: If S = { v1, v2, .... vn } is a finite set of vectors in any vector space V,
and if S is
linearly independent and S spans V, then S is a basis for V.

__________________________

The "Dimension" of vector space V is the number of vectors in a basis for V.

The vectors { i, j, k } form the STANDARD BASIS FOR .

_____________________________

Basis Theorems

______________________________

The first 2 statements of the last theorem tell us we need only show either linear independence or spanning but not both if a set has the right number of vectors in it. If the number of vectors in the set equals the dimension of the Vector Space, then the set need ONLY span or be linearly independent to be a basis.

The 3rd statement says that if we have a set with fewer vectors than the dimension, there are vectors in the space that we can add to the set to form a basis for the space. The set of vectors might span a subspace of V.

In the previous example we saw that
are linearly dependent because 3v1 + v2v3 = 0. Since we have 3 vectors from R³, which creates a SQUARE MATRIX, we could conclude the same thing from the zero determinant of the coefficient or system matrix.

; det(A) = 0; so the column vectors are linearly dependent

lessons lnalg4 and lnalg4.2
on finding a basis for the row, column and null spaces
cover more on this topic
Practice is found in Assignment #5 and Test #3 (from Index Table)

( Linear Algebra MathRoom Index )

MathRoom Door

(all content © MathRoom Learning Service; 2004 - ).