LAweek 2
LAweek 2
0
HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 4 of 43
Parallelogram rule for addition of two vectors:
If u and v in R 2 are represented as points in the plane, then u + v corresponds to the fourth
0
vertex of the parallelogram whose other vertices are 0, u and v. (Note that 0 = .)
0
1 2
EXAMPLE: Let u = and v =
3 1
x2
4
x1
1 2 3 4
1 3
EXAMPLE: Let u = . Express u, 2u, and 2
u on a graph.
2
x2
4
x1
2 1 1 2
1
3
Combining the operations of addition and scalar multiplication:
Definition: Given vectors v1 , v2 , . . . , vp in Rn and scalars c1 , c2 , . . . , cp , the
vector
c1 v1 + c2 v2 + + cp vp
is a linear combination of v1 , v2 , . . . , vp with weights c1 , c2 , . . . , cp .
1 2
Example: u = ,v= . Some linear combinations of u and v are:
3 1
7 1
1/3
3u + 2v = . 3 u + 0v = .
11 1
3 0
u 3v = . 0 = 0u + 0v = .
0 0
(i.e. u + (3)v)
HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 7 of 43
Geometric interpretation of linear combinations: all the points you can go to if
you are only allowed to move in the directions of v1 , . . . , vp .
x2
8
6
4
2
x1
8 6 4 2 2 4 6 8
2
4
6
8
Definition: Suppose v1 , v2 , . . . , vp are in Rn . The span of v1 , v2 , . . . , vp ,
written
Span {v1 , v2 , . . . , vp } ,
is the set of all linear combinations of v1 , v2 , . . . , vp .
In other words, Span {v1 , v2 , . . . , vp } is the set of all vectors which can be
written as x1 v1 + x2 v2 + + xp vp for any choice of weights x1 , x2 , . . . , xp .
Solution: Vector b is a linear combination of a1 and a2 if we can find weights x1, x 2 such that
x1a1 + x2a2 = b.
Vector equation:
|
4 3 2
2 6 8
14 10 8
|
1 0 2
0 1 2
0 0 0
From the previous example, we see that the vector equation
x1 a1 + x2 a2 + + xp ap = b
has the same solution set as the linear system whose augmented matrix is
| | | | |
a1 a2 . . . ap b .
| | | | |
m rows, p columns
The product of an m p matrix A and a vector x in Rp is the linear combination
of the columns of A using the entries of x as weights:
x
| | | | 1
..
Ax = a1 a2 . . . ap . = x1 a1 + x2 a2 + + xp ap .
| | | | xp
4 3 4 3 2
2
Example: 2 6 = 2 2 + 2 6 = 8 .
2
14 10 14 10 8
(In fact, the three problems have the same solution set.)
Another way of saying this: The span of the columns of A is the set of vectors
b for which Ax = b has a solution.
One question of particular interest: when are the above statements true for all
vectors b in Rm ? i.e. when is Ax = b consistent for all right hand sides b, and
when is Span {a1 , a2 , . . . , ap }= Rm ?
1 0 0
Example: (m = 3) Let e1 = 0, e2 = 1, e3 = 0.
0 0 1
x 1 0 0
Then Span {e1 , e2 , e3 } = R3 , because y = x 0 + y 1 + z 0 .
z 0 0 1
But for a more complicated set of vectors, the weights will be more complicated
functions of x, y, z. So we want a better way to answer this question.
HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 18 of 43
Theorem 4: Existence of solutions to linear systems: For an m n matrix
A, the following statements are logically equivalent (i.e. for any particular matrix
A, they are all true or all false):
a. For each b in Rm , the equation Ax = b has a solution.
b. Each b in Rm is a linear combination of the columns of A.
c. The columns of A span Rm (i.e. Span {a1 , a2 , . . . , ap } = Rm ).
d. rref(A) has a pivot in every row.
Warning: the theorem says nothing about the uniqueness of the solution.
Proof: (outline): By previous the discussion, (a), (b) and (c) are logically
equivalent. So, to finish the proof, we only need to show that (a) and (d) are
logically equivalent, i.e. we need to show that,
if (d) is true, then (a) is true;
if (d) is false, then (a) is false.
Proof: (continued)
Suppose (d) is true. Then, for every b in Rm , the augmented matrix [A|b]
row-reduces to [rref(A)|d] for some d in Rm . This does not have a row of the
form [0 . . . 0|], so, by the Existence of Solutions Theorem (Week 1 p 25),
Ax = b is consistent. So (a) is true.
rref(A) does not have a pivot in every row, so its last row is [0 . . . 0].
1 Then the linear system with augmented matrix [rref(A)|d] is
.. inconsistent.
Let d = . .
1 Now we apply the row operations in reverse to get an equivalent
linear system [A|b] that is inconsistent.
Example: 1 3 1 R2 R2 +2R1 1 3 1
)
*
2 6 1 R2 R2 2R1 0 0 1
HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 21 of 43
Theorem 4: Existence of solutions to linear systems: For an m n matrix
A, the following statements are logically equivalent (i.e. for any particular matrix
A, they are all true or all false):
a. For each b in Rm , the equation Ax = b has a solution.
b. Each b in Rm is a linear combination of the columns of A.
c. The columns of A span Rm (i.e. Span {a1 , a2 , . . . , ap } = Rm ).
d. rref(A) has a pivot in every row.
Observe that rref(A) has at most one pivot per column (condition 5 of a reduced
echelon form). So if A has more rows than columns (a tall matrix), then
rref(A) cannot have a pivot in every row, so the statements above are all false.
In particular, a set of fewer than m vectors cannot span Rm .
Definition: A linear system is homogeneous if the right hand side is the zero
vector, i.e.
Ax = 0.
When we row-reduce [A|0], the right hand side stays 0, so the reduced echelon
form does not have a row of the form [0 . . . 0|] with =
6 0.
So a homogeneous system is always consistent.
EXAMPLE:
Geometric representation:
EXAMPLE: (same left hand side as before)
Geometric representation:
EXAMPLE: Compare the solution sets of:
Geometric representation:
2
x3
1
2 4 5
1 2 3 x1
1
x2
Parallel Solution Sets of Ax = 0 and Ax = b
In our first example:
The solution set of Ax = 0 is a line through the origin parallel to v.
The solution set of Ax = b is a line through p parallel to v.
In both cases: to get the solution set of Ax = b, start with the solution set of
Ax = 0 and translate it by p.
p is called a particular solution (one solution out of many).
In general:
Theorem 6: Solutions and homogeneous equations: Suppose p is a solution
to Ax = b. Then the solution set to Ax = b is the set of all vectors of the form
w = p + vh , where vh is any solution of the homogeneous equation Ax = 0.
HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 27 of 43
Theorem 6: Solutions and homogeneous equations: Suppose p is a solution
to Ax = b. Then the solution set to Ax = b is the set of all vectors of the form
w = p + vh , where vh is any solution of the homogeneous equation Ax = 0.
Proof: (outline)
We show that w = p + vh is a solution:
A(p + vh )
=Ap + Avh
=b + 0
=b.
We also need to show that all solutions are of the form w = p + vh - see q25 in
Section 1.5 of the textbook.
In this picture, the plane is Span {u, v, w} = Span {u, v}, so we do not
need to include w to describe this plane.
We can think that w is too similar to u and v - and linear dependence is
the way to make this idea precise.
x1 v1 + + xp vp = 0
c1 v1 + + cp vp = 0.
= x1 = 0, x2 = 0.
4
2 2 2
1 3 1
0
xv = 0 linearly independent if v 6= 0
xv1 0
.. ..
. = . . If some vi 6= 0, then x = 0 is the only solution.
xvn 0
x1 u + x2 v = 0
1 2 3
EXAMPLE Let v 1 = 3 , v2 = 5 , v3 = 9 .
5 9 3
Solution: (a)
1 2 3 0
x1 3 + x2 5 + x3 9 = 0 .
5 9 3 0
Augmented matrix:
| |
1 2 3 0 1 2 3 0
3 5 9 0 row reduces to 0 1 18 0
5 9 3 0 0 0 0 0
|
1 0 33 0
(b) Reduced echelon form: 0 1 18 0
0 0 0 0
1 2 3 0
____ 3 + ____ 5 + ____ 9 = 0
5 9 3 0
or
In particular: the row reduction algorithm produces at most one pivot in each row
of rref(A). So, if A has more columns than rows (a fat matrix), then rref(A)
cannot have a pivot in every column.
So a set of more than n vectors in Rn is always linearly dependent.
Exercise: Combine this with the Theorem of Existence of Solutions (p19) to show
that a set of n linearly independent vectors span Rn .
HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 39 of 43
Conceptual problems regarding linear independence:
In problems about linear independence (or spanning) that do not involve specific
numbers, its often better not to compute, i.e. not to use row-reduction.
Example: Prove that, if {2u, v + w} is linearly dependent, then {u, v, w} is
linearly dependent.
Method:
Step 1 Rewrite the mathematical terms in the question as formulas. Be careful
to distinguish what we know (first line of the proof) and what we want to
show (last line of the proof).
What we know: there are scalars c1 , c2 not both zero such that
c1 (2u) + c2 (v + w) = 0.
What we want to show: there are scalars d1 , d2 , d3 not all zero such that
d1 u + d2 v + d3 w = 0.
(Be careful to choose different letters for the weights in the different
statements, because the weights in different statements will in general be
different.)
HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 40 of 43
Example: Prove that, if {2u, v + w} is linearly dependent, then {u, v, w} is
linearly dependent.
Method:
Step 1 Rewrite the mathematical terms in the question as formulas.
What we know: there are scalars c1 , c2 not both zero such that
c1 (2u) + c2 (v + w) = 0.
What we want to show: there are scalars d1 , d2 , d3 not all zero such that
d1 u + d2 v + d3 w = 0.
Step 2 Fill in the missing steps by rearranging (and sometimes combining)
vector equations.
Answer: We know there are scalars c1 , c2 not both zero such that
c1 (2u) + c2 (v + w) = 0
2c1 u + c2 v + c2 w = 0
and 2c1 , c2 , c2 are not all zero, so this is a linear dependence relation among
u, v, w.
HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 41 of 43
Partial summary of linear dependence:
The definition: x1 v1 + + xp vp = 0 has a non-trivial solution (not all xi are
zero); equivalently, it has infinitely many solutions.
Equivalently: one of the vectors is a linear combination of the others (see p33,
also Theorem 7 in textbook). But it might not be the case that every vector in
the set is a linear combination of the others (see Q2c on the exercise sheet).
| | |
Computation: rref v1 . . . vp has at least one free variable.
| | |
Informal idea: the vectors are in similar directions
Adding vectors to a linearly dependent set still makes a linearly dependent set (see
Q2d on exercise sheet).
Equivalent: removing vectors from a linearly independent set still makes a linearly
independent set (because P implies Q mean (not Q) implies (not P) - this is the
contrapositive).
HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 43 of 43