[go: up one dir, main page]

0% found this document useful (0 votes)
27 views43 pages

LAweek 2

lin

Uploaded by

jackaccyou
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views43 pages

LAweek 2

lin

Uploaded by

jackaccyou
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 43

Remember from last week:

Fact: A linear system has either


exactly one solution
infinitely many solutions
no solutions
We gave an algebraic proof via row reduction, but the picture, although not a
proof, is useful for understanding this fact.

HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 1 of 43


Now we will think more geometrically about linear systems.
1.3-1.4 Span - related to existence of solutions
1.5 A geometric view of solution sets (a detour)
1.7 Linear independence - related to uniqueness of solutions

We are aiming to understand the two key concepts in three ways:


The related computations: to solve problems about a specific linear system
with numbers (p13-14, p37-38).
The rigorous definition: to prove statements about an abstract linear system
(p39-40).
The conceptual idea: to guess whether statements are true, to develop a plan
for a proof or counterexample, and to help you remember the main theorems
(p11-12, p33-34).
HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 2 of 43
1.3: Vector Equations
A column vector is a matrix with only one column.
Until Chapter 4, we will say vector to mean column vector.

u1
u2
n
A vector u is in R if it has n rows, i.e. u = .
 
.. 2
un 1
   
1 2
Example: and are vectors in R2 .
3 1
 
2 3 x
Vectors in R and R have a geometric meaning: think of as the point (x, y)
y
in the plane.
HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 3 of 43
There are two operations we can do on vectors:

u1 v1 u 1 + v1
u2 v2 u 2 + v2
addition: if u = . and v = . , then u + v = .

. . ..
. . .
un vn un + vn

u1 cu1
u2 cu2
scalar multiplication: if u = . and c is a number (a scalar), then cu = . .

.. ..
un cun
These satisfy the usual rules for arithmetic of numbers, e.g.

0
u + v = v + u, c(u + v) = cu + cv, 0u = 0 = ... .

0
HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 4 of 43
Parallelogram rule for addition of two vectors:
If u and v in R 2 are represented as points in the plane, then u + v corresponds to the fourth
0
vertex of the parallelogram whose other vertices are 0, u and v. (Note that 0 = .)
0

1 2
EXAMPLE: Let u = and v =
3 1

x2
4

x1
1 2 3 4
1 3
EXAMPLE: Let u = . Express u, 2u, and 2
u on a graph.
2

x2
4

x1
2 1 1 2
1

3
Combining the operations of addition and scalar multiplication:
Definition: Given vectors v1 , v2 , . . . , vp in Rn and scalars c1 , c2 , . . . , cp , the
vector
c1 v1 + c2 v2 + + cp vp
is a linear combination of v1 , v2 , . . . , vp with weights c1 , c2 , . . . , cp .
   
1 2
Example: u = ,v= . Some linear combinations of u and v are:
3 1
   
7 1
1/3
3u + 2v = . 3 u + 0v = .
11 1

   
3 0
u 3v = . 0 = 0u + 0v = .
0 0
(i.e. u + (3)v)
HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 7 of 43
Geometric interpretation of linear combinations: all the points you can go to if
you are only allowed to move in the directions of v1 , . . . , vp .

HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 8 of 43


2 2
EXAMPLE: Let v 1 = and v 2 = . Express each of the following as a linear
1 2
combination of v 1 and v 2 :
0 4 6 7
a= , b= , c= , d=
3 1 6 4

x2
8
6
4
2
x1
8 6 4 2 2 4 6 8
2
4
6
8
Definition: Suppose v1 , v2 , . . . , vp are in Rn . The span of v1 , v2 , . . . , vp ,
written
Span {v1 , v2 , . . . , vp } ,
is the set of all linear combinations of v1 , v2 , . . . , vp .

In other words, Span {v1 , v2 , . . . , vp } is the set of all vectors which can be
written as x1 v1 + x2 v2 + + xp vp for any choice of weights x1 , x2 , . . . , xp .

HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 10 of 43


Example: Span of one vector in R3

Span {0} = {0}, because c0 = 0 for all scalars c.

If u is not the zero vector,


then Span {u} is a line Span {u}
through the origin in the
direction u.

We can also say {u} spans a u


line through the origin.

HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 11 of 43


Example: Span of two vectors in R3

3 1
1 e.g. v1 = 1, v2 = 4
e.g. v1 = 1, 2 1
2

2
v2 = 2
4

This is the plane


spanned by {v1 , v2 }.

HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 12 of 43


4 3 2
EXAMPLE: Let a 1 = 2 , a2 = 6 , and b = 8 .
14 10 8
Determine if b is a linear combination of a1 and a2.

Solution: Vector b is a linear combination of a1 and a2 if we can find weights x1, x 2 such that
x1a1 + x2a2 = b.
Vector equation:

Corresponding linear system:

Corresponding augmented matrix:

|
4 3 2
2 6 8
14 10 8

Reduced echelon form:

|
1 0 2
0 1 2
0 0 0
From the previous example, we see that the vector equation

x1 a1 + x2 a2 + + xp ap = b

has the same solution set as the linear system whose augmented matrix is

| | | | |
a1 a2 . . . ap b .
| | | | |

In particular, b is a linear combination of a1 , a2 , . . . , ap (i.e. b is in


Span {a1 , a2 , . . . , ap }) if and only if there is a solution to the linear system with
augmented matrix
| | | | |
a1 a2 . . . ap b .
| | | | |
We now develop a different way to write this equation.
HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 14 of 43
1.4: The Matrix Equation Ax = b
We can think of the weights x1 , x2 , . . . , xp as a vector.

m rows, p columns
The product of an m p matrix A and a vector x in Rp is the linear combination
of the columns of A using the entries of x as weights:
x
| | | | 1
..
Ax = a1 a2 . . . ap . = x1 a1 + x2 a2 + + xp ap .

| | | | xp

4 3   4 3 2
2
Example: 2 6 = 2 2 + 2 6 = 8 .
2
14 10 14 10 8

HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 15 of 43



4 3   4 3 2
2
Example: 2 6 = 2 2 + 2 6 = 8 .
2
14 10 14 10 8

There is another way to compute Ax, one row of A at a time:



4 3   4(2) + 3(2) 2
2
Example: 2 6 = 2(2) + 6(2) = 8 .
2
14 10 14(2) + 10(2) 8

Warning: The product Ax is only defined if the number of columns of A equals


the number of rows of x. The number of rows of Ax is the number of rows of A.

It is easy to check that A(u + v) = Au + Av and A(cu) = cAu.

HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 16 of 43


We have three ways of viewing the same problem:
1. The system of linear equations with augmented matrix [A|b],
2. The vector equation x1 a1 + x2 a2 + + xp ap = b,
3. The matrix equation Ax = b.

So these three things are the same:


1. The system of linear equations with augmented matrix [A|b] has a solution,
2. b is a linear combination of the columns of A (or b is in the span of the
columns of A),
3. The matrix equation Ax = b has a solution.

(In fact, the three problems have the same solution set.)

Another way of saying this: The span of the columns of A is the set of vectors
b for which Ax = b has a solution.

HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 17 of 43


So these three things are the same:
1. The system of linear equations with augmented matrix [A|b] has a solution,
2. b is a linear combination of the columns of A (or b is in the span of the
columns of A),
3. The matrix equation Ax = b has a solution.

One question of particular interest: when are the above statements true for all
vectors b in Rm ? i.e. when is Ax = b consistent for all right hand sides b, and
when is Span {a1 , a2 , . . . , ap }= Rm ?
1 0 0
Example: (m = 3) Let e1 = 0, e2 = 1, e3 = 0.
0 0 1

x 1 0 0
Then Span {e1 , e2 , e3 } = R3 , because y = x 0 + y 1 + z 0 .
z 0 0 1
But for a more complicated set of vectors, the weights will be more complicated
functions of x, y, z. So we want a better way to answer this question.
HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 18 of 43
Theorem 4: Existence of solutions to linear systems: For an m n matrix
A, the following statements are logically equivalent (i.e. for any particular matrix
A, they are all true or all false):
a. For each b in Rm , the equation Ax = b has a solution.
b. Each b in Rm is a linear combination of the columns of A.
c. The columns of A span Rm (i.e. Span {a1 , a2 , . . . , ap } = Rm ).
d. rref(A) has a pivot in every row.
Warning: the theorem says nothing about the uniqueness of the solution.

Proof: (outline): By previous the discussion, (a), (b) and (c) are logically
equivalent. So, to finish the proof, we only need to show that (a) and (d) are
logically equivalent, i.e. we need to show that,
if (d) is true, then (a) is true;
if (d) is false, then (a) is false.

HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 19 of 43


a. For each b in Rm , the equation Ax = b has a solution.
d. rref(A) has a pivot in every row.

Proof: (continued)
Suppose (d) is true. Then, for every b in Rm , the augmented matrix [A|b]
row-reduces to [rref(A)|d] for some d in Rm . This does not have a row of the
form [0 . . . 0|], so, by the Existence of Solutions Theorem (Week 1 p 25),
Ax = b is consistent. So (a) is true.

Suppose (d) is false. We want to find a counterexample to (a): i.e. we want to


find a vector b in Rm such that Ax = b has no solution.

HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 20 of 43


a. For each b in Rm , the equation Ax = b has a solution.
d. rref(A) has a pivot in every row.
Proof: (continued) Suppose (d) is false. We want to find a counterexample to
(a): i.e. we want to find a vector b in Rm such that Ax = b has no solution.

rref(A) does not have a pivot in every row, so its last row is [0 . . . 0].

1 Then the linear system with augmented matrix [rref(A)|d] is
.. inconsistent.
Let d = . .
1 Now we apply the row operations in reverse to get an equivalent
linear system [A|b] that is inconsistent.

   
Example: 1 3 1 R2 R2 +2R1 1 3 1

)




*

2 6 1 R2 R2 2R1 0 0 1
HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 21 of 43
Theorem 4: Existence of solutions to linear systems: For an m n matrix
A, the following statements are logically equivalent (i.e. for any particular matrix
A, they are all true or all false):
a. For each b in Rm , the equation Ax = b has a solution.
b. Each b in Rm is a linear combination of the columns of A.
c. The columns of A span Rm (i.e. Span {a1 , a2 , . . . , ap } = Rm ).
d. rref(A) has a pivot in every row.

Observe that rref(A) has at most one pivot per column (condition 5 of a reduced
echelon form). So if A has more rows than columns (a tall matrix), then
rref(A) cannot have a pivot in every row, so the statements above are all false.
In particular, a set of fewer than m vectors cannot span Rm .

HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 22 of 43


1.5: Solution Sets of Linear Systems
Goal: use vector notation to give geometric descriptions of solution sets
to compare the solution sets of Ax = b and of Ax = 0.

Definition: A linear system is homogeneous if the right hand side is the zero
vector, i.e.
Ax = 0.
When we row-reduce [A|0], the right hand side stays 0, so the reduced echelon
form does not have a row of the form [0 . . . 0|] with =
6 0.
So a homogeneous system is always consistent.

In fact, x = 0 is always a solution, because A0 = 0. The solution x = 0 called


the trivial solution.
A non-trivial solution x is a solution where at least one xi is non-zero.
HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 23 of 43
If there are non-trivial solutions, what does the solution set look like?

EXAMPLE:

2x1 + 4x2 6x3 = 0

4x1 + 8x2 10x3 = 0

Corresponding augmented matrix:



2 4 6 0

4 8 10 0
Corresponding reduced echelon form:

1 2 0 0

0 0 1 0
Solution set:

Geometric representation:
EXAMPLE: (same left hand side as before)

2x1 + 4x2 6x3 = 0

4x1 + 8x2 10x3 = 4

Corresponding augmented matrix:



2 4 6 0

4 8 10 4
Corresponding reduced echelon form:

1 2 0 6

0 0 1 2
Solution set:

Geometric representation:
EXAMPLE: Compare the solution sets of:

x1 2x2 2x3 = 0 x1 2x2 2x3 = 3


Corresponding augmented matrices:
h i h i
1 2 2 0 1 2 2 3
These are already in reduced echelon form.
Solution sets:

Geometric representation:

2
x3
1

2 4 5
1 2 3 x1
1
x2
Parallel Solution Sets of Ax = 0 and Ax = b
In our first example:
The solution set of Ax = 0 is a line through the origin parallel to v.
The solution set of Ax = b is a line through p parallel to v.

In our second example:


The solution set of Ax = 0 is a plane through the origin parallel to u and v.
The solution set of Ax = b is a plane through p parallel to u and v.

In both cases: to get the solution set of Ax = b, start with the solution set of
Ax = 0 and translate it by p.
p is called a particular solution (one solution out of many).

In general:
Theorem 6: Solutions and homogeneous equations: Suppose p is a solution
to Ax = b. Then the solution set to Ax = b is the set of all vectors of the form
w = p + vh , where vh is any solution of the homogeneous equation Ax = 0.
HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 27 of 43
Theorem 6: Solutions and homogeneous equations: Suppose p is a solution
to Ax = b. Then the solution set to Ax = b is the set of all vectors of the form
w = p + vh , where vh is any solution of the homogeneous equation Ax = 0.

HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 28 of 43


Theorem 6: Solutions and homogeneous equations: Suppose p is a solution
to Ax = b. Then the solution set to Ax = b is the set of all vectors of the form
w = p + vh , where vh is any solution of the homogeneous equation Ax = 0.

Proof: (outline)
We show that w = p + vh is a solution:

A(p + vh )
=Ap + Avh
=b + 0
=b.

We also need to show that all solutions are of the form w = p + vh - see q25 in
Section 1.5 of the textbook.

HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 29 of 43


How this theorem isuseful: a shortcut
to Q2b on the exercise sheet:
| | | |  
1 3 0 4
Example: Let A = a1 a2 a3 a4 =
.
2 6 0 8
| | | |
3 0 4
1 0 0
In Q2a, you found that the solution set to Ax = 0 is 0
r + s + t, where
1 0
0 0 1
r, s, t can take any value.     0
3 3 1
In Q2b, you want to solve Ax = . Now = 0a1 + 1a2 + 0a3 + 0a4 = A , so
6 6 0
0

0 0 3 0 4
1 1 1 0 0
is a particular solution. So the solution set is + r + s + t,
0 0 0 1 0
0 0 0 0 1
where r, s, t can take any value.
HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 30 of 43
Notice that this solution looks different from the solution obtained from row-reduction:
    3
1 3 0 4 3 1 3 0 4 3 0
rref = , which gives a different particular solution .
2 6 0 8 6 0 0 0 0 0 0
0
But the solution sets are the same:

3 3 0 4 3 3 3 0 4
0 1 0 0 0 1 1 0 0
+ r + s + t = + + (r 1) + s + t
0 0 1 0 0 0 0 1 0
0 0 0 1 0 0 0 0 1

0 3 0 4
1 1 0 0
= + (r 1) + s + t,
0 0 1 0
0 0 0 1
and r, s, t taking any value is equivalent to r 1, s, t taking any value.

HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 31 of 43


1.7: Linear Independence

In this picture, the plane is Span {u, v, w} = Span {u, v}, so we do not
need to include w to describe this plane.
We can think that w is too similar to u and v - and linear dependence is
the way to make this idea precise.

HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 32 of 43


Definition: A set of vectors {v1 , . . . , vp } is linearly independent if the only
solution to the vector equation

x1 v1 + + xp vp = 0

is the trivial solution (x1 = = xp = 0).

The opposite of linearly independent is linearly dependent:


Definition: A set of vectors {v1 , . . . , vp } is linearly dependent if there are
weights c1 , . . . , cp , not all zero, such that

c1 v1 + + cp vp = 0.

The equation c1 v1 + + cp vp = 0 is a linear dependence relation.

HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 33 of 43


x 1 v1 + + x p vp = 0
The only solution is x1 = = xp = 0 There is a solution with some xi 6= 0
linearly independent linearly dependent
       
2 3 2 4
Example: , is linearly Example: , is linearly
1 0 1 2
independent because dependent because
    
 
2
   
3 0 2x1 + 3x2 = 0 2 4 0
x1 + x2 = = 2 + (1) = .
1 0 0 x1 =0 1 2 0

= x1 = 0, x2 = 0.  
    4
2   2 2
1 3 1
0

HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 34 of 43


x 1 v1 + + x p vp = 0
The only solution is x1 = = xp = 0 There is a solution with some xi 6= 0
(i.e. unique solution) (i.e. infinitely many solutions)
linearly independent linearly dependent

Informally: v1 , . . . , vp are in totally Informally: v1 , . . . , vp are in similar


different directions; there is no directions
relationship between v1 , . . . vp .
HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 35 of 43
Some easy cases:

Sets containing the zero vector {0, v2 , . . . , vp }:

(1)0 + (0)v2 + + (0)vp = 0 linearly dependent

Sets containing one vector {v}:

xv = 0 linearly independent if v 6= 0


xv1 0
.. ..
. = . . If some vi 6= 0, then x = 0 is the only solution.
xvn 0

HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 36 of 43


Some easy cases:

Sets containing two vectors {u, v}:

x1 u + x2 v = 0

if x1 6= 0, then u = (x2 /x1 )v. if x2 6= 0, then v = (x1 /x2 )u.

So {u, v} is linearly dependent if and only if one of the vectors is a


multiple of the other (see p34).

Sets containing more vectors: x 1 v1 + + x p vp = 0


A set of vectors is linearly dependent if and only if one of the vectors is a
linear combination of the others. (If the weight xi in the linear dependency
relation is non-zero, then vi is a linear combination of the other vs.)

HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 37 of 43


How to determine if v1,v2,...,vp is linearly independent:

1 2 3
EXAMPLE Let v 1 = 3 , v2 = 5 , v3 = 9 .
5 9 3

a. Determine if v 1 , v 2 , v 3 is linearly independent.


b. If possible, find a linear dependence relation among v 1 , v 2 , v 3 .

Solution: (a)
1 2 3 0
x1 3 + x2 5 + x3 9 = 0 .
5 9 3 0
Augmented matrix:

| |
1 2 3 0 1 2 3 0
3 5 9 0 row reduces to 0 1 18 0
5 9 3 0 0 0 0 0

x 3 is a free variable there are nontrivial solutions.

v 1 , v 2 , v 3 is a linearly dependent set

|
1 0 33 0
(b) Reduced echelon form: 0 1 18 0
0 0 0 0

Let x 3 = _____ (any nonzero number). Then x 1 = _____ and x 2 = _____.

1 2 3 0
____ 3 + ____ 5 + ____ 9 = 0
5 9 3 0

or

____v 1 + ____v 2 + ____v 3 = 0

(one possible linear dependence relation)


A non-trivial solution to Ax = 0 is a linear dependence relation between the
columns of A: Ax = 0 means x1 a1 + + xn an = 0.

Theorem: Uniqueness of solutions for linear systems: For a matrix A, the


following are equivalent:
a. Ax = 0 has no non-trivial solution.
b. If Ax = b is consistent, then it has a unique solution.
c. The columns of A are linearly independent.
d. rref(A) has a pivot in every column (i.e. all variables are basic).

In particular: the row reduction algorithm produces at most one pivot in each row
of rref(A). So, if A has more columns than rows (a fat matrix), then rref(A)
cannot have a pivot in every column.
So a set of more than n vectors in Rn is always linearly dependent.
Exercise: Combine this with the Theorem of Existence of Solutions (p19) to show
that a set of n linearly independent vectors span Rn .
HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 39 of 43
Conceptual problems regarding linear independence:
In problems about linear independence (or spanning) that do not involve specific
numbers, its often better not to compute, i.e. not to use row-reduction.
Example: Prove that, if {2u, v + w} is linearly dependent, then {u, v, w} is
linearly dependent.
Method:
Step 1 Rewrite the mathematical terms in the question as formulas. Be careful
to distinguish what we know (first line of the proof) and what we want to
show (last line of the proof).
What we know: there are scalars c1 , c2 not both zero such that
c1 (2u) + c2 (v + w) = 0.
What we want to show: there are scalars d1 , d2 , d3 not all zero such that
d1 u + d2 v + d3 w = 0.
(Be careful to choose different letters for the weights in the different
statements, because the weights in different statements will in general be
different.)
HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 40 of 43
Example: Prove that, if {2u, v + w} is linearly dependent, then {u, v, w} is
linearly dependent.
Method:
Step 1 Rewrite the mathematical terms in the question as formulas.
What we know: there are scalars c1 , c2 not both zero such that
c1 (2u) + c2 (v + w) = 0.
What we want to show: there are scalars d1 , d2 , d3 not all zero such that
d1 u + d2 v + d3 w = 0.
Step 2 Fill in the missing steps by rearranging (and sometimes combining)
vector equations.
Answer: We know there are scalars c1 , c2 not both zero such that
c1 (2u) + c2 (v + w) = 0
2c1 u + c2 v + c2 w = 0
and 2c1 , c2 , c2 are not all zero, so this is a linear dependence relation among
u, v, w.
HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 41 of 43
Partial summary of linear dependence:
The definition: x1 v1 + + xp vp = 0 has a non-trivial solution (not all xi are
zero); equivalently, it has infinitely many solutions.
Equivalently: one of the vectors is a linear combination of the others (see p33,
also Theorem 7 in textbook). But it might not be the case that every vector in
the set is a linear combination of the others (see Q2c on the exercise sheet).

| | |
Computation: rref v1 . . . vp has at least one free variable.
| | |
Informal idea: the vectors are in similar directions

HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 42 of 43


Partial summary of linear dependence (continued):
Easy examples:
Sets containing the zero vector;
n
Sets containing too many vectors
    (more than n vectors in R );
2 4
Multiples of vectors: e.g. , (this is the only possibility if the set
1 2
has two vectors);
1 0 1
Other examples: e.g. 0 , 1 , 1
0 0 0

Adding vectors to a linearly dependent set still makes a linearly dependent set (see
Q2d on exercise sheet).
Equivalent: removing vectors from a linearly independent set still makes a linearly
independent set (because P implies Q mean (not Q) implies (not P) - this is the
contrapositive).
HKBU Math 2207 Linear Algebra Semester 1 2018, Week 2, Page 43 of 43

You might also like