Vector spaces

Vector space

A vector space over 𝕂 is a non-empty set 𝑽 of elements called vectors along with two operations vector addition and scalar multiplication

: 𝑽×𝑽 𝑽 : 𝕂×𝑽 𝑽

such that

» 𝒗 𝒖 = 𝒖 𝒗 » 1 𝒗 = 𝒗
» ( 𝒗 𝒖 ) 𝒘 = 𝒗 ( 𝒖 𝒘 ) » α ( β 𝒗 ) = ( αβ ) 𝒗
» ∃! 𝒐 𝑽 : 𝒗 𝑽 𝒐 𝒗 = 𝒗 » α ( 𝒗 𝒖 ) = ( α 𝒗 ) ( α 𝒖 )
» 𝒗 𝑽 , ∃! -𝒗 𝑽 : 𝒗 -𝒗 = 𝒐 » ( α+β ) 𝒗 = ( α 𝒗 ) ( β 𝒗 )

Examples

  • Vectors from geometry
  • Polynomials of degree at most two
  • complex numbers over complex numbers

Counterexamples

  • Vectors with integer components
  • Polynomials that evaluate to one at two
  • real numbers over complex numbers

Crazy Vector Space

𝑪𝑽𝑺 = { [ x y ] x,y } /

Operations

[ x1 y1 ] [ x2 y2 ] = [ x1 + x2 -2 y1 + y2 ] α [ x y ] = [ αx -2α +2 α y ]

Multiplication with zero

𝒗 𝑽 0 𝒗 = 𝒐

Multiplication with

negative one

𝒗 𝑽 (-1) 𝒗 = -𝒗

Scalar multiple of zero vector

α 𝕂 α 𝒐 = 𝒐

Linear combinations

Linear combination

A vector 𝒖 is a linear combination of

𝒘1 , 𝒘2 ,, 𝒘k

if there are constants

α1 , α2 ,, αk

such that

𝒖 = α1 𝒘1 +α2 𝒘2 ++ αk 𝒘k

Linear combination of subsets

If 𝒖 is a linear combination of a subset of 𝒘1 , 𝒘2 ,, 𝒘k then it is a linear combination of all the vectors 𝒘1 , 𝒘2 ,, 𝒘k .

transitivity

Suppose

  • 𝒖 is a linear combination of 𝒘1 , 𝒘2 ,, 𝒘k and
  • each 𝒘j is a linear combination of 𝒗1 , 𝒗2 ,, 𝒗t

then 𝒖 is a linear combination of 𝒗1 , 𝒗2 ,, 𝒗t .

matrix multiplication

Let C=AB then

  • rows of C are linear combinations of the rows of B
  • columns of C are linear combinations of the columns of A

Linear (in)dependence

Linear (in)dependence

Let { 𝒖1 , 𝒖2 ,, 𝒖k } be a set of vectors

α1 𝒖1 +α2 𝒖2 ++ αk 𝒖k = 𝒐 α1=0 , α2=0 ,, αk=0

then the vectors { 𝒖1 , 𝒖2 ,, 𝒖k } are linearly independent, otherwise they are linearly dependent.

Standard basis

The standard basis vectors are linearly independent i.e., the columns and rows of 𝑰 are linearly independent.

One vector case

The set { 𝒖1 } is linearly indepedent if and only if 𝒖1 =𝒐 .

Sets with zero vector

Let { 𝒖1 , 𝒖2 ,, 𝒖k } be a set of vectors such that there is i such that

𝒖i = 𝒐

then the set is linearly depedent.

Sets with equal vectors

Let { 𝒖1 , 𝒖2 ,, 𝒖k } be a set of vectors such that there are ij such that

𝒖i = 𝒖j

then the set is linearly depedent.

Linear dependence and combinations

Let { 𝒖1 , 𝒖2 ,, 𝒖k } be a set of linearly dependent vectors with 2k then there is t such that 𝒖t is a linear combination of the remaining vectors in the set.

Main Lemma

Main Lemma

Let { 𝒖1 , 𝒖2 ,, 𝒖m } and { 𝒘1 , 𝒘2 ,, 𝒘n } be two non-empty sets of vectors such that

𝒖1 = γ1,1 𝒘1 + γ1,2 𝒘2 ++ γ1,n 𝒘n 𝒖2 = γ2,1 𝒘1 + γ2,2 𝒘2 ++ γ2,n 𝒘n 𝒖m = γm,1 𝒘1 + γm,2 𝒘2 ++ γm,n 𝒘n

if m<n then { 𝒖1 , 𝒖2 ,, 𝒖m } are linearly dependent.

Square matrices

A square matrix of order n has linearly dependent rows if and only if it has linearly dependent rows.

Subspaces

subspace

Let 𝑽 be a vector space and 𝑼𝑽. If 𝑼 is a vector space the same operations then 𝑼 is a subspace of 𝑽.

trivial subspaces

Let 𝑽 be a vector space, then 𝑽 is a subspace of 𝑽 and {𝒐 } is a subspace of 𝑽.

subspace condition

𝑼 is a subspace of 𝑽 if and only if for all 𝒖 , 𝒘 𝑼 and for all α , β 𝕂

α 𝒖 + β 𝒘 𝑼

Span

Span

Let 𝑺= { 𝒖1 , 𝒖2 ,, 𝒖k } be a set of vectors. The set of all linear combinations of these vectors is the span of 𝑺 and denoted by 𝑺 .

𝑺 = 𝒖1 , 𝒖2 ,, 𝒖k

if the set is empty

= { 𝒐 }

cosistency condition

A system of linear equation A x = b has a solution if and only if b is in the span of the columns of A.

expanding spans

𝑺 𝒖 = 𝑺 if and only if 𝒖 𝑺 .

spans are vector spaces

The span of a set of vectors is a vector space.

Basis and dimension

Basis

The set 𝑩= { 𝒃1 , 𝒃2 ,, 𝒃d } is a basis for a vector space 𝑽 if

  • 𝑩 =𝑽
  • 𝑩 is linearly independent

Size

Let 𝑩1 and 𝑩2 be bases for a vector space 𝑽

| 𝑩1 | = | 𝑩2 |

dimension

Let 𝑩 be basis for a vector space 𝑽. The size of 𝑩 is called the dimension of 𝑽.

finite dimensions

A vector space is called finite dimensional it its basis has a finite number of vectors.

extending to a basis

Any linearly independent set can be extended to a basis.

diminishing to a basis

Any spanning set contains a basis.

Coordinates

uniqueness

Let { 𝒃1 , 𝒃2 ,, 𝒃d } be a linearly independent set. If

  • 𝒖 = α1 𝒃1 + α2 𝒃2 ++ αd 𝒃d
  • 𝒖 = β1 𝒃1 + β2 𝒃2 ++ βd 𝒃d

Then

α1= β1 , α2= β1 ,, αd= βd

coordinates

Let 𝒖 𝑽, 𝑩= { 𝒃1 , 𝒃2 ,, 𝒃d } be basis for 𝑽 and

𝒖 = ξ1 𝒃1 + ξ2 𝒃2 ++ ξd 𝒃d

Then ξ1 , ξ2 ,, ξd are coordinates of 𝒖 with respect to basis 𝑩.

Basis description

In a vector space 𝑽 a subset 𝑩 is a basis for 𝑽 if and only if every vector in 𝑽 can be represented in a unique was as a linear combination of the vectors in 𝑩.

Matrix Rank

independent columns

Let 𝐴 be a square matrix such that there is matrix 𝐵 such that

𝐴𝐵 =𝐼

Then the columns of 𝐴 are linearly independent.

rank

The rank of a matrix 𝐴 is the number of linearly independent columns of the matrix and is denoted by

rank(𝐴)

independent rows and columns

Let 𝐴 be a m×n matrix. The number of linearly independent rows equals the number of linearly independent columns.

product of elementary matrices

Let 𝐴 be a square matrix that has linearly independent rows. Then 𝐴 can be written as a product of elementary matrices.

consistency condition

A system of linear equation 𝐴 𝒙 =𝑏 is consistent if and only if

rank(𝐴) = rank( 𝐴𝑏 )

dual condition

A system of linear equations 𝐴 𝒙 =𝑏 is consistent if and only if

𝐴T 𝑦 = 𝒐 𝑏T 𝑦 =0