These notes give the definition of a vector space and several of the
concepts related to these spaces. Examples are drawn from the vector
space of vectors in
.
For a pdf version of these notes look
here.
A nonempty set
of elements
is called a vector
space if in
there are two algebraic operations (called
addition and scalar multiplication), so that the following
properties hold.
Addition associates with every pair of vectors
and
a
unique vector
which is called the sum of
and
and is written
. In the case of the space of
2-dimensional vectors, the
summation is componentwise (i.e. if
and
, then
), which can be
best illustrated by the ``parallelogram illustration'' below:
Addition satisfies the following :
The use of an additive inverse allows us to define a subtraction
operation on vectors. Simply
The result of vector subtraction in the space of 2-dimensional
vectors is shown below.
Frequently this 2-d vectors is protrayed as joining the ends of the two original vectors. As we can see, since the vectors are determined by direction and length, and not position, the two vectors are equivalent.
Scalar Multiplication associates with every vector
and every scalar
, another unique vector (usually written
),
For scalar multiplication the following properties hold:
Examples of vector space abound in mathematics. The most obvious
examples are the usual vectors in
, from which we have drawn
our illustrations in the sections above. But we frequently utilize
several other vectors spaces: The
3-d space of vectors, the
vector space of all polynomials of a fixed degree,
and vector spaces of
matrices. We briefly discuss these below.
The Vector Space of 3-Dimensional Vectors
The vectors in
also form a vector space, where
in this case the vector operations of addition and scalar
multiplication are done componentwise. That is
and
are vectors, then
addition is
The axioms are easily verified (for example the additive identity of
is just
, and the zero vector is just
.
Here the axioms just state what we always have been taught
about these sets of vectors.
The set of quadratic polynomials of the form
The axioms are again easily verified by performing the operations individually on like terms.
A simple extension of the above is to consider the set of polynomials
of degree less than or equal to
. It is easily seen that these
also form a vector space.
The set of
Matrices form a vector space. Two
matrices can be added componentwise, and a matrix can be multiplied by
a scalar. All axioms are easily verified.
Given a vector space
, the concept of a basis for the vector
space is fundamental for much of the work that we will do in computer
graphics. This section discusses several topics relating to linear
combinations of vectors, linear independence and bases.
This element is clearly a member of the vector space
(just repeatedly
apply the summation and scalar multiplication axioms).
The set
that contains all possible linear combinations of
is called the span of
. We frequently say that
is
spanned (or generated) by those
vectors.
It is straightforward to show that the span of any set of vectors is again a vector space.
Given a set of vectors
from a vector space
. This set is called linearly independent in
if the
equation
If a set of vectors is not linearly independent, then it is called
linearly dependent. This implies that the equation above has a
nonzero solution, that is there exist
which are
not all zero, such that
Any set of vectors containing the zero vector (
) is linearly
dependent.
To give an example of a linear independent set that everyone has seen, consider the three vectors
Consider the equation
Let
be a set of vectors in a vector space
and let
be the span of
. If
is linearly independent, then we say that these vectors form a basis
for
and
has dimension
. Since these vectors span
, any
vector
can be written uniquely as
![]() |
If
is the entire vector space
, we say that
forms a basis for
, and
has
dimension
.