By definition, a vector space
V
{\displaystyle V}
over a field
K
{\displaystyle K}
is a set
V
{\displaystyle V}
with two operations
⊞
{\displaystyle \boxplus }
of addition and
⊡
{\displaystyle \boxdot }
, of scalar multiplication, satisfying a list of axioms.
These are listed in the article vector space .
They are four axioms for addition and four axioms for scalar multiplication, so 8 in total.
So if we want to show that a set forms a vector space, we first have to define some operations
⊞
{\displaystyle \boxplus }
and
⊡
{\displaystyle \boxdot }
and then prove that the axioms are satisfied.
In defining the operations, note that the sum of two vectors and the product of a scalar with a vector again give vectors from the set
V
{\displaystyle V}
, i.e., for all
v
,
w
∈
V
{\displaystyle v,w\in V}
and
λ
∈
K
{\displaystyle \lambda \in K}
we have that
v
⊞
w
,
λ
⊡
v
∈
V
{\displaystyle v\boxplus w,\lambda \boxdot v\in V}
.
This is called completeness and is an important part of the well-definedness of the operations!
Then we work off the axioms in the order given within the definition.
Now we want to demonstrate the whole thing with an example.
As an example we choose the polynomial space of polynomials of degree less than or equal to
n
{\displaystyle n}
(for a fixed
n
∈
N
{\displaystyle n\in \mathbb {N} }
). We prove that those polynomials form a vector space.
First we need to precisely define the polynomial space, i.e., the set of our vectors.
On this set we introduce two operations, an addition and a multiplication by scalars from
K
{\displaystyle K}
:
Definition (Addition and scalar multiplication on the polynomial space)
We define the addition as follows:
⊞
:
K
[
X
]
≤
n
×
K
[
X
]
≤
n
→
K
[
X
]
≤
n
,
(
∑
i
=
0
n
a
i
X
i
,
∑
i
=
0
n
b
i
X
i
)
↦
∑
i
=
0
n
(
a
i
+
b
i
)
X
i
.
{\displaystyle {\begin{aligned}\boxplus \colon K[X]_{\leq n}\times K[X]_{\leq n}&\to K[X]_{\leq n},\\\left(\sum _{i=0}^{n}a_{i}X^{i},\sum _{i=0}^{n}b_{i}X^{i}\right)&\mapsto \sum _{i=0}^{n}(a_{i}+b_{i})X^{i}.\end{aligned}}}
The scalar multiplication works very similar:
⊡
:
K
×
K
[
X
]
≤
n
→
K
[
X
]
≤
n
,
(
λ
,
∑
i
=
0
n
a
i
X
i
)
↦
∑
i
=
0
n
(
λ
⋅
a
i
)
X
i
.
{\displaystyle {\begin{aligned}\boxdot \colon K\times K[X]_{\leq n}&\to K[X]_{\leq n},\\\left(\lambda ,\sum _{i=0}^{n}a_{i}X^{i}\right)&\mapsto \sum _{i=0}^{n}(\lambda \cdot a_{i})X^{i}.\end{aligned}}}
We want to point out that the sums on the right-hand side of the map run again only from
0
{\displaystyle 0}
to
n
{\displaystyle n}
.
So we get again polynomials which have at most degree
n
{\displaystyle n}
and so we actually end up in
K
[
X
]
≤
n
{\displaystyle K[X]_{\leq n}}
.
This is important to obtain well-defined operations
⊞
{\displaystyle \boxplus }
and
⊡
{\displaystyle \boxdot }
acting on
K
[
X
]
≤
n
{\displaystyle K[X]_{\leq n}}
.
We now want to show that polynomials indeed form a vector space:
So we need to establish the 8 vector space axioms :
V
{\displaystyle V}
together with the operation
⊞
{\displaystyle \boxplus }
forms an abelian group (missing) . That is, the following axioms are satisfied:
associative law: For all
f
,
g
,
h
∈
K
[
X
]
≤
n
{\displaystyle f,g,h\in K[X]_{\leq n}}
we have that:
f
⊞
(
g
⊞
h
)
=
(
f
⊞
g
)
⊞
h
{\displaystyle f\boxplus (g\boxplus h)=(f\boxplus g)\boxplus h}
commutative law: For all
f
,
g
∈
K
[
X
]
≤
n
{\displaystyle f,g\in K[X]_{\leq n}}
we have that:
f
⊞
g
=
g
⊞
f
{\displaystyle f\boxplus g=g\boxplus f}
Existence of a neutral element: There exists an element
0
∈
K
[
X
]
≤
n
{\displaystyle 0\in K[X]_{\leq n}}
such that for all
f
∈
K
[
X
]
≤
n
{\displaystyle f\in K[X]_{\leq n}}
we have that
f
⊞
0
=
f
{\displaystyle f\boxplus 0=f}
.
Existence of an inverse element: For every
f
∈
K
[
X
]
≤
n
{\displaystyle f\in K[X]_{\leq n}}
there exists an element
g
∈
K
[
X
]
≤
n
{\displaystyle g\in K[X]_{\leq n}}
such that we have
f
⊞
g
=
0
{\displaystyle f\boxplus g=0}
.
In addition, the following axioms of scalar multiplication
⊡
{\displaystyle \boxdot }
must be satisfied:
scalar distributive law: For all
λ
,
μ
∈
K
{\displaystyle \lambda ,\mu \in K}
and all
f
∈
K
[
X
]
≤
n
{\displaystyle f\in K[X]_{\leq n}}
we have that:
(
λ
+
μ
)
⊡
f
=
(
λ
⊡
f
)
⊞
(
μ
⊡
f
)
{\displaystyle (\lambda +\mu )\boxdot f=(\lambda \boxdot f)\boxplus (\mu \boxdot f)}
vectorial distributive law: For all
λ
∈
K
{\displaystyle \lambda \in K}
and all
f
,
g
∈
K
[
X
]
≤
n
{\displaystyle f,g\in K[X]_{\leq n}}
we have that:
λ
⊡
(
f
⊞
g
)
=
(
λ
⊡
f
)
⊞
(
λ
⊡
g
)
{\displaystyle \lambda \boxdot (f\boxplus g)=(\lambda \boxdot f)\boxplus (\lambda \boxdot g)}
associative law for scalars: For all
λ
,
μ
∈
K
{\displaystyle \lambda ,\mu \in K}
and all
f
∈
K
[
X
]
≤
n
{\displaystyle f\in K[X]_{\leq n}}
we have that:
(
λ
⋅
μ
)
⊡
f
=
λ
⊡
(
μ
⊡
f
)
{\displaystyle (\lambda \cdot \mu )\boxdot f=\lambda \boxdot (\mu \boxdot f)}
neutral element of scalar multiplication: For all
f
∈
K
[
X
]
≤
n
{\displaystyle f\in K[X]_{\leq n}}
and for
1
∈
K
{\displaystyle 1\in K}
(the neutral element of multiplication in
K
{\displaystyle K}
) we have that:
1
⊡
f
=
f
{\displaystyle 1\boxdot f=f}
.
We will now prove each of these steps individually.
We start with the associativity of addition. This follows from the associativity of addition in
K
{\displaystyle K}
.
Proof (Associativity of addition)
Let
f
=
∑
i
=
0
n
f
i
X
i
,
g
=
∑
i
=
0
n
g
i
X
i
,
h
=
∑
i
=
0
n
h
i
X
i
∈
K
[
X
]
≤
n
{\displaystyle f=\sum _{i=0}^{n}f_{i}X^{i},g=\sum _{i=0}^{n}g_{i}X^{i},h=\sum _{i=0}^{n}h_{i}X^{i}\in K[X]_{\leq n}}
.
Then, we have:
(
f
⊞
g
)
⊞
h
=
(
∑
i
=
0
n
f
i
X
i
⊞
∑
i
=
0
n
g
i
X
i
)
⊞
∑
i
=
0
n
h
i
X
i
=
∑
i
=
0
n
(
f
i
+
g
i
)
X
i
⊞
∑
i
=
0
n
h
i
X
i
=
∑
i
=
0
n
(
(
f
i
+
g
i
)
+
h
i
)
X
i
↓
associativity in
K
=
∑
i
=
0
n
(
f
i
+
(
g
i
+
h
i
)
)
X
i
=
∑
i
=
0
n
f
i
X
i
⊞
∑
i
=
0
n
(
g
i
+
h
i
)
X
i
=
∑
i
=
0
n
f
i
X
i
⊞
(
∑
i
=
0
n
g
i
X
i
⊞
∑
i
=
0
n
h
i
X
i
)
=
f
⊞
(
g
⊞
h
)
.
{\displaystyle {\begin{aligned}(f\boxplus g)\boxplus h&=(\sum _{i=0}^{n}f_{i}X^{i}\boxplus \sum _{i=0}^{n}g_{i}X^{i})\boxplus \sum _{i=0}^{n}h_{i}X^{i}\\&=\sum _{i=0}^{n}(f_{i}+g_{i})X^{i}\boxplus \sum _{i=0}^{n}h_{i}X^{i}\\&=\sum _{i=0}^{n}((f_{i}+g_{i})+h_{i})X^{i}\\&{\color {OliveGreen}\left\downarrow \ {\text{associativity in }}K\right.}\\[0.3em]&=\sum _{i=0}^{n}(f_{i}+(g_{i}+h_{i}))X^{i}\\&=\sum _{i=0}^{n}f_{i}X^{i}\boxplus \sum _{i=0}^{n}(g_{i}+h_{i})X^{i}\\&=\sum _{i=0}^{n}f_{i}X^{i}\boxplus (\sum _{i=0}^{n}g_{i}X^{i}\boxplus \sum _{i=0}^{n}h_{i}X^{i})\\&=f\boxplus (g\boxplus h).\end{aligned}}}
This shows the associativity of addition.
Now follows the commutativity of addition. As above, this follows from the commutativity of addition in
K
{\displaystyle K}
:
Proof (Commutativity of addition)
Let
f
=
∑
i
=
0
n
f
i
X
i
,
g
=
∑
i
=
0
n
g
i
X
i
∈
K
[
X
]
≤
n
{\displaystyle f=\sum _{i=0}^{n}f_{i}X^{i},g=\sum _{i=0}^{n}g_{i}X^{i}\in K[X]_{\leq n}}
.
Then, we have:
f
⊞
g
=
∑
i
=
0
n
f
i
X
i
⊞
∑
i
=
0
n
g
i
X
i
=
∑
i
=
0
n
(
f
i
+
g
i
)
X
i
↓
commutativity in
K
=
∑
i
=
0
n
(
g
i
+
f
i
)
X
i
=
∑
i
=
0
n
g
i
X
i
⊞
∑
i
=
0
n
f
i
X
i
=
g
⊞
f
.
{\displaystyle {\begin{aligned}f\boxplus g&=\sum _{i=0}^{n}f_{i}X^{i}\boxplus \sum _{i=0}^{n}g_{i}X^{i}\\&=\sum _{i=0}^{n}(f_{i}+g_{i})X^{i}\\&{\color {OliveGreen}\left\downarrow \ {\text{commutativity in }}K\right.}\\[0.3em]&=\sum _{i=0}^{n}(g_{i}+f_{i})X^{i}\\&=\sum _{i=0}^{n}g_{i}X^{i}\boxplus \sum _{i=0}^{n}f_{i}X^{i}\\&=g\boxplus f.\end{aligned}}}
This shows the commutativity of addition.
Now we have to prove that a zero exists, i.e. a neutral element with respect to addition. To do this, we must first find a candidate. There is an "obvious" one here: the zero polynomial
0
=
∑
i
=
0
n
0
X
i
{\displaystyle 0=\sum _{i=0}^{n}0X^{i}}
.
This is indeed the neutral element of the polynomial addition:
Proof (0 is the neutral element of addition)
Let
f
=
∑
i
=
0
n
f
i
X
i
∈
K
[
X
]
≤
n
{\displaystyle f=\sum _{i=0}^{n}f_{i}X^{i}\in K[X]_{\leq n}}
.
Then, we have:
0
⊞
f
=
∑
i
=
0
n
0
X
i
⊞
∑
i
=
0
n
f
i
X
i
=
∑
i
=
0
n
(
0
+
f
i
)
X
i
↓
0 is the neutral element of addition in
K
=
∑
i
=
0
f
i
X
i
=
f
.
{\displaystyle {\begin{aligned}0\boxplus f&=\sum _{i=0}^{n}0X^{i}\boxplus \sum _{i=0}^{n}f_{i}X^{i}\\&=\sum _{i=0}^{n}(0+f_{i})X^{i}\\&{\color {OliveGreen}\left\downarrow \ {\text{0 is the neutral element of addition in }}K\right.}\\[0.3em]&=\sum _{i=0}f_{i}X^{i}\\&=f.\end{aligned}}}
Since
f
{\displaystyle f}
was arbitrarily chosen,
0
{\displaystyle 0}
is the neutral element with respect to addition.
The next step is the existence of an additive inverse. Here again there is an obvious choice:
For a
f
=
∑
i
=
0
n
f
i
X
i
{\displaystyle f=\sum _{i=0}^{n}f_{i}X^{i}}
, the additive inverse is given by
g
=
∑
i
=
0
n
(
−
f
i
)
X
i
{\displaystyle g=\sum _{i=0}^{n}(-f_{i})X^{i}}
:
The proofs of the two distributive laws follow from the distributive law in
K
{\displaystyle K}
and go similarly, so we show only the second one here:
Proof (Distributive law)
Let
f
=
∑
i
=
0
n
f
i
X
i
∈
K
[
X
]
≤
n
,
λ
,
μ
∈
K
{\displaystyle f=\sum _{i=0}^{n}f_{i}X^{i}\in K[X]_{\leq n},\lambda ,\mu \in K}
.
Then, we have:
(
λ
+
μ
)
⊡
f
=
(
λ
+
μ
)
⊡
∑
i
=
0
n
f
i
X
i
=
∑
i
=
0
n
(
(
λ
+
μ
)
⋅
f
i
)
X
i
↓
distributive law in
K
=
∑
i
=
0
n
(
(
λ
⋅
f
i
)
+
(
μ
⋅
f
i
)
)
X
i
=
∑
i
=
0
n
(
λ
⋅
f
i
)
X
i
⊞
∑
i
=
0
n
(
μ
⋅
f
i
)
X
i
=
(
λ
⊡
∑
i
=
0
n
f
i
X
i
)
⊞
(
μ
⊡
∑
i
=
0
n
f
i
X
i
)
=
(
λ
⊡
f
)
⊞
(
μ
⊡
f
)
.
{\displaystyle {\begin{aligned}(\lambda +\mu )\boxdot f&=(\lambda +\mu )\boxdot \sum _{i=0}^{n}f_{i}X^{i}\\&=\sum _{i=0}^{n}((\lambda +\mu )\cdot f_{i})X^{i}\\&{\color {OliveGreen}\left\downarrow \ {\text{distributive law in }}K\right.}\\[0.3em]&=\sum _{i=0}^{n}((\lambda \cdot f_{i})+(\mu \cdot f_{i}))X^{i}\\&=\sum _{i=0}^{n}(\lambda \cdot f_{i})X^{i}\boxplus \sum _{i=0}^{n}(\mu \cdot f_{i})X^{i}\\&=(\lambda \boxdot \sum _{i=0}^{n}f_{i}X^{i})\boxplus (\mu \boxdot \sum _{i=0}^{n}f_{i}X^{i})\\&=(\lambda \boxdot f)\boxplus (\mu \boxdot f).\end{aligned}}}
This shows the distributive law for
+
{\displaystyle +}
in
⊡
{\displaystyle \boxdot }
.
Associative law with respect to multiplication
Bearbeiten
Next we have to establish the associative law with respect to scalar multiplication.
This follows (similarly to the first two axioms) from the associativity of multiplication in
K
{\displaystyle K}
:
Proof (Associative law with respect to multiplication)
Let
f
=
∑
i
=
0
n
f
i
X
i
∈
K
[
X
]
≤
n
,
λ
,
μ
∈
K
{\displaystyle f=\sum _{i=0}^{n}f_{i}X^{i}\in K[X]_{\leq n},\lambda ,\mu \in K}
.
Then, we have:
λ
⊡
(
μ
⊡
f
)
=
λ
⊡
(
μ
⊡
∑
i
=
0
n
f
i
X
i
)
=
λ
⊡
∑
i
=
0
n
(
μ
⋅
f
i
)
X
i
=
∑
i
=
0
n
(
λ
⋅
(
μ
⋅
f
i
)
)
X
i
↓
associativity of multiplication in
K
=
∑
i
=
0
n
(
(
λ
⋅
μ
)
⋅
f
i
)
X
i
=
(
λ
⋅
μ
)
⊡
∑
i
=
0
n
f
i
X
i
=
(
λ
⋅
μ
)
⊡
f
.
{\displaystyle {\begin{aligned}\lambda \boxdot (\mu \boxdot f)&=\lambda \boxdot (\mu \boxdot \sum _{i=0}^{n}f_{i}X^{i})\\&=\lambda \boxdot \sum _{i=0}^{n}(\mu \cdot f_{i})X^{i}\\&=\sum _{i=0}^{n}(\lambda \cdot (\mu \cdot f_{i}))X^{i}\\&{\color {OliveGreen}\left\downarrow \ {\text{associativity of multiplication in }}K\right.}\\[0.3em]&=\sum _{i=0}^{n}((\lambda \cdot \mu )\cdot f_{i})X^{i}\\&=(\lambda \cdot \mu )\boxdot \sum _{i=0}^{n}f_{i}X^{i}\\&=(\lambda \cdot \mu )\boxdot f.\end{aligned}}}
This shows the associative law for scalar multiplication.
And finally, we have to establish the unit property below:
Proof (Unit property)
Let
f
=
∑
i
=
0
n
f
i
X
i
∈
K
[
X
]
≤
n
{\displaystyle f=\sum _{i=0}^{n}f_{i}X^{i}\in K[X]_{\leq n}}
.
Then, we have:
1
⊡
f
=
1
⊡
∑
i
=
0
n
f
i
X
i
=
∑
i
=
0
n
(
1
⋅
f
i
)
X
i
↓
1
is the neutral element with respect to the multiplication in
K
=
∑
i
=
0
n
f
i
X
i
=
f
.
{\displaystyle {\begin{aligned}1\boxdot f&=1\boxdot \sum _{i=0}^{n}f_{i}X^{i}\\&=\sum _{i=0}^{n}(1\cdot f_{i})X^{i}\\&{\color {OliveGreen}\left\downarrow \ 1{\text{ is the neutral element with respect to the multiplication in }}K\right.}\\[0.3em]&=\sum _{i=0}^{n}f_{i}X^{i}\\&=f.\end{aligned}}}
So
K
[
X
]
≤
n
{\displaystyle K[X]_{\leq n}}
satisfies the unit property.
We have established all 8 vector space axioms, and hence the polynomial space
(
K
[
X
]
≤
n
,
⊞
,
⊡
)
{\displaystyle (K[X]_{\leq n},\boxplus ,\boxdot )}
is a vector space.