Vector Spaces in Abstract Algebra
To put it plainly, "this is kinda a mind fuck". Class notes are accessible at April 25th. This is needed to understand Fields specifically Extension Fields and Splitting Fields
How to think about it
If you have done linear algebra or multivariable calc, this might work against you here. We are usually used to two dimensional vector spaces, such as
However, in abstract algebra, the goal is to take out the physical attributes and think of it more abstractly. Then you can notice that we can define vectors in
Many examples from the book were done on paper.
Definition of a Vector Space
A vector space
Or associativity between the scalers Distributively between scalers times vectors Distributively between vectors times scalers
Notice, most of the time two vectors can not be multiplied. It is always scalervector. They can however be added. I'll differentiate the additive scaler identity and additive vector identity with and .
Examples:
over is a vector space.
clearly, you can multiply a scaler from
Lets show the following axioms within
- Associativity:
- First Distributivity:
- Second distributivity:
- Identity:
Clearly
forms a vector space over field
Def:
Vector addition is just polynomial addition and scaler multiplication is just multiplying the polynomial over an element.
This is easy to see so to see so I will skip the proof.
The set of all continuous real-valued functions on a closed interval is a vector space over .
is a vector space over .
Lets first define Vector addition and scaler multiplication, then it is easy to prove the axioms hold. I did the first few here and the rest on paper, but it is clear to see they hold.
Definition:
Vector addition:
For
Scaler Multiplication
For
Proof: of first two
Lets prove that
- Associativity:
- First distributive property:
Properties:
If
for all for all or for all for all and
Subspaces
Similar to subgroups for groups and subrings for rings, we have subspaces for vector spaces.
Definition of a Subspace
Let
- Closed under vector addition and scaler multiplication, i.e.
and for all scalers and multiples.
Example
A quick example is the subset
Clearly,
Linear Combination
Definition of Linear Combination
Definition of Spanning set
Spanning set of any vectors
If a set
If is a subset of vector space . Then the span of is a subspace of .
Firstly, the span of S is all the vectors of form
Vector Addition closure:
For two elements in Span S;
Then
Which is also in Span S
Scaler Multiplication closure.
For any scaler
Linear Independence (in Abstract Algebra)
Like in linear algebra vector is set to be linearly dependent if there exists scalers such that the linear combination is 0 without all coefficients being zero. Otherwise, its linearly independent.
I.E. A set is linearly independent if
another way to look at it is a set is linearly independent if a vector can not be written as linear combination of the other vectors.
Byproducts of linear independence:
If is a set of linearly independent vectors, and then we know
This follows as
means
This every
A set of vectors in a vector space V is linearly dependent one of the is a linear combination of the rest.
Essentially, take the following example of linear independence where at least one scaler (say
Then we can write
This works for any
Suppose that is spanned by vectors. If then any set of vectors in must be linearly dependent.
- We are saying that V is all the possible linear combinations of
vectors. Any larger set of vectors must be linearly dependent.
Basis:
Think of basis as the generators for a vector space, where the vector space is the span of those basis. The basis has to be linearly independent.
Basis for
The Basis of
Note, one set can have multiple basis, such as
In general, there is no unique basis for a vector space. In the example above there are actually infinite basis.
Basis for
Remember:
The sets
All basis for a vector space are of the same length.
This length of the basis is called the dimension of the vector space.
Ending theorems and notes
Let
- Any set of
linearly independent vectors is a basis for - Any set of
vectors that spans is a basis for - For any
, every set of linearly independent vectors there exists a set of vectors which can be a joined to create to create a basis for . (has to be length )
Linear Transformations Abstract Algebra
For some vector spaces
This is a linear transformation from
Notice that this is just a homomorphism preserving the structures of vector spaces.
Kernel:
The kernel in linear algebra is just the null space, and its all the vectors in
The kernel is always a subspace of .
Proof:
- Closure with vector addition
- Let
both be in . Then Which means is clearly also in the kernel.
- Let
- Closure with scaler multiplication
- We have to show that if
is in the kernel then is also in the kernel. Assume is in the kernel, then: - So clearly
is also in the kernel.
- We have to show that if
Also, remember that Homomorphism is Injective if and only if the kernel is trivial.
Example
Lets define a linear transformation with the following:
Now, this is a linear transformation as for any two matrices,
First lets notice that it holds vector addition:
Now lets notice that it holds scaler multiplication:
The kernel of this vector space is also easy to compute, we just need all
Clearly this happens when the following systems of equations hold true:
Given this, this only happens when