You might want to understand 1A. Preliminaries and Introduction prior to this.

To put it plainly, “this is kinda a mind fuck”. Class notes are accessible at April 25th. This is needed to understand Fields specifically Extension Fields and Splitting Fields

How to think about it

If you have done linear algebra or multivariable calc, this might work against you here. We are usually used to two dimensional vector spaces, such as or which can be added together or multiplied by a scaler.

However, in abstract algebra, the goal is to take out the physical attributes and think of it more abstractly. Then you can notice that we can define vectors in dimensions with some simple definitions, and many more things that we usually do not think about can become vectors; such as sets in Cryptography or Abstract algebra for ML where you might have attributes in 10!+ dimensions.

Many examples from the book were done on paper.

Definition of a Vector Space

A vector space over a field is an abelian group with scalar products defined for all and all which satisfies the following axioms given (Scalars) and . (Vectors)

  1. Or associativity between the scalers
  2. Distributively between scalers times vectors
  3. Distributively between vectors times scalers
  4. Notice, most of the time two vectors can not be multiplied. It is always scaler vector. They can however be added. I’ll differentiate the additive scaler identity and additive vector identity with and .

Examples:

over is a vector space.

clearly, you can multiply a scaler from with a vector from . We define scaler and vector multiplication as such: Lets show the following axioms within .

  1. Associativity:
  2. First Distributivity:
  3. Second distributivity:
  4. Identity: Clearly is a vector space over . For the future I will be less rigorous with examples unless absolutely needed, and we have build more intuition.

forms a vector space over field

Def:

Vector addition is just polynomial addition and scaler multiplication is just multiplying the polynomial over an element. This is easy to see so to see so I will skip the proof.

The set of all continuous real-valued functions on a closed interval is a vector space over .

is a vector space over .

Lets first define Vector addition and scaler multiplication, then it is easy to prove the axioms hold. I did the first few here and the rest on paper, but it is clear to see they hold.

Definition:
Vector addition:

For , let and for . Then we define as

Scaler Multiplication

For and the same we used above,

Proof: of first two

Lets prove that is a vector space over .

  1. Associativity:
  2. First distributive property:

Properties:

If is a vector space over then some properties hold

  1. for all
  2. for all
  3. or
  4. for all
  5. for all and

Subspaces

Similar to subgroups for groups and subrings for rings, we have subspaces for vector spaces.

Definition of a Subspace

Let be a vector space over , and let be some subset of . Then is said to be a subspace of iff:

  1. Closed under vector addition and scaler multiplication, i.e. and for all scalers and multiples.

Example

A quick example is the subset of where polynomials in have no odd powered terms. (Remember is a vector space over ) Clearly, as the odd terms will always have coefficients . Similarly for any , as again the coefficients of odd terms are 0.

Linear Combination

Definition of Linear Combination

is a linear combination of vectors .

Definition of Spanning set

Spanning set of any vectors is the linear combination of all the possible vectors.

If a set is is a spanning set of vectors , we say is spanned by .

If is a subset of vector space . Then the span of is a subspace of .

Firstly, the span of S is all the vectors of form .

Vector Addition closure:

For two elements in Span S; , we have it so Then Which is also in Span S

Scaler Multiplication closure.

For any scaler , this obviously holds as

Linear Independence (in Abstract Algebra)

Like in linear algebra vector is set to be linearly dependent if there exists scalers such that the linear combination is 0 without all coefficients being zero. Otherwise, its linearly independent.

I.E. A set is linearly independent if

another way to look at it is a set is linearly independent if a vector can not be written as linear combination of the other vectors.

Byproducts of linear independence:

If is a set of linearly independent vectors, and then we know

This follows as means This every or .

A set of vectors in a vector space V is linearly dependent one of the is a linear combination of the rest.

Essentially, take the following example of linear independence where at least one scaler (say ) is not zero. Then we can write as the following This works for any if the vector is linearly dependent.

Suppose that is spanned by vectors. If then any set of vectors in must be linearly dependent.

  1. We are saying that V is all the possible linear combinations of vectors. Any larger set of vectors must be linearly dependent.

Basis:

Think of basis as the generators for a vector space, where the vector space is the span of those basis. The basis has to be linearly independent.

Basis for

The Basis of would be where the entire set can be generated as a linear combination of these linearly independent vectors.

Note, one set can have multiple basis, such as is also the basis for .

In general, there is no unique basis for a vector space. In the example above there are actually infinite basis.

Basis for

Remember: The sets and are both basis for this set.

All basis for a vector space are of the same length.

This length of the basis is called the dimension of the vector space.

Ending theorems and notes

Let be a vector space of dimension . Then we have the following:

  1. Any set of linearly independent vectors is a basis for
  2. Any set of vectors that spans is a basis for
  3. For any , every set of linearly independent vectors there exists a set of vectors which can be a joined to create to create a basis for . (has to be length )

Linear Transformations Abstract Algebra

For some vector spaces and over , we can define a linear transformation as a the following map preserving scaler multiplication and vector addition: This is a linear transformation from to . Note: and

Notice that this is just a homomorphism preserving the structures of vector spaces.

Kernel:

The kernel in linear algebra is just the null space, and its all the vectors in such that

The kernel is always a subspace of .

Proof:

  1. Closure with vector addition
    1. Let both be in . Then Which means is clearly also in the kernel.
  2. Closure with scaler multiplication
    1. We have to show that if is in the kernel then is also in the kernel. Assume is in the kernel, then:
    2. So clearly is also in the kernel.

Also, remember that Homomorphism is Injective if and only if the kernel is trivial.

Example

Lets define a linear transformation with the following: Now, this is a linear transformation as for any two matrices, First lets notice that it holds vector addition: Now lets notice that it holds scaler multiplication: The kernel of this vector space is also easy to compute, we just need all such that .

Clearly this happens when the following systems of equations hold true: Given this, this only happens when and are both zero, so the kernel has to be or trivial.