Algebraic Structures
This chapter covers notes on algebraic structures.
Operations
We begin by defining an operation. As we saw in the last chapter, an operation is a kind of map. We now provide an explicit definition:
operation. Let be a set and Then the operation is a function that assigns every tuple exactly one element called the result of on which we may also denote as We call the symbol an operator, the function the operation, the carrier, and the arity of If then in which case we call a ternary operation. If then and is called a binary operation. If then and is called a unary operation. If then is called a nullary operation, or constant.
In these materials, we will focus primarily on binary and unary operations. From the definition above, we can say that, given a unary operation may be written as or and a binary operation may be written as However, we don't usually think of operations as functions (though we will soon enough), and the way operations are written depends heavily on the problem at hand:
Operation | Equivalent | Type | Example |
---|---|---|---|
Binary Prefix | |||
Binary Infix | |||
Binary Postfix | |||
Unary Prefix | |||
Unary Postfix | |||
Delimited |
In general, we'll use the notation for binary operations, and the notation for unary operations. For each of the operations above, we call and the operands of
Something to observe right off the bat: Since is a function, it follows that, where and we have Furthermore, note that the operator must always return some element in It can't return an element not in the carrier. In other words, for the operation the only things that exist in the universe are the things in its carrier For example, suppose:
One operation we might define on this set is (multiplication). This is a valid binary operation, since the operator always returns an element of the set
Returning to the definition of an operation, there are two nuances we must always keep in mind. First, must be uniquely defined. That is, it should be crystal clear what the element is. For example, suppose we defined an operation on the integers as: is a natural number whose square is This would not qualify as an operation. Why? Because if and then and
Second, given a carrier set and must be in for to be an operation. If then is what we call a partial operation. If for all we say that is closed under
closed carrier. Let be a carrier set and be an operation on We say that is closed under if, and only if, for all in it is true that Otherwise, we say that is not closed under
For example, subtraction is a partial operation on the set of natural numbers Why? Because subtraction does not always return a element of the carrier set: and Likewise, division is a partial operation on the set integers, because does not always result in an integer: Division is, however, is an operation on the set of positive real numbers, because always returns a real number where It is not an operation on the set of all real numbers, because does not return a real number.
Counting Binary Operations
Suppose we have a carrier set Then the Cartesian product of this set is:
Following the definition of an operation, an operation maps each of the pairs above to an element of For example, some mappings might be:
Thinking more carefully about this, we can draw a few inferences. First, an operation is a mapping from each element of the Cartesian product of the carrier set to an element of the carrier set. Thus, given a carrier set where the cardinality of is the cardinality of is And since a binary operation is a function of each element of to an element of we have possible operations. Thus, for our simple carrier set we get possible operations.
Notice what this means: Without further information, if is an operation, then is a unique element. That is, is a different element from This means that we can never assert that unless we assume, better yet, prove, that is commutative (we'll study commutativity shortly). Likewise, given a in the carrier set without more, the element is a distinct element from Which means that we cannot assert that without assuming or proving that is associative.
Algebraic Structure
An algebraic structure, despite its complicated-sounding name, has a surprisingly short definition:
algebraic structure. An algebraic structure, or algebra, is a tuple:
where is the carrier set, and are operations on
From the definition above, an algebra or algebraic structure is a tuple with two objects: (1) a set, and (2) a set of operations β functions β that take some member(s) of the set as a arguments, and return some member(s) of the set as their image. That set could be anything β a set of numbers, vectors, tuples, matrices, circuits, Boolean values, chess pieces, politicians, etc. Given that operations are functions, functions are relations, and relations are really just rules for how objects are paired, it follows that the moment we take a set and define rules for how things in that set can be combined, we've created a an algebra.
Within reason, we may denote an algebra with the notational form For example, the algebra is an algebra of the reals and addition. Working with just a single operation, we can omit the braces and simply write If the algebra includes multiple operations, we adopt the convention of using braces. For example, is an algebra of the reals and the operations of addition, multiplication, subtraction, and division.
Axiomatic Properties
We begin by presenting axiomic properties. These are properties of an algebra's carrier set or operations that appear so frequently in proofs that appear so frequently that it would be inconvenient to state the same assumptions over and over again. Accordingly, we encapsulate the body of assumptions into a single package called an axiomatic property. This practice isn't unique to abstract algebra β we see it in all other areas of mathematics.
Commutativity
We say that an operation is commutative on a set if, and only if, for all it is true that
commutativity. Let be a carrier set and an operation on Then is commutative if, and only if, for all it is true that Otherwise, we say that the operation is noncommutative.
Some examples: The operation of on the integers is commutative: Likewise, the operation of is commutative on the integers The operation of division on the reals, however, is not: We say the addition is commutative on the reals, and division is noncommutative on the reals. Subtraction is not commutative on the integers either, with the single exception that This does not, however, motivate us to say "partially commutative" because an operation is commutative if, and only if, it holds for all members of the carrier. Even if there were such a motivation, it wouldn't be very useful. (We follow the ancient mantra of mathematics β if it does not need to be said/defined, do not say/define it.)
A few more bits of detail: First, since is a function, the commutative property essentially tells us that In the world of functions, we say that is a symmetric function. Given and is commutative, we may use the cute phrase, " commutes with and commutes with " Thus, when an operation on a set is commutative, then elements that are placed on the bus are free to sit wherever they'd like β will send them where they must regardless.
Associativity
Next, we say that an operation is associative if, and only if, for all we can rest assured that Associativity is a property that arises when we have more than two operands β with just one or two operands, associativity is unnecessary, because commutativity/noncommutativity answers all of our questions. When we have more than two operands, however, we're confronted with a question that commutativity can't answer: Given can we do first? Put differently, associativity tells us whether applying commutativity is at our discretion. If an operation is nonassociative, then no β there are rules to follow. If an operation is associative, then yes β it's a free for all.
associativity. Let be a carrier set and and operation on Then is associative if, and only if, for all it is true that Otherwise, we say that the operation is nonassociative.
Once more, since is just a function, we can think of associativity in functional terms as Importantly, the fact that an operation is associative does not imply that it's commutative. For example, the concatenation operation on strings is associative, but noncommutative. For those unfamiliar, is an operation that takes two sequences of symbols, and concatenates, or joins them, into a single sequence of symbols. For example, or Thus, it follows that but it's not the case that
The same goes in the other direction: The commutativity does not imply associativity. Perhaps the most important example of this in real life is floating point addition (and by extension, multiplication). is always true, but is not guaranteed to equal because of precision loss, and the probability that increases if is very close to Failing to consider this fact can kill; the Patriot Missile Failure of 1991 killed 28 U.S. soldiers and injured about 100 others because of incorrect floating point arithmetic.
Identity
The next property we're interested in is whether the carrier set contains an identity.
identity element. Let be a carrier set and be an operation on If, and only if, there exists an element such that, for all and then we say that is an identity element of
For example, for the algebra the identity element is For the algebra the identity element is Not all algebras have identity elements. The algebra where is the set of all odd natural numbers, has no identity element β adding an odd natural to another odd natural will always return some new odd natural
We must be careful with the identity element's definition β it's deceptively simple. There are three pieces to the definition: (1) (2) and (3) condition 1 and condition 2 hold for all the elements of the carrier set. Thus, if we want to prove that some element is an identity element, we must prove all three of these pieces. As it turns out, we can satisfy condition 1 without satisfying condition 2, condition 2 without condition 1, or neither condition 1 nor condition 2. This leads to the notion of a one-sided identity.
one-sided identity. Let be a carrier set and an operation on Given an element we say that is a left-sided identity if, and only if, for all We say that is a right-sided identity if, and only if, for all If and then is a strictly left-sided identity, and if and then is a strictly right-sided identity. If is either strictly left-sided or strictly right-sided, we say that is a one-sided identity. If is both left-sided and right-sided, then is an identity element.
Suppose we're given an algebra We're told that has the left identity and the right identity From the definition above, we can infer that, given and
What if and Well, a simple substitution shows:
That is, Consequently, we have the following lemma.
lemma. Given a carrier set and a binary operation on there is at most one identity element with respect to
Inverse
If a carrier set has an identity element, we often want to ask whether every element of the carrier set has a left inverse, right inverse, or both left and right (inverse). Here is the definition of a left-inverse:
left inverse. Let be a carrier set, an operation on and the identity element of We say that every has a left inverse if, and only if,
And here is the definition of the right inverse:
right inverse. Let be a carrier set, an operation on and the identity element of We say that every has a right inverse if, and only if,
If every element of the carrier set has a right and left inverse, we say that the every element of the carrier set has an inverse.
inverse. Let be a carrier set and be an operation on Every has an inverse if, and only if,
Distributivity
The third property we'll consider requires us to introduce a second binary operator Thus, we know have an algebra where is a carrier set.1 With two operations, a question we're interested in is whether the operations are left distributive, right distributive, or both left and right distributive.
Left-distributivity is defined as follows:
left-distributivity. Let be a carrier, and and be operations on We say that is left-distributive over if, and only if, for all
Right-distributivity is defined as follows:
right-distributivity. Let be a carrier, and and be operations on We say that is right-distributive over if, and only if, for all
If an operation is both left- and right-distributive over an operation we say that distributes over or is distributive over
distributivity. Let be a carrier, and and be operations on We say that is distributive over if, and only if, for all
A few things to observe from the definition of distributivity: First, it's a property that always involves two distinct operations. Accordingly, the statement "multiplication is distributive" is akin to an incomplete sentence in natural language; it's ambiguous. Why? Because the adjective "distributive" entails that the distributive "thing" distributes. And if something distributes, there's always a distributor (the thing that distributes) and a distributee (the thing that receives the distribution). Thus, when we state that " is distributive" without more, the audience is told the distributor, but not the distributee, which leaves them with no way of verifying whether the statement is true or false β it's ambiguous. And in the world of mathematics, ambiguous statements are useless. Accordingly, to state that something is distributive, the statement must always be either of these forms:
is distributive over
distributes over
Second, distributivity is our first encounter of a property that relates one operation with another. We'll see that there are many other such properties, but distributivity is by far the most common. Casting distributivity in functional terms, suppose and If is left-distributive over we get:
If is right-distributive over we get:
And if is distributive over we have:
Axiomatic Algebras
Having defined the axiomatic properties, we turn to axiomatic algebras. Like the axiomatic properties, an axiomatic algebra is an algebra that mathematicians have given a special name, because (1) the algebra has some interesting combination of the axiomatic properties and (2) the algebra appears over and over again in many proofs.
Before we look at the algebras themselves, let's establish some conventions associated with algebras. As we saw, an algebra is a tuple where is some set of elements and is a set of operations on those elements. Now that we're discussing algebras, we'll want to refer to them without having to write over and over again. This particularly the case once we start talking about larger algebras. The convenion in abstract algebra is to use fraktur letters for algebraic structures. For example, an algebra may be denoted This notation has the benefits of (1) conciseness and (2) the ability to distinguish the carrier and the algebra
Semigroup
The two most basic algebras are the associative semigroup and the commutative semigroup. As we'll see later, recognizing these algebras can allow us to deduce some interesting propositions. First, let's consider the semigroup, a very simple algebra:
semigroup. Let be a carrier set and an operation on Then the algebra is an associative semigroup if, and only if, is associative on
For example, the algebras and are all semigroups. The algebra is not a semigroup, because is not associative on the integers. The algebra is not a semigroup, because is not even an operation on Of note, we often refer to algebras with the following syntactic form:
of the
For example, the semigroup is called the additive semigroup of the integers, and the semigroup is called the multiplicative semigroup of the naturals.
Commutative Semigroup
The next simplest algebra is the commutative semigroup:
commutative semigroup. Let be a carrier set and an operation on Then the algebra is a commutative semigroup if, and only if, is commutative on
For example, the algebras and are all Abelian semigroups. The operations of addition and multiplication are commutative on and
Monoid
The third simplest algebra is the monoid.
monoid. Let be a carrier set and an operation on Then the algebra is an monoid if, and only if, the following propositions are true: (1) is an associative semigroup, and (2) for all there exists an identity element
For example, the algebras and are monoids, because their carrier sets contain the additive identity Likewise, the algebras and are monoids, because they're carrier sets contain the multiplicative identity In contrast, the algebra is a semigroup but not a monoid, because the multiplicative identity is not a member of the set of even integers
Commutative Monoids
If is associative but not commutative, we say that is an associative monoid. If is commutative but not associative, we say that is a commutative monoid.
Groups
The fourth simplest algebra is the group. By far, they are most common of the simple algebras. In fact, they're so common that there's an entire subfield of abstract algebra dedicated to them, called group theory.
group. Let be a carrier set and be an operation on Then the algebra is a group if, and only if, all of the following propositions are true: (1) is associative. (2) For all there exists an identity element And (3) For all there exists an inverse If the carrier is a finite set, then we say that the group is a finite group.
Notice that from the definition above, the group is really just an associative monoid with an addon: Every element has an inverse such that
Let's consider some examples. The algebra is a group, called the additive group of integers. Likewise, the algebra is a group, called the additive group of rationals. If we defined the natural numbers as excluding zero (which we do not in these materials), then is not a group, because there is no inverse.
Importantly, the definition of a group does not require commutativity. Thus, a given group without further information, does not tell us whether is commutative on If it is commutative, then we say that is an Abelian group.
Properties of Groups
Recall that in our discussion of the identity element, we showed that an algebra can have at most one identity element. Since a group, by definition, must have an identity element, we have the following property:
group identity. Given a group there exists exactly one element called the group identity, such that, for all it follows that
Now consider this question: Can an element have two inverses? Well, let's suppose has the inverses and Then we have:
But since is associative by definition, it follows that:
Hence, we have the following property:
group inverse. Given a group there exists exactly one element called the group inverse, such that for all it follows that is the identity element
Abelian Group
The term "group" without more, communicates an assocative monoid with the additional property of every element having an inverse. For commutative monoids, we have the Abelian group (named after the mathematician Niels Abel, a forefather of group theory).
abelian group. Let be a carrier set and be an operation on Then the algebra is an Abelian group if, and only if, all of the following propositions are true: (1) is commutative. (2) For all there exists an identity element And (3) For all there exists an inverse
The algebras of and are all Abelian groups. For example, the algebra where is an alphabet and is concatenation is a group, but it is not a semigroup (recall the earlier demonstration that string concatenation is associative but noncommutative).
Rings
We now turn to our first "big" structure β the ring. We say "big" because we're now looking at a structure with two operations.
ring. Let be the algebra where is a set, and and are operations on Then is a ring if, and only if, the following propositions are true:
- is an Abelian group.
- For all it follows that That is, is associative.
- For all and That is, is distributive over
commutative ring. A ring is a commutative ring if, and only if, is both a ring and multiplication is commutative in
The structure is a commutative ring. It has an additive identity, 0, and a multiplicative identity, 1. It also has an additive inverse; for every there exists
Ordered Rings
The structure is an example of an ordered ring.
ordered ring. A ring is ordered if, and only if, there exists a nonempty set called the positive elements of with the following properties:
- If then
- If then
- If then exactly one of the following is true: or
Fields
field. A field is a structure where is a set and are binary operations on satisfying the following properties, called the field axioms:
- is an Abelian group.
- is an Abelian group.
- is distributive over in
Isomorphisms
Consider the following pairs of strings:
Do we see a pattern to these strings? Let's create a grid for each pair above. We'll have one row per string, one cell per character, and the following coloring rule: For all the characters of a given string if there exists a pair of characters such that then and are assigned the color powder blue. Otherwise, no color is assigned. Putting it all together, we have:
m | a | d | a | m |
r | o | t | o | r |
p | a | p | e | r |
t | i | t | l | e |
e | g | g |
a | d | d |
Notice that "madam" and "rotor" have some relationship, "paper" and "title" have another relationship, and "egg" and "add" have another relationship. All of these relationships are examples of isomorphisms β the relationship of structural similarity between objects. The string "rotor" is an isomorph of "madam" and the string "madam" is an isomorph of "rotor." Likewise, "paper" is an isomorph of "title" and vice versa, and "egg" is an isomorph of "add" and vice versa.
Footnotes
-
The symbol is called a lamp, and is used to denote a comment in the APL programming language. We use it here as a way to distinguish operations. β©