Graph Theory
The following materials are notes on graph theory.
Graph theory is a fairly nascent field, and there are many pieces of terminology. We lay all of them out now for expediency.
Terminology
Visually, are circles connected with lines:
Each circle is called a vertex (or node), and each line is called an edge (or link). Below is a general definition of a graph. The numbers are simply objects associated with each node (what we can think of as labels or datum). The objects chosen is a matter of application. They might be computers, people, chemical compounds, it's entirely up to us. Graph theory, in general, isn't particularly interested in what the vertices or edges represent. Instead, we're more interested in the general properties of graphs as objects in themselves.
graph. A graph is an ordered pair of sets The set is a set of objects called vertices. The set is a set of relations called edges. For all edges we may denote an edge with a variable Given a graph and absent ambiguity, we denote the set of all vertices of with the notation and the set of all edges of with the notation Given the edge we say that and are adjacent to one another, and that and are incident to
order and size. Given a graph the cardinality of is called the order of denoted and the cardinality of is called the size of denoted
neighborhood. The neighborhood of a vertex denoted is the set of all vertices in adjacent to
degree. Given a vertex the number of edges adjacent to is called the degree of denoted
density. Given a graph the density of denoted is defined as
where is the number of vertices and is the number of edges. If is proportional to then we way that is dense. Otherwise, we say that is sparse.
The definitions above are broad. In short, it doesn't preclude a graph from having vertices that connect to themselves, or "duplicate edges."
The edge is an example of a loop, and the edges and are examples of parallel edges.
loop. Given a graph and an edge the edge is a loop if, and only if,
parallel edges. Given a graph if and with then and are parallel edges.
Simple Graphs
A graph with no loops or parallel edges is called a simple graph.
simple graph. A graph is simple if, and only if, consists of only and and contains no loops and no parallel edges. That is, for all and
In the next section we discuss digraphs, where we introduce the notion of directed edges. Per the definition above, simple graphs have no direction. We can also see that given a simple graph there are, at most, edges. This is because the Cartesian product of includes self-loops and counts each edge between distinct vertices. Thus, we subtract and divide by 2.
theorem. Given a simple graph with vertices, there are, at most, edges.
This bound is partly why simple graphs are the most widely-used type of graph. The bound no longer exists once we allow parallel edges.
density of a simple graph. Given a simple graph the density of is defined as where is the number of edges, and is the number of vertices.
Subgraphs
In some cases, we're only interested in just a few parts of a graph rather than the entirety of Those parts are called subgraphs.
subgraph. Given a graph a subgraph of is a graph such that and
The definition above is for a simple graph. Given an edge we say that and are neighbors or adjacent, and we say that is incident to and
Walks
A sequence of vertices and edges is called a walk.
walk. Let be a graph of vertices. A walk is a sequence where each and for all We define the number of edges in as the length of the walk.
Paths
A sequences of vertices and edges with no repeated vertices is called a path.
path. Let be a graph of vertices. contains a path if, and only if, there exists a walk where each is distinct, for all We denote the set of all vertices comprising with and the set of all all edges as
vertex disjoint paths. Two paths and are vertex disjoint if and only if
edge disjoint paths. Two paths and are edge disjoint if and only if
disjoint paths. Two paths and are disjoint if, and only if, they are both vertex disjoint and edge disjoint.
A Hamiltonian path on a graph is a path on containing every vertex of
hamiltonian path. A path on a graph is a Hamiltonian path if, and only if,
Trails
A walk with no repeated edges is called a trail. If the first and the last vertices are equal, we call the walk a closed trail.
trail. Let be a graph of vertices. contains a trail if, and only if, there exists a walk where each is distinct, for all If (that is, a loop) we say that is a closed trail.
Circuits
A circuit is a trail that starts and finishes at the same node.
circuit. Given a graph with vertices, we say that contains a circuit if, and only if, there exists a trail such that for all
Cycles
A cycle is a circuit where the only repeated vertices are the first and the last.
cycle. Let be a graph of vertices. a cycle if, and only if, there exists a circuit where and is the only repeated vertex for all If contains a cycle, then we say that is called cyclic. Otherwise, we say that is acyclic.
Some cycles have a special property of containing every vertex of We call these cycles Hamiltonian cycles.
hamiltonian cycle. A cycle on a graph is a Hamiltonian cycle if, and only if,
Connectedness
Graphs are called connected if it contains a path from every vertex to every other vertex. That is, there exists a path if we wanted to visit every single vertex.
connectedness. A graph is connected if, and only if, for any two vertices there exists a path whose endpoints are and Otherwise, we say that is disconnected.
For a digraph if we can can visit every vertex in the correct direction, we call a strongly-connected graph.
strong-connectedness. A digraph is strongly-connected if, and only if, there exists a walk such that, if then and where is the adjacency function of Otherwise, we say that is weakly-connected.
Trees
tree. A graph is called a tree if, and only if, is acyclic and is connected.
The spanning tree of a tree is a tree that contains all of 's vertices, but not necessarily edges.
spanning tree. Given a tree a spanning tree of denoted is a tree where and
forest. A forest is a set where each is a tree for all with
spanning forest. Given a tree the spanning forest of denoted is a set where each is a spanning tree of for all with
Graph Operations
complete graph. A graph is a complete graph if, and only if, for any and any there exists an edge We denote a complete graph with the notation
graph complement. Given a graph the complement of denoted is defined as
graph union. Given the graph and the union of and is defined as
Digraphs
Graphs can be directed or undirected. Directed graphs are also called digraphs. For example:
Here, the edges are directed a particular way. By definition, an edge exists if and only if an edge exists. The "arrowhead" is determined by a function called the adjacency function. This function tells us that a directed edge exists, but not
digraph. A digraph is an ordered triple where is a set of vertices, is a set of directed edges. The function is a mapping called the arrow function of such that associates each directed edge with an ordered pair of vertices. Given a directed edge where and are vertices of we call the edge's source, and the edge's destination. Given a vertex the number of edges such that is a source is called the outdegree of denoted The number of edges such that is a destination is called the indegree of denoted
Adjacency Matrices
While graph diagrams look pretty, they're not the easiest to draw nor are they the best vehicles for reasoning. Another way to model graphs is with an adjacency table or adjacency matrix. For example, below is a graph diagram and its corresponding adjacency matrix.
0 | 1 | 0 | 1 | 1 | |
0 | 0 | 1 | 0 | 0 | |
0 | 0 | 0 | 1 | 0 | |
0 | 0 | 0 | 0 | 1 | |
0 | 0 | 0 | 0 | 0 |
The left-most column indicates the head (the starting vertex), and the header row indicates the tail (the ending vertex). A more concise form of the adjacency table is an adjacency matrix:
This should ring some bells: There is, in fact, a connection between graph theory and linear algebra. We will be seeing some linear algebra applications in these materials.