Definitions

# Minimum spanning tree

Given a connected, undirected graph, a spanning tree of that graph is a subgraph which is a tree and connects all the vertices together. A single graph can have many different spanning trees. We can also assign a weight to each edge, which is a number representing how unfavorable it is, and use this to assign a weight to a spanning tree by computing the sum of the weights of the edges in that spanning tree. A minimum spanning tree or minimum weight spanning tree is then a spanning tree with weight less than or equal to the weight of every other spanning tree. More generally, any undirected graph (not necessarily connected) has a minimum spanning forest, which is a union of minimum spanning trees for its connected components.

One example would be a cable TV company laying cable to a new neighborhood. If it is constrained to bury the cable only along certain paths, then there would be a graph representing which points are connected by those paths. Some of those paths might be more expensive, because they are longer, or require the cable to be buried deeper; these paths would be represented by edges with larger weights. A spanning tree for that graph would be a subset of those paths that has no cycles but still connects to every house. There might be several spanning trees possible. A minimum spanning tree would be one with the lowest total cost.

## Properties

### Possible multiplicity

There may be several minimum spanning trees of the same weight; in particular, if all weights are the same, every spanning tree is minimum.

### Uniqueness

If each edge has a distinct weight then there will only be one, unique minimum spanning tree. The proof to this fact can be done by induction. This is true in many realistic situations, such as the cable TV company example above, where it's unlikely any two paths have exactly the same cost. This generalizes to spanning forests as well.

### Minimum-cost subgraph

If the weights are non-negative, then a minimum spanning tree is in fact the minimum-cost subgraph connecting all vertices, since subgraphs containing cycles necessarily have more total weight.

### Cycle property

For any cycle C in the graph, if the weight of an edge e of C is larger than the weights of other edges of C, then this edge cannot belong to an MST. Indeed, assume the contrary, i.e., e belongs to an MST T1. If we delete it, T1 will be broken into two subtrees with the two ends of e in different subtrees. The remainder of C reconnects the subtrees, hence there is an edge f of C with ends in different subtrees, i.e., it reconnects the subtrees into a tree T2 with weight less than that of T1, because the weight of f is less than the weight of e.

### Cut property

For any cut C in the graph, if the weight of an edge e of C is smaller than the weights of other edges of C, then this edge belong to all MSTs of the graph. Indeed, assume the contrary, i.e., e does not belong to an MST T1. then adding e to T1 will produce a cycle, which must have another edge e2 from T1 in the cut C. Replacing e2 with e would produce a tree T1 of smaller weight.

## Pseudo Code

For the most general case,

`function MST(G,W):`
T = {}
`    while T does not form a spanning tree:`
`        find an edge in E that is safe for T`
T = T union {(u,v)}
`    return T`

where "safe" means that it forms an MST.

## Algorithms

The first algorithm for finding a minimum spanning tree was developed by Czech scientist Otakar Borůvka in 1926 (see Borůvka's algorithm). Its purpose was an efficient electrical coverage of Moravia. There are now two algorithms commonly used, Prim's algorithm and Kruskal's algorithm. All three are greedy algorithms that run in polynomial time, so the problem of finding such trees is in FP, and related decision problems such as determining whether a particular edge is in the MST or determining if the minimum total weight exceeds a certain value are in P. Another greedy algorithm not as commonly used is the reverse-delete algorithm, which is the reverse of Kruskal's algorithm.

The fastest minimum spanning tree algorithm to date was developed by Bernard Chazelle, which is based on the Soft Heap, an approximate priority queue. Its running time is O(e α(e,v)), where e is the number of edges, v is the number of vertices and α is the classical functional inverse of the Ackermann function. The function α grows extremely slowly, so that for all practical purposes it may be considered a constant no greater than 4; thus Chazelle's algorithm takes very close to linear time.

What is the fastest possible algorithm for this problem? That is one of the oldest open questions in computer science. There is clearly a linear lower bound, since we must at least examine all the weights. If the edge weights are integers with a bounded bit length, then deterministic algorithms are known with linear running time. For general weights, there are randomized algorithms whose expected running time is linear.

Whether there exists a deterministic algorithm with linear running time for general weights is still an open question. However, Seth Pettie and Vijaya Ramachandran have found a provably optimal deterministic minimum spanning tree algorithm, the computational complexity of which is unknown.

More recently, research has focused on solving the minimum spanning tree problem in a highly parallelized manner. With a linear number of processors it is possible to solve the problem in $O\left(log n\right)$ time. A 2003 paper "Fast Shared-Memory Algorithms for Computing the Minimum Spanning Forest of Sparse Graphs" by David A. Bader and Guojing Cong demonstrates a pragmatic algorithm that can compute MSTs 5 times faster on 8 processors than an optimized sequential algorithm. Typically, parallel algorithms are based on Boruvka's algorithm — Prim's and especially Kruskal's algorithm do not scale as well to additional processors.

Other specialized algorithms have been designed for computing minimum spanning trees of a graph so large that most of it must be stored on disk at all times. These external storage algorithms, for example as described in "Engineering an External Memory Minimum Spanning Tree Algorithm" by Roman Dementiev et al., can operate as little as 2 to 5 times slower than a traditional in-memory algorithm; they claim that "massive minimum spanning tree problems filling several hard disks can be solved overnight on a PC." They rely on efficient external storage sorting algorithms and on graph contraction techniques for reducing the graph's size efficiently.

## MST on complete graphs

It has been shown by J. Michael Steele based on work by Alan M. Frieze that given a complete graph on n vertices, with edge weights chosen from a continuous random distribution $f$ such that $f\text{'}\left(0\right) > 0$, as n approaches infinity the size of the MST approaches $zeta\left(3\right)/f\text{'}\left(0\right)$, where $zeta$ is the Riemann zeta function.

For uniform random weights in $\left[0,1\right]$, the exact expected size of the minimum spanning tree has been computed for small complete graphs.

Vertices Expected size
2 1 / 2
3 3 / 4
4 31 / 35
5 893 / 924
6 278 / 273
7 30739 / 29172
8 199462271 / 184848378
9 126510063932 / 115228853025

## Related problems

A related graph is the k-minimum spanning tree (k-MST) which is the tree that spans some subset of k vertices in the graph with minimum weight.

A set of k-smallest spanning trees is a subset of k spanning trees (out of all possible spanning trees) such that no spanning tree outside the subset has smaller weight. (Note that this problem is unrelated to the k-minimum spanning tree.)

The Euclidean minimum spanning tree is a spanning tree of a graph with edge weights corresponding to the Euclidean distance between vertices.

In the distributed model, where each node is considered a computer and no node knows anything except its own connected links, one can consider Distributed minimum spanning tree. Mathematical definition of the problem is the same but has different approaches for solution.

For directed graphs, the minimum spanning tree problem can be solved in quadratic time using the Chu–Liu/Edmonds algorithm.