Each byte value is encoded by its index in a list, which changes over the course of the algorithm. The list is initially in order by byte value (0, 1, 2, 3, ..., 255). Therefore, the first byte is always encoded by its own value. However, after encoding a byte, that value is moved to the front of the list before continuing to the next byte.
An example will shed some light on how the transform works. Imagine instead of bytes, we are encoding values 0-7. We wish to transform the following sequence:
524700717The list is initially (0,1,2,3,4,5,6,7). The first number in the sequence is 5, which appears at index 5. We add a 5 to the output stream:
5The 5 moves to the front of the list, producing (5,0,1,2,3,4,6,7). The next number is 2, which now appears at index 3. We have:
53and the list is now (2,5,0,1,3,4,6,7). Continuing this way, we find that the sequence is encoded by:
It is easy to see that the transform is reversible. Simply maintain the same list and decode by replacing each index in the encoded stream with the value at that index. Note the difference between this and the encoding method: The index in the list is used directly instead of looking up each value for its index.
i.e. you start again with (0,1,2,3,4,5,6,7). You take the "5" of the encoded block and look it up in the list, which results in "5". Then move the "5" to front which results in (5,0,1,2,3,4,6,7). Then take the "3", look it up in the list, this results in "2", move the "2" to front ... etc.
Details of implementation are important for performance, particularly for decoding. For encoding, no clear advantage is gained by using a linked list, so using an array to store the list is acceptable, with worst case performance O(nk), where n is the length of the data to be encoded and k is the number of values (generally a constant for a given implementation).
However, for decoding, we can use specialized data structures to greatly improve performance.
The MTF transform takes advantage of local correlation of frequencies to reduce the entropy of a message. Not all data exhibits this type of local correlation, and for some messages, the MTF transform may actually increase the entropy.
An important use of the MTF transform is in Burrows-Wheeler transform based compression. The Burrows-Wheeler transform is very good at producing a sequence that exhibits local frequency correlation from text and certain other special classes of data. Compression benefits greatly from following up the Burrows-Wheeler transform with an MTF transform before the final entropy-encoding step.
As an example, imagine we wish to compress Hamlet's soliloquy (To be, or not to be...). We can calculate the entropy of this message to be 7033 bits. Naively, we might try to apply the MTF transform directly. The result is a message with 7807 bits of entropy (higher than the original!). The reason is that English text does not in general exhibit a high level of local frequency correlation. However, if we first apply the Burrows-Wheeler transform, and then the MTF transform, we get a message with 6187 bits of entropy. Note that the Burrows-Wheeler transform does not decrease the entropy of the message; it only reorders the bytes in a way that makes the MTF transform more effective.