Permutations
What is a permutation?
I have 5 people arriving at a blind social. They registered in advance, so I know their names: Alice, Bob, Colby, Dallin, and Eden. But I don't know their faces. So, I make name tags for each of them and handout the name tags randomly. How many ways could I match names to faces?
The answer to this question comes with a knowledge of permutations.
 Permute
 To shuffle the ordering, such as by pairs of objects trading places in series
 Permutation
 A way to sequence a set of distinguishable items
 A way to shuffle a set of items by permuting the order
I can work out how many ways to assign name tags by counting in an organized way. I can imagine an organized process that allows exactly the same final results as the proposed disorganized (random) process. For that, I could perhaps say, the people must arrive in order, and the tags are handed out in any chosen order.
I could first ask, how many ways could I give Alice a name tag. That's easy. Five ways because she arrived first.
Now, the tricky part. For each of those choices, I need to know how many ways there are to give Bob a name tag after having given Alice a name tag. If I dive into the details, I can get lost rather quickly as I try to list the beginning of each permutation. Instead, I can count more easily if I ignore those details. So, ignoring what choices remain, I can simply ask how many. Namely, after Alice has a name tag, how many ways can I give Bob a name tag? That's easy again. Four ways. Then, I can think about Colby, and after Alice and Bob have name tags, I have three choices remaining for the next tag. And then 2 choices. And then only 1 choice for how to give out the last name tag.
So, on the one hand, I can see that each choice made limits each following choice in a unique way. And on the other hand, I can see that each choice has no impact on the number of available choices to follow.
But still I have not directly answered, how many ways to give out name tags! Imagine a tree with a single root, and a branch created for each choice. If no 2 branches can lead to the same result, then I just need to count all the leaves on the tree. I reach a leaf, when all decisions are made, and a final state emerges. I would draw 5 branches from the root, for Alice. Each branch, Alice is wearing a different name tag, so I can see there is no overlap in the resulting leaves between these 5 branches. And then from each of the 5 Alice branches, I would create 4 Bob branches. Within each Alice branch, I can see that each Bob branch gives Bob a different name tag, so the leaves coming from this level do not overlap. Also, I note I now have 20 branches at this level of the tree, which is 20 ways I may have given Alice and Bob their name tags. Proceeding further, I find 3 branches for Colby on each of these 20 branches, so I will have 60 branches at the third level, which is 60 ways to give Alice, Bob, and Colby a name tag. On each of these 60 branches, I create 2 branches for Dallin, so now 120 branches. Lastly, I have Eden. Now Eden is special because for each Dallin branch, I only have 1 name tag left to give, so Eden's level also has 120 branches. At this 5th level of branching in my tree, I have exhausted all choices, and some thought helps me to lay claim onto the idea that each level's branches' leaves have divergent results. And so, my answer is 120 = 5*4*3*2*1 = 5! or five factorial. But what if I wanted to examine each way that this could happen particularly? Must I build such a large tree to examine each permutation?
Methods for iterating permutations
With a suitable algorithm, I can create a permutation list of a given size. Here are the results of few such algorithms.
 1 object

1
1
 2 objects

1 2 2 1
1 2 2 1
 3 objects

1 2 3 2 1 3 1 3 2 2 3 1 3 1 2 3 2 1
1 2 3 1 3 2 2 1 3 2 3 1 3 1 2 3 2 1
 4 objects

1 2 3 4 2 1 3 4 1 3 2 4 2 3 1 4 3 1 2 4 3 2 1 4 1 2 4 3 2 1 4 3 1 3 4 2 2 3 4 1 3 1 4 2 3 2 4 1 1 4 2 3 2 4 1 3 1 4 3 2 2 4 3 1 3 4 1 2 3 4 2 1 4 1 2 3 4 2 1 3 4 1 3 2 4 2 3 1 4 3 1 2 4 3 2 1
1 2 3 4 1 2 4 3 1 3 2 4 1 3 4 2 1 4 2 3 1 4 3 2 2 1 3 4 2 1 4 3 2 3 1 4 2 3 4 1 2 4 1 3 2 4 3 1 3 1 2 4 3 1 4 2 3 2 1 4 3 2 4 1 3 4 1 2 3 4 2 1 4 1 2 3 4 1 3 2 4 2 1 3 4 2 3 1 4 3 1 2 4 3 2 1
Did you detect how each set of permutations was created? I marked a possible clue to the algorithm with each permutation list. In the first list, the position of the new object forms a regular stair step. In the second list, the value of the first element likewise is a regular stair step. Here is a brief description of the algorithms I used to generate these lists.
 New object insertion (copy recursive)
 Make n+1 copies of the permutation list for n objects. For each copy, insert the new object at a different position in the array.
 First element cycling (loop recursive)
 For each object in turn, assign it to first position. At each turn, repeat the process for the remaining objects: for each remaining object in turn, assign it to the first available position.
Simplified permutation iterator
To the programmer, these descriptions may sound a bit expensive in memory. If I want to create the n^{th} permutation list, I will need \( n \) levels of recursion. One can imagine a stack overflow from such a process. Might there be a more memory efficient means to compute a list of permutations. Ideally, a simple computation from the current permutation should give us the next permutation in our list. To discuss this succinctly, I define a few terms for clarity.
 earlier/earliest
 Given some quality, the item nearer/nearest to the front/start of the permutation that matches
 later/latest
 Given some quality, the item nearer/nearest to the back/end of the permutation that matches
 successor
 The next item in the permutation (one step closer to the end)
 ascending
 A lesser item is found earlier than a greater item
 ascending order
 Each item in the permutation is earlier than any item of greater value; so the least item is at the start, the greatest item is at the end, and each item's successor is the least greater item.
 ascended value
 ascender
 If two adjacent elements are found to be ascending, the earlier item is the ascended item, and the later item is the ascender item.
Now here is a new algorithm, which creates one of the very same permutation lists without any recursion.
 Swap the latest ascended item with the least greater item of all later items, and then reverse the tail

The first permutation is in ascending order.
At each permutation \( A \), build the next permutation \( A' \) as follows:
 Find the latest item \( A[p] \) in the permutation \( A \) that is less than its successor \( A[p+1] \).
 So find the largest \( p \) such that \( A[p] < A[p+1] \)
 Find the smallest \( A[q] \) such that \( p < q \) and \( A[p] < A[q] \).
 Swap the values between positions \( p \) and \( q \).
 This is the first part of the transformation to the next permutation.
 Reverse the order of all elements after position \( p \).
 Swap \( A[p+1] \) with \( A[n1] \), swap \( A[p+2] \) with \( A[n2] \) and so forth.
 All items before the latest ascended item stay the same.
 If \( r < p \), then \( A'[r] = A[r] \)
In summary, the following transformation gives me the new permutation.
\[ \text{Given } A[i], i \in [0, n1] \]\[ A'[i] = \begin{cases} A[i] & 0 \le i \lt p \\ A[q] & i = p \\ A[p] & i = n+pq \\ A[n+pi] & p \lt i \lt n \text{ and } i \neq n+pq \end{cases} \]\[ p = \max \left \{ i \mid A[i] \lt A[i+1] \right \} \]\[ A[q] = \min \left \{ A[r] \mid r \gt p \text{ and } A[r] \gt A[p] \right \} \]
I made a simple input and button which implements this algorithm to determine the next permutation in the series.
Not only does this algorithm have the same permutation sequence as by doing first element cycling, this algorithm also produces the permutation list in dictionary order.
To implement the previous button, I do the same transform, except for how I pick \( p \) and \( q \).
Complexity of the simplified iterator
I was tempted to think this simplified algorithm was poor. I need a loop to find p, a loop to find q, and a third loop to generate the new permutation. The good part is these loops are not nested. But what seems the bad part is that having \( np \) iterations per loop, gives me my worst case runtime of \( \sim \mathcal{O}(n) \) at each step. Surely, iterating all permutations would be \( \sim \mathcal{O}(n! n) \). Having something so expensive, and then multiplying it further by n, just seems so expensive computationally. But then a new idea hit me. The worst case is actually pretty rare: only n times out of n! cases. In fact, I have the best case in half of the cases: I only need to read and swap the last 2 items. In a third of the cases, I only need to read and modify the last 3 items. In an eighth of the cases, I only deal with the last 4 items. And continuing up, except for the worst case, the cases where I read and swap of the last \( r \) items accounts for exactly ^{r1}⁄_{r!} of the cases. From this I make a weighted average by adding up each loop size times its frequency of occurrence, and then simplify.
So, quite excitingly, the looping required to find the next permutation only contributes a constant multiplier to the runtime on average; thus its worse case complexity has no effect on the complexity of listing all permutations. So, looping through a full permutation list of n items requires only \( \sim \mathcal{O}(n!) \), which is in fact of smaller order than the space it takes to output that list in serial fashion.
Permutations as group elements
I can think of each permutation as the function (a bijection) which takes the items in ascending order to the particular permutation of items. In this sense, I can treat a permutation as an operation which can be performed onto a permutation. Mathematically, what results is called a group. In fact the group consisting of all permutations of \( N \) items is called the Symmetric group of size \( N \). The identity permutation is the items in ascending order.
I can construct the function \( f \) for a permutation by considering the position as \( x \) and the item value at the position as \( y \). Then \( f(x) = y \) is the function which creates the permutation once the function is applied to the identity permutation (the permutation where position equals item value). This notion of identity is consistent, for the function constructed from it is \( f(x) = x \) which is the identity map.
Cycle notation
The permutation as a function gives rise to cycle notation. This notation may be formed as in the following example. Suppose the permutation \( 1 4 5 3 2 6 8 7 \) is to be written in cycle notation. I always start by writing down the following:
Next, I check the value at position 1. If the position equals the value at that position, I close the parentheses to form a 1cycle.
After each close parentheses, I begin another cycle with the smallest number not yet recorded.
If the value at that position does not equal the position, I write down the value at that position. In this case, 4 is in position 2, so I write down 4 after the 2.
What is in position 4? 3.
What is in position 3? 5. What is in position 5? 2. I stop once another complete cycle is formed, and put a close parentheses.
Again, I find the smallest value not yet written. The value at that position is equal to the position, so another 1cycle results.
I find the smallest value not yet written, one last time. If none existed, I would be done. But because one exists, I write another open parenthesis followed by the position, and then value, and so on until all cycles have been written.
I have reached the end of my list, so I have found my permutation in cycle notation. Now, I want to have more fun with this notation, so I'll recreate my pair of lists of permutations list in that notation. To save typing, I'll eliminate the 1cycles, and for the identity element which consists of all 1cycles, I'll write the letter \( e \).
 1 object

e
e
 2 objects

e (1 2)
e (1 2)
 3 objects

e (1 2) (2 3) (1 2 3) (1 3 2) (1 3)
e (2 3) (1 2) (1 2 3) (1 3 2) (1 3)
 4 objects

e (1 2) (2 3) (1 2 3) (1 3 2) (1 3) (3 4) (1 2)(3 4) (2 3 4) (1 2 3 4) (1 3 4 2) (1 3 4) (2 4 3) (1 2 4 3) (2 4) (1 2 4) (1 3)(2 4) (1 3 2 4) (1 4 3 2) (1 4 3) (1 4 2) (1 4) (1 4 2 3) (1 4)(2 3)
e (3 4) (2 3) (2 3 4) (2 4 3) (2 4) (1 2) (1 2)(3 4) (1 2 3) (1 2 3 4) (1 2 4 3) (1 2 4) (1 3 2) (1 3 4 2) (1 3) (1 3 4) (1 3)(2 4) (1 3 2 4) (1 4 3 2) (1 4 2) (1 4 3) (1 4) (1 4 2 3) (1 4)(2 3)
I see that for the left list, each is the beginning of the next larger list. This demonstrates that each permutation list forms a subgroup of the next larger permutation list.
Generators
I may ask then, if I have a set of generators for the group of permutations of n, how many generators do I need to add to create the group of permutations of n+1 elements? The answer is simply, one generator. In fact, a little reasoning will show that adding any permutation as a generator, as long as it involves the last element, is sufficient to have a complete set of generators given the previous set of generators.
But how many generators are required for a symmetric group of size n? I can infer that I need no more than n1 generators, but can I do better?
Theorem: Every finite symmetric group of size greater than 2 requires only 2 generators. First, consider the permutations of 3 elements. I can generate all such permutations from two generators.
Now suppose that the permutation of n elements can be composed of just two generators.
Then the permutation of n+1 elements may be composed from three generators.
But \( c_n \) may be composed of the other two generators. I discovered this by observing that I can construct the same permutation two ways.
To solve for \( c_n \), I need the inverse of \( c_{n+1} \) and then rightmultiply to both sides.
Finally, I may employ this inverse to construct \( c_n \) as follows.
So the permutation of \( n{+}1 \) elements may be composed from two generators.
So any finite symmetric group may be created from just two generators.
Using generators to iterate permutations
How many permutations are required to iterate through all permutations of a given size? I am looking for a list of generators of the symmetric group, which, if properly sequenced, will trace a valid permutation list.
I first try my minimal generator set for the symmetric group of size 3. Can I compose them one at a time and achieve a full cycle? If I want to create a list from generators without repeat, it requires that no proper subsequence of my sequence creates the identity element. I list out the number of repeats of small units that create the identity element.
Avoiding these subsequences does not leave many alternatives. But one such sequence gives desirable results.
Success! The symmetric group of size 4 may prove more difficult. I try my minimal generator set once again. This time, I map it. I list each cycle of permutations created from the \( c_4 \) generator.
A
e(1 2 3 4)
(1 3)(2 4)
(1 4 3 2)
B
(1 2)(1 3 4)
(1 4 2 3)
(2 4 3)
C
(1 4)(2 3 4)
(1 2 4 3)
(1 3 2)
D
(1 2)(3 4)(1 3)
(1 4)(2 3)
(2 4)
E
(3 4)(1 2 3)
(1 3 2 4)
(1 4 2)
F
(2 3)(1 2 4)
(1 3 4 2)
(1 4 3)
So applying \( c_4 = (1\;2\;3\;4) \) in series cycles the elements in each box from top to bottom with wrap around. To show the transform of \( s = (1\;2) \), I add arrows. I trace the arrow ⇔ and go to the box with that label. Then within that box, I find the line labeled like the box I just came from.
A
B ⇔ eC ⇔ (1 2 3 4)
E ⇔ (1 3)(2 4)
F ⇔ (1 4 3 2)
B
A ⇔ (1 2)F ⇔ (1 3 4)
D ⇔ (1 4 2 3)
C ⇔ (2 4 3)
C
E ⇔ (1 4)A ⇔ (2 3 4)
B ⇔ (1 2 4 3)
D ⇔ (1 3 2)
D
E ⇔ (1 2)(3 4)C ⇔ (1 3)
B ⇔ (1 4)(2 3)
F ⇔ (2 4)
E
D ⇔ (3 4)F ⇔ (1 2 3)
A ⇔ (1 3 2 4)
C ⇔ (1 4 2)
F
E ⇔ (2 3)D ⇔ (1 2 4)
B ⇔ (1 3 4 2)
A ⇔ (1 4 3)
Now I navigate the boxes beginning at the identity. Amazingly, I find a path to trace every permutation exactly once.
This is pretty cool, but it seems messy. Although I can trace all the permutations in 23 multiplications, this path doesn't loop back to the start. Also, this path seems more like an antipattern than a pattern.
From gleaning the earlier proof that I need only 2 generators for the symmetric group, I see I can choose a different pair of generators. The intrabox paths give the 4cycles formed by the same generator and the interbox paths give the 3cycles formed by the other generator.
A
C ⇒ e ⇒ EE ⇒ (1 2 3 4) ⇒ F
F ⇒ (1 3)(2 4) ⇒ B
B ⇒ (1 4 3 2) ⇒ C
B
F ⇒ (1 2) ⇒ DD ⇒ (1 3 4) ⇒ C
C ⇒ (1 4 2 3) ⇒ A
A ⇒ (2 4 3) ⇒ F
C
A ⇒ (1 4) ⇒ BB ⇒ (2 3 4) ⇒ D
D ⇒ (1 2 4 3) ⇒ E
E ⇒ (1 3 2) ⇒ A
D
C ⇒ (1 2)(3 4) ⇒ BB ⇒ (1 3) ⇒ F
F ⇒ (1 4)(2 3) ⇒ E
E ⇒ (2 4) ⇒ C
E
F ⇒ (3 4) ⇒ AA ⇒ (1 2 3) ⇒ C
C ⇒ (1 3 2 4) ⇒ D
D ⇒ (1 4 2) ⇒ F
F
D ⇒ (2 3) ⇒ BB ⇒ (1 2 4) ⇒ A
A ⇒ (1 3 4 2) ⇒ E
E ⇒ (1 4 3) ⇒ D
Now, I can actually trace a path through every permutation, back to the start using just the two generators! And, actually, finding that path was quite fun. I highly recommend it.
Generator mapping
Before moving on to the case of permutations of 5 objects, I need a calculator to walk through permutations more quickly. Should not be too hard to create one, I think.
,  
,  ,  
, 
Controls  Generators  


⇑ =  , 
⇓ =  
⇒ =  
⇐ =  , 
Mapping a permutation of 5 items
With this calculator, navigating through the permutations of 5 items, should be easy, although a bit tedious. But I do want to be organized about it. I structure the groups in layers (alternating upper and lower case), so I can see the permutations grouped by how many applications of \( c_4 \) must be employed from the identity (irrespective of the count of \( c_5 \)).
A
f ⇒ e ⇒ bb ⇒ ( 1 2 3 4 5 ) ⇒ c
c ⇒ ( 1 3 5 2 4 ) ⇒ d
d ⇒ ( 1 4 2 5 3 ) ⇒ e
e ⇒ ( 1 5 4 3 2 ) ⇒ f
b
A ⇒ ( 1 2 3 4 ) ⇒ GG ⇒ ( 1 3 5 ) ( 2 4 ) ⇒ L
L ⇒ ( 1 4 3 ) ( 2 5 ) ⇒ O
O ⇒ ( 1 5 3 2 ) ⇒ H
H ⇒ ( 4 5 ) ⇒ A
c
A ⇒ ( 1 3 ) ( 2 4 5 ) ⇒ HH ⇒ ( 1 4 ) ( 2 5 3 ) ⇒ M
M ⇒ ( 1 5 4 2 ) ⇒ P
P ⇒ ( 3 4 ) ⇒ I
I ⇒ ( 1 2 3 5 ) ⇒ A
d
A ⇒ ( 1 4 2 ) ( 3 5 ) ⇒ II ⇒ ( 1 5 4 3 ) ⇒ N
N ⇒ ( 2 3 ) ⇒ L
L ⇒ ( 1 2 4 5 ) ⇒ J
J ⇒ ( 1 3 4 ) ( 2 5 ) ⇒ A
e
A ⇒ ( 2 5 4 3 ) ⇒ JJ ⇒ ( 1 2 ) ⇒ O
O ⇒ ( 1 3 4 5 ) ⇒ M
M ⇒ ( 1 4 ) ( 2 3 5 ) ⇒ K
K ⇒ ( 1 5 3 ) ( 2 4 ) ⇒ A
f
A ⇒ ( 1 5 ) ⇒ KK ⇒ ( 2 3 4 5 ) ⇒ P
P ⇒ ( 1 2 4 ) ( 3 5 ) ⇒ N
N ⇒ ( 1 3 ) ( 2 5 4 ) ⇒ G
G ⇒ ( 1 4 3 2 ) ⇒ A
G
b ⇒ ( 1 3 ) ( 2 4 ) ⇒ ff ⇒ ( 1 4 3 2 5 ) ⇒ r
r ⇒ ( 1 5 2 ) ⇒ q
q ⇒ ( 3 4 5 ) ⇒ t
t ⇒ ( 1 2 3 5 4 ) ⇒ b
H
b ⇒ ( 1 5 4 ) ⇒ ss ⇒ ( 2 3 4 ) ⇒ q
q ⇒ ( 1 2 4 3 5 ) ⇒ u
u ⇒ ( 1 3 ) ( 2 5 ) ⇒ c
c ⇒ ( 1 4 5 3 2 ) ⇒ b
I
c ⇒ ( 1 2 3 ) ⇒ tt ⇒ ( 1 3 2 4 5 ) ⇒ q
q ⇒ ( 1 4 ) ( 2 5 ) ⇒ v
v ⇒ ( 1 5 3 4 2 ) ⇒ d
d ⇒ ( 3 5 4 ) ⇒ c
J
d ⇒ ( 1 3 4 5 2 ) ⇒ uu ⇒ ( 1 4 ) ( 3 5 ) ⇒ q
q ⇒ ( 1 5 4 2 3 ) ⇒ r
r ⇒ ( 2 4 3 ) ⇒ e
e ⇒ ( 1 2 5 ) ⇒ d
K
e ⇒ ( 2 4 ) ( 3 5 ) ⇒ vv ⇒ ( 1 2 5 4 3 ) ⇒ q
q ⇒ ( 1 3 2 ) ⇒ s
s ⇒ ( 1 4 5 ) ⇒ f
f ⇒ ( 1 5 2 3 4 ) ⇒ e
L
b ⇒ ( 1 4 3 5 2 ) ⇒ tt ⇒ ( 1 5 3 ) ⇒ u
u ⇒ ( 2 3 ) ( 4 5 ) ⇒ d
d ⇒ ( 1 2 4 ) ⇒ w
w ⇒ ( 1 3 4 2 5 ) ⇒ b
M
c ⇒ ( 2 5 4 ) ⇒ uu ⇒ ( 1 2 ) ( 3 4 ) ⇒ v
v ⇒ ( 1 3 5 ) ⇒ e
e ⇒ ( 1 4 5 2 3 ) ⇒ w
w ⇒ ( 1 5 3 2 4 ) ⇒ c
N
d ⇒ ( 1 5 ) ( 2 3 ) ⇒ vv ⇒ ( 2 4 5 ) ⇒ r
r ⇒ ( 1 2 5 3 4 ) ⇒ f
f ⇒ ( 1 3 5 4 2 ) ⇒ w
w ⇒ ( 1 4 3 ) ⇒ d
O
b ⇒ ( 2 5 3 ) ⇒ ww ⇒ ( 1 2 ) ( 4 5 ) ⇒ e
e ⇒ ( 1 3 4 ) ⇒ r
r ⇒ ( 1 4 2 3 5 ) ⇒ s
s ⇒ ( 1 5 2 4 3 ) ⇒ b
P
c ⇒ ( 1 5 ) ( 3 4 ) ⇒ ww ⇒ ( 2 3 5 ) ⇒ f
f ⇒ ( 1 2 4 5 3 ) ⇒ s
s ⇒ ( 1 3 2 5 4 ) ⇒ t
t ⇒ ( 1 4 2 ) ⇒ c
q
G ⇒ ( 1 5 3 4 ) ⇒ JJ ⇒ ( 2 3 5 4 ) ⇒ H
H ⇒ ( 1 2 4 3 ) ⇒ K
K ⇒ ( 1 3 2 5 ) ⇒ I
I ⇒ ( 1 4 5 2 ) ⇒ G
r
G ⇒ ( 2 5 ) ⇒ NN ⇒ ( 1 2 ) ( 3 4 5 ) ⇒ X
X ⇒ ( 1 3 5 4 ) ⇒ O
O ⇒ ( 1 4 2 3 ) ⇒ J
J ⇒ ( 1 5 ) ( 2 4 3 ) ⇒ G
s
H ⇒ ( 1 5 ) ( 2 3 4 ) ⇒ OO ⇒ ( 2 4 3 5 ) ⇒ X
X ⇒ ( 1 2 5 3 ) ⇒ P
P ⇒ ( 1 3 2 ) ( 4 5 ) ⇒ K
K ⇒ ( 1 4 ) ⇒ H
t
G ⇒ ( 1 2 3 ) ( 4 5 ) ⇒ II ⇒ ( 1 3 2 4 ) ⇒ P
P ⇒ ( 1 4 2 5 ) ⇒ X
X ⇒ ( 1 5 2 ) ( 3 4 ) ⇒ L
L ⇒ ( 3 5 ) ⇒ G
u
H ⇒ ( 1 3 5 2 ) ⇒ JJ ⇒ ( 1 4 5 3 ) ⇒ L
L ⇒ ( 1 5 4 ) ( 2 3 ) ⇒ X
X ⇒ ( 2 4 ) ⇒ M
M ⇒ ( 1 2 5 ) ( 3 4 ) ⇒ H
v
I ⇒ ( 2 5 3 4 ) ⇒ KK ⇒ ( 1 2 ) ( 3 5 4 ) ⇒ M
M ⇒ ( 1 3 ) ⇒ X
X ⇒ ( 1 4 5 ) ( 2 3 ) ⇒ N
N ⇒ ( 1 5 2 4 ) ⇒ I
w
L ⇒ ( 1 3 4 2 ) ⇒ NN ⇒ ( 1 4 3 5 ) ⇒ P
P ⇒ ( 1 5 2 3 ) ⇒ M
M ⇒ ( 2 4 5 3 ) ⇒ O
O ⇒ ( 1 2 5 4 ) ⇒ L
X
r ⇒ ( 1 3 ) ( 4 5 ) ⇒ vv ⇒ ( 1 4 ) ( 2 3 ) ⇒ u
u ⇒ ( 1 5 ) ( 2 4 ) ⇒ t
t ⇒ ( 2 5 ) ( 3 4 ) ⇒ s
s ⇒ ( 1 2 ) ( 3 5 ) ⇒ r
Now that we have a map for the permutations of 5 objects, is there a generator path which cycles through every permutation just once? Yes!! Many paths exist. Here is one such:
When I come back to this one day, I might look at that sequence and question how I got there. So I will explain.
First, I create the path to traverse all nodes of box A and then return to the start. This is a simple repetition of the generator which defined the boxes in the first place.
Next, I try to include (at least some nodes) of a connected box to A as an added detour in this path. Perhaps as a remarkably lucky choice of generators, the path that traverses A, and includes a round trip to b, must traverse b completely.
Next, I can do the same with every box connected to A.
And in this manner, I continue, doing for boxes b, c, d, e, and f what I did for A. If at any point in traversing one of these boxes, I have a path that jumps to a box that I haven't traversed, then I split the traversal of that one to add another. To see why this works, I note the following identity.
Stated in words, jumping out to a connected box (\( c_4 \)), traversing all other node of that box (\( c_5 c_5 c_5 c_5 \)), and jumping back (\( c_4 \)), brings me the same nodes as if I did a single step within the box (\( = c_5 \)).
A similar pattern of traversal also holds for permutations of fewer objects.
Traveling through permutations of n objects
Having the chance to map out a circuit built from a pair of generators for the permutations of 5 objects (and less), I believe that for any n, a circuit exists for a permutation of n objects.
Proving this fact comes freely in many respects. Here are the key ideas and reasoning for this proof.
 \( n! \) nodes
 The permutation of n objects has \( n! \) nodes.
 \( (n1)! \) boxes via \( c_n \)
 By applying \( c_n \) I may group the nodes into \( (n1)! \) boxes of n nodes.
 \( n \) edges per box via \( c_{n1} \)
 By applying \( c_{n1} \) to the nodes within each box, I may draw edges between the boxes with \( n \) edges per box, or \( n! \) directed interbox edges in all.
 The graph of these boxes is connected and therefor has a spanning tree.
 A complete set of generators created the graph.
 If \( c_{n1} {c_n}^{n1} c_{n1} = c_{n} \), then each spanning tree of the graph corresponds to a circuit that visits all nodes exactly once.

Essentially, the idea here is as discussed in the previous section.
Traversing a single box in a simple circuit may be augmented with additional connected boxes.
 The sequence jumps from an origin box to an adjacent box by \( c_{n1} \)
 The remaining nodes of the adjacent box are traversed by the sequence of \( {c_n}^{n1} \)
 Assuming equality, the final \( c_{n1} \) returns to the box of origination, AND to the very next node in that box.
 \( c_{n1} {c_n}^{n1} c_{n1} = c_{n} \)

The reason why this is true may not be immediately apparent, but thinking of each generator as a function it becomes more clear.
 \( c_{n1} \) acts as a function that adds 1, except that n1 becomes 1 and n remains fixed.
 \( {c_n}^{n1} \) acts as a function that subtracts 1, except that 1 becomes n.
 \( c_{n} \) acts as a function that adds 1, except that n becomes 1.
\[ e = 1\;2\;3\;\dots\;n{}2\;n{}1\;n \]\[ \times c_{n1} = 2\;3\;4\dots\;n{}1\;1\;n \]\[ \times {c_n}^{n1} = 1\;2\;3\dots\;n{}2\;n\;n{}1 \]\[ \times c_{n1} = 2\;3\;4\dots\;n{}1\;n\;1 = c_{n} \]
In conclusion, I have many possible circuits traversing all permutation of n items just once, using a minimal two generators \( \{ c_n, c_{n1} \} \). This solution is constructible, but not decided uniquely.