Having derived a new probability mass distribution, so similar to the Poisson distribution, I should check that my new distribution is not a mistake.
Restated from the previous post, I have:

\[ Monkey(x,R) = \frac{ e^{- R x} (R x)^{x-1}}{x!} \]

\( R = \lambda t \) is the ratio of the time to complete a task to the mean arrival time between tasks. This equates to the positive percentage values found in my original post posing the problem.

\( x \) is the observed total number of tasks in a work period.

I want some assurances for my new distribution. For now, let me just see if it is a distribution at all!

I know I have a distribution (but not necessarily the right one), when the sum of the probabilities is unity.

Restated, I want to prove that \( \sum_{x=1}^{\infty} Monkey(x, R) = 1 \)

## Does \( \sum_{x=1}^{\infty} Monkey(x, R) \) converge?

For this, I can perform the ratio test on the sequence.

\[ a_{x}(R) = \frac{(R x)^{x-1}}{e^{R x} x!} \]

Applying some elbow grease and a well known limit, I can show that

\[ lim_{x \to \infty } \frac{a_{x+1}(R)}{a_{x}(R)} = R e^{1-R} \]

Finding the maximums on this function of R finishes the proof of convergence for \( R > 0 \) with one exception — the ratio test is inconclusive at \( R = 1 \).

## Convergence whenÂ \( R = 1 \)

\[ \sum_{x=1}^{\infty} Monkey(x, 1) = \sum_{x=1}^{\infty} \frac{ e^{- x} (x)^{x-1}}{x!} \]
The limit comparison test is another test to prove convergence.
So I just need to find a similar sequence whose sum converges more slowly.
I will use \( \frac{1}{x^{5/4}} \) since the integral test shows this series converges.

\[ lim_{x \to \infty} \frac{Monkey(x, 1)}{1/x^{5/4}} = lim_{x \to \infty} \frac{e^{- x} (x)^{x+1/4}}{x!} \]

I can use Stirling's approximation:

\[ lim_{x \to \infty} \frac{e^{-x}x^{x+1/4}}{\sqrt{2 \pi} e^{-x} x^{x+1/2}(1+O(1/x))} = 0 \]

Combining the two tests shows I have convergence for all positive R. So I have convergence, but to what?

## Uniform Convergence for \( R > 0 \)

I could now employ the Weierstrass M-test to show a stronger condition, that of uniform convergence.

\[ M_{x} := \frac{e}{x\sqrt{2 \pi x}} \]

It is easy to show (by the integral test) that the series \( \sum_{x=1}^{\infty} M_{x} \) converges.
I can also show that \( | a_{x}(R) | < M_{x} \) if \( R \in (0,\infty) \).
Thus I can demonstrate uniform convergence of \( a_{x}(R) \) per the following reductions.

\[ a_{x}(R)
= \frac{(R x)^{x-1} e^{-R x}}{x!} \]
\[ = \frac{(R x)^{x-1} e^{-R x}}{\sqrt{2 \pi} e^{-x} x^{x+1/2}(1+O(1/x))} \]
\[ \leq \frac{(R x)^{x-1} e^{-R x}}{\sqrt{2 \pi} e^{-x} x^{x+1/2}} \]
\[ = \frac{e^{1-R}(R e^{1-R})^{x-1}}{ x^{3/2}\sqrt{2 \pi }} \]
\[ \leq M_{x} \]

## Complete probability when \( R < 1 \)

I went on Wolfram Alpha to see if it could calculate the convergent value and it looks like for several test values, the sum is unity (or else very close to unity) in the interval \( (0,1) \). That's a good indicator! It did not suggest any answer when R was very close to, but greater than 1.

## Incomplete probability when \( R > 1 \)

If \( R > 1 \), my math search engine hacking suggests that the summation converges to a number less than 1.
Why would this be?
Was my probability analysis incomplete or incorrect?
Assuming this limit value hacking is correct, I interpret the partial probability to mean that \( P(M(\infty)) > 0 \), so that when I am queueing work at an average rate that is faster than I can process it, there is a positive (non-zero) probability that the work period will be never-ending.
Stated another way I have to add a new value to satisfy the law of total probability, as follows:

\[ Monkey(R, \infty) = P(M(\infty)) = 1 - \sum_{x=1}^{\infty} Monkey(R,x) \]

## Take a deep breath...

Alright, I'm ready to put the monkey to bed for now. For those who've followed this discussion the whole way through, I took a tangent discussion and have yet to get back on track!!! Hopefully, I will have a breakthrough in my next post (and not a breakdown :) ), where I can satisfactorily describe D and Q based upon R and s. I have a hunch that this Monkey distribution will help... if nothing else I gained the experience solving a similar problem.