Thursday 29 November 2012

Use any dice to calculate the outcome of any probability




While looking at this question, I had a gut feeling that you can use any fair, single die with any number of sides to calculate the outcome of any probability.



Assuming we express the probability as a range of numbers, this is easy for coins and d10s: coins can be flipped to generate a binary number, while d10s can be rolled to produce each digit of the outcome. If the result falls outside the range, ignore it and reroll.



This is really just probability with respect to base. The coin generates a result in base 2, while the d10 generates results in base 10. Therefore, a die with a number of sides n can be used to produce a result in base n.



Now consider that we have an arbitrary number of dice, each with an arbitrary number of sides. We could generate an outcome by expressing the result of each die roll in base 2 and tacking them together* (see example). This would however result in a lot of rerolling, and is otherwise time-consuming when you factor in converting to base 2 and so forth.



So here's what amounts to a somewhat silly puzzle question:





  • For an arbitrary set of dice, each with an arbitrary number of sides, is there a general method for determining the outcome of any probability while minimizing the number of rerolls.

  • Is there a method which is easy to remember and could reasonably be used during a gaming session (i.e. takes less than, say, 30 seconds to determine which dice to roll and calculate the result).



Example of presented method:
Outcome between 0 and 993, with (hypothetical) 1d7 and 1d21.





  • 993 in base 2 is 1111100001, meaning we need 10 binary digits to express the full range of possible outcomes.

  • 1d21 can provide 4 binary digits (0 through 15 in base 2), and 1d7 provides 2 digits (0 through 3).



Solution:
Roll 1d21 twice and 1d7 once. If the d21 lands higher than 16 or the d7 higher than 4, reroll. Subtract 1 from each roll so the range starts at 0. Convert to base 2. Append results to create one 10-digit binary value. If result > 993, toss it out and reroll.



There is a ~24% chance ($\frac{21-16}{21}$) of needing to reroll the d21 each time, a ~43% chance ($\frac{7-4}{7}$) for the d7, and a ~3% chance ($\frac{1024-994}{1024}$) of needing to reroll the final value.



*Ignoring rolls that are higher than the maximum rollable power of 2. I.e. if you had a d12, you would ignore rolls higher than 8 (23). This ensures an equal probability for each digit in the result.




Edit:



In light of Thomas Andrews' answer, multiple dice can be used to generate a higher $X$ value than one die alone. For a set of dice with number of sides $\{k1,k2,...,kn\}$ and rolls $\{r1,r2,...,rn\}$, the maximum $X$ value will be $k_1k_2k_3...k_n$ and a given roll value will be: $$r_1 + (r_2 - 1)k_1 + (r_3 - 1)k_1k_2 + \cdots + (r_n - 1)k_1k_2...k_{n-1}$$


Answer



Yes, given a single random number $X$ which generates elements $\{1,\dots,k\}$, with the $0

Basically, we're going to pick a real number in $[0,1]$ by successively reducing our range. To pick real exactly, we'd have to roll the die infinitely many times, but luckily, with probability $1$, in a finite amount of time, the current interval will no longer contain $q$, and we can stop, because then we know if $q$ is less than or greater than the real.



We start with the entire interval $[a_0,b_0]=[0,1]$.




At step $n$, we have interval $[a_n,b_n]$. If $b_nq$ then we halt with failure.



Otherwise, we roll the die.



If it comes up $1$, we take the next interval as $[a_n,(1-p)a_n+pb_n]$.
If it comes up something other than $1$, we take the next interval to be $[(1-p)a_n+pb_n,b_n]$.



The interval at step $n$ is at most length $\left(\max(p,1-p)\right)^n$. There is no guarantee that the process with stop in a known number of die rolls, but it will stop with probability $1$ in a finite number of rolls.




Edit: Ian asked for the expected number of rolls to know where you are.



This is rather complex, and depends on $q$ and $p$ as follows. Given an infinite sequence $\{a_i\}_1^{\infty}$ each in $\{0,1\}$, we can define $R_n=\sum_{i=1}^n a_i$ and $L_n=n-R_n$. We treat the $a_i$ as "left-right" choices in a binary tree.



Then for almost all[*] $q\in[0,1]$ there exist exactly one sequence $\{a_i\}$ such that:



$$q=\sum a_ip^{L_n}(1-p)^{R_n}$$



This has the advantage that if $\{a_i\}$ corresponds to $q_1$ and $\{b_i\}$ corresponds to $q_2$, then if $q_1


The expected number of rolls is going to depend on the $a_i$ corresponding to $q$.



That said, let $e_p(q)$ be the expected number of rolls. We can define the expected number recursively as follows:



$$e_p(q)=\begin{cases}
1 + (1-p)e_p\left(\frac{q-p}{1-p}\right)&p1+pe_p\left(\frac{q}{p}\right)&p>q
\end{cases}$$



But whether $pp$ almost certainly.




Finally, if $a_1=0$, then $\frac{q}{p}$ corresponds to the sequence $\{a_2,a_3,\dots\}$ and if $a_1=1$ then $\frac{q-p}{1-p}$ corresponds to $\{a_2,a_3,\dots\}$.



So we really see this expected value is related to the sequence $\{a_i\}$, but it is a mess to compute it.



The value is:



$$\sum_{i=0}^\infty p^{L_i}(1-p)^{R_i}$$



which we can also see because $p^{L_i}(1-p)^{R_i}$ is the odds that $q$ is still in our interval after trial $i$.




This is no more than $$\frac{1}{1-\max(p,1-p)},$$ for any $q$.



If you want more efficient use of the die (I'm using it as a coin toss here) and it has $N$ sides with equal probabilities, then the expected number of rolls is:



$$\sum_{k=0}^\infty \frac{1}{N^k}=\frac{N}{N-1}$$
and is independent of $q$. (That's true if $p=\frac{1}{2}$ in the original approach.


No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...