I was working on a simulation of the St Petersburg Paradox when I realized my coin-flipping code never recorded any streaks of greater than 15 heads in a row. I ran the simulation 100,000,000 times, which should have resulted in an average of 1526 streaks of heads 16 long.
(0.5^16) x 100,000,000 = 1526
Clearly, something is wrong.
#include <stdlib.h>
#include <stdio.h>
#include <time.h>
int main(int argc, char const *argv[])
{
srand(time(0));
int i, lim = 100000000, streak = 0, maxstreak = 0;
for (i = 0; i < lim; ++i)
{
if (rand()%2) {
streak++;
if (streak > maxstreak) maxstreak = streak;
}
else streak = 0;
}
printf("Ran %d times, longest streak of %d\n", lim, maxstreak);
return 0;
}
Returns the following every time:
Ran 100000000 times, longest streak of 15
Thanks for your help!
Edit: running GCC version 4.6.2 on Windows 7 x64. Bit new to programming in general.
Edit 2: thanks for everyone's help! Anyone sticking around, I wonder what about the current implementation would give a limit of 15 heads? How would the rand()
function be so interestingly broken to produce this problem?