i wasn't able to find anything about the existance of errors in "fair" coin tosses, but what you said make intuitive sense to me. I think my brain is going to melt though, i haven't looked at this stuff in ages.
Infinities are really hard, even for people who are good at math.It's easier to use N=1, N=5, N=10, and N=20 and then extrapolate from the trend.If you flip a coin just once, you will expect to get 0.5 heads. But you will also expect your error to be 0.5 (or 100% of your expectation). That's because you know you're either going to get 0 heads or 1 head; you can't possibly get 0.5 heads, so you're going to be off by 0.5.If you flip a coin five times, you will expect 2.5 heads. You will be off by either 0.5, 1.5, or 2.5 (all equally likely) for an expected error of 1.5 (or 60% of your expectation).So as we flipped more coins, the absolute value of the expected error went up, but the error as a percentage of expectation went down.If you flip a coin 10 times, you will expect 5 heads . . . and again the expected error will go up in absolute terms but down in percentage terms. That keeps happening as we raise N. The bigger N gets, the bigger our expected error gets.