Why do most probability graphs show a bell curve? I've been wondering why... Is it just something natural, like the fibonacci sequence?

66Although it's not the beall and endall, the key words here are probably "Central Limit Theorem". – Chappers Aug 01 '17 at 20:41

16Just a small note: it's plausible that the more prevalent distribution is actually the lognormal distribution, not the bell curve (normal distribution), even for things like heights and weights ([Limpert et al., BioScience, 2001](https://stat.ethz.ch/~stahel/lognormal/bioscience.pdf)). – Sparhawk Aug 01 '17 at 22:00

12Mostly, they are not bell curves. They are *approximately* bell curves. – Steven Alexis Gregory Aug 02 '17 at 07:11

11[Nassim Nicholas Taleb](https://en.wikipedia.org/wiki/The_Black_Swan_(Taleb_book)) would like to have a word with you. – Glorfindel Aug 02 '17 at 08:04

5I once read that everyone believes the normal distribution is pervasive... for different reasons. Mathematicians believe it is an empirical fact and experimentalists believe it is a theorem :) I will post an answer if I find the source. – Miguel Aug 02 '17 at 08:49

Also worth pointing out: normal distribution is the one [maximizing entropy for specified variance](https://en.wikipedia.org/wiki/Maximum_entropy_probability_distribution#Specified_variance:_the_normal_distribution). – dtldarek Aug 02 '17 at 10:20

Related post on Cross Validated Stack Exchange: [If my histogram shows a bellshaped curve, can I say my data is normally distributed?](https://stats.stackexchange.com/q/129417/22228) – Silverfish Aug 02 '17 at 14:20

In Tukey and Mosteller's textbook "Data Analysis and Regression" (1977), they point that many (even most) of the popular examples of 'bellcurves' deviate substantially from a normal distribution. That is, the popular idea that most distributions are normal is arguably a misconception, at least to a certain extent. Naturally, reality is more complicated than that. – Jacob Maibach Aug 02 '17 at 15:05

14Is the probability that a probabilitygraphwillshowabellcurve itself a bell curve? – Strawberry Aug 02 '17 at 16:09

Not perfectly related, but worth pointing out because its a pet peeve of mine. Bell curves occur frequently, but that doesn't mean you should "force" one to appear. For example, I once had a professor try to explain that his final exam was going to be "very hard" to cause the classes grades into a curve. It was a 400level course with 5 people. As we were all very knowledgeable and the sample size was small, expecting a bell curve here would be a mistake. – SouthShoreAK Aug 02 '17 at 16:19

4@Miguel Everybody believes in the exponential law of errors: the experimenters, because they think it can be proved by mathematics; and the mathematicians, because they believe it has been established by observation. Gabriel Lippman – Acumen Simulator Aug 02 '17 at 16:37

@SouthShoreAK most class grades are bimodal anyway... grading on a curve is lazy imo. Either you know the material or not. They do this bc they're not sure what material is useful and what is not. – Acumen Simulator Aug 02 '17 at 16:42

1@Miguel it was Nassim Taleb in *The Black Swan* (see Glorfindel's comment for the link). His point was that the distribution fits many *physical* processes (e.g. distribution of coin flips) but is mistakenly applied to *social* ones (e.g. distribution of book sales). – Jared Smith Aug 02 '17 at 17:00

3They just seem to resonate with some people. – Daniel R Hicks Aug 02 '17 at 21:37

1@user38826 Curve grading is a cover for the teacher not knowing how hard his test is. – Loren Pechtel Aug 02 '17 at 21:42

There are bells of different shapes. – Carsten S Aug 03 '17 at 10:44

4Because a lot of complex systems can be emulated as the sum of less complex processes. Regardless how these processes are distributed, the sum is then distributed normally (yeah I know this is put a bit bold). A psychology professor once said: "IQ is distributed normally, because intelligence is the sum of a lot of independent factors". – Willem Van Onsem Aug 04 '17 at 13:04

God only knows. – Michael Kay Aug 05 '17 at 09:31

Try this: let your samples x be 1/u, where u is a uniform random number. Easy to try: in excel just put 1/Random() into a bunch of cells. I don't think this will look like a bell curve. Now, differentiate the cumulative to get the PDF and try to calculate what the mean and standard deviation might be. Good luck. – richard1941 Aug 09 '17 at 19:34

@dtldarek Your comment made me think about the fact that Gaussians are the minimum uncertainly wave functions in quantum mechanics. Do you think this may connected with them being maximum entropy distributions for a specified variance? – Meep Apr 22 '18 at 20:23

@Glorfindel Does [he](https://math.stackexchange.com/users/88078/nero) ? – BCLC May 24 '18 at 18:42

@BCLC now I'm curious where you found that (there's no indication on the user's profile), and if it's true, he deserves to be mentioned [here](https://meta.stackexchange.com/q/93774/295232) – Glorfindel May 24 '18 at 21:35

@Glorfindel https://mobile.twitter.com/clipperhouse/status/409805959750377472 NNT posts but later deletes tweets to his SE questions. He's just that humble. – BCLC May 25 '18 at 05:00

1I think that all normal distributions are bell curves but not all bell curves are normal distributions. Did you mean "Why does the normal distribution occur everywhere?" – Timothy Nov 26 '18 at 05:20
12 Answers
Why do most probability graphs show a bell curve?
As you suspect, there is a natural tendency for distributions to be bellshaped.
There are some distributions that are not bellshaped at all. For example, the outcome of a roll of one fair die is a discrete uniform distribution:
By IkamusumeFan  Own work
This drawing was created with LibreOffice Draw, CC BYSA 3.0, Link
The roll of one die is a pretty simple process. What about the sum of two dice? The Wizard of Odds illustrates:
Starting to look a little like a bell, right? What about the totals of three, or four dice? Wolfram MathWorld provides a nice illustration:
You can see where this is headed. Nature is full of complex processes. How tall are you? Well, it depends on genetics, nutrition, exercise, injuries, bone loss, and so many more things. The central limit theorem shows (see symplectomorphic's comment below) that when adding the sum of a large number of things together, the resulting distribution is not just any belllooking curve, but specifically the normal distribution. Or for things with multiplicative combination, the lognormal distribution.
Why does this happen? mathreadler's answer hints it has to do with convolving distributions. The probability density function of a single die is a rectangular function (technically discrete, but let's pretend it's continuous). The sum of two rolls together is then the convolution of two rectangular functions.
By Convolution_of_box_signal_with_itself.gif: Brian Amberg
derivative work: Tinos (talk)  Convolution_of_box_signal_with_itself.gif, CC BYSA 3.0, Link
Notice how the result (the black triangle) looks like the case of two dice above. If then convolve this triangle with another rectangle, you get three dice. The more times you do this, the closer the result gets to a normal distribution.
The probability density function of the normal distribution is a Gaussian function, which have some elegant properties:
 A Gaussian convolved with a Gaussian is another Gaussian.
 The product of two Gaussians is a Gaussian.
 The Fourier transform of a Gaussian is a Gaussian.
From this you might intuitively see as things converge towards normal distributions, they "want" to stay as normal distributions since their "Gaussianess" is preserved under many operations.
Of course not everything is so simple as a single die roll, nor as complex as the determination of the height of a human. So there are a large number of distributions that look like a bell, but on careful examination aren't the normal distribution. Some of them exist in nature, and some find application as mathematical tools for some purpose. Looking through Wikipedia's list of probability distributions you can see belllike shapes are quite common, even if they aren't exactly the normal distribution.
But if you combine these two things:
 The central limit theorem means the normal distribution is common, and
 many distributions look like bells but aren't the normal distribution,
you might conclude most probability graphs show a bell curve.
 1,330
 1
 8
 7

14"The central limit theorem shows that when adding the sum of a large number of things together..." It is worth pointing out that "the" central limit theorem  the classical statement  doesn't say this; it requires the "things" to be *identically distributed* (and indeed it doesn't say the sum is normal; it implies it is only approximately normal). The Lyapunov generalization allows us to drop the assumption that the variables are identically distributed. – symplectomorphic Aug 02 '17 at 12:47

1While the answer is fine and accepted, IMO it could be further improved by maybe reducing the amounts of examples/"motivation" (which the OP has already stated) and focus more on "why" actually it looks like a bell shape. One could also mention that there are two interpretations of "bell" here  one the simple fact that it somehow visually looks like the physical object "bell", and the other that the "bell curve" (the term) *is* usually just a name for the "normal distribution". It is unclear whether the OP is aware of that. – AnoE Aug 04 '17 at 16:09

1@AnoE I'm a physicist, and I found the examples in the answer very helpful. Maybe it's just my different way of thinking. – user3653831 Aug 04 '17 at 16:52

@AnoE Not a bad idea. I added a little bit, while trying to keep it accessible. – Phil Frost Aug 04 '17 at 17:44

"Probably because most distributions look like a bell of some kind."  I don't like this reasoning. One could define a lot of horselike distributions and give names to them, and this would not imply by any means that horselike distributions appear everywhere. The fact that most distributions listed on Wikipedia are bellshaped is not an explanation at all, IMHO. – Pedro A Aug 04 '17 at 19:31

@Hamsteriffic Well yes, though the difference is the large number of bellshaped distributions exist because they are useful, whereas your arbitrary hoselike distributions have no application whatsoever. Though perhaps applications of horse distributions are an active area of research? – Phil Frost Aug 04 '17 at 19:40

But from that you fall into a tautology. Yeah, Wikipedia wouldn't list useless distributions. It lists only the useful ones. OK. Now, it is no surprise that the useful ones are mostly bell shaped, since the bellshaped ones are the ones that appear everywhere. The fact that Wikipedia lists mostly those is an evidence that indeed bellshaped curves appear everywhere, but not an explanation. The actual explanation is the rest of your answer, about convolutions. – Pedro A Aug 04 '17 at 19:49

(by the way, I +1ed and liked the rest of your answer a lot. I'm saying that the first paragraph could be improved, or maybe moved to the end of the answer. It is not really an explanation, but is indeed a nice addendum) – Pedro A Aug 04 '17 at 19:52



I really liked this edit. Thank you very much for taking the time to do it! Would +1 again if I could! :) – Pedro A Aug 04 '17 at 20:17

I really liked this edit. Thank you very much for taking the time to do it! Would +1 again if I could! :) – Pedro A Aug 04 '17 at 20:18



3CAUTION: The central limit theorem only works when the underlying distributions have a variance. Some distributions, such as Pareto, DONT have a variance, so the central limit theorem does not work for them, nor do other theorems on the probability of extreme values. – richard1941 Aug 09 '17 at 19:39
The convolution of two functions is at least as nice as the nicest of the two (often even nicer), and the sum of two independent distributions has a density which is the convolution of their density functions. So as they convolve more and more when we add them up they become nicer and the gaussian function is the nicest in the world!
 24,082
 9
 33
 83

25

6This is much better than the answers that just say "central limit theorem", in that it much more nearly gets to the heart of the issue of *why* sums (and consequently, means) of several independent random variables (for various values of several, as the situation demands) would look bellshaped but it needs a bit more clarification of what is being claimed and justification of why it's true. – Glen_b Aug 04 '17 at 00:34

4If "The convolution of two functions are at least as nice as the nicest of the two", then the nicest function is $0$. You could argue that Gaussians are the nicest nonzero functions. – Daniel Fischer Aug 04 '17 at 18:20

1This answer is cute, but it makes no sense unless you already understand the CLT. – jds Jan 24 '19 at 22:30
I think it is important to distinguish between the general bell shaped curves that not have to be normal and the normal distribution. For the later, the key notion, as already mentioned and elaborated, is the Central Limit Theorem. Namely, if some slightly technical conditions are satisfied then sample averages and sums converge (weakly) to the normal distribution. However, $(1)$ not every bell shaped curve is normal and $(2)$ not everything that we assume that is normal is indeed normal. As already was mentioned in the comments  a lot of biological variables are definitely not normal (like heights and weights of humans) however they are bell shaped and can be approximated with very high precision with the Gaussian distribution. Same relation you can encounter with the Exponential distribution as a model for life duration of machine or something like this  they are definitely not really exponential as the machine cannot "live" forever.
As such, maybe a better model will be truncated distributions. E.g., heights maybe well described by two side truncated normal distribution. But what are the problems in this case? $(1)$ If the truncation values (parameters) are unknown, you have to estimate them, and $(2)$ besides it may introduce much more complexity to your calculations. So the basic questions in statistical modelling (IMHO) is not "whether this or that variable follows the normal distribution" albeit whether the Gaussian random variable can give us a good approximation of its distribution. Let us take the heights of healthy adult males in Scandinavia for example. A good model for their heights distribution will probably be $N(187, 10)$, but more accurate model will be the same Normal distribution but truncated at $5$ standard deviations above and below the mean, i.e., setting the support to be $[137, 237]$. Your gain in precision in estimating probabilities is absolutely neglectable as this truncation adds less than $0.001$ to the "mass" of the bell at $[137, 237]$. Same logic applies to the machine's life duration example. A truncation at $\tau$ that is far away from the expected value will give you correction term that is $1/P(X<\tau) = 1/(1e^{\lambda x})$, and it equals almost $1$, thus you have no practical gain from this.
Another issue that was already mentioned, is that asymptotic normality is not a universal truth. N.N. Taleb called the Gaussian distribution "the great intellectual fraud", where, to the best of my understanding, he meant that in finance the exponential decay ("thin" tails) is very uncommon feature. Namely, you can take Cauchy's distribution for example, it also has a bell shaped form, however due to relatively "fat" tails, it does not have a finite mean. This distribution will serve as a bad approximation of the heights of humans because negative heights or "gigantically" large values (let us say, over $4$ meters) are biologically impossible. Hence, a fat tail distribution that puts nonneglectable weights on extreme values will be improper mathematical modelling of such variable (height). While, in revenues it may be vice versa  assuming normality is the same as assuming neglectable probabilities for very large gains or losses, that  as we all know  maybe simply wrong.
To sum it up, it seems that symmetry and general "bell" shape is indeed prevalent in the real world distributions. However, strict normality as described by
$$
f_{X}(x) = \frac{1}{\sqrt{2 \pi \sigma^2}}\exp\left\{\frac{(x\mu)^2}{2\sigma^2}\right\},
$$
is mostly just an approximation of the actual distribution. More accurate models that are still "bell" shaped may be better in calculating various parameters, however the little gain from the more precise model usually don't worth the higher complexity that it introduces. Hence, the regular normal distribution remains in many cases not only a good approximation but also a very convenient one. Finally, it is worth to mention that are some "domains" like finance where the bell shaped distributions are mostly nonnormal ones, and assuming normality in this case may be wrong.
 15,490
 3
 17
 38

5This is a really nice answer  and with a finite sample, [it can be very hard to tell the different "bells" apart](https://stats.stackexchange.com/q/129417/22228) – Silverfish Aug 02 '17 at 14:23
As comment(s) suggest, the answer is the Central Limit Theorem
In probability theory, the central limit theorem (CLT) establishes that, in most situations, when independent random variables are added, their properly normalized sum tends toward a normal distribution (a bell curve) even if the original variables themselves are not normally distributed. The theorem is a key concept in probability theory because it implies that probabilistic and statistical methods that work for normal distributions can be applicable to many problems involving other types of distributions.
That suggests that natural independent variables on reasonable sample sizes tend to model what we call Normal Distributions, or gaussians.
 335
 2
 13

1Please provide me with a little more 'digestible' evidence, please. Great answer, though! – Shreyas Shridharan Aug 01 '17 at 20:56

1this is the only answer, to understand it you'll have to do some research of your own – Matt Aug 02 '17 at 08:33
I'll try to answer this with some intuition.
The "bell curve" (actually called the normal distribution) is one probability distribution that can be seen in the natural world. These are things like the distribution of heights and weights, where most people lie in the middle, with less occurrences on the higher and lower ends. How many 5foot6 people do you know? How about 6foot9? 4foot11?
As mentioned by @Chappers the "Central Limit Theorem" is one key explanation. Loosely speaking (very loosely speaking), this theorem says that if you repeat an experiment over and over, the distribution of averages will follow this normal distribution.
 7,524
 3
 16
 29

14I think you should be more explicit about why the normal distribution is so common: it shows up in any quantity that has multiple independent inputs. Which means any reasonably complicated process (and nature sure is full of those) will generally have a normallydistributed output. – eyeballfrog Aug 01 '17 at 21:06

11Also, the normal distribution does **not** show up as "distribution of heights and weights" (have you ever seen someone weigh 5 kg?). But it very well **approximates** them. – Clement C. Aug 01 '17 at 22:14

The normal distribution is an example of a bell curve, not a alternative name for it. While it is true that you'll sometimes get people saying bell curve when they mean normal distribution, this is not strictly correct. There are many other examples of a bell curve that are not the normal distribution (e.g Student's t, Cauchy, etc.). – Jack Aidley Aug 02 '17 at 14:49

1@JackAidley I'm not sure there is a standard usage for the term "bell curve". I have personally never seen it used for anything but the normal distribution (or at least the family of gaussian distributions). – Jacob Maibach Aug 02 '17 at 15:37

@JacobMaibach In first courses it's usually used for the normal distribution. I mean, that's really the only other distribution talked about in those classes, along with the binomial. – Sean Roberson Aug 02 '17 at 16:54

1I don't think this answers the question. It doesn't actually answer why the normal distribution occurs in nature. – Timothy Nov 26 '18 at 05:40

@eyeballfrog That's not really right. It would be more correct to say "any quantity that has **many** (not just multiple) independent inputs whose contributions **add**". The assumption that they simply add with no interaction effects  which is often reasonable but rarely exact  illustrates both why *approximate* normal distributions are common in nature and why subtle deviations from normality are also quite common. – tparker Dec 20 '20 at 05:02
This is a question I've had myself, and from my experience talking with physicists, mathematicians, biologists, and engineers, their comment on a question like this was: a bell curve often represents a distribution of averages, which is to say mathematically, the distributions of averages of measurements always tend to form a bellcurve shape.
With this in mind, this shape naturally follows from the fact that most of our measurements are actually averages of many convoluted interactions on a smaller scale. "What is the blood pressure of a person?" Well blood pressure is an average of many averages of interactions with even more cells and chemicals. "What is the intelligence of a person?" Well, IQ is an average of many different culminations of cognitive interactions in the brain. "What is the temperature in in this room"? temperature is an average of the distribution of kinetic energy of many fast moving particles in an object or the air. Even though all of these things have many small fluctuations, they average out to the most common value with a bell distribution.

I don't think this answers the question. It's pretty much saying you should assume the normal distribution occurs in nature because you were told so but you didn't give a reason. – Timothy Nov 26 '18 at 05:42

@Timothy I disagree; I think that this is probably the best answer to this question. To me, simply invoking the CLT, as most of the other answers do, misses the heart of the question, which justify why so many realworld phenomena would be described by a sum of many iid random variables. – tparker Dec 20 '20 at 05:08
I found the bean machine to be the most intuitive example to justify the normal distribution. As rows are added to the bean machine the resulting distribution becomes an increasingly good approximation of the normal distribution. This is ultimately a consequence of the central limit theorem.
Notice the relationship with, say, coin flipping. If at each peg we go left on tails and right on heads then to get to the far left bin I'd have to have a long streak of tails to get there. By comparison, the central bins are more common because many different paths consisting of heads and tails will land you in the central bins unlike the outermost bins. Taking this binomial distribution to the limit as we increase the number of flips or rows in the bean machine and again we have the normal distribution. This is the most intuitive justification I've found.
 7,332
 1
 9
 21
Adding to the above answers, Gaussian functions also have nice computational properties: their Fourier transforms are Gaussian too, and Fourier transforms can be used to compute fast convolutions (like in one answer related to convolutions).
They can be used both as filters and windows with positive weights, and possess some optimal concentration in both domains, as famous from the WeylHeisenbergPauli socalled "uncertainty principle" (also SE.physics on Heisenberg/Gaussian waves).
It can also be found in the AzumaHoeffding inequality for martingales with bounded differences, and many others.
In vision, they form optimal scalespace structures, in relation with the heat equation.
However, they tend to be a little less efficient on (very) discrete data and outside leastsquare optimization.
 204,278
 154
 264
 488
 6,164
 1
 18
 47

1I am happy that you mention the heat equation (nobody has mentionned it), whose so important 1D solution is an evolving gaussian with $\sigma = \sqrt{t}$. – Jean Marie May 08 '19 at 20:07
The bellcurve, currently known as a normal distribution and formerly known as the exponential law of errors basically says that there is a nominal value and errors from this nominal value decrease in frequency the further you get from this nominal value.
Now the reason it is seen in nature comes from the variety of forces acting to generate an outcome. The distribution becomes normal when you have several different forces of varying magnitude acting together. Generally, the more forces then the more normal the distribution will become.
This occurs a lot in nature which is why the normal distribution is so prevalent. Say for example you are shooting a basketball, most likely you will at least hit the rim since you are the dominant force acting on the ball. However there are other forces as well: footing, wind, fatigue, elbow position, etc. So the main force dominates where the ball will go but many of the other forces have a small impact which generate variability.
Now in practice, the normal distribution is used more than it is observed in nature because it has several desirable qualities, is simple to understand, and it's shortcomings are often minute; so it serves as a good approximation.
It also plays a large role since most statistical estimators are ideally normal; so most methods strive for this sense a normal estimate is unbiased, centered, with errors decreasing in frequency the further from nominal.
 131
 2
Heat is random motion. The rate of flow of random motion behaves like the flow of heat.
The rate of change of heat at a position is the rate of heat going in minus the heat going out. The rate of heat transfer is proportional to the difference in temperature. In one dimension,
$$\frac{du}{dt} = k \frac{d^2u}{dx^2}$$
Suppose you have a spike of heat. All the heat in one place. $$u(x,0)=\delta(x)$$ where $\delta$ is the Kronecker delta function.
With $\sigma^2 = kt$ and $\mu=0$, the normal (Gaussian) distribution is a solution to this initial value problem. $$\Phi(x,t)=\frac{1}{\sqrt{4\pi kt}}\exp\left(\frac{x^2}{4kt}\right)$$
Over time the heat will fall down into a bell curve. This gives an intuitive idea of how randomness flows away from an initial exact position towards the two tails of the bell curve.
 443
 5
 9
While many of the explanations above are quite excellent and bring to light the CLT and convergence of distributions to normal shaped things, there is a point that I think needed to be emphasized (and perhaps add a different viewing angle).
When first reading this question, the answer I came up with did not land immediately on the CLT. You can break the question down into two parts:
 Why do the functions taper off at the end?
 Why are the functions unimodal (only a single hump in the middle)? Why is it symetric?
To answer the first question, I draw on properties of distributions in general. Specifically the the normality requirement of the PDF $\int_{\infty}^{\infty} f(x)dx = 1$. For this requirement to hold, a sufficient (but not necessary) condition is that $lim_{x\rightarrow\infty}f(x)=0$ and $lim_{x\rightarrow\infty}f(x)=0$. This means you will very often end up with curves that are "fat" in the middle and taper off at the ends. It is an natural consequence of the normality condition that most of the "interesting" stuff in a distribution happens in the middle (but of course there are degenerate counter examples to this sweeping rule).
The answer to second question is exactly the CLT as others have pointed out. But lets take it for a different spin. Suppose you have a uniform distribution on the interval $(0,1)$. Begin "drawing" from this distribution, and noting the values you get when you do. One big observation about this distribution is that, necessarily, $\frac{1}{2}$ of the probability, lies below the $\frac{1}{2}$ way mark ($0.5$ in this case). Said another way, it's an artifact of the distribution that, you are equally likely to get a value above the $\frac{1}{2}$ way mark as below it. So as you "draw" from the distribution, the value of the samples will naturally tend to balance them selves out. For example note that $P(X<.25)=P(.75<X)$ for uniformly distributed $X$. Thus you'll end up with a hump in the middle (thats the key behind all the other posts about Galton's bean's).
So why bring up the uniform distribution? In a lot of cases, when you want to model a starting assumption of zero information (an uninformed model), you start with some "reasonable" bounds and assume that the values are uniformly distributed. Then you sample and update the model. The updated model will have some symmetry because of the property I just mentioned. If you want to cut out a "large bin" of probability this symmetry will force you to do it from the middle. Hence, the unimodal behavior.
Is it "Gaussian"? Not necessarily as some other posters have mentioned. Long tails can occur under the right conditions (e.g. if you start off with stuff that is not completely uniform but more $\beta$ shaped with a flat middle but high but short "ends"). But is "bell shaped", yes.
 208
 1
 5

"So why bring up the uniform distribution? In a lot of cases, when you want to model a starting assumption of zero information (an uninformed model), you start with some "reasonable" bounds and assume that the values are uniformly distributed." Actually, when you assume the uniform distribution, you might be injecting information through the back door. The information measure (self entropy) might be less than for a uniform. Generally, for a bounded distribution, the minimum entropy occurs with a truncated exponential distribution. – richard1941 Aug 09 '17 at 19:41

Uniform distribution is least information when the mean is known to be 1/2. The accursed edit time limit cut me off! – richard1941 Aug 09 '17 at 19:48

@richard1941 You make a fair point. I'm mostly just banking on the symmetry of the starting distribution. Auto convolution will preserve symmetry and thus the resulting distributions on the sum will also be symmetric. As long as the starting distributions are bound in the $L_2$ norm, the measure will concentrate in the middle because of the convolution. This will get you the hump in the middle bell shape (though it may not be Gaussian). If we loose either of those conditions, the guarantee on convergence goes with them. – James S. Aug 11 '17 at 16:32
A bell curve is a Gaussian distribution curve. It occurs very frequently. Like a pile of sand you slowly let fall from your hand on the beach, the shape of the pile, in one plane, is such a curve. It's everywhere from the beach to the atom.
But with all this theorising above, don't lose sight of the fact that mathematics is merely a human construct that reflects the way that matter and spacetime interact with each other and themselves. Mathematics does not dictate the way the Universe behaves; it merely reflects it.
 627
 5
 11

1A pile of sand is better approximated by a cone. Or a triangle, in crosssection. https://en.wikipedia.org/wiki/Angle_of_repose – Phil Frost Feb 17 '18 at 12:58