A quick look at the wikipedia entry on mathematical constants suggests that the most important fundamental constants all live in the immediate neighborhood of the first few positive integers. Is there some kind of normalization going on, or some other reasonable explanation for why we have only identified interesting *small* constants?

**EDIT:** I may have been too strong in some of my language, or unclear in my examples. The most "important" or "interesting" constants are certainly debatable. Moreover, there are many important and interesting very large numbers. Therefor I would like to make two revisions.

First, to give a clearer idea of the numbers I had in mind, please consider such examples as $\pi$, $e$, the golden ratio, the Euler–Mascheroni constant, the Feigenbaum constants, the twin prime constant, etc. Obviously numbers like $0$, $1$, $\sqrt2$, $...$, while on the wikipedia list, are in some sense "too fundamental" for consideration.

This leads me to my second revision, which is that the constants I am trying to describe are (or appear to be) irrational. Perhaps this is a clue to what makes them interesting. At the very least, it leads me to believe that **large integer counterexamples do not satisfy the question as I had intended**.

Finally, if I could choose a better word to describe such numbers, it might be "auspicious" rather than interesting or important. But I don't really know if that's any better or worse.