Most of our current mathematical knowledge was developed to explain something already observed empirically. Going way back, many early civilizations had no concept of "zero" as being a numerical quantity; however, the concept of "nothing" or "none" existed, and eventually the Babylonians, around 2000 BC, began using symbols for "none" or "zero" alongside numerals, equating the concepts. Newton laid the foundations of what we know today as calculus (also developed independently by Leibniz) in order to mathematically explain and calculate the motion of celestial bodies (and also of projectiles here on earth). Einstein developed tensor calculus in order to establish the mathematical backing for general relativity.

It can also, however, happen in reverse. Usually, this is when "pure" math exhibits some "oddity", such as a divergence or discontinuity of an "ideal" formula that otherwise models real-world behavior very closely, or something originally thought of as a practical impossibility. Then, we find that in fact the real-world behavior actually follows the math even in these "edge cases", and it was our understanding of the way things worked that was wrong. Here's one from physics which touches on some of the most basic grade-school math and yet challenges those very foundations of thought: **negative absolute temperature**.

Temperature, classically, is the measure of thermal energy in a system. By that definition, you can never have less than no energy in the system; hence, the concept of "absolute zero". Most "normal" people hold to this concept and think of zero degrees Kelvin as a true absolute; you can't go lower than that.

However, the theoretical, more rigorous, definition of temperature has as its defining character the ratio between the change in energy and the change in entropy. As you add total energy to a system, some remains "useful" as energy, while some is lost to entropy (natural disorder). It's still there (First Law of Thermodynamics), but cannot do work (Second Law of Thermodynamics).

The graph of temperature using this definition has computable negative values; if entropy and energy are ever inversely related (entropy reduces as energy increases, or vice-versa), then this fraction, and thus the temperature, is negative. Even more interesting is that the graph of temperature as a function of energy over entropy diverges at absolute zero; the delta of entropy approaches zero for deltas of energy around absolute zero, producing infinitely positive or negative values with an undefined division by zero at the origin. That graph, therefore, predicts that absolute zero is actually a state not of zero energy, but of zero change in entropy, regardless of the amount of energy in the system. Absolute zero, therefore, could in fact be observed in systems with extreme (even infinite) amounts of energy, as long as no additional energy added was ever lost to entropy.

This used to be discounted out-of-hand; until recently, every thermal system known to man always exhibited a direct relationship between energy and entropy. You could keep adding all the energy you wanted, to infinity, and entropy would continue to increase as well. You could keep cooling a system all you wanted, until you took out all you could possibly remove, and entropy would decrease as well. Again, this is borne out by our everyday observations of the world; solid, crystalline ice, when heated, becomes more chaotic but generally predictable water, which when further heated becomes less predictable gas, and eventually decomposes into its even less predictable component atoms, which would further decompose into plasma.

However, work with lasers, and the theoretical behavior of same, gave us a thermal system that has an "upper bound" to the amount of possible energy we could add that remains contained within the system, and moreover, that limit was pretty easy to reach. This allows us to observe a system that actually becomes *less* chaotic as *more* energy is added to it, because the more energy that is in the system, the closer it gets to its upper limit of total energy state, and thus the fewer the number of particles in the system that are at a state less than the highest state (and thus the ability to accurately predict the energy state of any arbitrary particle is increased).

On the other side of the spectrum, recent news has reported that scientists have produced the opposite; they can get entropy to *increase* by *removing* energy from the system. Work with superfluids at extremely cold temperatures has demonstrated that at a critical point of energy removal from the system, particles within it no longer have sufficient energy to sustain the electromagnetic force that attracts them to and repels them from each other in their lowest energy state (which is also their most ordered state). They lose the ordered structure that defines conventional matter, and begin to "flow" around each other without resistance (zero viscosity). At that critical point, you have increased entropy as the result of removing energy; the particles become less predictable as to position and direction of motion when they're cooled, instead of our classical idea that things which are cooled become more orderly. At this point, we have reached "negative absolute temperature".

Thus, temperature seems to exhibit a "wraparound"; as energy increases to infinity, eventually the amount of it that can be in entropy will decrease, seemingly breaking the First Law of Thermodynamics and allowing us to get more energy back from the system than the incremental amount we added (but not more than the total amount of energy ever introduced to the system, so the First Law still holds). Because that threshold is attained (in an unbound system) at infinite energy states, we'll never get there with most of our everyday thermal systems, but we can see it in a bound system, and we can "wrap around" from the low end by removing energy to reach a negative absolute temperature. This is backed up by observance of the reciprocal of temperature, which is the thermodynamic beta or "perk". This fraction, by placing the zero entropy delta in the numerator, is perfectly continuous for all real values of the domain, including zero.