One of my friends was asking me about tertiary level mathematics as opposed to high school mathematics, and naturally the topic of rigour came up.

To provide him with a brief glimpse as to the difference, I said the following.

In high school, you were taught that the area of a rectangle is $ab$ where $a$ is the breadth and $b$ is the height. You can physically see this by constructing an $a \times b$ grid and counting the squares it forms, provided $a$ and $b$ are integers.

He had agreed and said that it was "obvious" that the area of a rectangle was $ab$. I then responded with:

What is the area of a rectangle with dimensions $\pi$ by $\sqrt 2$?

He immediately just said $\pi \sqrt 2$, and then I responded with one of the most common questions in mathematics:

How do you know that for sure?

I had said that it intuitively works for integer values of $a$ and $b$, but how do we KNOW for sure that it works for irrational values of $a$ and $b$? Then I used that as a gateway to explain that in tertiary level mathematics we don't assume such things. There is no "It is clearly true for these easy-to-understand integers, so therefore it is true for all real values" and that everything must be proven.

He then asked me something that I had no answer to:

I get that we cannot assume these kinds of things, but has there ever been an occasion where an assumption or a lack of rigour has killed someone before?

I am sure that there may exist an example floating somewhere in history, but I cannot think of any.

Do you know of one?

EDIT: Cheers to starsplusplus

A lot of really great responses! However, the majority of them don't quite fit the definition of 'rigour' in the mathematical sense, which is vastly different to the common English term. See this. Many of the answers provided so far have been accidents/deaths caused by a lack of what I feel to be more like procedural rigour as opposed to mathematical rigour.


It seems like further clarification is needed regarding what I'm looking for in a response. I was looking for an example of where an individual(s) did something that was mathematically incorrect (not a trivial computational error though) that had a consequence which led to the death of one or more people. So, what do I mean by something mathematically incorrect that isn't a trivial computational error?

Example of something I'm looking for:

Say somebody is part of programming or working out the math behind a missile firing mechanism. In part of their computations, they did one of the following which yielded an incorrect value. This incorrect value caused the missile to fly out of control and cause the death of one or more person(s).

  • Exchanged a summation with an integral unjustifiably
  • Needed to use two sequence of numbers that always yielded relatively prime numbers. They used a computer but it didn't find any counter examples, so the programmer assumed that the formula always yields relatively prime integers. However, the counter example lies at $n=99999999999999999999999999$, beyond reasonable computational time.
  • The limit of a series was to be used at some point in the computations. To calculate it, the person re-arranged terms however they liked and then found a limit. But the series didn't converge absolutely so they could have gotten any value.
  • 9,762
  • 10
  • 28
  • 65
  • 9
    Wouldn't any fatal mistake due to failure to stick to a procedure count as "death by lack of rigour"? – Raskolnikov Apr 05 '15 at 10:14
  • From a literal standpoint, yes! But I am referring to the rigour specially in the context of Mathematics. I've edited the title to clear things up. – Trogdor Apr 05 '15 at 10:15
  • 2
    While not about fatal mistakes, things like the Monty Hall problem or "after 5 reds in roulette, is black more likely next?" can be good examples to use to explain the importance of mathematical rigor. – Kimball Apr 05 '15 at 16:56
  • 6
    Should this question be on [HSM.SE]? – gerrit Apr 05 '15 at 18:33
  • 4
    Looking at David Richerby's claims that none of these are mathematical rigor issues leads to a philosophical question: if someone assumes a mathematically rigorous proof exists when one does not, and acts on that assumption and dies, do we blame the lack of mathematical rigor, or the assumption? – Cort Ammon Apr 06 '15 at 00:17
  • 1
    One thing to keep in mind is that before rigour, nothing at all worked! So they are a lot of people who wouldn't exist right now if it wasn't for mathematical rigour, but it is hard to directly pin point which part. – PyRulez Apr 06 '15 at 00:21
  • 1
    @PyRulez: Your comment sounds absurd on the face of it, which makes me curious to know what you really meant. You *can't*, surely, have meant that math was completely non-functional before the development of rigorous analytical methods, largely in the last few centuries? – Nathan Tuggy Apr 06 '15 at 02:51
  • 12
    I think your example is very bad. If you want to make the point that intuition is fallible, you should give an example where intuition is actually wrong (there are plenty), and then show that it is wrong (by at least giving a counter example), instead of framing rigor as some weird quirk that you personally demand. – Superbest Apr 06 '15 at 04:16
  • I can't find a source, but my electronics engineering teacher told us this story: a circuit board controlling a roller coaster was not adequately bypassed (the power rail did not have enough capacitors to smooth out ripples) which ultimately resulted in the death of a child when the "on deck" carts entered the loading zone prematurely and rammed the carts which had people getting out of them. The kid was knocked over and then ran over. Bypassing the power circuit on a board is a known problem and can easily be solved with mathematical rigour – benathon Apr 06 '15 at 08:11
  • 1
    @portforwardpodcast that would be a lack of engineering rigour, not mathematical. – smci Apr 06 '15 at 11:40
  • 5
    Hmm, it seems like most if not all of the examples given below are "someone did check things carefully enough", which is an example of rigor in the conventional sense, not the mathematical sense. OF COURSE there have been many times when mistakes in calculations have caused deaths. But I think a "right" answer to this question would be something more on the order of: "Someone assumed that a formula applied to real numbers just because it applied to integers, and this turned out to be wrong because in fact ..." – Jay Apr 06 '15 at 19:58
  • @Jay What about my example? The Pythagoreans clearly did not get it right. To use your phrasing: "The Pythagoreans assumed magnitudes were commensurable, but this turned out to be wrong because, in fact, ..." – Daniel W. Farlow Apr 06 '15 at 23:37
  • @NathanTuggy Okay, I guess that is a bit of an exagerration. Progress is definitely faster now. – PyRulez Apr 06 '15 at 23:57
  • You're not quite right that we have to prove the area formula for a rectangle. Area is something that's tricky to define properly, but we often start by *defining* the area of a rectangle to be the product of its side lengths, at least for certain rectangles - this is then used as a basis to define area more generally. You have to start somewhere. – Jair Taylor Apr 07 '15 at 00:45
  • It's going to be very hard to find examples because in most cases where someone's life rides on math the system will be tested unmanned first. The error will become apparent before anyone is in danger. However, Alan's answer about armoring airplanes I think qualifies. – Loren Pechtel Apr 07 '15 at 04:25
  • 1
    Maybe mathematical rigor can sometimes even be more harmful than the lack of it. You all know the joke: A biologist, a physicist, and a mathematician travel by train and see a black sheep. The biologist says "Oh, apparently the sheep in this country are all black". The physicist: "No, we can only say that at least *one* sheep in this country is black". The mathmatician slowly shakes her head and slowly says "At least one sheep is black on at least side". - Such rigor may in fact be too much for many applications – Hagen von Eitzen Apr 07 '15 at 08:39
  • 2
    Since almost every answer has a "That's not mathematical rigour, that's [something else]" comment on it, I think you should edit your question to give a rigorous definition of rigour. Or maybe [link to this question](http://math.stackexchange.com/questions/170221/what-exactly-is-mathematical-rigor). – starsplusplus Apr 07 '15 at 11:16
  • im stuck with first part , second doesnt interest me , how could you assume a product of irrational dimentions cant be applied in geometry ? im not really understanding your assumption. of course if we deal with $\pi$ as a decimal number of 3,14 its erronous calculation indeed , but, if we let it abstract without concretising it on a paper, it remains workful !! – Abdou Abdou Apr 07 '15 at 11:50
  • @starsplusplus I agree--that's why I just voted to close as "unclear what you're asking." I thought my answer sufficiently answered the question, but everyone else who answered feels the same way of course. There needs to be some clarification. – Daniel W. Farlow Apr 07 '15 at 13:20
  • We're not assuming that the product of irrationals can't be applied in geometry. It can! What I was saying though, to my friend, is that it's not completely obvious that the formula for the area of a rectangle is breadth x height when the dimensions are no longer integers. For integer dimensions, it is completely obvious but for irrational dimensions (which tend to be naughty sometimes), can we still trust our intuition? – Trogdor Apr 07 '15 at 13:20
  • I've clarified it in my main post. Hopefully that should provide people with a better picture of a 'nail on the head' answer. The closest so far would be joe johnson's answer, which unfortunately remains hidden far at the bottom of all the answers. – Trogdor Apr 07 '15 at 13:22
  • @Trogdor The fact that you think Joe Johnson's answer is closest to what you want simply illustrates how unclear your question is. I think many people would argue that his answer is not best, but you are the OP. I'd try to tighten it up a bit. – Daniel W. Farlow Apr 07 '15 at 13:29
  • @Trogdor idont know what r u tryin to prove but .... an a*b rectangle of irrational dimentions , if you straighten the perimeter of a $radius=1$ circle as a whole segment , than put the trigonometric tangent of an insider angle $tan(\theta=\pi/3)=\sqrt{3}$ as a second dimention of 90° angle with the first segment , you have difinitely an irrational surface value of $\sqrt{3}\pi$ ,if you convert to decimal rounded floting value=5.44 on paper, that would have bad consequences because further we keep working on with that fake value we will have bigger error gap more we advance in calculation – Abdou Abdou Apr 07 '15 at 13:44
  • @magicman Well, I think you're at least thinking in the right direction. As I understood the story, Hipparchus was murdered for letting the secret out. Which I guess technically fits the question but is something of a special case. It wasn't a natural result of a math error, it was a human sanction. And actually a sanction for getting it right, not for getting it wrong. – Jay Apr 07 '15 at 13:45
  • @Jay Your comment just now *further* shows how unclear the question is as it currently stands. "Has lack of mathematical rigour killed anybody before?"--directly? Indirectly? One person? Etc. There are *so* many ways to view this question, but I think it's fair to say my answer is a little unique in that there's a twist compared to how most other answers have been provided so far. Like you said, and as I point out in my answer, it was really the Pythagoreans who dropped the ball. Basic point: question needs to be made airtight; otherwise, non-legitimate answers will continue to be submitted. – Daniel W. Farlow Apr 07 '15 at 13:49
  • I think this qualifies: http://en.wikipedia.org/wiki/HMS_Captain_%281869%29 – Loren Pechtel Apr 17 '15 at 02:01
  • 4
    Ah, a so-called rigour mortis. – Nonoffensive name Jun 14 '15 at 00:06
  • Look up the [Sleipner A Oil Platform](http://www.ima.umn.edu/~arnold//disasters/sleipner.html) – Michael Burr Nov 12 '15 at 21:29
  • This reminded be of an anecdote my professor told me (not sure if true). The story goes that while reusing old components for building a new spaceship, the enigneers tested an O-ring (or was it a pipe, no idea) for having the correct shape (circular) by measuring its thickness for all orientations. Since they measured a constant width, they concluded it must be circular. Their mistake: there are many more [curves of constant width](https://en.wikipedia.org/wiki/Curve_of_constant_width). This misclassification caused welded joints to tear apart during the start. – M. Winter Jan 22 '18 at 10:55

15 Answers15


Particularly, lack of knowledge of Bayes' theorem, and intuitive use of probability, lead to many misdiagnosed patients all of the time. In fact, some studies suggest that as many as 85%(!) of medical professionals get these type of questions wrong.

A famous example is the following. Given that:

  • 1% of women have breast cancer.
  • 80% of mammograms detect breast cancer when it is there.
  • 10% of mammograms detect breast cancer when it’s not there.

Now say that a woman is diagnosed with breast cancer using a mammogram. What are the chances she actually has cancer?

Ask your friends (including medical students) what their intuition regarding the the answer is, and I'm willing to bet most will say around 80%. The mathematical reasoning people give for this answer is simple: Since the test is right 80% of the time, and the test was positive, the patient has a 80% chance of being sick. Sound correct?

Would you be surprised to learn that the actual percentage is closer to 10%?

This perhaps surprising result is a consequence of Bayes' theorem: The overall probability of the event (breast cancer in all women) has a crucial role in determining the conditional probability (breast cancer given a mammography).

I hope it's obvious why such a misdiagnoses can be fatal, especially if treatment increases the risk of other forms of cancer, or in reversed scenarios where patients are not given care when tests give negative results.

  • 31,733
  • 7
  • 76
  • 133
  • 15
    Do the people who made the study understand Bayes' theorem? :-) – Asaf Karagila Apr 05 '15 at 17:07
  • 1
    @Asaf Karaglia: there is an obvious experiment you could try. I think there is reasonable evidence that they do :-) – Rob Arthan Apr 05 '15 at 17:57
  • 31
    This is not about "mathematical rigour" but about wrong thinking. – Christian Blatter Apr 05 '15 at 18:03
  • 44
    @ChristianBlatter: I thought mathematical rigour was, by definition, lack of wrong thinking. – jwodder Apr 05 '15 at 18:23
  • 10
    This is known as the [false positive paradox](http://en.wikipedia.org/wiki/False_positive_paradox) – BlueRaja - Danny Pflughoeft Apr 05 '15 at 18:35
  • 4
    There's a good link about this on BBC www.bbc.com/news/magazine-28166019. Another example is Sally Clark's case where a faulty understanding of independent events by a famous paediatrician lead to her conviction of murder of her two sons. She spent about 3+ years in jail until she won her case. – user3371583 Apr 05 '15 at 20:38
  • Of course, although it is wrong, it probably isn't bad that they overestimate that, to avoid practioners from going "well, you only have a *10%* chance of cancer, so...". It is of the more fatal variaty when the reverse happens. – PyRulez Apr 06 '15 at 00:08
  • This is very interesting can you please elaborate a little bit more. – Neil Apr 06 '15 at 03:32
  • @Neil - added a link in the spoiler section explaining the math. – nbubis Apr 06 '15 at 06:51
  • Thanks nbubis :) edit: it isn't showing :( – Neil Apr 06 '15 at 07:19
  • 19
    Does it work like this. Out of 1,000 women 10 will have breast cancer and 990 won't. So out of those 990, 99 will be false positive. and out of the 10 that do have cancer 8 will show up as having it and 2 will be missed. So The chance of getting it right is 8/107 converted to a percent. So about 7% of women with breast cancer will actually be detected from the mammogram test. – Neil Apr 06 '15 at 07:26
  • 4
    It is interesting in these problems about false positives to consider the limit $P(B) \to 0$. The Bayes rule exactly says that $P(B|A) = \frac{P(A|B)}{P(A|B) P(B) + P(A|B^c) P(B^c)} P(B)$. As $P(B) \to 0$ the fraction tends to $\frac{P(A|B)}{P(A|B^c)}$. So the posterior probability is approximately $\frac{P(A|B)}{P(A|B^c)}$ times the prior probability. This estimate works when $\frac{P(A|B) P(B)}{P(A|B^c) P(B^c)} \ll 1$, as can be justified using the geometric series. I think this is a nice quick-and-dirty heuristic for the problems where our intuition leads us astray. – Ian Apr 06 '15 at 12:04
  • @Ian - Exactly. Or in other words, the chances of contracting a rare disease trump those or the doctor not being wrong. – nbubis Apr 06 '15 at 12:44
  • @nbubis Speaking of _computational_ rigor, the answer appears to be $P(C \mid T) = 0.0747$. – Alecos Papadopoulos Apr 06 '15 at 18:46
  • 7
    I had to re-read your statements of the statistics when you said the right answer was not 80%. Yes, there's a big difference between "80% of patients who have cancer show positive on this test" and "80% of patients who show positive on this test have cancer". Even to someone such as myself who is, shall we say, reasonably mathematically literate, you have to read the statements carefully. I can readily see that someone trying to make a point -- whether for some ideological purpose or whatever -- might well throw such a statement past me and if I'm not reading carefully I could miss it. ... – Jay Apr 06 '15 at 19:48
  • 4
    ... I'm sure there are plenty of people in the world with less mathematical background who would miss the distinction even if they tried to study the text. Of such things is ignorance or even propaganda made. – Jay Apr 06 '15 at 19:49
  • 2
    I have no quibble with your math here but I don't see the dead person the OP is looking for. – Loren Pechtel Apr 07 '15 at 04:09
  • 8
    @Neil *So about 7% of women with breast cancer will actually be detected from the mammogram test.* **No.** 80% of women with breast cancer will be detected from the mammogram test. 7% of women *who get a positive result on the test* will actually have breast cancer (about 93% will be false positives). – starsplusplus Apr 07 '15 at 11:19
  • 2
    This sounds like the Monty Hall paradox where some ignorant mathematicians tried to correct the vain marilyn vos savant. She was right of course, however its odd to see people who do this stuff for a living get schooled by simple, yet unintuitive, logic. – Zach466920 Apr 28 '15 at 16:08

Another example is the mistaken assumption made by the designers of the enigma machine that it was good for a (character-by-character) encryption algorithm never to encrypt any character as itself, whereas in fact this turned out to be an exploitable weakness. Whether that saved lives or costs lives depends on your point of view, but it is clearly a good advertisement for mathematical rigour.

Rob Arthan
  • 42,571
  • 4
  • 38
  • 84
  • 14
    I don't see what this has to do with mathematical rigour. The fact that a letter couldn't encode as itself was a by-product of the design, not a design goal. It follows because the any setting of the rotors implement a permutation of the alphabet (say, $\pi$) and the reflector implements another permutation (say, $\rho$) in which every letter has order two. This implements the permutation $f(x)=\pi^{-1}(\rho(\pi(x))$ and $f(x)\neq x$ for all $x$ because $\rho(x)\neq x$ for all $x$. – David Richerby Apr 05 '15 at 23:24
  • 1
    A rigorous mathematical analysis of the design would have concluded that excluding the possibility that $f(x) = x$ was a weakness. – Rob Arthan Apr 05 '15 at 23:53
  • 16
    No. A *thorough* mathematical analysis of the design would have concluded that. In ordinary English, "rigorous" and "thorough" mean the same thing; in mathematics, they do not. A rigorous mathematical analysis is one in which every deduction is fully justified. – David Richerby Apr 06 '15 at 00:13
  • So what does "thorough" mean in mathematics? – Rob Arthan Apr 06 '15 at 00:21
  • 9
    A thorough analysis is one that considers all the possibilities: you're suggesting that they didn't analyze the possibility that letters never enciphering to themselves could be a weakness. Actually, as I recall, the German military was perfectly well aware that Enigma had weaknesses such as this, but they didn't consider them important. Their attitude was along the lines of, "Enigma would take 100 years to break. So what if there are weaknesses that mean it'll only take 90 years? The enemy doesn't have 90 years." – David Richerby Apr 06 '15 at 00:30
  • Interest only: The Poles had worked on enigma for ~~= 10 years and had a postgraduate group dedicated to it, which was pivotal to the later successes, and then you add the Turing factor of N multiplier :-) - so "tens of years" was probably not horrendously bad as as estimate of required cracking effort. – Russell McMahon Apr 07 '15 at 07:21
  • I'm curious how it was a weakness. There are 2.7 times fewer derangements than permutations, so the search space for a key is a little bit smaller. But searching through derangements may be somewhat harder computationally, and could also be more complex with a physical machine. Derangements are not closed under composition, so there is slightly less algebraic structure available. – zyx Dec 15 '15 at 20:51
  • It let them detect blind alleys in the search much more quickly. See https://en.wikipedia.org/wiki/Cryptanalysis_of_the_Enigma for more information and extensive references. – Rob Arthan Dec 15 '15 at 21:15
  • 2
    @DavidRicherby: I see that comments of mine have been removed and that you have 30 upvotes for your attempts to defend a cryptographic system that failed - very specifically because the argument that "the enemy doesn't have 90 years" grossly underestimated the ability of the enemy to exploit weaknesses in the system. I despair for anyone who is beguiled by your bizarre attempt to distinguish between "rigour" and "thoroughness". – Rob Arthan Nov 27 '16 at 00:01

The tricky thing about mathematical rigour is that it's particularly hard to blame it.

Mathematics is very abstract. It has to be applied, often several times in succession, before its products become concrete enough to kill someone. Thus there are often several layers to diffuse the blame. There's a famous phrase I hold near my heart, "All models are wrong; some are useful." Generally speaking, we try to make sure we aren't reliant on our models for life and death.

However, the models can lead people to overlook requirements, making a product that is too cheap. Consider the Tacoma Narrows Bridge. The original plan called for 25ft deep trusses below the bridge for stability. It was very expensive. Leon Moisseiff and his associates petitioned Washington State to build it for less using their design with 8ft deep girders instead of the expensive trusses. His price was 3/4 that of the larger plan, and the pricetag alone was enough to make the decision to go with Moisseiff's plan.

To defend his cheap and thin 8ft girders, he referenced the latest and greatest models on elastics to show that it could deal with the wind load. The mathematical analysis showed that it could withstand a static wind load which was sufficient for the area.

Unfortunately for mathematical rigor, "static wind load" was not a good model of what such a thin bridge actually faced. The wind loads were actually dynamic, and it was the resulting oscillations that eventually doomed the bridge to the bottom of the river.

There were technically no deaths (one Cocker Spaniel perished after being left in the car, because it tried to bite at the hand of the gentleman who tried to rescue it), but I think that gets close enough that it might meet your friend's requirements for a response. After all, there was nothing in the design that prevented the deaths. People just managed to scramble off the bridge before it went down.

Cort Ammon
  • 3,081
  • 13
  • 21
  • Tacoma Narrows is in Washington, not New York. Did you mean Washington State? – cpast Apr 05 '15 at 23:09
  • 5
    That's not a lack of mathematical rigour: that's using the wrong model. – David Richerby Apr 05 '15 at 23:28
  • 10
    I would argue that the two are tightly intertwined. After all, nobody ever bled out directly because of an irrational square root. They bleed out because someone assumes the world works one way, and it actually works another. I would argue that, in this case, the issue was a mathematically non-rigorous claim that the transients in the system would damp out, leaving only the static portion. This seems like a reasonably valid transform from the OP's position on mathematical riggor regarding irrational numbers acting the same as integers did. – Cort Ammon Apr 06 '15 at 00:14
  • @cpast: WHOOPS! (fixed, thank you for catching that slip) – Cort Ammon Apr 06 '15 at 00:14
  • @CortAmmon Your comment and your answer make different claims. Your answer says that they chose the wrong model (static, rather than dynamic load); your comment says that they did some non-rigorous mathematics to show that the dynamic contribution was negligible. – David Richerby Apr 06 '15 at 00:49
  • 6
    @DavidRicherby I'm arguing they are tightly intertwined. This may be a place where background leads people to different wordings and different answers. I'm an engineer, and in my world, choosing the wrong model is nearly 100% caused due to an assumption that "case a" behaves mathematically similar to "case b." We then call it "mathematical rigor" when we take the time to mathematically prove our assumption was valid (this could be thought of along the lines of transforms in category theory). Do you have a different definition which leads to "choice of model" being unrelated to rigor? – Cort Ammon Apr 06 '15 at 01:05
  • 1
    In mathematics, rigour means that every step of the (mathematical) argument is fully justified. It has nothing to do with things that are external to mathematics, such as claiming that those results mean anything in the real world. For example, one could rigorously prove that $x+x=2x$. One could then claim that $x$ is "the weight of a person" and come to the nonsense conclusion that you and I together weigh exactly twice as much as either one of us. But that conclusion has nothing to do with rigour because it is external to mathematics: we performed a rigorous analysis of the wrong model. – David Richerby Apr 06 '15 at 09:50
  • 4
    I heard the Tacoma Narrows story differently, although it was told to prove a point, so may have been misrepresented: the problem was that unlike other engineers who relied on experience and rule of thumb, this engineer used a mathematical model which allowed him to build the bridge much thinner than ever before. This caused the bridge itself to act as an airfoil (essentially a wing) and generate lift, which is something no bridge had ever done before and thus was not included in the model. So, one might twist the story, this was caused not by *lack* but by *application* of mathematical rigor! – Jörg W Mittag Apr 07 '15 at 08:20

A classic example of bad assumptions leading to really bad results, including quite possibly unnecessary deaths, is the concept of "survivor's bias", in which statistics are skewed by only looking at the survivors of some process. In World War II, a statistician identified this as a hole in the army's research methods into how to armor their planes.

Now, how to "prove" that in other cases when it wasn't noticed that this actually lead to fatalities...that might be harder, but it's pretty clear that this phenomenon was around LONG before it was identified to be watched out for.

Edit: A bad assumption (in science, not purely math) but still a statistical assumption also led to lots of proven deaths: The use of radiation to shrink the thymus of healthy babies to prevent Sudden Infant Death Syndrome because it was thought that a large thymus was causing breathing problems. In fact, researches had bad data on the size of a healthy thymus from sampling bias: Only poor people had been used for autopsies for many years by medical students/researchers, and stress/poor diet shrinks the Thymus.

The extra radiation caused a lot of thyroid cancer, which definitely killed a lot of people. Here's a link: http://www.orderofthegooddeath.com/poverty-the-thymus-and-sudden-infant-death-syndrome#.VSENw_nF8d8

  • 14,825
  • 2
  • 24
  • 44
  • 12
    In fact, statistics in general is probably a good place to look for examples of this sort of thing. – Jack M Apr 05 '15 at 16:40
  • 2
    In this vein, perhaps one could also recall the [Sally Clark case](http://en.wikipedia.org/wiki/Prosecutor%27s_fallacy#The_Sally_Clark_case) - but, again, I guess that, instead of lack of mathematical rigour this would be rather a case of plain lack of math (or statistics). – leonbloy Apr 05 '15 at 17:15
  • Relevant: https://www.youtube.com/watch?v=w-CK8VxMz9g (yay vsauce) – bjb568 Apr 05 '15 at 19:51
  • 4
    This doesn't seem to have a lot to do with mathematical rigour. Rigour in mathematics means that every step of the argument is fully justified; you're talking about situations in which people used the wrong data and got garbage results. To use the example in the question, one can give a fully justified (i.e., rigorous) argument that the area of a rectangle is its width times its height. But if you get the wrong area because you measured something other than the width and the height, that's not a failure of mathematical rigour: it's just using the wrong data. – David Richerby Apr 06 '15 at 00:53
  • 7
    The WW II era statistician was Abraham Wald. Prior to his 1943 work, people assumed that armor should be added to sections of returning airplanes showing the greatest damage. Wald showed, with math, that the places that need reinforcement are those portions of the returning airplanes that exhibited the least amount of damage. – David Hammen Apr 06 '15 at 04:00
  • An error in square root sign gives a different solution, especially while dealing with differential equations. A sign error in control engineering can down a plane or rocket to another destination. – Narasimham Apr 06 '15 at 09:39
  • @DavidHammen you made my day with that trivia... so unobvious on first thought, so obvious on the second, all in all brilliantly genious! –  Apr 07 '15 at 02:15
  • Not in maths though, can be translated with AND NOT ; Instruction from jailor to hangman: " Leave him, not hang him!" with comma misplaced " Leave him not, hang him! " – Narasimham Apr 22 '15 at 15:01

Perhaps you might think this answer won't be appropriate for the mathematics Stack Exchange site, but rather for Stack Overflow or Programmers. But really, the deaths in this instance may have been influenced by a lack of rigour in programming of a mathematical system, so I thought it might apply.

In 1991, 28 soldiers were killed by an Iraqi Scud missile at an army barracks in Dharan, Saudi Arabia, during the Gulf War. A Patriot missile system was programmed incorrectly resulting in a floating point error on the internal clock of the system resulting in a time error of approximately 0.34 seconds at the time of the incident.

Combined with the Scud velocity of ~1676 m/s, the Patriot missile system radar incorrectly placed the missile over half a kilometer from its true position, which was incidentally outside of its "range gate". From what I have gathered, the Patriot system essentially looked in the wrong place of the sky because of this and failed to shoot the missile down.

It may not necessarily be 100% true that this error caused the deaths. Who knows if the Patriot missile would have hit the Scud even if it weren't for the error. Thus, this may not really answer your question perfectly, but it certainly highlights the importance of rigour when applying math to real systems.


From the GAO report:

On February 11, 199 1, the Patriot Project Office received Israeli data identifying a 20 percent shift in the Patriot system’s radar range gate after the system had been running for 8 consecutive hours. (Figure 4 depicts the location of a Scud within the range gate after the Patriot has been in operation for over 8 hours.) This shift is significant because it meant that the target (in this case, the Scud) was no longer in the center of the range gate. The target needs to be in the center of the range gate to ensure the highest probability of tracking the target. As previously mentioned, the range gate is calculated by an algorithm that determines if the detected target is a Scud, and if the Scud is in the Patriot’s firing range. If these conditions are met, the Patriot fires its missiles.......Patriot Project Office officials said that the Patriot system will not track a Scud when there is a range gate shift of 50 percent or more. Because the shift is directly proportional to time, extrapolating the Israeli data (which indicated a 20 percent shift after 8 hours) determined that the range gate would shift 50 percent after about 20 hours of continuous use. Specifically, after about 20 hours, the inaccurate time calculation becomes sufficiently large to cause the radar to look in the wrong place for the target. Consequently, the system fails to track and intercept the Scud

enter image description here




Full GAO Report


  • 675
  • 1
  • 5
  • 15
  • 1
    That's not my understanding of the failure. Rather, the computer tried to plot an intercept course against the missile and it was plotting the inbound and the Patriot at different times, causing the tracks to not converge. I do not think this case should be counted, though, as the Patriot had an abysmal record against Scuds--had it fired it almost certainly would have failed anyway. – Loren Pechtel Apr 07 '15 at 04:12
  • I am not entirely sure, as I have only read the small excerpts I have found. But your comment prompted me to search a bit more and I have linked the full GAO report in my answer. One of the opening pages, under "Results in Brief", states...*At the time of the incident, the battery had been operating continuously for over 100 hours. By then, the inaccuracy was serious enough to cause the system to look in the wrong placefor the incoming Scud.* I am not really sure if there is a difference between what you are saying and what I'm saying. – krb686 Apr 07 '15 at 05:33
  • 1
    In other words, perhaps "looking in the wrong place of the sky" really means plotting an intercept course at the wrong place of the sky. – krb686 Apr 07 '15 at 05:37
  • Oh you are highlighting the patriot battery plotting the courses at different times as opposed to thinking the missile was elsewhere. Everything I have read/seen seems to indicate that the battery thought the missile was in the wrong place. See the quote/picture I have added to my answer. – krb686 Apr 07 '15 at 05:46

Look no further than the case of the THERAC-25.

Sloppy programming and unverified assumptions about software components in a machine for radiation therapy almost single-handedly killed a number of patients.

It's a common example in the field of software engineering as an answer to the question "why we need to bother with rigorous practices" as well as when it comes to formal methods and program correctness verification - when CS freshmen start groaning they are often redirected to that example.

Tobia Tesan
  • 503
  • 2
  • 13
  • 3
    A more rigorous treatment (pun intended) is in *[An Investigation of the Therac-25 Accidents](http://courses.cs.vt.edu/~cs3604/lib/Therac_25/Therac_2.html)* (published in *IEEE Computer*). It is long (four long pages), but interesting, and it should be required reading for anyone involved in [functional safety](https://en.wikipedia.org/wiki/Functional_safety). – Peter Mortensen Apr 06 '15 at 10:13
  • 1
    Thanks, Peter, great link. Feel free to incorporate that into my answer. – Tobia Tesan Apr 06 '15 at 12:12
  • 5
    While this was a case of sloppy programming it was not a math error. The Therac-25 fired at the intensity it was commanded to--what turned it into a death ray is when it fired it's electron beam at the intensity level needed for x-ray therapy but it didn't have the target in place to convert the electron beam to x-rays. – Loren Pechtel Apr 07 '15 at 04:15
  • 3
    @LorenPechtel - granted, it isn't the best example (others have posted much better ones), but arguably, sloppy programming and absence of formal program verification *is* by all means a lack of mathematical rigour, since proving the correctness of a program is done with the instruments of mathematical logic. Granted, the problem was more the *lack* of it. Questionable engineering practices was another, complementary problem, of course. – Tobia Tesan Apr 07 '15 at 05:11

TL;DR: A Pythagorean by the name of Hippasus allegedly perished at sea because he disclosed the secret of irrational magnitudes to outsiders, a realization that invalidated the Pythagorean general theory of similar figures.

See The Scandal of the Irrational, a resource made freely avaialable by MIT Press for more info.

Hippasus of Metapontum and Irrational Magnitudes

How lack of mathematical rigor killed him: As legend has it, lack of mathematical rigor by the Pythagoreans as a whole is what ultimately killed Hippasus of Metapontum. His demise had to do with his disclosing the "secret" of incommensurable line segments (i.e., line segments having no common unit of measure). From Howard Eves' An Introduction to the History of Mathematics:

The discovery of the irrationality of $\sqrt{2}$ (in this context, a geometrical proof of the irrationality of $\sqrt{2}$ is obtained by showing that a side and diagonal of a square are incommensurable) caused some consternation in the Pythagorean ranks. Not only did it appear to upset the basic assumption that everything depends on the whole numbers, but because the Pythagorean definition of proportion assumed any two like magnitudes to be commensurable, all the propositions in the Pythagorean theory of proportion had to be limited to commensurable magnitudes, and their general theory of similar figures became invalid. So great was the "logical scandal" that efforts were made for a while to keep the matter a secret. One legend has it that the Pythagorean Hippasus (or perhaps some other) perished at sea for his impiety in disclosing the secret to outsiders, or (according to another version) was banished from the Pythagorean community and a tomb was erected for him as though he was dead.

Why Discovery of Irrational Magnitudes was so Disturbing

Most of what follows is adapted from pgs 82-84 of the aforementioned book by Howard Eves:

The integers are abstractions arising from the process of counting finite collections of objects. The needs of daily life require us in addition to counting individual objects, to measure various quantities, such as length, weight, and time. To satisfy these simple measuring needs, fractions are required, for seldom will a length, as an example, appear to contain an exact integral number of linear units. Thus, if we define a rational number as the quotient of two integers $p/q, q\neq 0$, this system of rational numbers, since it contains all the integers and fractions, is sufficient for practical measuring purposes.

The rational numbers have a simple geometric interpretation. Mark two distinct points $O$ and $I$ on a horizontal straight line ($I$ to the right of $O$) and choose the segment $OI$ as a unit of length:

enter image description here

If we let $O$ and $I$ represent the numbers $0$ and $1$, respectively, then the positive and negative integers can be represented by a set of points on the line spaced at unit intervals apart, the positive integers being represented to the right of $O$ and the negative integers to the left of $O$. The fractions with denominator $q$ may then be represented by the points that divide each of the unit intervals into $q$ equal parts. Then, for each rational number, there is a point on the line.

To the early mathematicians, it seemed evident that all the points one the line would in this way be used up. It must have been something of a shock to learn that there are points on the line not corresponding to any rational number. This discovery was one of the greatest achievements of the Pythagoreans. In particular, the Pythagoreans showed that there is no rational number corresponding to the point $P$ on the line where the distance $OP$ is equal to the diagonal of a square having a unit side (see the above figure). Their discovery marks one of the great milestones in the history of mathematics.

To prove that the length of the diagonal of a square of unit side cannot be represented by a rational number, it suffices to show that $\sqrt{2}$ is irrational. Many algebraic proofs of this fact exist, and a geometric one is provided in the article linked to at the beginning of this post.

The discovery of the existence of irrational numbers was surprising and disturbing to the Pythagoreans. First of all, it seemed to deal a mortal blow to the Pythagorean philosophy that all depends upon the whole numbers. Next, it seemed contrary to common sense, for it was felt intuitively that any magnitude could be expressed by some rational number. The geometrical counterpart was equally startling, for who could doubt that for any two given line segments one is able to find some third line segment, perhaps very very small, that can be marked off a whole number of times into each of the two given segments? But take as the two segments a side $s$ and a diagonal $d$ of a square. Now if there exists a third segment $t$ that can be marked off a whole number of times into $s$ and $d$, we would have $s=bt$ and $d=at$, where $a$ and $b$ are positive integers. But $d=s\sqrt{2}$, whence $at=bt\sqrt{2}$; that is, $a=b\sqrt{2}$, or $\sqrt{2}=a/b$, a rational number. Contrary to intuition, then, there exist incommensurable line segments.


Now maybe you can see why your example with your friend actually glosses over something rather important: how would you even construct a rectangle with sides of measured lengths $\pi$ and $\sqrt{2}$?

I had said that it intuitively works for integer values of $a$ and $b$, but how do we KNOW for sure that works for irrational $a$ and $b$?

The early Pythagoreans would have said that it does not work. Fortunately for us, we know that it does work, and this is perhaps due to Hippasus' disclosure of the existence of irrational magnitudes. Unfortunately for him, he gave a different meaning to "under the sea"!

Daniel W. Farlow
  • 21,614
  • 25
  • 56
  • 98
  • 14
    Did you intend this as an example that _too much_ mathematical rigour (maybe) killed somebody? – Marc van Leeuwen Apr 06 '15 at 14:25
  • 3
    @MarcvanLeeuwen I meant it as an example where someone was (maybe) killed (i.e., Hippasus) because of the Pythagoreans' initial lack of rigour (in regards to their theory of proportions). Oddly enough, as I presume is your point, their actual rigor resulted in them eventually realizing much of their mathematical work was based on a faulty assumption. Nonetheless, their initial lack of rigor resulted in Hippasus' demise because of his disclosure. I don't know if that clears it up, but it's a pretty fascinating story! – Daniel W. Farlow Apr 06 '15 at 14:33

A lack of mathematical rigor has cost thousands of lives.

A lack of mathematical rigor is largely the cause for the belief that trading on credit default options were safer than they really were (widely held as the cause of the financial crisis of 2008). Regardless of whether or not the investors were intentionally misled, the consequence was that many people were financially ruined, costing thousands of lives. Moreover, those who lost their retirement as a result may have had their lifespans shortened unnecessarily due to having more limited retirement care options.

  • 524
  • 7
  • 12
  • 2
    This is an example of applying models in situations where they shouldn't have been applied, and of mis-using statistics badly (along the lines of the earlier case of LTCM's chief forex guy (assistant professor at Stanford before) repeatedly doubling up by a stubborn belief into mean reversion of a time series that didn't reflect their current situation). The underlying models (assumptions given; conclusions derived) were fully rigorous. – gnometorule Apr 07 '15 at 15:46
  • 2
    @gnometorule: Using induction to prove a statement without considering the base case is a lack of mathematical rigor. This was a case of applying the [Black-Scholes equation](http://en.wikipedia.org/wiki/Black–Scholes_equation) without checking if its underlying assumptions were valid. – jxh Apr 07 '15 at 18:41
  • 2
    @gnometorule: The point is, while induction is fully rigorous, applying it correctly also requires mathematical rigor. The same is true of applying any law or theorem. – jxh Apr 07 '15 at 19:21
  • Perhaps you meant credit default swap? https://en.wikipedia.org/wiki/Credit_default_swap. – Matt Samuel Sep 13 '16 at 00:25

Consider the case of Sally Clark who was convicted of killing her sons after both died in infancy. The pathologist used poor mathematical reasoning to 'prove' that it was extremely unlikely to be accidental. For example, he calculated a chance of 1 in 73 million that it was accidental because he assumed that the events were necessarily independent if it was not murder. This reasoning was accepted by judges and juries at the original trial and the first appeal before she was finally released on a second appeal.

Sally Clark died some time later but it is believed that the stress of the situation was a major contributory factor so lack of mathematical rigour was at least a factor in her death.

  • 321
  • 1
  • 3
  • 4
    Misuse of statistics in court cases. There is an interesting article by Mary Gray that includes many of these... "Statistics and the Law". *Math. Mag.* 56 (1983), no. 2, 67–81. – GEdgar Apr 07 '15 at 14:26

The Gimli Glider aviation incident involved a Boeing 767 plane that ran out of fuel at an altitude of 41,000 feet (12,000 m) over Canada. Investigations revealed that fuel loading was miscalculated due to a misunderstanding of the recently adopted metric system which replaced the imperial system.

None of the 61 passengers on board were seriously hurt, but this was thanks to one of the pilots being an experienced glider pilor, familiar with flying techniques almost never used by commercial pilots, which enabled him to land the plane without power.

The investigation revealed that,

Instead of 22,300 kg of fuel, they had 22,300 pounds on board — 10,100 kg, about half the amount required to reach their destination.

Dan Dascalescu
  • 300
  • 1
  • 8
  • 7
    A classic example in aviation is de Havilland Comet's square windows where a lack of understanding of the regularity of solutions of pde's on the boundary lead to a few crashes. – user3371583 Apr 05 '15 at 20:49
  • 24
    This has nothing whatsoever to do with mathematical rigour. – David Richerby Apr 05 '15 at 23:26
  • 1
    @DavidRicherby: are we even clear about what mathematical rigor means? When I have taught, I have made very clear to all of my students that units and dimensions are of the utmost importance to physical problems. – Ron Gordon Apr 07 '15 at 14:24
  • 2
    @RonGordon Mathematical rigour means using arguments that are fully justified. There is no "argument" here: it's just a simple error whose closest connection to mathematics is that a number is involved. – David Richerby Apr 07 '15 at 19:23
  • The Havilland Comet example honestly sounds to me like the sort of thing you had in mind. Learning to take these sorts of things into account is precisely the kind of training one gets in a first real analysis course. – jdc Dec 15 '15 at 19:54

The most plain way to answer this is: Yes.

I'm not going to waste time giving examples: others here have done this quite adequately. Instead, I'm going to give you some examples of how mathematical rigour saves lives - thus leaving the ways it does not save lives self-evident.

The first step is to examine your assumptions. Are you (or your friend) assuming that 'mathematical rigor' affects only the numbers on the paper? This is a fallacious assumption, after all. Mathematics is used in so many different areas of our life, and we never even realize it.

When engineers design automobiles, or planes, or anything of the sort, not only is the design done through mathematics, but the testing boundaries for safety concerns are also done through mathematics. If you've ever ridden in a car, plane, or train, motorcycle, or even a normal pedal cycle, or if you've ridden on an elevator or roller coaster, you've bet your life on the mathematical rigour having been done in full.

Then there's medical testing. If you've ever been in a hospital or taken headache medicine, you're betting your life on the mathematical rigour for the safety testing on that medicine and equipment having been done in full.

Have you eaten processed food? The safety margins for production and distribution, and for food testing, are based on math. The methods used for preserving food are based on math. Even as such, we still get food recalls when things go wrong - but if you ever eat food prepared and distributed en masse, you're depending on the mathematical rigour having been done right.

Then, on a broader scale, there's the concept behind this. Mathematical rigour hides behind a LOT of assumptions we make in day-to-day life, or in politics. We assume, based on past experience, pattern recognition, and lack of knowledge, that something 'just makes sense' that it should work. Ignorance, it has been found, breeds confidence. That is to say - the people who know least about a subject are the most confident that they know all about it - because they don't know enough to know the dangers. But how often has someone 'run the numbers'? If you want to replace a light switch ... do you do it yourself, or do you call an electrician? Someone could go by their gut ... or they could run the numbers. What are the statistics? What kind of electrical work is this, and what are the numbers on the flow of electricity through that line? What is the probability that there's a vital step in the process you don't know? Doing the mathematical rigour on a decision like that is very much like getting proper electrician training. Or ... chalking up a big 'unknown' field and deciding to call an electrician because the unknown value of the unknown mathematically represents too much of a risk.

But if you choose the other way and get yourself electrocuted - you've basically failed to do the math. Death by assumption.

Wolfman Joe
  • 367
  • 1
  • 3
  • 6
    I can see how the downvoters dislike this answer because it does not contain a specific example, but you *have* to love "*death by assumption*" :) – Tobia Tesan Apr 06 '15 at 16:49
  • Not saving lives isn't the same as killing. #logicalrigor – Dan Dascalescu Dec 15 '15 at 21:03
  • 2
    @DanDascalescu - If someone dies because of this source, they have died. Everyone dies of being killed by something - whether it is by a person's action, killed by a chance mishap, killed by illness, killed by old age, or whatever. I would like to know your premise upon what kind of death does NOT involve being killed by something. – Wolfman Joe Jan 19 '16 at 16:16
  • upvoted to compensate for at least one downvote – timur Mar 05 '19 at 19:28

[…] has there ever been an occasion where an assumption or a lack of rigour has killed someone before?

As most of the existing answers seem to have interpreted this question differently than I did, I first want to specify my understanding:

I do not think that anybody would doubt that people have been killed by mathematical mistakes, lack of mathematical knowledge, lack of scientific, procedural or engineering rigour in the past (see most of the existing answers). And of course, many of these mistakes would have been prevented by mathematical rigour, but they would have also been prevented by mathematical or scientific knowledge, doing more experiments or tests and so on.

Thus I assume that you are asking for cases where scientific, engineering, procedural rigour and similar were present, but mathematical rigour wasn’t, which then killed somebody. This would include that somebody tested a false mathematical statement with a number of experiments that is appropriate regarding for an application on which lives depend.

Now, there are false mathematical statements that are rather robust to experiment (see, e.g., here), but all of those are very far from actual application. Moreover, even the mathematics that actually is close to application is usually wrapped into some layers of science and engineering that act as a failsafe with respect to this application. Also, the fewer the number of cases for which a statement does not hold, the less likely such a case is occurr in application. Thus I consider it very unlikely that there is a case in which lack of mathematical rigour directly killed somebody.

That being said, mathematical rigour is not without value. The alternative to mathematical rigour is empirism¹ and empirism can at times be quite tedious. Moreover, if a mathematician fails with respect to rigour in aspects that are covered by layers of application, this may cost more time (than it saves the mathematician) of people on the next layers, which in turn have less time for what they actually want to do. So eventually mathematical rigour saves the time of people close to application which may use this time to save actual lives.

¹ Remember that all applications of mathematics are eventually based on the very well substatiated, but yet empirical fact that mathematical axioms apply to certain real-life observables, constructs and similar.

  • 5,210
  • 1
  • 22
  • 43

Interchanging two limits, including differentiating a Fourier series term by term, is a topic where rigour is important. Rigorous theorems describe when you are allowed to do this. Legend has it one of the rockets at Cape Canaveral in the early '60s crashed due to an electrical engineer's having interchanged two limits illegitimately. But no one was killed.

I think that the topic of Fourier series and integrals is probably the most practically relevant part of maths where rigour actually makes a difference...especially when noise or square waves are involved....

Although no one was killed, there was a great financial loss and a spectacular crash.

  • 15
    This is one that could use more detail. Could you explain the story further? – Jay Apr 07 '15 at 13:50

"Challenger, go with throttle up".

NASA launch managers ignored the recommendations from Morton Thiokol engineers to delay the launch. There was undoubtedly an engineering assessment of those risks,with failure probabilities, along with physical evidence of risks from prior launches that indicated such a cold weather launch was a bad idea. With an illustrious scientific and engineering organization such as NASA turning its back on the risk assessment for all the wrong reasons, this seemed to me like an example where "an assumption or a lack of rigour has killed someone before". I'm sure there was plenty of math involved. Feymann's demonstration of an O-ring deforming at cold temperatures was pretty damning.

  • 609
  • 1
  • 4
  • 7
  • 2
    More detail would help. IIRC the cause of the Challenger explosion was attributed to inadequate O-rings (as in, O-rings that were not rated for their purpose), but I don't know if there is a mathematical side to the story. – Tobia Tesan Apr 06 '15 at 12:11
  • 1
    NASA launch managers ignored the recommendations from Morton Thiokol engineers to delay the launch. There was undoubtedly an engineering assessment, of those risks,with failure probabilities, along with physical evidence of risks from prior launches that indicated such a cold weather launch was a bad idea. With an illustrious scientific and engineering organization such as NASA turning its back on the risk assessment for all the wrong reasons, this seemed to me like an example where "an assumption or a lack of rigour has killed someone before". I'm sure there was plenty of math involved. – SWilliams Apr 06 '15 at 12:36
  • Yes! And his demonstration of an O-ring coming apart at cold temperatures was pretty damning. – SWilliams Apr 06 '15 at 13:04
  • You could incorporate that into the answer (and perhaps this would change the mind of the downvoter). – Tobia Tesan Apr 06 '15 at 13:48
  • @TobiaTesan - Whether or not an O-ring is adequate for the purpose is determined primarily through mathematics. Even physical testing of O-rings under controlled conditions ... the applicability of those controlled conditions is done through mathematics. – Wolfman Joe Apr 06 '15 at 15:59
  • 4
    I don't see what this has to do with mathematics, let alone mathematical rigour. – David Richerby Apr 06 '15 at 17:55
  • Had they plotted the "no damage found" O rings with temperature on the same plot of "damage severity" vs. temperature, they would have found it immediately. Failure to understand nothing observed != no data. – Joshua Apr 06 '15 at 18:14
  • @DavidRicherby "I'm sure there was plenty of math involved." So what does this have to do with mathematics? The answer appears to be, "something, probably." – KSmarts Apr 06 '15 at 21:08
  • The O-ring demonstration Feynman did wasn't that it "came apart", but that when compressed while cold, it didn't "spring back" afterward, but stayed compressed -- meaning it wouldn't seal properly, causing blow-by, and ultimately leading to failure. – Glen_b Apr 07 '15 at 02:09
  • Thanks for the correction. I saw his demo a long time ago. I have edited my OP. – SWilliams Apr 07 '15 at 02:12
  • One discussion of what's arguably a failure of *statistical* rigor in relation to the Challenger disaster is [here](https://tamino.wordpress.com/2011/12/14/the-value-of-data/) - a lack of awareness of the problem of *nonignorable nonresponse*. – Glen_b Apr 07 '15 at 02:18
  • I don't see that this was a matter of mathematics. The engineers understood the situation--cold O-rings -> leaks -> danger. The launch decision was made by managers, not by engineers--no math involved. – Loren Pechtel Apr 07 '15 at 04:18
  • Actually this is an example of when POLITICS cost lives. The engineers told the brass it was not safe to fly that morning but they gave the order anyway. – Chris Apr 07 '15 at 14:20
  • This reminded be of an anecdote my professor told me (not sure if true, nor if related to your story). The story goes that while reusing old components for building a new spaceship, the enigneers tested an O-ring (or was it a pipe, no idea) for having the correct shape (circular) by measuring its thickness for all orientations. Since they measured a constant width, they concluded it must be circular. Their mistake: there are many more [curves of constant width](https://en.wikipedia.org/wiki/Curve_of_constant_width). This misclassification caused welded joints to tear apart during the start. – M. Winter Jan 22 '18 at 10:54
  • I think i might be deviating from original question , but let me bring down my opinion , if maths would somehow kill a body , it maybecause excessive rigor not lack of it , like someone who wastes more than half of his days and non slept nights in scientific researches without any result , that would distract him from serious matters in his life and retard him very backward in age , he would miss many things and wake up on a horible deadly truth cause him to commit suicide or self destruction , btw .... this question seems silly for me to care i prefer the first one bein debated

ok was joking :D time for serious answers

considering an infinite euclidean plan isometry

this graph shows how is a product of two irrational amounts is abstractly feasible (but not perfectly calculable since the human brain conception is bounded to decimal values)

enter image description here

the value shadowed in blue equals to $\int_{0}^{\pi} \sqrt{3}$ ~= $5.44$

the last rounded value is not yet authentic because when we draw a $3.14$ segment perpendicular to $1.73$ long segment we end up with a result nearby $5.43$ this difference of $0.01$ would be gradually and progressivly leading to a killing mistake.

This is an example of deadly inaccuate rounded up error percent.

I will show you now the steps you need to get an irrational product value on your graph paper , take a compass , and a pen , and a ruler with measures.

first draw a circle of radius $r=1Dm = 10cm$ , and draw bissectrice of $\pi/2$ with same compass ...

enter image description here

Take same compass again , draw the bissectrice of the angle $\pi/4$ , now you have an angle $\theta=\pi/8$

enter image description here

Draw the projective point on $x$ axis , then take the same compass , and draw a half circle of radius = the segment from the center to the projection of the angle $\theta$

enter image description here

length of new radius is $cos \theta$=$cos \pi/8$

$cos \pi/4 = cos(2(\pi/8))=2cos²(\pi/8)-1$

$cos \pi/8 = \sqrt{\frac{cos (\pi/4) + 1}{2}}= \frac{\sqrt{\sqrt{2}+2}}{2}$

the surface of the blue shadowed segment = $\pi*r^2/8=\frac{\sqrt{2}+2}{24}\pi$

Now be happy :D you have just drawn an irrational product of class $ \pi*\mathbb{N}^{1/2}$ by your own bare hands !!! and it does never equals the approximate decimal calculation result whic is nearby $44.7 mm$

Abdou Abdou
  • 503
  • 4
  • 11
  • 1
    in this example i made light on a 11.4 centimetres range mistake that caused death of 25 workers ! have i missed something mr downvoter ? – Abdou Abdou Apr 07 '15 at 15:24
  • 4
    Your answer is poorly written and not much of an answer, hence the downvote. – Daniel W. Farlow Apr 07 '15 at 15:42
  • @MagicMan in yuor answer you did cite "there exist incommensurable line segments" which fits my answer , so what was downvoting and underestimating this post for ? – Abdou Abdou Apr 07 '15 at 15:47
  • 2
    Ibidem $\color{white}{\text{Your answer blows}}$ – Daniel W. Farlow Apr 07 '15 at 15:49
  • i didnt copypast from you ! – Abdou Abdou Apr 07 '15 at 15:51
  • 1
    I'm sorry - who died in this example? I'm puzzled. – Tobia Tesan Apr 16 '15 at 07:04
  • 3
    Two years into construction of the bridge, at 11.50 am on 15 October 1970, the 112 m (367.5 ft) span between piers 10 and 11 collapsed and fell 50 m (164 ft) to the ground and water below. Thirty-five construction workers were killed. Many of those who perished were on .... http://en.wikipedia.org/w/index.php?title=West_Gate_Bridge – Abdou Abdou Apr 16 '15 at 09:39