
Why mathematics is more human than we think. A plea for the ethical use of formulas and models.
One plus one is two. In a world full of “fake news” and subjective opinions, mathematics is often seen as the last rock in the storm: objective, neutral, incorruptible. But this appearance is deceptive.
In a joint lecture at the University of Teacher Education Vorarlberg, I, together with my colleague Univ.-Prof. Gregor Nickel, a philosopher of mathematics from Siegen, took a look behind the scenes of apparent mathematical truth.
The conclusion: Mathematics is a powerful tool, but it is never free from human values and human errors.
When bridges miss each other
Sometimes mathematics fails simply due to craftsmanship. An almost legendary example of miscalculation is the High Rhine Bridge (Hochrheinbrücke) between Germany and Switzerland. As is well known, the two neighboring countries define sea level differently—Germany refers to the North Sea, Switzerland to the Mediterranean. The difference is 27 centimeters. The engineers knew this. But unfortunately, a classic sign error occurred: Instead of subtracting the value, it was added. The result: The two halves of the bridge had a height difference of 54 centimeters.

Such mistakes are expensive and embarrassing, but usually ethically unproblematic as long as “only” money and concrete are affected. It becomes more dramatic when blind faith in technology meets human lives. The Swedish warship Vasa sank in 1628 shortly after setting sail because two teams of carpenters used different units of measurement—Swedish and Amsterdam feet. The ship was simply asymmetrical.
The model is not reality
However, the real ethical explosiveness often lies not in calculation errors, but in so-called “modeling.” Before we calculate, we must simplify the world. We must decide: What is important? What do we leave out?

Let’s take our world maps. Most of us have the image of the Earth in our heads that Gerardus Mercator designed in the 16th century. His map was brilliant for navigation because it is conformal (preserves angles)—anyone who draws a course on the map will arrive at their destination. The price for this: The areas are massively distorted. Greenland appears almost as large as Africa on the map, although Africa is actually 14 times larger.
Is that a lie? No, it is a decision for navigation and against equal-area representation. It only becomes problematic when we forget that every map, every statistic, and every algorithm is based on such decisions.
When probability becomes a verdict
How dangerous unreflected mathematics can be is shown by the tragic case of Sally Clark. The British woman lost two babies to sudden infant death syndrome in the late 90s. A so-called statistical expert calculated in court: The probability of this happening twice in one family is vanishingly small (1 in 73 million). The judiciary’s conclusion: It had to be murder; Clark was convicted.
The mistake? The expert assumed the deaths were independent events—like rolling dice twice. But in biology, there are genetic or environmental factors that increase the risk of being ill for a second child. Furthermore, the wrong question was asked: One should not have asked how unlikely two deaths are, but how probable a double murder by the mother is in comparison. Mathematically speaking, murder would have been even more unlikely. Sally Clark was later acquitted, but she never recovered from the injustice and died a few years later from alcohol poisoning. Here, flawed mathematics without ethical intuition became a weapon.
The compulsion for flawlessness
What do we learn from this for our digitalized society, in which algorithms decide on creditworthiness, job opportunities, and university rankings?
First: We must say goodbye to the illusion of absolute freedom from error. The sociologist Ulrich Beck warned as early as 1986 against creating technologies that force humans to be infallible—because we are not.
Second: Every model involves value judgments. When we rank universities, we decide whether citations are more important to us than student satisfaction. When we train an AI, we decide which data we feed it.
Mathematics is not an island of pure reason, decoupled from our values. It is a cultural asset, much like language. And just as we must learn to speak fairly and thoughtfully, we must learn to calculate responsibly.
Numbers may not lie, but they always tell only the story we let them tell.
A German original of this article appeared in Thema Vorarlberg.

This is a brilliant article: Brief but with enough examples to convince the reader of its counter-intuitive thesis.