What are the most mindblowing things in mathematics?
What concepts or facts do you know from math that is mind blowing, awesome, or simply fascinating?
Here are some I would like to share:
Gödel's incompleteness theorems: There are some problems in math so difficult that it can never be solved no matter how much time you put into it.
Halting problem: It is impossible to write a program that can figure out whether or not any input program loops forever or finishes running. (Undecidablity)
The Busy Beaver function
Now this is the mind blowing one. What is the largest non-infinite number you know? Graham's Number? TREE(3)? TREE(TREE(3))? This one will beat it easily.
The Busy Beaver function produces the fastest growing number that is theoretically possible. These numbers are so large we don't even know if you can compute the function to get the value even with an infinitely powerful PC.
In fact, just the mere act of being able to compute the value would mean solving the hardest problems in mathematics.
Σ(1) = 1
Σ(4) = 13
Σ(6) > 101010101010101010101010101010 (10s are stacked on each other)
Σ(17) > Graham's Number
Σ(27) If you can compute this function the Goldbach conjecture is false.
Σ(744) If you can compute this function the Riemann hypothesis is false.
For the uninitiated, the monty Hall problem is a good one.
Start with 3 closed doors, and an announcer who knows what's behind each. The announcer says that behind 2 of the doors is a goat, and behind the third door is a car student debt relief, but doesn't tell you which door leads to which. They then let you pick a door, and you will get what's behind the door. Before you open it, they open a different door than your choice and reveal a goat. Then the announcer says you are allowed to change your choice.
So should you switch?
The answer turns out to be yes. 2/3rds of the time you are better off switching. But even famous mathematicians didn't believe it at first.
Such a simple construct right? Notice the word "conjecture". The above has been verified till 4x10^18 numbers BUT no one has been able to prove it mathematically till date! It's one of the best known unsolved problems in mathematics.
You can take any map of anything and color it in using only four colors so that no adjacent “countries” are the same color. Often it can be done with three!
I came here to find some cool, mind-blowing facts about math and have instead confirmed that I'm not smart enough to have my mind blown. I am familiar with some of the words used by others in this thread, but not enough of them to understand, lol.
For me, personally, it's the divisible-by-three check. You know, the little shortcut you can do where you add up the individual digits of a number and if the resulting sum is divisible by three, then so is the original number.
That, to me, is black magic fuckery. Much like everything else in this thread I have no idea how it works, but unlike everything else in this thread it's actually a handy trick that I use semifrequently
The utility of Laplace transforms in regards to differential systems.
In engineering school you learn to analyze passive DC circuits early on using not much more than ohms law and Thevenin's Theoram. This shit can be taught to elementary schoolers.
Then a little while later, you learn how to do non-finear differential equations to help work complex systems, whether it's electrical, mechanical, thermal, hydrolic, etc. This shit is no walk in the park.
Then Laplace transforms/identities come along and let you turn non-linear problems in time-based space, into much simpler problems in frequency-based space. Shit blows your mind.
THEN a mafacka comes along and teaches you that these tools can be used to turn complex differential system problems (electrical, mechanical, thermal, hydrolic, etc) into simple DC circuits you can analyze/solve in frequency-based space, then convert back into time-based space for the answers.
I know this is super applied calculus shit, but I always love that sweet spot where all the high-concept math finally hits the pavement.
Euler's identity, which elegantly unites some of the most fundamental constants in a single equation:
e^(iπ)+1=0
Euler's identity is often cited as an example of deep mathematical beauty. Three of the basic arithmetic operations occur exactly once each: addition, multiplication, and exponentiation. The identity also links five fundamental mathematical constants:
The number 0, the additive identity.
The number 1, the multiplicative identity.
The number π (π = 3.1415...), the fundamental circle constant.
The number e (e = 2.718...), also known as Euler's number, which occurs widely in mathematical analysis.
The number i, the imaginary unit of the complex numbers.
Furthermore, the equation is given in the form of an expression set equal to zero, which is common practice in several areas of mathematics.
Stanford University mathematics professor Keith Devlin has said, "like a Shakespearean sonnet that captures the very essence of love, or a painting that brings out the beauty of the human form that is far more than just skin deep, Euler's equation reaches down into the very depths of existence". And Paul Nahin, a professor emeritus at the University of New Hampshire, who has written a book dedicated to Euler's formula and its applications in Fourier analysis, describes Euler's identity as being "of exquisite beauty".
Mathematics writer Constance Reid has opined that Euler's identity is "the most famous formula in all mathematics". And Benjamin Peirce, a 19th-century American philosopher, mathematician, and professor at Harvard University, after proving Euler's identity during a lecture, stated that the identity "is absolutely paradoxical; we cannot understand it, and we don't know what it means, but we have proved it, and therefore we know it must be the truth".
"The coastline paradox is the counterintuitive observation that the coastline of a landmass does not have a well-defined length. This results from the fractal curve–like properties of coastlines; i.e., the fact that a coastline typically has a fractal dimension."
This is my silly contribution: 70% of 30 is equal to 30% of 70. This applies to other numbers and can be really helpful when doing percentages in your head. 15% of 77 is equal to 77% of 15.
I am a huge fan of 3blue1brown and his videos are just amazing. My favorite is linear algebra. It was like an out of body experience. All of a sudden the world made so much more sense.
Imagine a soccer ball. The most traditional design consists of white hexagons and black pentagons. If you count them, you will find that there are 12 pentagons and 20 hexagons.
Now imagine you tried to cover the entire Earth in the same way, using similar size hexagons and pentagons (hopefully the rules are intuitive). How many pentagons would be there? Intuitively, you would think that the number of both shapes would be similar, just like on the soccer ball. So, there would be a lot of hexagons and a lot of pentagons. But actually, along with many hexagons, you would still have exactly 12 pentagons, not one less, not one more. This comes from the Euler's formula, and there is a nice sketch of the proof here: .
Borsuk-Ulam is a great one! In essense it says that flattening a sphere into a disk will always make two antipodal points meet. This holds in arbitrary dimensions and leads to statements such as "there are two points along the equator on opposite sides of the earth with the same temperature". Similarly one knows that there are two points on the opposite sides (antipodal) of the earth that both have the same temperature and pressure.
Godel's incompleteness theorem is actually even more subtle and mind-blowing than how you describe it. It states that in any mathematical system, there are truths in that system that cannot be proven using just the mathematical rules of that system. It requires adding additional rules to that system to prove those truths. And when you do that, there are new things that are true that cannot be proven using the expanded rules of that mathematical system.
This is a common one, but the cardinality of infinite sets. Some infinities are larger than others.
The natural numbers are countably infinite, and any set that has a one-to-one mapping to the natural numbers is also countably infinite. So that means the set of all even natural numbers is the same size as the natural numbers, because we can map 0 > 0, 1 > 2, 2 > 4, 3 > 6, etc.
But that suggests we can also map a set that seems larger than the natural numbers to the natural numbers, such as the integers: 0 → 0, 1 → 1, 2 → –1, 3 → 2, 4 → –2, etc. In fact, we can even map pairs of integers to natural numbers, and because rational numbers can be represented in terms of pairs of numbers, their cardinality is that of the natural numbers. Even though the cardinality of the rationals is identical to that of the integers, the rationals are still dense, which means that between any two rational numbers we can find another one. The integers do not have this property.
But if we try to do this with real numbers, even a limited subset such as the real numbers between 0 and 1, it is impossible to perform this mapping. If you attempted to enumerate all of the real numbers between 0 and 1 as infinitely long decimals, you could always construct a number that was not present in the original enumeration by going through each element in order and appending a digit that did not match a decimal digit in the referenced element. This is Cantor's diagonal argument, which implies that the cardinality of the real numbers is strictly greater than that of the rationals.
The best part of this is that it is possible to construct a set that has the same cardinality as the real numbers but is not dense, such as the Cantor set.
That you can have 5 apples, divide them zero times, and somehow end up with math shitting itself inside-out at you even though the apples are still just sitting there.
A simple one: Let's say you want to sum the numbers from 1 to 100. You could make the sum normally (1+2+3...) or you can rearrange the numbers in pairs: 1+100, 2+99, 3+98.... until 50+51 (50 pairs). So you will have 50 pairs and all of them sum 101 -> 101*50= 5050. There's a story who says that this method was discovered by Gauss when he was still a child in elementary school and their teacher asked their students to sum the numbers.
Three of the basic arithmetic operations occur exactly once each: addition, multiplication, and exponentiation. The identity also links five fundamental mathematical constants:[6]
The number 0, the additive identity.
The number 1, the multiplicative identity.
The number π (π = 3.1415...), the fundamental circle constant.
The number e (e = 2.718...), also known as Euler's number, which occurs widely in mathematical analysis.
The number i, the imaginary unit of the complex numbers.
The fact that an equation like that exists at the heart of maths - feels almost like it was left there deliberately.
x^n + y^n = z^n has no solutions where n > 2 and x, y and z are all natural numbers. It's hard to believe that, knowing that it has an infinite number of solutions where n = 2.
Pierre de Format, after whom this theorem was named, famously claimed to have had a proof by leaving the following remark in some book that he owned: "I have a proof of this theorem, but there is not enough space in this margin". It took mathematicians several hundred years to actually find the proof.
Here's a fun one - you know the concept of regular polyhedra/platonic solids right? 3d shapes where every edge, angle, and face is the same? How many of them are there?
Did you guess 48?
There's way more regular solids out there than the bog standard set of DnD dice! Some of them are easy to understand, like the Kepler-poisont solids which basically use a pentagramme in various orientations for the face shape (hey the rules don't say the edges can't intersect!) To uh...This thing. And more! This video is a fun breakdown (both mathematically and mentally) of all of them.
Unfortunately they only add like 4 new potential dice to your collection and all of them are very painful.
I find the logistic map to be fascinating. The logistic map is a simple mathematical equation that surprisingly appears everywhere in nature and social systems. It is a great representation of how complex behavior can emerge from a straightforward rule. Imagine a population of creatures with limited resources that reproduce and compete for those resources. The logistic map describes how the population size changes over time as a function of its current size, and it reveals fascinating patterns. When the population is small, it grows rapidly due to ample resources. However, as it approaches a critical point, the growth slows, and competition intensifies, leading to an eventual stable population. This concept echoes in various real-world scenarios, from describing the spread of epidemics to predicting traffic jams and even modeling economic behaviors. It's used by computers to generate random numbers, because a computer can't actually generate truly random numbers. Veritasium did a good video on it: https://www.youtube.com/watch?v=ovJcsL7vyrk
I find it fascinating how it permeates nature in so many places. It's a universal constant, but one we can't easily observe.
The Banach - Tarski Theorm is up there. Basically, a solid ball can be broken down into infinitely many points and rotated in such a way that that a copy of the original ball is produced. Duplication is mathematically sound! But physically impossible.
How Gauss was able to solve 1+2+3...+99+100 in the span of minutes. It really shows you can solve math problems by thinking in different ways and approaches.
To me, personally, it has to be bezier curves. They're not one of those things that only real mathematicians can understand, and that's exactly why I'm fascinated by them. You don't need to understand the equations happening to make use of them, since they make a lot of sense visually. The cherry on top is their real world usefulness in computer graphics.
The Julia and Mandelbrot sets always get me. That such a complex structure could arise from such simple rules. Here's a brilliant explanation I found years back: https://www.karlsims.com/julia.html
The 196,883-dimensional monster number (808,017,424,794,512,875,886,459,904,961,710,757,005,754,368,000,000,000 ≈ 8×10^53) is fascinating and mind-boggling. It's about symmetry groups.
The fact that complex numbers allow you to get a much more accurate approximation of the derivative than classical finite difference at almost no extra cost under suitable conditions while also suffering way less from roundoff errors when implemented in finite precision:
(x and epsilon are real numbers and f is assumed to be an analytic extension of some real function)
Incompleteness is great.. internal consistency is incompatible with universality.. goes hand in hand with Relativity.. they both are trying to lift us toward higher dimensional understanding..
Saving this thread! I love math, even if I'm not great at it.
Something I learned recently is that there are as many real numbers between 0 and 1 as there are between 0 and 2, because you can always match a number from between 0 and 1 with a number between 0 and 2. Someone please correct me if I mixed this up somehow.
Integrals. I can have an area function, integrate it, and then have a volume.
And if you look at it from the Rieman sum angle, you are pretty much adding up an infinite amount of tiny volumes (the area * width of slice) to get the full volume.
Szemeredis regularity lemma is really cool. Basically if you desire a certain structure in your graph, you just have to make it really really (really) big and then you're sure to find it. Or in other words you can find a really regular graph up to any positive error percentage as long as you make it really really (really really) big.
Let's define a sequence. We will start with 1 and 1.
To get the next number, square the last, add 1, and divide by the second to last.
a(n+1) = ( a(n)^2 +1 )/ a(n-1)
So the fourth number is (2*2+1)/1 =5, while the next is (25+1)/2 = 13. The sequence is thus:
1, 1, 2, 5, 13, 34, ...
If you keep computing (the numbers get large) you'll see that every time we get an integer. But every step involves a division! Usually dividing things gives fractions.
This last is called the somos sequence, and it shows up in fairly deep algebra.
The infinite sum of all the natural numbers 1+2+3+... is a divergent series. But it can also be shown to be equivalent to -1/12. This result is actually used in quantum field theory.