What are some of your unpopular opinions in math? (not about math)
-
For example: I don't believe in the axiom of choice nor in the continuum hypothesis.
Not stuff like "math is useless" or "people hate math because it's not well taught", those are opinions about math.
I'll start: exponentiation should be left-associative, which means a^b should mean b×b×...×b } a times.
-
For example: I don't believe in the axiom of choice nor in the continuum hypothesis.
Not stuff like "math is useless" or "people hate math because it's not well taught", those are opinions about math.
I'll start: exponentiation should be left-associative, which means a^b should mean b×b×...×b } a times.
The exceptions including the number 1. Like it not being a prime number, or being 1 the result of any number to the 0 power. Or 0! equals 1.
I know 1 is a very special number, and I know these things are demonstrable, but something always feels off to me with these rules that include 1.
-
The exceptions including the number 1. Like it not being a prime number, or being 1 the result of any number to the 0 power. Or 0! equals 1.
I know 1 is a very special number, and I know these things are demonstrable, but something always feels off to me with these rules that include 1.
X^0 and 0! aren't actually special cases though, you can reach them logically from things which are obvious.
For X^0: you can get from X^(n) to X^(n-1) by dividing by X. That works for all n, so we can say for example that 2³ is 2⁴/2, which is 16/2 which is 8.
Similarly, 2¹/2 is 2⁰, but it's also obviously 1.The argument for 0! is basically the same. 3! is 1x2x3, and to go to 2! you divide it by 3.
You can go from 1! to 0! by dividing 1 by 1.In both cases the only thing which is special about 1 is that any number divided by itself is 1, just like any number subtracted from itself is 0
-
X^0 and 0! aren't actually special cases though, you can reach them logically from things which are obvious.
For X^0: you can get from X^(n) to X^(n-1) by dividing by X. That works for all n, so we can say for example that 2³ is 2⁴/2, which is 16/2 which is 8.
Similarly, 2¹/2 is 2⁰, but it's also obviously 1.The argument for 0! is basically the same. 3! is 1x2x3, and to go to 2! you divide it by 3.
You can go from 1! to 0! by dividing 1 by 1.In both cases the only thing which is special about 1 is that any number divided by itself is 1, just like any number subtracted from itself is 0
The numbers shouldn't change to make nice patterns, though, rather the patterns that don't fit the numbers don't fit them. Sure, the pattern with division of powers wouldn't be nice, but also 1 multiplied by itself 0 times is not 1, or at least, not only 1.
-
For example: I don't believe in the axiom of choice nor in the continuum hypothesis.
Not stuff like "math is useless" or "people hate math because it's not well taught", those are opinions about math.
I'll start: exponentiation should be left-associative, which means a^b should mean b×b×...×b } a times.
“Terryology may have some merits and deserves consideration.”
I don’t hold this opinion, but I can guarantee you it’s unpopular.
-
The exceptions including the number 1. Like it not being a prime number, or being 1 the result of any number to the 0 power. Or 0! equals 1.
I know 1 is a very special number, and I know these things are demonstrable, but something always feels off to me with these rules that include 1.
Or 0! equals 1.
x factorial is the number of ways you can arrange x different things. There's only one way to arrange zero things.
-
The numbers shouldn't change to make nice patterns, though, rather the patterns that don't fit the numbers don't fit them. Sure, the pattern with division of powers wouldn't be nice, but also 1 multiplied by itself 0 times is not 1, or at least, not only 1.
We make mathematical definitions to do math. We can define 0! any way we want but we defined it to be equal to 1 because it fits in nicely with the way the factorial function works on other numbers.
Literally the only reason why mathematicians define stuff is because it’s easier to work with definitions than to do everything from elementary tools. What the elementary tools are is also subjective. Mathematics isn’t some objective truth, it’s just human made structures that we can expand and better understand through applying logic in the form of proofs. Sometimes we can even apply them to real world situations!
-
For example: I don't believe in the axiom of choice nor in the continuum hypothesis.
Not stuff like "math is useless" or "people hate math because it's not well taught", those are opinions about math.
I'll start: exponentiation should be left-associative, which means a^b should mean b×b×...×b } a times.
Mixed numbers fraction syntax [1] is the dumbest funking thing ever. Juxtaposition of a number in front of any expression implies multiplication! Addition? Fucking addition? What the fuck is wrong with you?
-
For example: I don't believe in the axiom of choice nor in the continuum hypothesis.
Not stuff like "math is useless" or "people hate math because it's not well taught", those are opinions about math.
I'll start: exponentiation should be left-associative, which means a^b should mean b×b×...×b } a times.
-
I have this odd, perhaps part troll, feeling that there are two, and only two, roots of the Riemann Zeta function that aren't on the critical line, and are instead mirrors of each other at either side of it, like some weird pair of complex conjugates. Further, while I really want their real parts to be 1/4 and 3/4, the actual variance from 1/2 will be some inexplicable irrational number.
-
Multiplication order in current mathematics standards should happen the other way around when it's in a non-commutative algebra. I think this because transfinite multiplication apparently requires the transfinite part to go before any finite part to prevent collapse of meaning. For example, we can't write 2ω for the next transfinite ordinal because 2ω is just ω again on account of transfinite and backwards multiplication weirdness, and we have to write ω·2 or ω×2 instead like we're back at primary school.
-
-
Or 0! equals 1.
x factorial is the number of ways you can arrange x different things. There's only one way to arrange zero things.
I could still debate the proposition that zero things can be arranged in any way.
-
I could still debate the proposition that zero things can be arranged in any way.
That sounds like a philosophical position, not a mathematical one.
-
-
I have this odd, perhaps part troll, feeling that there are two, and only two, roots of the Riemann Zeta function that aren't on the critical line, and are instead mirrors of each other at either side of it, like some weird pair of complex conjugates. Further, while I really want their real parts to be 1/4 and 3/4, the actual variance from 1/2 will be some inexplicable irrational number.
-
Multiplication order in current mathematics standards should happen the other way around when it's in a non-commutative algebra. I think this because transfinite multiplication apparently requires the transfinite part to go before any finite part to prevent collapse of meaning. For example, we can't write 2ω for the next transfinite ordinal because 2ω is just ω again on account of transfinite and backwards multiplication weirdness, and we have to write ω·2 or ω×2 instead like we're back at primary school.
Multiplication order in current mathematics standards should happen the other way around when it’s in a non-commutative algebra.
The good thing about multiplication being commutative and associative is that you can think about it either way (e.g. 3x2 can be thought of as "add two three times). The "benefit" of carrying this idea to higher-order operations is that they become left-associative (meaning they can be evaluated from left to right), which is slightly more intuitive. For instance in lambda calculus, a sequence of church numerals n~1~ n~2~ ... n~K~ mean n~K~ ^ n~K-1~ ^ ... ^ n~1~ in traditional notation.
For example, we can’t write 2ω for the next transfinite ordinal because 2ω is just ω again on account of transfinite and backwards multiplication weirdness, and we have to write ω·2 or ω×2 instead like we’re back at primary school.
I'd say the deeper issue with ordinal arithmetic is that Knuth's up-arrow notation with its recursive definition becomes useless to define ordinals bigger than ε~0~, because something like ω^(ω^^ω) = ω^ε0^ = ε~0~. I don't understand the exact notion deeply yet, but I suspect there's some guilt in the fact that hyperoperations are fundamentally right-associative.
-
-
“Terryology may have some merits and deserves consideration.”
I don’t hold this opinion, but I can guarantee you it’s unpopular.
Who did ever say that? Not a single article that I've read about Terryology has praised it. I guess the Joe Rogan podcast helped it gather some followers?
-
The exceptions including the number 1. Like it not being a prime number, or being 1 the result of any number to the 0 power. Or 0! equals 1.
I know 1 is a very special number, and I know these things are demonstrable, but something always feels off to me with these rules that include 1.
0! = 1 isn't an exception.
Factorial is one of the solutions of the recurrence relationship f(x+1) = x * f(x). If one states that f(1) = 1, then it only follows from the recurrence that f(0) = 1 too, and in fact f(x) is undefined for negative integers, as it is with any function that has the property.
It would be more of an exception to say f(0) != 1, since it explicitly denies the rule, and instead would need some special case so that its defined in 0.
-
Mixed numbers fraction syntax [1] is the dumbest funking thing ever. Juxtaposition of a number in front of any expression implies multiplication! Addition? Fucking addition? What the fuck is wrong with you?
Juxtaposition of a number in front of any expression implies multiplication!
You think 10 means 1x0?
Fucking addition?
No, place value. It's not juxtaposition unless you have brackets and/or pronumerals in the term
-
We make mathematical definitions to do math. We can define 0! any way we want but we defined it to be equal to 1 because it fits in nicely with the way the factorial function works on other numbers.
Literally the only reason why mathematicians define stuff is because it’s easier to work with definitions than to do everything from elementary tools. What the elementary tools are is also subjective. Mathematics isn’t some objective truth, it’s just human made structures that we can expand and better understand through applying logic in the form of proofs. Sometimes we can even apply them to real world situations!
Mathematics isn’t some objective truth
Yes it is. That's why it's such a huge part of Physics.
it’s just human made structures
The notation is. The rest is underlying laws of Nature.
Sometimes we can even apply them to real world situations!
Maths whole reason for being is to model real world situations.
-
Mathematics isn’t some objective truth
Yes it is. That's why it's such a huge part of Physics.
it’s just human made structures
The notation is. The rest is underlying laws of Nature.
Sometimes we can even apply them to real world situations!
Maths whole reason for being is to model real world situations.
Wow you just disproved all of academic mathematical foundations and philosophy! Congrats!