Kim recalls Yoichi Miyaoka’s proposed proof of Fermat’s Last Theorem in 1988, which garnered a lot of media attention before serious flaws were discovered. “He became very embarrassed,” says Kim.
People that understand a thing should be able to explain it, IMHO. Too say others aren’t smart enough is lame.
Embarrassment is a powerful force, especially for some. It sounds an awful lot like he has become inscrutable for the purpose of avoiding future embarrassment (regardless of the mathematical proof or not.)
I guess Japan is one of those cultures in which shame or embarrassment or loss of face is a Very Big Deal.
On the other hand, my understanding of the culture of mathematics is that people produce flawed proofs all the time (queue the “Simpsons” ha-ha tape), but sometimes the flaws get corrected and you have a good result, or sometimes a “negative result” is respected in terms of a new way of thinking about a problem.
Next, there are things that can be proven in mathematics that are not simple to explain. Really.
I don’t think the problem is that this man thinks “I’m smart and you’re not!” The problem is that he flopped this putative proof out there, but who wants to devote their eyesight and remaining brain cells before Alzheimers sets in to checking this, um, “totality” of work.
I mean, a lot of us like to write computer code, either for coin for for recreation, but who here really likes to read anyone else’s computer code? And to the extent that math proofs are kinda, sorta like computer software source code, I get the impression from the remarks that this proof is probably somewhere between “spaghetti code” and the insufferable weenie who puts every obscure computer language feature into their code, writes “clever” macros and obscure templates, and expects you to follow it.
Yeah, from the sound of it it’s not just spaghetti code, it’s that he started out by writing a compiler for a whole new language of his own design and then wrote an interesting new program in that… and, surprise, nobody is too excited about checking that program for bugs.
Then again, maybe the new language is interesting on its own merits; so far it seems like nobody is sure, and his unwillingness to market it is a bit much even among mathematicians.
who here really likes to read anyone else’s computer code?
Depends on who wrote it and what language. Elegant code has the quality of art.
Thanks for the link to that cool article.
Bob Clark
This “inter-universal geometer,” this possible genius, may have found the key that would redefine number theory as we know it.
Or maybe after spending years figuring it out, he discovered that, while valid, the conjecture says absolutely nothing about number theory. That it’s all just a coincidence resulting, not from a relationship between addition and multiplication, but from the severe constraints on the numbers one can use to test it.
The article makes a statement about how odd it is to basically have additions have odd effects when thought of as multiplications.
But – logarithms still pretty darn cool even though the slide rule is dead.
Multiplication is addition, especially in binary where it’s shifting and adding.
Kim recalls Yoichi Miyaoka’s proposed proof of Fermat’s Last Theorem in 1988, which garnered a lot of media attention before serious flaws were discovered. “He became very embarrassed,” says Kim.
People that understand a thing should be able to explain it, IMHO. Too say others aren’t smart enough is lame.
Embarrassment is a powerful force, especially for some. It sounds an awful lot like he has become inscrutable for the purpose of avoiding future embarrassment (regardless of the mathematical proof or not.)
I guess Japan is one of those cultures in which shame or embarrassment or loss of face is a Very Big Deal.
On the other hand, my understanding of the culture of mathematics is that people produce flawed proofs all the time (queue the “Simpsons” ha-ha tape), but sometimes the flaws get corrected and you have a good result, or sometimes a “negative result” is respected in terms of a new way of thinking about a problem.
Next, there are things that can be proven in mathematics that are not simple to explain. Really.
I don’t think the problem is that this man thinks “I’m smart and you’re not!” The problem is that he flopped this putative proof out there, but who wants to devote their eyesight and remaining brain cells before Alzheimers sets in to checking this, um, “totality” of work.
I mean, a lot of us like to write computer code, either for coin for for recreation, but who here really likes to read anyone else’s computer code? And to the extent that math proofs are kinda, sorta like computer software source code, I get the impression from the remarks that this proof is probably somewhere between “spaghetti code” and the insufferable weenie who puts every obscure computer language feature into their code, writes “clever” macros and obscure templates, and expects you to follow it.
Yeah, from the sound of it it’s not just spaghetti code, it’s that he started out by writing a compiler for a whole new language of his own design and then wrote an interesting new program in that… and, surprise, nobody is too excited about checking that program for bugs.
Then again, maybe the new language is interesting on its own merits; so far it seems like nobody is sure, and his unwillingness to market it is a bit much even among mathematicians.
who here really likes to read anyone else’s computer code?
Depends on who wrote it and what language. Elegant code has the quality of art.
Thanks for the link to that cool article.
Bob Clark
Or maybe after spending years figuring it out, he discovered that, while valid, the conjecture says absolutely nothing about number theory. That it’s all just a coincidence resulting, not from a relationship between addition and multiplication, but from the severe constraints on the numbers one can use to test it.
The article makes a statement about how odd it is to basically have additions have odd effects when thought of as multiplications.
But – logarithms still pretty darn cool even though the slide rule is dead.
Multiplication is addition, especially in binary where it’s shifting and adding.