Mathematics has been called the language of the universe. Scientists and engineers often speak of the elegance of mathematics when describing physical reality, citing examples such as π, E=mc2, and even something as simple as using abstract integers to count real-world objects. Yet while these examples demonstrate how useful math can be for us, does it mean that the physical world naturally follows the rules of mathematics as its "mother tongue," and that this mathematics has its own existence that is out there waiting to be discovered?

I don't understand why the author seems to discount anything remotely complex as not mathematical. For example,

> Today's submicrometer transistors involve complicated effects that the earlier models neglected, so engineers have turned to computer simulation software to model smaller transistors. A more effective formula would describe transistors at all scales, but such a compact formula does not exist.

What the heck does she think the code of the simulation software is? People wrote that code so that engineers can actually get work done at Intel. Furthermore, aren't those "complicated effects" at smaller scales due to quantum mechanics which are extremely well-defined by math?

Why is there the assumption that describing something in the most general form (like "transistors at all scales") is still going to be simple? Is everything supposed to be as simple as E = mc^2?

I took transistor theory years ago. I remember almost nothing of the formulas, but I do remember the presentation. The engineering professor started with a quantum mechanical model, and he expanded the formula till it filled a chalk board. Then he said, notice that the dominate term is, and circled one of the terms. Then he took and erased everything but that term, and said we'll treat this as equal. He did this three times, with one of the formula expansions taking up 3 chalkboards. In the end, he said

and that is why we treat transistors as a linear amplifier, and it only applies across a certain range.

Then the next day we had the lab, and the challenge was to build a 2 transistor amplifier and show that it was linear across the desired range. You get the parts, and get them it all together up on a scope. It was a mess, there were odd side-bands and crazy noise points all through the spectrum, and I spent hours tuning the resistors to lock in the range. Wow, it was eye opening. Yes, one can model transistors as a linear amplifier, but it's good to know that the full model is a huge complicated noisy quantum mechanical problem that would probably take up 20 chalk boards fully expanded.

It's like Box said,

all models are wrong, some are useful

The linear model of a transistor has proven very useful, but one needs to always be aware of the boundaries where it doesn't apply--for any model.

I don't even see how telepathy/twin-thought could be explained by quantum mechanics. I mean...I'm no physicist, but I don't see an explanation there. Electrons in two peoples brains are entangled so they think the same thoughts? That's not how your brain works. Everyone's neural patters are different. Flipping one electron's spin in one atom of one molecule of one neuron isn't going to make you suddenly read someone's mind. I think people aren't appreciating how complex this is.

And what about the concept of integers in the first place? That is, where does one banana end and the next begin? While we think we know visually, we do not have a formal mathematical definition.

I think the author is unaware of all the formal definitions of integers that exist. One can start with Hilbert's program ( https://en.wikipedia.org/wiki/Hilbert%27s_program ) as a first attempt at this problem. Then the work of Church and Curry created formal definitions in lambda constructions. But my personal favorite is the recent work in Homotopy Type Theory ( https://en.wikipedia.org/wiki/Homotopy_type_theory ), which proceeds to formally define integers with only 2 axioms, homotopy and univalence. This is in contrast to early formulations in ZF theory which (if I remember correctly) required 18 axioms.

To make a blanket statement that we don't have a formal definition of integers ignores the last 100 years of foundational mathematics work. While the article is interesting in some of the questions it poses, this demonstrated ignorance on the part of the author has me categorizing it as a fluff piece. The introduction and 1st chapter of Penrose's "Road to Reality" directly tackles the questions of the article in a far more elegant manner.

"I argue that there are many more cases where math is ineffective (non-compact) than when it is effective (compact). Math only has the illusion of being effective when we focus on the successful examples. But our successful examples perhaps only apply to a tiny portion of all the possible questions we could ask about the universe."

These are limitations of our models rather than of mathematics itself. The world is complex and necessitates complex mathematics but that doesn't mean math isn't the best approach or isn't the "mother tongue" of the physical world. Ultimately we can describe the entire physical universe with math.

Speaking of which, if anyone is aware of how to define variance in a multivariate Gaussian using Clifford Algebra, I've been searching for such a definition. I've tried deriving it from scratch and can't quite seem to make it fit. I'd like to reformulate some multivariate statistical models into Clifford Algebra for some research I did.

The problem I have with this article is that the author tries to poke holes in mathematics by using "real-world" examples, but fails to note that mathematics is an idealization of the real world. We use mathematics to describe the big picture of what's going on, not to tell exactly what's happening. That's the job of simulations, which (surprise!) employ mathematics that describe a smaller scale to build the larger scale. The real world is very much described by math.