Wednesday, July 29, 2015

When I saw A Mind for Numbers: How to Excel at Math and Science (Even if You Flunked Algebra) by Barbara Oakley, PhD, I was intrigued. The main title was in the form of a cutesy equation, which I supposed was the editor's conceit. Having read the book, I am not so sure.

Dr. Oakley knows what she is talking about, because she began as a math-phobic, but learned to love it. Throughout the book she has 1-page items by people with a similar background, who now are comfortable with mathematics, at one level or another.

The book is a breezily-written compendium of learning techniques and tips gathered into 18 chapters. It is not intended to be taken in wholesale. Different people learn in different ways, and in many cases, a single chapter, or even one point in a chapter, can unlock the math potential for someone. But I wonder…

At a certain level, to be human is to be mathematically adept at some level. Very young children, asked to choose one of two piles of coins, will pick the one that is spread out rather than a neat stack of the same number of coins. They equate spatial area with quantity, and don't realize that the two piles are equivalent. But, I suspect they have not yet learned to count, and it takes this further level of sophistication before they have the mental equipment to fairly evaluate the two piles.

I think that is analogous to an experience I had at about age 12. Someone had showed me a few Algebra equations. I saw something like 10x=5 and wondered, "How can that be?" I thought the "x" was supposed to represent a digit, like the stuff on the left would be a number from 100 to 109. So I thought something else had to be going on. This caused quite a delay in my getting the point when I began Algebra class later that year. But I think, a month or so into the school year, when it all began to "click", that my brain had simply grown up enough to have the right tools for doing algebra.

We all do a certain amount of calculation. Most of us can quickly evaluate the change we're given at the store (if we used cash). People who bowl soon learn to keep score without writing down their calculations. When we drive (without a GPS), and we see "Chicago, 95 miles", we check the odometer, note what it shows, and can then glance at it later and know how many miles we still have to go. Many times, we can even be told a "problem" like this:

John and Mary ride bicycles toward each other at 5 mph, from 1 mile apart. Their pet bird, which flies 20 mph, flies back and forth from one to the other until they meet. How far does the bird fly?

Most of us, by age 10 or so, can figure that John and Mary each ride half a mile, because they are going the same speed. The bird flies four times as fast as either of them, so its total flight is two miles. This kind of "figuring" is actually algebra, without the equations. In fact, it would take me longer to write down the equations for solving this using "traditional" algebra, than it did to write the two sentences above. What is funny is when someone tries to tackle the problem as a series of flights of decreasing length by the bird. The equations to get that to work are gnarly!

So every one of us has some amount of math built in. Standard equipment. But that "some amount" varies a great deal from person to person. Not everyone can learn algebra, no matter how it is taught nor how hard they try. But most can, and by "most" I mean "more than half but not a great deal more". Some kids who had no trouble with algebra never, ever get the point of Trigonometry. My senior year of high school, we got done with the ordinary curriculum for the year a few days early, so the teacher did an experiment. He got out a few copies of a basic text in Calculus and taught it to us. In just a couple of weeks, I learned enough Calc so that I pretty much breezed through the Calc 101 course the next year in college, which used that same textbook!

I was a working mathematician, at a certain level, for decades. But there are branches of math that have never made sense to me, and others that I can puzzle out with desperate levels of effort. I had to take Differential Equations three times to pass it. I'm still not comfortable with it, but the explanation of how to use it takes only two pages in my old CRC Handbook of Chemistry & Physics, and actually contains most of what one needs to do nearly any Diff Eq problem!

And so it goes. Each human brain has a certain mathematical limit. With luck, we might grow mathematically to our full potential, but it is time consuming. Most of us never need all that stuff. But we also grow into certain abilities over time. Just as the brain doesn't finish emotional maturation until about age 25, it must be true that certain math circuits only get set up at certain ages. It may be that the ten years between my second and my third try at learning Diff Eq made more difference in my ability, than the exposure I'd had during the first two attempts.

With all that in mind, I find no sense in delving into what Dr. Oakley has to say. The book is a fantastic resource. Someone who needs encouragement and help in how to learn math and science will do well to read the book quickly, then return to read over certain sections with more care: those sections that seemed to make the most sense the first time through. The first two chapters will be helpful to everyone. As for the others, while the author attempts to make them generally applicable, each will actually be best suited to people with a certain kind of mind, one way or another.

Thursday, July 23, 2015

A friend gave me an old paperback copy of Travels in Hyperreality by Umberto Eco. He is one of those authors about whom I have heard a little, but really knew nothing more than his name. That made the reading a bit of an adventure.

Chapters 1, 7, and 8 are long essays in the form of extended travelogues. The other five chapters are collections of shorter essays, republished from newspaper columns, dating from the late 1960s to 1980. All were translated by William Weaver.

There was a certain sameness about the second and later chapters, once I could stand back and view them all. Though they are diverse in subject, they are all the views of a skeptical intellectual who revels in digging into less-traveled corners of this or that concept. Chapter one, from which the book gets its title, stands alone as a travelogue in two senses: physical travel experienced as cultural and conceptual travel. It soon becomes clear that "hyperreality" refers to the United States of America, and most specifically to the breadth of cultural milieus that form a different kind of national map. None of them can be found "across the pond". This is probably as true today as it was 40-50 years ago, even though Europe is getting populated with McDonalds joints, Disneylands, and so forth.

There is just something about certain places that you can say, as certain advertisements in the US have it, "Often imitated, never duplicated." Such places need but one name. In Europe, they are usually great capitals: Rome, Paris, London, Oslo, Berlin, and so forth. In the U.S., some are great cities: Chicago, Las Vegas, New York, New Orleans; but some are more regional: SoCal, The Valley, 'Bama, The Rockies, or Down East. (To be fair, Europe also has regional memes: The Loire Valley, Tuscany, the Alps…)

Eco delved into certain of these, but his interest was conceptual landscapes, which frequently transcended the geographic. Thus, he begins by exploring "Fortresses of Solitude", modeled in his telling on Superman's lair, because of the American penchant for greatly expanding the meaning of "Museum" far beyond the way the word is used elsewhere. Starting with the Lyndon B Johnson Library with its hyper-eclectic gathering of artifacts, including a full-size, detailed replica of the Oval Office (but brighter and shinier), he passes through several similar establishments sporting full-size replicas of this or that building or collection thereof, and winds up at a replica Colonial farm, complete with livestock…or as close as the proprietors could come to a replica, what with changes in farm animals over the past 3-4 centuries. I found myself wondering what he'd have thought of the Winterthur Museum in Delaware, the 175-room mansion of Henry F. du Pont, which is primarily composed of entire salons—walls, ceilings, floors, windows, furniture, art and all—bought from other mansion-builders who'd fallen on hard times all across America.

Then he dwells upon wax museums that go beyond the "statuary" genre of Madame Tussaud's, and this country has a great many of them. Wax replicas of people are one thing. Some of the items are replicas of works of art, such as one of Michaelangelo's David, but colored as the statue might once have been, and perhaps as David was in life. Is that more than a replica, or less?

In another turn, he starts with Wm. R. Hearst's "Castle" in San Simeon, remarking at length upon its confused mix of artifact and artifice, the real and the fake, the old and the new and the new-but-looks-old. He touches on other such American castles, and again, I was wondering what he'd have thought of the "summer cottages" in Newport, Rhode Island, where the Vanderbilt's and others escaped the oppressive summers of their "real" homes in the Carolinas. After contrasting these various kinds of "museums" with the Forest Lawn Cemetery's exhibits, he gets to the real meat of his essay, the real homes of hyperreality, the amusement parks. Quite simply, American entrepreneurs have turned the notion of a "house of amusement" inside-out, first with Disneyland, in California, and on a scale 150 times larger in Orlando, Florida, home of Disney World (and Mr. Disney meant it), yet not only the "Disneys", but also Knott's Berry Farm, the various Universal Studios properties, and sundry other places, all fitting the moniker "theme park". Why, my former favorite Mojave Desert destination, Calico Ghost Town, is well on its way to becoming a theme park, though its current admission fee is only 1/10 what it costs to visit a Disney for a day (but then, you can experience all that Calico has to offer in a day. Disney World? Not even close).

Nothing is off limits to the American amusement machine, it seems. Eco samples the religious fare; one of the "church celebrities" he saw must have been Kathryn Kulhman, by the description: she was a walking, talking, one-woman circus if ever there was one. And his conclusion of it all? That Americans must love fakery better than the genuine, for we certainly consume enough of it. And that's what hyperreality means, after all, not just a fake but an enhanced fake, a fake "on steroids", a fake that is a great deal more enjoyable than the original.

After all that, the rest of the book is "merely" brilliant. I gradually realized that Eco is a European intellectual with a capital I. The titles of the chapters and essays are elliptical, on purpose. If I get started commenting in detail, the foregoing will be a tenth of what comes after. I think instead I'll give it a rest, and say that Eco peers here, there and everywhere, and always has something to say that is at least interesting and thought-provoking, and is often useful.

Friday, July 17, 2015

From a distance, one simply sees a misshapen, though symmetric, skull. Once the book is in hand, the details resolve into a pair of men in tall hats flanking a pair of trees, a row of fenced tombstones, and a small, solitary female figure in the background. This cover art, coupled with the placement of Unseemly Science by Rod Duncan in the "Sci-Fi/Fantasy" shelving, promised a thorough mix of genres, and indeed, it proved a delightful mix.

Another detail in the cover art is more subtle. Upon a lengthy look, the scene is found to be snowy, with mountain shadows behind. The author's writing is similarly subtle. It took a good while for me to realize that the ice itself was the core around which the mystery resolved. Yet if I reveal more than that, it will be an "unseemly spoiler".

The milieu is of more immediate interest. The book is set in an early 21st Century England with a distinctly 19th Century flavor. Armistice in 1819 after a civil war has split the country along a line through the Midlands that divides Leicester into North and South halves. To the south is Kingdom, centered on London, and to the north is a Republic, centered on Carlisle. The former British Empire is now popularly called the Gas-Lit Empire. Most nations of the world have invested great power in the International Patent Office.

Unlike the familiar patent authorities of modern nations, which exist to facilitate technology, the Patent Office enforces the Great Accord, which primarily limits technology to innovations that can be shown to "protect and insure the wellbeing of the common man." One area considered practically exempt from their oversight is medical innovation, based on the risky notion that any medical advance must be beneficial. I'd guess they forgot Dr. Mengele.

The protagonist is Elizabeth Barnabus, a fugitive from the Kingdom living in the Republic. Her backstory is told in a former book by Duncan, The Bullet-Catcher's Daughter. Upon becoming pubescent and lovely, she'd been "acquired" by a certain nobleman as a plaything (mistress), but escaped and ran northward. She lives by her wits, a kind of female Sherlock Holmes, aided by her skills in disguise. Being tall and less shapely than one might expect, she is adept at taking on a male persona and doing business as her brother when a man's work is needed. Her "brother" has been asked to look into apparent theft of ice, which is produced in large amounts in the Welsh mountains by poor families of ice farmers, and transported southward where it is kept frozen by large, inefficient cooling machinery.

Chapters in this book are headed by quotes from two as-yet unwritten books, The Bullet-Catcher's Handbook and From Revolution. The latter is stated in a glossary as a mix of writings reaching back to the Federalist Papers. I am intrigued by the titling of carnival illusionists as bullet-catchers. Having seen on a Mythbusters episode that catching a bullet with one's teeth is quite impossible, no matter how much powder you remove from the shell, I understand that the carnival illusion is one of the most skillful.

Male writers cannot totally pull off writing in a female voice. Mr. Duncan does as well as any I've read, but the very familial sense I had reading it indicated that the character of Ms Barnabus is more male-like than a female writer would have made her. I suppose the author might protest that her frequent forays into a male world, disguised as a man, make her rather mannish in general. Perhaps. She is, nonetheless, a very engaging doubly-secret detective; doubly so in that she must do her work in secret, females being forbidden from doing business in the Republic.

Dramatic tension is amplified when the Republican government takes up a bill that would enact an extradition treaty with the Kingdom. Though the Republic has tolerated numerous fugitives from the Kingdom, "proper folk" (meaning mainly those with "jobs" few of us would call "gainful employment") look askance at such immigrants, and they intend to legislate them out of existence. Most will be forcibly returned to the Kingdom, finding themselves on a rapid course to the tight end of a noose. The author has done a delightful job of rendering the above ingredients into a gripping tale of multiple betrayals and surprising heroics.

I'm in the process of scaring up a copy of the earlier book. This one ends with sufficient closure that, while one knows the author plans another volume (or more), the book is a unit unto itself. One thing is clear. If she can, Ms Barnabus means to bring about the downfall of the Patent Office. My wager'd be on that being the subject of a successor volume.

Friday, July 10, 2015

When I was about ten, I was disappointed in a picture I'd taken. I had been too far from the person I was "shooting", so he looked like no more than a couple of dots. Having recently learned about enlargements, I suggested getting the middle of the picture enlarged. My father remarked that the photo shop charges a lot for enlargements. Then I suggested putting it under my microscope and taking another picture, then getting that printed—I'd already been setting up a clumsy rig with a tripod holding Dad's camera at the eyepiece and making photos of the cells in thin-sliced carrots and leaves. He said I could try, but it would be very blurry, then explained about the grain in the print and in the negative. I looked, and sure enough, even at 25X the film grain made the picture look like it was printed on sand.

The next year he and I made a small telescope (I still use it), and I learned about diffraction and the magnification limit of an optical system. I realized, even if the film and print grain were a hundred times smaller, and even if the optics of the camera were flawless, diffraction would limit how much I could enlarge the final image.

This is an illustration of the Rayleigh criterion for resolving star images in a telescope. I downloaded it from the Angular Resolution article in Wikipedia. The upper section shows that the Airy Disks of the two stars are fully separated. The Airy Disk is everything inside the first dark ring (first null). The lowest section shows serious overlap, and the middle section shows the Rayleigh criterion, at which point the first null of one Airy Disk passes through the center of the other. This is the accepted resolution limit of a telescope system, or indeed, any optical system, including the eye.

What causes this pattern? It results from the interaction of light from a distant point source (or multiple sources) passing through a circular aperture. Just by the way, if you should get the notion to make a telescope with a rectangular aperture, under high magnification you'll get a diffraction pattern more like this:

Such diffraction patterns, I realized one day, are a visible manifestation of quantum-mechanical effects. If you could solve the Schrödinger Wave Equation for this system, the square of its solution would look like this image. In the SWE, the solution is in complex space, and represents probabilities, while the square of the complex probability at any point is the intensity of, for example, a beam of light or electrons, as it is spread through space by diffraction. One characteristic of the SWE is that, while there will frequently be numerous nulls, or zeroes, in the solution, there is no greatest angle or maximum distance beyond which its solution is always zero. This is why even huge telescopes such as the 10m diameter Keck telescopes in Hawaii still have a diffraction pattern once all other aberrations are accounted for (the atmosphere is a much bigger light scatterer "down here", though).

So, think of it. The yellow-green light that our eyes are most sensitive to has a wavelength of 0.55µ, or 550 nm. That's pretty small, about 1/1800 mm. And, even if we are comfortable with photons, the minimal packets of light, we think of them as having a similar "size". But diffraction patterns show us that a photon can somehow "sense" the entire aperture as it "chooses" by how much to change its direction of travel. A certain experiment that has been done with both photons and electrons proves it:

Set up a very, very light-tight box with a dimmable light source at one end, a sheet with a hole in it about midway, and either a sheet of film or an array of sensitive detectors (e.g. a digital camera sensor) at the opposite end.

Let's assume the light source is accompanied by a lens system that makes a uniform beam larger in diameter than the hole in the sheet.

Set the "brightness" of the light source such that there will very seldom be more than one photon inside the box at any one time. That's pretty dim!

A 550 nm photon has an energy of 2.254 eV.

A 1 mw yellow-green laser set to that wavelength (you can do that with dye lasers) emits 2.77 quadrillion photons per second.

Light traverses a 1-meter box in about 3 ns.

The 1 mw laser thus emits 8.3 million photons in those 3 ns.

Thus you must dim the beam by a factor of more than 8 million. That is 23 f/stops, or an ND of 6.9. Two pieces of #9 welding glass is about right.

Close the box, turn on the light, and wait about 3 hours.

Develop or download the resulting image. It will have the same diffraction pattern as if you'd left off the filters and shot a picture in 1/1000 sec.

The experiment has been done many times, usually using a two-slit setup. Either way, it shows that both a photon and an electron somehow "self-interfere" as they are influenced by everything along the way from emitter to "final resting place."

All the above serves to get my mind in gear to write about The Quantum Moment: How Planck, Bohr, Einstein, and Heisenberg Taught Us to Love Uncertainty By Robert P. Crease and Alfred Scharff Goldhaber. The authors, professors at Stony Brook University, aim to demonstrate that "quantum stuff" keeps things from either collapsing or flying apart. That we owe our lives to it. Dr. Goldhaber, in particular, draws upon classroom experience, for he teaches a course that uses optics to introduce quantum mechanics.

The book is filled with mini-histories and mini-biographies of the "physics greats" of a century ago who wrestled with the findings of phenomena that revealed that Newtonian mechanics are not up to the task of explaining all the little stuff that underlies our everyday experience. Optical diffraction is just one such phenomenon. If there were no diffraction, you could put a really powerful eyepiece on an ordinary pair of binoculars and see to the end of the universe...if your eyes were sensitive to really, really dim light (telescopes are big mainly to collect more light; high resolution is also good, but is secondary in many cases).

Einstein imagined riding a beam of light from emitter to absorber. Nowhere have I read an explanation that, from the photon's point of view, nothing happens at all. The special theory of relativity, with length compression by Lorentz contraction, and time dilation, only applies to non-photons, and in particular, particles with mass. If you take Lorentz contraction and time dilation to their limits at v=c, the photon travels no distance at all, and does so in zero time. So there is nothing to experience! From a photon's point of view, the entire universe has zero size and time has no meaning; the big bang may as well never have happened!

What if we step back a tiny bit, and imagine the neutrinos that arrived in 1987, heralding the core collapse of an immense star in the Large Magellanic Cloud, Supernova 1987a (SN1987a). I haven't read any analysis of their apparent velocity, but it must have been only the tiniest whisker slower than c. Neutrinos do have some mass, perhaps a few billionths of the mass of an electron, so they tend to have near-c velocities. It is likely that the "clock" of those neutrinos registered only a few minutes during their journey of 187,000 light years, and the distance seemed at most a few hundreds or thousands of kilometers. Now, that is relativistic.

What did Einstein and Planck and Heisenberg do that got everyone (among physicists) all in a dither for the first half of the Twentieth Century? First, Planck applied a minimum limit to the "packets" of energy radiating from a heated object, in order to combine two competing, and incompatible, mathematical models of "black body radiation" into a single formula. Einstein later showed a simpler derivation of that formula. But at first, physicists just thought of it all as a mathematical trick. In between, Einstein had described a good theory of the photoelectric effect, which seemed to require that light be in finite packets, that we now call photons.

Photons are usually small in terms of the energy they convey. As mentioned above, the yellow-green color seen at 550 nm wavelength is carried by photons with an energy of 2.254 eV (electron-Volts). An eV is a 6 billionth-billionths of a joule, and a 1-watt current is defined as one joule per second. But molecules are also small, and the energies that underlie their structure are similarly small. UVb radiation from the sun, just half the wavelength, and thus twice the energy, of "550 nm yellow-green", breaks chemical bonds in your skin, causing damage that can lead to cancer. So use sunscreen! (The middle of the UVb band is close to 275 nm, with a photon energy near 4.5 eV; more than enough to knock a carbon-carbon bond for a loop.)

Book after book is filled with the stories of the founders and discoverers of quantum physics. This book puts it all into a context that the authors call the Quantum Moment. They use the word "moment" the way a historian uses "era". From 1687 until 1927, the Newtonian Moment dominated about 240 years of physics discovery. Once a critical mass of physicists had to accept that quantum phenomena were real, not just mathematical tricks, the Quantum Moment arrived. The stories of the epic battle between Bohr, who formulated the Copenhagen Interpretation, and Einstein, whose work stimulated Bohr and others, but from which Einstein then recoiled, is told here with more feeling and clarity than any other I've read.

Scientists have an emotional bond with their science. For many of them, it is their church, which they defend as keenly as any ardent fundamental Christian defends his church's theology. In the Newtonian Moment, phenomena whose initial state could be perfectly described were thought to be perfectly predictable. The math might be gnarly, but it could, in principle, be done. Quantum theory, and then quantum mechanics, blow by blow cracked open this notion and showed it to be a fantasy.

This is not just the problem of imperfect knowledge, rounding errors, or the need to simplify your equations to make them solvable. Heisenberg's Uncertainty Principle is not just a description of the way a measurement apparatus "kicks" a particle when you are measuring its location or velocity. What is Uncertain is not your measurement, but the actual location and velocity of the particle itself, at least according to Bohr. One implication of this with more recent application is the "no-quantum-cloning" principle, which makes certain applications of quantum computing impossible. However, they also make it very possible to create unbreakable cryptographic codes, which has the governments of the world (or their equivalents of our NSA and CIA) all-aquiver.

Then there's the cat. The authors give us the luscious details of Schrödinger's Cat satire, which he proposed as a slap against the notion of an "observer". Bohr and others needed some instruction from optics: every quantum particle is sensitive to, very literally, everything in the universe. All at once, and with no apparent limitation set by c. Heck, half the time, the cat is the only observer that matters. The other half, the cat is dead, and it ceases to matter to him. But, the authors point out, the air in the box is an "observer": the exchange of oxygen, water and carbon dioxide around a breathing cat are quite different from those near a dead one. So all we can say from outside the box with the cat in it, is that we can't decide the status of the cat without looking inside. We just need to remember that the term "observer" is very squishy.

I recall reading that even a pitched baseball has a "wavelength", according to the deBroglie formula. It is really tiny, only a few thousand times larger than the Planck limit of 10-35 cm, in fact. That means the deBroglie wavelength of a jet aircraft is much, much smaller than the Planck limit, which is why "real world" phenomena are easily treated as continuous for practical matters.

But the Cat, and the Uncertainly limit, show that the boundary between quantum and "classical" worlds is hard to pin down. Since that is the core of the Copenhagen Interpretation, it is seen to be weak at best, and in the eyes of some physicists, simply wrong. But there is no well-attested competing theory.

We must remember that the theories and mathematics of quantum "stuff" describe lots of "what" and a little bit of "how". They tell us nothing about "why". We don't know why there is a Pauli Exclusion Principle, that two electrons, and two only, can coexist in an atomic "s" shell, but only if they have opposite spins (and that "spin" is oddly different from the way a top spins). But we do know, that if it were not so, atoms would collapse in a blast of brightness, almost immediately, and the universe would collapse back into a reverse of the big bang, all at once and everywhere.

One scientist's work is not mentioned in this book, probably because he wasn't directly involved in the quantum revolution. But his work is pertinent in another way. Kurt Gödel formulated his Incompleteness Theorems in 1931, early in the Quantum Moment. Together, they show that no mathematical system can "solve" every problem that can be stated using its postulates, and that no mathematical system can be used to describe its own limitations. For example, there are rather simple polynomials that can be formulated using Algebra, but can only be solved using Complex Analysis. Even weirder if you know only Algebra, the simple formula X²=1 has two answers (1 and -1), but we tend to think that Xⁿ=-1 has only the answer -1 when n is odd, and is "imaginary" when n is even. But in Complex analysis, when n=3, for example, there are three answers, two of them involving an "imaginary" part.

At present, then, science has three boundaries to infinite exploration:

Schrödinger Undecidability: You can't predict quantum phenomena on a particle-by-particle basis. Even if you could escape the Uncertainty Principle, you couldn't do anything of great use with the results (which would fill all the computers in the known universe, just describing a helium atom to sufficient precision).

Gödel Incompleteness: You can't solve most of the questions being asked in the framework of quantum mechanics, not now, not ever, using the methods of quantum mechanics. QM appears to be the most Gödelian of mathematical systems, in that it asks so few questions that can be answered!

For scientists who grew up in the Newtonian Moment, it is like finding out that your church has no roof, and the rain and raccoons are getting in and taking over the place. No wonder Einstein was upset! We are in the Quantum Moment, nearly 90 years into it, and it may be another century or two before a new Moment supersedes it. Get used to it.