Category Archives: Science News

Post navigation

Sunday, April 22, 2018

First Contact — Charles Q. Choi in the Washington Post looks at the possibility that there were other civilizations here on Earth long before we came along.

Reptilian menaces called Silurians evolved on Earth before humankind — at least in the “Doctor Who” rendition of the universe. But, science fiction aside, how would we know if some advanced civilization existed on our home planet millions of years before brainy humans showed up?

This is a serious question, and serious scientists are speculating about what traces these potential predecessors might have left behind. And they’re calling this possibility the Silurian hypothesis.

When it comes to the hunt for advanced extraterrestrial civilizations that might exist across the cosmos, one must reckon with the knowledge that the universe is about 13.8 billion years old. In contrast, complex life has existed on Earth’s surface for only about 400 million years, and humans have developed industrial civilizations in only the past 300 years. This raises the possibility that industrial civilizations might have been around long before human ones ever existed — not just around other stars, but even on Earth itself.

“Now, I don’t believe an industrial civilization existed on Earth before our own — I don’t think there was a dinosaur civilization or a giant tree sloth civilization,” said Adam Frank, an astrophysicist at the University of Rochester and a co-author of a new study on the topic. “But the question of what one would look like if it did [exist] is important. How do you know there hasn’t been one? The whole point of science is to ask a question and see where it leads. That’s the essence of what makes science so exciting.”

Artifacts of human or other industrial civilizations are unlikely to be found on a planet’s surface after about 4 million years, wrote Frank and study co-author Gavin Schmidt, director of NASA’s Goddard Institute for Space Studies. For instance, they noted that urban areas currently take up less than 1 percent of Earth’s surface and that complex items, even from early human technology, are very rarely found. A machine as complex as the Antikythera mechanism — used by the ancient Greeks, it is considered the world’s first computer — remained unknown when elaborate clocks were being developed in Renaissance Europe.

One may also find it difficult to unearth fossils of any beings who might have lived in industrial civilizations, the scientists added. The fraction of life that gets fossilized is always extremely small: Of all the many dinosaurs that ever lived, for example, only a few thousand nearly complete fossil specimens have been discovered. Given that the oldest known fossils of Homo sapiens are only about 300,000 years old, there is no certainty that our species might even appear in the fossil record in the long run, they added.

Instead, the researchers suggested looking for more-subtle evidence of industrial civilizations in the geological records of Earth or other planets. The scientists focused on looking at the signs of civilization that humans might create during the Anthropocene, the geological age of today, characterized by humans’ influence on the planet.

“After a few million years, any physical reminder of your civilization may be gone, so you have to look for sedimentary anomalies, things like different chemical balances that just look wacky,” Frank said.

For instance, humans living in industrial civilizations have burned an extraordinary amount of fossil fuels, releasing more than 500 billion tons of carbon from coal, oil and natural gas into the atmosphere. Fossil fuels ultimately derive from plant life, which preferentially absorb more of the lighter isotope carbon-12 than the heavier isotope carbon-13. When fossil fuels get burned, they alter the ratio of carbon-12 to carbon-13 normally found in the atmosphere, ocean and soils — an effect that could later be detected in sediments as hints of an industrial civilization.

In addition, industrial civilizations have discovered ways to artificially “fix” nitrogen — that is, to break the powerful chemical bonds that hold nitrogen atoms together in pairs in the atmosphere, using the resulting single nitrogen atoms to create biologically useful molecules. The large-scale application of nitrogenous fertilizers generated via nitrogen fixing is already detectable in sediments remote from civilization, the scientists noted.

The Anthropocene is also triggering a mass extinction of a wide variety of species that is probably visible in the fossil record. Human industrial activity may also prove to be visible in the geological record in the form of long-lived synthetic molecules from plastics and other products, or radioactive fallout from nuclear weapons.

One wild idea the Silurian hypothesis raises is that the end of one civilization could sow the seeds for another. Industrial civilizations may trigger dead zones in oceans, causing the burial of organic material (from the corpses of organisms in the zones) that could, down the line, become fossil fuels that could support a new industrial civilization. “You could end up seeing these cycles in the geological record,” Frank said.

All in all, thinking about the impact that a previous civilization has on Earth “could help us think about what effects one might see on other planets, or about what is happening now on Earth,” Frank said.

Sunday, December 31, 2017

On Thursday, El Caudillo del Mar-A-Lago sat down with Michael Schmidt of The New York Times for what apparently was an open-ended, one-on-one interview. Since then, the electric Twitter machine–and most of the rest of the Intertoobz–has been alive with criticism of Schmidt for having not pushed back sufficiently against some of the more obvious barefaced non-facts presented by the president* in their chat. Some critics have been unkind enough to point out that Schmidt was the conveyor belt for some of the worst attacks on Hillary Rodham Clinton emanating from both the New York FBI office and the various congressional committees staffed by people in kangaroo suits. For example, Schmidt’s name was on a shabby story the Times ran on July 23, 2015 in which it was alleged that a criminal investigation into HRC’s famous use of a private email server was being discussed within the Department of Justice. It wasn’t, and the Times’ public editor at the time, the great Margaret Sullivan, later torched the story in a brutal column.

Other people were unkind enough to point out that the interview was brokered by one Christopher Ruddy, a Trump intimate and the CEO of NewsMax, and that Ruddy made his bones as a political “journalist” by peddling the fiction that Clinton White House counsel Vince Foster had been murdered, one of the more distasteful slanders that got a shameful public airing during the Clinton frenzy of the 1990s. Neither of those will concern us here. What Schmidt actually got out of this interview is a far more serious problem for the country. In my view, the interview is a clinical study of a man in severe cognitive decline, if not the early stages of outright dementia.

Over the past 30 years, I’ve seen my father and all of his siblings slide into the shadows and fog of Alzheimer’s Disease. (The president*’s father developed Alzheimer’s in his 80s.) In 1984, Ronald Reagan debated Walter Mondale in Louisville and plainly had no idea where he was. (If someone on the panel had asked him, he’d have been stumped.) Not long afterwards, I was interviewing a prominent Alzheimer’s researcher for a book I was doing, and he said, “I saw the look on his face that I see every day in my clinic.” In the transcript of this interview, I hear in the president*’s words my late aunt’s story about how we all walked home from church in the snow one Christmas morning, an event I don’t recall, but that she remembered so vividly that she told the story every time I saw her for the last three years of her life.

In this interview, the president* is only intermittently coherent. He talks in semi-sentences and is always groping for something that sounds familiar, even if it makes no sense whatsoever and even if it blatantly contradicts something he said two minutes earlier. To my ears, anyway, this is more than the president*’s well-known allergy to the truth. This is a classic coping mechanism employed when language skills are coming apart. (My father used to give a thumbs up when someone asked him a question. That was one of the strategies he used to make sense of a world that was becoming quite foreign to him.) My guess? That’s part of the reason why it’s always “the failing New York Times,” and his 2016 opponent is “Crooked Hillary.”

In addition, the president* exhibits the kind of stubbornness you see in patients when you try to relieve them of their car keys—or, as one social worker in rural North Carolina told me, their shotguns. For example, a discussion on healthcare goes completely off the rails when the president* suddenly recalls that there is a widely held opinion that he knows very little about the issues confronting the nation. So we get this.

But Michael, I know the details of taxes better than anybody. Better than the greatest C.P.A. I know the details of health care better than most, better than most. And if I didn’t, I couldn’t have talked all these people into doing ultimately only to be rejected.

This is more than simple grandiosity. This is someone fighting something happening to him that he is losing the capacity to understand. So is this.

We’re going to win another four years for a lot of reasons, most importantly because our country is starting to do well again and we’re being respected again. But another reason that I’m going to win another four years is because newspapers, television, all forms of media will tank if I’m not there because without me, their ratings are going down the tubes. Without me, The New York Times will indeed be not the failing New York Times, but the failed New York Times. So they basically have to let me win. And eventually, probably six months before the election, they’ll be loving me because they’re saying, “Please, please, don’t lose Donald Trump.” O.K.

In Ronald Reagan’s second term, we ducked a bullet. I’ve always suspected he was propped up by a lot of people who a) didn’t trust vice-president George H.W. Bush, b) found it convenient to have a forgetful president when the subpoenas began to fly, and c) found it helpful to have a “detached” president when they started running their own agendas—like, say, selling missiles to mullahs. You’re seeing much the same thing with the congressional Republicans. They’re operating an ongoing smash-and-grab on all the policy wishes they’ve fondly cultivated since 1981. Having a president* who may not be all there and, as such, is susceptible to flattery because it reassures him that he actually is makes the heist that much easier.

I’m always moving. I’m moving in both directions. We have to get rid of chainlike immigration, we have to get rid of the chain. The chain is the last guy that killed. … [Talking with guests.] … The last guy that killed the eight people. … [Inaudible.] … So badly wounded people. … Twenty-two people came in through chain migration. Chain migration and the lottery system. They have a lottery in these countries. They take the worst people in the country, they put ‘em into the lottery, then they have a handful of bad, worse ones, and they put them out. ‘Oh, these are the people the United States. …” … We’re gonna get rid of the lottery, and by the way, the Democrats agree with me on that. On chain migration, they pretty much agree with me.

We’ve got bigger problems.

He Believed — Dan Barry in The New York Times on his father’s belief in UFO’s.

The year now ending has been so laden with tumultuous news that one astounding report in the exhausted final days of 2017 seemed almost routine: that for years, an intelligence official burrowed within the Pentagon warren was running a secret program to investigate reports of unidentified flying objects.

Beg your pardon?

That scoop, by Helene Cooper, Ralph Blumenthal and Leslie Kean for The New York Times, was underscored by a companion article that detailed how in 2004 an oval object played a game of aeronautic hide-and-seek off Southern California with two Navy fighter jets assigned to the aircraft carrier Nimitz. The object then zipped away at a speed so otherworldly that it left one of the Navy pilots later saying he felt “pretty weirded out” — as you might if you watch the video of the encounter that the Department of Defense has made public.

In considering these reports, my mind turned to all those reasonable people who were dismissed and ridiculed over the years because they believed that something was out there. I thought in particular of believers who had died without savoring these official revelations.

Believers like my late father.

I can hear what he would have said, there at the veterans’ home, his broken vessel of a body in a wheelchair but his mind as quick and bright as a shooting star. “I’ve been saying it for years,” he’d assert, followed by a choice epithet he reserved for government officials, followed by, “I knew it.”

Then, a satisfying drag on a cigarette.

My father, Gene, finished high school at night and served three years in the Army; he did not attend college. But he had a fearsome intellect, read voraciously and developed a command of such subjects as American history, numismatics — and U.F.O. investigations. Through the 1960s and 1970s, he joined many others in monitoring reports of aerial anomalies, tracking down reams of redacted official reports and swapping theories about credible sightings and government cover-ups.

They bandied about the names of well-known U.F.O. researchers — J. Allen Hynek, Donald Keyhoe, Stanton Friedman — and read the latest newsletters from an organization called the National Investigations Committee on Aerial Phenomena, or Nicap. They remained resolute, even when many others gave up the cause after an Air Force-funded report in 1969 concluded that further study of U.F.O.s was unlikely to be of much scientific value, leading to the termination of the official Air Force program investigating the subject.

To the likes of Gene Barry, the report was merely part of the cover-up.

He was no astronomer or physicist. Just a working stiff who endured the anonymous drudgery of a daily commute but then, at night, often felt connected to something larger than himself, larger than all of us. While his neighbors focused on the fortunes of the New York Jets, he was contemplating whether the “wheel in the middle of a wheel” mentioned in the Book of Ezekiel referred to a flying object of some kind. If so, just consider the implications!

In our family, the horizontal line separating earth and sky often blurred. My father’s supernaturally patient wife and four impressionable children carried small blue membership cards for a research and investigative organization called the Mutual U.F.O. Network, or Mufon. We applauded my father when he spoke at a U.F.O. symposium at a local university. At his behest, my sister Brenda even brought a blueprint for a spacecraft that he had received in the mail — mysterious packages often arrived in the mail — to Sts. Cyril and Methodius parochial school to ask her science teacher what he made of it.

The teacher handed it back without a word.

In other households of the 1960s, Barney and Betty referred only to the Rubbles of Bedrock, loyal neighbors of Fred and Wilma Flintstone. But in our home, those names might also refer to Betty and Barney Hill, a New Hampshire couple who claimed to have been abducted and examined by aliens in 1961.

Then there were the family outings. Every so often our parents would pack us into the Chevy station wagon for a nighttime drive to that rare Long Island hill with an unimpeded view of the sky, or to Wanaque, N.J., 70 miles away, where strange lights were said to have been hovering over a local reservoir.

Gradually, we children would doze off, our necks stiff from craning. My mother, the tolerant sidekick and chauffeur, would light another cigarette, while my father continued to train his cheap binoculars on the celestial infinity, confident in the certainty of the still uncertain.

Over the years, life on terra firma intruded. Career setbacks, sickness, that daily anonymous grind. My father’s unofficial cell of believers quietly disbanded — exhausted, perhaps, by government silence and the false reports caused by weather balloons, satellites and people just seeing things. Then, when my mother died in 1999, he lost the person who grounded him, the Betty to his Barney.

He died in 2008, still believing without having seen, still questioning the government, still marveling at the arrogance of those who insisted we were the only intelligent life in the universe.

A decade passed, and then came this month’s report of a secret Pentagon program with the delightful name of the Advanced Aerospace Threat Identification Program. Funded by the government between 2007 and 2012, the program investigated aerial threats that included “unidentified aerial phenomena,” or U.A.P.s — which is just a less-polarizing way of saying U.F.O.s.

To hardened veterans of the U.F.O. wars, the news of the government program was less surprising than it was validating. And the video of the encounter between Navy fighter jets and an unidentified object moving at extraordinary velocity provided a helpful visual to the cause of those U.F.O. groups with long acronyms.

“Very interesting, very interesting,” said Fran Ridge, the archivist of the research accumulated by Nicap, now defunct. “But the very first thing that entered my mind was — why now? Is this a distraction? Is this something to get the people’s attention off politics?”

Mr. Ridge’s skeptical words reminded me of my father, who half-joked that he believed in a conspiracy — about everything.

“Finally, the kimono is being opened a little,” said Jan Harden, the director of Mufon. “Personally, I don’t need verification from the government. But for the mass public, it’s important to know that there is advanced technology in our skies.”

The news of the Pentagon’s program received a stunning amount of attention that included the usual dismissive commentary.

“Call me when you have a dinner invite from an alien,” the celebrated astrophysicist Neil deGrasse Tyson said on CNN, a comment that would have driven my father to distraction. Classic redirection, he would have railed, the tip of his cigarette reddening with rage.

But my father would also have nodded in agreement to what the good astrophysicist had to say about that almost playful aerial anomaly captured on government video. “It’s a flying object and we don’t know what it is,” Dr. Tyson said. “I would hope somebody’s checking it out.”

This past year, reporters on The Atlantic’s science, technology, and health desks worked tirelessly, writing hundreds of stories. Each of those stories is packed with facts that surprised us, delighted us, and in some cases, unsettled us. Instead of picking our favorite stories, we decided to round up a small selection of the most astonishing things we learned in 2017. We hope you enjoy them as much as we did, and we hope you’ll be back for more in 2018:

The record for the longest top spin is over 51 minutes. Your fidget spinner probably won’t make it past 60 seconds.

Flamingos have self-locking legs, which makes them more stable on one leg than on two.

If your home furnace emits some methane pollution on the last day of 2017, it’ll almost certainly leave the atmosphere by 2030—but it could still be raising global sea levels in 2817.

Somewhere around 10,000 U.S. companies—including the majority of the Fortune 500—still assess employees based on the Myers-Briggs test.

Humans have inadvertently created an artificial bubble around Earth, formed when radio communications from the ground interact with high-energy particles in space. This bubble is capable of shielding the planet from potentially dangerous space weather like solar flares.

Languages worldwide have more words for describing warm colors than cool colors.

Turkeys are twice as big as they were in 1960, and most of that change is genetic.

Two Chinese organizations control over half of the global Bitcoin-mining operations—and by now, they might control more. If they collaborate (or collude), the blockchain technology that supposedly secures Bitcoin could be compromised.

U.S. physicians prescribe 3,150 percent of the necessary amount of opioids.

Physicists discovered a new “void” in the Great Pyramid of Giza using cosmic rays.

Daily and seasonal temperature variations can trigger rockfalls, even if the temperature is always above freezing, by expanding and contracting rocks until they crack.

The decline of sales in luxury timepieces has less to do with the rise of smartwatches and more to do with the rising cost of gold, the decline of the British pound, and a crackdown on Chinese corruption.

Spider silk is self-strengthening; it can suck up chemicals from the insects it touches to make itself stronger.

Intelligence doesn’t make someone more likely to change their mind. People with higher IQs are better at crafting arguments to support a position—but only if they already agree with it.

The NASA spacecraft orbiting Jupiter can never take the same picture of the gas planet because the clouds of its atmosphere are always moving, swirling into new shapes and patterns.

During sex, male cabbage white butterflies inject females with packets of nutrients. The females chew their way into these with a literal vagina dentata, and genitals that double as a souped-up stomach.

If all people want from apps is to see new stuff scroll onto the screen, it might not matter if that content is real or fake.

Cardiac stents are extremely expensive and popular, and yet they don’t appear to have any definite benefits outside of acute heart attacks.

Animal-tracking technology is just showing off at this point: Researchers can glue tiny barcodes to the backs of carpenter ants in a lab and scan them repeatedly to study the insects’ movements.

One recommendation from a happiness expert is to build a “pride shrine,” which is a place in your house that you pass a lot where you put pictures that trigger pleasant memories, or diplomas or awards that remind you of accomplishments.

Thanks to the internet, American parents are seeking out more unique names for their children, trying to keep them from fading into the noise of Google. The median boy’s name in 2015 (Luca) was given to one out of every 782 babies, whereas the median boy’s name in 1955 (Edward) was given to one out of every 100 babies.

The reason that dentistry is a separate discipline from medicine can be traced back to an event in 1840 known as the “historic rebuff”—when two self-trained dentists asked the University of Maryland at Baltimore if they could add dental training to the curriculum at the college of medicine. The physicians said no.

Tuesday, August 22, 2017

Here’s what we saw of the eclipse from the roof of the parking garage of my office in downtown Miami.

Picture by my co-worker Jeff.

We got about 78% of the sun blocked out at our latitude. The light got a little dimmer, like the first hint of approaching evening, and we had clouds passing over at times. While others went with the glasses and the cereal boxes for viewing, I reverted to the ancient method (learned during the eclipse of July 20, 1963) of the camera obscura: a pinhole projected on a flat surface using two (CF-free) Styrofoam plates from the cafeteria salad bar.

Picture by my co-worker Carmen.

We got to maximum obscurity at 2:58 p.m. and then went back to work. And in true Miami form, it rained heavily an hour later.

Monday, August 21, 2017

1. 1st Total Solar Eclipse in 38 Years…

…for those in contiguous United States (excluding Alaska and Hawaii). The last time anyone in mainland US saw a total eclipse of the Sun was on February 26, 1979. If you live in the US and miss this event, you’ll have to wait 7 more years, until April 8, 2024, to see a total solar eclipse from a location in the contiguous United States.

2. Most North Americans Will Be Able To See Totality…

…if they are willing to drive that is. The total eclipse will only be visible along the Moon’s central shadow, which at its widest will be about 115 kilometers (71.5 miles), according to some sources. Its path will span from the country’s West Coast to the East Coast. The rest of North America, as well as Central America and northern parts of South America, will experience a partial solar eclipse. NASA has estimated that a majority of the American population lives less than a 2-day drive away from the path of totality.

3. A Once-In-A-Lifetime Event

While total solar eclipses are not rare—they occur twice every 3 years on average and can be seen from some part of the Earth—a total eclipse of the Sun that can be seen from the American West Coast to the American East Coast occurs less frequently. In fact, the last time a total solar eclipse was visible from coast to coast was almost 100 years ago, on June 8, 1918!

What makes this eclipse extra special is that it is the first time since the total solar eclipse of January 11, 1880 that a total solar eclipse will occur exclusively over the continental United States—no other country will see totality, though many countries will see a partial eclipse of the Sun.

Because of these reasons, the eclipse is also being called the Great American Eclipse.

4. Parts of 14 American States Will Go Dark…

…for the 2 minutes of totality. The Yaquina Head Lighthouse in Newport, Oregon will be the first location on continental US soil to see totality. The partial phase of the eclipse will begin here at 9:04 am local time. Lincoln City, Oregon will also be one of the first locations in the country to experience totality.

Oregonians will also be the first to see totality as the Moon’s shadow moves east at an average speed of about 3600 km/h (2237 mph). After Oregon, the eclipse will move through Idaho, Wyoming, Nebraska, Kansas, Missouri, Illinois, Kentucky, Tennessee, North Carolina, Georgia, and South Carolina. Montana and Iowa are the only states where the path of totality will pass through unpopulated areas. People in Charleston, South Carolina will be some of the last people in the US to see totality.

Nashville, Tennessee is one of the few large cities in the United States to fall completely within the eclipse’s path of totality. Interestingly, only some parts of Kansas City, Kansas and Kansas City, Missouri will be able to see a total solar eclipse.

5. Totality Will Be Spectacular

If you are lucky enough to be in the path of totality, you are in for an astronomical treat, weather permitting, of course. When the eclipse begins, at 1st contact, it will appear as if the Moon is taking a bite out of the Sun. As the eclipse progresses, the sky will get darker, the temperature will drop, and if you pay attention, animals and birds will become quieter.

At 2nd contact, which is when totality begins, Baily’s Beads become visible. As the Moon completely covers the Sun’s surface, the diamond ring can be seen. You might also see pink spots called prominences near the diamond. These spots are caused by gases on the Sun’s surface.

Totality is the only time when one can see the corona, the Sun’s atmosphere. At 3rd contact, Baily’s Beads will once again become visible and a second diamond ring may appear.

7. You Will Need Eye Protection

Looking at the Sun with your naked eyes is highly dangerous and can even cause blindness. The safest way to see a solar eclipse is to wear protective eclipse glasses or use a pinhole projector you can easily make yourself.

8. Part of Saros Series 145

Solar eclipses occur in cycles. The Saros cycle, one of the most studied eclipse cycles, occurs every 18 years. Two solar eclipses separated by a Saros cycle have similar features—they occur at the same lunar node, with the Moon roughly at the same distance from the Earth. The eclipses also take place at about the same time of the year and around the same time of day. Eclipses that are separated by a Saros cycle are part of a Saros series.

The August 21, 2017 total solar eclipse belongs to Saros series 145. It is the 22nd eclipse in a series of 77 eclipses. The series began with a partial solar eclipse visible from the Northern Hemisphere on January 4, 1639 and will end with a partial solar eclipse visible from the Southern Hemisphere on April 17, 3009.

The next eclipse in the series—a total solar eclipse—will take place on September 2, 2035.

And remember, let’s be careful out there: don’t look directly at the sun.

The first solar eclipse I remember was the one on July 20, 1963. We were up in Michigan and my dad and I were painting our little Puddleduck sailboat and named it “Swallow” after the boat in the Arthur Ransome stories. We were in the backyard and I had fashioned a pinhole projector out of two pieces of cardboard. The sun was 78% blocked, but it didn’t seem much darker than your average twilight.

Sunday, August 14, 2016

In case, like all sensible Americans, you’ve been watching the Olympics and haven’t noticed what’s been up with El Caudillo de Mar-A-Lago, he spent a couple of days telling his audiences of screaming geeks that the president was “the founder of ISIS.” Then he spent yesterday saying that, when he said “founder,” he meant “founder,” dammit. Then this morning, he took to the electric Twitter machine and declared that he was only engaging in “SARCASM” and that all the dim bulbs in the dishonest press don’t get the vast sweep of his subtle wit.

For his next trick, he’s going to swallow his own head.

Just for information’s sake, as any middle-school teacher knows from long experience, this is an example of sarcasm:

“Yeah, sure, like we’re going to hand the nuclear codes to a vulgar talking yam who stiffs his subcontractors and doesn’t know enough about any major issue to throw to a cat. Yeah, we’re gonna do that. Surrrrre.”

Sarcasm.

Accusing the president of “founding” a barbaric terrorist group and then insisting you were serious?

Not sarcasm.

You know what else isn’t sarcasm? Suggesting that you intend to turn the American system of criminal justice unilaterally into a Peronist nightmare. That’s not sarcasm. The Miami Heraldwas there.

“Would you try to get the military commissions—the trial court there—to try U.S. citizens?” a reporter asked. “Well, I know that they want to try them in our regular court systems, and I don’t like that at all. I don’t like that at all,” he said. “I would say they could be tried there, that would be fine.”

Actually, that would not be fine. The Constitution says it would not be fine. I suspect more than a few lawyers in the Pentagon would say that it would not be fine. Actually, I suspect Alexander Hamilton, an actual Founder of something, would say that it would not be fine, because he wrote this in Federalist 28:

Independent of all other reasonings upon the subject, it is a full answer to those who require a more peremptory provision against military establishments in times of peace to say that the whole power of the proposed government is to be in the hands of the representatives of the people. This is the essential, and, after all, the only efficacious security for the rights and privileges of the people which is attainable in civil society.

Actually, I suspect that Thomas Jefferson, another actual Founder of something, would say that it would not be fine, because this is one of the particulars on which he arraigned George III in a little document you may have heard of called “The Declaration of Independence.”

(That last part, by the way, was sarcasm.)

He has affected to render the Military independent of and superior to the Civil Power.

So, no, I don’t think he’s being sarcastic about any of this. I think there’s only one joke out there, and it’s the one over which obvious anagram Reince Priebus presides.

Fidel Castro is turning ninety on Saturday. It has been a long life, and a most eventful one. He was born on August 13, 1926, three years before the Great Crash and the start of the global depression. Feature films were still silent; commercial air travel was in its infancy; most people who moved around the globe did so by ship; many navies still used sailing ships. The telephone existed, but for instant global communication and news, the telegram was still the thing. Most cars still had to be started with a hand crank.

Calvin Coolidge was the President of the United States, which at the time had a population of a hundred and seventeen million—a third of its present size—and there were forty-eight states. The United States was not a superpower. The country had few paved roads, and less than ten per cent of the rural population had access to electricity. A Sharia-style ban on the consumption of alcohol, known as Prohibition, had been in force since 1920 (and would last until 1933). Cuba had been an independent republic for a mere twenty-four years. It was the last of Spain’s colonies in the New World to be relinquished, but only after intervention by American forces, in 1898, had ended decades of bloody warfare with Cuban nationalists. Cuba had then fallen under U.S. military administration; it gained its independence in 1902, but only after it had agreed to have the so-called Platt Amendment embedded in its new constitution. This provision granted the U.S. control in perpetuity over Guantánamo Bay, as well as the right to intervene in Cuba whenever it saw fit. For decades thereafter, Cuba remained a virtual American colony, a period that Fidel has always referred to as the “pseudo-republica.” The U.S. Marines intervened repeatedly, and the Presidents were of the pliant variety.

Fidel, and his younger brother Raúl, grew up in Birán, then, as now, a provincial backwater of eastern Cuba, an area dominated in those days by carpetbagging U.S. agribusinesses like United Fruit, which had swooped in and bought up most of the productive land in the halcyon days that followed the Spanish-American War. Fidel’s father, Ángel Castro, had emigrated from a godforsaken corner of Galicia, in Spain, as a teen-ager, and stayed, eventually becoming a kind of peasant overlord with a large and prosperous finca on which he harvested sugarcane with Haitian laborers that was sold to the United Fruit Company.

By the time Fidel was sent to Havana for a private Jesuit education, and from there to Havana University, to study law, he had become an ardent nationalist, a fervent admirer of the country’s nineteenth-century national-independence hero, José Martí—a poet and journalist who had joined the war against the Spaniards and died heroically when he charged the enemy on horseback in his first day on the battlefield. He was an admirer of other historic men of action as well, including Robespierre, Julius Caesar, and Napoleon Bonaparte.

By the age of twenty-one, Fidel had begun to entertain political ambitions of his own, and was becoming known to Cuban authorities as a hothead with political aspirations and a penchant for the dramatic gesture. In 1947, he joined a boat expedition with other would-be revolutionaries planning to violently unseat the neighboring Dominican Republic’s dictator, Rafael Trujillo. The expedition was intercepted by Cuban troops before it ever made it off a remote Cuban cay, but the next year, while Fidel was in Bogotá, Colombia, for an anti-imperalist youth congress, the popular Liberal politician Jorge Eliécer Gaitán was assassinated, sparking massive rioting; Fidel participated. Back in Cuba, in 1949, Fidel helped organize a protest in front of the U.S. Embassy after an incident in which American sailors clambered onto a statue of José Martí in a prominent plaza in Old Havana and urinated on it; Fidel got a police beating for his troubles.

By 1953, aged twenty-seven, Fidel’s ambition was no less than the seizure of power in Cuba, which by then was in the hands of an especially corrupt dictator, Fulgencio Batista. In July, he led a full-frontal assault with several fellow-armed youngsters against the Moncada army barracks in Cuba’s second city of Santiago. It was an unmitigated disaster. A number of rebels died in the fighting, and dozens more were executed, some after being brutally tortured. Fidel survived, and when he was put on trial he defended himself with an impassioned piece of oratory that took him four hours to read out, in which he declared, “History will absolve me.” He was convicted and sentenced to fifteen years in prison, but the proceedings solidified his position as a national figure.

Nearly two years into Fidel’s imprisonment, in an ill-advised act of magnanimity, Batista signed an amnesty that freed Fidel from prison. He immediately went into exile in Mexico, where, with his brother Raúl, who had recruited a young Argentine named Ernesto (Che) Guevara to their cause, he began planning for a guerrilla war against Batista. Within a year and a half, he and his followers had begun that war, and by New Year’s Day, 1959, Batista had fled, and Fidel and his rebels had won.

Then followed the big stuff of history: the U.S.-backed Bay of Pigs invasion; the Cuban Missile Crisis; the creation of a one-party state presided over by the Cuban Communist Party; myriad attempts by the C.I.A. to kill or oust Fidel, and his remarkable ability to survive, and to stay in power; his support for guerrilla struggles in dozens of other countries; the great exodus of Cubans who fled the island, mostly to Florida, some for economic reasons and others in search of political freedom. The Soviet Union collapsed, but Fidel remained in power until 2006, when he fell ill and handed the job over to Raúl.

When Fidel came to power, Dwight D. Eisenhower was President. Today it is Barack Obama, an African-American, who visited the island last March at the invitation of Raúl, after the two leaders restored diplomatic relations, in 2014. Fidel was not part of the official visit, nor did he appear in public, but his presence was felt. Over the past decade, as Fidel has adapted to his role as Cuba’s elder statesman, he has expressed his opinions in occasional columns published in the official Communist daily, Granma. In the past year and a half, since the restoration of relations with the Americans, he has made it abundantly clear that he remains deeply skeptical of American intentions, while emphasizing that he supports his younger brother’s decisions. But, coming as it does in the twilight of his life, the fact that the Americans are back—initially in the form of a growing flood of eager tourists, but also as prospective investors—must be deeply poignant for Fidel, whose opposition to el imperialismo yanqui was the mainstay of his political career. What did Fidel think of the fact that American personalities of the likes of Kim Kardashian and Kanye West and Jerry Springer were touring Havana last spring, taking selfies and tweeting about what they did and saw and ate and drank?

In his last public appearance, at the seventh Cuban Communist Party Congress, in April, a frail-looking Fidel gave a speech in which he did not once mention the Americans. He spoke instead of his preoccupation with the challenges confronting humankind, including the risks posed by arms proliferation, global warming, and food scarcities. And Fidel reaffirmed his faith in Communism, in the future of Cuba, and the legacy that he believed Cuba’s Communists had forged. He also mentioned his looming birthday. It was a milestone, he said, that he had “never expected to reach.”

Mosquitoes may be small, but they pack a mean punch. Weighing in at a measly 2.5 milligrams, these buzzing arthropods are responsible for more deaths than snake bites, shark attacks, and murders combined. A whopping 725,000 people die each year from diseases transmitted by this common pest. Researchers have spent decades and millions of dollars fighting dengue, yellow fever, and chikungunya—dangerous viruses that female mosquitoes can spread in a single bite. Now—as scientists rev up efforts to tackle the worsening mosquito-borne Zika epidemic that’s rocked the Americas—some scientists are tapping into Earth’s oldest organic armies as they seek to wipe out these diseases.

In this week’s episode of the Inquiring Minds podcast, journalist and author Ed Yong explores the emerging science of the microbiome—the trillions of tiny organisms that inhabit the bodies of humans and other animals. Along the way, he tells host Kishore Hari about Wolbachia—one of nature’s most successful land-based bacteria—and its potential to aid the fight against Zika and other mosquito-borne illnesses. Wolbachia, says Yong, has “tremendous promise in bringing tropical diseases to heal.”

Wolbachia is extremely versatile; it can infect more than 40 percent of all arthropod species, including spiders, insects, and mites. Research has shown that female Aedes aegypti mosquitoes infected with the bacteria are unable to transmit common viruses, including Zika and dengue. And because Wolbachiapasses from a female mosquito to her offspring, it could spread easily through a wild population. That means releasing a small batch of mosquitoes infected with the bacteria could help eradicate mosquito-borne diseases in a potentially short amount of time, says Yong. For a mosquito whose global range spans six continents—and includes a large chunk of the United States, the impact on global public health could be substantial.

Despite years of research, treatments for many mosquito-borne illnesses is limited. Clinical trials for a Zika vaccine are underway, but researchers don’t expect one to be available to the public for at least 18 months. “There are no vaccines,” Yong says. “There are no good treatments for dengue. We need better ways of controlling these diseases.” Field trials of Wolbachia-carrying mosquitos have been underway in Australia since 2011, and in Brazil, Indonesia, and Vietnam since 2014. The results have shown great promise, with no ill effect on people or the environment.

Yong argues that Wolbachia is safer and more cost-effective than traditional vector control methods, such as spraying with insecticides. And unlike insecticides, bacteria are self-perpetuating. And Wolbachia doesn’t appear to affect mosquito populations, so other insects and animals that feed on these pests won’t miss a meal. “It’s not about killing mosquitos,” Yong says, “it’s about turning them into dead ends for viruses.”

Sunday, June 12, 2016

Not The Only One — Rinku Sen points out that Donald Trump isn’t the only racist in the GOP.

Donald Trump’s latest attack on US District Judge Gonzalo Curiel, who is presiding over the suit brought against Trump University by its former students, led Paul Ryan to distance himself from the candidate he’s endorsed. He called Trump’s statement “textbook” racism, wishing aloud that Trump would stick to the GOP’s focus on revitalizing the economy. Ryan didn’t go so far as to withdraw his support from a Trump nomination, noting that the GOP policy agenda has a far greater chance of succeeding with Trump in the White House than with Hillary Clinton. But it is that policy agenda itself that advances racism, even if it stops short of racial slurs about the people who will suffer the most from its implementation.

Ryan’s clumsy attempts to navigate around Trump only throw into relief the GOP’s broader refusal to acknowledge that racism takes many, many forms—most of them unconscious, hidden, and systemic. Trump’s overt racism actually obscures the party’s covert racism. In some cases, that covert racism may even come with good, if paternalistic, intentions of saving communities of color from the “evils” of dependence. But for the GOP—and too many Democrats, for that matter—if racist intention isn’t obvious (and sometimes, even if it is), there is no need to bring up race at all. Indeed, doing so points to the moral weakness of racial justice advocates.

New York Representative Lee Zeldin, for instance, is particularly opposed to the notion that we might develop a policy agenda that actually directs resources to the racial groups with highest need. “So being a little racist or very racist is not OK,” he said on CNN in countering attacks on Trump, “but, quite frankly, the agenda that I see and all the microtargeting to blacks and Hispanics from a policy standpoint, you know, that’s more offensive to me.” Trump surrogates like Zeldin don’t want us to consider the possibility that a supposedly universal agenda might have different impacts on different racial and ethnic groups. Rather, they want us to believe proponents of “identity politics,” as another Trump surrogate argued, are the real racists, pushing “political correctness” down the collective American throat.

This isn’t new. Resistance to the idea that “textbook” racism can be found in our political and economic systems as much as in our individual hearts and minds was also fully present in the policy debates of the 1960s. And that resistance deeply shaped the Civil Rights Act. For example, in the employment section, employers were vulnerable under the law only if a complainant could prove a “pattern or practice of resistance” to civil-rights measures—or, if you can establish a racist intent. That was written to punish only the most explicit kind of Southern racism, while leaving the subtler Northern version intact. While updates to the act gave lip service to the notion that racist impact constitutes discrimination, even if intention is not obvious, civil-rights plaintiffs still bear the burden of establishing intention if they want any recourse.

Or take voting today. Perhaps GOP leaders are sincere when they say their ongoing attack on (nonexistent) voter fraud with new voter-ID laws are not designed to suppress the votes of people of color. But they will not acknowledge evidence that the law’s impact is nonetheless voter suppression. Or take the question of equal opportunity for children. A bill adopted by the House Education and Workforce Committee this year, which Ryan has endorsed, forces 11,000 high-poverty schools out of eligibility for free-lunch programs, lowers nutritional standards, and makes it much tougher for schools to enroll kids who need help. The children who are adversely affected will be far disproportionately of color. But they aren’t named as targets, so we cannot pin down racist intention; Ryan and many others denouncing Trump this week would strongly object if someone called the bill “textbook” racism. As long as lawmakers and politicians don’t say “black” or “Mexican” or “Arab,” they are cleared of racist intention, as though the impacts of their actions don’t matter.

But they do matter, to a growing electorate of color.

The GOP (and, again, lots of Democrats, too) willfully ignore the fact that all politics are identity based. White men are simply not required to own their identities in politics because they constitute the default universal. When Zeldin calls out “microtargeting” as racist, he attempts to shut down any discussion of the racialized effects of supposedly race-neutral policies. The result is a debate in which only the most obvious, intentional brand of racism is to be condemned.

The Next Step — Frank Bruni profiles Pete Buttigieg, the openly gay mayor of South Bend, Indiana, who could be the next rising star.

If you went into some laboratory to concoct a perfect Democratic candidate, you’d be hard pressed to improve on Pete Buttigieg, the 34-year-old second-term mayor of this Rust Belt city, where he grew up and now lives just two blocks from his parents.

Education? He has a bachelor’s from Harvard and a master’s from Oxford, where he was a Rhodes Scholar.

Public service? He’s a lieutenant in the Navy Reserve. For seven months in 2014, he was deployed to Afghanistan — and took an unpaid leave from work in order to go.

He regularly attends Sunday services at his Episcopal church. He runs half-marathons. His TEDx talk on urban innovation in South Bend is so polished and persuasive that by the end of it, you’ve hopped online to price real estate in the city.

And though elective office was in his sights from early on, he picked up some experience in the private sector, including two years as a consultant with McKinsey. He describes that job in politically pitch-perfect terms, as an effort to learn how money moves and how data is mined most effectively.

Two years ago, The Washington Post called him “the most interesting mayor you’ve never heard of.”

And that was before he came out. He told his constituents that he was gay in an op-ed that he wrote for the local newspaper last June, during his re-election campaign. Then he proceeded, in November, to win 80 percent of the vote — more than the first time around.

But what happens if he aims higher than this primarily Democratic city of roughly 100,000 people — which he’s almost sure to? Is there now a smudge on that résumé, or could he become yet another thrilling symbol of our country’s progress?

The breaking of barriers was the story of last week, as Hillary Clinton clinched the Democratic presidential nomination. There are more milestones to come: for women, for blacks, for Hispanics, for other minorities.

Although voters in Wisconsin elevated an openly lesbian candidate, Tammy Baldwin, to the United States Senate, and Oregon’s governor has described herself as bisexual, no openly gay, lesbian or bisexual person has ever emerged as a plausible presidential candidate.

How soon might that change? Could we look up a dozen or more years from now and see a same-sex couple in the White House?

I’d wondered in the abstract, and after a veteran Democratic strategist pointed me toward Buttigieg as one of the party’s brightest young stars, I wondered in the concrete.

He probably winced when he read that: At no point during my visit with him last week did he express such a grand political ambition or define himself in terms of his sexual orientation.

“I’m not interested in being a poster boy,” he told me. He has not, since his op-ed, spoken frequently or expansively about being gay.

He doesn’t hide it, though. His partner, Chasten Glezman, a middle-school teacher, moved in with him this year and sometimes accompanies him to public events.

One day Buttigieg popped into Glezman’s classroom with an offering from Starbucks. That night, he got an email fuming that the children had been unnecessarily exposed to certain ideas.

He wrote back “explaining how what I was doing was the same kind of thing a straight couple would do,” he told me. “I didn’t go in there to discuss L.G.B.T. issues. I went in there to bring a cup of coffee to somebody that I love.”

“But it was one of those moments,” he added, “when I realized we can’t quite go around as if it were the same.”

South Bend is Indiana’s fourth largest city and abuts the University of Notre Dame, where both of Buttigieg’s parents have taught. It was once famous for its Studebaker auto assembly plant, but that closed more than half a century ago, prompting a painful decline.

Buttigieg has worked to reverse it. His “1,000 houses in 1,000 days” campaign demolished or repaired that many abandoned homes. New construction and the dazzling River Lights public art installation, which bathes a cascading stretch of South Bend’s principal waterway in a rainbow of hues, are reinvigorating the city center. And the old Studebaker plant is at long last being renovated — into a mix of office, commercial, residential and storage space.

All of that could set Buttigieg up for a Senate or gubernatorial bid down the line. So could his sharp political antenna. He saw the future: In 2000, he won the nationwide J.F.K. Profile in Courage Essay Contest for high school students with a tribute to a certain congressman named Bernie Sanders.

“Politicians are rushing for the center, careful not to stick their necks out on issues,” he wrote, exempting Sanders and crediting him with the power “to win back the faith of a voting public weary and wary of political opportunism.”

He seems always to say just the right thing, in just the right tone. When I asked why he signed up for the Navy Reserve, he cited his experience canvassing for Barack Obama in Iowa in 2008.

“So many times, I would knock and a child would come to the door — in my eyes, a child — and we’d get to talking and this kid would be on his way to basic training,” he remembered. “It was like this whole town was emptying itself out into the military.” But very few of the people he knew from Harvard or Oxford signed up.

When I asked where the Democratic Party errs, he said that too many Democrats “are not yet comfortable working in a vocabulary of ‘freedom.’ Conservatives talk about freedom. They mean it. But they’re often negligent about the extent to which things other than government make people unfree.”

“And that is exactly why the things we talk about as Democrats matter,” he continued. “You’re not free if you have crushing medical debt. You’re not free if you’re being treated differently because of who you are. What has really affected my personal freedom more: the fact that I don’t have the freedom to pollute a certain river, or the fact that for part of my adult life, I didn’t have the freedom to marry somebody I was in love with? We’re talking about deep, personal freedom.”

HE also challenged the degree to which some Democrats “participate in the fiction that if we just turn back the clock and get rid of trade, everybody can get their manufacturing jobs back. There are a lot of people who think they lost their jobs because of globalization when they actually lost their jobs because of technology.”

The solution, he said, isn’t isolationism, protectionism and nostalgia. It’s new skills and a next generation of products and services.

Did I mention that he speaks passable Arabic? Or that he’s an accomplished musician who played piano with the South Bend Symphony Orchestra in 2013 for a special performance of “Rhapsody in Blue”?

Or that he recently won a J.F.K. New Frontier Award, given annually to a few Americans under 40 whose commitment to public service is changing the country?

The daunting scope of his distinctions may be his greatest liability. (How many accolades named after J.F.K. can one man collect?)

That and his precociousness. Before his mayoralty, he ran an unsuccessful campaign for state treasurer of Indiana. He was 28.

So he’s not the most relatable pol in the pack. The laboratory would fix that.

Or maybe he’s fixing it himself. I last saw him at South Bend’s minor league baseball park, where he was chowing down on an all-American supper of nachos smothered in strips of fatty beef and a pale yellow goo. It looked like training for the Iowa State Fair.

Give him some Tums. And keep an eye on him.

Turn on the Dark — Rebecca Boyle in The Atlantic on the disappearance of the night sky.

The Pawnee people took literally the idea that we are all star stuff. In their cosmology, which dates back at least 700 years, the first woman was born from the marriage of stars, and the first man from the union of the sun and moon. The stars themselves were sent by the creator god, Tirawa, who tasked them with holding up the sky.

The brightest stars were entrusted with Earth’s climate, which was thought to be the key to its fertility. But this arrangement made some lesser stars jealous, so they stole a sack of violent storms that belonged to the brighter stars and emptied them on the Earth, and this is how death came to the world.

Today, the clouds, wind, and rain are still the principal ways that humans experience the sky, and that experience is changing. The Pawnee lived through thunderstorms and tornadoes, but ours are likely to become more violent as climate change worsens. And our night sky is changing too. As light pollution intensifies, it’s emptying out of stars, and life on Earth is paying a price.

One-third of humanity —and 80 percent of North Americans—can’t see the bright smear of the Milky Way, our home in the cosmos. For the first time in the history of our species, entire generations of people have never seen our galaxy.

A full 99 percent of the people in North America and Europe sleep under a bright haze at night, caused by light pollution. A new dark sky atlas describes just how widespread this problem is, and gives scientists a starting point for studying the impact artificial light is having on humans and the other creatures that share this planet.

“The light that we detected is not even seen by people, because they are asleep; it is only seen by astronomers,” says Fabio Falchi of the Light Pollution Science and Technology Institute in Thiene, Italy. “But I am convinced that light pollution is no longer a problem for astronomers. It is a global problem for everyone. All life on Earth evolved with the dark, with 12 hours of dark and 12 hours of sun. But now we are enveloping our planet in a perpetual glow. And life is affected by that.”

Falchi and Chris Elvidge, a scientist at the National Oceanic and Atmospheric Administration, have been studying satellite images of the Earth at night since the 1990s. Their first atlas, produced in 2001, used older satellite data that was taken around 8 p.m. local time, while the updated atlas comes courtesy of a new satellite that captured sky glow around 1 or 2 a.m. Because of these and other differences, the new atlas can’t be directly contrasted with the old one. But the scientists think light pollution is more widespread now, even as some communities are trying to bring back the night. This is partly because of LEDs.

“Awareness is rising, but not as much, I think, as the new lights,” Falchi says.

Many cities are replacing their older high-pressure sodium or metal halide street lamps with LEDs, which use less energy but shine more brightly, especially in the part of the visible-light spectrum that scatters the most. (This is the same effect that makes the sky blue.) This means cities are both getting brighter and spreading their light across greater distances. Standing in Death Valley National Park, for example, a visitor can see gumdrop-shaped domes of light hovering over Las Vegas to the east and Los Angeles to the west, both of which are hundreds of miles away.

[…]

Falchi and Elvidge say they have also sought out dark skies, but they have to travel to get them. A few years ago Falchi visited Chile’s Atacama desert, one of the driest and darkest places on the planet, where the view of the Milky Way presented him with what he called one of “the great natural wonders.”

Elvidge tries to escape the haze by visiting the mountains, an easy and obvious choice for someone in Boulder, Colorado. But sometimes he drives east instead, passing the exits for Denver and the smaller agricultural city of Greeley. After a two-hour ride, he arrives at the windswept Pawnee National Grassland.

Nobody lives out there now; the Pawnee themselves are long gone, and white farmers mostly abandoned the area after the Dust Bowl. Apart from the sandstone Pawnee Buttes, improbably rising from the plains like a pair of prairie ziggurats, the grassland’s most compelling feature is its sky. Here, you can see the same stars the Pawnee people did, centuries ago. Taking them in just as the Pawnee would have, you can wonder, as they did, where we came from.

Sunday, May 29, 2016

Donald Trump is a wildly promiscuous liar. He also has disturbing authoritarian tendencies. Trump’s many critics have seized upon both traits as his two major disqualifications for the presidency, yet both of them frustratingly defy easy quantification. All politicians lie some, and many of them lie a lot, and most presidents also push the limits of their authority in ways that can frighten their opponents. So what is so uniquely dangerous about Trump? Perhaps the answer is that both of these qualities are, in a sense, the same thing. His contempt for objective truth is the rejection of democratic accountability, an implicit demand that his supporters place undying faith in him. Because the only measure of truth he accepts is what he claims at any given moment, the power his supporters vest in him is unlimited.

Trump lies routinely, about everything. Various journalists have tried to tally up his lies, inevitably giving up and settling for incompletesummaries. Some of these lies are merely standard, or perhaps somewhat exaggerated, versions of the way members of his party talk about policy. (The “real” unemployment rate is as high as 42 percent, or his gargantuan tax-cut plan “will be revenue-neutral.”) At times he engages in especially brazen rewriting of his own positions, such as insisting he opposed the Iraq War when he did not, or denying his past support for universal health insurance. Some of his lies are conspiracy theories that run toward the edges of respectable Republican thought (Barack Obama was actually born abroad) or even well beyond it (Ted Cruz’s father may have conspired to kill John F. Kennedy). In all these areas, Trump has merely improved upon the methods used by the professionals in his field.

Where he has broken truly unique ground is in his lies about relatively small, routine matters. As I’ve pointed out before — it’s become a small personal fixation — after Mitt Romney mocked the failure of Trump Steaks, Trump held a press conference in which he insisted Trump Steaks remained a going concern, despite the undeniable fact that the business no longer exists. (His campaign displayed store-bought steaks for the media, not even bothering to fully remove the labels of the store at which they purchased them.) The New York Times actually reported this week that Trump had displayed his steaks, without mentioning the blatant deception. Another such example is Trump’s prior habit of impersonating an imaginary p.r. representative while speaking to reporters. Obviously, the practice itself is strange enough, but the truly Trumpian touch is that he admitted to the ruse publicly, and then subsequently went back to denying it.

The normal rules of political lying hold that when the lie has been exposed, or certainly when it has been confessed, the jig is up. You have to stop lying about it and tell the truth, or at least retreat to a different lie. Trump bends the rules of the universe to his own will, at no apparent cost. His brazenness is another utterly unique characteristic. His confidence that he can make the truth whatever he wishes at any moment, and toggle back and forth between incompatible realities at will, without any cost to himself, is a display of dominance. Possibly Trump’s most important statement of the campaign was his idle boast that he could shoot somebody on Fifth Avenue without losing any votes.

Finally, there is Trump’s habit of settling all disputes with his own peculiar form of ad hominem. He dismisses all criticisms of his statements and his record with an array of put-downs, and likewise confirms all endorsements with praise. Anybody who disagrees with Trump is ugly, short, corrupt, a loser, a habitual liar, a total joke, and so forth. People who support him are smart, beautiful, fair, esteemed, etc. But politics being as it is — and, especially, Trump’s positions being as fluid as they are — the composition of the two categories is in constant flux. One day, you are a failing, ridiculous, deranged liar, and the next day a citizen of the highest regard. Trump literally called Ben Carson a “violent criminal” and a “pathological liar,” akin to a “child molester.” When later accepting Carson’s endorsement, Trump praised his “dignity.” Once Trump mocked Rick Perry as a moron who wore glasses to look smart and who should be required to take an IQ test to participate in presidential debates. Now he is a “good guy, good governor.” This is the pattern Trump uses to dismiss all media criticism, or to amplify friendly coverage. Every reporter or publication is either pathetic and failing or fair and wonderful, and the same reporters and publications can be reclassified as one or the other as Trump sees fit.

1984 is a cliché for invoking totalitarianism, and in any case, Trump is merely an authoritarian and a bully, not a totalitarian. (A totalitarian government, like North Korea, exerts control over every aspect of its citizens’ lives; an authoritarian one, like Putin’s Russia, merely uses enough fear and violence to maintain control.) Nonetheless, the novel does capture the relationship between dictatorial authority and the power to manipulate any fact into a binary but permeable scheme:

The past was alterable. The past never had been altered. Oceania was at war with Eastasia. Oceania had always been at war with Eastasia. Jones, Aaronson, and Rutherford were guilty of the crimes they were charged with. He had never seen the photograph that disproved their guilt. It had never existed, he had invented it. He remembered remembering contrary things, but those were false memories, products of self-deception.

Truth and reason are weapons of the powerless against the powerful. There is no external doctrine he can be measured against, not even conservative dogma, which he embraces or discards at will and with no recognition of having done so. Trump’s version of truth is multiple truths, the only consistent element of which is Trump himself is always, by definition, correct. Trump’s mind is so difficult to grapple with because it is an authoritarian epistemology that lies outside the democratic norms that have shaped all of our collective experiences.

Every three years, the Royal Institute of Navigation organizes a conference focussed solely on animals. This April, the event was held southwest of London, at Royal Holloway College, whose ornate Victorian-era campus has appeared in “Downton Abbey.” For several days, the world’s foremost animal-navigation researchers presented their data and findings in a small amphitheatre. Most of the talks dealt with magnetoreception—the ability to sense Earth’s weak but ever-present magnetic field—in organisms as varied as mice, salmon, pigeons, frogs, and cockroaches. This marked a change from previous years, Richard Nissen, a member of the Institute, told me, when a range of other navigation aids were part of the discussion: landmarks, olfactory cues, memory, genetics, polarized light, celestial objects. “Everyone now seems completely sold on the idea that animal navigation is based on magnetism,” Nissen said. Human-centric as it sounds, most of the conference’s attendees believe that animals possess a kind of compass.

Scientists have sought for centuries to explain how animals, particularly migratory species, find their way with awesome precision across the globe. Examples of these powers abound. Bar-tailed godwits depart from the coastal mudflats of northern Alaska in autumn and set out across the Pacific Ocean, flying for eight days and nights over featureless water before arriving in New Zealand, seven thousand miles away. If the birds misjudge their direction by even a few degrees, they can miss their target. Arctic terns travel about forty thousand miles each year, from the Arctic to the Antarctic and back again. And odysseys of this sort are not limited to the feathered tribes. Some leatherback turtles leave the coast of Indonesia and swim to California, more than eight thousand miles away, then return to the very beaches where they hatched. Dragonflies and monarch butterflies follow routes so long that they die along the way; their great-grandchildren complete the journey.

Although the notion of a biocompass was widely disparaged in the first half of the twentieth century, the evidence in favor of it has since become quite strong. In the early nineteen-sixties, a German graduate student named Wolfgang Wiltschko began conducting experiments with European robins, which he thought might find their way by picking up radio waves that emanated from the stars. Instead, Wiltschko discovered that if he put the robins in cages equipped with a Helmholtz coil—a device for creating a uniform magnetic field—the birds would change their orientation when he switched the direction of north. By the start of this century, seventeen other species of migratory bird, as well as honeybees, sharks, skates, rays, snails, and cave salamanders, had been shown to possess a magnetic sense. In fact, practically every animal studied by scientists today demonstrates some capacity to read the geomagnetic field. Red foxes almost always pounce on mice from the northeast. Carp floating in tubs at fish markets in Prague spontaneously align themselves in a north-south axis. So do dogs when they crouch to relieve themselves, and horses, cattle, and deer when they graze—except if they are under high-voltage power lines, which have a disruptive influence.

The only problem is that no one can seem to locate the compass. “We are still crying out for how do they do this,” Joseph Kirschvink, a geobiologist at the California Institute of Technology, said. “It’s a needle in the haystack.” Kirschvink meant this almost literally. In 1981, as a Ph.D. student at Princeton University, he proposed that magnetite, a naturally occurring oxide of iron that he had found in honeybees and homing pigeons, was the basis of the biocompass. Even a handful of magnetite crystals, he wrote at the time, could do the trick. “One equivalent of a magnetic bacteria can give a whale a compass—one cell,” he told me. “Good luck finding it.” Even in animals smaller than a whale, this is no easy task. Throughout the two-thousands, researchers pointed to the presence of iron particles in the olfactory cells of rainbow trout, the brains of mole rats, and the upper beaks of homing pigeons. But when scientists at the Research Institute of Molecular Pathology, in Vienna, took a closer look, slicing and examining the beaks of hundreds of pigeons, they found that the iron-rich cells were likely the product of an immune response—nothing to do with the biocompass. The study’s lead researcher, David Keays, has since turned his focus to iron-containing neurons inside the pigeons’ ears.

The search for the biocompass has extended to even smaller scales, too. In 1978, the German biophysicist Klaus Schulten proposed that birds’ innate sense of direction was chemical in nature. According to his theory, incoming light would hit some sort of sensory mechanism, which Schulten hadn’t yet pinpointed, and induce a transfer of electrons, triggering the creation of a radical pair—two molecules, each with an extra electron. The electrons, though slightly separated, would spin in synchrony. As the bird moved through the magnetic field, the orientation of the spinning electrons would be modulated, providing the animal with feedback about its direction. For the next twenty years, it remained unclear which molecules could be responsible for such a reaction. Then, in 2000, Schulten suggested an answer—cryptochromes, a newly discovered class of proteins that respond to blue light. Cryptochromes have since been found in the retinas of monarch butterflies, fruit flies, frogs, birds, and even humans. They are the only candidate so far with the right properties to satisfy Schulten’s theory. But the weakest magnetic field that affects cryptochromes in the laboratory is still twenty times stronger than Earth’s magnetic field. Peter Hore, a chemist at Oxford University, told me that establishing cryptochromes as the biocompass will require at least another five years of research.

At the conference, the magnetite and cryptochrome researchers made up distinct camps, each one quick to point out the opposing theory’s deficiencies. One person stood alone: Xie Can, a biophysicist at Peking University. Xie spent six years developing a kind of unified model of magnetic animal navigation. Last year, he published a paper in Nature Materials describing a protein complex that he dubbed MagR, which consists of iron crystals enveloped in a double helix of cryptochrome—the two main theories rolled into one. Xie has yet to win over other researchers, some of whom believe that his findings are the result of iron oxide contaminating his lab experiments. (Keays has said that he will eat his hat if MagR is proved to be the real magnetoreceptor.) But at the end of the conference, with one mystery of animal navigation after another left unanswered, Xie told me that he felt more confident than ever of his and his colleagues’ model. “If we are right, we can explain everything,” he said. Michael Walker, a biologist at the University of Auckland, was more circumspect. If history is any indication, he said, many of the current hypotheses about how the biocompass works will turn out to be wrong.

When Constance Wu landed the part of Jessica Huang, the Chinese-American matriarch on the ABC sitcom “Fresh Off the Boat,” she didn’t realize just how significant the role would turn out to be. As she developed her part, Ms. Wu heard the same dismal fact repeated over and over again: It had been 20 years since a show featuring a predominantly Asian-American cast had aired on television. ABC’s previous offering, the 1994 Margaret Cho vehicle “All-American Girl,” was canceled after one season.

“I wasn’t really conscious of it until I booked the role,” Ms. Wu said. “I was focused on the task at hand, which was paying my rent.”

The show, which was just renewed for a third season, has granted Ms. Wu a steady job and a new perspective. “It changed me,” Ms. Wu said. After doing a lot of research, she shifted her focus “from self-interest to Asian-American interests.”

In the past year, Ms. Wu and a number of other Asian-American actors have emerged as fierce advocates for their own visibility — and frank critics of their industry. The issue has crystallized in a word — “whitewashing” — that calls out Hollywood for taking Asian roles and stories and filling them with white actors.

On Facebook, Ms. Wu ticked off a list of recent films guilty of the practice and said, “I could go on, and that’s a crying shame, y’all.” On Twitter, she bit back against Hollywood producers who believe their “lead must be white” and advised the creators of lily-white content to “CARE MORE.” Another tip: “An easy way to avoid tokenism? Have more than one” character of color, she tweeted in March. “Not so hard.”

It’s never been easy for an Asian-American actor to get work in Hollywood, let alone take a stand against the people who run the place. But the recent expansion of Asian-American roles on television has paradoxically ushered in a new generation of actors with just enough star power and job security to speak more freely about Hollywood’s larger failures.

And their heightened profile, along with an imaginative, on-the-ground social media army, has managed to push the issue of Asian-American representation — long relegated to the back burner — into the current heated debate about Hollywood’s monotone vision of the world.

“The harsh reality of being an actor is that it’s hard to make a living, and that puts actors of color in a very difficult position,” said Daniel Dae Kim, who stars in “Hawaii Five-0” on CBS and is currently appearing in “The King and I” on Broadway.

Mr. Kim has wielded his Twitter account to point to dire statistics and boost Asian-American creators. Last year, he posted a cheeky tribute to “the only Asian face” he could find in the entire “Lord of the Rings” series, a woman who “appears for a glorious three seconds.”

Other actors lending their voices include Kumail Nanjiani of “Silicon Valley,” Ming-Na Wen of “Agents of S.H.I.E.L.D.” and Aziz Ansari, who in his show, “Master of None,” plays an Indian-American actor trying to make his mark.

They join longtime actors and activists like BD Wong of “Gotham”; Margaret Cho, who has taken her tart comedic commentary to Twitter; and George Takei, who has leveraged his “Star Trek” fame into a social media juggernaut.

This past year has proved to be a particularly fraught period for Asian-American representation in movies. Last May, Sony released “Aloha,” a film set in Hawaii that was packed with white actors, including the green-eyed, blond-haired Emma Stone as a quarter-Chinese, quarter-Native Hawaiian fighter pilot named Allison Ng.

In September, it was revealed that in the planned adaptation of the Japanese manga series Death Note, the hero, a boy with dark powers named Light Yagami, would be renamed simply Light and played by the white actor Nat Wolff. In “The Martian,” released in October, the white actress Mackenzie Davis stepped into the role of the NASA employee Mindy Park, who was conceived in the novel as Korean-American.

The list goes on. In December, set photographs from the coming “Absolutely Fabulous” film showed the Scottish actress Janette Tough dressed as an over-the-top Asian character. Last month, Marvel Studios released a trailer for “Doctor Strange,” in which a character that had originated in comic books as a Tibetan monk was reimagined as a Celtic mystic played by Tilda Swinton.

And in the live-action American film adaptation of the manga series Ghost in the Shell, scheduled for next year, the lead character, Major Motoko Kusanagi, will be called Major and played by Scarlett Johansson in a black bob.

Studios say that their films are diverse. “Like other Marvel films, several characters in ‘Doctor Strange’ are significant departures from the source material, not limited by race, gender or ethnicity,” the Marvel Studios president Kevin Feige said in a statement. Ms. Swinton will play a character that was originally male, and Chiwetel Ejiofor a character that was originally white. Paramount and DreamWorks, the studios behind “Ghost in the Shell,” said that the film reflects “a diverse array of cultures and countries.”

But many Asian-American actors aren’t convinced. “It’s all so plainly outlandish,” Mr. Takei said. “It’s getting to the point where it’s almost laughable.”

Archives

Categories

Shameless Begging

Help Support BBWW

Bark Bark Woof Woof Shop

Click the mug to buy coolBark Bark Woof Woof stuff.

Legal Stuff

The opinions expressed in the postings on this site are, unless otherwise noted, solely those of the author (me). I take no responsibility for the opinions or contents expressed in referenced links or websites. The opinions expressed in the Comments section are solely those of their author and are subject to editing or deletion for offensive content.