*president George W. bush - traitor: conspiracy in election fraud of 2000, started an illegal war against iraq and afghanistan, allowed 911 to happen

Knobel discovered that negative attitudes toward the Irish existed in America and were expressed throughout the course of everyday life and strengthened as the Civil War approached, a period where his research concludes.

Negative Irish stereotypes depicting the Irish as beasts or apes prevailed in an antebellum Anglo-America and anti-Irish sentiments permeated throughout the lives of ordinary Americans, not just nativists.

Aaron Burr, vice-president who killed Hamilton, had children of color

Philadelphia ceremony honors John Pierre Burr, prominent member of black society now recognised as son of founding father

Amanda HolpuchThe Guardian Last modified on Sat 24 Aug 2019 17.15 EDT

​John Pierre Burr, one of two children the former vice-president Aaron Burr is said to have fathered with a servant from India, was officially memorialized as a descendant of the founding father at a ceremony in Philadelphia on Saturday.

The elder Burr was the vice-president to Thomas Jefferson between 1801 and 1805 but is perhaps best known for killing Alexander Hamilton in a duel, an act which made him the villain in a hit Broadway musical.

​The younger Burr, who lived from 1792 to 1864, was a prominent member of black society in Philadelphia. Rumored to be the vice-president’s son for several years, he was officially recognized by the Aaron Burr Association in 2018.

At Eden Cemetery in Philadelphia on Saturday, the not-for-profit association unveiled a headstone for John Pierre in a ceremony which featured a procession of men in tricorn hats, carrying flags.

A descendant of John Pierre Burr, Sherri Burr, spoke at the ceremony, which came about largely because of her own work to determine whether Aaron Burr was John Pierre’s father.

​Burr, an emeritus professor of law at the University of New Mexico and the third vice-president of the Aaron Burr Association, told the gathered crowd: “From henceforth I hope John Pierre Burr is never again referred to as ‘the natural son’ or ‘the illegitimate son’, but is simply referred to as ‘the son’,” the Washington Post reported.

Along with other evidence Burr found, a DNA test showed she was related to Stuart Johnson, another Burr descendant.

At the Aaron Burr Association’s annual meeting last year, members voted unanimously to recognize that Aaron Burr had two children – the other was Louisa Charlotte – by Mary Eugenie Beauharnais Emmons, who was from Kolkata, India, and was a servant in the Burr home.

The association was founded in 1946 and knew of rumors about John Pierre for more than a decade. In 2005, a black woman named Louella Burr Mitchell Allen came forward, claiming she had traced her lineage to John Pierre.

It was the work of Sherri Burr which swayed members of the association, a group of roughly 75 Burr descendants and history fans, to formally recognize the lineage. Historians told the Post the evidence Burr had found was convincing.

Sherri Burr is working on a book, Aaron Burr’s Family of Color. A historical fiction book about Burr’s “secret wife”, by Susan Holloway Scott, is due for release next month.

​Close ties between the founding fathers of the United States and people of color, including the people they enslaved, have become a more prominent thread in public history.

In 2018, Jefferson’s home at Monticello launched an exhibit about Sally Hemmings, an enslaved women who had Jefferson’s children. The relationship was an open secret while Jefferson was alive but for two centuries it was largely avoided at historical sites and in school textbooks.

In 2017, the first comprehensive history of George Washington’s runaway slave, Ona Judge, was published by the historian Erica Armstrong Dunbar. Washington’s home in Virginia, Mount Vernon, hosted an exhibit about Judge.

The Aaron Burr Association timed its headstone instillation to coincide with the 400th anniversary of enslaved Africans being brought to the US.

Point Comfort: where slavery in America began 400 years ago

​In 1619, a ship with 20 captives landed at Virginia, ushering in the era of slavery in what would become the United States

David Smith in WashingtonThe GuardianWed 14 Aug 2019 00.00 EDT

Young girls walk past a sign denoting the 400th anniversary of the landing of the first enslaved Africans in English-occupied North America at Point Comfort in 1619. Photograph: Evelyn Hockstein/The Guardian

The blue waters of the Chesapeake lap against the shore. Sunbathers lounge in deckchairs as black children and white children run and play on the beach. And close by stands a magnificent oak tree, its trunk stretching three great arms and canopies of leaves high into the tranquil sky.

Over half a millennium, the Algernoune Oak has witnessed war and peace and the fall of empires, but never a day like the one in late August 1619. It was here that the White Lion, a 160-ton English privateer ship, landed at what was then known as Point Comfort. On board were more than 20 captives seized from the Kingdom of Ndongo in Angola and transported across the Atlantic. This dislocated, unwilling, violated group were the first enslaved Africans to set foot in English North America – ushering in the era of slavery in what would become the United States.

This site, now Fort Monroe in Hampton, southern Virginia, will host a weekend of 400th anniversary commemorations on 23-25 August, culminating in a symbolic release of butterflies and nationwide ringing of bells. Americans of all races will reflect on a historical pivot point that illuminates pain and suffering but also resilience and reinvention. Some see an opportunity for a national reckoning and debate on reparations.

For a people robbed of an origins story, it is also an invitation to go in search of roots – the African in African American.

​“Once I learned that I was from there it changed something in me,” said Terry E Brown, 50, who has traced his ancestry to Cameroon and enslaved people in Virginia and North Carolina. “I have a fire in me to just learn about why and who I am. There’s something deep down and spiritual about it and I want to connect to it. I’m American, and I believe in this structure that we have, but I’m emotionally and spiritually tied to Africa now that I know where I came from.”

By the early 17th century the transatlantic slave trade – the biggest forced migration of people in world history – was already well under way in the Caribbean and Latin America. In 1619 it came to the English colony of Virginia. The San Juan Bautista, a Spanish ship transporting enslaved Africans, was bound for Mexico when it was attacked by the White Lion and another privateer, the Treasurer, and forced to surrender its African prisoners.

The White Lion continued on to land at Point Comfort. John Rolfe, a colonist, reported that its cargo was “not anything but 20 and odd Negroes, which the Governor and Cape Merchant bought for victualls”. They were given names by Portuguese missionaries: Antony, Isabela, William, Angela, Anthony, Frances, Margaret, Anthony, John, Edward, Anthony and others, according to research by the Hampton History Museum.

The captain of the White Lion, John Jope, traded the captives to Virginians in return for food and supplies. They were taken into servitude in nearby homes and plantations, their skills as farmers and artisans critical in the daily struggle to survive. Slavery in America was born.

​Yet it all requires a leap of imagination in the serenity of today’s 565-acre Fort Monroe national monument, run by the National Park Service, or in the low-key city of Hampton, home to Nasa’s Langley Research Center.

Brown, the first black superintendent at Fort Monroe, said: “The early colonists are trying to survive and they’re not doing it. They’re resorting to cannibalism because they just can’t figure this thing out. When the Africans show up, the game changes a little bit because they knew how to cultivate rice, sugar and cotton, all those things were perfect for this environment and for what they were trying to do.”

It would be another century until the formation of the United States. By 1725, some 42,200 enslaved Africans had been transported to the Chesapeake; by 1775, the total was 127,200. Thomas Jefferson, the author of the declaration of independence, which contains the words “all men are created equal”, was a Virginia slave owner and, by 1860, the US was home to about 3.9 million enslaved African Americans.

The events of 1619 are at once both remote and immediate in a state where white nationalists caused deadly violence in Charlottesville two years ago and in a nation where their enabler occupies the White House.

Brown reflected: “African Americans make up about 13% of the population and our young black men account for about 49% of America’s murders. People who look like me, about 41% of them are sitting in a jail cell. Now I can easily blame that on one thing but I can easily tie it to the very beginning of this country. It’s so easy to treat other people like they’re less than human if you don’t know them. So what I’m hoping this 400th will do is raise the awareness level.

“We’re not going to change people’s behaviour overnight but maybe if you sit back and think, ‘man, 400 years’, they were enslaved for 246 years so they lived under the most oppressive conditions imaginable but they managed to reinvent themselves …They created new music and new art forms and new families. It’s one of the greatest stories and it’s amazing that they survived it.”

​Last month, Donald Trump travelled to nearby Jamestown to celebrate the 400th anniversary of the first representative legislative assembly. The US president made reference to the first enslaved Africans’ arrival in Virginia, “the beginning of a barbaric trade in human lives”, but there are currently no plans for him to attend the commemoration at Fort Monroe.

Gaylene Kanoyton, the president of the Hampton branch of the National Association for the Advancement of Colored People (NAACP), said:“He’s not welcome because of everything that we’re commemorating, the arrival of slavery. He’s for white supremacy, he’s for nationalism, he’s for everything that we are against.”

​Built by enslaved labour, the fort has a multilayered history, full of contradiction and paradox, like America itself. It witnessed the beginning of slavery but also the end: early in the civil war, three enslaved men seeking freedom escaped to Fort Monroe and were deemed by the commander as “contraband of war”, spurring thousands to seek sanctuary behind Union lines and ultimately a shift in government policy towards emancipation.

There are other threads from past to present. Among the Africans who arrived on the White Lion were Anthony and Isabela who, in 1624 or 1625, had a son, William, who was baptised. In a census they are identified as “Antoney Negro: Isabell Negro: and William theire Child Baptised.” They were living on the property of Captain William Tucker, so are now known by this surname, and William is often described as the first African child born in English North America.

A local family in Hampton believe they are his direct descendants. Walter Jones, 63, whose mother is the oldest living Tucker, said: “We traced as far as we could and then we had word-of-mouth records. We heard this years and years ago and so a lot of us have been through family history and we just never realized how significant it was. From what we’re able to dig up, everything still points to that.”

Jones and his relatives maintain a two-acre cemetery in the historic African American neighborhood of Aberdeen Gardens in Hampton, where many of their ancestors are buried. A simple grey monument is inscribed with the words: “Tucker’s cemetery. First black family. 1619.” A short distance away, a headstone says, “African American female. Approx age 60. Discovered July 2017.” Dozens of white crosses dot patches of grass and soil representing unmarked graves.

Can Jones, a retired software engineer, forgive the enslavers? “The way we were raised and the way I was raised is that we forgive all for some of the things that were done because it wasn’t just them. It was going on everywhere so it was unfortunate and in some cases Africans were also involved in some of the slave trade.

“There’s more discord to not being recognised as being such a vital part of our history and our nation’s history here and what was contributed. We didn’t come here by choice but we chose to excel and to build a country which wasn’t our own. So sometimes I think not having that type of recognition makes you a little bitter. If it hasn’t come by now, when will it? And now that it’s 400 years coming up, how many people truly will even recognise that?”​The Tuckers are not alone. The anniversary coincides with a boom in online and TV genealogy. Donnie Tuck, the mayor of Hampton, a majority African American city, took a DNA test earlier this year and found lineage in Nigeria and other countries.

“Now we look at progress and, with so many documentaries and programs where you’re exploring what slaves went through and the civil war and the period afterwards, I think there’s a whole new emphasis and we have more resources available to us. There’a a real hunger among African Americans to try and know our roots and our experience, our journey here to America and even that whole journey for the last 400 years.”

Some have taken the curiosity further and travelled to Africa. Last month, the congressman James Clyburn was part of a congressional delegation to Ghana, led by the House speaker, Nancy Pelosi, that visited Cape Coast and Elmina castles to observe the 400th anniversary. It was his second trip to the “door of no return”. “All I remember the first time I went there was walking through that door and looking out at the ocean and the impact that was,” he said in a phone interview.

Clyburn believes that America has still not fully confronted the issue of slavery. “It’s an issue that’s been avoided in this country as much as possible. If it were an ongoing process I think that we would be much further down the road on that. We continue to treat this whole issue with what I like to call benign neglect. We tend to feel that if we ignore it, pretend it didn’t happen, then it didn’t happen or if we don’t need to do anything with it then we won’t.”[...]

T​he real reason the Second Amendment was ratified, and why it says “State” instead of “Country” (the Framers knew the difference – see the 10th Amendment), was to preserve the slave patrol militias in the southern states, which was necessary to get Virginia’s vote.

Founders Patrick Henry, George Mason, and James Madison were totally clear on that . . . and we all should be too.In the beginning, there were the militias. In the South, they were also called the “slave patrols,” and they were regulated by the states.

​In Georgia, for example, a generation before the American Revolution, laws were passed in 1755 and 1757 that required all plantation owners or their male white employees to be members of the Georgia Militia, and for those armed militia members to make monthly inspections of the quarters of all slaves in the state. The law defined which counties had which armed militias and even required armed militia members to keep a keen eye out for slaves who may be planning uprisings.

As Dr. Carl T. Bogus wrote for the University of California Law Review in 1998, “The Georgia statutes required patrols, under the direction of commissioned militia officers, to examine every plantation each month and authorized them to search ‘all Negro Houses for offensive Weapons and Ammunition’ and to apprehend and give twenty lashes to any slave found outside plantation grounds.”

It’s the answer to the question raised by the character played by Leonardo DiCaprio in Django Unchained when he asks, “Why don’t they just rise up and kill the whites?” If the movie were real, it would have been a purely rhetorical question, because every southerner of the era knew the simple answer: Well regulated militias kept the slaves in chains.

Sally E. Haden, in her book Slave Patrols: Law and Violence in Virginia and the Carolinas, notes that, “Although eligibility for the Militia seemed all-encompassing, not every middle-aged white male Virginian or Carolinian became a slave patroller.” There were exemptions so “men in critical professions” like judges, legislators and students could stay at their work. Generally, though, she documents how most southern men between ages 18 and 45 – including physicians and ministers – had to serve on slave patrol in the militia at one time or another in their lives.

And slave rebellions were keeping the slave patrols busy.

By the time the Constitution was ratified, hundreds of substantial slave uprisings had occurred across the South. Blacks outnumbered whites in large areas, and the state militias were used to both prevent and to put down slave uprisings. As Dr. Bogus points out, slavery can only exist in the context of a police state, and the enforcement of that police state was the explicit job of the militias.

If the anti-slavery folks in the North had figured out a way to disband – or even move out of the state – those southern militias, the police state of the South would collapse. And, similarly, if the North were to invite into military service the slaves of the South, then they could be emancipated, which would collapse the institution of slavery, and the southern economic and social systems, altogether.

These two possibilities worried southerners like James Monroe, George Mason (who owned over 300 slaves) and the southern Christian evangelical, Patrick Henry (who opposed slavery on principle, but also opposed freeing slaves).

Their main concern was that Article 1, Section 8 of the newly-proposed Constitution, which gave the federal government the power to raise and supervise a militia, could also allow that federal militia to subsume their state militias and change them from slavery-enforcing institutions into something that could even, one day, free the slaves.

This was not an imagined threat. Famously, 12 years earlier, during the lead-up to the Revolutionary War, Lord Dunsmore offered freedom to slaves who could escape and join his forces. “Liberty to Slaves” was stitched onto their jacket pocket flaps. During the War, British General Henry Clinton extended the practice in 1779. And numerous freed slaves served in General Washington’s army.

Thus, southern legislators and plantation owners lived not just in fear of their own slaves rebelling, but also in fear that their slaves could be emancipated through military service.

At the ratifying convention in Virginia in 1788, Henry laid it out:

“Let me here call your attention to that part [Article 1, Section 8 of the proposed Constitution] which gives the Congress power to provide for organizing, arming, and disciplining the militia, and for governing such part of them as may be employed in the service of the United States. . . .

“By this, sir, you see that their control over our last and best defence is unlimited. If they neglect or refuse to discipline or arm our militia, they will be useless: the states can do neither . . . this power being exclusively given to Congress. The power of appointing officers over men not disciplined or armed is ridiculous; so that this pretended little remains of power left to the states may, at the pleasure of Congress, be rendered nugatory.”

​George Mason expressed a similar fear:

“The militia may be here destroyed by that method which has been practised in other parts of the world before; that is, by rendering them useless, by disarming them. Under various pretences, Congress may neglect to provide for arming and disciplining the militia; and the state governments cannot do it, for Congress has an exclusive right to arm them [under this proposed Constitution] . . . “

Henry then bluntly laid it out:

“If the country be invaded, a state may go to war, but cannot suppress [slave] insurrections [under this new Constitution]. If there should happen an insurrection of slaves, the country cannot be said to be invaded. They cannot, therefore, suppress it without the interposition of Congress . . . . Congress, and Congress only [under this new Constitution], can call forth the militia.”

​“In this state,” he said, “there are two hundred and thirty-six thousand blacks, and there are many in several other states. But there are few or none in the Northern States. . . . May Congress not say, that every black man must fight? Did we not see a little of this last war? We were not so hard pushed as to make emancipation general; but acts of Assembly passed that every slave who would go to the army should be free.”

Patrick Henry was also convinced that the power over the various state militias given the federal government in the new Constitution could be used to strip the slave states of their slave-patrol militias. He knew the majority attitude in the North opposed slavery, and he worried they’d use the Constitution to free the South’s slaves (a process then called “Manumission”).

The abolitionists would, he was certain, use that power (and, ironically, this is pretty much what Abraham Lincoln ended up doing):

“[T]hey will search that paper [the Constitution], and see if they have power of manumission,” said Henry. “And have they not, sir? Have they not power to provide for the general defence and welfare? May they not think that these call for the abolition of slavery? May they not pronounce all slaves free, and will they not be warranted by that power?

“This is no ambiguous implication or logical deduction. The paper speaks to the point: they have the power in clear, unequivocal terms, and will clearly and certainly exercise it.”

He added: “This is a local matter, and I can see no propriety in subjecting it to Congress.”

James Madison, the “Father of the Constitution” and a slaveholder himself, basically called Patrick Henry paranoid.

“I was struck with surprise,” Madison said, “when I heard him express himself alarmed with respect to the emancipation of slaves. . . . There is no power to warrant it, in that paper [the Constitution]. If there be, I know it not.”

But the southern fears wouldn’t go away.

​Patrick Henry even argued that southerner’s “property” (slaves) would be lost under the new Constitution, and the resulting slave uprising would be less than peaceful or tranquil:

“In this situation,” Henry said to Madison, “I see a great deal of the property of the people of Virginia in jeopardy, and their peace and tranquility gone.”

So Madison, who had (at Jefferson’s insistence) already begun to prepare proposed amendments to the Constitution, changed his first draft of one that addressed the militia issue to make sure it was unambiguous that the southern states could maintain their slave patrol militias.

His first draft for what became the Second Amendment had said: “The right of the people to keep and bear arms shall not be infringed; a well armed, and well regulated militia being the best security of a free country [emphasis mine]: but no person religiously scrupulous of bearing arms, shall be compelled to render military service in person.”

But Henry, Mason and others wanted southern states to preserve their slave-patrol militias independent of the federal government. So Madison changed the word “country” to the word “state,” and redrafted the Second Amendment into today’s form:

“A well regulated Militia, being necessary to the security of a free State [emphasis mine], the right of the people to keep and bear Arms, shall not be infringed.”

Little did Madison realize that one day in the future weapons-manufacturing corporations, newly defined as “persons” by a Supreme Court some have called dysfunctional, would use his slave patrol militia amendment to protect their “right” to manufacture and sell assault weapons used to murder schoolchildren.

The U.S. Government Turned Away Thousands of Jewish Refugees, Fearing That They Were Nazi SpiesIn a long tradition of “persecuting the refugee,” the State Department and FDR claimed that Jewish immigrants could threaten national security

In the summer of 1942, the SS Drottningholmset sail carrying hundreds of desperate Jewish refugees, en route to New York City from Sweden. Among them was Herbert Karl Friedrich Bahr, a 28-year-old from Germany, who was also seeking entry to the United States. When he arrived, he told the same story as his fellow passengers: As a victim of persecution, he wanted asylum from Nazi violence.

​But during a meticulous interview process that involved five separate government agencies, Bahr's story began to unravel. Days later, the FBI accused Bahr of being a Nazi spy. They said the Gestapo had given him $7,000 to steal American industrial secrets—and that he'd posed as a refugee in order to sneak into the country unnoticed. His case was rushed to trial, and the prosecution called for the death penalty.

What Bahr didn’t know, or perhaps didn’t mind, was that his story would be used as an excuse to deny visas to thousands of Jews fleeing the horrors of the Nazi regime.

World War II prompted the largest displacement of human beings the world has ever seen—although today's refugee crisis is starting to approach its unprecedented scale. But even with millions of European Jews displaced from their homes, the United States had a poor track record offering asylum. Most notoriously, in June 1939, the German ocean liner St. Louis and its 937 passengers, almost all Jewish, were turned away from the port of Miami, forcing the ship to return to Europe; more than a quarter died in the Holocaust.

Government officials from the State Department to the FBI to President Franklin Roosevelt himself argued that refugees posed a serious threat to national security. Yet today, historians believe that Bahr's case was practically unique—and the concern about refugee spies was blown far out of proportion.

In the court of public opinion, the story of a spy disguised as a refugee was too scandalous to resist. America was months into the largest war the world had ever seen, and in February 1942, Roosevelt had ordered the internment of tens of thousands of Japanese-Americans. Every day the headlines announced new Nazi conquests.

Bahr was “scholarly” and “broad-shouldered,” a man Newsweek called “the latest fish in the spy net.” Bahr was definitely not a refugee; he had been born in Germany, but immigrated to the U.S. in his teens and become a naturalized citizen. He returned to Germany in 1938 as an engineering exchange student in Hanover, where he was contacted by the Gestapo.

At his preliminary hearing, the Associated Press reported that Bahr was “nattily clad in gray and smiling pleasantly.” By the time his trial began, he had little reason to smile; in a hefty 37-page statement, he admitted to attending spy school in Germany. His defense was that he'd planned to reveal everything to the U.S. government. But he sad he'd stalled because he was afraid. “Everywhere, no matter where, there are German agents,” he claimed.

​Comments like these only fed widespread fears of a supposed “fifth column” of spies and saboteurs that had infiltrated America. U.S. Attorney General Francis Biddle said in 1942 that “every precaution must be taken...to prevent enemy agents slipping across our borders. We already have had experience with them and we know them to be well trained and clever.” The FBI, meanwhile, released propaganda films that bragged about German spies who had been caught. “We have guarded the secrets, given the Army and Navy its striking force in the field,” one film said.

These suspicions were not only directed at ethnic Germans. “All foreigners became suspect. Jews were not considered immune,” says Richard Breitman, a scholar of Jewish history.

The American ambassador to France, William Bullitt, made the unsubstantiated statement that France fell in 1940 partly because of a vast network of spying refugees. “More than one-half the spies captured doing actual military spy work against the French Army were refugees from Germany,” he said. “Do you believe there are no Nazi and Communist agents of this sort in America?”

These kinds of anxieties weren't new, says Philip Orchard, a historian of international refugee policy. When religious persecution in the 17th century led to the flight of thousands of French Huguenots—the first group ever referred to as “refugees”—European nations worried that accepting them would lead to war with France. Later, asylum seekers themselves became objects of suspicion. “With the rise of anarchism at the turn of the 20th century, there were unfounded fears that anarchists would pose as refugees to enter countries to engage in violence,” Orchard says.

These suspicions seeped into American immigration policy. In late 1938, American consulates were flooded with 125,000 applicants for visas, many coming from Germany and the annexed territories of Austria. But national quotas for German and Austrian immigrants had been set firmly at 27,000.​Read more: ​SMITHSONIAN.COM

the party of traitors in 1933 are still traitors today!!!

What the GOP learned when the wealthy tried to overthrow FDR and install a fascist dictator

by DailyKos - alternet​March 24, 2019

In 1933, a group of very wealthy bankers on Wall Street were mortified with FDR. He gave government jobs to many unemployed, ended the gold standard, and was planning on enacting a slew of government programs that would constitute the “New Deal”. Workers would be given the right to unionize, infrastructure projects would be established, and he pitched a pension program for workers called Social Security. For that, he was called a “socialist.” Up to that point, however, his gravest sin was going after the wealthy to pay their fair share. The Revenue Act, which imposed a “wealth tax” on those at the top, increased the tax rate to 75%.

Roosevelt was accused of being a “traitor to his class”.

Unlike what the GOP says today, Roosevelt wasn’t trying to do any great social experiment. He was trying to pull our nation out of the Great Depression. Unemployment was between 80-90% at several major cities, and those few who were working were being exploited with zero labor protections. FDR took action. He was hated by the rich, and he didn’t care:

The forces of ‘organized money’ are unanimous in their hate for me – and I welcome their hatred.

The bankers felt that FDR was going after their money, but they went far beyond just trying to sway the election. Some decided that this whole democracy thing just wasn’t working, and they had to take matters into their own hands. This was a few years before the extent of the horror inflicted by the Nazis had surfaced, and fascism had several believers within the American right.

The bankers plotted a coup against FDR, which would later be called the Wall Street Putsch. The conspirators included a bond salesman named Gerald MacGuire, the commander of the Massachusetts American Legion Bill Doyle, and investment banker Prescott Bush. Yes, that Bush—father of George H.W. and grandfather of W.

They tried to recruit a general named Smedley Butler to their cause. They arranged several meetings and were quite blunt in what they wanted:

The conspirators would provide the financial backing and recruit an army of 500,000 soldiers, which Butler was to lead.

The pretext for the coup would be that FDR’s health was failing. FDR would remain in a ceremonial position, in which, as MacGuire allegedly described, “The President will go around and christen babies and dedicate bridges and kiss children.”

The real power of the government would be held in the hands of a Secretary of General Affairs, who would be in effect a dictator: “somebody to take over the details of the office — take them off the President’s shoulders. […] A sort of a super secretary.”

Unfortunately for the conspirators, the general didn’t bite. In fact, he exposed them at a Congressional hearing as traitors. Initially, Congress and the press treated this whole thing as a joke. However, with Butler’s testimony, along with a reporter who was at one of the meetings, Congress opened an investigation. They found out it was indeed true, but decided not to do anything about it because the plot seemed so far-fetched. No one was prosecuted.

Sadly, this wouldn’t be the last time a wealthy person got away with treason.

The wealthy plutocrats determined they had a problem. As much as the American people were suffering, they believed in democracy. They believed in America, and had contempt for dictatorship. No coup could have worked back then even if Butler had agreed to march on Washington. People wouldn’t have stood for it. So the strategy changed: the kind of Ayn-Randian changes that they wanted could never happen in a democracy. The majority doesn’t want welfare for the rich, denial of healthcare, repeal of the estate tax, or destruction of the social safety net. So democracy itself had to go.

For years, the plutocrats have spent a fortune on campaigns, rightwing media, and astro-turfing groups to cultivate a bizarre authoritarian ideology. I’ve written about the phenomenon of conservative submission before, but I argue that it didn’t happen overnight. It has taken a while, but now a large segment of our population has been utterly convinced that they need to align their beliefs with whatever benefits those at the top. Simultaneously, through constant bombardment from rightwing media, they are fed a steady diet of fear and paranoia about other people. They are told repeatedly that they need a strong leader to save and protect them. These are the ingredients for authoritarianism. For the rich, it doesn’t really matter who the person is, as long as he takes control and gives them what they want.

Has it worked?

According to a study by the University of Massachusetts, the inclination to support authoritarianism is now the most statistically significant trait that identifies someone as a modern, Trump-supporting republican.

Now, when Trump threatens to send the Feds against people on TV who satire him, or threatens to use the military, police, or “biker gangs” against those who speak out against him, these people cheer.

His supporters openly call for dictatorship. They don’t even bother to hide it anymore:

After World War II, what people hated the most was a fascist dictator. Ironic that is exactly what the modern GOP is now demanding. This time, there won’t need to be any ridiculous military coup attempt, but rather just a couple million manipulated morons willing to surrender their rights.

This Is How the Age of Plastics Began

In 1939, the future arrived at the World’s Fair in New York with the slogan, “The World of Tomorrow.”

DAVID A. TAYLOR - mother jones​MARCH 3, 2019 6:00 AM

In the closing months of World War II, Americans talked nonstop about how and when the war would end, and about how life was about to change. Germany would fall soon, people agreed on that. Opinions varied on how much longer the war in the Pacific would go on.

Amid the geopolitical turmoil, a small number of people and newspapers chattered about the dawn of another new age. A subtle shift was about to change the fabric of people’s lives: cork was about to lose its dominance as a cornerstone of consumer manufacturing to a little-known synthetic substance called plastic.

In 1939, the future arrived at the World’s Fair in New York with the slogan, “The World of Tomorrow.” The fairground in Queens attracted 44 million people over two seasons, and two contenders laid claim to being the most modern industrial material: cork and plastic.

For decades, cork had been rising as the most flexible of materials; plastic was just an intriguing possibility. The manifold forms of cork products were featured everywhere, from an international Paris Exhibition to the fair in Queens, where the material was embedded in the Ford Motors roadway of the future.

Meanwhile, plastic made a promising debut, with visitors getting their first glimpse of nylon, Plexiglas, and Lucite. Souvenirs included colorful plastic (phenolic resin) pencil sharpeners molded in the form of the fair’s emblematic, obelisk-shaped Trylon building. Visitors also picked up celluloid badges and pen knives, and a Remington electric razor made of Bakelite, along with plastic ashtrays, pens, and coasters.

In the months after the fair, as US entry into the war became inevitable, the government grew concerned by American dependence on cork, which was obtained entirely from forests in Europe. The United States imported nearly half of the world’s production.

People in their 50s today remember when a bottle cap included a cork sliver insert to seal it. But in 1940, cork was in far more than bottle caps. It was the go-to industrial sealant used in car windshield glazing, insulation, refrigerated containers, engine gaskets, and airplanes. In defense, cork was crucial to tanks, trucks, bomber planes, and weapon systems. As the vulnerability for the supply of this all-purpose item became clear with the Nazi blockade of the Atlantic, the government put cork under “allotment,” or restricted use prioritized for defense. Information about cork supplies became subject to censorship.

In October 1941, the Commerce Department released a hefty report detailing the situation titled “Cork Goes to War.” Besides outlining the growing industrial use of cork, the report highlighted Hitler’s efforts to scoop up Europe’s cork harvests and the need for a systemic American response.

Part of that response was an intense research and development machine that ramped up the nascent synthetic industry to fill gaps in defense pipelines. Some were synthetics first developed by America’s enemies: chemists at Armstrong Cork, an industry leader, crafted new products using materials research from Germany. Many synthetics were developed during the mad scramble to replace organic items that the blockade made expensive. To pay for the research and offset rising materials costs, Armstrong trimmed employees’ use of items like carbon paper and paper clips; the company’s accountants noted 95,000 clips used per month in 1944, a 40 percent decline since the war’s start.

In 1944, a book titled “Plastic Horizons,” by B.H. Weil and Victor Anhorn, documented the promise of plastic. A chapter titled “Plastics in a World at War” opens with a paean to the blood toll of war. But then the authors trace how war bends science to its needs for new both deadly and life-saving items: Physicists turn to aircraft detection, chemists to explosives. “Nylon for stockings has become nylon for parachutes. Rubber for tires has almost vanished, and desperate measures are required to replace it with man-made elastics.” That section concludes, “Plastics have indeed gone to war.”

In one dramatic example, the authors describe how plastics came to neutralize Germany’s secret weapon: a magnetic mine designed to be laid on the ocean floor and detonated by the magnetic field surrounding any vessel that passed over it. To counteract that, Allied scientists created plastic-coated electric cables that wrapped around the ships’ hulls and “degaussed” them, rendering the mines ineffective. Thanks, polyvinyl chloride!

The book got a glowing review in the New York Times, which noted that America was experiencing a chemical revolution.

Early plastics, as the book explained, covered a wide range of natural or semi-synthetics like celluloid and synthetic resins that could be molded with heat and pressure.

After the war, chronic shortages of common materials like rubber, cork, linseed oil, and paints forced chemists to scramble for substitutes, further speeding the embrace of plastics. Profitable bottling innovations included the LDPE squeeze bottle introduced by Monsanto in 1945, which paved the way for plastic bottles for soaps and shampoos, and the “Crowntainer,” a seamless metal cone-topped beer can.

There was also a shortage of tinplate for metal caps. Industry was quickly adapting to finding substitutes. Giles Cooke, the in-house chemist at one manufacturing leader, Crown Cork & Seal, was dabbling in research on synthetic resins for container sealants through the 1940s. In beverage bottling, cork’s quality remained unmatched. You could taste the difference between a cork-sealed bottle and one sealed with plastic. Recognizing that it would takes decades to replace cork as a sealant, Cooke and his colleagues hedged their bets with patents on both silicone film container liners and rubber hydrochloride.

In the end, “Plastic Horizons” undersold its subject. Its closing chapter hardly seems to anticipate the ubiquity of plastics we see today, along with its formidable waste problem. “In the future, plastics will supplement rather than supplant such traditional structural materials as metals, wood, and glass,” the authors wrote.

“There may be no Plastics Age, but that should discourage no one; applications will multiply with the years,” they continued. “Plastics are indeed versatile materials, and industry, with the help of science, will continue to add to their number and to improve their properties. Justifiable optimism is the order of the day, and the return of peace will enable the plastics industry to fulfill its promise of things to come.”

By 1946, the transition to plastics had reached a new threshold. That year, New York hosted a National Plastics Exposition, where for the first time, a range of strong, new materials and consumer products headed for American homes were on display. One observer noted, “the public are certainly steamed up on plastics.”

The World of Tomorrow indeed.​

How Islam spread through the Christian world via the bedroom

by Aeon - alternet​February 2, 2019

​There are few transformations in world history more profound than the conversion of the peoples of the Middle East to Islam. Starting in the early Middle Ages, the process stretched…

There are few transformations in world history more profound than the conversion of the peoples of the Middle East to Islam. Starting in the early Middle Ages, the process stretched across centuries and was influenced by factors as varied as conquest, diplomacy, conviction, self-interest and coercion. There is one factor, however, that is largely forgotten but which played a fundamental role in the emergence of a distinctively Islamic society: mixed unions between Muslims and non-Muslims.

For much of the early Islamic period, the mingling of Muslims and non-Muslims was largely predicated on a basic imbalance of power: Muslims formed an elite ruling minority, which tended to exploit the resources of the conquered peoples – reproductive and otherwise – to grow in size and put down roots within local populations. Seen in this light, forced conversion was far less a factor in long-term religious change than practices such as intermarriage and concubinage.

The rules governing religiously mixed families crystallised fairly early, at least on the Muslim side. The Quran allows Muslim men to marry up to four women, including ‘People of the Book’, that is, Jews and Christians. Muslim women, however, were not permitted to marry non-Muslim men and, judging from the historical evidence, this prohibition seems to have stuck. Underlying the injunction was the understanding that marriage was a form of female enslavement: if a woman was bound to her husband as a slave is to her master, she could not be subordinate to an infidel.

Outside of marriage, the conquests of the seventh and eighth centuries saw massive numbers of slaves captured across North Africa, the Middle East and Central Asia. Female slaves of non-Muslim origin, at least, were often pressed into the sexual service of their Muslim masters, and many of these relationships produced children.

Since Muslim men were free to keep as many slaves as they wished, sex with Jewish and Christian women was considered licit, while sex with Zoroastrians and others outside the ‘People of the Book’ was technically forbidden. After all, they were regarded as pagans, lacking a valid divine scripture that was equivalent to the Torah or the Gospel. But since so many slaves in the early period came from these ‘forbidden’ communities, Muslim jurists developed convenient workarounds. Some writers of the ninth century, for example, argued that Zoroastrian women could be induced or even forced to convert, and thus become available for sex.

Whether issued via marriage or slavery, the children of religiously mixed unions were automatically considered Muslims. Sometimes Jewish or Christian men converted after already having started families: if their conversions occurred before their children attained the age of legal majority – seven or 10, depending on the school of Islamic law – they had to follow their fathers’ faith. If the conversions occurred after, the children were free to choose. Even as fathers and children changed religion, mothers could continue as Jews and Christians, as was their right under Sharia law.

​Mixed marriage and concubinage allowed Muslims – who constituted a tiny percentage of the population at the start of Islamic history – to quickly integrate with their subjects, legitimising their rule over newly conquered territories, and helping them grow in number. It also ensured that non-Muslim religions would quickly disappear from family trees. Indeed, given the rules governing the religious identity of children, mixed kinship groups probably lasted no longer than a generation or two. It was precisely this prospect of disappearing that prompted non-Muslim leaders – Jewish rabbis, Christian bishops and Zoroastrian priests – to inveigh against mixed marriage and codify laws aimed at discouraging it. Because Muslims were members of the elite, who enjoyed greater access to economic resources than non-Muslims, their fertility rates were probably higher.

Of course, theory and reality did not always line up, and religiously mixed families sometimes flouted the rules set by jurists. One of the richest bodies of evidence for such families are the biographies of Christian martyrs from the early Islamic period, a little-known group who constitute the subject of my book, Christian Martyrs under Islam (2018). Many of these martyrs were executed for crimes such as apostasy and blasphemy, and not a small number of them came from religiously mixed unions.

A good example is Bacchus, a martyr killed in Palestine in 786 – about 150 years after the death of the Prophet Muhammad. Bacchus, whose biography was recorded in Greek, was born into a Christian family, but his father at some point converted to Islam, thereby changing his children’s status, too. This greatly distressed Bacchus’s mother, who prayed for her husband’s return, and in the meantime, seems to have exposed her Muslim children to Christian practices. Eventually, the father died, freeing Bacchus to become a Christian. He was then baptised and tonsured as a monk, enraging certain Muslim relatives who had him arrested and killed.

Similar examples come from Córdoba, the capital of Islamic Spain, where a group of 48 Christians were martyred between 850 and 859, and commemorated in a corpus of Latin texts. Several of the Córdoba martyrs were born into religiously mixed families, but with an interesting twist: a number of them lived publicly as Muslims but practised Christianity in secret. In most instances, this seems to have been done without the knowledge of their Muslim fathers, but in one unique case of two sisters, it allegedly occurred with the father’s consent. The idea that one would have a public legal identity as a Muslim but a private spiritual identity as a Christian produced a unique subculture of ‘crypto-Christianity’ in Córdoba. This seems to have spanned generations, fuelled by the tendency of some ‘crypto-Christians’ to seek out and marry others like them.

In the modern Middle East, intermarriage has become uncommon. One reason for this is the long-term success of Islamisation, such that there are simply fewer Jews and Christians around to marry. Another reason is that those Jewish and Christian communities that do exist today have survived partly by living in homogeneous environments without Muslims, or by establishing communal norms that strongly penalise marrying out. In contrast to today’s world, where the frontiers between communities can be sealed, the medieval Middle East was a world of surprisingly porous borders, especially when it came to the bedroom.

Trump, Benjamin Franklin and the long history of calling immigrants ‘snakes’

by History News Network - ALTERNETJanuary 28, 2019

​In the midst of a national (non-)dialogue about immigration, one major sticking point has been the belief promoted by Donald Trump that immigrants crossing the southern border are criminals, slinking northward…

In the midst of a national (non-)dialogue about immigration, one major sticking point has been the belief promoted by Donald Trump that immigrants crossing the southern border are criminals, slinking northward like reptiles to spread their venom.

Speaking at the Conservative Political Action Conference in February 2018, Trump read a poem that he had frequently recited during his campaign to discuss immigration. The poem (first written as a song in the 1960s) tells the story of a woman who rescues a freezing snake but is bitten after she revives it. The last stanza is a dialogue between the woman and the snake: “I saved you, cried the woman, and you’ve bitten me. Heavens why?/ You know your bite is poisonous, and now I’m going to die./ Oh, shut up, silly woman said the reptile with a grin./ You knew damn well I was a snake before you took me in.”

Trump’s unsupported allegations that immigrants are “animals, not people” may find a popular reception among many Americans because the association between immigrants, criminality, and reptility goes back to a period well before the founding of the nation. The sticking point in the national dialogue might be removed if citizens had more information about the cultural origins of the belief that immigrants are felons—the legal equivalent of rattlesnakes.

After the establishment of the British colonies in America in the early seventeenth century, it became a common practice in England for authorities to round up “beggars, Gypsies, prostitutes, the poor, the orphaned,” and other “lewd and dangerous persons” and ship them to the colonies.To pay for their passage, the ship’s captain sold the immigrants into servitude upon arrival. Dennis Todd, author of Defoe’s America, estimates that about 130,000 British immigrants were brought to the Chesapeake Bay region between 1670 and 1718, far outnumbering the property-owning colonists (Todd, 7). Of this number, about half were indentured servants. Many of the remainder were felons who were granted transportation to America as an alternative to capital punishment.After passage of the Transportation Act of 1718, a large class of felonies was made punishable by transportation, rather than death. Between 1718 and 1775, some 40,000 convicts were transported to America (Todd, 8).

In the years prior to the importation of enslaved people from Africa, indentured servants, some of whom came voluntarily, were the principal source of labor in America. They were bound to work for a limited period of time (generally four to seven years), after which they were to be freed. They were often paid “freedom dues” of food, clothing, tools, and land upon completing their terms of service. After 1718, indentured servants were gradually supplanted by enslaved Africansand by transported felons. The felons were often sentenced to fourteen years of labor and did not enjoy the rights accorded to indentured servants. The longer terms of service and the lower social status of transported felons, together with a tighter market for tobacco, meant that felons after 1718 were much less likely to obtain the rewards of transportation that they might once have expected.

Benjamin Franklin, a patriot committed to ideals of human liberty, decried the policies of the British government that sent ships loaded with convicts to America. In 1749, the Assembly in Pennsylvania passed a bill forbidding the importation of convicts,but the measure wasrejected by the British Parliament on the grounds “That such Laws are against the Publick Utility, as they tend to present the Improvement and Well Peopling of the Colonies” (Franklin, 358). In a satirical letter worthy of Jonathan Swift, Franklin expressed the gratitude of the colonies to “our Mother Country for the Welfare of her Children,” and proposed a fair return for the shipments of convicts. His proposal was that, in the Spring, the colonists should round up thousands of the “venomous Reptiles we call Rattle-Snakes, Felons-convict from the Beginning of the World,” and transport them to Britain, where they may be released in St. James’s Park, in the pleasure gardens about London, “but particularly in the Gardens of the Prime Ministers, the Lords of Tradeand Members of Parliament; for to them we are most particularly obliged” (Franklin, 360).

Franklin, of course, did not consider all immigrants to be comparable to rattlesnakes, but many colonists made no distinction between the two groups. One pamphleteer, whom Franklin quoted in his letter to the Pennsylvania Gazette, exclaimed “In what can Britain show a more Sovereign Contempt for us, than by emptying their Jails into our Settlements; unless they would likewise empty their Jakes on our Tables?” (Franklin, 358). Eventually, says Todd, “all servants, free or criminal, came to be seen as socially inferior and unfit” (Todd, 144). Franklin was not brazen enough to propose a wall to keep out both servants and slaves, but in some ways his proposal to send rattlesnakes to Britain went further. His letter was really a cry to Americans to stand up for their own sovereignty (to which they did not yet have a claim, though Franklin thought they should exert it). Presumably, Franklin thought that sovereignty for America would leadto a more humanitarian policy on immigration.

America should be able to control its borders and determine the composition of its citizenry without recourse to rhetoric that demeans both immigrants and the office of the Presidency. The starting point, rather than sticking point, may be the moment when we stop regarding immigrants as rattlesnakes.

​RECOMMENDED READING:WHITE TRASH - THE 400-YEAR UNTOLD HISTORY OF CLASS IN AMERICA BY NANCY ISENBERG

THE BORDER PATROL HAS BEEN A CULT OF BRUTALITY SINCE 1924​Greg Grandin - the interceptJanuary 12 2019, 6:00 a.m.

​SINCE ITS FOUNDING in the early 20th century, the U.S. Border Patrol has operated with near-complete impunity, arguably serving as the most politicized and abusive branch of federal law enforcement — even more so than the FBI during J. Edgar Hoover’s directorship.

The 1924 Immigration Act tapped into a xenophobia with deep roots in the U.S. history. The law effectively eliminated immigration from Asia and sharply reduced arrivals from southern and eastern Europe. Most countries were now subject to a set quota system, with the highest numbers assigned to western Europe. As a result, new arrivals to the United States were mostly white Protestants. Nativists were largely happy with this new arrangement, but not with the fact that Mexico, due to the influence of U.S. business interests that wanted to maintain access to low-wage workers, remained exempt from the quota system. “Texas needs these Mexican immigrants,” said the state’s Chamber of Commerce.

Having lost the national debate when it came to restricting Mexicans, white supremacists — fearing that the country’s open-border policy with Mexico was hastening the “mongrelization” of the United States — took control of the U.S. Border Patrol, also established in 1924, and turned it into a frontline instrument of race vigilantism. As the historian Kelly Lytle Hernández has shown, the patrol’s first recruits were white men one or two generations removed from farm life. Some had a military or county sheriff background, while others transferred from border-town police departments or the Texas Rangers — all agencies with their own long tradition of unaccountable brutality. Their politics stood in opposition to the big borderland farmers and ranchers. They didn’t think that Texas — or Arizona, New Mexico, and California — needed Mexican migrants.

​Earlier, in the mid-1800s, the Mexican-American War had unleashed a broad, generalized racism against Mexicans throughout the nation. That racism slowly concentrated along an ever-more focused line: the border. While the 1924 immigration law spared Mexico a quota, a series of secondary laws — including one that made it a crime to enter the country outside official ports of entry — gave border and customs agents on-the-spot discretion to decide who could enter the country legally. They had the power to turn what had been a routine daily or seasonal event — crossing the border to go to work — into a ritual of abuse. Hygienic inspections became more widespread and even more degrading. Migrants had their heads shaved, and they were subjected to an increasingly arbitrary set of requirements and the discretion of patrollers, including literacy tests and entrance fees.

The patrol wasn’t a large agency at first — just a few hundred men during its early years — and its reach along a 2,000-mile line was limited. But over the years, its reported brutality grew as the number of agents it deployed increased. Border agents beat, shot, and hung migrants with regularity. Two patrollers, former Texas Rangers, tied the feet of one migrant and dragged him in and out of a river until he confessed to having entered the country illegally. Other patrollers were members of the resurgent Ku Klux Klan, active in border towns from Texas to California. “Practically every other member” of El Paso’s National Guard “was in the Klan,” one military officer recalled, and many had joined the Border Patrol upon its establishment.

For more than a decade, the Border Patrol operated under the authority of the Department of Labor, which in the early years of the Great Depression, before the election of Franklin D. Roosevelt and his appointment of Frances Perkins as secretary of labor, was a major driver pushing deportation. Perkins, even before she entered FDR’s cabinet, had already criticized Border Patrol brutality. In office, she tried to limit the abuses of immigration officials as much as she could, curtailing warrantless arrests, allowing detained migrants phone calls, and working to extend the protections the New Deal offered citizens to migrant workers, including an effort to make abusive migrant labor contracts more equitable.

Reform was short-lived. The White House, bowing to pressure from agriculturalists, placed the Border Patrol, and migration policy more broadly, under the authority of the Department of Justice. More lawsfurther criminalizing migration reinforced the Border Patrol’s power. For example, the end of the Bracero guest-worker program, along with the 1965 Hart-Celler Act, which for the first time assigned quotas to Mexico and other countries in the Western Hemisphere, now meant that thousands of seasonal Mexican workers were officially “illegal.”

Exporting Paramilitary Policing

At the same time, experience gained in migrant interdiction began to be exported internationally. The Border Patrol is often thought of, even by critics of its brutality, as a sleepy backwater federal agency, far removed from the Cold War’s ideological frontlines. But the Patrol played a role in expanding the radius of Washington’s national security doctrine — the tutoring of allied security forces in counterinsurgency tactics — and accelerating the tempo of paramilitary action.

​The career of John P. Longan, who worked as an Oklahoma sheriff before joining the Border Patrol, is illustrative. Following stints in New Mexico and Texas, Longan was tapped to help run Operation Wetback, a mass deportation drive focused mostly on California that, as the Los Angeles Times put it, transformed the patrol into an “army” committed to an “all-out war to hurl tens of thousands of Mexican wetbacks back into Mexico.” Modern armies need a modern intelligence service, and Longan, operating out of an unmarked location in an old Alameda Navy installation, updated the Patrol’s ability to gather and analyze information — including information extracted during interrogations — and then act on that information quickly. A few years later, Longan transferred to the State Department’s Public Safety Program, doing tours in a number of third-world hotspots, including Venezuela, Thailand, the Dominican Republic, and Guatemala. According to Stuart Schrader, in his forthcoming “Badges Without Borders: How Global Counterinsurgency Transformed American Policing,” Longan was one of a number of Border Patrol agents recruited to train foreign police through CIA-linked “public safety” programs, since they were likely to speak Spanish. And having worked the southwestern borderlands, these patrollers-turned-covert operators were familiar with societies built around peonage-like labor relations; they seamlessly extended the kind of free-range immunity they enjoyed at home to poorer, oligarch-ruled nations like Guatemala.

In Guatemala, Longan used the intelligence techniques similar to the ones he developed in Operation Wetback to train local police and military officers, creating an “action unit” that could gather information — also mostly from interrogations, many of them including torture — and act on that information in a rapid manner. Within the first three months of 1966, “Operación Limpieza,” or Operation Clean-up, as Longan called his project, conducted over 80 raids and scores of extrajudicial assassinations, including the murder, during one four-day period in early March, of over 30 political activists (I describe Longan’s time in Guatemala in detail here). Likewise, through the early 1970s, the U.S. trained Latin American security forces, the majority from countries run by military governments, at the Border Patrol Academy in Los Fresnos, Texas, where, according to the Los Angeles Times, “CIA instructors” trained them “in the design, manufacture, and potential use of bombs and incendiary devices.”

​In This Place, You Have No Rights

Starting in the 1970s, investigative journalists began to report on Border Patrol abuse. Such exposés were damning, but largely ignored. John Crewdson, for instance, won a Pulitzer in 1980 for a series of articles published in the New York Times, including one titled “Border Sweeps of Illegal Aliens Leave Scores of Children in Jails,” yet his 1983 book based on the series, “The Tarnished Door,” is out of print. Crewdson’s reporting on the Border Patrol and the immigration system deserves a revival, for it provides an important back-history to the horrors we are witnessing today.

Patrollers, he reported, regularly engaged in beatings, murder, torture, and rape, including the rape of girls as young as 12. Some patrollers ran their own in-house “outlaw” vigilante groups. Others maintained ties with groups like the Klan. Border Patrol agents also used the children of migrants, either as bait or as a pressure tactic to force confessions. When coming upon a family, agents usually tried to apprehend the youngest member first, with the idea that relatives would give themselves up so as not to be separated. “It may sound cruel,” one patroller said, but it often worked.

Separating migrant families was not official government policy in the years Crewdson was reporting on abuses. But left to their own devices, Border Patrol agents regularly took children from parents, threatening that they would be separated “forever” unless one of them confessed that they had entered the country illegally. Mothers especially, an agent said, “would always break.” Once a confession was extracted, children might be placed in foster care or left to languish in federal jails. Others were released into Mexico alone, far from their homes — forced to survive, according to public defenders, by “garbage-can scrounging, living on rooftops and whatever.” Ten-year-old Sylvia Alvarado, separated from her grandmother as they crossed into Texas, was kept in a small cinderblock cell for more than three months. In California, 13-year-old Julia Pérez, threatened with being arrested and denied food, broke down and told her interrogator that she was Mexican, even though she was a U.S. citizen. The Border Patrol released Pérez into Mexico with no money or way to contact her U.S. family. Such cruelties weren’t one-offs, but part of a pattern, encouraged and committed by officers up the chain of command. The violence was both gratuitous and systemic, including “stress” techniques later associated with the war in Iraq.

The practice, for instance, as recently reported, of placing migrants in extremely cold rooms — called hieleras, or “ice boxes” — goes back decades, at least to the early 1980s, with Crewdson writing that it was a common procedure. Agents reminded captives that they were subject to their will: “In this place, you have no rights.”

Some migrants, being sent back to Mexico, were handcuffed to cars and made to run alongside them to the border. Patrollers pushed “illegals off cliffs,” a patrol agent told Crewdson, “so it would look like an accident.” Officers in the patrol’s parent agency, the Immigration and Naturalization Service, traded young Mexican women they caught at the border to the Los Angeles Rams in exchange for season tickets, and supplied Mexican prostitutes to U.S. congressmen and judges, paying for them out of funds the service used to compensate informants. Agents also worked closely with Texas agriculturalists, delivering workers to their ranches (including to one owned by Lyndon B. Johnson when he was in the White House), then raiding the ranches just before payday and deporting the workers. “The ranchers got their crops harvested for free, the INS men got fishing and hunting privileges on the ranches, and the Mexicans got nothing,” Crewdson reported.

Subsequent reporting confirms that the violence Crewdson documented continued down the years, largely unabated. The remoteness of much of the border region and the harshness of its terrain, the work that straddled the line between foreign and domestic power, and the fact that many of the patrollers were themselves veterans of foreign wars (or hailed from regions with fraught racial relations, including the borderlands themselves) all contributed to a “fortress mentality,” as one officer put it. Patrollers easily imagined their isolated substations to be frontier forts in hostile territory, holding off barbarians. They wielded awesome power over desperate people with little effective recourse. Based on information provided by local migrant advocacy groups, Human Rights Watch wrote in 1993 that in one such substation, in Harlingen, Texas, “physical abuse is often coupled with due process abuses meant to terrorize victims of brutality.” Most captured migrants, beaten or threatened with a beating, signed “voluntary departure agreements” and were “quickly repatriated.”

​Between 1982 and 1990, Mexico City sent at least 24 protests to the U.S. State Department on behalf of Mexicans injured or murdered by Border Patrol agents. Just as soldiers use racial epithets for the people they are fighting overseas, Border Patrol agents have a word for their adversaries: “tonks.” It’s “the sound,” one patroller told a journalist, “a flashlight makes when you hit someone over the head.” In neighborhoods filled with undocumented residents, the Patrol operated with the latitude of an occupying army. “Mind your own fucking business, lady, and go back into your house,” one patroller ordered a resident in Stockton, California, who came out on her balcony to see him “kicking a Mexican male who was handcuffed and lying facedown on the ground.”

It wasn’t just the federal Border Patrol that engaged in such sadism, but local law enforcement as well. In 1980, a Texas lawyer affiliated with the United Farm Workers obtained videos of 72 interrogations of migrants that took place over the course of the previous seven years, recorded by the police department in McAllen, Texas. The images were disturbing: Police took turns beating one handcuffed Mexican man, bashing his head on the concrete floor, punching, kicking, and cursing as he pleaded for mercy. The tapes were made for enjoyment, as a kind of bonding ritual that would later be associated with the abuse committed against Iraqi prisoners in Abu Ghraib: As the officers gathered “night after night,” they drank beer and watched “playbacks” of their interrogation sessions. It was, said one of the men involved, a way of initiating new recruits into the cult of border brutalism.

There have been contradictory judicial rulings, but historically, agent power has been limited by no constitutional clause. There are few places patrollers can’t search, no property belonging to migrants they can’t seize. And there is hardly anybody they can’t kill, provided that the victims are poor Mexican or Central American migrants. Between 1985 and 1990, federal agents shot 40 migrants around San Diego alone, killing 22 of them. On April 18, 1986, for instance, patroller Edward Cole was beating 14-year-old Eduardo Carrillo Estrada on the U.S. side of the border’s chain-link fence, when he stopped and shot Eduardo’s younger brother, Humberto, in the back. Humberto was standing on the other side of the fence on Mexican soil. A court ruled that Cole, who had previous incidents of shooting through the fence at Mexicans, had reason to fear for his life from Humberto and used justifiable force.

​Such abuses persisted through the 1990s and 2000s. In 1993, the House Subcommittee on International Law, Immigration, and Refugees held hearings on Border Patrol abuse, and its transcript is a catalogue of horrors. One former guard, Tony Hefner, at the INS detention center in Port Isabel, Texas, reported that “a young Salvadoran girl” was forced to “perform personal duties, like dancing the Lambada, for INS officials.” (In 2011, Hefner published a memoir with more accusations of sexual abuse by, as Hefner writes, the INS “brass”). Roberto Martinez, who worked with the San Diego-based U.S.-Mexico Border Program for the American Friends Service Committee, testified that “human and civil rights violations” by the Border Patrol “run the gamut of abuses imaginable” — from rape to murder. Agents regularly seized “original birth certificates and green cards” from Latino citizens, “leaving the victim with the financial burden of having to go through a lengthy process of applying for a new document.” “Rapes and sexual abuse in INS detention centers around the United States,” Martinez said, “seem to be escalating throughout the border region.”

Brutality continued as Washington further militarized both the border and broader immigration policy — first after the 1993 signing of the North American Free Trade Agreement, and then years later with the creation of Immigration and Customs Enforcement and the establishment of the Department of Homeland Security after the 9/11 attacks. Since 2003, Border Patrol agents have killed at least 97 people, including six children. Few agents were prosecuted. Last year, a 19-year-old Guatemalan Maya woman, Claudia Patricia Gómez Gonzáles was killed, shot in the head by a still-unnamed Texas Border Patrol agent shortly after she entered the United States. According to a recent report by the American Civil Liberties Union, young girls apprehended by the Patrol have been physically abused and threatened with rape, while unaccompanied children have experienced “physical and psychological abuse, unsanitary and inhumane living conditions, isolation from family members, extended period of detention, and denial of access to legal medical service.”

The viciousness we are witnessing today at the border, directed at children and adults, has a long history, a fact that should in no way mitigate the extraordinary cruelty of Donald Trump. But it does suggest that if the U.S. is to climb out of the moral abyss it has fallen into, it has to think well beyond Trump’s malice. It needs a historical reckoning with the true cause of the border crisis: the long, brutal history of border enforcement itself.

T​he new Congress includes its first two Muslim women members. One of them, Rashida Tlaib of Michigan, considered getting sworn in privately using a copy of the Qur’an from the library of one of America’s Founding Fathers.

Muslims arrived in North America as early as the 17th century, eventually composing 15 to 30 percent of the enslaved West African population of British America. (Muslims from the Middle East did not begin to immigrate here as free citizens until the late 19th century.) Even key American Founding Fathers demonstrated a marked interest in the faith and its practitioners, most notably Thomas Jefferson.

As a 22-year-old law student in Williamsburg, Virginia, Jefferson bought a Qur’an – 11 years before drafting the Declaration of Independence.

The purchase is symbolic of a longer historical connection between American and Islamic worlds, and a more inclusive view of the nation’s early, robust view of religious pluralism.

​Although Jefferson did not leave any notes on his immediate reaction to the Qur’an, he did criticize Islam as “stifling free enquiry” in his early political debates in Virginia, a charge he also leveled against Catholicism. He thought both religions fused religion and the state at a time he wished to separate them in his commonwealth.

Despite his criticism of Islam, Jefferson supported the rights of its adherents. Evidence exists that Jefferson had been thinking privately about Muslim inclusion in his new country since 1776. A few months after penning the Declaration of Independence, he returned to Virginia to draft legislation about religion for his native state, writing in his private notes a paraphrase of the English philosopher John Locke’s 1689 “Letter on Toleration”:

“[he] says neither Pagan nor Mahometan [Muslim] nor Jew ought to be excluded from the civil rights of the commonwealth because of his religion.”

The precedents Jefferson copied from Locke echo strongly in his Virginia Statute for Religious Freedom, which proclaims:

“(O)ur civil rights have no dependence on our religious opinions.”

The statute, drafted in 1777, which became law in 1786, inspired the Constitution’s “no religious test” clause and the First Amendment.Jefferson’s pluralistic visionWas Jefferson thinking about Muslims when he drafted his famed Virginia legislation?

Indeed, we find evidence for this in the Founding Father’s 1821 autobiography, where he happily recorded that a final attempt to add the words “Jesus Christ” to the preamble of his legislation failed. And this failure led Jefferson to affirm that he had intended the application of the Statute to be “universal.”

By this he meant that religious liberty and political equality would not be exclusively Christian. For Jefferson asserted in his autobiography that his original legislative intent had been “to comprehend, within the mantle of its protection, the Jew and the Gentile, the Christian and Mahometan [Muslim], the Hindoo, and Infidel of every denomination.”

By defining Muslims as future citizens in the 18th century, in conjunction with a resident Jewish minority, Jefferson expanded his “universal” legislative scope to include every one of every faith.

Ideas about the nation’s religiously plural character were tested also in Jefferson’s presidential foreign policy with the Islamic powers of North Africa. President Jefferson welcomed the first Muslim ambassador, who hailed from Tunis, to the White House in 1805. Because it was Ramadan, the president moved the state dinner from 3:30 p.m. to be “precisely at sunset,” a recognition of the Tunisian ambassador’s religious beliefs, if not quite America’s first official celebration of Ramadan.

Muslims once again provide a litmus test for the civil rights of all U.S. believers. Today, Muslims are fellow citizens and members of Congress, and their legal rights represent an American founding ideal still besieged by fear mongering, precedents at odds with the best of our ideals of universal religious freedom.

‘The sunrise city’: Florida community reconciles with history of 1920s race riot

Politicians and activists of Ocoee found recognition for victims of the ‘single bloodiest day’ in modern US political history

Richard Luscombe in Miamithe guardianThu 3 Jan 2019 06.00 EST

​It has been almost a century since Gladys Franks Bell’s father fled an election day race riot in Florida, clutching his little brothers and sisters and wading through swamps and woodland to safety while the Ku Klux Klan razed the family’s home town of Ocoee.

​By the end of the night his uncle July Perry was dead, lynched by a white mob and left hanging from a lamp-post next to a sign reading: “This is what we do to niggers who vote.” The murderous rampage, meanwhile, continued unchecked, claiming dozens of other black lives, according to many accounts, while hundreds of survivors were run out of what then became an all-white town for decades.

Until recently, one of the most shameful episodes of the deep south’s racist past looked destined to be forgotten forever.

But now, thanks to the efforts of local politicians, activists and the Alabama-based Equal Justice Initiative, there is permanent recognition for the victims and their legacy, and an official expression of “regret and horror” from the city of Ocoee, near Orlando.“It’s been so long, I never thought I’d live to see an acknowledgment that this even happened,” said Bell, who lives in the neighbouring city of Apopka.

“It stayed with me over the years, what my daddy shared with us when we were children, when we used to go into Ocoee and he showed us everything that used to be our property, and told us about his life there and everything that went on.

“Some of it makes you laugh, some makes you cry, and other parts make you downright mad. But they are the facts. It does bring all the memories back.”

A giant step towards healing came in November 2018, when the city of Ocoee – where the census returns between the time of the massacre and 1980 recorded only white residents – adopted a proclamation steeped in symbolism. Ocoee was no longer a so-called sundown city, named for an era when the safety of any black resident could not be guaranteed after dark, the proclamation read. It was henceforth to be “the sunrise city, with the bright light of harmony, justice and prosperity shining upon all our citizens”.

The ball was set rolling to reconciliation at the start of the decade when the civil rights historian Paul Ortiz, associate professor of history at the University of Florida, published an essaylooking into what he called “the single bloodiest day in modern American political history”.

Ortiz chronicled the events in Ocoee surrounding the presidential election of 2 November 1920. Perry and his friend Mose Norman, two prosperous black businessmen, had tried to register African Americans to vote, in the face of fierce opposition from city leaders, and when Norman attempted to vote himself he was turned away.

Events degenerated quickly after he returned with a shotgun and was beaten and chased off by a mob who had gathered at the polling station. They raced to Perry’s home, where they thought Norman was hiding, and radioed for reinforcements. Klan members from Orlando and Tampa rushed to the scene where Perry, fearful for his family’s safety, fired at the crowd with a shotgun, killing two men.

The mob then overran the house, wounding Perry and pursuing his fleeing family through nearby woods, and expanded their rampage to Ocoee’s northern quarter, burning dozens of homes and two churches, killing an unknown number of people, perhaps as many as 50, according to Ortiz.

Perry’s fate was sealed when he was pulled by Klan members from the county jail in Orlando, shot and strung up. In the following hours, the rioting spread to Ocoee’s southern districts, where hundreds of black residents were forced to leave permanently, with no compensation for their lost property.

​The renewed interest in Ocoee’s grim history sparked a new push for reconciliation, bolstered this April when the election day riot was incorporated into the National Memorial for Peace and Justice in Montgomery, Alabama, a museum dedicated to victims of racial terror and more than 4,400 black people lynched in the south between 1877 and 1950.

And in May, Ocoee voters elected George Oliver as the city’s first African American commissioner, who joined William Maxwell, the longtime chair of the city’s human relations diversity board, as driving forces for the adoption of the proclamation.

“It’s not so much righting a wrong as an opportunity to look at ourselves, each person as an individual,” Oliver said. “You’ve got to understand where July Perry and Mose Norman were coming from. They dared to prosper in an era of white privilege, dared to leave their home in North Carolina to seek out prosperity … one generation away from slavery.”

“That part became their undoing. They wanted to live the American dream. ”

For Bell, the healing process began decades ago when her father Richard, as a teenager, carried his siblings to safety and helped them build their new life in Plymouth, Florida, 10 miles north of Ocoee, memories she records in her book Visions Through My Father’s Eyes.

“He went through all of that, he never shared any hatred against any white person and he taught us to do the same,” she said. “He’d tell us all about it and we just knew of it not holding any grudges. That’s just the type of man my father was.”

How George H.W. Bush Rode a Fake National Security Scandal to the Top of the CIA

​James Risen - the interceptDecember 8 2018, 6:00 a.m.

ON DECEMBER 15, 1975, a Senate committee opened hearings on whether George H.W. Bush should be confirmed as director of the Central Intelligence Agency.It wasn’t going to be a slam dunk.

The Democrats had a huge majority in the Senate, and many were still angry over Bush’s role as a partisan apologist for former President Richard Nixon, who had resigned the year before as a result of the Watergate scandal. What’s more, in the wake of disclosures in the press of pervasive domestic spying by the CIA, the Senate had launched its first aggressive investigation into alleged abuses by the U.S. intelligence community.

Beginning in January 1975, the Church Committee, named for its chair, Idaho Democratic Sen. Frank Church, unearthed one scandal after another at the CIA, the FBI, and the National Security Agency. Long-hidden covert programs, including a series of plots to kill foreign leaders like Cuba’s Fidel Castro and the Congo’s Patrice Lumumba, had been exposed, rocking the CIA. By late 1975, the agency’s public standing was at a low ebb, and the CIA and White House officials in the administration of President Gerald Ford were increasingly worried about the political impact of the disclosures.

For Bush, the CIA job was a major opportunity at a time when his political career was in flux. Until then, his greatest accomplishment in the Republican Party had been to win a House seat in Texas that had always been held by a Democrat. But he had lost a subsequent Senate bid in 1970 and had been bouncing around Republican establishment circles ever since. He had the ignominy to serve as chair of the Republican National Committee during Watergate, forcing him to make repeated public excuses for Nixon.

Bush had also served as United Nations ambassador under Nixon and as head of the U.S. Liaison Office in China under Ford, and now the Washington rumor mill was reporting that Bush, the loyal soldier, was under consideration for a major political prize — to be Ford’s vice presidential running mate in 1976. If he didn’t get the vice president’s slot in 1976, it seemed likely that he might run for the presidency on his own later.

But first he had to get confirmed to the CIA post.

For the Ford White House and the CIA, Bush’s confirmation hearings set the stage for an all-out battle with congressional leaders. At a critical moment, the Ford administration, its allies in Congress, and the intelligence community collaborated to gin up outrage over a fake national security scandal that ultimately helped pull Bush across the finish line. That polarizing strategy has provided a winning model for Republican efforts to discredit and distract ever since, all the way down to Donald Trump, Devin Nunes, and the attempted sliming of the FBI and special counsel Robert Mueller’s Trump-Russia investigation.

The story of how Bush became CIA director is brilliantly told in “A Season of Inquiry Revisited” by Loch K. Johnson, a renowned historian of intelligence at the University of Georgia and former Church Committee staffer.

To get confirmed, Bush had to run a gauntlet through the Senate, where Democrats held 60 seats thanks to a post-Watergate Democratic landslide in the 1974 midterms. If he got the nod, he would be the first partisan political figure ever to run the CIA. Until then, the agency had been led by gray-flannel establishment figures from Wall Street, former senior military officers, or longtime agency professionals.

Standing directly in Bush’s way was Church, who had emerged as the spokesperson and public face of congressional efforts to probe and reform the intelligence community. Church immediately opposed Bush’s nomination, which he saw as an effort by Ford to install a partisan hack at the CIA who would do the bidding of the White House just as Congress was seeking to curb the agency’s abuses. Church viewed the Bush nomination as a direct White House attack on his committee’s investigation.

“We need a CIA that can resist all the partisan pressures which can be brought to bear by various groups inside and outside the government — especially pressures from the White House itself,” Church said in a speech on the Senate floor. “This is why the appointment of Ambassador George Bush is so ill-advised. It is one thing to choose an individual who may have had political experience, and quite another to choose someone whose principal political role has been that of chairman of the Republican National Committee. There is no need to eliminate from consideration an individual simply because he or she may have held public office. But the line must be drawn somewhere, and a man of Mr. Bush’s prolonged involvement in partisan activities at the highest party level surely passes over that line.”

At his confirmation hearing, Bush did little to allay Church’s concerns. Instead, he warned that “we must not see the CIA dismantled,” an obvious attack on the Senate’s investigative efforts.

AS THE HOLIDAYS approached, Bush’s confirmation hung in limbo. Then, on December 23, 1975 — eight days after his confirmation hearing — Richard Welch, the CIA’s station chief in Greece, was returning home from a Christmas party at the U.S. ambassador’s residence in Athens when he was assassinated.

Welch had been a relatively easy target for a local militant group known as 17 November. He had been living in the same house used by several previous CIA station chiefs and had been publicly identified in publications in Greece. The group later claimed that its members had been watching him for months.

But the CIA and the Ford White House quickly saw Welch’s murder as a political windfall. At a time when the CIA was under assault from Congress and Bush’s nomination was in peril in the Senate, there was now a dead CIA hero to mourn.

Ford, waiving restrictions, announced that Welch could be buried at Arlington National Cemetery. The plane carrying his body back home in early January “circled Andrews Air Force Base for three quarters of an hour in order to land live during the Today Show,” according to Johnson’s book.

The CIA and the White House began to exploit Welch’s death to discredit Church and his committee’s work. William Colby, the outgoing CIA director, lashed out at Congress, blaming Welch’s killing on the “sensational and hysterical way the CIA investigations had been handled and trumpeted around the world,” Johnson writes.

There was not a shred of evidence that anything the Church Committee had done had led to Welch’s murder. But the truth didn’t matter to the CIA and the Ford White House, and the campaign to discredit Church and his committee’s investigation worked. After Welch’s murder, public support for the Church Committee waned.

The changed climate proved helpful to Bush. On January 27, 1976, South Carolina Sen. Strom Thurmond argued for his confirmation by claiming that the public was more concerned by disclosures that “are tearing down the CIA” than by the “selection of this highly competent man to repair the damage of this over-exposure,” according to Johnson’s book. Later that day, Bush was confirmed by a vote of 64-27.

Bush only lasted a year as CIA director. Ford — who ended up choosing Bob Dole as his running mate — was defeated by Jimmy Carter in the 1976 election. Bush tried to convince Carter to keep him on as CIA director, but Carter’s vice president was Walter Mondale, who had been a leading member of the Church Committee and had already won a commitment from Carter to try to implement many of the committee’s recommendations for reforming the intelligence community.

So Bush ran for president instead. He lost in the primaries to Ronald Reagan, then rode Reagan’s coattails as his running mate in the 1980 election.

Bush’s political career owes much to the misuse of Welch’s murder. Above all, it helped start a Republican tradition of generating fake national security scandals to discredit Democrats and win political battles. In the wake of Bush’s death, many in the mainstream press and political elite have pinned him to a bygone era of civility, when partisanship was held in check out of concern for some greater good. But playing dirty didn’t start yesterday. There is a straight line from Welch to pre-war intelligence on Iraq’s weapons of mass destruction, Benghazi, and Nunes’s farcical midnight search for evidence that Trump was wiretapped.​

Let’s Talk About George H.W. Bush’s Role in the Iran-Contra Scandal

​Arun Gupta - the interceptDecember 7 2018, 10:06 a.m.

​THE EFFUSIVE PRAISE being heaped on former President George H.W. Bush — “a calm and vital statesman” who exuded “decency, moderation, compromise” — risks burying his skeletons with him. One of the most notable skeletons that has gotten scant attention in recent days is his role in the Iran-Contra scandal.

As CIA director in the mid-1970s and as Ronald Reagan’s vice president, Bush helped forge a world of strongmen, wars, cartels, and refugees that continues today. In particular, he was deeply involved in the events that became known as the Iran-Contra scandal, a series of illegal operations that began with a secret effort to arm Contra fighters in Nicaragua in the hopes of toppling the leftist Sandinista government; this effort became connected to drug trafficking, trading weapons for hostages with Iran, and banking scandals.

In 1987, Arthur Liman, chief counsel for the Senate Select Committee on Secret Military Assistance to Iran and the Nicaraguan Opposition, described it as a “secret government-within-a-government … with its own army, air force, diplomatic agents, intelligence operatives and appropriations capacity.” Independent counsel Lawrence Walsh, tasked with investigating Iran-Contra, concluded that the White House cover-up “possibly forestalled timely impeachment proceedings against President Reagan and other officials.” Bush was a central figure in this.

Bush’s spy history is murky. According to Russ Baker, author of “Family of Secrets,” a history of the Bush family, in the late 1950s, Bush allegedly allowed the CIA to use an offshore oil rig he owned near Cuba as a staging ground for anti-Castro Cubans to raid their homeland. In 1967, Bush visited Vietnam as a freshman member of Congress, and Baker claims that Bush was accompanied by his business partner, a CIA agent, to investigate the Phoenix Program, the CIA torture and assassination operation that killed more than 20,000 Vietnamese by 1971.

Bush misled an FBI investigation about Chile’s responsibility. Also as spy chief, Bush met his Panamanian counterpart, Manuel Noriega, already suspected at the time of drug trafficking. (As president, Bush ordered the invasion of Panama in 1989 to remove Noriega from power, who was the country’s ruler by that point.)

As vice president, Bush became an architect of the “secret government” that came into being for the Iran-Contra operations. Official investigations of Iran-Contra are limited to the period after October 1984 when Congress banned military and intelligence services from providing direct or indirect support to the Contras. But Gary Webb’s expose on CIA and Contra links to cocaine smuggling, “The Dark Alliance,” dates to 1981 the covert U.S. support for the Contras. Cobbled together from remnants of Nicaragua’s defeated National Guard, the Contras were notorious for torture, assassination, and other atrocities. The Phoenix-Condor link reached Central America, as the CIA recruited veterans of Argentina’s Dirty War to train the Contras, who ignited a decadelong war that killed an estimated 50,000 Nicaraguans.

Rolling Stone dates Bush’s involvement in the Contra war to 1982, when he reportedly conspired with CIA chief William Casey in an operation they code-named “Black Eagle.” Working under Bush, Donald Gregg managed finances and operations for the Contras, according to Rolling Stone. Rodriguez handled arms flights to Central America and negotiated with military commanders there. Historian Douglas Valentine has claimed that in 1981, Bush authorized these veterans of the Phoenix Program to initiate a “Pink Plan” terror war against Central American insurgents.

Black Eagle masked its operation by relying on the Israeli Mossad to acquire and ship weapons to Central America, employing Panamanian airfields and companies as fronts, according to the Rolling Stone story. But the planes, once emptied of their arms cargo in Central America, were repurposed by Noriega and the Medellín cartel to ship drugs back to the United States. The CIA allegedly stuck a deal with the Medellín cartel’s primary contact, Barry Seal. In return for Seal hauling weapons to the Contras, the CIA protected him as his operations smuggled an estimated $3 billion to $5 billion in drugs into the United States.

The White House also leaned on Gulf State monarchies to cough up more than $40 million for the Contras, violating the 1984 congressional ban known as the Boland Amendment. In 1985, Lt. Col. Oliver North coordinated with Israel to ship more than 2,000 anti-tank missiles to Iran through Israel in exchange for Iran’s assistance in freeing American hostages held in the region — and the profits were used to fund the Contras.

The maneuver, which violated the Arms Export Control Act, was extraordinarily cynical. Iran was mired in a brutal war with Iraq, which was backed by Bush and other senior Reagan administration officials beginning in 1982. Through the BNL bank that would later collapse in scandal, Iraq received more than $4 billion of U.S. Department of Agriculture credits. Most of that money reportedly went to buy weaponry even as Iraq waged chemical warfare against Iran and its own Kurdish citizens.

Both the Contra weapons shipments and the arms-for-hostages deals were exposed in 1986.

​Much is still not known about Iran-Contra because of document shredding, deceit, and cover-ups by Reagan-era officials. Congress handcuffed its inquiry by failing to subpoena Oval Office recordings and calling knowledgeable witnesses. Robert Parry, an Associated Press reporter who uncovered the arms-for-drugs trade years before Webb, criticized the media for failing to dig into the story and succumbing to White House pressure and perception management.

On Christmas Eve 1992, then-President Bush decapitated the investigation by Walsh. Bush pardoned six figures, including Secretary of Defense Caspar Weinberger, whose trial was about to begin, with Bush likely called to testify. Walsh was livid. Saying “the Iran-Contra cover-up … has now been completed,” he called Bush a “president who has such a contempt for honesty [and] arrogant disregard for the rule of law.” Bush’s pardons are newly relevant because Bush consulted his attorney general at the time, William Barr, who reportedly did not oppose the pardons. Barr has just been named by President Donald Trump as his nominee for attorney general, where he may once again confront the issue of presidential pardons of senior government officials caught in an illegal conspiracy.

Bush’s role in the Iran-Contra scandal shows that his legacy is far darker than what is being reported amid his death and funeral. The truth is that he coddled dictators and death squads, undermined democratic institutions, and trashed the Constitution. He created the conditions that helped give rise to Donald Trump.

The word “Hanukkah” means dedication. It commemorates the rededicating of the ancient Temple in Jerusalem in 165 B.C. when Jews – led by a band of brothers called the Maccabees – tossed out statues of Hellenic gods that had been placed there by King Antiochus IV when he conquered Judea. Antiochus aimed to plant Hellenic culture throughout his kingdom, and that included worshipping its gods.

Legend has it that during the dedication, as people prepared to light the Temple’s large oil lamps to signify the presence of God, only a tiny bit of holy oil could be found. Yet, that little bit of oil remained alight for eight days until more could be prepared. Thus, each Hanukkah evening, for eight nights, Jews light a candle, adding an additional one as the holiday progresses throughout the festival.

Hanukkah’s American story

Today, America is home to almost 7 million Jews. But Jews did not always find it easy to be Jewish in America. Until the late 19th century, America’s Jewish population was very small and grew to only as many as 250,000 in 1880. The basic goods of Jewish religious life – such as kosher meat and candles, Torah scrolls, and Jewish calendars – were often hard to find.

In those early days, major Jewish religious events took special planning and effort, and minor festivals like Hanukkah often slipped by unnoticed.

It began with a simple holiday hymn written in 1840 by Penina Moise, a Jewish Sunday school teacher in Charleston, South Carolina. Her evangelical Christian neighbors worked hard to bring the local Jews into the Christian fold. They urged Jews to agree that only by becoming Christian could they attain God’s love and ultimately reach Heaven.

Moise, a famed poet, saw the holiday celebrating dedication to Judaism as an occasion to inspire Jewish dedication despite Christian challenges. Her congregation, Beth Elohim, publicized the hymn by including it in their hymnbook.

This English language hymn expressed a feeling common to many American Jewsliving as a tiny minority. “Great Arbiter of human fate whose glory ne’er decays,” Moise began the hymn, “To Thee alone we dedicate the song and soul of praise.”

It became a favorite among American Jews and could be heard in congregations around the country for another century.

Shortly after the Civil War, Cincinnati Rabbi Max Lilienthal learned about special Christmas events for children held in some local churches. To adapt them for children in his own congregation, he created a Hanukkah assembly where the holiday’s story was told, blessings and hymns were sung, candles were lighted and sweets were distributed to the children.

His friend, Rabbi Isaac M. Wise, created a similar event for his own congregation. Wise and Lilienthal edited national Jewish magazines where they publicized these innovative Hanukkah assemblies, encouraging other congregations to establish their own.Lilienthal and Wise also aimed to reform Judaism, streamlining it and emphasizing the rabbi’s role as teacher. Because they felt their changes would help Judaism survive in the modern age, they called themselves “Modern Maccabees.” Through their efforts, special Hanukkah events for children became standard in American synagogues.

20th-century expansion

By 1900, industrial America produced the abundance of goods exchanged each Dec. 25. Christmas’ domestic celebrations and gifts to children provided a shared religious experience to American Christians otherwise separated by denominational divisions. As a home celebration, it sidestepped the theological and institutional loyalties voiced in churches.

But by giving those gifts at Hanukkah, instead of adopting Christmas, they also expressed their own ideals of American religious freedom, as well as their own dedication to Judaism.

​After World War II, many Jews relocated from urban centers. Suburban Jewish children often comprised small minorities in public schools and found themselves coerced to participate in Christmas assemblies. Teachers, administrators and peers often pressured them to sing Christian hymns and assert statements of Christian faith.

From the 1950s through the 1980s, as Jewish parents argued for their children’s right to freedom from religious coercion, they also embellished Hanukkah. Suburban synagogues expanded their Hanukkah programming.

As I detail in my book, Jewish families embellished domestic Hanukkah celebrations with decorations, nightly gifts and holiday parties to enhance Hanukkah’s impact. In suburbia, Hanukkah’s theme of dedication to Judaism shone with special meaning. Rabbinical associations, national Jewish clubs and advertisers of Hanukkah goods carried the ideas for expanded Hanukkah festivities nationwide.

In the 21st century, Hanukkah accomplishes many tasks. Amid Christmas, it reminds Jews of Jewish dedication. Its domestic celebration enhances Jewish family life. In its similarity to Christmas domestic gift-giving, Hanukkah makes Judaism attractive to children and – according to my college students – relatable to Jews’ Christian neighbors. In many interfaith families, this shared festivity furthers domestic tranquility.

In America, this minor festival has attained major significance.

Coard: Know Thanksgiving instead of celebrating it

Michael Coard - philly tribune​11/16/18

​I wouldn’t go so far as to say the white man is the devil. But I will say, as I have always said, “A devil is what a devil does.” And, historically speaking, the white man has done a whole lotta devilment, especially here in the land called America.

Since white folks, and sadly Black folks too, in this country will celebrate Thanksgiving next week, I’ll use this week’s Freedom’s Journal column to expose some irrefutable proof of that racist devilment.

Let’s begin at the beginning, which was white invasion resulting in Red genocide. Howard N. Simpson, M.D. in Invisible Armies: The Impact of Disease on American History writes, “The Europeans were able to conquer America not because of their military genius or their religious motivation or their ambition or [even] their greed. They conquered it by waging... biological warfare.” And J. Leitch Wright Jr. in The Only Land They Knew notes, “In 1623, the British indulged in the first use of chemical warfare in the colonies when negotiating a treaty with tribes, headed by Chief Chiskiac, near the Potomac River. The British offered a toast symbolizing ‘eternal friendship,’ whereupon the chief, his family, advisors, and two hundred followers dropped dead of poison.”

And in a 1763 letter to a colleague, Sir Jeffrey Amherst, a high-ranking British military officer, not only suggested using vicious wild dogs to hunt down Red men, women, and children- which was brutally done- but also suggested using diseased blankets on Red men, women, and children when he wrote, “Could it not be contrived to send Small Pox among those disaffected tribes of Indians? We must on this occasion use every stratagem in our power to reduce them.” And that was satanically done.

However, you might say those aren’t examples of Thanksgiving. You might also say Thanksgiving was invented by Europeans as an expression of unity and appreciation between the two races. But you’d be wrong, dead wrong- as dead as the murdered so-called Indians.

Thanksgiving, as an American holiday, is a celebration of racist genocide. But don’t take my word for it. Listen to what Wamsutta (also known as Frank B. James), the official representative of the Wampanoag Nation, wrote in 1970 in response to an invitation from the Massachusetts Department of Commerce for his “tribe” to participate in the 350th anniversary of the Pilgrims’ landing:

“This is a time of celebration for you- celebrating an anniversary of a beginning for the white man in America.... It is with a heavy heart that I look back upon what happened to my people. Even before the Pilgrims landed [here], it was a common practice for explorers to capture Indians, take them to Europe, and sell them as slaves.... The Pilgrims had hardly explored the shores of Cape Cod for four days before they had robbed the graves of my ancestors and stolen their corn and beans.... Massasoit, the great Sachem of the Wampanoag, knew these facts. Yet he and his people welcomed and befriended the settlers.... This action by Massaoit was perhaps our biggest mistake. We, the Wampanoag, welcomed you, the white man, with open arms, little knowing that it was the beginning of the end, that before 50 years were to pass, the Wampanoag would no longer be a free people....

History gives us facts and there were atrocities. There were broken promises and most of these centered around land ownership.... Never before had we had to deal with fences and stone walls. But the white man needed to prove his worth by the amount of land that he owned. Only ten years later, when the Puritans came, they treated the Wampanoag with even less kindness in converting the souls of the so-called ‘savages....’ [And the Indians who rejected the Puritans’ Christianity were] pressed between stone slabs and [also] hanged.... And... down through the years, there is record after record of Indian lands taken and... reservations set up....

Although time has drained our culture and our language is almost extinct, we the Wampanoags still walk the lands of Massachusetts. [And] our spirit refuses to die.... We still have the spirit. We still have the unique culture. We still have the will and, most important of all, the determination to remain as Indians.

We are determined, and our presence here this evening is living testimony that this is only the beginning of the American Indian... to regain the position in this country that is rightfully ours.”

But Brother Wamsutta’s September 10, 1970 speech was never heard publicly at the anniversary event because Massachusetts’ white government officials banned him from reading it aloud after they had requested and received a copy of it beforehand.

Here are three facts you must know about white folks’ Thanksgiving so you won’t make the mistake of celebrating and thereby whitewashing the horrific physical and biological slaughter of our brave Red sisters and brothers.

1. The Red nations (and there were five hundred of them on this land they called Turtle Island) were inhabited by people accurately and generally known as the Onkwehonwe whose ancestors had been in the so-called New World for approximately 14,000 years. White Thanksgiving was founded thousands of years later in 1621 in Plymouth, Massachusetts by Pilgrims a year after they arrived from England to promote European religious traditions.

2. As further explained by Professor James W. Loewen in Lies My Teacher Told Me, “The Pilgrims did not introduce the tradition.... Indians had observed autumnal harvest celebrations for centuries. Although George Washington... [in 1789 did issue a proclamation setting aside November 26] as a national day of thanksgiving, our modern celebration dates back only to 1863. During the Civil War, when the Union needed all the patriotism that such an observance might muster, Abraham Lincoln proclaimed Thanksgiving a national holiday.” By the way, as Francis Russell Stoddard notes in The Truth About the Pilgrims, the term “Pilgrims” wasn’t even used until the 1870s.

3. Shortly after the Pilgrims (and later the Puritans) arrived in/invaded this land and throughout the history of the United States, most notably following Congressional passage of Senate Bill 102 signed by President Andrew Jackson in 1830 and known as the Indian Removal Act- resulting in the gruesome “Trail of Tears”- Red people by the millions decreased in number as the genocidal terrorism, biological warfare, torture, rape, murder, land theft, and colonization increased.

Despite the hellish tradition of white Thanksgiving, I’m certainly not suggesting that Black folks not chill out on November 22 by hanging out, socializing, eating, and drinking with your family. In fact, you should do all that because it’s important for families, especially Black families, to come together as often as possible. Furthermore, that chilling out could also include watching professional football (unless you’re still boycotting like me). But when you watch the Washington game on that day, don’t use the racist slur by calling that team the “Redskins” unless you call their Dallas opponents (and all other NFL teams) the “Crackers.”

Think about it.

A threat to democracy: Republicans' war on minority voters

​The Republican party has harassed, obstructed, frustrated and purged American citizens from having a say in their own democracy for more than 150 years

​Carol Andersonthe guardianWed 31 Oct 2018 06.00 EDT

Selma, Alabama, 1965: ‘Vote’ written across the forehead of a young man marching for black voting rights. Photograph: Alamy Stock Photo

It was a mystery worthy of crime novelist Raymond Chandler. On 8 November 2016, African Americans did not show up. It was like a day of absence. African Americans had virtually boycotted the election because they “simply saw no affirmative reason to vote for Hillary”, as one reporter explained, before adding, with a hint of an old refrain, that “some saw her as corrupt”. As proof of blacks’ coolness toward her, journalists pointed to the much greater turnout for Obama in 2008 and 2012.

It is true that, nationwide, black voter turnout had dropped by 7% overall. Moreover, less than half of Hispanic and Asian American voters came to the polls.

This was, without question, a sea change. The tide of African American, Hispanic and Asian voters that had previously carried Barack Obama into the White House and kept him there had now visibly ebbed. Journalist Ari Berman called it the most underreported story of the 2016 campaign. But it’s more than that.

The disappearing minority voter is the campaign’s most misunderstood story.

Minority voters did not just refuse to show up; Republican legislatures and governors systematically blocked African Americans, Hispanics and Asian Americans from the polls. Pushed by both the impending demographic collapse of the Republican party, whose overwhelmingly white constituency is becoming a smaller share of the electorate, and the GOP’s extremist inability to craft policies that speak to an increasingly diverse nation, the Republicans opted to disfranchise rather than reform. The GOP enacted a range of undemocratic and desperate measures to block the access of African American, Latino and other minority voters to the ballot box.

Using a series of voter suppression tactics, the GOP harassed, obstructed, frustrated and purged American citizens from having a say in their own democracy.

The devices the Republicans used are variations on a theme going back more than 150 years. They target the socioeconomic characteristics of a people (poverty, lack of mobility, illiteracy, etc) and then soak the new laws in “racially neutral justifications – such as “administrative efficiency” or “fiscal responsibility” – to cover the discriminatory intent. Republican lawmakers then act aggrieved, shocked and wounded that anyone would question their stated purpose for excluding millions of American citizens from the ballot.

The millions of votes and voters that disappeared behind a firewall of hate and partisan politics was a long time in the making. The decisions to purposely disenfranchise African Americans, in particular, can be best understood by going back to the close of the civil war.

After Reconstruction, the plan was to take years of state-sponsored “trickery and fraud” and transform those schemes into laws that would keep blacks away from the voting booth, disfranchise as many as possible, and, most important, ensure that no African American would ever assume real political power again.

The last point resonated. Reconstruction had brought a number of blacks into government. And despite their helping to craft “the laws relative to finance, the building of penal and charitable institutions, and, greatest of all, the establishment of the public school system”, the myth of incompetent, disastrous “black rule” dominated. Or, as one newspaper editor summarized it: “No negro is fit to make laws for white people.”

Of course, the white lawmakers couldn’t be that blatant about their plans to disfranchise; there was, after all, that pesky constitution to contend with, not to mention the 15th amendment covering the right to vote with its language barring discrimination “on account of race”. Undaunted, they devised ways to meet the letter of the law while doing an absolute slash-and-burn through its spirit.

That became most apparent in 1890 when the Magnolia State passed the Mississippi Plan, a dizzying array of poll taxes, literacy tests understanding clauses, newfangled voter registration rules, and “good character” clauses – all intentionally racially discriminatory but dressed up in the genteel garb of bringing “integrity” to the voting booth. This feigned legal innocence was legislative evil genius.

As the historian C Vann Woodward concluded, “The restrictions imposed by these devices [in the Mississippi Plan] were enormously effective in decimating the Negro vote.” Indeed, by 1940, shortly before the United States entered the war against the Nazis, only 3% of age-eligible blacks were registered to vote in the south.

Senator Theodore Bilbo, one of the most virulent racists to grace the halls of Congress, boasted of the chicanery (of the Mississippi Plan) nearly half a century later. “What keeps ’em [blacks] from voting is section 244 of the [Mississippi] Constitution of 1890 … It says that a man to register must be able to read and explain the Constitution or explain the Constitution when read to him.”

While the Civil Rights Movement and the subsequent Voting Rights Act of 1965 seemed to disrupt and overturn disfranchisement, the forces of voter suppression refused to rest. The election of Barack Obama to the presidency in 2008 and 2012 sent tremors through the right wing in American politics. They seized their opportunity in 2013 after the US supreme court gutted the Voting Rights Act and doubled down on some vestiges of the Jim Crow era, such as felony disfranchisement.

In 2016, one in 13 African Americans had lost their right to vote because of a felony conviction – compared with one in 56 of every non-black voter. The felony disfranchisement rate in the United States has grown by 500% since 1980. In America, mass incarceration equals mass felony disfranchisement. With the launch of the war on drugs, millions of African Americans were swept into the criminal justice system, many never to exercise their voting rights again.

Generally, the incarcerated cannot vote, but once they have served their time, which sometimes includes parole or probation, there is a process – often arcane and opaque – that allows for the restoration of voting rights. Overall, 6.1 million Americans have lost their voting rights. Currently, because of the byzantine rules, “approximately 2.6 million individuals who have completed their sentences remain disenfranchised due to restrictive state laws”, according to The Sentencing Project.

The majority are in Florida. The Sunshine State is actually an electorally dark place for 1.7 million citizens because “Florida is the national champion of voter disenfranchisement”, according to the Florida Centre for Investigative Reporting. The state leads the way in racializing felony disfranchisement as well. “Nearly one-third of those who have lost the right to vote for life in Florida are black, although African Americans make up just 16% of the state’s population,” according to Conor Friedersdorf’s reporting for the Atlantic.

Florida is one of only four states, including Kentucky, Iowa and Virginia, that “permanently” disfranchises felons.

The term “permanent” means that there is no automatic restoration of voting rights. Instead, there is a process to plead for dispensation, which usually requires petitioning all the way up to the governor after a specified waiting period. Republican Governor Rick Scott, a Republican, has made that task doubly difficult. The Florida Office of Executive Clemency, which he leads, meets only four times a year and has more than 10,000 applications waiting to be heard. An ex-offender cannot even apply to have his or her voting rights restored until 14 years after all the sentencing requirements have been met. The process is therefore daunting enough as it is, but Scott has slowed it down considerably.

His predecessor, a moderate Republican turned Democrat, “restored rights to 155,315 ex-offenders” over a four-year span. Since 2011, however, Scott has approved only 2,340 cases.

Republicans in Georgia have brought their own distinct twist to voter suppression. Secretary of State, Brian Kemp, has developed a pattern of going after and intimidating organizations that register minorities to vote. In 2012, when the Asian American Legal Advocacy Center (AALAC) realized that a number of its clients, who were newly naturalized citizens, were not on the voter rolls although they had been registered, its staff made an inquiry with the secretary of state’s office.

After waiting and waiting and still receiving no response, AALAC issued an open letter expressing concern that the early voting period would close before they had an answer. Two days later, in a show of raw intimidation, Kemp launched an investigation questioning the methods the organization had used to register new voters. One of the group’s attorneys was “aghast … ‘I’m not going to lie: I was shocked, I was scared.’” AALAC remained under this ominous cloud for more than two years before Kemp’s office finally concluded there was no wrongdoing.

Kemp then went after the New Georgia Project when in 2014 the organization decided to whittle away at the bloc of 700,000 unregistered African American voters in the state and, in its initial run, registered nearly 130,000 mostly minority voters. Kemp didn’t applaud and see democracy in action. Instead, he exclaimed in a TV interview, “We’re just not going to put up with fraud.”

Later, when talking with a group of fellow Republicans behind closed doors, he didn’t claim “fraud”. It was something much baser. “Democrats are working hard … registering all these minority voters that are out there and others that are sitting on the sidelines,” he warned. “If they can do that, they can win these elections in November.” Not surprisingly, within two months of that discussion, he “announced his criminal investigation into the New Georgia Project”. And, just as before, Kemp’s hunt for fraud dragged on and on with aspersions and allegations filling the airwaves and print media while no evidence of a crime could be found.

Vote suppression has become far too commonplace. In 2017, “99 bills to limit access to the ballot have been introduced in 31 states … and more states have enacted new voting restrictions in 2017 than in 2016 and 2015 combined”, according to Abi Berman.

​Yet, while there are far too many states that are eager to reduce “one person, one vote” to a meaningless phrase, others, such as Oregon, are determined to “make voting convenient” and “registration simple” because these “policies are good for civic engagement and voter participation”. In 2015, Oregon pioneered automatic voter registration (AVR). Under AVR, Oregon added 68,583 new voters in just six months. By the end of July 2016, the state’s “torrid pace” had swelled the rolls by 222,197 new voters.

California took one look at its neighbor to the north and is “hard on Oregon’s heels”. Secretary of State, Alex Padilla, dissatisfied with his own state’s abysmal 42% voter turnout rate, had been scouring the nation looking for best practices. “We want to serve as a contrast to what we see happening in other states, where they are making it more difficult to register or actually cast a ballot,” he said. California, thus, adopted and then adapted Oregon’s AVR program to include preregistration of 16- and 17-year-olds who are then automatically registered to vote when they turn 18.

These state initiatives to remove the barriers to the ballot box including the use of mail-in ballots – which has had tremendous success in Colorado – are beginning to ricochet around the nation.

To date, ten states have implemented AVR and “15 states have introduced automatic voter registration proposals in 2018”.

Democrats in Congress have also pushed for legislation to enact a federal AVR program, because the United States consistently ranks toward the bottom of developed democracies in terms of voter turnout. In July 2016, Senators Patrick Leahy (D-VT), Dick Durbin (D-IL), and Amy Klobuchar (D-MN) co-sponsored legislation that would take AVR nationwide. Leahy remarked, “There is no reason why every eligible citizen cannot have the option of automatic registration when they visit the DMV, sign up for healthcare or sign up for classes in college.”

No reason at all, except not one Republican in Congress has stepped up to support the bill.

Thus, when thirty-one states are vying to develop new and more ruthless ways to disfranchise their population, and when the others are searching desperately for ways to bring millions of citizens into the electorate, we have created a nation where democracy is simultaneously atrophying and growing – depending solely on where one lives. History makes clear, however, that this is simply not sustainable. It wasn’t sustainable in the antebellum era. It wasn’t sustainable when the poll tax and literacy test gave disproportionate power in Congress to Southern Democrats. And it’s certainly not sustainable now. Or, as Abraham Lincoln soberly observed, “I believe this government cannot endure, permanently half slave and half free.”

Here are 7 things the United Daughters of the Confederacy might not want you to know about them​Kali Holloway, Independent Media Institute - raw story07 OCT 2018 AT 01:34 ET

It’s helpful, in the midst of any conversation about this country’s Confederate monuments, to understand who put these things up, which also offers a clue as to why. In large part, the answer to the first question is the United Daughters of the Confederacy, a white Southern women’s “heritage” group founded in 1894. Starting 30 years after the Civil War, as historian Karen Cox notes in her 2003 book “Dixie’s Daughters,” “UDC members aspired to transform military defeat into a political and cultural victory, where states’ rights and white supremacy remained intact.” In other words, when the Civil War gave them lemons, the UDC made lemonade. Horribly bitter, super racist lemonade.

Though the UDC didn’t invent the Lost Cause ideology, they were deeply involved in spreading the myth, which simultaneously contends the Confederacy wasn’t fighting to keep black people enslaved while also suggesting slavery was pretty good for everyone involved. Lost Causers — plenty of whom exist today, their sheer numbers a reflection of the UDC’s effectiveness — argue that Confederate monuments are just innocent statues; that taking them down erases history; that we cannot retroactively apply today’s ideas about the morality of slavery to the past. The response to those ridiculous cop-outs is that Confederate monuments honor and glorify people who fought to maintain black chattel slavery; that they were erected for the explicit purpose of obfuscating history; and that the immorality of slavery was always understood by the enslaved. Excuses, excuses: get better at them.

“In their earliest days, the United Daughters of the Confederacy definitely did some good work on behalf of veterans and in their communities,” says Heidi Christensen, former president of the Seattle, Washington, chapter of the UDC, who left the organization in 2012. “But it’s also true that since the UDC was founded in 1894, it has maintained a covert connection with the Ku Klux Klan. In fact, in many ways, the group was the de facto women’s auxiliary of the KKK at the turn of the century. It’s a connection the group downplays now, but evidence of it is easily discoverable — you don’t even have to look very hard to find it.”

​In 2017, after the white nationalist Unite the Right rally in Charlottesville, UDC President Patricia M. Bryson posted an open letterclaiming the UDC’s members “have spent 123 years honoring [Confederate soldiers] by various activities in the fields of education, history and charity, promoting patriotism and good citizenship,” and that members, “like our statues, have stayed quietly in the background, never engaging in public controversy.” But that isn’t true, not by a stretch. The UDC’s monuments, books, education and political agenda have always spoken loudly—in absolutely deafening shouts — on issues from anti-black racism to the historical memory of the Civil War across the South. Today, a shameful number of Americansdon’t think slavery was the primary cause of the Civil War—even though the seceding states literally spelled this out in document form— in part because of the UDC’s campaign of misinformation. The most minor of gains made by blacks during the Reconstruction were obliterated nearly as soon as they were obtained, and the UDC backed that disenfranchisement full stop. Even the current UDC has mostly steadfastly refused — with rare exceptions— to take down Confederate monuments. They know the power of those symbols, both politically and socially, and they aren’t giving an inch, if they can help it.

The UDC have had a huge impact on this country, and to pretend they’ve stood “quietly in the background” would be laughable if it weren’t so insulting. The UDC both trained and became the white women of 1950s massive resistance, who author Elizabeth Gillespie McRae writes did “the daily work on multiple levels . . . needed to sustain racial segregation and to shape resistance to racial equality.” They set a precedent for a huge swath of today’s white women voters whose main political agenda is white supremacy — women who in a 2017 Alabama Senate race backed the alleged pedophile who wistfully longed for slaveryand supportedthe presidency of a man who brags about grabbing women’s genitals when he’s not shouting his racism from the rafters. They have contributed to the construction of a “white womanhood” that has historically been and currently remains incredibly problematic, rendering “white feminism” eternally suspect. With their impact considered, and signs of their handiwork all over society — even carved indelibly into mountain sides— it seems worth understanding a few things about the UDC both then and now. Here are seven things you should know about the United Daughters of the Confederacy.

1. They published a very pro-KKK book. For children.

In 1914, the in-house historian of the UDC Mississippi chapter, Laura Martin Rose, published “The Ku Klux Klan, or Invisible Empire.” It’s essentially a love letter to the original Klan for its handiwork in the field of domestic terror in the years following the Civil War, when blacks achieved a modicum of political power.---

2. Actually, they published at least two very pro-KKK books. . .

. . .and probably many more. Another UDC ode to the KKK was written by Annie Cooper Burton, then-president of the Los Angeles chapter of the UDC, and published in 1916. Titled “The Ku Klux Klan,” much like Rose’s aforementioned book, it argues that the Klan has gotten a bad rap just because they terrorized and intimidated black people, not infrequently assaulting and raping black women, murdering black citizens, and burning down black townships. For these reasons, she suggests, the UDC should do even more to show reverence to the Klan:

“Every clubhouse of the United Daughters of the Confederacy should have a memorial tablet dedicated to the Ku Klux Klan; that would be a monument not to one man, but to five hundred and fifty thousand men, to whom all Southerners owe a debt of gratitude.”

By “all Southerners,” Burton clearly means “only white people,” which is also what she means whenever she uses the word “people.”

3. They built a monument to the KKK.

The UDC was busiest during the 1910s and 1920s, two decades during which the group erected hundreds of Confederate monumentsthat made tangible the racial terror of Jim Crow. This, apparently, the group still considered insufficient to convey their message of white power and to reassert the threat of white violence. So in 1926, the UDC put up a monument to the KKK. In a piece for Facing South, writer Greg Huffman describes a record of the memorial in the UDC’s own 1941 book “North Carolina’s Confederate Monuments and Memorials:”

4. Their most intense efforts focused on the “education” of white children.​Historian Karen Cox, author of 2003’s “Dixie’s Daughters,” has written that the UDC’s biggest goal was to indoctrinate white Southern children in the Lost Cause, thus creating “living monuments.”---

5. They’re big fans of black chattel slavery from way back.

The UDC were perhaps the most efficient agents making the ahistorical Lost Cause myth go viral. They did this through a number of methods, the most visually apparent being the 700 monuments exaltingpeople who fought for black chattel slavery that still stand. But also, in the rare cases the UDC has “honored” black people with statuary and monuments, it has been in the form of “loyal slave” markers — an actual subgenre of Confederate monuments — which perpetuate the image of content enslaved blacks and benevolent white enslavers.---

6. They get tax breaks that help keep their workings financially solvent.

The UDC is a nonprofit. That means it’s a tax-exempt organization. That recent article about the UDCby AP reporter Allen Breed notes that the annual budget of Virginia, where the UDC is headquartered, “awards the state [division of the] UDC tens of thousands of dollars for the maintenance of Confederate graves — more than $1.6 million since 1996.”

7. They continue to exert political and social influence.

For the most part, the UDC has publicly kept pretty mum on the subject of Confederate monument removal, which has led some to conclude that the group is largely inactive, and even obsolete. Their numbers have dwindled since their heyday, but they remain tenacious about keeping Confederate monuments standing, thus continuing their cultural and political influence.

​The UDC does this mostly through lawsuits. (The number of Confederate markers on courthouses has always shown the group’s keen interest in the power of the legal system.) When the San Antonio City Council voted in the weeks after the racist violence in Charlottesville to remove a Confederate monument from public property, the UDC filed suit against city officials. The Shreveport, Louisiana, chapter of the UDC has announced it will appeal a federal judge’s 2017 dismissal of the group’s lawsuit to keep up a Confederate monument at a local courthouse. The UDC threatened legal action against officials in Franklin, Tennessee, when city officials announced plans — not to take down a UDC monument to the Confederacy, but to add markers recognizing African-American historical figures to the park, which the UDC claims it owns. The city of Franklin, with pretty much no other option, responded by filing a lawsuitagainst the UDC.

And then there’s the case of the UDC vs. Vanderbilt University, in which the group’s Tennessee division filed suit after school administrators announced plans to remove the word “Confederate” from one of its dorms. A state appeals court ruled Vanderbilt could only implement the plan if it repaid $50,000 the UDC had contributed to the building’s construction in 1933 — adjusted to 2016 dollars. Vanderbilt opted to pay $1.2 million to the UDC rather than keep “Confederate” in the dorm name, which it raised from anonymous donors who contributed to a fund explicitly dedicated to the cause.

In 1914, Charles Daniels bought a pair of tickets to see King Learat Calgary’s Sherman Grand Theatre, but when he attempted to take his orchestra-level seat, he was told by ushers to move up to the balcony level, where other black patrons were seated.

Theatre staff told Daniels that his presence made the white patrons uncomfortable. Daniels protested – refused offers of a refund, and left.“The fact that this happened in 1914, in Calgary, Alberta, blew my mind. It broke the whole narrative that these kind of things only happen in the United States,” said Bashir Mohamed, a civil servant who has been scouring the provincial archives in Edmonton for the last two years, and wrote about Daniels’s case in an essay for the Sprawl, a Calgary journalism site.

Daniels’s story has re-emerged amid a belated recognition across Canada of past injustices that have been largely absent from the national conversation on race.

At the time, the incident at the theatre was widely covered in local papers, with one running the headline: “CALGARY ‘NIGGER’ KICKS UP FUSS — Wants to Attend Theatre With ‘White Folks’ But Management Says No.”

Daniels retained a lawyer, sued the theatre over its policy of segregation – and won the case.

​He was awarded $1,000, worth more than $17,000 USD today. During the trial he had said: “I think the humiliation is worth that amount.”

Mohamed said he had started research the history of black Canadians after reading an online commentator claiming that Canada does not have the same history of racial discrimination as the US. “When I was going through school, I never learned about this black history. But I always assumed there was something there,” he said.

Since then he has documented Canada’s racist place-names, the extensive presence of the Ku Klux Klan in western Canada, the effects of segregation and the fights of early – but largely forgotten – civil rights activists.

“I think it’s very important work… because there isn’t as much written about people of African descent within Alberta,” Jennifer Kelly, a University of Alberta education professor, told the CBC.

Daniel’s story has been told before, over the years and in different publications. But few Canadians are aware of the pioneering theatregoer, whose successful fight against discrimination predated the broader civil rights movement by decades.

“We have photos of Martin Luther King being arrested. We have mugshots. We have photos of Rosa Parks sitting on a bus. We have photos of her mugshot too,” he said.

​But in Canada, the victims of racism were often seen as secondary to the story, he said: in Daniels’ case, newspaper reporters spoke with theatre management, but never Daniels. No images of Daniels were ever published.

“We don’t see photos of the black-only sections of the theatre. We don’t see photos of black patients being denied care, even know we know all those things happened,” said Mohamed. “Because these photos don’t exist, it’s hard to see these things as real that affected real people.”

Other civil rights activists are only recently gaining mainstream recognition in Canada. Viola Desmond, a black woman who refused to leave a whites-only area of a movie theatre in 1946, is to become the first Canadian-born woman to appear on the country’s $10 bill.Amid a broader debate about Canada’s colonial heritage, some cities have removed of now-controversial historical figures, such as John A Macdonald, the country’s first prime minister and a notorious racist.

“There’s been this resurgence across Canada and the United States of people reclaiming their history,” said Mohamed, pointing out that Edmonton’s Oliver neighbourhood is named after Frank Oliver, the former federal minister who championed policies barring black immigration to Canada and successfully lobbied for the forced removal First Nations from their treaty lands.

“It’s led to really critical discussions of who we celebrate. And who we’ve forgotten.”

Teachers Were the Real Heroes of School Desegregation

Often overlooked in histories of school desegregation are the teachers.

“There was not a manual, and there was not anything other than let’s try this, and with the overriding principle that these young people should not have to pay too big a price both in terms of their academic learning, in terms of their safety, by going through this process, because they didn’t volunteer for it either. And we’re all in it, in that sense. And that was the beauty of it, I mean, there were so many beautiful moments, but a lot of ugly stuff.”– Shelton Boyles, a former English teacher, Gainesville, Florida

Often overlooked in histories of school desegregation are the teachers. Most were products of segregated colleges. Nothing in their training prepared them to teach integrated classes. After the lawyers and the courts had their say, the teachers had to make desegregation work. Failure was not an option. We have public schools in the South today because of their courage and tenacity.

School desegregation in the South proceeded in two stages. The Supreme Court in Brown v. Board of Education (two opinions, 1954 and 1955), decreed that each segregated school district must change to a “racially nondiscriminatory school system.” The lower courts quickly decided that test was satisfied if black students could choose to transfer from their historically black schools to white schools. The South’s dual system of white and black schools remained in place.

I graduated from the white Gainesville High School in Florida in 1962, two years before the first blacks entered white schools there. My research of the Gainesville experience with desegregation has informed this article.

During the “freedom of choice” era of desegregation, only a minority of black students chose to transfer to white schools. They were expected to conform to the white schools’ expectations. Many of those students were middle class and college bound. Curricula, clubs, student government, and other activities in the white schools did not change.

The NAACP Legal Defense Fund in 1968 persuaded the Supreme Court that the South’s dual school system, even with freedom of choice, was not a “racially nondiscriminatory school system.” The black schools would have to be integrated or closed. In Florida, virtually all of the black high schools were closed. In Gainesville, the entire student body of Lincoln High School struck for 11 days in 1969 to protest its closing.

With full desegregation, teachers had to engage disaffected students of both races. African American teachers were expected not only to continue their normal instructional roles but also to help black students accommodate to desegregation. Frequently, that role included calming student disturbances. White teachers had to begin working with their new black students from the point at which segregation had left those students academically.

Unencumbered by reforms such as No Child Left Behind, teachers in the newly integrated classrooms could innovate. At Gainesville High, two white and one black teachers developed a three-hour block course called “Man and His Environment.” The course satisfied English, science and social studies requirements. The class included a representative group of students of both races and differing academic abilities. In other classes, many teachers took time out for students to tell something of their personal lives. Coaches fielded integrated sports teams. Principals found ways to get students of both races to work together on projects.

Masked by segregation, reading deficits were a major problem. Reading skills were needed not just for English classes but for all other subjects, even math. At the high school level, teachers were not expected to teach reading. However, in 1970, ten Gainesville High teachers enrolled in a junior college course for problem readers, to learn how to address their students’ reading issues. Curricula had to be revised to accommodate varying reading skills. Every teacher and every school were affected by student reading deficits.

Principal and football coach were the two most important positions in the high schools. In Gainesville, two new high schools opened in 1970. At the east side school, the last principal of Lincoln High School, a black, became principal and a white became football coach. On the more affluent, white west side, the new school’s principal was white and Lincoln’s last football coach took over the athletic program. At Lincoln, he fed his players Kentucky Fried Chicken before games. Speaking to a friend about his posting to the west side, he commented, “These kids, they ain’t going to be able to play football off of salads.” But he molded a successful program. A white player says the teams played all the harder for him, to protect him from the occasional racist sideline taunts.

Although the schools increased the numbers of deans who enforced discipline, at times the teachers had to be first responders. One white GHS English teacher recalls breaking up a fight in class between a white boy and a black boy. “I got right down on the floor in my little dress and high heels and I rolled around with them. I got them apart.”

Public education in Florida had been tarnished by the antics of Republican Governor Claude Kirk. Recruiting black teachers, already difficult, was made more so because of Kirk. The school district sent out a recruiting team of a white teacher and a black teacher to visit out-of-town colleges. When passing through Taylor County, Florida, the teacher who was not driving would get down on the floor of the car to avoid potentially hostile attention.

After the 1969-70 and 1970-71 school years, each of the principals of Gainesville High resigned. For 1971-72, 30-year-old Dan Boyd took over. The school board told him that if he resigned, there would be no other position for him in the county school system. Although without special training in racial matters, Boyd remained principal for 24 years. He did not lead from his desk. He worked out with the athletes in the weight room, traveled with the football team, and was visible on campus. A female student, remembering Boyd’s tight jeans, says all the girls had crushes on him.

Despite serious outbreaks of violence in 1970, the Gainesville schools slowly but steadily regained educational equilibrium. However, student engagement continues to be a key problem facing public schools. We can still learn from both white and black educators who taught through the desegregation years.​

The Colonial Roots of Gun Culture​The origins of the U.S. gun obsession lie in the violent dispossession of Native Americans.

BY ROXANNE DUNBAR-ORTIZ - in these times

In the summer of 1970, while I was living and organizing in New Orleans with a women’s study-action group, we became caught up in a current of repression and paranoia. After a week of heavy police surveillance, we began receiving telephone calls from a man claiming to be a member of the Ku Klux Klan. The man threatened to burn down our building, and, of course, we didn’t trust the police, so we did not report it. Instead, we decided to arm ourselves. We saw it as a practical step, not a political act, something we needed for self-defense in order to continue working, not at all embracing armed struggle. In reality, once armed, our mindsets changed to match the new reality.

Gun-love can be akin to non-chemical addictions like gambling or hoarding, either of which can have devastating effects, but murder, suicide, accidental death and mass shootings result only from guns. While nearly anything may be used to kill, only the gun is created for the specific purpose of killing a living creature. The sheer numbers of guns in circulation, and the loosening of regulations on handguns especially, facilitate deadly spur-of-the-moment reflex acts.

Seventy-four percent of gun owners in the United States are male, and 82 percent of gun owners are white. The top reason U.S. Americans give for owning a gun is for protection. What are the majority of white men so afraid of?

Instead of dismissing the Second Amendment as antiquated and irrelevant, or as not actually meaning what it says, understanding the original purpose of the Second Amendment is key to understanding gun culture, and possibly the key to a new consciousness about the continuing effects of settler-colonialism and white nationalism.

One argument that runs through historical accounts of the thinking behind the Second Amendment is idealizing Anglo settler-farmers as fiercely independent and rightly fearing Big Brother government, insisting on settlers’ right to overthrow oppressive regimes. But, what colonists considered oppressive was any restriction put on them in regard to obtaining land. In the instances of Bacon’s Rebellion in 1676, the War of Independence itself, and many cases in between, the settlers’ complaint was the refusal of the British colonial authorities to allow them to seize Native land peripheral to the colonies.

Taking land by force was not an accidental or spontaneous project or the work of a few rogue characters. Male colonial settlers had long formed militias for the purpose of raiding and razing Indigenous communities and seizing their lands and resources, and the Native communities fought back. Virginia, the first colony, forbade any man to travel unless he was “well armed.” In 1658, the colony ordered every settler home to have a functioning firearm, and later even provided government loans for those who could not afford to buy a weapon.

These types of laws stayed on the books of the earliest colonies and were created in new colonies as they were founded. The Second Amendment, ratified in 1791, enshrined these rights and obligations as constitutional law: “A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed.” The Second Amendment thus reflects this dependence on individual armed men to take and retain land.

The continuing significance of that “freedom” specified in the Bill of Rights reveals the settler-colonialist cultural roots of the United States that appear even in the present as a sacred right. Settler-militias and armed households were institutionalized for the destruction and control of Native peoples, communities and nations. With the expansion of plantation agriculture, by the late 1600s they were also used as “slave patrols,” forming the basis of the U.S. police culture after emancipation.

T​hat is the way of settler-colonialism, and that is the way of the gun—to kill off and control enemies. Violence perpetrated by armed settlers, even genocide, were not absent in the other British settler-colonies—Australia, Canada and New Zealand—but the people of those polities never declared the gun a God-given right. And the people of the other Anglo settler-colonies did not have economies, governments and social orders based on the enslavement of other human beings. The United States is indeed “exceptional.”

Germans are the largest ancestry group in the U.S., but their identity has been largely disappeared. Here’s why

​ERIKA SCHELBY - salonJULY 27, 2018 10:00PM (UTC)

"The howl of the cave man.” This is how a 1918 Los Angeles Times article described the music of Brahms and Bach. A year earlier the U.S. had declared war against Germany and waded into the tragedy of the First World War. The propaganda machine was in full swing. Germans were brutes—close cousins of the barbaric Huns—and detesting all things German became a badge of patriotic pride. The time was also ripe for the contributions of German-Americans to be scrubbed from the history books. Figures like Alexander von Humboldt, Carl Blümner and Heinrich Balduin Möllhausen, whose contributions to the U.S. are vast, were eclipsed by caricatures of the brutish German lusting for American blood. German-Americans learned to keep a low profile and the collective demonization induced a historical amnesia from which we have yet to awaken.

Today, German-Americans are the largest ancestry group in the U.S., with some 50 million citizens, but their history and their identity has been largely disappeared. This may seem irrelevant, but the history of how it happened tells us a great deal about our present and how ethnic groups can come in and out of favor depending on the geopolitics of the day. As Art Silverman and Robert Siegel noted in an “All Things Considered” segment titled “During World War, the US Propaganda Erased German Culture”: “today … what happened a century ago has special relevance … World War I inspired an outbreak of nativism and xenophobia that targeted German immigrants, Americans of German descent, and even the German language.”

I grew up in Germany and was educated in the post-World War II model, which, for obvious reasons, stressed a respect for pluralism and cultivated a global view of politics and culture. As a result I’ve always been sensitive to the ways in which propaganda shapes our opinions of different communities. The daily scandals of how Latin American immigrants, many of them fleeing horrors that the U.S. had a hand in producing, are brutalized at our borders; and the harassment and attacks on Muslims come out of and occur alongside a propaganda campaign to dehumanize these groups as criminal, devious and irrational. The ethnicities change, but the message stays the same.

Unfortunately, this model has proven reliable in the effort to tar an entire population, and I’ve spent a good deal of time studying how it was applied to Germans and German-Americans. It begins with a well-crafted propaganda campaign initiated only days after the U.S. declared war on Germany. On April 13, 1917, President Wilson formed the Committee on Public Information (CPI), which recruited 150,000 academics, business people, and talent from the media and arts to promote the war to voters. Up to 20,000 newspaper pieces per week were printed based on “voluntary censorship” and CPI handouts. That was probably the birth of “embedded.” One of the CPI’s primary goals was to shape public opinion of the new enemies—Germans. Professor Vernon Kellogg, a member of the intelligentsia, who served this effort, eloquently expressed himself in a CPI publication, writing, “Will it be any wonder, if, after the war, the people of the world, when they recognize any human being as German, will shrink aside so that they may not touch him as he passes, or stoop for stones to drive him from his path?”

Hollywood did its part by producing films like “The Kaiser: The Beast of Berlin and Wolves of Kultur,” which cemented the idea of Germans as a public menace in the minds of movie-goers. Super-patriotic volunteer groups joined in, whipping up a tidal wave of war hysteria, hunting for imaginary spies or saboteurs who did not exist, trampling civil rights along the way. Books were burned, citizens were tarred and feathered, German names of places, streets, foods, and objects were erased. Bach went from being a seminal composer to a villain whose work exemplified the monstrous German aesthetic. Teaching the German language was outlawed in more than a dozen states.

In that same year an anti-German mob demanded that the Pabst Theater in Milwaukee (once called the “German Athens”) cancel a scheduled performance of “William Tell” by Friedrich Schiller. They underscored their demand by placing a machine gun in front of the theater. When the theater offered to show a comedy instead the crowd threatened to “break up the Hun show.” The mob got its way and nothing showed on the night “William Tell” was due to run.

It is interesting to note that Schiller drew his inspiration from the American and French Revolution. “William Tell” was the same play W.E.B. Du Bois read in German at Fiske University before he studied in Berlin where his “first awakening to social reform began.”

On April 5, 1918, almost exactly a year after Wilson inaugurated the CPI, Robert Prager, a German miner living in Illinois, was lynched. Not only had he committed the sin of being a German immigrant, but he was suspected of being a socialist. Ironically, one of the reasons Prager’s application for membership in the United Mine Workers Union was declined was precisely because he was German. Eleven men were arrested, tried, and ultimately acquitted of the crime. On April 11, 1918, The Washington Post editorialized, “The more one ponders Senator Overman’s estimate of 400,000 German spies the harder it is to grow righteously indignant over the Illinois lynching.” Lee Slater Overman was a senator from North Carolina who chaired a committee zealously dedicated to routing out real or perceived spying and other treasonous activities on the part of Germans and others.

​If the propaganda served the war effort, it also served the economic interests of the elite. Not unlike today, discontent among workers was profound. It ran so deep that the United States Commission on Industrial Relations, an arm of Congress, admitted: “The workers of the nation, through compulsory and oppressive methods, legal and illegal, are denied the full product of their toil … Citizens numbering millions smart under a sense of injustice and oppression. The extent and depth of industrial unrest can hardly be exaggerated.”

When miners in Bisbee, Arizona, went on strike in June 1917 Walter Dodge, president of Phelps Dodge Corporation, a mining company impacted by the strike, declared, “I believe the government will be able to show that there is German influence behind this movement.”

​In a stroke of propagandistic genius cartoonist H.T. Webster managed to conflate the Kaiser, the top German aristocrat, with the International Workers of the World. His cartoon, which ran in the New York Globe in 1917, shows the acronym IWW printed over the Kaiser’s face.

By the time the war was over ordinary German-Americans had learned to lay low, speak English exclusively, and release any attachment to their cultural heritage. Perhaps the best measure of the success of the propaganda effort against Germans is how quickly and effectively they learned to de-identify as German. We may be tempted to mistake this for assimilation, but history shows the hand of coercion was placed firmly around the German-American community.

So, why care about this? After all, it’s not as if any American is being attacked or oppressed today based on their German heritage. For me, as a German-American and a student of history, the anti-German propaganda effort is no mere historical footnote or anomalous descent into xenophobia. It is a template—hardly the first, but a vivid one no less—for what we are seeing today. When I look at the faces of immigrant children forcibly separated from their families or hear anti-Muslim sentiment, I know that the demonization of communities does not happen accidentally and that an intellectual as well as legal and political infrastructure is required for them to become victims in the larger game of geopolitics. The material loss for those affected is incalculable and the loss of culture heritage for them and everyone else is often a quiet casualty of the war waged against them. History gives us the vantage point to see how and why enemies are created. It also gives us an unmistakable warning that we would be wise to heed.

Here are some of the ways that Mexicans made America great

History News Network - raw story18 JUL 2018 AT 10:25 ET

Mexicans have contributed to making the United States in pivotal and enduring ways. In 1776, more of the territory of the current United States was under Spanish sovereignty than in the 13 colonies that rejected British rule. Florida, the Gulf coast to New Orleans, the Mississippi to St. Louis, and the lands from Texas through New Mexico and California all lived under Spanish rule, setting Hispanic-Mexican legacies. Millions of pesos minted in Mexico City, the American center of global finance, funded the war for U.S. independence, leading the new nation to adopt the peso (renamed the dollar) as its currency.

The U.S. repaid the debt by claiming Spanish/Mexican lands—buying vast Louisiana territories (via France) in 1803; gaining Florida by treaty in 1819; sending settlers into Texas (many undocumented) to expand cotton and slavery in the 1820s; enabling Texas secession in 1836; provoking war in 1846 to incorporate Texas’s cotton and slave economy—and California’s gold fields, too. The U.S. took in land and peoples long Spanish and recently Mexican—often mixing European, indigenous, and African ancestries. The 1848 Treaty of Guadalupe Hidalgo recognized those who remained in the U.S. as citizens. And the U.S. incorporated the dynamic mining-grazing-irrigation economy that had marked Spanish North America for centuries and would long define the U.S. West.

​Debates over slavery and freedom in lands taken from Mexico led to the U.S. Civil War while Mexicans locked in shrunken territories fought over liberal reforms and then faced a French occupation—all in the 1860s. With Union victory, the U.S. drove to continental hegemony. Simultaneously, Mexican liberals led by Benito Juárez consolidated power and welcomed U.S. capital. U.S. investors built Mexican railroads, developed mines, and promoted export industries—including petroleum. The U.S. and Mexican economies merged; U.S. capital and technology shaped Mexico while Mexican workers built the U.S. west. The economies were so integrated that a U.S. downturn, the panic of 1907, was pivotal to setting off Mexico’s 1910 revolution, a sociopolitical conflagration that focused Mexicans while the U.S. joined World War I.

Afterwards, the U.S. roared in the 20s while Mexicans faced reconstruction. The U.S. blocked immigration from Europe, and still welcomed Mexicans to cross a little-patrolled border to build dams and irrigation systems, cities and farms across the west. When depression hit in 1929 (it began in New York, spread across the U.S., and was exported to Mexico), Mexicans became expendable. Denied relief, they got one-way tickets to the border, forcing thousands south—including children born as U.S. citizens.

Mexico absorbed the refugees thanks to new industries and land distributions—reforms culminating in the nationalization of the oil industry in 1938. U.S. corporations screamed foul and FDR arranged a settlement (access to Mexican oil mattered as World War II loomed). When war came, the U.S. needed more than oil. It needed cloth and copper, livestock and leather–and workers, too. Remembering the expulsions of the early 30s, many resisted going north. So the governments negotiated a labor program, recruiting braceros in Mexico, paying for travel, promising decent wages and treatment. 500,000 Mexican citizens fought in the U.S. military. Sent to deadly fronts, they suffered high casualty rates.

To support the war, Mexican exporters accepted promises of post-war payment. With peace, accumulated credits allowed Mexico to import machinery for national development. But when credits ran out, the U.S. subsidized the reconstruction of Europe and Japan, leaving Mexico to compete for scarce and expensive bank credit. Life came in cycles of boom and bust, debt crises and devaluations. Meanwhile, U.S. pharmaceutical sellers delivered the antibiotics that had saved soldiers in World War II to families across Mexico. Children lived—and Mexico’s population soared: from 20 million in 1940, to 50 million by 1970, 100 million in 2000. To feed growing numbers, Mexico turned to U.S. funding and scientists to pioneer a “green revolution.” Harvests of wheat and maize rose to feed growing cities. Reliance on machinery and chemical fertilizers, pesticides, and herbicides, however, cut rural employment. National industries also adopted labor saving ways, keeping employment scarce everywhere. So people trekked north, some to labor seasonally in a bracero program that lasted to 1964; others to settle families in once Mexican regions like Texas and California and places north and east.

Documentation and legality were uncertain. U.S. employers’ readiness to hire Mexicans for low wages was not. People kept coming. U.S. financing, corporations, and models of production shaped lives across the border; Mexican workers labored everywhere, too. With integrated economies, the nations faced linked challenges. In the 1980s the U.S. suffered from “stagflation” while Mexico faced a collapse called the “lost decade.” In 1986, Republican President Ronald Reagan authorized a path to legality for thousands of Mexicans in the U.S. tied to sanctions on employers aimed to end new arrivals. Legal status kept workers here; failed sanctions enabled employers to keep hiring Mexicans—who kept coming. They provided cheap and insecure workers to U.S. producers—subsidizing profits in times of challenge.

The 1980s also saw the demise of the Soviet Union, the end of the Cold War, and the presumed triumph of capitalism. What would that mean for people in Mexico and the U.S.? Reagan corroded union rights, leading to declining incomes, disappearing pensions, and enduring insecurities among U.S. workers. President Carlos Salinas of Mexico’s dominant PRI attacked union power—and in 1992 ended rural Mexicans’ right to land. A transnational political consensus saw the erosion of popular rights as key to post-Cold War times.

Salinas proposed NAFTA to Reagan’s Republican successor, George H.W. Bush. The goal was to liberate capital and goods to move freely across borders—while holding people within nations. U.S. business would profit; Mexicans would continue to labor as a reservoir of low wage workers—at home. The treaty was ratified in Mexico by Salinas and the PRI, in the U.S. by Democratic President Clinton and an allied Congress.

As NAFTA took effect in 1994, Mexico faced the Zapatista rising in the south, then a financial collapse—before NAFTA could bring investment and jobs. With recovery, the Clinton era hi-tech boom saw production flow to China. Mexico gained where transport costs mattered—as in auto assembly. But old textiles and new electronics went to Asia. Mexico returned to growth in the late 1990s, with jobs still scarce for a population nearing 100 million. Meanwhile, much of Mexican agriculture collapsed. NAFTA ended tariffs on goods crossing borders. The U.S. subsidizes corporate farmers–internal payments enabling agribusiness to sell below cost. NAFTA left Mexican producers to face U.S. subsidized staples. Mexican growers could not compete and migration to the U.S. accelerated.

NAFTA brought new concentrations of wealth and power across North America. In Mexico, cities grew as a powerful few and favored middle sectors prospered; millions more struggled with informality and marginality. The vacuum created by agricultural collapse and urban marginality made space for a dynamic violent drug economy. Historically, cocaine was an Andean specialty, heroin an Asian product. But as the U.S. pressed against drug economies elsewhere, Mexicans—some enticed by profit; many searching for sustenance—turned to supply U.S. consumers.

U.S. politicians and ideologues blame Mexico for the “drug problem”—a noisy “supply side” understanding that is historically untenable. U.S. demand drives the drug economy. The U.S. has done nothing effective to curtail consumption—or to limit the flow of weapons to drug cartels in Mexico. Laying blame helps block any national discussion of the underlying social insecurities brought by globalization—deindustrialization, scarce employment, low wages, lowered benefits, vanishing pensions—insecurities that close observers know fuel drug dependency. Drug consumption in the U.S. has expanded as migration from Mexico now slows (mostly due to slowing population growth)—a conversation steadfastly avoided.

People across North America struggle with shared challenges—common insecurities spread by globalizing capitalism. Too many U.S. politicians see advantages in polarization, blaming Mexicans for all that ails life north of the border. Better that we work to understand our inseparable histories. Then we might work toward a prosperity shared by diverse peoples facing common challenges in an integrated North America.

Coard: “Operation Wetback”: America's Worst Mass Deportation

Michael Coard - philly tribune​7/13/18

​It was 64 years ago on July 15, 1954 that the U.S. Border Patrol began its widespread and notoriously racist deportation program — officially called “Operation Wetback” by the Dwight Eisenhower administration — by kicking out nearly 1.5 million Mexicans through the use of such despicable actions that involved, for example, the demanding of birth certificate identification of all so-called “Mexican-looking” people via apartheid-styled stop and frisk harassment.

I should mention that this “wetback” racial slur originated in the 1920s to describe Mexicans who swam the Rio Grande to reach America.

Donald Trump, born in 1946, was a budding 8-year-old racist in 1954. And by the time he reached 69 years old 61 years later, he was a full-fledged racist Republican presidential candidate who, in 2015, said Mexican immigrants are “bringing in crime. They’re rapists.” In 2014, he described Mexicans as “our enemies.”

As president, he declared that during his first 100 days in office, he would deport up to three million undocumented immigrants, meaning the Brown and Black ones. Not the white ones. As he stated earlier this year, “Why do we want all these people from Africa here? They’re s***hole countries. .... We should have more [white] people from Norway.”

Also as president, he initiated and implemented in 2018 a policy that has never — I repeat, never — before been initiated and implemented to separate thousands of immigrant, especially Mexican, infants and other children from parents fleeing imminent violence and/or dire poverty in their native land.

Let’s get back to 1954’s “Operation Wetback,” which is a perfect example of what Trump means by his “Make America Great Again” nonsense. By the way, when he and his 63 million racist supporters tell Mexicans to go back to their own country, those so-called immigrants should just plop down in Texas and explain that they and their ancestors are natives of that land — at least until America in 1845 stole the northeastern province of Mexico and declared it the 28th state and then began the Mexican-American War a year later and claimed victory in 1848.

And while our Mexican comrades are at it, they should mention that the American states of Arizona, California, Colorado, New Mexico and Utah are also part of their ancestral homeland. As a result, maybe Trump should go back to Germany and his stooges back to their various European countries.

As indicated by Erin Blakemore of The History Channel, “[This] short-lived operation used military-style tactics to remove Mexican immigrants — some of them American citizens — from the United States. Though millions of Mexicans had legally entered the country through joint immigration programs in the first half of the 20th century, Operation Wetback was designed to send them back to Mexico. ... During [this operation] ..., tens of thousands of immigrants were shoved into buses, boats and planes and sent to often-unfamiliar parts of Mexico, where they struggled to rebuild their lives. In Chicago, three planes a week were filled with immigrants and flown to Mexico. In Texas, 25 percent of all of the immigrants deported were crammed onto boats later compared to slave ships, while others died of sunstroke, disease and other causes while in custody.”

Columbia University professor Dr. Mae Ngai similarly referred to the type of boats used as like “eighteenth century slave ships.”

UCLA professor Kelly Lytle Hernandez pointed out that the operation was “lawless ..., arbitrary ... [and] based on a lot of xenophobia ... and ... resulted in sizable large-scale violations of people’s rights, including the forced deportation of U.S. citizens.”

There were so many immigrant kidnappings, as I describe them, that The Conversation (which is a global network of newsrooms founded by British-Australian journalist Andrew Jaspan) reported that the Border Patrol was “converting public parks into concentration camps to detain at least 1,000 people at a time.”

These types of mass deportations actually began much earlier. During the 1930s, as uncovered by historian Francisco Balderrama, “The United States deported over one million Mexicans ..., 60 percent of whom were U.S. citizens of Mexican descent.”

Yesterday’s 1954 immigration policy oozed from the same sewer that today’s 2018 immigration policy oozes: the racism sewer. It was a political response to white brainless KKK-type voters who illogically claimed that the backbreaking and labor-intensive farming jobs they didn’t want were being taken by Brown people. Wait ... What?!

I should note that in 2015, then-candidate Trump endorsed Operation Wetback in a campaign speech wherein he proclaimed, “I like Ike ... [who] moved a million-and-a-half illegal immigrants out of this country. ... Moved them way south. They never came back.”

I should also note that the Trump-like cretin who served as Southwest Border Patrol Chief during that operation was Harlon Carter. He’s the guy who had been convicted at age 17 of hunting down and murdering Mexican-American Ramon Casiano in Laredo, Texas in 1931 and later, years after his conviction was overturned on a (racist) technicality, became chief executive officer of the National Rifle Association. Anyone surprised? Nope.

You’ve just read about the old problem and the new problem. But what about the solution? It’s very simple: Abolish not only I.C.E. (Immigration and Customs Enforcement) but also any federal policy that does not allow people fleeing violence, persecution, or deadly poverty the right to due process proceedings which include court-appointed lawyers and a “preponderance of the evidence” burden placed on the government to disprove such claim of violence, persecution, or deadly poverty.

Oh, and by the way, when is any member of Congress gonna ask the indigenous Red people here what they think about the so-called (and misnamed) New World’s immigration policy since 1492 and America’s immigration policy since 1776? Red land reparations anyone?

A SHORT HISTORY OF AMERICANS PRAISING THEIR OWN “GENEROSITY,” FROM THE GENOCIDE OF NATIVE AMERICANS TO TRUMP’S CHILD SNATCHERS​Jon Schwarz - the interceptJuly 12 2018, 8:14 a.m.

EARLIER THIS WEEK, Health and Human Services Secretary Alex Azar celebrated the Trump administration for its treatment of immigrant children it has separated from their parents. “We have nothing to hide about how we operate these facilities,” said Azar on CNN. “It is one of the great acts of American generosity and charity, what we are doing for these unaccompanied kids.”

This magnanimous claim raises an obvious question: What are the othergreat acts of American generosity?

Of course, we know that the U.S. treatment of Native Americans has been extraordinarily generous. They were literally asking for it since before there was an America: The seal of the Massachusetts Bay Colony for most of the 17th century was an Indian saying, “Come over and help us.”

So as President Andrew Jackson explained to Congress in 1829 when making the case for the Indian Removal Act, “Toward the aborigines of the country no one can indulge a more friendly feeling than myself. … Rightly considered, the policy of the General Government toward the red man is not only liberal, but generous.” The passage of the act the next year generously allowed the Cherokee to experience the Trail of Tears.

Sixty years later, in “The Winning of the West,” future President Teddy Roosevelt remained impressed by this example of American generosity. “In [our] treaties,” he wrote, “we have been more than just to the Indians; we have been abundantlygenerous.… No other conquering and colonizing nation has ever treated the original savage owners of the soil with such generosity as has the United States.”

Slavery, too, was an act of generosity. As Thomas Roderick Dew, who went on to become president of William & Mary College, put it in the famed 1832 treatise, “The Pro-Slavery Argument,” slaveowners were among the “most generous” Americans. Moreover, a slaveholder’s son, precisely because he witnessed his father enslaving others, “acquires a greater generosity and elevation of soul, and embraces for the sphere of his generous actions a much wider field.”

The Vietnam War was another high point in U.S. generosity. David Lawrence, then editor of U.S. News & World Report, proclaimed in 1966 that “what the United States is doing in Vietnam is the most significant example of philanthropy extended by one people to another that we have witnessed in our times.”

More recently, we generously helped three prisoners at Guantánamo Bay kill themselves. “The manipulative detainees,” wrote Michelle Malkinsoon afterward, “reportedly used the generous civil liberties protections we gave them to plot their suicide pact.”

​So we clearly have a long history about which to volubly praise ourselves. But we should be modest enough to realize that we have never matched the moral heights of the most generous man in history: Adolf Hitler.

Just before the German invasion of Poland in 1939, the British ambassador to Germany wrote home to explain how frustrated Hitler was that he was not receiving credit for his generosity:

Herr Hitler replied that he would be willing to negotiate, if there was a Polish Government which was prepared to be reasonable. … He expatiated on misdoings of the Poles, referred to his generous offer of March last, said that it could not be repeated.

We shouldn’t feel too bad about the Poles’ ingratitude, however. Hitler had an advantage, because he was leading the Germans, who are naturally generous to a fault. Joseph Goebbels explained this in a generous 1941 article, titled “The Jews Are Guilty!“:

If we Germans have a fateful flaw in our national character, it is forgetfulness. This failing speaks well of our human decency and generosity, but not always for our political wisdom or intelligence. We think everyone else is as good natured as we are.At this point, all we can do is pray that someday we will find it in our hearts to be as generous as 1940s-era Germans. Overall, we’re still a long way from that achievement, but certainly there seem to be some pioneers among us right now who are getting close.

american corporations have never been PATRIOTIC!!!

American supporters of the European Fascists

A number of prominent and wealthy American businessmen helped to support fascist regimes in Europe from the 1920s through the 1940s. These people helped to support Francisco Franco during the Spanish Civil War of 1936, as well as Benito Mussolini, and Adolph Hitler.

Some of the primary and more famous Americans and companies that were involved with the fascist regimes of Europe are: William Randolph Hearst, Joseph Kennedy (JFK's father), Charles Lindbergh, John Rockefeller, Andrew Mellon (head of Alcoa, banker, and Secretary of Treasury), DuPont, General Motors, Standard Oil (now Exxon), Ford, ITT, Allen Dulles (later head of the CIA), Prescott Bush, National City Bank, and General Electric.

It should be noted that businessmen from many countries, including England and Australia, also worked with the fascist regimes of Europe prior to WWII. The fascist governments were involved in a high level of construction, production, and international business.

I.G. Farben, a German company, was the largest chemical manufacturing enterprise in the world during the early part of the 20th century. As such the company had many holdings in a variety of countries, including America. The American holdings of I.G. Farben included Bayer Co., General Aniline Works, Agfa Ansco, and Winthrop Chemical Company.

I.G. Farben was critical in the development of the German economy and war machine leading up to WWII. During this time I.G. Farben's international holdings along with its international business contracts with companies like Standard Oil, DuPont, Alcoa, and Dow Chemical were crucial in supplying the Nazi regime with the materials needed for war as well as financial support.​The Spanish Civil War was the precursor to World War II. Fascist Francisco Franco was aided by Hitler and Mussolini during the Spanish Civil War. At this time GM, Ford, DuPont, and Standard Oil were working with Franco and supplying the fascist powers of Europe. At this same time many Americans were protesting the goings on in Europe as well as the involvement of American companies in helping the fascist powers. A group of American volunteer soldiers known as the Abe Lincoln Brigade went to Spain during this time to fight against Franco in defense of the Spanish Republic. This group was made up primarily of leftist American groups, such as members of American socialist parties and communist parties.

The success of the fascists in Spain was an important first step in the building of fascist power in Europe and the stepping-stone for the Italian and German powers.

The support of American corporations, and lack of American intervention by the government, was crucial in the success of this first step.

American banks and businesses continued to support the fascist regimes of Europe legally up until the day Germany declared war on America and the activities were stopped under the Trading with the Enemy Act. Despite this, some companies and individuals still maintained a business relationship with the Third Reich. Ford and GM supplied European fascists with trucks and equipment as well as investing money in I.G. Farben plants. Standard Oil supplied the fascists with fuel. US Steel and Alcoa supplied them with critically needed metals. American banks gave them billion's of dollars worth of loans.

The following is excerpted from a report printed by the United States Senate Committee on the Judiciary in 1974:

The activities of General Motors, Ford and Chrysler prior to and during World War II...are instructive. At that time, these three firms dominated motor vehicle production in both the United States and Germany. Due to its mass production capabilities, automobile manufacturing is one of the most crucial industries with respect to national defense. As a result, these firms retained the economic and political power to affect the shape of governmental relations both within and between these nations in a manner which maximized corporate global profits. In short, they were private governments unaccountable to the citizens of any country yet possessing tremendous influence over the course of war and peace in the world. The substantial contribution of these firms to the American war effort in terms of tanks, aircraft components, and other military equipment is widely acknowledged. Less well known are the simultaneous contributions of their foreign subsidiaries to the Axis Powers. In sum, they maximized profits by supplying both sides with the materiel needed to conduct the war.

During the 1920's and 1930's, the Big Three automakers undertook an extensive program of multinational expansion...By the mid-1930's, these three American companies owned automotive subsidiaries throughout Europe and the Far East; many of their largest facilities were located in the politically sensitive nations of Germany, Poland, Rumania, Austria, Hungary, Latvia, and Japan...Due to their concentrated economic power over motor vehicle production in both Allied and Axis territories, the Big Three inevitably became major factors in the preparations and progress of the war. In Germany, for example, General Motors and Ford became an integral part of the Nazi war efforts. GM's plants in Germany built thousands of bomber and jet fighter propulsion systems for the Luftwaffe at the same time that its American plants produced aircraft engines for the U.S. Army Air Corps....

Ford was also active in Nazi Germany's prewar preparations. In 1938, for instance, it opened a truck assembly plant in Berlin whose "real purpose," according to U.S. Army Intelligence, was producing "troop transport-type" vehicles for the Wehrmacht. That year Ford's chief executive received the Nazi German Eagle (first class)....

The outbreak of war in September 1939 resulted inevitably in the full conversion by GM and Ford of their Axis plants to the production of military aircraft and trucks.... On the ground, GM and Ford subsidiaries built nearly 90 percent of the armored "mule" 3-ton half-trucks and more than 70 percent of the Reich's medium and heavy-duty trucks. These vehicles, according to American intelligence reports, served as "the backbone of the German Army transportation system."....

After the cessation of hostilities, GM and Ford demanded reparations from the U.S. Government for wartime damages sustained by their Axis facilities as a result of Allied bombing... Ford received a little less than $1 million, primarily as a result of damages sustained by its military truck complex at Cologne...

Due to their multinational dominance of motor vehicle production, GM and Ford became principal suppliers for the forces of fascism as well as for the forces of democracy. It may, of course, be argued that participating in both sides of an international conflict, like the common corporate practice of investing in both political parties before an election, is an appropriate corporate activity. Had the Nazis won, General Motors and Ford would have appeared impeccably Nazi; as Hitler lost, these companies were able to re-emerge impeccably American. In either case, the viability of these corporations and the interests of their respective stockholders would have been preserved.(READ MORE)

Behind the Criminal Immigration Law: Eugenics and White Supremacy

The history of the statute that can make it a felony to illegally enter the country involves some dark corners of U.S. history.

by Ian MacDougall - pro publica​June 19, 8:15 p.m. EDT

...​The federal law they say they are enforcing makes it a crime for foreign citizens to cross (or attempt to cross) the border into the U.S. anywhere other than an official port of entry. A first offense is a misdemeanor; a second unlawful entry is a felony.

The result was floods of immigrants: Between 1901 and 1910, for example, close to 9 million came to the U.S. As that happened, anti-immigrant attitudes mounted, with mass influxes from parts of Europe associated in the popular imagination with a litany of social problems, like urban poverty and squalor.

In May 1918, after the U.S. had entered World War I, Congress passed a statute called the Passport Act that gave the president the power to restrict the comings and goings of foreign citizens during wartime. A few months later, however, the war ended — and with it, the restrictions on border crossings.

Federal officials saw potential in the criminal provisions of the Passport Act — a maximum 20-year sentence — as a tool for deterring immigration. So prosecutors ignored the expiration of the law and continued to indict migrants under the Passport Act for unlawful entry into the U.S.

Anti-immigration sentiment continued to climb and the rhetoric of the era has resonance today. One anti-immigration group at the time claimed that immigrants tended to be “vicious and criminal” — the “bootleggers, gangsters, and racketeers of large cities.” The war, Columbia University historian Mae Ngai has written, “raised nationalism and anti-foreign sentiment to a high pitch.”

In response, Congress began clamping down. With the Immigration Act of 1924, it capped the flow at about 165,000 people a year, a small fraction of previous levels The statute’s quotas curtailed migration from southern and eastern Europe severely. Another 1924 law — the Oriental Exclusion Act — banned most immigration from Asia. At the same time, Congress made it easier to deport non-citizens for immigration violations.

In 1925, a federal appeals court put a halt to the practice of indicting migrants under the Passport Act outside wartime. But immigration officials liked what they’d seen, and by 1927, they were working on a replacement.

Two men spearheaded the effort that would lead Congress to criminalize unlawful entry into the United States. They were motivated by eugenics and white supremacy.

The first was James Davis, who was Secretary of Labor from 1921 to 1930. A Republican originally appointed by President Warren Harding, Davis was himself an immigrant from Wales who went by “Puddler Jim,” a reference to his job as a youthful worker in the steel mills of western Pennsylvania. At the time, the Department of Labor oversaw immigration, and Davis had grown disturbed by what he’d seen.

Davis was a committed eugenicist, and he believed principles of eugenics should guide immigration policy, according to The Bully Pulpit and the Melting Pot by the historian Hans Vought. It was necessary to draw a distinction, Davis had written in 1923, between “bad stock and good stock, weak blood and strong blood, sound heredity and sickly human stuff.”

In November 1927, Davis proposed a set of immigration reforms in the pages of The New York Times. Among his goals: “the definite lessening and possibly, in time, the complete checking of the degenerate and the bearer of degenerates.” One “phase of the immigration problem,” Davis wrote, was the “surreptitious entry of aliens” into the United States in numbers that “cannot even be approximately estimated.”

Deportation alone wasn’t enough to deter illegal immigration, Davis wrote. There was nothing disincentivizing the migrant from turning around and trying again. “Endeavoring to stop this law violation” by deportation only, he wrote, “is like trying to prevent burglary with a penalty no severer than opening the front door of the burglarized residence, should the burglar be found within, escorting him to it, and saying ‘You have no right here; see that you don’t come in again.’”

An immigrant who enters the country unlawfully, he concluded, “should be treated as a law violator and punished effectively.”

To bring his vision to fruition, Davis teamed up with a senator from South Carolina. Coleman Livingston Blease, a Democrat, was “a proud and unreconstructed white supremacist,” UCLA history professor Kelly Lytle Hernández wrote in her 2017 book City of Inmates.

Migrants from Mexico were one group whose numbers the increasingly powerful nativist elements in Congress hadn’t managed to restrict. Mexican workers were key to the booming economy of the southwest. Regional employers, particularly in the agricultural sector, had successfully lobbied Congress to block any bill that would choke off their primary source of inexpensive labor. As a result, migration from Mexico soared, with many Mexicans making illegal border crossings to avoid the cost and inconvenience of customs stations.

Blease saw in Davis’s proposal for criminal penalties a way to advance his vision of a white America, and he believed it would bridge the gap between the nativists clamoring for quotas and southwestern congressmen resisting them. Large-scale farmers didn’t mind criminal penalties, Hernández writes, so long as the law was enforced once the harvest was over.

The legislation wasn’t without its opponents, as the UCLA law professor Ingrid Eagly documented in a 2010 study of immigration prosecutions. Groups like the American Civil Liberties Union opposed the bill. The ACLU felt it was unfair and unlikely to deter migration. An immigrant “may be quite ignorant of this law before he starts on his journey,” the group told Congress.

​Despite the ACLU’s objections, a Republican-controlled Congress passed Davis and Blease’s bill in 1929. A Republican president, Herbert Hoover, signed it into law.​The law made it a crime to enter the United States unlawfully and, in so doing, “created the criminalization of the border,” Eagly said.

The statute was swiftly put to use. Between July 1929 and June 1930, according to a Department of Labor report, prosecutors brought more than 6,000 unlawful entry cases. “It is believed that it will prove an effective deterrent,” the report’s author wrote. (In his recent memo, Sessions made similar claims about the Trump administration’s zero-tolerance policy.)

But the law didn’t reduce migration. By 1933, the Labor Department concluded that its rosy outlook had been wrong. The 1929 law “does not seem to have the deterrent effect expected,” noted a Labor Department report published that year.It blamed budget limitations and judges wary of meting out serious sentences if a defendant was going to be deported anyway.

In the 1930s, the Great Depression achieved what prosecutions and deportations had not. Immigration plunged as the labor market in the United States dried up. Prosecutions for unlawful entry dropped to about 5,000 a year, according to a 2012 examination of the law by Doug Keller in the Loyola University Chicago Law Journal.

A shortage of labor during World War II prompted the U.S. to reverse course and encourage migration of temporary workers from Mexico through what it called the Bracero program. (The word refers to manual laborers in Spanish.)

Despite the earlier lessons, federal prosecutors began to focus their attention on bringing unlawful entry cases against Mexican migrants to deter workers from going around the Bracero program. By 1951, there were 15,000 illegal entry and re-entry prosecutions a year.

At the same time, Congress was working to overhaul American immigration law. The effort was spearheaded by two Democrats: Sen. Patrick McCarran and Rep. Francis Walter. Both were staunch anti-Communists who saw immigration — particularly from Eastern Europe and Asia — as posing a risk that Soviet or Maoist agents would infiltrate the country.

Their law is best known for preserving a quota system that meant about 85 percentof immigration visas annually went to people from northern and western Europe. But it also made a crucial change in the unlawful entry law.

In a counterintuitive move, Congress decided to reduce the penalties for unlawful entry — to a maximum of six months in prison. (It also added a felony provision for any additional illegal entry convictions.)

The change wasn’t driven by compassion or a shift away from criminalizing unlawful immigration. Rather, it anticipated the creation of federal magistrate courts that would handle the cases, according to Eagly, the UCLA law professor. A defendant facing a misdemeanor charge punishable by six months or less generally doesn’t have a right to a grand jury indictment or a jury trial. Once Congress established federal magistrate courts, prosecutors could bring criminal charges against far larger numbers of defendants.

A Democratic-controlled Congress passed the law in 1952, but it was vetoed by President Harry Truman. His veto message decried “carrying over into this year of 1952 the isolationist limitations of our 1924 law.” Congress was unmoved and overrode his veto. (In this sense, Trump is correct that Democrats bear some responsibility for the unlawful entry law that underlies his administration’s new immigration policy.)

The unlawful entry statute has remained largely unchanged since 1952. In 1968, however, Congress finally passed a law establishing federal magistrate courts, allowing for a major expansion of charges under the unlawful entry law. Without the need to go through the grand jury process or deal with potential jury trials, immigration prosecutions — almost all for unlawful entry — shot up, Eagly found in her 2010 study: from 2,536 cases nationwide in 1968 to 17,858 in 1974.

The trend culminated in programs like Operation Streamline during the George W. Bush administration, in which magistrate judges along the border took simultaneous mass guilty pleas for unlawful entry. (An appeals court ended the practice in 2009.)

The use of the law hasn’t been a partisan matter. The number of such cases spiked to nearly 50,000 in the last year of the Bush administration, and it stayed in that range for most of the Obama administration, according to federal government data maintained by the Transactional Records Access Clearinghouse at Syracuse University. By 2016, the number had fallen to about 35,000 — still higher than all but the last year of the Bush administration.

But the number of unlawful entry cases fell, the TRAC data shows, during Trump’s first year in office, to 27,000. (It had begun to rise again in recent months, however, even before Sessions announced the administration’s “zero-tolerance” policy.)

Convictions for immigration crimes now account for more than half of all federal criminal convictions.

Hawaii’s fight against Trump’s Muslim travel ban has long roots of resistance

As SCOTUS prepares to rule on Trump v. Hawaii, a reminder that Hawaii stood up for Japanese Americans in WWII

​RAHNA REIKO RIZZUTO - salonJUNE 16, 2018 5:00PM (UTC)

In 1945, my great uncle died for his country, one of 400,000 American soldiers who gave their lives during World War II. He was a member of the most decorated unit in the history of American warfare, one of the “little brown soldiers” that saved the Lost Battalion from Texas in the Vosges Mountains. Four casualties for every man saved. My great uncle, Robert Ozaki, shows up in written accounts of that battle, leading a bayonet charge when his lieutenant disappeared and was thought captured. He arrived at the hospital in Colorado with shrapnel in his back, and our family story has it that he kept shaking down his thermometer so that the doctors could attend to other soldiers.

Your hero? He served fiercely and with honor, and died in that hospital: a recipient of a Silver Star and a Purple Heart.

​Your enemy? His own government branded him class 4-C, an “enemy alien,” and would not let his family attend his funeral.

Unless you are in the Marvel Comic universe, it’s hard to be both. But my Silver Star great uncle was Japanese-American. The decades leading up to the war were a time of virulent hatred for the Japanese, with terms like “inscrutable,” “repulsive” and “the yellow peril” thrown around freely. Racism was codified and supported by the president, Congress, the courts and local government, and urged on in headlines in the media. Robert Ozaki would have remained a “menace,” if it were not for Hawaii. And this month, as we await the ruling of the Supreme Court on Hawaii’s challenge to Donald Trump’s travel ban, it is worth remembering that this is not the first time that Hawaii stood up to the overt racist policies of the U.S. government and said no.

In what world are infants and old ladies a threat? In a world where anti-Japanese (and “Asiatic”) sentiment already had a long and ugly history. After importing them first as cheap labor, immigration from Japan had been completely banned. Laws had already been passed to ensure that the Japanese could never become citizens or own property. But their children were American citizens, and as they began owning farms and businesses, hysteria grew. The propaganda machine (the fake news of the 1940s) taught Americans that “Japs” were snakes, beasts, who would marry your daughters, rape the world and steal your stuff; they were to be slapped, smacked, banished and exterminated. American citizenship, hard work, community service and a clean record did not help those individuals then, just as law-abiding immigrants are not safe now. The messages were violent and they were everywhere.

It was a racism rooted in greed: Within a few decades, Japanese-American farms on the West Coast were seven times more profitable than the average. Japanese Americans controlled two-thirds of the Los Angeles flower market, and were projected to produce 40 percent of the produce needed for the war effort. In giving them a week to dispose of everything they owned, and holding them prisoner in camps where they could not make enough money to pay the taxes on any properties they still owned, the evacuation effectively stripped them of everything.

The mission that was accomplished by Roosevelt’s Executive Order was not safety for America. Despite the excuse of national security, there was not one single case of espionage during the war. The result was the successful cleansing of the West Coast of all persons of Japanese ancestry, and the transfer of between $150 million and $400 million of assets back into Caucasian hands.

In the territory of Hawaii, however, events spun out differently, with history-making results. There, martial law was also declared, with similar exclusion orders. However, the commanding general, Lt. Gen. Delos Emmons, refused to evacuate the Japanese Americans, who made up 37 percent of the population and a significant portion of the economy. Emmons flipped the script, arguing that it was better for the overall economy to leave them free. He refuted the rumors, false claims of espionage and the violently anti-Japanese sentiment that was fueling calls for exclusion. Instead, he chose to do something radical: to treat the Japanese Americans as lawful, loyal citizens, and trust them. He even gave them back their guns.

After Pearl Harbor was bombed, Japanese Americans serving in the Hawaii Territorial Guard were discharged at first, but petitioned to continue to serve. Emmons eventually placed them into a lone battalion, the 100th, or One-Puka-Puka. Some 10,000 Japanese American men living in Hawaii volunteered to enlist. Their fierce dedication altered the face of the war for the Japanese Americans. Pressured to find a home for the battalion, the U.S. government began to reconsider their status. The War Department asked for volunteers from behind barbed wire, and eventually began drafting men out of the camps to create the all-Japanese American 442nd Regimental Combat Team that would include the 100th Battalion. The 442nd proved that they were not snakes by earning more than 18,000 military awards among 14,000 soldiers, including 9,486 Purple Hearts, 560 Silver Stars and 21 Medals of Honor.

Today, 75 years later, racism is still rampant, and still a smokescreen for greed. All the horrifying treatment of humans that is playing out in our daily newsfeeds — within our own country and at our borders — is based on the same triggers, and the same arguments. Today’s monsters are still people of color, immigrants, people who don’t speak our language. They are still born from our worries about our safety and our fears that there is a lack of jobs and money and that there is not enough for us. As we twist ourselves in knots to erase or justify our actions (turning off body cameras, claiming to be protecting child refugees while we build new for-profit prisons for their parents), it is worth remembering that our safety does not come from threatening the safety of others. Quite the opposite. Our fears imprison us all. Racism is taught; it is deliberate. And until we can see through the lie that we are each other’s enemies, we cannot follow the money and the power to understand who our teachers are.

​In 18 long months, the Trump administration has distinguished itself by its many racist and discriminatory policies and executive orders. The actions of its agencies are routinely being challenged in court. Though racism is hardly new in our country (the Japanese American incarceration being just one small example), it is clearly blossoming, thanks to the propaganda that is, once again, infusing the media and every branch of government, and coming from the top. The argument that this revised travel ban is not racist is bogus. It is worth remembering that Executive Order 9066 did not mention ethnicity or race but was to apply to “any or all persons.” Both were justified on grounds of national defense. Just as Roosevelt’s order was a tool for racism, this administration’s actions and words make it clear that the travel ban will be another tool in our growing arsenal against people who are “other,” who we are being told are threatening our safety and our stuff.

In the 1940s, the Supreme Court rejected the first three challenges to the incarceration, before finally ruling that the government had no legal right to imprison a loyal citizen. The damage was already done. In 1988, when Ronald Reagan signed the Civil Liberties Act apologizing for the incarceration, he repeatedly mentioned the bravery of the Japanese American soldiers as proof that the incarceration was a “mistake” and one “based solely on race.”​These reversals may not have been possible if Hawaii had buckled under and followed a different path. Our safety will not be gained in lawsuits. Justice may not be supreme. We must all find a way to question the propaganda and the policies that have been designed to separate us and to see each other as human. If we need assurance that our enemy may indeed be our hero, the all-Japanese American 442nd Regimental Team is a potent reminder that beneath the different skin and eyes of “the other” may beat 9,486 Purple Hearts.

America’s segregated shores: beaches' long history as a racial battleground

For decades officials imposed regulations to restrict African Americans’ use of public beaches – and the fight for equal access is far from over​

​Andrew W Kahrlthe guardianTue 12 Jun 2018 06.00 EDT

Children from Hartford’s North End play at the private Madison Beach Club, exercising their legal right to the wet sand portion of the shore. Photograph: Bob Adelman

Summer has arrived, which, for many Americans, means day camps for children, afternoons lounging by a pool, and weekend trips to the beach. But for many others, summer brings new burdens, frustrations and fears: the end of free or reduced meals for children at schools, the added cost of childcare and the search – often in vain – for safe, affordable and accessible places to play and cool off on a hot day.

Summers have long been America’s most segregated season. Nowhere is this more evident than along the nation’s beaches and coasts, one of the chief destinations for vacationers and pleasure seekers, and a perennial site of racial conflict and violence. The infamous 1919 Chicago race riot, which lasted seven days and claimed 38 lives, began on the shores of Lake Michigan, when white youth gang members stoned to death a black teenager named Eugene Williams after he had accidentally drifted across a color line in the water. In its aftermath, African Americans learned to avoid the city’s lakefront. As a child, black Chicagoan Dempsey Travis remembers, “I was never permitted to learn to swim. For six years, we lived within two blocks of the lake, but that did not change [my parents’] attitude. To Dad and Mama, the blue lake always had a tinge of red from the blood of that young black boy.”

In the decades that followed, local governments across the US enacted a host of policies and practices designed to segregate places of outdoor leisure by race and effectively exclude people of color from public beaches. In the south, those methods were quite explicit. Coastal cities such as Norfolk, Virginia, Charleston, South Carolina, and Miami, Florida, prohibited African Americans from stepping foot on any of their public beaches, and for years ignored blacks’ demands for public beaches of their own. Whites’ indifference to the health and humanity of black communities often had deadly consequences. Throughout the Jim Crow era, shockingly high numbers of black youth drowned each summer while playing in dangerous, and unsupervised, bodies of water. When white officials did respond to black demands for beaches and parks of their own, they invariably selected remote, polluted, often hazardous, locations. In Washington DC, officials designated Buzzard’s Point, a former dumping ground located downstream from a sewage plant, as an appropriate location for the city’s “colored” bathing beach. In New Orleans, it was a remote site on Lake Pontchartrain, 14 miles from downtown, surrounded on both sides by fishing camps that dumped raw sewage into the lake. One health official described the waters offshore as “grossly contaminated” and wholly unfit for bathing.

In the north, whites employed more subtle, but no less effective, methods of segregation. Predominantly white suburbs and towns in the north-east, for example, designated their public beaches for residents only, or charged exorbitant access fees for non-residents, or barred non-residents from parking near the shore, all designed to keep minority populations in neighboring cities out. City officials, meanwhile, failed to provide black neighborhoods with safe and decent places of public recreation and deliberately made beaches and pools frequented by middle-class whites inaccessible to the poor and people of color.

Here, too, whites’ determined efforts to keep black people out of their pools and off their beaches cost black children their lives. On a hot summer day in June 1968, teenagers Howard Martin and Lemark Hicks left their families’ low-income public housing units in Port Chester, New York, and went in search of a place to cool off. Hours later, scuba divers pulled their lifeless bodies from the depths of the Byram river, where the two African American boys had drowned while swimming, unsupervised, in the river’s dangerous currents. While the boys screamed for help, less than a mile away lifeguards kept a watchful eye over children playing in the surf at Byram Beach, one of three public beaches in the neighboring town of Greenwich, Connecticut. But despite its close proximity, these beaches, and the safety they afforded bathers, were not an option for Martin, Hicks, and all other black children living in Port Chester. They were for Greenwich residents only.

Such senseless tragedies fueled black unrest and played no small role in sparking urban uprisings during the long, hot summers of the 1960s. In 1968, public housing residents in Hartford, Connecticut, staged a series of protests following the drowning deaths of several children along a dangerous section of a river that snaked through their housing project. City officials had repeatedly ignored parents’ demands to fence off the area or, better yet, provide the neighborhood with a public swimming pool, and instead scolded parents for not keeping a better watch over their children. “This is what causes riots,” protest leader Barbara Henderson said in response. The Kerner commission concurred. In its 1968 report on the “riots” that had engulfed urban black America in previous summers, it listed “poor recreation facilities and programs” as the fifth most intense grievance of black populations in riot-torn cities, just behind policing practices, unemployment and underemployment, housing and education. In response, cities hastily built above-ground swimming pools and placed sprinklers on fire hydrants in black neighborhoods.

​But aside from these modest gestures, little was done to address the underlying causes of summertime segregation and recreational inequality. In recent decades, fiscally distressed cities have slashed funding for outdoor recreation programs for disadvantaged children, and closed or sold off public parks, beaches and swimming pools in poorer neighborhoods, while affluent communities continue to employ the same tactics for keeping “undesirables” out of their parks and off their shores. Earlier this spring, officials in Westport, Connecticut, dramatically increased parking fees and slashed the number of passes sold to non-residents at its public beach. The move came after locals complained about the growing numbers of outsiders there the previous summer. In the exclusive community of Palos Verdes Estates, California, a gang of wealthy local whites (known as the “Lunada Bay Boys”) has been waging a decades-long campaign of terror against non-residents, especially African Americans, who seek access to the town’s public beach. Local residents have subjected visitors to beatings and assaults, racist epithets, sexual harassment, dog attacks, death threats, property destruction and vandalism, all with the tacit approval of local law enforcement. Officials in this and other affluent beachfront communities in Los Angeles, meanwhile, have for years thwarted attempts by the city’s regional transit authority to offer direct bus routes from black and brown inner-city neighborhoods to the beach. As a result, it is common to find black children living in Los Angeles who have never even seen the Pacific Ocean, much less spent a day on its shores.

Like schools and neighborhoods, the persistence of racial separatism in places of play didn’t just happen by chance. Nor does it, as some might claim, simply reflect people’s personal preferences. It is the result of public policies and private actions that, by design, aimed to segregate bodies of water by race and allow whites to claim the most desirable outdoor spaces to themselves. Many of these policies and practices remain in effect today. Undoing them is critical to making public space in America truly public, and to ensuring that all Americans enjoy the basic human right to leisure and recreation.

Colin Kaepernick Is Righter Than You Know: The National Anthem Is a Celebration of Slavery

Jon Schwarz - the interceptAugust 28 2016, 12:08 p.m.​6/11/18

​Before a preseason game on Friday, San Francisco 49ers quarterback Colin Kaepernick refused to stand for the playing of “The Star-Spangled Banner.” When he explained why, he only spoke about the present: “I am not going to stand up to show pride in a flag for a country that oppresses black people and people of color. … There are bodies in the street and people getting paid leave and getting away with murder.”

Almost no one seems to be aware that even if the U.S. were a perfect country today, it would be bizarre to expect African-American players to stand for “The Star-Spangled Banner.” Why? Because it literally celebrates the murder of African-Americans.

Few people know this because we only ever sing the first verse. But read the end of the third verse and you’ll see why “The Star-Spangled Banner” is not just a musical atrocity, it’s an intellectual and moral one, too:

No refuge could save the hireling and slaveFrom the terror of flight or the gloom of the grave,And the star-spangled banner in triumph doth waveO’er the land of the free and the home of the brave.

“The Star-Spangled Banner,” Americans hazily remember, was written by Francis Scott Key about the Battle of Fort McHenry in Baltimore during the War of 1812. But we don’t ever talk about how the War of 1812 was a war of aggression that began with an attempt by the U.S. to grab Canada from the British Empire.

However, we’d wildly overestimated the strength of the U.S. military. By the time of the Battle of Fort McHenry in 1814, the British had counterattacked and overrun Washington, D.C., setting fire to the White House.

And one of the key tactics behind the British military’s success was its active recruitment of American slaves. As a detailed 2014 article in Harper’s explains, the orders given to the Royal Navy’s Admiral Sir George Cockburn read:

Let the landings you make be more for the protection of the desertion of the Black Population than with a view to any other advantage. … The great point to be attained is the cordial Support of the Black population. With them properly armed & backed with 20,000 British Troops, Mr. Madison will be hurled from his throne.

Whole families found their way to the ships of the British, who accepted everyone and pledged no one would be given back to their “owners.” Adult men were trained to create a regiment called the Colonial Marines, who participated in many of the most important battles, including the August 1814 raid on Washington.

Then on the night of September 13, 1814, the British bombarded Fort McHenry. Key, seeing the fort’s flag the next morning, was inspired to write the lyrics for “The Star-Spangled Banner.”

So when Key penned “No refuge could save the hireling and slave / From the terror of flight or the gloom of the grave,” he was taking great satisfaction in the death of slaves who’d freed themselves. His perspective may have been affected by the fact he owned several slaves himself.

With that in mind, think again about the next two lines: “And the star-spangled banner in triumph doth wave / O’er the land of the free and the home of the brave.”

The reality is that there were human beings fighting for freedom with incredible bravery during the War of 1812. However, “The Star-Spangled Banner” glorifies America’s “triumph” over them — and then turns that reality completely upside down, transforming their killers into the courageous freedom fighters.

After the U.S. and the British signed a peace treaty at the end of 1814, the U.S. government demanded the return of American “property,” which by that point numbered about 6,000 people. The British refused. Most of the 6,000 eventually settled in Canada, with some going to Trinidad, where their descendants are still known as “Merikins.”

Furthermore, if those leading the backlash against Kaepernick need more inspiration, they can get it from Francis Scott Key’s later life.

By 1833, Key was a district attorney for Washington, D.C. As described in a book called Snowstorm in August by former Washington Post reporter Jefferson Morley, the police were notorious thieves, frequently stealing free blacks’ possessions with impunity. One night, one of the constables tried to attack a woman who escaped and ran away — until she fell off a bridge across the Potomac and drowned.

“There is neither mercy nor justice for colored people in this district,” an abolitionist paper wrote. “No fuss or stir was made about it. She was got out of the river, and was buried, and there the matter ended.”

Key was furious and indicted the newspaper for intending “to injure, oppress, aggrieve & vilify the good name, fame, credit & reputation of the Magistrates & constables of Washington County.”

You can decide for yourself whether there’s some connection between what happened 200 years ago and what Colin Kaepernick is angry about today. Maybe it’s all ancient, meaningless history. Or maybe it’s not, and Kaepernick is right, and we really need a new national anthem.

The U.S. Supreme Court will soon hand down its decision in Janus v. AFSCME Council 31, which challenges the ability of public sector unions to collect “fair share” fees from workers who are covered by a negotiated union contract but don’t want to join the union. While the case may seem technocratic, its argument is one thread of a well-worn tapestry by conservatives: attacking union rights to thwart working-class solidarity, especially across racial and ethnic lines.

At the heart of the case is what are deceptively known as “right-to-work” laws, which were conceived with the sole intention of maintaining racial wage hierarchies in the Jim Crow South as part of a larger conservative backlash to the success of union organizing in the years immediately following the passage of the National Labor Relations Act (also known as the Wagner Act) in 1935. We have to go that far back because what Congress did in that year, as long ago as it seems, greatly constrains working-class power today.

The Wagner Act essentially legalized the rights of employees to organize in unions and developed the process of union certification through a new agency created under the law, the National Labor Relations Board. It’s hard to overstate the radical nature of this law at the time it was enacted — and how surprising it was that the law was upheld by the Supreme Court. It seemed destined to be overturned, given the Court’s longstanding opposition to government involvement in the economy.

Indeed, most of the big corporations at the time — DuPont, General Motors and Republic Steel — ignored the law under this assumption, carrying on their normal business of fighting union attempts by firing activists, hiring spies, and stocking up on guns and tear gas. They funded the legal challenge to the National Labor Relations Act and a major public relations effort to smear the law in the court of public opinion. But in a 1937 decision in National Labor Relations Board v. Jones & Laughlin Steel Corporation, the Supreme Court declared the Wagner Act constitutional by sustaining Congress’s power to regulate employers under the commerce clause.

Within a year-and-a-half, 3 million new workers voted to be represented by a union in the Congress of Industrial Organizations (CIO). As America entered World War II and demand for machinery, ammunitions and aircraft soared, another 5 million workers voted for a union in just three years, including many women and African-Americans, who had gained new protections under federal contracts related to the war effort.

By the end of the 1940s, nearly one-third of American workers were unionized, winning contracts for better wages, job security and benefits. With unprecedented gains in the North and Midwest, the CIO set its sights on organizing the Jim Crow South. Termed Operation Dixie, the CIO aimed to organize one million Southern white and black workers, provoking the ire of Southern segregationists who rightly worried that working-class solidarity between blacks and whites would uproot the political power structure of the South.

It turned out that the Wagner Act did exactly what it was supposed to do — unleash a great spurt of workplace democracy — and segregationists and business leaders were less than pleased. So Republicans and Southern Democrats in Congress drafted a bill that would amend the Wagner Act by gutting many of the hard-fought labor rights it guaranteed. The bill, passed over President Truman’s veto, is known as Taft-Hartley for its co-sponsors, Senator Robert Taft and Representative Fred Hartley.

Taft-Hartley provided so-called free-speech rights to employers during an NLRB election, providing companies ample time and leeway to spread false and anti-union information to its workers. Employers could now hold mandatory meetings with workers to detail the perils of welcoming a union into the workplace, intimating that their jobs, or indeed the entire factory, might up and disappear. The Taft-Hartley amendments also outlawed industry-wide strikes, secondary boycotts, and sympathy strikes and gave the president more expansive authority to obtain injunctions against strikes if they jeopardized national interests.

Its most ideological mandate was to require all union officers to sign an affidavit saying that they were not members of the Communist Party. At the time, some of the most effective organizers in the labor movement were Communists. After Taft-Hartley was passed, some unions collapsed and others were purged from the CIO or left rather than signing the pledge. As intended, this provision of the law neutered the most radical and effective elements in the labor movement and washed the labor movement free of its most ardent supporters of women’s and civil rights.

Taft-Hartley also allowed states to pass “right-to-work” laws, which gave workers, even in a unionized workplace, the right to refuse to pay fair-share fees. Under these laws, workers can free-ride — enjoy the benefits of representation without having to pay for it — which makes the establishment and sustenance of a new union a much riskier proposition. In states without right-to-work laws, all workers who benefit from a union contract must pay a fair-share fee, even if they decline union membership. This fee basically represents the worker’s portion of the costs of the union’s providing collective bargaining and other benefits. This is a smaller fee than the overall union dues, which cover broader costs, such as political funding and lobbying.

That loss of revenue makes organizing in right-to-work states much more financially precarious and is a major reason that union density in those states is so much lower. In the aftermath of Taft-Hartley, a number of states quickly passed so-called “right-to-work” laws. Ten states, mostly in the South, passed them immediately in 1947, followed by another half dozen or so in the early 1950s.

It’s a technocratic policy with profound consequences, dreamed up by a rather infamous segregationist.

The development and promotion of “right-to-work” laws is largely credited to Vance Muse, a Texas oil industry lobbyist and known white supremacist who warned that without such legislation to impede union organizing, “white women and white men will be forced into organizations with black African apes whom they will have to call ‘brother’ or lose their jobs.” When it came to squashing working class solidarity, big business and segregationists forged common cause — an alliance in the history books with significant impacts on worker’s rights today.

Right-to-work laws and other anti-union efforts are aimed at consolidating economic and political power for businesses and capital by preventing any whiff of working-class solidarity. Through union dues, the labor movement can amass significant resources to engage in voter turnout, agenda setting and issue advocacy, all on behalf of ordinary Americans. It is that amassing of political power — in addition to the fairer distribution of profits — that is so threatening to conservatives and corporate America. After all, big labor has been responsible for advances in our day-to-day lives that still make conservatives livid: Medicare, Medicaid and, yes, Obamacare too; unemployment insurance; Social Security; the 40-hour workweek; pensions (what’s left of them, anyway); the minimum wage. These are just the greatest hits; many other humane advances in our lives owe their existence to labor unions.

Big business saw the weakening of union rights as the first step in a campaign to bring down the entire New Deal order, and weren’t shy about saying so. At the end of the war, Alfred Sloan, CEO of General Motors, spoke honestly about his disdain for the New Deal, saying, “It took 14 years to rid this country of Prohibition. It is going to take a good while to rid the country of the New Deal, but sooner or later the ax falls and we get a change.”

Power in America might be thought of as being historically represented by two scales. On the left is labor and on the right is capital. When one side loses political clout, the other side gains it. Today the right side of the scale overpowers the left. And that was no accident.

If the court sides with the plaintiff in Janus, it would essentially nationalize “right-to-work” laws in the public sector. All of us will pay the price — in the form of lower wages, job insecurity, miserly benefits and soaring inequality — so America’s plutocrats can tighten the political and economic vise that leaves most of us struggling.

These same plutocrats will point the finger for our struggles at new immigrants, black people and the poor, in a well-worn narrative that allows some politicians and their corporate donors to tilt the rules in their favor while the rest of us fight among ourselves for the leftovers. But we can change. We can come together across racial and ethnic lines to elect new leaders who will represent all of us, not just the wealthy few. We can start by refusing to fall for the divide-and-conquer politics at issue in the Janus case.​

just like trump!!!

(NYT) Evidence Shows That Nixon Betrayed The U.S. In Order To Become President

the intellectualist​5/28/18

Evidence shows that President Richard Nixon colluded with the South Vietnamese for the purpose of winning the 1968 election over his Democratic opponent Herbert Humphrey. By doing so, Mr. Nixon betrayed the United States for his own personal ambitions.

At the time, President Lyndon B. Johnson was negotiating a peace settlement with the North Vietnamese.

Nixon, through a trusted intermediary, contacted the South Vietnamese and promised them a better deal if they refused to work with Johnson. By 1968, 30,000 Americans had already died in America’s war in Vietnam.

​The South Vietnamese, apparently believing Nixon’s promises, chose not to cooperate in American peace negotiations, damning the process.​Nearly 60,000 Americans were killed in Vietnam by the time the U.S. fled in 1975.

Following his resignation, Nixon denied harming peace talks between the Johnson Administration and North Vietnam, however, evidence discovered following Mr. Nixon’s death eviscerates these claims of innocence.According to the New York Times:

“Now we know Nixon lied. A newfound cache of notes left by H. R. Haldeman, his closest aide, shows that Nixon directed his campaign’s efforts to scuttle the peace talks, which he feared could give his opponent, Vice President Hubert H. Humphrey, an edge in the 1968 election. OnOct. 22, 1968*, he ordered Haldeman to “monkey wrench” the initiative.

Haldeman’s notes return us to the dark side. Amid the reappraisals, we must now weigh apparently criminal behavior that, given the human lives at stake and the decade of carnage that followed in Southeast Asia, may be more reprehensible than anything Nixon did in Watergate.”

In a conversation with the Republican Senator Everett Dirksen, the minority leader, Johnson lashed out at Nixon. “I’m reading their hand, Everett,” Johnson told his old friend. “This is treason.”

“I know,” Dirksen said mournfully.

Johnson’s closest aides urged him to unmask Nixon’s actions. But on a Nov. 4 conference call, they concluded that they could not go public because, among other factors, they lacked the “absolute proof,” as Defense Secretary Clark Clifford put it, of Nixon’s direct involvement.”*

The Retired General Who Stopped a Wall Street Coup

General Smedley Butler blew the whistle on a millionaire-led effort to oust FDR and the New Deal.

By Jim Hightower - other words​May 23, 2018

Many Americans would be shocked to learn that political coups are part of our country’s history. Consider the Wall Street Putsch of 1933.

Never heard of it? It was a corporate conspiracy to oust Franklin D. Roosevelt, who had just been elected president.

With the Great Depression raging and millions of families financially devastated, FDR had launched several economic recovery programs to help people get back on their feet. To pay for this crucial effort, he had the audacity to raise taxes on the wealthy, and this enraged a group of Wall Street multimillionaires.

Wailing that their “liberty” to grab as much wealth as possible was being shackled, they accused the president of mounting a “class war.” To pull off their coup, they plotted to enlist a private military force made up of destitute World War I vets who were upset at not receiving promised federal bonus payments.

​One of the multimillionaires’ lackeys reached out to a well-respected advocate for veterans: Retired Marine general Smedley Darlington Butler. They wanted him to lead 500,000 veterans in a march on Washington to force FDR from the White House.

They chose the wrong general. Butler was a patriot and lifelong soldier for democracy, who, in his later years, became a famous critic of corporate war profiteering.

Butler was repulsed by the hubris and treachery of these Wall Street aristocrats. He reached out to a reporter, and together they gathered proof to take to Congress. A special congressional committee investigated and found Butler’s story “alarmingly true,” leading to public hearings, with Butler giving detailed testimony.

By exposing the traitors, this courageous patriot nipped their coup in the bud. But their sense of entitlement reveals that we must be aware of the concentrated wealth of the imperious rich, for it poses an ever-present danger to majority rule.

The Myth of the Roosevelt “Trustbusters”

Teddy and FDR weren't the anti-corporate crusaders that they're portrayed as by populists today.​By Robert D. Atkinson and Michael Lind - the new republicMay 4, 2018

Getty (x2)

In the aftermath of the Great Recession, amid growing concerns about income inequality and wage stagnation, politicians and pundits on the left and right have blamed the problems of twenty-first-century America on a familiar populist scapegoat: big business. The solution, they say, can be found in the nation’s past—in particular, the reign of two twentieth-century presidents.

​In the early 1900s, the narrative goes, Theodore Roosevelt waged war on corporate concentration as a crusading “trustbuster.” A generation later, during the Great Depression, his cousin Franklin D. Roosevelt stood up for small banks against Wall Street’s big bullies. The Roosevelts saved America from plutocracy and created a golden age for the middle class. Thus, manyargue, we need a new generation of trustbusters to save us from the robber barons of tech and banking.

It makes for a compelling case. But it’s based on a false history.

Teddy Roosevelt was far from the business-bashing “trustbuster” of popular memory. The Republican president distinguished between “good” and “bad” trusts, telling Congress in 1905, “I am in no sense hostile to corporations. This is an age of combination, and any effort to prevent combination will not only be useless, but in the end, vicious…”

​It is true that his administration brought 44 antitrust actions against corporations and business combinations, including the Northern Securities railroad company and the “beef trust” in meatpacking, which were ultimately broken up by the Supreme Court. But Roosevelt had profound doubts about antitrust, observing that “a succession of lawsuits is hopeless from the standpoint of working out a permanently satisfactory solution” to the problems posed by big business. Indeed, he wanted to replace antitrust policy with federal regulation of firms by a powerful Bureau of Corporations, whose decisions would be shielded from judicial review.

His Republican successor in the White House, William Howard Taft, initiated twice as many antitrust lawsuits in four years as Roosevelt had done in his seven and a half years in office. Privately, Roosevelt raged when the Supreme Court ordered the break-up of Standard Oil, in an antitrust lawsuit begun under his administration and completed under Taft: “I do not see what good can come from dissolving the Standard Oil Company into 40 separate companies, all of which will still remain really under the same control. What we should have is a much stricter government supervision of these great companies, but accompanying this supervision should be a recognition of the fact that great combinations have come to stay and we must do them scrupulous justice just as we exact scrupulous justice from them.”

Anger at Taft was one of the factors that motivated Roosevelt to run for president again in 1912 as the candidate of the Progressive Party. The party’s platform reflected his view that big business overall was a positive force, but needed federal regulation: “The corporation is an essential part of modern business. The concentration of modern business, in some degree, is both inevitable and necessary for national and international business efficiency.” The remedy for abuse was not mindlessly breaking up big firms, but preventing specific abuses by means of a strong national regulation of interstate corporations.

Like Roosevelt, FDR is falsely remembered as an enemy of big business. When running for office in 1932, the Democrat mocked the populists who supported antitrust: “The cry was raised against the great corporations. Theodore Roosevelt, the first great Republican Progressive, fought a Presidential campaign on the issue of ‘trust busting’ and talked freely about malefactors of great wealth. If the government had a policy it was rather to turn the clock back, to destroy the large combinations and to return to the time when every man owned his individual small business. This was impossible.” FDR agreed with his cousin that the answer was regulation, not breaking up big corporations: “Nor today should we abandon the principle of strong economic units called corporations, merely because their power is susceptible of easy abuse.”

In his first term, FDR in attempted to restructure the U.S. economy under the National Industrial Recovery Act (NIRA), a system of industry-wide minimum wages and labor codes, which small businesses claimed gave an unfair advantage to big firms. In his second term, after the Supreme Court struck down the NIRA in 1935, Roosevelt briefly fell under the influence of Robert Jackson, Thurman Arnold, and other champions of an aggressive approach to antitrust in the Justice Department. But when World War II broke out, such an approach became an impediment to enlisting major industrial firms for war production, and FDR sidelined the antitrust advocates.

​Surely FDR wanted to “break up big banks,” though, given his support of the Glass-Steagall Act of 1933? That’s a myth, too.

FDR and Senator Carter Glass of Virginia shared the goal of separating commercial and investment banking, ending what FDR called “speculation with other people’s money.” But they were also hostile to what American populists loved—the fragmented system of small, unstable local “unit banks” protected from competition with big eastern banks by laws against interstate branch banking. To prop up local banks, Representative Henry B. Steagall of Alabama pushed an old populist idea: federal deposit insurance. Shortly before his election in 1932, FDR explained why he opposed the policy in a letter to the New York Sun: “It would lead to laxity in bank management and carelessness on the part of both banker and depositor. I believe that it would be an impossible drain on the Federal Treasury to make good any such guarantee. For a number of reasons of sound government finance, such a plan would be quite dangerous.”

FDR was so opposed that he threatened to veto the bank reform legislation if it included deposit insurance. In the end, in order to enact other reforms he favored, he reluctantly signed the Glass-Steagall bill. If FDR had prevailed, there would be no Federal Deposit Insurance Corporation (FDIC).

Today, the growth and consolidation of multinational corporations presents American democracy with genuine policy challenges. But the answer need not come from bogus history; real history will suffice. Teddy Roosevelt argued that “big trusts” must be “taught that they are under the rule of law,” yet added that “breaking up all big corporations, whether they have behaved well or ill,” is “an extremely insufficient and fragmentary measure.”

And FDR said: “Nor today should we abandon the principle of strong economic units called corporations, merely because their power is susceptible of easy abuse.” The answer to the problems caused by corporate concentration, the Roosevelts agreed, is prudent government oversight and using antitrust laws to police abuses—not to break up every big company simply because it’s big.

Bigotry stopped Americans from intervening before the Holocaust. Not much has changed​By JAMES GROSSMAN - la timesAPR 29, 2018 | 4:05 AM

Children looking at the Statue of Liberty, on June 4, 1939. U.S. Holocaust Memorial Museum)

A ruthless dictator unleashes terror on his own citizens. Those fleeing elicit sympathy — but encounter obstacles to entering the United States. Americans learn of mass killings, but their moral revulsion doesn’t easily turn into policy or military intervention. One thing remains consistent: America doesn’t want refugees, at least not of this ilk; those people aren’t welcome here.

Historians like me are wary of the adage that “history repeats itself.” But comparisons and analogies help us learn from the past, showing us how context matters and conventional wisdom deceives. To most Americans in 1945, “those people” meant “European Jews.” Today, they are Syrians, Congolese, Hondurans.

No visitor to the new exhibition “Americans and the Holocaust” at the U.S. Holocaust Memorial Museum in Washington, D.C., will walk away with conventional wisdom about World War II intact. In the 1930s, anti-Semitism rested comfortably within American ideologies of race, but this context, not widely acknowledged at the time, has now virtually disappeared from mainstream collective memory. Instead, America’s pre-Pearl Harbor isolationism is viewed as a mistaken but understandable disinclination to intervene in another European war, further tempered by the suggestion that Americans had only slight knowledge of Nazi depravity.

Museum visitors enter the new exhibit’s galleries in 1933 and walk through 12 years without the benefit of 80 years of hindsight. They see what Americans knew about events in Nazi Germany as they learned it. Public opinion (as documented by polls) and U.S. policy are revealed within that context.

It is a sobering journey. Americans knew that something was dreadfully wrong in Germany. As early as 1932, and even more in 1933, popular magazines including Cosmopolitan, Time and Newsweek included major stories on the persecution of Jews in Germany and on Nazi governance. Hitler and Goebbels appeared on covers of Time in 1933, with Goebbels accompanied by a clear message: “Say it in your dreams — THE JEWS ARE TO BLAME.”

An imaginative crowdsourcing effort carried out by the museum uncovered no fewer than 15,000 U.S. newspaper articles documenting persecution published between 1933 and 1945. Newsreels told the same story.

Commentators who have the benefit of hindsight have criticized President Franklin D. Roosevelt for his refusal to intervene. In 1933, the U.S. ambassador to Germany recorded in his diary Roosevelt’s instructions: “The German authorities are treating the Jews shamefully and the Jews in this country are greatly excited. But this is also not a governmental affair.” It comes across as cold-hearted in retrospect, but Roosevelt understood his fellow Americans; they would not march to war — or even expend substantial public resources — to save Jews.

If this feels in any way familiar, consider what comes next. Even when 94% of polled Americans claimed to “disapprove of the Nazi treatment of Jews in Germany,” 71% of them opposed permitting any more than a trickle of German Jews to enter the United States — two weeks after Kristallnacht. Two-thirds of Americans opposed admitting refugee children in 1939.

America kept its doors closed to the people for whom they professed sympathy. This sentiment, shaped by racism, was nothing new, nor was it confined to immigrants. One need only cross the Mall to the National Museum of African American History and Culture to be reminded that in the 1850s white Northerners were as repulsed by the suggestion that emancipation would result in black migration northward as they were by the cruelty of slavery.

Anti-Semitism would remain central to American foreign policy even as the nation stared down Nazi Germany. The United States entered the war in Europe, of course, but Roosevelt was shrewd enough to cast the move as fighting fascism on behalf of democracy. The war was about preserving American values, not saving European Jews.

At war’s end, Americans encountered graphic, overwhelming evidence of what they had been hearing about regularly since the first news reports about the death camps in 1942. Films, photographs, articles and official documents laid out the horrific details of ghettos, concentration camps and gas chambers. Aside from the Jewish media, however, few of these accounts named the victims as Jews.

Terrible people those Nazis, those fascists. The survivors of their terror, however, the “displaced persons,” still could not be trusted to be our neighbors even if there was an orderly bureaucracy of refugee screening, documented here by a wall of letters and official forms.

The ring of familiarity impels us to ask chilling questions about our current moment.

The Americans who helped Hitler

History News Network - raw story26 MAR 2018 AT 11:25 ET

Why were so many “great” Americans tarred with a pro-Nazi brush? Henry Ford. Connecticut Banker and Senator, Prescott Bush, father and grandfather of the Bush presidents. Charles Lindbergh. Even the first, albeit short-lived, America First Committee (1940-1941) with its origins at Yale University allowed itself to become infiltrated by dangerous agents of the Third Reich in America.

Granted, at the outset, there was some considerable sympathy for Hitler in Europe and America. The real enemy, Communism, had swept away the Russian Empire, and was making headway in Europe. Spain had gone communist. Fascism was seen as an antidote to the hammer and sickle. But Hitler’s personal interest in infiltrating America, as early as 1925, was purely economic. Germany needed foreign exchange to stay afloat. It was drowning in hyperinflation. A loaf of bread cost a trillion Reichsmarks. Wheelbarrows and muscles were needed to transport the cash to the bakery. Yesterday’s marks became tomorrow’s kites or paper toys for children—that’s how quickly they were devalued. But few Americans recognized that America’s bankers were behind the bankrupt currency. The names Morgan, Bush, Chase, Union Banking Corporation, First National City are just some that spring to mind.

Germany needed foreign exchange more than anything else to get back on its feet. From 1933, Hitler cunningly lassoed other Americans to help him in his task. Where else could he turn for money coupled with an unwillingness to stop his European expansion plan? As a failed artist, art seemed as good a place to start as any. Alfred H. Barr, Founding Director of the Museum of Modern Art, knew he was buying art taken from German museums that Hitler deemed to be “Entartarte” or “degenerate.” Barr befriended one of Hitler’s art dealers, Karl Buchholz, and was a close personal friend of the Hamburg-born art aficionado Curt Valentin from the time Valentin’s feet hit terra firma in 1937 New York. So Hitler got his foreign exchange, initially from looted museum art, later from desperate, mostly Jewish families, and Barr filled his new museum under the guise of “saving modern art.”

Barr was far from alone. American banks, like Chase National, were involved in a scheme to bring dollars to Germany in something called the Rückwanderer Mark Scheme. And why not? They had to do something to stop the rot on their poor Mark investments of the 1920s. Literally meaning “returning home,” the Rückwanderer Mark was designed to allow Germans living in the U.S.A. who wanted to return to Germany—on a temporary or permanent basis—to buy Rückwanderer Marks at an advantageous exchange rate. The Reichsbank allowed any returnees to Germany to exchange half of their dollars at the favorable RM 4.10 rate, even though the real exchange rate was only RM 2.48.

How could a bankrupt country afford such largesse? The surplus was paid from blocked accounts and assets once owned by refugees fleeing Germany, mostly Jews. The refugees lost an additional 25 percent minimum through a mechanism called a “flight tax,” which was often as elastic as a rubber band. The elasticity stemmed from the official practice of restricting refugees to one small suitcase to take with them and valuing any nonmonetary assets for two or three cents (pfennigs) on the Reichsmark. “The German government,” the FBI noted, “thereby netted a profit in dollars of nearly 90 percent.”

American Companies trading Rückwanderers needed to pay wholesalers, among which were a host of companies, including American Express, the Hamburg-Amerika Line, and the Swiss import-export firm Interkommerz in America, run by Henri Guisan, son of the commander-in-chief of the Swiss army. Jean Guisan, a close family relation, got the idea to introduce the seductive American, Florence Lacaze Gould (wife of the youngest son of American robber baron Jay Gould and the subject of my biography), to act as their “clean skin” banker in Monaco and France. The man who vetted Mrs. Gould was August T. Gausebeck, a German national working in New York since 1933. Gausebeck had the backing of the wealthiest supporters of Hitler including Fritz Thyssen, Prescott Bush’s main banking client. Gausebeck’s New York company, Robert C. Mayer & Co., and his German-inspired investment company called the New York Overseas Corporation, were the primary vehicles complicit in the theft of millions from Jews fleeing Germany. They should have received an acknowledgement somewhere for helping Hitler and Göring to build the Luftwaffe.

But have a heart. Uncle Sam did get around to stopping them. In 1943. The Neutrality Act in the United States prohibited loans and gifts to belligerent nations. J. Edgar Hoover, FBI director, was told in October 1939: “Representatives approach investors and indicate to them that Germany will undoubtedly win the war… and that marks will undoubtedly increase many times in value.” Hoover was onto the scam like any mollusk clinging to a juicy rock. What attracted Hoover’s attention was Gausebeck, that German resident alien who was secretly funding the anti-Semitic campaign of Father Charles Coughlin on the radio. Coughlin? Just a minute….

​Surely the Canadian-American Catholic priest who took to the airwaves since June 1930 could not have been in direct Nazi pay? Wrong. The National Archives are littered with documents proving that the priest was on the take. And why not? The America he broadcast to in 1930 was bust, just like Germany after 1918. Investors in the stock market were looking at profits down some 45.9 percent in the leading two hundred industrial companies. Steel production was down 60 percent; automobile production a staggering 60 percent. Farmers selling wheat in the autumn of 1930 were getting half of what they had been offered in 1929. Office workers, if they still had jobs, watched the breadlines form and wondered if the lines might not be for banks about to close. (By December, there were 328 banks closing each month.)

So when Father Charles E. Coughlin took to the airwaves on Station WJR Detroit with his richly mellow, reassuring voice, his ingratiating charm just begged his listeners to wrap their arms around and listen to his own brand of “Fireside Chat.” Cloaking his fascist message in words of the times, Father Coughlin had discovered his pulpit. His listeners were, like the Germans in 1918, angry. Really angry at bankers. They feared the Communists more than the Fascists, and like other demagogues, Coughlin built his Church of the Little Flower on the wretchedness of others. By early 1933, it was estimated that Coughlin had an audience of ten million people in the U.S.A, and only a handful of critics. But that December, CBS refused to renew his contract unless his sermons were submitted to censorship prior to his broadcasts. Why?

Incensed Americans—Jews, Protestants, Catholics and others—ran to Father Coughlin’s defense. No one else stood up for America’s poor. They became members of his People’s Lobby, partly-funding his programs on another station, increasing his hook-up from twenty-nine to thirty-five stations. Enter August T. Gausebeck, Göring’s banker in America. If Coughlin took off the gloves and plainly said what he meant, Gausebeck would fund any shortfalls the good father might experience—in five to ten dollar untraceable donations. So Coughlin, freed from tedious financial burdens, spoke out against the C.I.O. and organized labor; against the League of Nations, swaying millions to vote as he saw it. Coughlin threw his considerable weight behind Franklin D. Roosevelt, and in the good father’s opinion, brought about Roosevelt’s first presidential victory. Angered that Roosevelt did not recognize his contribution, he turned on the president-elect. Coughlin publicly called Roosevelt a “liar.” That was his first big mistake.

He also spoke out vociferously against the “money lenders”—meaning Jews—and adopted the platform of a man eager to install the first American Reichstag. Coughlin leaned further to the right, republishing the disreputable forgery The Protocols of Zion in his magazine Social Justice and attacked American unionism as having its headquarters in Moscow. (Much of the commentary in Social Justice regarding Jews was taken verbatim from the speeches of Joseph Goebbels, literally line by line.)

Roosevelt was in the pocket of the “money lenders,” Coughlin endlessly jeered. Cheered at the German-American Bund meeting at Madison Square Garden in 1939, Coughlin and his platoons of Christian Front followers were revealed as nothing more than criminal thugs, out to terrify the neighborhoods they lived in. As hundreds were arrested for their violence and pointed racial hatred, claiming Coughlin as their spiritual father, the radio priest ran hot and cold in reply, depending on his audience.

By the time Pearl Harbor came, Coughlin had three-quarters of the United States clamoring for his scalp and demanding to lock up his lunatic fringe. Like Hitler, his little empire lasted a scant twelve years. The Catholic Church had cut him loose, clearly recognizing Nazi ideals, Nazi methods and the un-Christian message Coughlin preached. Yet his fascist worldview remains a danger today. Preaching hatred is not freedom of expression. It is dangerous, deadly propaganda—intent on destroying our souls through fear. We would all do well to learn the lessons of history, and understand how forces that use our democracy against us work.

The Conversation - raw story22 MAR 2018 AT 17:43 ET

In 1942, 18-year-old Iris Lopez, a Mexican-American woman, started working at the Calship Yards in Los Angeles. Working on the home front building Victory Ships not only added to the war effort, but allowed Iris to support her family.

However, before joining the shipyards, Iris was entangled in another lesser-known history. At the age of 16, Iris was committed to a California institution and sterilized.

Iris wasn’t alone. In the first half of the 20th century, approximately 60,000 people were sterilized under U.S. eugenics programs. Eugenic laws in 32 states empowered government officials in public health, social work and state institutions to render people they deemed “unfit” infertile.

California led the nation in this effort at social engineering. Between the early 1920s and the 1950s, Iris and approximately 20,000 other people – one-third of the national total – were sterilized in California state institutions for the mentally ill and disabled.

To better understand the nation’s most aggressive eugenic sterilization program, our research team tracked sterilization requests of over 20,000 people. We wanted to know about the role patients’ race played in sterilization decisions. What made young women like Iris a target? How and why was she cast as “unfit”?

Racial biases affected Iris’ life and the lives of thousands of others. Their experiences serve as an important historical backdrop to ongoing issues in the U.S. today.

‘Race science’ and sterilization

Eugenics was seen as a “science” in the early 20th century, and its ideas remained popular into the midcentury. Advocating for the “science of better breeding,” eugenicists endorsed sterilizing people considered unfit to reproduce.

Under California’s eugenic law, first passed in 1909, anyone committed to a state institution could be sterilized. Many of those committed were sent by a court order. Others were committed by family members who wouldn’t or couldn’t care for them. Once a patient was admitted, medical superintendents held the legal power to recommend and authorize the operation.

Eugenics policies were shaped by entrenched hierarchies of race, class, gender and ability. Working-class youth, especially youth of color, were targeted for commitment and sterilization during the peak years.

​Eugenic thinking was also used to support racist policies like anti-miscegenation lawsand the Immigration Act of 1924. Anti-Mexican sentiment in particular was spurred by theories that Mexican immigrants and Mexican-Americans were at a “lower racial level.” Contemporary politicians and state officials often described Mexicans as inherently less intelligent, immoral, “hyperfertile” and criminally inclined.

These stereotypes appeared in reports written by state authorities. Mexicans and their descendants were described as “immigrants of an undesirable type.” If their existence in the U.S. was undesirable, then so was their reproduction.

Targeting Latinos and Latinas

In a study published March 22, we looked at the California program’s disproportionately high impact on the Latino population, primarily women and men from Mexico.

​Previousresearch examined racial bias in California’s sterilization program. But the extent of anti-Latino bias hadn’t been formally quantified. Latinas like Iris were certainly targeted for sterilization, but to what extent?

We used sterilization forms found by historian Alexandra Minna Stern to build a data set on over 20,000 people recommended for sterilization in California between 1919 and 1953. The racial categories used to classify Californians of Mexican origin were in flux during this time period, so we used Spanish surname criteria as a proxy. In 1950, 88 percent of Californians with a Spanish surname were of Mexican descent.

We compared patients recommended for sterilization to the patient population of each institution, which we reconstructed with data from census forms. We then measured sterilization rates between Latino and non-Latino patients, adjusting for age. (Both Latino patients and people recommended for sterilization tended to be younger.)Latino men were 23 percent more likely to be sterilized than non-Latino men. The difference was even greater among women, with Latinas sterilized at 59 percent higher rates than non-Latinas.

In the first half of the twentieth century, approximately 20,000 people – many of them Latino – were forcibly sterilized in California.

In their records, doctors repeatedly cast young Latino men as biologically prone to crime, while young Latinas like Iris were described as “sex delinquents.” Their sterilizations were described as necessary to protect the state from increased crime, poverty and racial degeneracy.

Lasting impact

The legacy of these infringements on reproductive rights is still visible today.

​Recent incidents in Tennessee, California and Oklahoma echo this past. In each case, people in contact with the criminal justice system – often people of color – were sterilized under coercive pressure from the state.​Contemporary justifications for this practice rely on core tenets of eugenics. Proponents argued that preventing the reproduction of some will help solve larger social issues like poverty. The doctor who sterilized incarcerated women in California without proper consent stated that doing so would save the state money in future welfare costs for “unwanted children.”

The eugenics era also echoes in the broader cultural and political landscape of the U.S. today. Latina women’s reproduction is repeatedly portrayed as a threat to the nation. Latina immigrants in particular are seen as hyperfertile. Their children are sometimes derogatorily referred to as “anchor babies” and described as a burden on the nation.

As the fight for contemporary reproductive justice continues, it’s important to acknowledge the wrongs of the past. The nonprofit California Latinas for Reproductive Justice has co-sponsored a forthcoming bill that offers financial redress to living survivors of California’s eugenic sterilization program. “As reproductive justice advocates, we recognize the insidious impact state-sponsored policies have on the dignity and rights of poor women of color who are often stripped of their ability to form the families they want,” CLRJ Executive Director Laura Jiménez said in a statement.

This bill was introduced on Feb. 15 by Sen. Nancy Skinner, along with Assemblymember Monique Limón and Sen. Jim Beall.

If this bill passes, California would follow in the footsteps of North Carolina and Virginia, which began sterilization redress programs in 2013 and 2015.

In the words of Jimenez, “This bill is a step in the right direction in remedying the violence inflicted on these survivors.” In our view, financial compensation will never make up for the violation of survivors’ fundamental human rights. But it’s an opportunity to reaffirm the dignity and self-determination of all people.

U.S. Guilty of More Election Meddling Than Russia, Has Done To Other Countries For Over A Century

By David Love - atlanta black star​March 18, 2018

​With news of Russian meddling in the 2016 U.S. presidential election seizing the spotlight each day, America is now faced with the prospect of being victimized by the same practices it has promoted throughout the world. While Russia is receiving attention for its interference in the internal politics of the United States, Britain and other European nations, Uncle Sam has a long history of disrupting foreign governments, engaging in regime change, deposing elected leaders and even assassinating them.

Comparing the United States and Russia and their respective histories of overt and covert election influence in other countries, the former wins decisively. Carnegie Mellon University researcher Dov H. Levin has created a data set in which he found between 1946 and 2000, the U.S. interfered in foreign elections 81 times, while the Soviet Union and later Russia meddled on 36 occasions. “I’m not in any way justifying what the Russians did in 2016,” Levin told The New York Times. “It was completely wrong of Vladimir Putin to intervene in this way. That said, the methods they used in this election were the digital version of methods used both by the United States and Russia for decades: breaking into party headquarters, recruiting secretaries, placing informants in a party, giving information or disinformation to newspapers.”

Africa and the Caribbean provide ample evidence of a history of U.S. meddling in the elections of other nations. For example, Patrice Lumumba, the first democratically elected prime minister of the Democratic Republic of the Congo, was overthrown and assassinated in 1961 by the Belgians, who were reportedly aided and abetted by the CIA. The U.S. also had its own plan, which was not implemented, to assassinate Lumumba by lacing his toothpaste with poison, and otherwise remove him from power through other methods.

In 1966, the CIA was involved in the overthrow of Ghanaian President Kwame Nkrumah by way of a military coup. According to CIA intelligence officer John Stockwell in the book “In Search of Enemies,” the Accra office of the CIA had a “generous budget” and was encouraged by headquarters to maintain close contact with the coup plotters. According to a declassified U.S. government document, “The coup in Ghana is another example of a fortuitous windfall. Nkrumah was doing more to undermine our interests than any other black African. In reaction to his strongly pro-Communist leanings, the new military regime is almost pathetically pro-Western.” CIA participation in the coup was reportedly undertaken without approval from an interagency group that monitors clandestine CIA operations.

The U.S. Marines were on hand in 1912 to assist the Cuban government in destroying El Partido de Independiente de Color (PIC) or the Independent Party of Color, which was formed by descendants of slaves and became the first 20th century Black political party in the Western Hemisphere outside of Haiti. PIC — which believed in racial pride and equal rights for Black people — engaged in protest after the Cuban government banned the race-based party from participating in elections. In putting down the PIC, the United States invoked the Platt Amendment, which allowed American intervention in Cuban affairs. The military action from U.S. and Cuban forces resulted in the massacre of 6,000 Black people.

The U.S. occupied the Dominican Republic twice — from 1916 until 1924, controlling the government and who became president, and again in 1965, opposing elected president Juan Bosch, supporting a military coup and installing Joaquin Balaguer. President Reagan took advantage of the assassination of Prime Minister Maurice Bishop of Grenada and orchestrated its long planned invasion of the Caribbean nation, justifying the invasion on the grounds the regime was anti-American and supported by Cuba.

​America is known for a high degree of intervention and coup sponsorship in Haiti, with military occupations and support for brutal dictators, and the ousting of President Jean-Bertrand Aristide under both Presidents George H.W. and George W. Bush. In 2009, during the Obama administration, Honduran President Manuel Zelaya was overthrown in a military coup and forced to fly to a U.S. military base at gunpoint and in his pajamas in an act of American-endorsed regime change. Although there was no evidence the Obama administration was involved in the coup, it did not stop it from taking place. The United States called for new elections rather than declaring a coup had taken place, and contributed to the subsequent deterioration and violence in Honduras. Elsewhere in Latin America, the U.S. was involved in the 1954 overthrow of Jacobo Arbenz Guzman, the democratically elected president of Guatemala, to protect the profits of the United Fruit Company.

In 1953, the CIA, with help from Great Britain, engineered a coup in Iran, overthrowing the democratically elected Prime Minister Mohammed Mossadegh and installing a puppet regime under the Shah. This, after Mossadegh nationalized the British Anglo-Iranian Oil Company, later known as BP. In 1965, the U.S. Embassy supported the rise to power of Indonesia’s brutal dictator General Suharto, and enabled his massacre of over half a million Indonesians.. Ten years later, the U.S. helped Suharto with political and military support in his invasion of East Timor, which had declared independence from Portugal. The Indonesian occupation killed more than 200,000 Timorese, one third of the population.

With the 2016 presidential election, America now has experienced having another country meddle in its internal affairs — something which the U.S. has perpetrated against other nations for more than a century.

Coard: America's 12 Slaveholding Presidents

Michael Coard - philly tribune​3/10/18

​In my Freedom’s Journal columns on February 24 and March 3 here in The Philadelphia Tribune, I exposed the lies about President George Washington’s supposed wooden teeth and Thomas Jefferson’s supposed innocently romantic love affair with Sally Hemings.​Washington’s teeth were actually yanked from the mouths of our enslaved ancestors and Jefferson actually raped Sally repeatedly while she was just a child.

In response to both columns, white racists went certifiably crazy (I mean crazier) and denied and yelled and screamed and hollered and insulted. They also trolled on social media. Unfortunately for them, they’re gonna need a straight-jacket after reading this.

This week’s topic is about the twelve United States presidents who enslaved Black men, women, boys, and girls. And before you crazy racists start talking nonsense about those so-called “great” patriots simply being “men of their times,” you need to know that the anti-slavery movement amongst good white folks began in the 1730s and spread throughout the Thirteen Colonies as a result of the abolitionist activities during the First Great Awakening, which was early America’s Christian revival movement. Furthermore, the anti-slavery gospel of the Second Great Awakening was all over the nation from around 1790 through the 1850s.

America is and always has been a Christian country, right? Therefore, if the Christian revivalists weren’t men (and women) of that slaveholding time, why weren’t those twelve presidents who led this Christian country?

Beyond the religious abolitionist movement, the secular abolitionist movement was in full effect in the 1830s, thanks to the likes of the great newspaper publisher William Lloyd Garrison. Presidents knew how to read, right?

By the way, John Adams, the second president (from 1797-1801) and his son John Quincy Adams, the sixth president (from 1825-1829), never enslaved anybody. And they certainly were men of their times. Maybe they knew slavery was, is, and forever will be evil and inhumane.

Here are the evil and inhumane 12 slaveholding presidents listed from bad to worse to worst:

12. Martin Van Buren, the eighth president, enslaved 1 but not during his presidency. By the way, that 1 escaped.

11. Ulysses S. Grant, the eighteenth president, enslaved 5 but not during his presidency. In office from 1869-1877, he was the last slaveholding president.

10. Andrew Johnson, the seventeenth president, enslaved 8 but not during his presidency. However, when he was Military Governor of Tennessee, he persuaded President Abraham Lincoln to remove that state from those subject to “Honest Abe’s” Emancipation Proclamation.

9. William Henry Harrison, the ninth president, enslaved 11 but not during his presidency. However, as Governor of the Indiana Territory, he petitioned Congress to make slavery legal there. Fortunately, he was unsuccessful.

8. James K. Polk, the eleventh president, enslaved 25 and held many of them during his presidency. He also stole much of Mexico from the Mexicans during the 1846-1848 war in which those Brown people were robbed of California and almost all of today’s Southwest.

7. John Tyler, the tenth president, enslaved 70 and held many of them during his presidency. He was a states’ rights bigot and a jingoist flag-waver who robbed Mexico of Texas in 1845.

6. James Monroe, the fifth president, enslaved 75 and held many of them during his presidency. He hated Blacks so much that he wanted them sent back to Africa. That’s why he supported the racist American Colonization Society, robbed West Africans of a large piece of coastal land in 1821, and created a colony that later became Liberia. The Liberian state of Monrovia is named after that racist thug.

5. James Madison, the fourth president, enslaved approximately 100-125 and did so during his presidency. He’s the very same guy who proposed the Constitution’s Three-Fifths Clause.

4. Zachary Taylor, the twelfth president, enslaved approximately 150 and held many of them during his presidency. During his run for president in 1849, he campaigned on and bragged about his wholesale slaughter of Brown people when he was a Major General in the Mexican-American War. And white folks in America elected him.

3. Andrew Jackson, the seventh president, enslaved 150-200 and held many of them during his presidency. By the way, Jackson, nicknamed “Indian Killer”- whom fake President Donald Trump describes as his all-time favorite- wasn’t just a brutal slaveholder. He was also a genocidal monster who was responsible for the slaughter of approximately 30,000-50,000 Red men, women, and children. Moreover, he signed the horrific Indian Removal Act of 1830 that robbed the indigenous people of 25 million acres of fertile land and doomed them and their descendants to reservation ghettos.

2. Thomas Jefferson, the third president, enslaved 267 and held many of them during his presidency. For more info about this child rapist, read my March 3 column

1. George Washington, the first president, enslaved 316 and held many of them during his presidency. For more info about the man whose teeth were “yanked from the heads of his slaves,” read my February 24 column.