If the measure passes, Maine would become the first state to adopt a statewide RCV system.

Eleven cities nationwide now use RCV, or instant-runoff voting for citywide elections, four of which are in the Bay Area: Oakland, San Leandro, Berkeley (all since 2010) and San Francisco (citywide since 2011).

How does RCV work?

RCV ballots vary somewhat by city, but generally follow a similar format. Assuming there are more than two candidates running for a given position, RCV races on the ballot include a first choice, second choice and third choice column, with every candidate listed in each column.

A sample RCV ballot created by the Alameda County Registrar of Voters. (Alameda County Registrar of Voters)

Voters pick their first, second and third choices, regardless of party. (Note: You don’t have to pick three; you can still just pick your first choice, or your first two choices, etc. It also doesn’t do much good to select your first choice three times — it’ll be counted only once.)

If any candidate receives more than 50 percent of first-choice votes, that candidate is automatically elected. Game over. But if no one receives a majority, a second round of voting proceeds. The candidate who received the least number of votes is eliminated (and if you voted for that candidate, your vote goes to your second choice pick). This process is repeated until one candidate has a clean majority.

In it’s endorsement of the state’s RCV measure, the League of Women Voters of Maine note that in nine of Maine’s last 11 gubernatorial elections, the winner failed to receive a majority of votes. RCV, it concludes, is “the best way to ensure a majority vote in competitive, single-seat, multi-candidate elections.”

Keep in mind that RCV is different from California’s top-two primary system, in which all candidates are listed on the same ballot and the top two vote-getters, regardless of party affiliations, advance to the general election.

Round 1: The Sesame Street Power Grab

So just for kicks (and because puppets are more fun than politicians), let’s pretend we’re observing a heated street council race on “Sesame Street.” There are four candidates running, and a total of 24 neighborhood voters casting ballots.

Cookie Monster (the clear front-runner, of course, well-loved for his oratorical gifts and promises of free pastries to the electorate) gets 10 first-place votes.

Oscar the Grouch gets eight first-place votes (with strong support from the waste management industry and a large contingent of the generally disgruntled).

Because no candidate received more than 12 votes, there’s no clear majority after the initial round. But we do have our first loser — better luck next time, Grover — so we move on to Round 2.

Round 2: The First Elimination

Grover, the candidate with the least amount of first-choice votes, is outta here! But for the two voters who picked Grover as a first choice, their second-choice votes still count. Here’s how:

One of the voters who chose Grover picked Oscar as a second choice. So that vote goes to Oscar (who now has a total of nine votes). The other voter in Grover’s fan club picked Cookie Monster as a second choice. So, that vote goes to Cookie Monster (who now has 11 votes).

At the end of Round 2, here’s the tally:

C. Monster: 11 votes
Oscar: 9 votes
B. Bird: 4 votes

Still no clear winner (because there still are three candidates standing), so on to Round 3 we go!

Round 3: The Deciding Moment

Three candidates left, and Big Bird’s got the least amount of first-choice votes (only four), so that oversized avian is cooked!

Now, we look at the second-choice votes of those four voters who picked Big Bird as their first choice. Remarkably, as it turns out, all four of Big Bird’s second-choice votes were for Oscar! That means Oscar picks up four more votes, giving him (or it?) a final tally of 13 votes to Cookie Monster’s 11.

And thus, that grumpy, trash-dwelling green goon is the new boss in town.

In your city’s races, things might not be quite that simple (and all the candidates will likely have noses). But hopefully you’re beginning to get the idea of how a candidate can viably receive the most first-choice votes and still lose the election.

In 2010, for instance, Oakland first used RCV for its mayoral race and witnessed a similar outcome: there were 10 candidates, and Don Perata, the clear front-runner (who vastly outspent his opponents during the campaign), got 35 percent of the first-choice votes. That left Jean Quan in a distant second with only 24 percent of first-choice votes. But Quan — who anticipated this outcome and allied herself with other underdog candidates and their supporters — received many more second-choice votes than did Perata. And after all the elimination rounds, with second- and third-choice votes factored in, Quan ended up with 51 percent of the vote to Perata’s 49 percent.

The Pundits

So is RCV a good thing? It really depends on who you ask. (Jean Quan, I’m guessing would say yes; Don Perata … not so much. And Oscar the Grouch … definitely a big fan.)

Like pretty much everything in politics, the system’s got its strong supporters and staunch enemies.

RCV supporters say:

It could save taxpayers millions by eliminating the need for local primaries and separate runoff elections.

It lessens the influence of campaign spending. Because RCV eliminates primaries, candidates only need to raise money for one election per cycle, not two or three.

It gives underdog and third party candidates a fighting chance and produces a winner who is supported by a clear majority.

It discourages mudslinging and negative campaigning; candidates are now more likely to ally with each.

RCV Opponents say:

It’s too confusing for voters and unnecessarily adds to the complexity of an already complicated ballot.

There’s lots of room for technical error as election computers tally results through the use of a complicated algorithm.

It encourages less popular candidates to game the system by teaming up against the front-runner.

It’s discriminatory against less educated or knowledgeable members of the voting public who haven’t received sufficient instruction on how the system works.

]]>alameda-county-registrar-of-voters-rankedchoice-voting-presentation-4-638VIDEO: Should San Francisco Lower Its Voting Age?https://ww2.kqed.org/lowdown/2016/10/24/should-high-school-students-under-18-be-allowed-to-vote/
https://ww2.kqed.org/lowdown/2016/10/24/should-high-school-students-under-18-be-allowed-to-vote/#commentsMon, 24 Oct 2016 17:00:46 +0000http://blogs.kqed.org/lowdown/?p=13116Continue reading VIDEO: Should San Francisco Lower Its Voting Age?→]]>
They pay taxes. They have to abide by the same laws as everyone else. And many are old enough to work and get behind the wheel.

And that’s not fair, say a number of youth rights groups, who for years have pushed to lower America’s voting age to 16. In a nation with notoriously low voter turnout — particularly among 18- to 24-year-olds — allowing more young people to vote, advocates claim, would boost civic participation and give students a voice in local public affairs.

This year, San Francisco supervisors approved Proposition F for the November 2016 ballot. The measure would lower the city’s voting age for local elections, allowing 16- and 17-year-olds to vote for mayor and other city officials, as well as school board and citywide initiatives. It follows a multi-year organizing effort by Vote16 SF and the San Francisco Youth Commission. If the measure passes, San Francisco would become the first major city in the country to extend voting rights to 16- and 17-year-olds (effective for the next municipal election).

A similar initiative in Berkeley — Measure Y1 — would allow16- and 17-year-olds to vote, but only for school board members. There are also efforts to get similar measures up for a vote in states across the country.

Nationwide, only two municipalities — the Maryland cities of Hyattsville and Tacoma Park — have passed ordinances lowering their voting ages to 16 for local elections.

Although the voting age is still 18 in a majority of the world’s democracies, several nations including Austria, Argentina, Brazil, and Nicaragua have already lowered the voting age for national elections to 16.

Not everyone agrees …

Skeptics, however, argue that too many young people simply lack the life experience and knowledge to make informed decisions in the voting booth.

“I think it’s a dumb idea,” argued Curtis Gans, former director of the Committee for the Study of the American Electorate at American University. “The voting age was set at 18 because that’s the age at which people could be drafted and die for their country. [Those under 18] don’t have enough life experience or history and don’t know the issues in enough detail.”

Additionally, opponents argue, the nation’s minimum voting age often sets the precedent for other age ceilings. Sexual consent and criminal responsibility age limits, for instance, vary state by state but never exceed 18. If the voting age were lowered to 16, some fear, states could start treating 16-year-olds as adults in matters of consent and criminal prosecution.

The slow path to the ballot

Throughout U.S. history, voting has gradually grown more inclusive, a result of hard-fought political battles waged by disenfranchised populations.

Upon adoption of the U.S. Constitution, voting in most states was reserved for white male property owners. In fact, the nation’s founding document, as originally drafted, never explicitly guarantees the right to vote to anyone.

By the mid-Nineteenth Century, most states had dropped property requirements. And with the ratification of the Fifteenth Amendment in 1870, voting rights were granted to all male citizens, 21 and older, regardless of color. It took another half-century before the passage of the 19th Amendment, extending the vote to women.

Young people in Seattle march in support of lowering the national voting age to 18 (1969). (Post-Intelligencer Collection, Museum of History & Industry )

But it wasn’t until 1971 that America lowered its voting age to 18. The 26th Amendment was ratified largely as a result of heated student opposition to the Vietnam War and the contention that if an 18-year-old was old enough to be drafted into the military and fight for his country, he should also be considered mature enough to influence political outcomes.

“Old enough to fight, old enough to vote,” became the movement’s campaign slogan.

The effort extended suffrage to millions of 18-20-year-olds.

]]>https://ww2.kqed.org/lowdown/2016/10/24/should-high-school-students-under-18-be-allowed-to-vote/feed/13handsStudent_march_fullYoung people in Seattle march in support of lowering the national voting age to 18.Debate Bingo and Other Essentials for the Third Presidential Debatehttps://ww2.kqed.org/lowdown/2016/10/18/debate-bingo-and-other-essentials-for-the-third-presidential-debate/
Tue, 18 Oct 2016 21:00:02 +0000http://ww2.kqed.org/lowdown/?p=24109Continue reading Debate Bingo and Other Essentials for the Third Presidential Debate→]]>Missed the first two presidential debates?

Don’t fret. There’s still one more to go, and plenty of mud left to sling.

Donald Trump and Hillary Clinton face off again on Wednesday night at the University of Nevada in Las Vegas (UNLV). Their third and final debate runs 90 minutes, commercial-free (starting at 6 p.m. E.T.), and will be split into six 15-minute segments. Both candidates will be given two minutes to respond to questions and follow-up opportunity to respond to their opponent.

The debate will be hosted by Chris Wallace, the anchor of Fox News Sunday, whose questions are expected to address topics ranging from immigration, the Supreme Court, the economy and foreign policy.

A record 84 million viewers tuned into the first debate on Sept. 26. Viewership decreased but remained impressively high for the second debate on Oct. 9, a town hall-style format that included questions asked by undecided voters in the audience. Typically, the third presidential debate draws less attention than the first two. But this election is anything but typical, and the final candidate will likely attract an impressive number of eyes.

Some debate history

The first-ever televised presidential debate didn’t happen until 1960. Candidates Richard Nixon and John F. Kennedy squared off — just once — in front of the camera, an event that proved extremely beneficial to the smoother and more youthful Kennedy, who went on to win the election against his stodgier opponent.

It took another 16 years for the next presidential debate to happen. In the 1976 election, when President Gerald Ford faced off against his Democratic challenger, Jimmy Carter, he made a notorious gaffe about the Soviet Union, an oversight that proved detrimental to his campaign.

Ever since, debates have become a fixture of U.S. presidential elections. It’s now standard protocol for candidates to face off three times in the grueling months leading up to Election Day, providing Americans with a rare, unscripted glimpse of their personalities and how they handle themselves under pressure.

]]>handsVIDEO: Should California Abolish the Death Penalty or Make It Easier to Execute?https://ww2.kqed.org/lowdown/2016/10/17/video-californias-death-penalty-divide/
https://ww2.kqed.org/lowdown/2016/10/17/video-californias-death-penalty-divide/#commentsTue, 18 Oct 2016 02:43:00 +0000http://ww2.kqed.org/lowdown/?p=24194Continue reading VIDEO: Should California Abolish the Death Penalty or Make It Easier to Execute?→]]>
Among the heap of statewide propositions California voters weigh in on next month, two are literally life and death decisions.

Proposition 62 would abolish capital punishment in California, making life without the possibility of parole the maximum punishment for murder. The Yes on 62 campaign argues that the death penalty in California is a failed, immoral and incredibly expensive system, costing taxpayers upwards of $150 million a year.

On the other hand, Proposition 66 would not only keep the state’s death penalty intact, it would speed up the notoriously long appeals process for those cases, potentially accelerating the rate of executions. The Yes on 66 campaign advocates for reforming the system by making it more efficient. “Mend, don’t end, California’s death penalty” is its slogan.

In the extreme unlikelihood that both measures receive more than 50 percent approval, only the one with the most votes would be enacted.

An Indecisive History

California has had a tough time making up its mind about the death penalty. In 1872, the state authorized capital punishment in its penal code (until then, executions were generally conducted by county sheriffs). 23 years later, a guy named Jose Gabriel, convicted of murdering an elderly couple, was hung inside San Quentin Prison, marking California’s first official execution at the hands of the state.

For the next 75-odd years, California executed nearly 500 inmates, four of them women.

And then things got really confusing.

In early 1972, the California Supreme Court ruled that the state’s death penalty law constituted cruel and unusual punishment.. But just nine months later, California voters approved a ballot initiative that amended the state constitution to make capital punishment permissible. A year later, the state passed legislation that actually made the death penalty mandatory for certain crimes. But once again, the state Supreme Court struck back, ruling that law unconstitutional as well.

Fast forward six years. In 1978, California voters approved Proposition 7 by a whopping 70 percent. The initiative not only reinstated the state’s death penalty, but also broadened the list of circumstances under which a convicted prisoner could receive a death sentence. It also increased prison terms for first and second degree murder.

And its this law that currently stands in California. The last attempt to abolish capital punishment in California came in 2012, when voters narrowly defeated Proposition 34.

Slow and expensive

Since 1978, the state has only executed 13 prisoners (a 14th was convicted in California but executed in Missouri). More death row inmates in California have died from natural causes than been executed. The last execution – of Clarence Ray Allen – in January 2006. There are nearly 750 prisoners currently residing on California’s death row, according to the California Department of Corrections and Rehabilitation. The vast majority of them are men housed at San Quentin State Prison in Marin County, less than 20 miles from downtown San Francisco.

Interestingly, many capital punishment opponents in California argue for repealing the death penalty largely on economic grounds. They contend that the current system is horribly inefficient and a financial burden to the already cash-strapped state. Due to the number of legal appeals and required long-term special supervision for death row inmates, the financial costs of executing a prisoner far outweigh that of life imprisonment. Repealing the death penalty would save the state an estimated $100 million a year, according to the Legislative Analyst’s Office.

But supporters of the death penalty argue that criminals convicted of the most heinous crimes deserve to be put to death. Some believe the death penalty deters criminal behavior, and for the families and friends of victims, is the only way that justice can be truly served.

The U.S. stands alone

Among western democracies, the U.S. stands alone in its continued use of capital punishment. Since 1976, when the Supreme Court ended a brief moratorium, more than 1400 inmates have been executed. Of these, Texas has led the way, executing nearly 540 inmates in the last four decades.

The death penalty is legal in 31 states, including California, where a 2012 voter initiative to abolish it was narrowly defeated. 17 states have had executions in the last five years. However, executions have been idled in a growing number of states with large death row populations, including California and Pennsylvania, due to ongoing appeals and other legal constraints.

The practice also remains legal in the federal justice system, as evidenced by the recent death sentence of Boston Marathon bomber Dzhokhar Tsarnaev. But he now joins more than 60 other death row inmates in a federal system that has conducted only three executions in the last half century.

But a series of factors, including recent high-profile botched executions, lethal injection drug shortages, last minute exonerations, evidence of racial discrimination in sentencing, huge legal costs and dropping crime rates have all contributed to a growing uneasiness with capital punishment among both liberals and a growing number of conservatives.

Although a solid majority of Americans still believe that convicted murderers should be executed, support has waned considerably in recent decades, according to recent polls. Of the 19 states (and Washington D.C.) that have abolished the death penalty, six have done so in the last ten years.

About the filmmakerJazmin Jones is a filmmaker and graduate of the Bay Area Video Coalition’s Digital Pathways Program. She studied at the City College of San Francisco.

More resources for teaching the funky business of elections

When voters head to the polls on Election Day to select the next president, we’re not actually voting for any one person. Instead, we’re throwing our support behind a group of “electors” who belong to a strange institution called the Electoral College. And it’s this mysterious group of 538 members that directly casts the actual votes to determine who the next president will be.

What is the Electoral College?

During presidential elections, state political parties select a group of “electors.” These are usually committed party activists who have pledged to support their party’s presidential candidate should he or she win the state’s popular vote.

How many electors does each state get?

It’s based on a simple equation: each state’s total number of congressional representatives plus its two senators. Every state (and Washington, D.C.) is guaranteed at least three electoral votes. A sparsely populated state like North Dakota – which has two senators but only one congressional representative – gets just three electoral votes.

On the other end of the spectrum is crowded California, which gets 55 electoral votes (equal to its 53 congressional representatives and two senators).

Interestingly — and controversially — the more than four million people living in U.S. territories like Puerto Rico and Guam get no electors. And although most residents of these territories are citizens and can participate in their party’s presidential primaries, they have no influence in the general election.

How does a presidential candidate win electors?

The presidential election is a grueling state-by-state battle, and in almost every one of those states, it’s a winner-take-all scenario. That means the candidate who receives the most popular votes — the plurality — in each state, gets all that state’s Republican or Democratic electors.

That’s bad news for the other candidates in the race: even if they lose the popular vote by a single votes, they walk away from that state empty-handed.

So looking again at California: if Hillary Clinton wins the popular vote, she gets all 55 electors, leaving Donald Trump with none.

And that’s why California and other very populous states like New York, Texas, and Florida are political jackpots: they have so many delicious electors for the taking.

As if this wasn’t complicated enough, there are actually two states that follow different rules. Maine and Nebraska use a proportional system, in which two electors are chosen by statewide popular vote and the remaining electors are decided by popular vote within each congressional district.

Why is 270 the magic number to win the race?

There are 538 electors nationwide, and to win the the presidency, a candidate needs just over half – or 270 of them. So if you win a state like California (even if you win it by a single measly vote), you’ve just secured about 20 percent of the votes you need to be sitting pretty in the White House come January.

Conversely, presidential candidates on the campaign trail generally don’t spend too much time in relatively underpopulated states like the Dakotas, where electors are scarce. You also probably won’t find them campaigning too much in big but generally politically predictable states like Democratic-leaning California or Republican-leaning Texas. It’s the big swing states (a.k.a. the battleground states) – like Florida, Ohio, Pennsylvania, Wisconsin and Virginia – where they’ll likely be spending most of their time as the election nears. These are the states that are still up for grabs and chock full of electors; the one’s that usually decide the election.

270 To Win provides good interactive maps allowing users to simulate different outcomes. It also shows state-by-state breakdowns and results from previous presidential elections.

And what if neither candidate gets to 270 electoral votes?

The chances of this happening are incredibly slim, but if it did,the House of Representatives would elect the next president from a pool of the three candidates who received the most electoral votes. Each state delegation has one vote. The Senate would then elect the vice president, with each Senator casting a vote. This has only happened once: in the 1824 presidential election, Andrew Jackson won the most popular votes and led the pack in electoral votes. But because it was a competitive race among four candidates, Jackson fell short of winning the requisite electoral majority. Congress decided the outcome, and ultimately elected Jackson’s rival John Quincy Adams.

When do electors cast their official votes?

Oddly, it’s not until about a month after Election Day. On the Monday following the second Wednesday of December (stay with me here), each state’s electors meet in their respective state capitals and cast their votes — one for president and one for vice president. This event never really gets a whole lot of attention because everyone already knows that those electors are almost certainly going to vote for the candidate in their own party. The results are announced the first week in January and the president is sworn in two weeks later.

Technically, electors can change their minds, but it’s only happened a handful of times (these electors are labeled “faithless”).

This is really confusing! How about a real example?

Sure. Let’s look back at the historic 2008 election when Democrat Barack Obama handily defeated Republican John McCain. First off, in terms of electoral votes, Obama pretty much killed it – he ended up with more twice what John McCain had: 365 compared to 173.

But Obama won the election by less than 10 million popular votes. Why? Because he was able to squeak out wins in the big critical swing states (namely Ohio, Pennsylvania and Florida), amassing all of those electoral votes.

What happened in Florida is a actually a great example of just how peculiar our electoral system can be:

The Sunshine State is the quintessential mother-lode swing state; always unpredictable and worth a big chunk of electoral votes. In 2008, Obama won it by a margin of less than three percent (he got about 51 percent to McCain’s 48 percent). We’re talking about a victory of less than 300,000 votes. But because of Florida’s winner-take-all rule, Obama’s slim victory secured him all 27 of the state’s electoral votes (leaving McCain with squat). So depending on how you look at it, you could technically argue that the votes cast by the more than 4 million Floridians who voted for McCain didn’t really end up counting for much at all.

Can a candidate win the presidency without winning the popular vote?

Indeed! This has actually occurred four different times: in 1876 and 1888, Rutherford B. Hayes and Benjamin Harrison, respectively, won the White House even though they lost the popular vote (but won the electoral vote). And then there was that strange 1824 election, decided by the U.S. House of Representatives, which handed the presidency to John Quincy Adams over Andrew Jackson.

And finally, there was the infamous 2000 election, ultimately decided by the Supreme Court, in which Al Gore won more popular votes than George W. Bush, but came up short on electoral votes (following a controversial Florida recount). Guess who then became a staunch advocate for getting rid of the Electoral College?

Why did the Founding Fathers come up with such a zany system?

Two main reasons:

a) They wanted to steer clear of the British parliamentary model, in which the chief executive (prime minister) is chosen by elected representatives of the majority party. The founders thought that it was more democratic to appoint electors from each state than to have a system in which the president was elected by Congress.

b) It came down to an issue of old-school logistics: Back in the day, long distance communication and travel was, to put it mildly, a challenge. Voting for delegates at a local level was easier and less susceptible to tampering and corruption than was counting every last person’s vote across the whole country.

What are arguments for keeping the Electoral College?

It’s intended to make candidates pay at least some attention to less-populated states and rural regions (whose electors can add up) rather than focusing entirely on voter-rich urban centers.

It avoids the need for a nationwide recount in the event of a very close race.

It’s consistent with America’s representative system of government and it’s in our Constitution, so just leave it be!

And how about against?

Under our current electoral system, not all votes are equal; voters in swing states and less populous states have disproportionate power. And that means that not every vote has equal impact. In a direct democracy, everyone’s vote would have the same weight regardless of geography.

It encourages candidates to focus their campaigns largely in swing states while often ignoring the millions of voters in more populous states that tend to predictably favor one party.

It’s a super outdated system that makes it possible for a candidate to win more votes but still lose the election.

]]>https://ww2.kqed.org/lowdown/2016/10/13/what-is-the-electoral-college-and-is-it-time-to-get-rid-of-it/feed/8VIDEO: What’s the Actual Job Description of The President of the United States?https://ww2.kqed.org/lowdown/2016/10/03/what-exactly-is-the-presidents-job-description/
Mon, 03 Oct 2016 13:00:33 +0000http://blogs.kqed.org/lowdown/?p=4062Continue reading VIDEO: What’s the Actual Job Description of The President of the United States?→]]>For what is arguably the most powerful job in the world, the position of President of the United States comes with a surprisingly brief description and strikingly few prerequisites.

And amid all the hoopla of the presidential election, with so much attention focused on the latest soundbites, gaffes and most recent polling numbers, it’s easy to lose sight of what the president is actually supposed to do.

The basics of the job are spelled out in Article II of the U.S. Constitution. To start, you must be at least 35 years old, a native-born citizen and have lived in the United States for a minimum of 14 years. The term is four years, with a chance for another go-around if things work out. And yes, it comes with pretty decent compensation (current salary: $400,000/year).

The Constitution goes on to identify the president’s two main roles: as chief executive of the federal government and commander-in-chief of the armed forces, with the authority to send troops into combat.

As head of the Executive Branch of the federal government, he (or she) enforces U.S. laws, treaties, and court rulings, has the power to make treaties (with two-thirds support of the Senate); can grant reprieves or pardons and appoints federal officials, including Supreme Court justices. The president is also charged with signing legislation from Congress into law, or vetoing it.

And that’s about all that is set it stone.

Of course, the modern presidency has evolved just a bit from how it was in the early days of the republic. As Joel Achenbach of the Washington Post notes, early presidents had almost no staff (let alone indoor plumbing). It wasn’t until 1857 that Congress appropriated funding for the president to even hire his own secretary.

But the nature of the presidency changed dramatically in the 1930s under Franklin D. Roosevelt and the implementation of the New Deal programs. The vastly expanded bureaucracy turned the position into a far more complicated job, requiring a formidable support staff. Today, about 3,000 staffers work in the president’s office, not including the 15 departments run by cabinet secretaries and other agencies (CIA , NASA, etc.) that are part of the Executive Branch.

So sure, the job comes with a huge amount of power and responsibility. But can a president really do all the myriad things that the current candidates are pledging to make happen if elected?

Probably not.

But as these two Crash Course videos explain, the presidency comes with both formal and informal powers, and that leaves a good part of the job open to interpretation.

]]>handsA History of Arguing: The Greatest Hits (and misses) of U.S. Presidential Debateshttps://ww2.kqed.org/lowdown/2016/09/23/the-history-of-arguing-the-greatest-hits-and-misses-of-presidential-debates/
Sat, 24 Sep 2016 03:00:19 +0000http://blogs.kqed.org/lowdown/?p=4143Continue reading A History of Arguing: The Greatest Hits (and misses) of U.S. Presidential Debates→]]>

The first presidential debate between Donald Trump and Hillary Clinton takes place Monday night at Hofstra University in New York. It’s a highly anticipated face off between two very different candidates who have talked a lot of trash about each other but have yet to engage in direct combat.

Some observers predict this debate will shatter viewership records, as millions of Americans tune in to see how these two strong but strikingly divergent personalities measure up against each other when questioned on some of the most pressing national issues. It also marks the first U.S. presidential debate in history to include a women.

The first of three debates (the other two are on Oct. 9 and Oct. 19), the showdown will be moderated by NBC News anchor Lester Holt, who is expected to divide the 90-minute session into three different topical segments: the Direction of America, Achieving Prosperity and Securing America.

The first ever televised presidential debate didn’t happen until 1960. Candidates Richard Nixon and John F. Kennedy squared off — just once — in front of the camera, an event that proved extremely beneficial to the smoother and more youthful Kennedy, who went on to win the election against his stodgier opponent. It took another 16 years for the next presidential debate to happen. In the 1976 election, when President Gerald Ford faced off against his Democratic challenger Jimmy Carter, he made a notorious gaffe about the Soviet Union, an oversight that proved detrimental to his campaign.

Ever since, debates have become a fixture of U.S. presidential elections. It’s now standard protocol for candidates to face off three times in the grueling months leading up to Election Day, providing Americans with a rare, unscripted glimpse of their personalities and how they handle themselves under pressure.

]]>handsHow 9/11 Changed America: Four Major Lasting Impacts (with Lesson Plan)https://ww2.kqed.org/lowdown/2014/09/10/13-years-later-four-major-lasting-impacts-of-911/
https://ww2.kqed.org/lowdown/2014/09/10/13-years-later-four-major-lasting-impacts-of-911/#commentsTue, 06 Sep 2016 18:00:32 +0000http://blogs.kqed.org/lowdown/?p=14066Continue reading How 9/11 Changed America: Four Major Lasting Impacts (with Lesson Plan)→]]>
Fifteen years ago, the United States wasn’t officially engaged in any wars. Few of us had ever heard of al-Qaeda or Osama bin Laden, and ISIS didn’t even exist.

We deported half the number of people we do today. Our surveillance state was a fraction of its current size. And — perhaps hardest to believe — we didn’t have to take off our shoes to get through airport security.

America’s involvement in the War on Terror — prompted by the 9/11 terrorist attacks — resulted in a dramatic change in our nation’s attitudes and concerns about safety and vigilance.

It ushered in a new generation of policies like the USA Patriot Act, prioritizing national security and defense, often at the expense of civil liberties.

These changes continue to have ripple effects across the globe, particularly in the Middle East, where American-led military operations helped foment rebellions and ongoing warfare throughout the region.

Below are four of the many dramatic impacts — nationwide and in California — resulting from the events of that tragic day.

I. Ongoing wars

Less than a month after 9/11, U.S. troops invaded Afghanistan in an attempt to dismantle al-Qaeda — the terrorist group that claimed responsibility for the attacks — and remove the Taliban government harboring the group. Two years later, in March 2003, the United States invaded Iraq and deposed President Saddam Hussein. Although not directly linked to the terrorist attacks, Hussein was suspected of producing weapons of mass destruction (none were ever found), and the invasion was a key part of America’s newly launched War on Terror, under the leadership of President George W. Bush.

Our military involvement in Afghanistan turned into the longest-running war in U.S. history. And although formal U.S. combat operations ended in late 2014, more than 8,000 U.S. troops are still there in an effort to help stem the ongoing Taliban insurgency.

In December 2011, remaining U.S. troops were pulled out of Iraq, leaving that nation in a far more volatile state than when military operations first began in 2003. And as the Islamic extremist group ISIS — which sprouted from the chaos of war — continues to terrorize the region, the U.S. has resumed intermittent air strikes.

In 2002, the Bush Administration also opened the Guantanamo Bay detention center in Cuba, where it began sending suspected enemy combatants. Held indefinitely, prisoners were denied access to trials or legal representation, and subject to brutal interrogation techniques . By 2003, there were more than 650 foreign inmates at the facility. Critics have long pushed to shut Guantanamo down, calling it a gross violation of basic human rights and a stain on America’s image abroad. And although early in his first term, Obama vowed to close it — and has significantly reduced the population (there were only 61 inmates, as of August) — Guantanamo still remains open, a result of of intense partisan gridlock.

After 9/11, budgets for defense-related agencies skyrocketed: Homeland Security’s discretionary budget jumped from about $16 billion in 2002 to more than $43 billion in 2011. Meanwhile, the budgets of the Coast Guard, Transportation Security Administration and Border Patrol have all more than doubled since 2001.

In the last 15 years, millions of young U.S. soldiers have been deployed overseas, thousands have been killed, and many have returned home with debilitating physical and mental injuries.

According to U.S. Substance Abuse and Mental Health Services Administration, roughly 3.1 million Americans entered military service between 2001 and 2011, and nearly 2 million were deployed to Afghanistan or Iraq. In that time, more than 6,000 American troops have been killed, and roughly 44,000 wounded. Of returning service members, more than 18 percent have post-traumatic stress disorder (PTSD) or depression, and almost 20 percent have reported suffering from the effects of traumatic brain injury (TBI).

California impact

California is second only to Texas in its contribution of recruits to the U.S. military. As of 2009, the U.S. Census reported roughly 118,000 active California service members. When you multiply that by the number of families and friends those soldiers left at home, the significance of the statewide impact becomes clear. In 2010 alone, nearly 6,000 military recruits were from California, according to the National Priorities Project.

The LA Times reports that as of August 25, 2014, 749 California service members from every corner of the state had been killed in Iraq and Afghanistan.

II. Immigration and deportation

The Bush Administration created the Department of Homeland Security in 2002, a cabinet-level office that merged 22 government agencies. The Immigration and Naturalization Service and the U.S. Customs Service — both formerly part of the Department of Justice — were consolidated into the newly formed U.S. Immigration and Customs Enforcement (ICE). The agency has overseen a massive increase in deportations, which have nearly doubled since 9/11.

According to the Department of Homeland Security’s Yearbook of Immigration Statistics, there were roughly 200,000 annual deportations a year between 1999 and 2001. While that number dropped slightly in 2002, it began to steadily climb the following year. In the first two years of the Obama Administration (2009 – 2010), deportations hit a record high: nearly 400,000 annually. About half of those deported in that period were convicted of a criminal offense, although mostly for low-level, non-violent crimes.

The Secure Communities program, established in 2008 and officially phased out in 2014, allowed local law enforcement to check the immigration status of every person booked in a county or local jail — even if not ultimately convicted of a crime — by comparing fingerprints against federal immigration records. The program resulted in numerous cases of undocumented immigrants entering deportation proceedings after being stopped for minor infractions (like not using a turn signal while driving).

By 2014, when Obama announced plans to phase the program out, ICE had established Secure Communities partnerships with every single law enforcement jurisdiction in the nation (all 3,181 of them).

California impact

In 2009, Jerry Brown — then California’s Attorney General — agreed to implement the Secure Communities throughout the state. As of 2012, ICE reported it had taken nearly 48,000 “convicted criminal aliens” in California into custody. Almost half of them were deported, even though less than a quarter had been convicted of offenses considered “serious or violent.”

California is the primary destination for foreign nationals entering the country, and home to a quarter of America’s immigrant population. Of the nearly 10 million immigrants (both naturalized and undocumented) residing in the state, an estimated 4.3 million are Mexican, 28 percent of whom are naturalized, according to the Public Policy Institute of California.

III. The friendly-ish skies

Long airport lines, full body scans, the occasional pat-down (for the lucky ones). It’s all par for the course, nowadays, when you fly. But not so long ago, it wasn’t unusual to show up at the airport a half-hour before a domestic flight, keep your shoes tied tight, and skip through the metal detector while sipping a Big Gulp, all without ever having to show an ID.

Before the advent of color-coded security threat warnings, pat downs were very uncommon, liquid was allowed, and the notion of having to go through full-body scanners was the stuff of science fiction. Heck, prior to 9/11, some airport security teams even allowed passengers to take box cutters aboard (the supposed weapon used by the 9/11 hijackers). Any knife with a blade up to four inches long was permitted. And cigarette lighters? No problem!

In the wake of the terrorist attacks, airport security underwent a series of major overhauls. And a service that was once largely provided by private companies is now primarily overseen by the Transportation Security Administration.

Created in the wake of the 9/11 attacks, the TSA is tasked with instituting new security procedures and managing screenings at every commercial airport checkpoint in the country (although, private contractors still operate at some airports). It marks the single largest federal start up since the days of World War II. The agency is authorized to refer to watch lists of individuals who could pose flight safety risks.

Although advocates argue that the changes have made air travel vastly safer, the additional security steps have also tacked on a significant amount of travel time for the average passenger, while infringing on privacy rights and, in many instances, increasing scrutiny of minority travelers, particularly those of Middle Eastern descent.

IV. Big surveillance

The U.S. intelligence state boomed in the wake of 9/11. The growth resulted in a marked increase in government oversight, primarily through a vast, clandestine network of phone and web surveillance.

Classified documents that were leaked last year by former government contractor Edward Snowden detail the expansion of a colossal surveillance state that has seeped into the lives of millions of Americans. The exponential growth of this apparatus — armed with a $52.6 billion budget in 2013 — was brought to light last year when the Washington Post obtained a “black budget” report from Snowden, detailing the bureaucratic and operational landscape of the 16 spy agencies and more than 107,000 employees that now make up the U.S. intelligence community.

Further audits reveal that the National Security Agency alone has annually scooped up as many as 56,000 emails and other communications by Americans with no connection to terrorism, and violated privacy laws thousands of times per year since 2008.

In fact, the last glorious taste of summertime that Labor Day’s come to represent masks the often overlooked turbulent history that led to its establishment in the first place.

In the late 19th Century, America was undergoing a period of rapid industrialization. Many of the nation’s urban centers were bursting at the seams, attracting a flood of poor immigrants desperate for work and vulnerable to exploitation. Growing labor unrest led to a string of major strikes and protests, with workers demanding higher pay, safer work conditions and the right to unionize.

These demonstrations prompted violent clashes with police and private security forces. The unrest led to major improvements for millions of workers, prompting an era of new labor regulations that included the establishment of an 8-hour workday and child labor laws. The reforms also gave rise to a prolonged period of burgeoning union membership, increased wages and a notable rise in the ranks of America’s middle class. It’s a trend that continued until the 1970s, when the power of unions and the size of the middle class both began to decline.

Labor Day became an official national holiday in 1894 in the aftermath of the notorious Pullman Strike, among the largest in U.S. history.

Soon after ordering federal authorities to quell the unrest (which resulted in a number of strikers killed), President Grover Cleveland made Labor Day an official holiday, a conciliatory nod to the nation’s working class. But eager to distinguish the holiday from the more radical roots of May Day — an internationally recognized workers day — Cleveland pushed for an apolitical September date.

These two short videos (above and below) provide a good, brief overview of those origins.

So brace yourself for that last inescapable flood of campaign commercials.

But it wasn’t always like this.

In 1948 less than one percent of U.S. homes had TVs. By 1954 – a mere six years later – more than half of all American’s had a boob-tube in the house. By 1958, that rate had soared to over 80 percent, and today hovers at about 97 percent.

That’s according to University of Wisconsin Journalism Professor James L. Baughman, who documents the rapid rise of TV in American life. “No other household technology,” he writes, “not the telephone or indoor plumbing, had ever spread so rapidly into so many homes.”

It didn’t take political campaigns long to catch on to the enormous potential this new technology offered: a green light to instantly infiltrate the living rooms of millions of Americans, more directly, personally, and visually than ever before.

The very first televised campaign ads were launched in the 1952 presidential race. Leading the charge was Republican candidate Dwight D. Eisenhower (and his running mate Richard Nixon). The campaign spent roughly $1.5 million on ads, twice that of Democratic opponent Adlai Stevenson. The first series of spot ads, called “Eisenhower Answers America,” featured a seemingly average citizen asking a laughably scripted, leading question, to which Eisenhower frankly responded, staring directly into the camera, utterly devoid of emotion or charisma. The campaign soon followed up with the now legendary “I Like Ike” animation, which gave the candidate a major edge in the race.

The Living Room Candidate, a project of the Museum of the Moving Image, is an impressively thorough and well curated repository of presidential campaign ads in every election since 1952. Here are 10 of the heaviest hitters (note the wide variations between negative/fear-inducing and euphorically positive):

Dwight D. Eisenhower’s “Ike for President” (1952)

This seemingly quaint commercial helped Eisenhower trounce his Democratic opponent Adlai Stevenson. The first Republican to win the White House in 20 years, Eisenhower got 83 percent of the electoral vote.

John F. Kennedy’s “Kennedy For Me” (1960)

At 43, John F. Kennedy was to become the youngest elected candidate in U.S. history. Attacked by his opponent Richard Nixon as inexperienced, this jingle ad helped turn Kennedy’s youth into an asset, someone who is “old enough to know and young enough to do.”

Kennedy won with 56 percent of the electoral vote.

Lyndon B. Johnson’s “Daisy Girl” (1964)

Part of Lyndon B. Johnson’s 1964 re-election bid, in the midst of the Cold War, this ad is among the most famous campaign commercials of all time. It ran only once, during an NBC broadcast of Monday Night at the Movies on September 7, 1964. But that was enough to scare the pants off an already skittish electorate, by painting his Republican opponent, Barry Goldwater, as a dangerous right-wing extremist who’d bring the world to the brink of disaster.

Johnson won the election with 90 percent of the electoral vote.

Hubert H. Humphrey’s “Laughter” (1968)

Although Democrat Hubert H. Humphrey ended up losing the 1968 presidential race to Richard Nixon, this ad still packed a punch by portraying Spiro Agnew, Nixon’s relatively unknown running mate, as a political neophyte, so inexperienced as to be, literally, laughable. The ad was created by Tony Schwartz, who also made Johnson’s “Daisy” ad.

Nixon still beat Humphrey, with nearly 56 percent of the electoral vote.

Richard Nixon’s “McGovern Defense” (1972)

This ad aired during Richard Nixon’s re-election bid against Democratic challenger George McGovern in 1972. With the U.S. military still deeply engaged in the Vietnam War, Nixon’s campaign sought to portray Democrats as weak on national defense, with policies that would place the nation in peril. It was sponsored by a group called “Democrats for Nixon.”

Nixon won with a whopping 97 percent of the electoral vote.

Ronald Reagan’s “Morning In America” (1984)

Part of a series of 1984 Reagan campaign ads collectively known as “Morning in America,” this commercial effectively highlights idyllic scenes of productivity and suburban life to suggest that President Reagan had successfully restored American optimism and revived the economy from the prolonged period of high inflation and unemployment that had persisted under his Democratic predecessor Jimmy Carter. The ads helped Reagan handily defeat his Democratic opponent Walter Mondale, with 98 percent of the electoral vote.

George H.W. Bush’s “Revolving Door” (1988)

This crushing ad attacked a program that Democratic presidential challenger Michael Dukakis had supported while he was governor of Massachusetts, allowing prisoners to be released on weekend furloughs. The ad capitalized on the case of Willie Horton, a Massachusetts state prison inmate and one of the program’s participants, who in 1987 raped a woman while on weekend furlough. The black-and-white ad successfully cast doubt on Dukakis’ ability to govern and protect public safety, striking a major blow to his campaign.

Bush won with 80 percent of the electoral vote.

Bill Clinton’s “Man From Hope” (1992)

An edited down version of a much longer biographical film shown at the 1992 Democratic Convention, this commercial widely considered among the most compelling biographical ads ever made. Emphasizing Clinton’s small town roots it conveys the candidate’s strong work ethic, wisdom and sense of humanity.

George W. Bush’s “Windsurfing” (2004)

The most effective and memorable ad of the 2004 election, this drove home the Bush campaign’s consistent allegation that Democratic challenger John Kerry was a “flip-flopper” who merely tailed the political winds.

Bush won with 53 percent of the electoral vote.

Barack Obama’s “Yes We Can” Web Ad (2008)

Among the most unconventional campaign ads to date, this was only available on the web. Produced by Will.i.am of The Black Eyed Peas and Jesse Dylan (Bob Dylan’s son), the ad put music to Obama’s New Hampshire primary concession speech (after he lost the state to Hilary Clinton). It features a succession of over 30 celebrity performers singing his words. First posted on YouTube, the video quickly went viral, with over 26 million views in just a few days. It led to an online fundraising boom and a new wave of momentum for Obama’s campaign.

Obama beat his Republican challenger John McCain with 68 percent of the electoral vote.

]]>handsLessons from the Police Shootings and Urban Race Riots of the 1960shttps://ww2.kqed.org/lowdown/2016/07/21/uprising-lessons-from-the-race-riots-of-1967/
https://ww2.kqed.org/lowdown/2016/07/21/uprising-lessons-from-the-race-riots-of-1967/#commentsThu, 21 Jul 2016 18:50:16 +0000http://ww2.kqed.org/lowdown/?p=22779Continue reading Lessons from the Police Shootings and Urban Race Riots of the 1960s→]]>

In the first week of July alone, two black men were killed by police officers — in Baton Rogue, La. and suburban St. Paul, Minn. — after being stopped for minor infractions. Then came the killings of five police officers in Dallas and three in Baton Rouge. The violence reignited intense, racially-charged protests and public debate, underscoring the degree of mistrust between many local police forces and the communities they’re intended to serve.

It’s a major issue that’s come up frequently in the two years since a white police officer in Ferguson, Mo. killed Michael Brown, an unarmed black man, an incident that helped spark the nationwide Black Lives Matter movement.

But this certainly isn’t the only time America has grappled with the specters of racism, violence and anger surrounding the role of law enforcement in communities of color. We’ve unfortunately been here many times before.

The summer of 1967 is one of the most notorious examples.

More than 100 inner-city, largely black communities were rocked by violent uprisings in what became known as the “long, hot summer.” The incidents, which mostly occurred in East Coast and Midwestern cities — from Milwaukee to Buffalo, Tampa to Cincinnati — resulted in more than 100 deaths, millions in property damage and scores of burned-out neighborhoods that never fully recovered.

These riots were a symptom of a larger problem: a deep-seated anger and hopelessness simmering in many poor, black urban communities suffering from disproportionately high rates of poverty, joblessness and crime and a notable lack of power and representation in largely white local governments.

But the spark igniting each incident of unrest was generally triggered by a single police action: a local incident involving a black man beaten or killed by white police officers for a seemingly minor infraction.

Two of the most devastating riots happened nearly back-to-back that July.

Summer of Rage: Newark and Detroit

In Newark, NJ two white police officers severely beat a black cab driver after stopping him for a minor traffic violation. As word of the incident spread, thousands rioted in the streets, looting businesses and prompting the deployment of several thousand police officers and National Guardsmen. The violence raged for six days, leaving 26 people dead, scores more injured and tens of millions of dollars in property damage.

Just two weeks later in Detroit, a police raid on an unlicensed bar sparked an even more devastating riot. Looters raided shops and set buildings on fire and panic ensued amid of snipers on rooftops. President Lyndon Johnson sent in thousands of US Army paratroopers to quell the unrest. At the end of five chaotic days, 43 people were dead and some 7,000 arrested. Large swaths of Detroit’s inner city were left in ruins with hundreds of buildings damaged or destroyed. Like in Newark, most of the dead were black men killed by white police officers and soldiers.

Newark and Detroit were not isolated incidents. Two years before, a confrontation between a young black man and a police officer in the Watts neighborhood of Los Angeles resulted in days of rioting that left 34 people dead. Violent unrest continued in 1966 in poor sections of cities like Chicago, Cleveland, New York and San Francisco.

As wealthier, largely white communities increasingly flocked to the suburbs, the remaining inner-city communities were often thrust into a deeper state of prolonged economic isolation, Over the following decades, jobs and home values in these areas continued to drop sharply.

The Kerner Commission

In the wake of these riots, President Johnson established a task force: the National Advisory Commission on Civil Disorders, known as the Kerner Commission after its chair, Illinois Governor Otto Kerner. The group was tasked with addressing three major questions:

“What happened? Why did it happen? What can be done to prevent it happening again?”

Rioting in Detroit. (Courtesy of Detroit Free Press)

Over the next six months, members of the commission visited inner city neighborhoods throughout the country, interviewing residents, police officers, and local officials. They drew on the research of social scientists and analyzed media coverage of the recent violence.

The 11-member bipartisan commission was not politically radical in any sense of the word: It included four members of Congress, the mayor of New York, Atlanta’s police chief, and union and industry representatives. Only two members were black..

Nevertheless, the commission’s final report was blunt, and to many Americans, shocking:

“This is our basic conclusion: Our nation is moving toward two societies, one black, one white — separate and unequal. White racism is essentially responsible for the explosive mixture which has been accumulating in our cities.”

The report’s direct reference to white racism as a root cause of the riots was particularly controversial and groundbreaking.

“We used the word racism. And on the commission, we had two or three people say, ‘Should we use that word, racism?'” former Senator Fred Harris (D-Okla.), who served on the commission, told Bill Moyers in 2008.

Click to download part of the original text (Photo courtesy Bill Moyers Journal

“We felt that was very important, I did, and I think it was, to say it. Because what we know is that oppressed people often come to believe about themselves the same bad stereotypes that the dominant society has. Our saying racism, I think, was very important to a lot of black people who said, “Well, maybe it’s not just me. Maybe I’m not, by myself, at fault here. Maybe there’s something else going on.”

The report elaborated on the often explosive relationship between local police forces and the black communities they patrolled:

The police are not merely a “spark” factor. To some Negroes police have come to symbolize white power, white racism and white repression. And the fact is that many police do reflect and express these white attitudes. The atmosphere of hostility and cynicism is reinforced by a widespread belief among Negroes in the existence of police brutality and in a “double standard” of justice and protection—one for Negroes and one for whites.

At the time, many observers believed that the unrest was the work of “outside agitators,” radical groups intent on sowing chaos and disorder. But the Kerner Commission found no evidence of conspiracy or premeditated plans. Although it stopped short of labelling the riots a flat-out rebellion against racial oppression, it underscored that the conflicts were an indication of the deep frustration stemming from a host of social problems afflicting inner city communities of color.

Topping that list was police brutality, unemployment, and inadequate affordable housing. The commission stated, in no uncertain terms, that white America was directly implicated in creating these problems:

“What white Americans have never fully understood—but what the Negro can never forget—is that white society is deeply implicated in the ghetto. White institutions created it, white institutions maintain it, and white society condones it.”

Its long list of sweeping policy recommendations included:

Creating two million new jobs and six million new affordable housing units

The 426-page report, published in March 1968, sold over two million copies and earned a spot on the nonfiction bestseller list of the New York Times, which called it a “stinging indictment of white society.”

And then, it all but disappeared.

The Johnson Administration thought the commission hadn’t given it enough credit for past civil rights legislation, and President Johnson later refused to meet with commissioners or support further research.

The report noted that in order to improve conditions, “hard choices must be made, and, if necessary, new taxes enacted.” But there was little political will to do so, particularly as the nation planted itself deeper into the incredibly costly conflict in Vietnam.

Less than a month after its publication, Martin Luther King, Jr.’s assassination sparked more violent riots in poor, urban communities nationwide.

From Kerner to Ferguson

After the Michael Brown shooting in summer 2014 and the unrest that followed, a new commission was formed to study a similar issue. Chaired by Jimmy Carter and George W. Bush, the group was tasked with identifying the underlying causes of the unrest. Its final report, while much smaller in scope, bears some resemblance to the Kerner findings. The series of fairly modest recommendations included:

Reducing the use of force by police officers

Reforming sentencing laws

Improving the health and education of children and young people

Increasing access to affordable housing and public transit

Expanding Medicaid

Like the Kerner report, the Ferguson analysis identifies racial inequality as the primary problem. But the language and tone is strikingly different: far less piercing, accusatory and urgent.

“We are not pointing fingers and calling individual people racist,” the report states. “We are not even suggesting that institutions or existing systems intend to be racist.”

The original members of the Kerner commission may have foreseen this. They concluded their report by quoting the testimony of psychologist Kenneth Clark. Clark – whose famous doll tests were cited in Brown v. Board of Education – reminded his audience of the many previous commissions assembled to study incidents of racial unrest: Chicago in 1919, Harlem in 1935 and 1943, Los Angeles in 1965. Testifying before the Kerner Commission, he said, was a kind of Alice in Wonderland experience: he watched the same images flickering past, sat listening to the same analysis and the same recommendations – and it all culminated, finally, in the same inaction. The commissioners quoted his words:

“It is time now to end the destruction and the violence.”

]]>https://ww2.kqed.org/lowdown/2016/07/21/uprising-lessons-from-the-race-riots-of-1967/feed/1handsdetroit_race_riot_1967Rioting in Detroit.kerner-reportCourtesy Bill Moyers JournalThe Chilling Effect: Why San Francisco Gets So Foggy in the Summer [Interactive]https://ww2.kqed.org/lowdown/2015/06/08/making-sense-of-san-franciscos-bone-chilling-summertime-fog/
https://ww2.kqed.org/lowdown/2015/06/08/making-sense-of-san-franciscos-bone-chilling-summertime-fog/#commentsTue, 31 May 2016 21:00:44 +0000http://blogs.kqed.org/lowdown/?p=13025“The coldest winter I ever spent was a summer in San Francisco.”

OK, so Mark Twain may never have actually said it himself. But the statement stands regardless.

As any naive tourist shivering miserably in a tank top and Bermuda shorts might attest, San Francisco in the summertime can be one chilly town, especially in early summer.

Welcome to the infamous “June Gloom.”

Even on days when the temperature in nearby cities climbs into the glorious 90s (F), it’s not uncommon to find much of San Francisco shrouded in a thick blanket of bone-chilling fog.

If you’re still a bit foggy (sorry, couldn’t resist) about why that is, scroll through this interactive explainer created by Newsbound.

For a bit more clarity (about the fog, that is), watch this beautiful time-lapse film by Simon Christen, and below that, a short video by KQED’s Quest that digs deeper into the science of coastal fog.

]]>https://ww2.kqed.org/lowdown/2015/06/08/making-sense-of-san-franciscos-bone-chilling-summertime-fog/feed/9History of May Day, Explainedhttps://ww2.kqed.org/lowdown/2014/04/30/a-brief-history-of-may-day/
https://ww2.kqed.org/lowdown/2014/04/30/a-brief-history-of-may-day/#commentsSat, 30 Apr 2016 00:00:39 +0000http://blogs.kqed.org/lowdown/?p=12689Continue reading History of May Day, Explained→]]>In about 80 countries around the world, May 1, or May Day, is an official labor holiday, marked by worker demonstrations and rallies.

But you wouldn’t know it in much of the United States, where union membership has fallen to its lowest point in nearly 70 years and May Day’s significance is all but forgotten (although in recent years, it’s become a day of immigrant rights rallies).

And that’s a bit odd, given that International Workers Day, as it’s alternately known, is a major milestone in our nation’s turbulent labor history.

Gilded Age tensions

During The Gilded Age, which stretched from the end of the Civil War to the turn of the 20th Century, America went through a period of dramatic economic growth and industrialization. It resulted in a huge concentration of wealth and a rapidly growing gap between capital — broadly defined as stockholders, executives and managers who controlled the means of production – and the wage-earning labor force who worked the production lines.

Industrial capitalism yielded larger workplaces, greater use of technology, and a division of the manufacturing process that required less skill and training (sound familiar?). It also posed a direct threat to the individual laborer, who risked becoming an increasingly cheap and replaceable cog in a vastly expanding machine.

The labor movement

This was a period of boom and bust. Intermittent economic slowdowns led to waves of widespread unemployment and growing discontent, particularly among new wave of European immigrants who poured into cities in desperate search of work.

When work was available, it was often less than desirable. In the absence of strong federal work laws, immigrant laborer commonly worked excessively long hours in wretched, dangerous conditions, typically for meager wages.

In response, a convention of the Federation of Organized Trades and Labor Unions called for a national strike on May 1, 1886. The primary demand: an eight-hour workday.

The convention declared:

“Eight hours shall constitute a legal day’s labour from and after May 1, 1886, and that we recommend to labour organizations throughout this jurisdiction that they so direct their laws as to conform to this resolution by the time named.”

Hundreds of thousands of workers in cities across the country participated in the strike, including roughly 80,000 workers in Chicago.

With a booming population fueled by an influx of German immigrants (the city grew from about 300,000 in 1870 to 1.7 million in 1900), Chicago became a hotbed of radical labor activism.

The Haymarket Affair

Two days after the demonstrations, police and strikers clashed outside Chicago’s McCormick Reaper Works, leaving two workers dead. In response, a group of anarchist labor leaders organized a rally the following evening in Chicago’s Haymarket Square.

The event attracted a large crowd, and proceeded peacefully until police arrived and ordered the remaining workers to disperse. As the officers advanced on the crowd, a homemade bomb was thrown, and in the melee that ensued, seven policeman were killed (mostly a result of friendly-fire). Police fired on the crowd, killing at least four demonstrators and injuring scores more.

A number of subsequent organizing efforts were violently suppressed by authorities. In a desperate attempt to identify the perpetrators of the Haymarket incident, Chicago authorities captured and convicted eight local labor leaders, despite having any concrete evidence of their involvement. Four were hanged, one committed suicide, and three were pardoned six years later by the Illinois’ governor.

The real bomber was never revealed.

The seven anarchists initially sentenced to death for the murder of a police officer during the Haymarket incident (Wikimedia Commons)

Although the Haymarket Affair, as the incident became known, marked a temporary setback for the labor movement, it also spurred a fresh wave of activism around the world, particularly among younger workers, and membership in labor organizations grew rapildy.

The first May Day

Responding to ongoing pressure for an eight-hour day, the American Federation of Labor (AFL) resumed its campaign, planning a general strike May 1, 1890. AFL president Samuel Gompers enlisted the support of European socialist labor leaders in planning an international day of action to demand a universal eight-hour day. Workers in countries throughout Europe and America rallied in the streets.

The New York World’s front page the next day was devoted entirely to the event. The headlines proclaimed:

“Parade of Jubilant Workingmen in All the Trade Centers of the Civilized World … Everywhere the Workmen Join in Demands for a Normal Day”

The Times of London listed 24 European cities where demonstrations had occurred. It also noted events in Cuba, Peru and Chile. Commemoration of May Day became an annual event, as workers in a growing number of nations participated each year. In many nations — especially those with socialist or former-socialist governments — it still retains strong political significance.

May Day’s decline in America

In 1894, riots erupted during the longstanding Pullman Strike near Chicago, in which workers were killed by federal authorities sent in to quell the strike. The incident drew national attention, and under pressure to appease the increasingly powerful labor movement, Congress unanimously approved rush legislation to make Labor Day a national holiday.

But eager to distinguish Labor Day from May Day’s more radical roots, President Grover Cleveland pushed for a September date for the holiday (Labor Day). With the official Labor Day celebration in September intentionally divorced from its radical roots, America’s observance of May Day became increasingly obsolete.

And finally, the 8-hour day

The fight for the eight-hour day in America persisted through the turn of the century, with ongoing, and sometimes violent strikes and labor demonstrations. Incrementally, though, a number of key industries agreed to shorten hours for their workers. In 1916, Congress passed the Adamson Act, the first federal law to regulate the hours of workers in private companies.

Two decades later, Congress passed the Fair Labor Standards Act, setting the maximum workweek for a wide range of industries at 40 hours. It also required employers to pay overtime bonuses in certain professions.

So, when you clock out of work at 5 p.m. this week, consider tipping your hat to those May Day labor activists from way back when.

]]>https://ww2.kqed.org/lowdown/2014/04/30/a-brief-history-of-may-day/feed/2Pyramid_of_Capitalist_SystemA 1911 Industrial Worker publication illustraiton critiquing the capitalist system. (Wikimedia Commons)378px-HaymarketMartyrsThe seven anarchists initially sentenced to death for the muder of a police officer during the Haymarket incident (Wikimedia Commons)When Rivers Caught Fire: A Brief History of Earth Dayhttps://ww2.kqed.org/lowdown/2014/04/18/earth-day-a-brief-history/
Mon, 18 Apr 2016 13:00:56 +0000http://ww2.kqed.org/lowdown/?p=21839Donning a gas mask, a demonstrator participates in the first Earth Day celebration in 1970. (AP)

Quick quiz:

1. Which labor organization helped fund and organize the first Earth Day celebration?

2. Who made the following statement:

“Restoring nature to its natural state is a cause beyond party and beyond factions … It is a cause of particular concern to young Americans, because they, more than we, will wreak the grim consequences of our failure to act on programs which are needed now if we are to prevent disaster later.”

Keep reading for answers.

Rivers on fire

Today, our planet needs all the love it can get. From the increasingly severe impacts of climate change and overpopulation to rapid deforestation and mass species extinction, we’re up against a mounting series of potentially catastrophic environmental crises. But environmental regulation remains staunchly divisive in Washington. Passing environmental regulation is often considered too politically risky, and efforts to address even the most pressing challenges are consistently thwarted and the once-celebrated Environmental Protection Agency, created under a Republican administration, is now one of the most reviled agencies in Washington.

For what it’s worth, though, the environmental outlook in the late 1960s wasn’t too rosy either.

After decades of largely unregulated industrial and economic growth in the wake of World War II, the U.S. had managed to majorly muck up its air and water resources. Toxic effluent from factories frequently spilled into streams and rivers. Open spaces were used as dumping grounds. DDT and other synthetic chemicals contaminated natural habitats and water supplies. And air pollution from factories and belching cars left many industrial areas shrouded in thick blankets of smog.

Here’s a handful of the environmental catastrophes that happened within less than three years:

November 1966: In New York City, 168 people die of respiratory-related illnesses over a 3-day period due primarily to horrendously poor air quality.

March 1967: Interior Department Secretary Stewart L. Udall announces the first official list of endangered wildlife species. Among the 78 species is the Bald Eagle, America’s national bird.

January 1969: A blowout at an offshore oil rig near Santa Barbara spills roughly 10,000 gallons of crude oil into the Santa Barbara Channel and onto nearby beaches. It lasts for 10 straight days becoming (at that point), the largest oil spill in American history. Today, it only ranks third, overtaken by the 1989 Exxon Valdez and 2010 Deepwater Horizon spills).

June 1969: A particularly fetid industrial stretch of the Cuyahoga River running through Cleveland bursts into flames (seriously) when oil-soaked debris in the water is ignited by sparks from a passing train.

A movement begins

As urban unrest and the anti-war movement ignited across the nation, environmental activism had yet to gain a strong foothold.

“If the people really understood that in the lifetime of their children, they’re going to have destroyed the quality of the air and the water all over the world and perhaps made the globe unlivable in a half century, they’d do something about it. But this is not well understood.”

That’s a quote from Senator Gaylord Nelson, a Democrat from Wisconsin, who spearheaded a national day of awareness in the aftermath of these environmental disasters, .

“If we could tap into the environmental concerns of the general public and infuse the student anti-war energy into the environmental cause, we could generate a demonstration that would force the issue onto the national political agenda.”

Denis Hayes in the Earth Day campaign office (Courtesy of AP)

In late 1969, Nelson formed a bipartisan congressional steering committee and enlisted Denis Hayes, a 25-year-old Harvard Law School dropout, to coordinate the initiative. Influenced by anti-war campus activism, Hayes sought to organize environmental teach-ins throughout the country that occur simultaneously on April 22, 1970.

[Interestingly, an independent Earth Day effort had earlier been proposed by peace activist John McConnell during a 1969 UNESCO conference in San Francisco. McConnell reserved the date of March 21, 1970 — the first day of spring — a month prior to Hayes’ event.]

With a limited budget and no email or Internet access, Hayes and a small group of organizers mailed out thousands of appeals, recruiting an army of young volunteers to organize local events in communities and campuses across the country.

On November 30, 1969, the New York Times reported:

“Rising concern about the ‘environmental crisis’ is sweeping the nation’s campuses with an intensity that may be on its way to eclipsing student discontent over the war in Vietnam.”

The first Earth Day

“Lord knows what we thought we were doing. It was wild and exciting and out of control and the sort of thing that lets you know you’ve really got something big happening … What we were trying to do was create a brand new public consciousness that would cause the rules of the game to change.”

In the end, an estimated 20 million people participated in that first Earth Day, a name coined by advertising guru Julian Koenig (father of Sarah Koenig of Serial podcast fame).

It marked the single largest demonstration in U.S. history. Helped along by its catchy name, which .

“It was a huge high adrenaline effort that in the end genuinely changed things,” Hayes said. “Before (that), there were people that opposed freeways, people that opposed clear-cutting, or people worried about pesticides, (but) they didn’t think of themselves as having anything in common. After Earth Day they were all part of an environmental movement.”

Hayes’ assertions were affirmed by several national polls showing a rapid rise in the public’s concern about air and water resources. In the Gallup Opinion Index, the percentage of respondents who considered air and water pollution a top national problem rose from 17 percent in 1969 to 53 percent by 1970 .

On Earth Day the following year, an independent group launched an anti-litter public service announcement, known as the “Crying Indian” that featured a white actor in a headdress, rowing a birch bark canoe, and shedding after seeing garbage strewn everywhere. Despite the ad’s culturally questionable premise, it proved enormously popular and is still considered one of the most successful public service announcements in history.

Unexpected allegiances

That brings us back to the first question of the quiz. The group most supportive of the first Earth Day organizing effort — financially and otherwise — was none other than the United Auto Workers.

A labor union not generally thought of for championing environmental causes, the UAW donated funds for the event and turned out volunteers across the country.

UAW President Walter Reuther pledged his union’s full support for Earth Day and for subsequent air quality legislation that the auto industry staunchly opposed.

“What good is a dollar an hour more in wages if your neighborhood is burning down,” he said. “What good is another week’s vacation if the lake you used to go to is polluted and you can’t swim in it and the kids can’t play in it?”

Nixon and the golden era of environmental regulation

That was said by President Richard Nixon during his 1970 State of the Union address.

Yes, that Nixon, the conservative Republican most commonly remembered for prolonging America’s involvement in Vietnam and resigning in disgrace over the Watergate scandal.

But Nixon also oversaw the most sweeping environmental regulations in the America’s history.

Even before the first Earth Day, Congress passed the National Environmental Policy Act, which among other things, required environmental impact statements for major new building projects and developments. Nixon signed it into law on January 1, 1970.

Environmentalism had never been one of Nixon’s major political priorities, but his administration — like the UAW — recognized the shifting political tide, as public outcry and media attention to environmental issues increased.

By the end of 1972, Nixon signed the Clean Water Act, Pesticide Control Act (which banned DDT) and Marine Mammal Protection Act. A year later, he also signed the Endangered Species Act and the Safe Water Drinking Act.

Most of these bills were approved with bipartisan support in Congress, in some instances nearly unanimously.

In a televised speech in 1972, Nixon said:

“We are taking these actions not in some distant future, but now, because we know that it is now or never.”

Environmental conditions in the United States began to slowly improve. Which is not to say there wasn’t strong political opposition and major lingering problems, But for a time — stretching through the Ford and Carter administrations — the pursuit of environmentalism maintained a strong bipartisan support. In the last year of his presidency, Carter even installed solar panels on the roof of White House to promote renewable energy initiatives.

Green honeymoon ends

The economic slowdown in the late 1970s swept in a tide of political change. In 1981, a year into his first term as president, Ronald Reagan appointed two aggressive defenders of industry to head the EPA and the Department of the Interior. As part of the “Reagan Revolution,” the Administration moved rapidly to slash federal budgets, cutting the EPA’s funding by nearly half. Environmental enforcement was weakened considerably, as large swaths of public land were opened up for mining, drilling, grazing and other private uses. In a famous symbolic act, the solar panels on the White House roof were dismantled during his second term.

To be fair, a number of significant environmental policies were advanced during Reagan’s Administration, including the Superfund program to clean up hazardous waste sites, creation of wilderness areas, and the Montreal Protocol, an international agreement to protect the ozone layer by phasing out the production of substances responsible for its depletion.

But the anti-regulatory sentiment established during Reagan’s presidency took root. Efforts to strengthen the nation’s environmental protection laws grew increasingly partisan, a trend that continues today. The stream of regulatory measures approved by Nixon four decades ago would have scant chance of passing today’s Congress.

The benefit of tangible problems

Organizers of the first Earth Day had a key advantage: they were trying to tackle visible, tangible problems that impacted people’s daily lives, incontrovertible evidence of pollution’s ill-effects. Rivers and lakes were too polluted for kids to swim in; parks were strewn with trash; people were getting sick from pollution in the air. The evidence was incontrovertible, and it made it a whole lot easier to draw clear connections between quality of life and the urgent need for strong environmental protections.

In contrast, many of today’s major environmental threats, like climate change, while perhaps even more catastrophic in nature, are also more abstract; it’s far more challenging to convey the sense of urgency necessary to mobilize the masses and pressure lawmakers to act. The United States, one of the world’s largest greenhouse gas emitters, refused to sign on to the Kyoto Protocol, a 2005 international treaty approved by 180 nations requiring rapid cuts in emissions. In 2010, Congress failed to pass comprehensive national climate change legislation. And although the U.S. signed on to last year’s landmark international climate accord, pledging to reduce its carbon emissions, many environmental advocates say the deal doesn’t go nearly far enough to prevent the worst impacts of catastrophic climate change.

Which begs an ominous question:

What degree of disaster will need to happen to incite a new era of environmental change?

]]>earth_dayteach-in-office_AP_4471_600x450Denis Hayes in the Earth Day campaign office (Associated Press)UAW