Monday, April 28, 2014

Populist
political leaders disparage the few in order to win over the many. It is a
“divide to conquer” strategy that relies on emotional appeal, rather than
rational debate. Such leaders surround themselves with sycophants, rather than
a “team of rivals.” They make promises without regard as to how they might be
fulfilled, and blatantly lie about their opponents. While they claim to speak
for the masses, their concern is for themselves. They are interested in the
here and now. The past and the future have no relevance.

Populists
are always the most dangerous politicians. In giving things rather than in
guiding legislation, they insinuate themselves into the hearts and minds of susceptible
voters. Every dictator, whether from the Left or the Right, has had his or her
roots in populism. We can look at Lenin and Stalin who strove for equality, in
a classless society, but on the way killed or murdered perhaps 40 million
people. We can consider Hitler who, in the name of creating a perfect society,
murdered six million Jews. Mao Tse-tung did the same thing in China. A few bankers
may be greedy and some corporate leaders may be corrupt, but every major
campaign against human life has been led by government and almost always under
the pretense of fairness and equality.

Inequality
has become the banner for today’s populists and “fairness” is their goal.
Gillian Tett, writing in last weekend’s Financial Times, noted that
media reference to inequality is six times higher this month than in 2005 or
2010. The term “inequality” is expressed simplistically, with little thought as
to its causes, or to history. It is generally thought of in terms of financial
outcomes. Too little attention is paid to the far more important issue of
opportunities. Outcomes are a function of intelligence, diligence, hard work, aspiration
and luck. No matter how we measure it, life is not fair, nor can it ever be.
Why did my sister die of cancer at the age of 58 and not me? Why have some of
my friends become enormously wealthy and not me? Why have my children proved
such a blessing, yet those of some my friends been such a burden? Why was I
born in this great country when billions of less fortunate were born
impoverished in places like Somalia,
Haiti or Afghanistan? Innumerable
questions, such as these, can be asked with no satisfactory answers.

Yet,
the fact that there are no good answers does not mean the questions should not
be asked. Like Stuart Little, the quest is important. We should always seek
ways, individually, of improving our lives, as well as helping those around us.
But we should not be blinded with the expectation that Nirvana will be found.
It is the promise of Utopia that drives the Populist, even as we know from
history that Utopia is likely to become Dystopia. Ask those who lived in Hitler’s
Germany, in Eastern Europe before the Wall came down, in China during the
Cultural Revolution, or lovers of freedom today in North Korea, Cuba, Syria, or
myriad other countries.

The
answer to resolving inequality lies not in promises of Arcadia, but in education, a focus on
equality of opportunity, and less government in our lives. It requires a state
that provides basic freedoms, a state that recognizes that its powers are
limited and a society that functions under the rule of law. It demands respect
for private property. These are all elements essential for civil society and
economic success. It means forgoing dependency, especially on government, and
becoming personally responsible for one’s actions. It means understanding that
effort and results are related, and that there is nothing so valuable as
freedom.

Dependency
on government is an inhibitor to equality, as it is destructive to aspiration,
a vital ingredient to success. It is through success that the bulwarks of
inequality are breached. Similarly, affirmative action, at this time and in
this country, deprives the recipient of a personal sense of accomplishment, a
trait fundamental to self-confidence.

Advocates
for equality of outcomes got a boost recently from French economist Thomas
Piketty’s book, Capital in the Twenty-First Century. He suggests that we
are living in a new Gilded Age where a few people have recently accumulated
enormous wealth. Professor Piketty predicts that this concentration of wealth
will slow economic growth for a century. His answer to what he sees as the
obstruction of concentrated wealth is confiscatory – a global wealth tax. Paul
Krugman, writing in Friday’s New York Times, suggests that conservatives
are “at a loss for coherent arguments” against Professor’s Piketty’s thesis.
Krugman’s allegation is absurd. His dismissive attitude toward those on the
Right reminds me of advice my father once gave: never argue with a fool, for a
passerby will be unable to tell who the fool is.

David
Brooks, in a dueling column on the same day in the same paper as Krugman, noted
that historically that best way to reduce inequality is to help lift those at
the bottom, rather than squashing those at the top. We should also not forget
that we already have a very progressive tax system. The top 1% of earners in
the U.S.
pay 35% of all taxes, compared to 19% of all taxes paid by the same group in
1980. The top 10% pay 71% of all taxes. Almost half of all American workers pay
no federal income tax. How much more money should the federal government
squeeze out of our most productive citizens, and for what purpose?

But
the biggest problem with a wealth tax is that wealth would move. The concept of
a universal global wealth tax is a dream – some country, or more likely dozens
of countries, would offer safe harbor to the very wealthy. Wealth is fungible.
It can be easily moved. Buildings cannot, but borrowings against assets could.
Regardless, real estate values would decline; investments would not be made. Corporations
would suffer. Unemployment would rise. A global wealth tax would ensure a
world-wide economic slowdown. Warren Buffett and others like him have given
much of their wealth away to myriad charities; though, the IRS is never a
beneficiary. It would appear that Mr. Buffett, despite statements to the
contrary, believes that foundations such as the one Bill and Melinda Gates
established will be better stewards of his money than the federal government?

There
is no question that we are living through a period that could be considered a
Gilded Age. There has been a concentrated accumulation of enormous wealth. But
historically such wealth is not uncommon when the world is going through an
economic revolution – a sea change. The same thing happened in the late 19th
and very early 20th Centuries when oil, steel, railroads,
merchandising, automobiles, utilities, farm and mining equipment created
enormous family wealth. The same thing has happened now with the advent of the
internet and communication revolutions, and the explosion of hedge and private
equity funds. The gap between rich and poor was far greater during that earlier
time than today. Even so, much of that wealth was given away. Consider what
Andrew Carnegie did for public libraries, or other families, like the
Rockefellers, Fords and Bessemers did for hospitals, universities and museums. Would
these great institutions have been erected if the money had been left to
government bureaucrats? The same thing is happening today, with billions of
dollars being contributed to myriad eleemosynary organizations.

But
back to today’s populism: Divisive politics, pitting one group against another,
may win elections, but does little to promote the kind of economic growth
needed to mitigate those differences. Government has a role in addressing concerns
of the poor, the sick and the elderly. But we should never forget that skills,
work ethics and desires are never equal. An attempt to address outcomes will
inevitably put liberty at risk.

When
those like Thomas Piketty complain that wealth is more unequally distributed in
the U.S.
than elsewhere, he overlooks what happens in dictatorial regimes – those place
where government has assumed ever increasing responsibilities – and where
wealth is confiscated. It is not merchants, bankers or industrialists who get
rich in those countries, it is the political leadership. Ayatollah Ali Khamenei
of Iran
is estimated to be worth $90 billion. Vladimir Putin’s wealth is put at $40 to
$70 billion. Kim Jong-un is said to be worth $5 billion, Fidel and Raul Castro
at $1.3 billion. Xi Jin ping, China’s
new President, has a net worth said to be in the “hundreds of millions,”
according to London’s
Telegraph. Even Laurent Lamothe, Prime Minister of Haiti, is estimated
to have a net worth of $175 million, but then he is only 42 years old and has
only recently assumed the position. Such wealth accruing to U.S.
politicians, whether while in office or afterwards, should send chills up the
spines of freedom-loving Americans.

The
concept of democracy is relatively new in the scheme of human history, going
back only 227 years to the election of George Washington. (Historian may argue
that democracy first appeared more than 2400 years ago in the GreekRepublic,
but its ultimate failure is only a reminder of democracy’s fragility.) Keep in
mind, while there are some in our society who are greedy and others who for
reasons of skill, hard work, or luck live lives with far more material goods
than most of us, the only real threat to our freedom is a government that decides
it knows better than its people, one in which property rights are no longer sacrosanct.

Politicians
who believe that the elimination of inequality justifies the confiscation of private
property are frightening. Their goal is personal power, not the idealistic
populism they espouse. Their interests are at odds with all who value freedom
and who see equality of opportunity as the real promise.

Friday, April 25, 2014

Having
worked on Wall Street for 47 years, what constantly amazes is how little I know
about the market and how it works. Nevertheless, I feel reasonably comfortable
in answering the question as to whether the market is rigged in the negative,
despite allegations to the contrary by Michael Lewis in his fascinating book Flash
Boys. With about 200 broker-dealers, thirteen exchanges and approximately
45 “dark pools” (off exchange trading venues), I suspect that innate
competition helps subdue the natural greedy instincts of those who flock to
Wall Street. Like most endeavors, the more intense the competition the more
equitable the pricing. Competition, not regulation or price fixing, is the
means by which capitalism best discovers fair prices. Smart people have
certainly taken advantage of complexity, but that is a consequence of
technology.

Perhaps
it is a question of definition, but a rigged market to me suggests a cabal of
likeminded people colluding to enrich themselves at the expense of the public. Instinctively,
I am not a fan of High Frequency Traders, and Mr. Lewis has well articulated
how they have taken advantage of the system, with little or no social or
economic purpose other than personal gain. But I suspect Brad Katsuyama, as
quoted in Flash Boys, is right when he says: “I think most of them have
just rationalized that the market is creating the inefficiencies and they
(HFTs) are just capitalizing on them.” It has been a combination of
technological advances and the unintended consequences of government
deregulation and regulation that allowed HFTs to work their magic.

While
Cliff Asness and Michael Mendelson are quoted, in an op-ed in the April 1st
edition of the Wall Street Journal, as saying that high-frequency
trading has been around for 20 years, such trading clearly got a boost when the
SEC passed a rule in 2005 (U.S. Regulation NMS), which made it easier to trade
stocks on multiple exchanges, but also mandated better transparency and
consistent access to market bids and offers, regardless of where the individual
stock is traded. Its intent was to open large exchanges such as the NYSE and
NASDAQ to greater competition, including “dark pools.” The consequence was an
increase in high frequency trading, but with less clarity, as “dark pools”
definitionally have no transparency. At the same time, the proliferation of
HFTs meant that bids and offers displayed were often ephemeral, disappearing
once a legitimate bid or offer was made. The front-running of orders, according
to Mr. Lewis, was their motive, not providing liquidity, despite being paid to
do so by some exchanges.

Evolution,
as Darwin
noted, does not require the strongest or most intelligent of a species to
survive, but the one most adaptable. With technology ubiquitous and
increasingly powerful, markets and the way they trade has changed significantly.
In the dark ages, when I entered the business, the New York Stock Exchange was a
privately owned (by seat holders), quasi-public, socially responsible,
open-outcry institution, with brokers (seat holders) controlling orders and
specialists (also seat holders) charged with maintaining orderly markets. Retail
investors, as well as professional money managers had confidence in the system.
It worked. The NYSE was the most efficient market in the world during the first
two decades after World War II. Firms like Merrill Lynch (where I started in
the business) helped spread stock ownership to millions of Americans.

As
institutional investors grew in size, block trading emerged, with private brokerage
firms using their own capital to make bids and offers on blocks of stock –
blocks that were too big for individual specialists to absorb. Competition kept
pricing competitive. However, the natural governor inherent in partnerships
gave way to looser standards, as brokerage firms went public. Risks were
downloaded onto public shareholders, permitting traders to price more
aggressively. By 1997 ECNs (electronic communication networks) were becoming
competitive as providers of liquidity. Decimalization arrived in 2000. In 2006
the New York Stock Exchange, which had been owned by seat holders since its
founding in 1792, became a publically owned business. In April 2007, the NYSE
merged with Euronext N.V. to form the first global equities exchange, and in
December 2012 the company merged with IntercontinentalExchange (ICE).

What
had been an icon of American capitalism, a quasi-public, socially responsible
institution was now owned by a group established in 2000 to trade energy
contracts. Volume on the NYSE has risen from under 10 million shares a day when
I entered the business to around 750 million shares a day today. Forty-seven
years ago the Exchange accounted for 100% of trading in NYSE listed shares;
today they account for about 20%.

A
multiplicity of exchanges, including “dark pools,” along with no specialists to
maintain orderly order flow and no block traders to provide liquidity, created
a vacuum into which HFTs swarmed. The $64,000 dollar question, which no one can
honestly answer, is: Have HFTs increased or decreased liquidity? Mr. Asness
answers in the affirmative, though he cautions his answer, “…but it’s hard to
prove either way.” What is unassailable is that liquidity is not, and cannot,
be free. The middle man is often characterized as parasitic, but his presence
is necessary for markets to function. If a market maker buys at the offer and
sells at the bid all day long, he or she will lose money. High Frequency
Traders, who are the bad guys in Mr. Lewis’ book, are looking to make a penny,
or even a fraction thereof, on all shares traded. They do so by combining speed
with market intelligence derived from studying order flows. It doesn’t sound
like much, but one penny on a day in which 3.5 billion shares trade (about the
average for the year) is $35 million.

But
it is not the alleged “rigging” of the market that concerns me; it is the suspicion
that no one is in charge in case of a calamitous event. Markets, like our
credit system, rely on confidence. There will always be panics and market
crashes. They are inevitable and cannot be avoided. But when they occur, they
must be addressed quickly and confidence must be restored as soon as possible. Can
we count on government bureaucrats to do so? In the panic of 1907, it was J.P.
Morgan who intervened. In the aftermath of the October 1929 crash, it was
William Durant (co-founder of General Motors) and members of the Rockefeller
family who tried to stem the tide. (The market, in that instance, continued to
slide through the end of November before recovering. The later collapse into
the middle of June 1933 was more a function of bad fiscal, economic and
monetary decisions by government, combined with a failure to restore
confidence.) In 1987, the heads of equity trading at Salomon Brothers and
Goldman Sachs made a point of buying stocks on Tuesday morning, October 20th
when it appeared that the collapse of the day before would persist. In the fall
of 2008, the $5 billion investment made by Warren Buffett into Goldman Sachs
helped restore confidence.

It
is my fear that markets, increasingly under the control of machines and
robot-like investment managers, have become more susceptible to panic selling.
I doubt that HFTs would step into the gap, as did Morgan, Durant, Shopkorn,
Mnuchin, and Buffett in years prior. In the two decades following World War II,
90% of all stocks were owned individually by households. By 1960, half of all
households had some stock ownership. They had become capitalists and took pride
in their investments. To own a “piece of America” was not only a slogan; it
reflected an aspiration. By definition, investing in stocks for the long term
requires confidence in the future – a critical ingredient to building economic
growth. Today, individual households own less than 25% of all shares, and
investors are disparaged as one-percenters. It is true that more than half of
households own shares through mutual funds and various retirement plans, but those
represent multiple degrees of separation. The number of publically traded
stocks has shrunk by 30% since 2000, in part because of costs of complying with
government regulations. In the meantime, the Obama Administration, with its
focus on inequality, has had a tendency to downplay the importance of equity
investments and savings, preferring the political expediency of dividing the
people into capitalists and non-capitalists.

One
could argue that the market may be rigged, but the more important message is
that such voices detract from the far more critical fact that the impersonal
nature of investing today, the utilization of markets as casinos by algorithmically
programmed computers, the speed with which orders reach exchanges and with
holding periods measured in milliseconds, and the demonization of capitalism by
politicians are destroying confidence, damaging economic growth and denigrating
the concept of savings.

Wednesday, April 23, 2014

Christians
are generally considered the “haves,” the establishment, the status quo, the perpetrator
rather than the victim. But that is not a valid portrait. In varying degrees of
intensity and by myriad peoples, Christians have been singled out for killing
for 2000 years. Since 9/11, attacks on Christians have intensified (mostly by
Islamic extremists). The politically correct environment in which we live has meant
that many of these attacks receive minimal publicity. While attacks on Muslims
are categorized as “hate” crimes, when Christians are attacked it is the victim
who is often seen as the instigator. However, Pope Benedict XVI, who resigned
in 2013, recently claimed that Christians are the most persecuted group in the
contemporary world. The murder of Christians has only increased since his
warning.

While
it can be argued that Christianity has been a force for good, there have been instances,
though, when harm has been done in the name of Christ. Early examples were the
Crusades (1095 – 1291), where, with promises of Plenary Indulgences from Pope
Urban II, knights and kings from across Europe traveled to the Middle East to
restore Christian access to holy places in and around Jerusalem. (By the 11th Century
Islam had been embedded in much of the Middle East
for 400 years.) It was more likely that the knights and kings who led those
armies were motivated more by the prospect of gold and jewels, than by the possibility
of doing time in Purgatory. The latter was reserved for the hapless minions who
were forced to accompany them. Greed has always been more primeval than
religiosity. In any case, Crusaders were not particularly Christian to those
who opposed them. Rape and pillage were all in a day’s work.

Another
early and blatant example of Christian persecution was the Spanish Inquisition.
Like the Crusades, the targets were Muslims; though Jews were victimized as
well. In 1478, Ferdinand II of Aragon and Isabella I of Castile
established the Tribunal of the Holy Office of the Inquisition to replace the
Medieval Inquisition which was under Papal control. More than seven hundred
years earlier, in 711, Moors had invaded the Iberian Peninsula, conquering and
ruling over most of what is today Spain and Portugal. While they were stopped
from invading the rest of Europe by Charles Martel in 732 at the Battle of Tours, Muslim rule in Spain only ended with the fall of Granada in 1492. The
Inquisition was established to ensure the orthodoxy of those who converted from
Judaism and Islam to Christianity. The brutality of the Inquisition could be
seen in the fact that after 1492 if one did not convert one was forced into
exile. Thousands were put to death. It was only in 1834 that Isabella II abolished
the tribunals, though by then they had not been used for several years.

It
is also true that Hitler and Stalin were Christian by heritage, if not by
belief, and Jews were singled out as victims. Germans may not have sent Jews to
gas chambers in the name of Christ, but there is no question that the Holocaust
was about ethnicity and racial purity.

Despite
its shortcomings, religion plays a major role in the lives of billions of people.
It provides comfort to the bereaved, the discomfited and helps those suffering
indiscriminate and wanton acts of evil. Nevertheless, there is no question that,
while religion has done more good than harm over the millennia, it does have a
dark side. Throughout history, religion, along with geography, natural
resources and economics, has been one of the major causes of war and suffering.

We,
in the comfort of our Western homes, often forget that it is the dispossessed
and those without hope that are most likely to seek solace in religion.
Prosperity foretells a rise in secularism and a decline in religiosity.
Prosperity brings all kinds of material goods – food is plentiful, shelter is
available, entertainment in the form of internet-connected cell phones are
ubiquitous – but often that secular fulfillment is accompanied by a want in
spirituality. It is why, despite the magnificence of their cathedrals, Europe’s pews are increasingly empty on Sundays. Gallup reports that 15% of French citizens and 10% of UK citizens
regularly attend church. While Muslims make up only 5% of the UK population,
estimates are that more people will be attending mosques in 2020 than churches.
And, 25% of mosques have extremist literature that calls for the beheading of
lapsed Muslims, forbidding interfaith marriages and ordering women to remain
indoors. Do we really want to integrate Sharia Law with Blackstone?

In
the United States,
similar trends regarding church attendance are apparent, with less than 20% of
people attending on a regular basis. The numbers are higher for older people
and much lower for younger. Apparently, the concept of being “connected” does
not apply to the Deity. Christianity has been strongest among emerging nations,
in Africa and parts of Asia. Ironically, it is
in those nations where leisure is a luxury that people find time for God. It is
also in many of these nations where Christians are most persecuted.

According
to a PEW Research poll, 80% of the world’s population identifies with a
religious group. Of those, 2.2 billion identified as being Christian, 1.6
billion Muslim and 1 billion Hindu. While Muslim nations proudly call
themselves just that, Christian nations are either consumed with doubt, or too
politically correct to acknowledge what they are. A few days ago, when David
Cameron claimed Britain
to be a Christian nation – a supposition one would think obvious with the head
of the Church of England being the Queen – he was accused of “fostering
division.” His characterization, according to his accusers, will have “negative
consequences for politics and society.” Of course Mr. Cameron is correct. It is
only the PC police (today’s version of George Orwell’s fictional “Thought
Police”) that would argue it is not. Seventy-seven percent of Americans
identify as being Christian. Even though the U.S. has no state religion, it is,
in fact, a Christian nation.

The
problem for Christians is that oppressive governments in many emerging countries
battle for people’s minds and souls. They fear the competition Christianity
brings. It is why Kim Jong-un’s North Korea
ranks first among Open Door’s list of countries in terms of Christian
persecution, and why Assad’s Syria
ranks third. Of the estimated 300,000 Christians in North Korea, an estimated 50,000 to
70,000 live in concentration camps or prisons. Being caught with a Bible is
grounds for execution. Last year 2,123 Christians were killed – mostly in
Muslim nations – because of their faith. That was almost a doubling from the
1,201 killed in 2012. The data is according to Open Doors, a nondenominational
group that tracks Christian persecution worldwide. Of those murdered in 2013,
1,213 were killed in Syria,
612 in Nigeria and 88 in Pakistan. In Somalia, which
ranks second on Open Door’s list, converts to Christianity from Islam are
threatened with execution. Rounding out the top ten on Open Door’s list are Iraq, Afghanistan,
Saudi Arabia, Maldives, Pakistan,
Iran and Yemen – with the exception of North Korea,
all Muslim nations. For obvious reasons, the Christian population in these
countries has shrunk.

Since
the start of this year, 479 Christians have been killed for reasons of religion,
421 of them in Nigeria.
In February, in the village
of Izghe, 121 townspeople
were rounded up and summarily hacked to death by Boko Haram (an Islamic
terrorist group), while they shouted praises to Allah.

Apart
from the three people killed at last year’s Boston Marathon by Muslim brothers,
four Christians in the United
States were killed in 2013 for being
Christian, in three separate incidents. They were all killed by Islamic
extremists. Two Muslim converts, each on a “mission from Allah,” shot
Christians, one in California, the other in Ohio. A third Muslim beheaded
two Christian Coptics in Buena Vista,
New Jersey. The latter, “a ritual
killing, religious in nature” is the way authorities put it. Freedom House, a U.S.
human rights organization, claims that mosques across the country (and there
are 3000 of them) carry literature describing non-Muslims as infidels and
promoting intolerance against Western society. Regardless, crimes committed by
Muslims against Christians and Jews are rarely deemed to be of the “hate”
variety.

The
murdering of Christians and Jews has reached epidemic proportions, much of
which – in this age where moral relativism prevents offending the very people
responsible for much of the killing – has gone unreported, or underreported. Wishing
something away does not make it disappear. Mr. Obama seems to feel the War on
Terror is over, with Osama bin Laden dead and al Qaeda on the run. Secretary of
State John Kerry has suggested that the emergence of the 21st
Century has mystically meant that civility will be the way of relations between
nations. Islamic extremists have little interest in living in harmony with
those of other faiths. President Bush was far closer to the mark when he spoke
of the war against terror being one that will persist for generations. There
will be no Battleship Missouri, aboard which the enemy will surrender and give
up their weapons. There will be no treaty signed in the mirrored halls of Versailles. It will at
some point peter out, but only when officials acknowledge what is happening and
confront force with force.

While
no single episode has had the drama of 9/11, Islamic terrorism, if anything,
has become more embedded in our lives, certainly as regards the discriminate
killing of Christians and Jews. Atomic weapons in the hands of nations, such as
North Korea today, Iran likely tomorrow, and Pakistan, no
longer the ally it was, means that the world is increasingly unsafe.

I
have no interest in whitewashing the role Christianity played in past episodes
of ethnic violence. Nevertheless, that does not excuse the increase in the
killings of Christians today. Ignoring reality is a manifestation of a cowardly
tolerance towards the intolerant. The wanton killing of Christians and Jews should
be condemned by all, not just by Western leaders who should not fear to condemn
intolerance not matter its origin, but also by secular leaders of Muslim
nations and by spiritual leaders of Islam who should not want to see their
religion hijacked by those who see genocide as an extension of policy, or who
have no interest in the fellowship of mankind.

Monday, April 14, 2014

Intolerance
of tolerance is certainly no virtue. But bowing to pressure from the intolerant
is cowardly. That is what Brandeis President Frederick Lawrence displayed when
he revoked the honorary degree the university had planned to bestow on Ayaan
Hirsi Ali at this spring’s commencement.

We
have reached a sad state when an American, Jewish-sponsored college founded in
1948, when the Holocaust still cast its genocidal shadow over a world reeling
from five years of world war, denied a promised honorary degree to a black
woman who dared take on one on the cruelest elements of religious bigotry the
world has ever known – radical, theocratic Islamism, with its inhumane
treatment of women through genital mutilation, forced marriages and “honor
killings.” Multiculturalism should not mean accepting the unacceptable.

Ms.
Ali is now an American citizen, married to historian Niall Ferguson, yet she
still lives under persistent threats of death. She left her native Somalia at the age of eight and was forced to
leave her adopted country of the Netherlands when she was told,
despite her prominence as a human rights activist and the fact she was a
center-right member of Parliament, adequate security could no longer be
provided. Death threats intensified following the release of the film Submission, for which she wrote the
screenplay, and after the shooting death of the film’s producer Theo van Gogh
by a member of Hofstad, an Islamic extremist group. Pinned to Mr. van Gogh’s
chest by the knife that had been stuck in his dead body was a note promising
similar retribution to Ms. Ali.

In
contrast, the only pressure Brandeis president, Frederick Lawrence experienced
was from student and faculty activists, motivated by a perverted sense of multiculturalism
and political correctness, and from CAIR, the Council of American-Islamic
Relations.

What
made Mr. Lawrence’s actions so insufferable is that he masked his intolerance
in a cloak of patronizing tolerance – that, while Ms. Ali had the right to
express her anti-Islamist sentiments, having her appear at a Brandeis
commencement would be inconsistent with Brandeis’s core values. Those values, like
so many of those on the illiberal Left, apparently, exclude diversity when it differs
from their cynical concepts of multicultural righteousness.

CAIR,
the group that led the charge to exclude Ayaan Hirsi Ali from receiving an
honorary degree, claims to be the largest Muslim civil rights and advocacy
organization. In reality, it is a Hamas-linked group that is a business-suited
front for the Muslim Brotherhood and other Islamic-extremist organizations. Its
charter, for example, calls for the destruction of Israel. CAIR led the charge that
forced ABC to scrub a new series about a teenage girl forced to live with an
extended family in Saudi Arabia,
“Alice in Arabia.”
They expressed fears that it could engage in “stereotyping” that might lead to
the bullying of Muslim students. The show’s creator disagreed, saying that it
was a step toward greater tolerance for the understanding and empathy of women
in all cultures. Last week, two University
of Michigan campuses
scrubbed the screening of “Honor Diaries,” a 2013 documentary that explores
violence against women in honor-bound societies. It is a movie that exposes the
terrible abuses women and girls suffer in the name of family honor. Never mind
that it goes out of its way to convey respect for moderate Islam, or that it
won the Interfaith Award for Best Documentary at the Chicago International Film
Festival last October, CAIR and other Islamist groups objected and the
University caved. Where are the feminists who fought the good fight for women’s
rights? Or don’t Muslim women count as women?

CAIR
refers to itself as a “Muslim NAACP,” “but,” as Roger Simon of POLITICO.com
wrote recently, “to compare CAIR to the NAACP is like comparing Josef Mengele
to Ben Carson.” One would have more respect for CAIR if they used their
influence within the Muslim community to condemn the small number of Islamic extremists
who give the Muslim religion a bad name. Instead, they foster Islamophobia, by
castigating those who challenge Islam when it is used as a vehicle for
extremism – anyone who condemns killings or mutilations in the name of Allah is
portrayed as an Islamophobe. It was his false sense of the exigencies of
multiculturalism that caused Mr. Obama to refer to the 2009 FortHood
shootings as “work place violence,” despite Major Nidal Hasan shouting “Allahu
Akhbar,” as he murdered 13 fellow soldiers.

It
could well be, as some claim, that Ayaan Hirsi Ali has allowed the tragic
events of her youth and young adulthood to color her view unfairly toward the
Muslim religion, that “violence is inherent in Islam,” as she has said. But,
keep in mind, she was made to undergo genital mutilation as a child, then forced
unwillingly, at a young age, into marriage with a cousin she disliked. But, the
question hangs: why have not peace-loving Muslims stepped forward to condemn
those who would kill and maim in the name of Allah? And why are our
universities, supposedly bastions of free speech, been so willing to ban speech
when it is in opposition to their narrow, multicultural views, or is it because
they fear retaliation?

This
brings us back to our main point – the inexplicable action taken by Brandeis. Keep
in mind, this is the university that conferred honorary degrees on playwright
Tony Kushner who called the creation of Israel
“a mistake,” and on Archbishop Desmond Tutu who compared Israel to Nazi
Germany. The decision to pull Ms. Ali’s honorary degree was despicable for an
institution that prides itself as being liberal in the old fashioned definition
of the word – being open to all views. It was cowardly in that the press
release intimated that the decision was mutual, that it was made in
consultation with Ayaan Hirsi Ali. In a clarifying statement the next day, Ms.
Ali said when first approached by Brandeis: “I accepted partly because of the
institution’s distinguished history…I assumed that Brandeis wanted to honor me
for my work as a defender of the rights of women against abuses that are often
religious in origin.” In the same statement, after she had been told her name
had been withdrawn, she noted, “I was completely shocked when President
Frederick Lawrence called me – just a few hours before issuing a public
statement – to say such a decision had been made…I was not surprised when my
usual critics, notably CAIR, protested against my being honored in this way.
What did surprise me was the behavior of Brandeis.”

The
foregoing of an honorary degree will have no lasting impact on Ayaan Hirsi Ali
who will continue to fight for women’s rights against the abusive treatment
brought about by the intolerance of religious fanatics. But it does mark
another notch in the closing of an American university’s mind, where political
correctness subsumes academic and individual freedom, where free expression and
diversity have been replaced with silence and conformity. It is us – students
and citizens – who are the losers, not Ms. Ali.

Friday, April 11, 2014

I
never dived to the bottom of the ocean, nor have I ever ascended Everest. I
never ran a marathon, nor did I (or will I) make a billion bucks. I never sang
at the Met, nor did I ever ski the Matterhorn.
But together, Caroline and I made it through fifty years of marriage – a feat more
daunting than those listed, and certainly one more cherished.

Neither
of our parents made it to fifty years of marriage. Death intervened. Of our
four sets of grandparents, only one made it to fifty years, my paternal
grandparents. Ironically, they were the oldest of that batch to marry, both being
in their 30s, something unusual when they were married in 1907. There was a
small family party for them in 1957 in Wellesley,
which was good fun. But they seemed pretty old to me at the time. Consequently,
I do my best to act young and be vigorous as possible when around my own
grandchildren!

It
doesn’t seem that long ago that I was standing at the altar in the chapel of
the Church of the Heavenly Rest on New
York’s Fifth
Avenue. My brother Frank was at my side, as were
two cousins and Caroline’s brother. My sister Betsy and the wife of Caroline’s
cousin were her attendants. The rector, Floyd Thomas stood behind us. I was 23
and nervous. And then Caroline Elliott appeared coming down the aisle – a
vision of beauty – on her reluctant father’s arm. And why wouldn’t he be
reluctant? He was 71 years old, a Princeton and HarvardLawSchool graduate. I was a
boy from New Hampshire with a year to go in
college – the University
of New Hampshire – from
which I had dropped out for a couple of years to work and to go into the army. I
was not what one would have called a promising prospect. On the other hand, I
have been blest with an innate sense of optimism. I am one who prefers “what
might be” to “what could have been.”

The
first lines of Edward Albert Guest’s poem “It Couldn’t be Done” come to mind:

“Somebody said that it couldn’t be done.

But he, with a chuckle, replied

That maybe it couldn’t, but he would be one

Who wouldn’t say so ‘til he tried.”

(It
would be a mistake, though, to overplay the country boy-rube bit, as my father,
his father and both his grandfathers were Harvard men, while most of the males
in my mother’s family had gone to Yale. One exception was my favorite uncle who
went to Trinity. I was simply a lad late to mature.)

In
many respects, Caroline and I had an ideal start. Being young and in college
meant we had low overhead and no expectations about material goods. There was no
peer pressure. We grew into our new, married state. We both worked: Caroline
typing a manuscript, with me balancing three jobs and my courses. Between
classes, I drove a school bus, worked in a sandwich shop and wrote a sports
column for Foster’s Daily Democrat. Ten months later, in February, I had
completed my degree and had a job lined up with the Recordak division of
Eastman Kodak beginning in June. So we took $2000 we had saved (our rent was
$85.00 a month and we allocated $10.00 a week for groceries), bought two
roundtrip tickets to Paris, booked rooms for the night we arrived and the night
before we were to leave, and hired a Volkswagen Beetle. For the next eleven
weeks, with Arthur Frommer’s Europe on $5 a Day tucked in our bag, we
drove where impulse took us – France, Spain,
Italy, Austria, Germany,
Switzerland and back to France. It was
a time to unwind, a time to really know one another, and an opportunity to prepare
for the grown-up world we faced on our return.

In
1971, we ended up in Greenwich,
with our third child only a few days old. It was where we would live for the
next 24 years. (In the interim, I had left Kodak and joined Merrill Lynch in New Haven). I went to
work for the predecessor firm of Monness, Crespi, Hardt & Co., the firm to
which I returned in 1992, after an absence of seventeen years. Greenwich was where our
children grew up and where we made many of the friends we still have today. In
1993, we moved into the house in Old Lyme, which we had bought a couple of
years earlier, renovated and where we live today.

When
asked about the secret of staying married for fifty years (or just surviving that
long) my response is there is no secret, other than by marrying young you
increase the probabilities. A successful marriage obviously requires love for
the other person, but it also combines a willingness to share, to be empathetic,
supportive and to have an understanding that things will not always be just as
one wants. As much as anything, a successful marriage depends on luck. How well
can one know someone after a year or so of dating? How can one tell if one will
be a good mother or father? There is much that is left to chance, but like most
successful endeavors it also takes work, a willingness to listen and the
acceptance that each is an individual. In many respects Caroline and I are very
different, yet she is my best friend. There is no one with whom I would rather
have dinner or spend the weekend.

We
know we have been lucky. In a world marked by uncertainty, I am grateful for
the sense of permanence we have been afforded. And my sense is that the
permanence of our relationship has been good for our children.

We
will be celebrating by renewing our wedding vows this afternoon, something our
grandchildren have encouraged. Richard van Wely who for many years was rector
at St. Barnabas church in Greenwich
will officiate. Caroline will have as her attendants our six granddaughters,
ranging in age from 5 to 13. I will be accompanied by our four grandsons, the
youngest being 9 and the oldest 13. The congregation will consist of our
children and their spouses, along with Richard’s attractive wife, Judy. As I
stand at the altar this time, fifty years later, I will not have the same fears
that consumed me fifty years ago. I promise not to think of the word “altar,”
as described by Ambrose Bierce in The Devil’s Dictionary: “The word is now
seldom used except with reference to the sacrifice of their liberty by a male
and female fool.”

We
will adjourn to the Belle Haven Club for a wedding supper, thanks to our son
Sydney and his wife, the authoress Beatriz. Next week, all 18 of us will go to
the Hillsboro Club in Florida
for a few days over Easter weekend, where Caroline and I will prepare for our
next fifty years. The last fifty has been like life in a playpen. I expect and
hope the next fifty will be more of the same.

Wednesday, April 9, 2014

Yesterday
was “National Equal Pay Day,” one of those silly appellations that are applied
to an increasing number of otherwise blameless days. In this case,
responsibility goes to the National Committee on Pay Equity (NCPE). The concern
is the gap between pay for women and that of men for “similar” work. Besides
the difficulty in defining “similar,” there are also the questions of
performance, risks and hours worked that any employer must consider when
thinking about pay. Nevertheless, the commemoration of days such as this are celebrated
more for the political good they afford needy politicians than for the
betterment of society.

Democrats
up for re-election are particularly needy as they face what could be a gray November.
The economy is limping along. ObamaCare may have signed up 7.1 million people,
but 6 million lost coverage in the process, so net sign-ups are not very
impressive. (As an aside, it would be interesting to know what we taxpayers
spent in software and advertising expenses to sign up a net of 1.1 million
people.) Overseas, we have created a mess, from the Middle East to Ukraine to the East China
Sea. We have apparently abandoned the principles of the Monroe
Doctrine. “Without the United
States in the lead,” as Victor Davis Hanson
wrote recently, “the world cannot remain the world as we have known it since
1946.” Scandals have rocked this Administration, bringing comparisons of “If
you like your doctor, you can keep your doctor” Obama to “I’m not a crook”
Nixon.

So
what is Mr. Obama to do? As they face re-election, no candidate wants him by their
side. So the President resorts to familiar and proven issues, those that poll
well and serve to fire-up Democrats’ base – same-sex marriage, pre-school,
child care, family leave and pay equity.

The
original Equal Pay Act was signed into law in 1963 by President Kennedy, but
that proved inadequate to politicians who consider having to pay for birth
control to be a violation of a woman’s rights. So in 1996 “National Equal Pay
Day” was born. It comes on a Tuesday, because it represents how far into the
work week women must work in the next week to earn what men made the previous
week. Yesterday, Mr. Obama signed an executive order barring federal
contractors from penalizing employees who discuss their compensation. (Will
that EO apply to union leaders from docking a member who complains about using
his dues to pay for politicians antithetical to his beliefs?) The President
also signed a memorandum that will require contractors to release salary
summary reports broken down by race and gender. Quality of work performed, to
this White House, is obviously of less importance than measurements of
equality.

What
gets lost in this miasma of insufferable verbiage is that no two people perform
the same job exactly the same. If a man and a woman are hired out of business
school by the same company and in the same department, yet the woman proves
more effective than her male counterpart should she be punished by having her
pay level set at that of her underperforming compatriot? On an assembly line,
if a woman proves more adept at performing her job, should she be held back? If
a man and a woman hold “similar” jobs, but he puts in more hours, should he be
penalized?

It
is a silly and, in fact, demagogic argument that is potentially harmful to
employers and divisive to the people. Mr. Obama’s executive order is a gift for
trial lawyers, as the burden of proof, in an accusation of unfair pay, falls on
the employer, not the employee. So we can expect, as has been true in the past,
wealth to transfer from corporate coffers into the already gold-laden pockets
of trial lawyers.

Mr.
Obama is fond of telling us that we live in the 21st Century. Businesses
today compete globally, and must keep an eye on the bottom line. For-profit
businesses, including those who contract with governments, are motivated by – surprise
– profits. Without profits they would go out of business, which would result in
employees, regardless of gender or race, being out of work. Consequently, managements
look for efficiencies in plants, factories, offices and stores. Their purchase
departments strive to get the best bargains. Human resource departments are
charged with attaining benefit packages that satisfy employees without
overburdening the corporation. Tax lawyers are paid for their ability to
navigate the maze that is the tax code. And they hire the people they believe are
best qualified for the jobs – individuals who can either save the business
money or generate the most profitable revenues. Their motivation is profit, not
social justice. It can be no other way. When considering future employees and
compensation, race, creed or gender pale in comparison to the qualifications of
the individual to be a profitable contributor.

Government
bureaucracies and eleemosynary institutions do not have the same motivations.
Social justice may be part of their mission. But for government to force
private businesses to apply redundant standards is both unrealistic and
ultimately risky to government, which depends on a robust private sector to pay
the taxes on which government feeds. Not-for-profit organizations would go unfunded.
I say redundant, because, as the Bureau of Labor Statistics (BLS) reports, single
women who have never married made 96% of men’s earnings in 2012. Risk is
another factor. Ninety-two percent of work related deaths were to men. It is
redundant because the Equal Pay Act of 1963 made discrimination in pay a
criminal offense.

If
there were such a thing as equal performance in equivalent jobs, I would be in
favor of equal pay. But there is not. Common sense takes a back seat to the brainless
advocates who argue for equality of outcomes. In their 2012 book, The
Declining Importance of Race and Gender in the Labor Market, June and David
O’Neill argue that nearly all the 23% raw gender pay gap can be attributed to
factors other than discrimination.

In
their desire to gain political advantage, regardless of the cost to truth and
civility, advocates for equality in outcomes tend to harm the most productive.
It was interesting to watch the squirming of White House Press Secretary Jay Carney
when confronted with the fact that White House female staff members make on
average $0.88 for every dollar male staff members make. The fact that they do
only proves my point – pay is not gender based, it is determined by the job one
does. When Mr. Carney mentioned that those with similar jobs all made the same,
he was either lying or telling us that the White House uses income to further political
agendas, which is believable, but I suspect unlikely. Looking at a chart printed
in Tuesday’s New York Times (page A14) it would appear that the former
is more likely than the latter – different jobs pay different rates and pay is
based on ability, the specific job and hours worked. Of the 237 lower paid
White House staff (those below $70,000), 54% are women. Of the 136 higher paid
(above $100,000), 46% are women. Is that fair? I am no judge, but I assume that
Mr. Obama thinks it is. Mandating equality of outcomes cannot be done. Efforts
to do so tend to push people toward the lowest common denominator. The White
House cannot afford such outcomes and neither can businesses or
not-for-profits.

The
whole issue is a political football designed as a red herring – to keep the
eyes and ears of probing reporters from looking into scandals, like the IRS, Benghazi and Fast and
Furious, which are far more serious. In a world of robots, pay could be equal
as each machine could be programmed to do exactly what its mate does. But that
sort of thinking doesn’t work in the world of humans, a fact that the White
House recognizes, even if Jay Carney is not permitted to say so.

What
I do favor is the best pay for the best performance, regardless of race, gender
or creed. As such, pay and outcomes will never be equal, but opportunities should
always be.

Monday, April 7, 2014

Hornswoggle
is a word I have always liked. It is a verb meaning to bamboozle or to dupe.
While its origin is considered “unknown,” the word is generally thought to be native
to America.
The word is said to date to the early 19th Century, but it does not
appear in the 1828 Webster’s Dictionary. A man or a woman with an unfaithful spouse
can be described as having been hornswoggled. It was the kind of colorful word
we liked in New Hampshire,
where we never trusted city folk who tried to sell us something we didn’t want
or need. We suspected their motivation. We didn’t mind being the hornswoggler
but we didn’t want to be the hornswogglee.

Glancing
through the “Summary for Policymakers” just published by the Intergovernmental
Panel on Climate Change (IPCC), the word ‘hornswoggle’ came to mind. While the
IPCC, according to its own principles, is a policy-neutral organization, its
head Rajendra Pachauri, in an interview last September, said, “Humanity has
pushed the climate system to the brink.” He added, “We need to transition away
from fossil fuels.” With the release of the document a week ago, and sounding
just a mite less patronizing, he said, “Adaption alone is not going to solve
the problem and we need mitigation at the global level.”

In
some respects the IPCC has become its own worst enemy. The problem with government
(and non-government) bureaucracies is that they develop lives of their own.
Jobs and careers, in the case of the IPCC, depend on man-made influences on
climate change being the principal cause. Such agencies also serve as vehicles
for politically correct politicians.

But
not all agreed with the inflexible tone of the report. Richard Tol, a professor
of economics at the University
of Sussex and an expert
on climate change removed his name from the Summary Report, despite having
written one chapter and contributed to two others. He did not feel that the Report
accurately reflected persistent changes in technology, particularly as it
pertains to crop productivity, and it downplayed the benefits of global
warming, as it pertains to the reductions in deaths from cold stress and in
crop yields. Professor Tol believes in climate change, but takes exception with
the adamant tone of the Report.

Professor
Tol’s skepticism was matched by a paper published in a recent edition of
“American Journal of Agricultural Economics,” entitled Information Manipulation and Climate Agreements. In it, two
professors of economics, Fuhai Hong and Xiaojing Zhao note how environmental
groups and their media cohorts deliberately exaggerate for the “good of the
cause.” In 1989, the late Stanford professor for environmental studies Stephen
Schneider wrote: “…we have to offer up scary scenarios, make simplified,
dramatic statements, and make little mention of any doubts we might have.” In a
2003 issue of “National Science,” NASA global warming scientist James Hanson
conceded a similar attitude: that the use of “extreme scenarios” to dramatize
global warming “may have been appropriate at one time” to drive the public’s
attention to the issue.

Björn
Lomborg, the Danish professor and former director of the Environmental
Assessment Institute in Copenhagen
wrote last fall: “We should accept that there is global warming. But we should
also accept that current policies are costly and have little upside.” “It
(global warming) is becoming a religion, and religions don’t worry too much
about facts,” says James Lovelock, a 94-year-old retired UK scientist,
famous for his Gaia hypothesis that Earth is a self-regulating, single
organism. In his 2006 book, Revenge of Gaia, he wrote: “It’s just as
silly to be a [climate] denier as it is to be a believer.” He argues that
fracking and nuclear should power the UK, not windfarms.

While
all of these people believe that climate change is for real – that the Earth is
warming and that man has been part of the cause – they disagree in terms of the
magnitude of the change and to the extent of man’s culpability. We all exaggerate
to make points. But most of us do not do so with the sense of sanctimonious
righteousness common to those in the business of warning us about man’s effect
on global warming. For a few, like Al Gore and Michael Moore, exaggerating
fears of global warming provided a venue to enormous personal wealth, while
using taxpayer dollars to fund start-up costs. Is there any wonder so many of
us feel hornswoggled?

The
Summary Report issued dire warnings on the dangers from man-made global
warming. It will “devastate” food supplies, cause “mass extinctions of plants
and animals, worsen droughts, and raise the risk of wars over resources.” The
headline in the New York Times expressed their editorial bias: “Worst is
to Come.” Without swift and decisive action, the Earth will “almost surely face
centuries of climbing temperatures, rising seas, species loss and dwindling
agricultural yields,” the paper editorialized. The Summary Report was a “call
to action” noted the Los Angeles Times. There was little in the report
that discussed an evolving and ever-changing Earth that has warmed and cooled
over the millennia. There was little in the report that would help governments
and individuals prepare for what will be inevitable changes to the Earth’s
climate over the next several decades.

It
is not that man has had no affect on climate. He most assuredly has. The
question is to what extent. It is too simplistic to say that carbon dioxide or
methane emissions are solely responsible for global warming (or cooling).
Globally, coal consumption in 2000 was 5.3 billion metric tons; in 2011, it was
8.1 billion metric tons. Yet temperatures have remained pretty constant over
the past fifteen years. There is much about the world we still do not
understand. More dangerous than the ignorant are those who believe they know
what they don’t know. Perversely, the decision by Mr. Obama’s Environmental
Protection Agency (EPA) to ban the use of coal in new, green-field power
generating plants in the U.S. has increased the likelihood that more coal will
be used for that purpose in China and India, as the commodity has become
cheaper. Keep in mind, inexpensive food, energy and shelter are more important
to people in developing and emerging nations than environmental concerns.

That
our colleges and universities have become forums to promote political ideology
is a well known fact. A few days ago, the New York Times wrote
uncritically about courses on global warming being taught at the University of Oregon. The course is entitled, “The
Cultures of Climate Change.” Its goal is not to marshal evidence for climate
change as a human-caused crisis, or to measure its effects – the reality and
severity of it are taken as given – but how to think about it, prepare for it
and respond to it.” Instead of using scientific texts, the class will use
films, fiction, poetry and photography. Obviously, the purpose is indoctrination,
not instruction.

Forty
years ago climatologists were convinced the Earth was cooling, ice-caps were
getting larger and arable lands were lessening. For most of the past four
decades they have been worried that the Earth is warming, with ice-caps melting
and impairments to agriculture. As their rhetoric soared to match their
expectations of rising temperatures, the Earth’s temperatures began to
moderate. So, now their mantra is simply “climate change,” allowing them to be
right no matter which way the thermometer moves. Most of us would agree that
man has had an effect. Whether he has tilted the balance toward cooling, as was
believed in the 1970s, or toward warming, as is currently believed, no one can
tell for certain. Before we get overly excited about man’s effect it is worth
remembering that all creatures have some effect on their environment. There is
nothing permanent about life on Africa’s savannahs anymore than there is about
life on the marshes that separate my house from the Connecticut
River. Life is in constant motion. Everything alive must die at
some point.

What
we need from the IPCC is a balance that recognizes that the need to change must
take into consideration the costs of that change – especially for those in
developing parts of the world. We need to prepare to adapt to inevitable change.
We need less ideology and more common sense. Maine Governor Paul LePage, in an
op-ed in last Friday’s Wall Street Journal, noted how an EPA mandate
requiring that all new wood stoves have reduced emissions immediately would,
besides putting hundreds of people out of work, worsen future emissions from
wood stoves in Maine. The exorbitant costs of the new stoves would mean that
all but the very wealthy would continue to use their current stoves. In their
condescending attitudes, what rich liberals like Mr. Obama and bureaucracies
like the EPA never consider is that regulations are regressive in their
economic consequences.

The
IPCC reminds me of the boy who cried wolf. When the wolf didn’t appear, the
people’s guard went down; so that when he did appear, the sheep were eaten.
With an ideological focus on prevention, rather than adaption, the risk is that
we might not change our habits when the climate does change significantly. In
letting the argument be hi-jacked by climate extremists like Mr. Moore, Mr.
Gore, the editors of the New York Times and the Los Angeles Times,
and the professor at the University
of Oregon, we are being
hornswoggled by climate fanatics – to ours and the world’s detriment.

Friday, April 4, 2014

A
“small group of angry white liberals” are attempting to ban Condoleezza Rice,
the first African-American woman to be Secretary of State, from speaking at the
University of Minnesota on April 17 – this, according to an article in the
Daily Caller on March 29th. The group is led by a math teacher named
William Messing and a graduate student named Nick Theis who is a member of the ill-named
Students for a Democratic Society (SDS), with assistance from the University of Minnesota’s chapter of the SDS. Messing
introduced a resolution in the school’s University Senate that calls on the
administration to rescind the invitation. In an op-ed in the Minneapolis
Star Tribune, Mr. Theis wrote that his objection had nothing to do with
party politics or freedom of speech, but was “simply an issue of human rights.”
Yeah! Right!

Unfortunately
this attempt to muzzle conservatives is not unique to Minnesota. The faculty at RutgersUniversity
has opposed Ms. Rice as this spring’s commencement speaker. She “lacks moral
authority” and fails to meet the standards of “exemplary citizenship,”
according to the faculty’s statement. The attempt to silence the Right is not
unique to Ms. Rice. Neurosurgeon Dr. Benjamin Carson, who spent 30 years at
Johns Hopkins Hospital and was emeritus fellow of the Yale Corporation, was
banned from speaking at last year’s Hopkins’ graduation ceremony by an on-line
petition from “liberal” students.

Black
pastor Reverend Kevin Johnson of the BrightHopeBaptistChurch in North Philadelphia was
supposed to be the commencement speaker at his Alma Mater, MorehouseCollege.
But when he wrote an op-ed criticizing President Obama for “failing the Black
community” he was disinvited. Sandor Farkas is one of two students at Dartmouth (both
registered Independents) who were banned from a campus activists’ strategy
meeting as to how the college should honor Martin Luther King Day. Mr. Farkas’
comment: “What I am really frustrated at is that we can’t have reasonable
discussions about these issues.”

These
attempts to censor free speech at colleges and universities are in direct
contradiction to the concept that centers of learning should be: marketplaces
for ideas. Harvard senior and columnist for the Harvard Crimson Sandra
Korn put it most bluntly: “If our university community opposes racism, sexism,
and heterosexism, why should we put up with research that counters our goals?” Her
article was titled, “Let’s give Up on Academic Freedom in Favor of Justice.”
Besides pointing out the obvious, that she has already abandoned academic
freedom, does Ms. Korn feel that all heterosexuals are homophobic? How would
the biology department explain evolution without heterosexism?

The
plot thickens. The IRS scandal is, in its essence, a blatant attempt by government
to squelch free speech. That was also true of Mr. Obama’s demonization of Supreme
Court Justices for their decision in Citizens United case. The Left wanted to
ban corporate giving to political campaigns with no restraint on Union giving. There
is too much money in politics, but every effort to control spending has failed.
The best solution is to have the name of every donor be made public, regardless
as to whether the gift is directed at the candidate, campaign or at a political
action committee. 501(c)(4)’s should be disallowed. Sunshine and removing any
tax advantage will do more to limit spending on campaigns than any
government-proposed plan, as we all should know given the failure of
McCain-Feingold. As openthebooks.com would say, open the books.

Free
speech, as we all know, has limitations. It does not include the right to yell
“Fire!” in a crowded theater; it does not include the right to be abusive, or
to call out obscenities, to disturb others or to foment violence. The right to
privacy and the protection of intellectual property are also rights protected
by the Constitution. It is government – not the individual – that the
Constitution confines, with a system of checks and balances. When President
Obama says he wants to do things with or without Congress, he is speaks as
government, not as an individual; thus should be subject to the restraints
imposed on him by the Constitution. To let him do as he pleases sets a
dangerous precedent.

In
all that we do and say, we should never forget that liberty is sacrosanct and
fragile. Listening last Sunday to four Ukrainians describe what was happening
to their country, I was struck by the realization that young people like Mr.
Theis and Ms. Korn and the students at Rutgers, Dartmouth and John Hopkins have
more in common with the Russians than the Ukrainian protestors. The young
gathered in Maidan Square
wanted freedom, not socialism; independence, not paternalism; free speech, not
ultimatums. In this country we take such rights for granted. However, in doing
so, we forget how valuable they are, and what they cost in blood and fortune.
We say we sympathize with those who struggle to be free, but we draw the line
at taking up arms. But what happens when and if tyranny arises at home? To say it
can’t happen is to ignore history and to misunderstand the nature of man.

We
need more discourse, not less. It provides balance. Encouraging equality of
outcomes is not the same as promoting equality of opportunity. The rights of
the gay community should be protected, but those rights should not come at the
destruction of the traditional family, which is happening, with almost half of
first-born American children born out of wedlock. Narrowing gaps in income and
wealth are worthy goals, but not at the expense of stifling aspiration,
innovation and economic growth.

Conservatives
place their confidence in individuals. Liberals argue that a benign government does
a better job looking after those who most need help. But governments, as all
revolutionaries know, are not always benign. Power corrupts. Conservatives see
the expansion of government as inimical to the concept of freedom, including
free speech. Liberals don’t.

What
gives me confidence, however, that those who would stifle free speech will
ultimately lose this war has been the proliferation of bloggers, talk radio,
cable TV and those like the Koch brothers and their antitheses, George Soros
and Tom Steyer. What these people ensure is that myriad voices will be heard. Speakers
and writers may be reviled by those who don’t like the message, but that is no
reason to not permit the messenger to speak – an act far more common to the
intolerant Left than the more tolerant Right. I was pleased that the Supreme
Court struck down the attempt to impose limits on campaign contributions in
Wednesday’s decision, McCutcheon versus FEC. As I wrote earlier, it is
sunshine, not mandates, that is needed. The man or woman who denies free speech
is the one who fears an opponent’s words may successfully challenge their own,
and serve to convert their followers.

In
1949, the FCC passed the Fairness Doctrine, requiring broadcasters to equitably
air contrasting opinions. The next thirty years was a time when television news
was limited to three broadcasters and a shrinking number of newspapers. There
were no talk radio or all-day news stations. With news sources limited, the
Fairness Doctrine was acceptable and enforceable. But those with opinions
outside the mainstream had a difficult time expounding their ideas. In 1969,
the Supreme Court ruled unanimously that not only was the Doctrine
constitutional, but necessary to democracy. Finally, in 1987, with urging from
President Reagan, the FCC abolished the Fairness Doctrine. Reagan proved wiser
than the nine Justices. He recognized that news sources were mushrooming and
that the word “fair” had different meanings to different people. He knew that
people are better served with more sources of information, not one controlled
by a federal bureaucracy. He knew news people had biases. Reagan’s beliefs were
based on the concept that the individual is capable of making up his or her own
mind, that they don’t want or need their news filtered. His was a faith in
people, not institutions.

Unfortunately,
extremists in the Democrat Party now occupy the mainstream. It is disturbing
that those claiming to be liberal are vocally illiberal when it comes to dissenting
opinions. It is especially so when we see it in our universities, supposedly
bastions of classical liberalism. To the extent those university factories
produce graduates who never hear a contrary opinion, our democracy will suffer.
Fortunately they are countered by those who challenge conventional thinking. It
is troubling, however, when the federal government utilizes the IRS to muzzle
dissent, the CIA to cover up Benghazi, the FCC
to attempt to place monitors in U.S.
newsrooms, and the Justice Department to be disingenuous regarding Fast and
Furious.

The
tragedy in curbing dissenting opinions is that it causes the misconstruance and
accentuation of differences. There is a difference in what conservatives and
liberals see as the role of the U.S.
to be in the world and in the size and reach of government at home.
Nevertheless, in general, my progressive friends and I do not disagree as to
long term goals, but rather how best to achieve them. We all would like to live
freely, in a peaceful world, in which people can assemble, pray, write and
speak as they choose. We yearn for a country that allows for the pursuit of
happiness. We would all like the economy to grow, for jobs to be plentiful, for
poverty to diminish and for an educational system that permits our children
(and grandchildren) to become more competitive in the global economy. Where we
differ is how best to achieve those goals, and how to achieve them without
giving up liberties. Liberals believe government elites know best;
conservatives put their trust in people. But, regardless on which side of the
divide one stands, we should all agree: free markets in ideas are as valuable
as free markets in economic matters.

Sadly,
speech is not free in our universities and colleges and it is increasingly curtailed
by an aggressive government. But thank God for cable networks and for the
internet, both of which allow the marketplace for ideas to survive and thrive. It
should not shock anyone that Fox News attracts almost three times the
viewership of MSNBC and almost four times that of CNN, despite denunciations of
Fox by the President and CNN being played in every airport lounge in the
country. The downside of this barrage of news is that it provides little time
for reflection, something that might help young people like Ms. Korn and Mr.
Theis. As for the internet, its success raises another question: Why would Mr.
Obama want to transfer supervision of an intellectual property that has been so
successful and is so important to people in countries whose governments are
less free than ours? But that’s a topic for another day.