jforeman

Comment/OpEdComments Off on The UK Home Office is Still Not Fit for Purpose (Daily Express Jul 23, 2106)

THANKS TO Brexit and Theresa May’s arrival as Prime Minister, the UK may be about to twee a sea-change in the often chaotic ways it has managed migration in recent years.

However it was not reassuring that during her first week as Home Secretary, Amber Rudd refused to commit to even a vague numerical target for reduced migration

However it was not reassuring that during her first week as Home Secretary, Amber Rudd refused to commit to even a vague numerical target for reduced migration, despite the fact that bringing net migration down to the “tens of thousands” was a Tory general election pledge.

A significant amount of the annual migration into the UK is beyond the Home Secretary’s control because of Britain’s membership of the EU.

But the fact is that according to Migration Watch non-EU immigration is the largest element of net migration into Britain.

Moreover, those aspects of non-EU immigration that most infuriate the public: abuse of the asylum system, the flouting of visa rules by students and the seemingly inexplicable inability of the system to remove those who are here illegally – are all fixable given political will and adequate resources. No new laws are needed. The Home Office just has do a better job of enforcing existing rules and keeping track of who comes and goes from the UK.

Perhaps the most urgent reform is one that was promised by the Cameron government and supposed to be in place by March 2015 but which was never implemented.

This is the re-establishment of embarkation controls in the country’s airports and seaports. Many countries, even in Europe, have passport control for people leaving the country.

It is the only practical way of ensuring a government has a reasonable idea of who is in the country and who has left.

Most people have no idea how inadequate our Government’s resources are when it comes to estimating both legal and illegal immigration. GETTY

Theresa May could change the way the UK copes with immigrationThe Office for National Statistics is forced to rely on something called the International Passenger Survey.

This is a voluntary polling of passengers who come in from certain airports, ports and the Channel Tunnel.

What has made the IPS an especially bad source of guestimates for UK immigration is that, as the ONS admitted in 2014, its interviewers were sent to places such as Heathrow rather than the regional airports or coach terminals used by most migrants.

As a result it underestimated immigration by at least 350,000 over the previous decade.

The Home Office just has do a better job of enforcing existing rulesBut knowing who is here is not enough. Britain needs a Home Office able and willing to enforce migration laws.

One of the biggest spurs to illegal immigration into Britain has been the perception abroad that once you have made it into the UK, by whatever method, you are unlikely to be removed.

This is true if you come in with false papers, overstay a visa, enrol in a bogus college, take part in a fake marriage, if you have committed terrorist offences abroad, even if you commit a serious crime during your time in the UK.

Although there may be more than a million illegal immigrants in the UK, employers who knowingly employ illegal immigrants are rarely ever prosecuted. Britain’s student visa regime has long been a notorious vehicle for illegal immigration.

And even though the Cameron governments closed down thousands of bogus colleges that existed only to get people into the country, there is a problem of students overstaying their visas.

Of the 100,000 non-EU people who come to this country every year on student visas, a huge proportion, perhaps half, simply vanishes after their visas expire.

Astonishingly the Home Office, which should be tracking those on student visas and then looking for those students who have overstayed, has routinely failed to do this.

Then there is the asylum system. As Alp Mehmet of Migration Watch points out, scores of thousands of people “are denied asylum every year but there is no record of them leaving”.

Britain needs a Home Office able and willing to enforce migration lawsThey don’t depart of their own free will and the Home Office is too under-resourced or inept to deport them.

This is only partly because many British judges favour an extreme interpretation of European laws about deporting people to any country whose legal system they believe is not up to British standards.

It is mostly because the Home Office has never committed adequate resources to enforcing immigration law and tribunal rulings, removing those who have lost the right to remain, or keeping track on people who may have to be removed.

Indeed it is no exaggeration to say it is the inadequacies of the Home Office that have ensured Britain’s immigration laws are a joke.

Ms Rudd has a chance to change all this and transform her own reputation as a minister whose talents lie in public relations rather than running a government department.

Of course it will take money to enable the Home Office to properly do its job of safeguarding the country’s borders.

Ms Rudd’s predecessor was, for whatever reason, unable to demand the necessary extra cash from George Osborne to fund the Border Force and other vital, underfunded parts of the Home Office.

Shortly before the referendum it was revealed that the UK has only three coastal patrol boats for its entire coastline and that no one is keeping tabs on the country’s airstrips.

It is in everyone’s interest that the Home Office finally becomes fit for purpose. Once people know that Britain’s rules are enforced it will promote respect for the system, deter illegals, encourage migrants who play by the rules and make British voters feel less taken advantage of – all of which feeds social harmony.

Comment/OpEdComments Off on Theresa May’s Record as Home Secretary is Alarming, Not Reassuring (Spectator Blog Jul 16, 2016)

Despite David Cameron’s experience as a marketing man, his skills at reputation management were feeble compared to those of Theresa May. May was not a terrible Home Secretary but she was not a good one, still less an outstanding one.

Yes, she remained in office for six years. But longevity in office is hardly proof of success, even at the Home Office. Anyone who has worked in a large organisation has encountered long-serving, apparently unfireable incompetents, and one thing that the history of the Cameron administration surely proves is that being bad at your job rarely leads to losing that job.

Some kind of strange magic has prompted pundits and analysts to forget all the misfortunes and scandals of her tenure. Now seems a good time to remember them, and to consider the type of leadership style that they suggest.

To begin with there was the outcry over a relaxation of border checks on non-EU nationals that came about because of ‘unauthorised actions’ by a head of the Border Force who took a sensible pilot scheme too far. Later came the absurd vans carrying billboards telling illegal immigrants to leave. Then, the strange, secret advice deal with Saudi Arabia’s interior ministry that involved Mrs May travelling to that country whose criminal justice system is infamous for its barbarity and medieval cruelty. After that came the mistreatment of female asylum seekers by Home Office contractors at the Yarl’s Wood detention centre, a scandal that deepened when Mrs May banned the UN’s special rapporteur on violence against women from visiting the centre.

There was also Mrs May’s decision to opt into the European Arrest Warrant, which allows immediate extradition without prima facie evidence to EU countries, some of which have corrupt, third-world style judicial systems. And in the case of the three radicalised British schoolgirls who flew to Turkey to join Isis, their recruiters were correct in predicting that they would be able to leave the country undetected. (Three years later, the Home Office has still failed to institute embarkation checks at British airports.) Together these illustrate the unhappy, almost South American combination of authoritarianism and lethargy that marked so much of the Home Office’s trajectory during Mrs May’s leadership

As well as the baffling, infamous mistreatment of Afghan interpreterswho had worked with British forces in Afghanistan (for which she characteristically escaped censure), Mrs May’s tenure at the Home Office saw a number of troubling decisions about who is and isn’t allowed to enter the UK on ideological and public safety grounds. Whatever one may think of the controversial American bloggers Pam Geller and Robert Spencer, neither have advocated violence, and both have spoken all around North America without incident. Their 2013 exclusion from the UK should concern anyone who takes freedom of speech seriously.

Then there was the cynical political correctness. Mrs May talked about coming down hard on hate crimes and lambasted the police about a lack of diversity. But she abjectly failed to identify the child rape rings of Rotherham, Rochdale, Sheffield, Bradford and Oxford as the racially and ethnically motivated hate crimes that they were.

When it came to police reform, May’s efforts were far less ambitious or impressive than anything achieved by Gove in the Department for Education or the Ministry of Justice. She certainly deserves credit for forcing through changes in pay, conditions and pensions and for potentially improving police leadership by allowing ‘direct entry’ to senior positions from civilian life. But many of the more disturbing tendencies in British policing have become worse under her leadership, most obviously the distortion of policing priorities by public relations concerns.

The nadir of this phenomenon was Operation Midland, one of the most disgraceful episodes in the history of modern British policing. Millions of pounds were spent investigating allegations that various former ministers, intelligence chiefs and other top officials had been part of a paedophile ring that raped and murdered young boys. The fact that that both Detective Superintendent Kenny McDonald, the senior officer on this case and his boss, Sir Bernard Hogan-Howe, emerged from this sinister debacle with their jobs, was rather more illustrative of Mrs May’s real attitude to the policing establishment than her famous speech to the Police Federation in 2014.

May has also done little to reverse various policing trends that have alienated the public from the police, including the abandonment of neighbourhood policing, the substitution of decoy-like PCSOs and CCTV for beat patrols, and the massaging of crime statistics, At the same time Mrs May has given the nod to massive, transformative budget cuts that may genuinely make Britain’s police forces unfit for purpose. There is an argument that smaller overall numbers and decreased budgets don’t automatically translate to fewer frontline officers or less effective policing. But this assumes that forces are well-run and that resources aren’t so depleted that they cannot function. However, if many of your 43 separate forces are poorly managed and are culturally inclined to prioritise exciting, fashionable or easy aspects of policing – such as trawling social media for hate speech – over patrolling the streets, then smaller numbers will definitely make a difference for the worse.

Thanks to the cuts and also to Mrs May’s disdainful attacks on certain aspects of police culture, she leaves the Home Office with police morale at what may be an all time low (though nowhere near as bad as in the armed forces). It is telling that many police officers believe that her loud opposition to ‘stop and search’ and criticism of inadequate police diversity have been typical May opportunism that had more to do with image-management and personal ambition than any genuine concern for minorities or civil liberties.

And what of May’s record on migration? During the last election campaign David Cameron took brickbats for the fact that net migration into the UK had actually increased from 244,000 in 2010 to 330,000 in 2014 rather than being brought down to less than 100,000 as he had committed. Mrs May, the cabinet minister actually responsible for making the government’s commitment a reality, faced remarkably little criticism or even questioning about why this had failed.

The target was probably an impossible one given Britain’s chaotic border arrangements in 2010, and May certainly could not be blamed for the attractiveness of the UK and its work opportunities for young EU citizens from countries with ever-worsening youth unemployment. But you would be hard pressed to find evidence of serious and effective effort to repair or reform the parts of the Home Office entrusted with border security and migration.

As the then shadow home secretary Yvette Cooper pointed out, three years after May took over the ministry, the number of people refused entry had dropped by 50 per cent, the backlog of finding failed asylum seekers had gone up, the number of foreign prisoners removed had gone down, and the number of illegal immigrants deported had also gone down. Tens of thousands of international students kicked out of the country by the Home Office – in a panicked response to a TV documentary about a test cheating scam – then turned out to have been wrongly deported. Meanwhile, bogus colleges that falsify ‘student’ records so that foreigners can work illegally in the UK have continued to flourish because the Home Office has an inadequate number of staff assigned to checking them.

The Border Force, now a separate agency with spiffy new uniforms, is demoralised, overstretched and facing deeper, remarkably ill-timed austerity cuts. It has less than a handful of patrol boats to guard the coastline even as the migrant crisis deepens, and it is unable to keep any watch at all on the country’s many small airfields. Yet, remarkably, the Home Secretary never showed any inclination to stand up to the Chancellor on its behalf – even in the wake of the Paris and Brussels terrorist attacks .

Despite her carefully fostered reputation for toughness, Mrs May’s record on extremism is perhaps the least impressive aspect of her checkered tenure at the Home Office. Any public official who seriously addresses radicalisation, ghettoisation and extremism risks being labeled an Islamophobe or worse. It takes a brave politician, one more committed to doing the right thing than to securing a glorious political future, to take on this hornets’ nest; Mrs May was not such a politician. This began to be clear during the Trojan Horse affair, when official reluctance to confront radicalisation in Birmingham schools prompted a concerned Education Secretary to venture onto the Home Secretary’s turf. (Her characteristic fury at this trespass was damaging to both departments at the time, and may well wreak havoc into the new government. Certainly her firing of Michael Gove’s as Justice Minister, despite the fact that his incomplete prison reforms have been universally lauded, looks like a destructive act of petty vengeance and personal spite.)

It became more apparent when Mrs May, having delivered some appropriate sound-bites, avoided potential career-inhibiting controversy by ensuring that the Home Office’s efforts to deal with tricky issues like female genital mutilation, honour killings and forced marriage remained as low key – and low impact – as possible. But it is even more obvious in the investigation Mrs May eventually set up into whether Britain’s Sharia courts, some legal, some not, might possibly discriminate against women in matters of divorce, domestic violence and child custody, as a result of a ‘misuse’ of Sharia teaching. (In the past the Home Secretary has implicitly claimed a surprising intimacy with Islamic law and political thought, asserting in 2014 that the actions of Isis ‘have absolutely no basis in anything written in the Koran.’)

Of course, Mrs May is hardly the first ambitious politician to have disregarded principle or even the public good in order to smooth her ride to the top. Her defenders would argue that a pragmatic lack of ideological ballast is one of the qualities she shares with David Cameron.

More disturbing are the tendencies that have caused her to be nicknamed Teflon Theresa or McCavity May. As well as the buck passing that ensured that blame for all of the Home Office’s failings fell onto junior ministers and civil servants, Mrs May and her staff put tremendous effort into ensuring that she rarely – if ever – faced a Paxman-style grilling. And so good were they at applying pressure on the media that remarkably few critical articles about her have ever been published. There’s even a peculiar tendency for those that have been published to be taken down or become unavailable.

Ronald Reagan once said there is no limit to what you can achieve if you don’t mind who gets the credit. Unfortunately it is also true that if you mind very much who gets credit and blame, then you are unlikely to achieve a great deal. As Home Secretary Theresa May was hobbled by her own ambition. Perhaps now she has the power and position she worked so hard to get, her main priority can at last be the country she serves.

(Originally published on the Daily Telegraph site July 1, 2016; taken down later that afternoon: see https://order-order.com/2016/07/02/read-full-article-pulled-telegraph-pressure-may-campaign/)

In the run-up to the 2015 election one of the handicaps David Cameron had to finesse was the fact that net migration to the UK was three times as high as he had promised it would be. Remarkably, none of the opprobrium this failure provoked brought forth the name of Theresa May, the cabinet minister actually entrusted with bringing migration down. Then, as now, it was as if the icy Home Secretary had a dark magic that warded off all critical scrutiny.

The fact that her lead role in this fiasco went unmentioned reflects Mrs May’s clever, all-consuming efforts to burnish her image with a view to become prime minister. After all, Mrs May’s tenure as Home Secretary has been notably unsuccessful. Its abundant failures include a succession of derelictions that have left Britain’s borders and coastline at least as insecure as they were in 2010, and which mean that British governments still rely on guesswork to estimate how many people enter and leave the country.

People find this hard to credit because she exudes determination. Compared to many of her cabinet colleagues she has real gravitas. And few who follow British politics would deny that she is a deadly political infighter. Indeed Theresa May is to Westminster what Cersei Lannister is to Westeros in Game of Thrones: no one who challenges her survives unscarred; the welfare of her realm is a much lower priority than her craving for power.

Take the UK Border Force. Despite the increased terror threat, it was already a dangerously underfunded and demoralised agency when Mrs May announced in April that its budget was to be cut. Then in May, after two people-smugglers’ vessels were found sinking off the Kent coast, the public discovered that the Force has only three cutters protecting 7,700 miles of coastline. Italy by contrast has 600 boats patrolling its 4722 miles.

Considering the impression Mrs May gives of being serious about security, it’s astonishing that she has also allowed the UK’s small airfields to go unpatrolled — despite their attraction for traffickers of people, drugs and arms, and the urgings of the security services.

Then there is the failure to establish exit checks at all the country’s airports and ports. These were supposed to be in place by March 2015.

Unfortunately the Border Force isn’t the only organisation under Mrs May’s control that is manifestly unfit for purpose. Recent years have seen a cavalcade of Home Office decisions about visas and deportations that suggest a department with a bizarre sense of the national interest. The most infamous episode was the refusal of visas to Afghan interpreters who served with the British forces in Afghanistan – as Lord Guthrie said, a national shame. Mrs May has kept so quiet about this and other scandals – such as the collapse of the E-borders IT system, at cost of almost a billion pounds – that you might imagine someone else was in charge the Home Office.

It is not just a matter of the odd error. Yvette Cooper pointed out in 2013 that despite Coalition rhetoric, the number of people refused entry to the UK had dropped by 50%, the backlog of finding failed asylum seekers had gone up and the number of illegal immigrants deported had gone down. You’d almost imagine that Mrs May was so busy defending her turf and polishing her image that there was no time left to embark on the major reforms her department obviously needs.

The reputation for effectiveness that Mrs May enjoys mostly derives from a single, endlessly cited event: the occasion in 2014 when she delivered some harsh truths to a conference of the Police Federation. Unfortunately this was an isolated incident that, given the lack of any subsequent (or previous) effort at police reform, seems to have been intended mainly for public consumption.

In general Mrs May has avoided taking on the most serious institutional problems that afflict British policing. These include, among other things, a disturbing willingness by some forces to let public relations concerns determine their policing priorities, widespread overreliance on CCTV, a common propensity to massage crime numbers, the extreme risk aversion manifested during the London riots, and the preference for diverting police resources to patrol social media rather than the country’s streets.

There is also little evidence that Mrs May has paid much attention to the failure of several forces to protect vulnerable girls from the ethnically-motivated sexual predation seen in Rotherham and elsewhere. Nor, despite her proclaimed feminism, has Mrs May done much to ensure that the authorities protect girls from certain ethnic groups from forced marriage and genital mutilation. But again, Mrs May has managed to evade criticism for this.

When considering her suitability for party leadership, it’s also worth remembering Mrs May’s notorious “lack of collegiality”. David Laws’ memoirs paint a vivid picture of a secretive, rigid, controlling, even vengeful minister, so unpleasant to colleagues that a dread of meetings with her was something that cabinet members from both parties could bond over.

Unsurprisingly, Mrs May’s overwhelming concern with taking credit and deflecting blame made for a difficult working relationship with her department, just as her propensity for briefing the press against cabinet colleagues made her its most disliked member in two successive governments.

It is possible that Mrs May’s intimidating ruthlessness could make her the right person to negotiate with EU leaders. However, there’s little in her record to suggest she possesses either strong negotiation skills or the ability to win allies among other leaders.

It’s surely about time – and not too late – for conservatives to look behind Mrs May’s carefully-wrought image and consider if she really is the right person to lead the party and the country. There’s a vast gulf between being effective in office, and being effective at promoting yourself; it’s not one that Theresa May has yet crossed.

Review of “The Last Thousand” by Jeffrey E. Stern

A 2015 exposé on the Buzzfeed website created a stir by savaging the notion that the massive expansion of education in Afghanistan has been one of the triumphs of the international military effort. It was titled “Ghost Students, Ghost Teachers, Ghost Schools.”

“As the American mission faltered, U.S. officials repeatedly trumpeted impressive statistics— the number of schools built, girls enrolled, textbooks distributed, teachers trained, and dollars spent —to help justify the 13 years and more than 2,000 Americans killed since the United States invaded,” wrote a Pakistani-American journalist named Azmat Khan. The U.S. government’s claims are, Khan said, “massively exaggerated, riddled with ghost schools, teachers and students that exist only on paper.”

One-tenth of the schools that Buzzfeed’s employees claimed to have visited were not operating or had not been built. Some U.S.-funded schools lacked running water, toilets, or electricity. Others were not built to international construction standards. Teacher salaries, often U.S.-subsidized, were being paid to teachers at nonexistent schools. In some places local warlords had managed to divert U.S. aid into their own pockets.

The tone and presentation of the article leaves little doubt of its author’s conviction that 13 years of effort in Afghanistan, including the expenditure of 2,000 American lives and billions of dollars ($1 billion on education alone) were pointless and the entire intervention a horrendous mistake.

Unfortunately, it is all but certain that some of the gladdening numbers long cited by USAID and others are indeed inaccurate or misleading, especially given that they are based in large part on statistics supplied by various Afghan government ministries. The government of Afghanistan is neither good at, nor especially interested in, collecting accurate data. Here, as in all countries that receive massive amounts of overseas aid, local officials and NGOs have a tendency to tell foreign donors (and foreign reporters) what they think the latter want to hear. They are equally likely to exaggerate the effectiveness of a program or the desperate need for bigger, better intervention.

Moreover it would be remarkable if there weren’t legions of ghost teachers. No-show or nonexistent salaried employees are a problem in every Afghan government department. This is true even in the military: The NATO-led coalition battled for years to stop the practice whereby Afghan generals requested money to pay the salaries of units that existed only on paper. As for abandoned or incomplete school-construction projects, such things are par for the course not only in Afghanistan but everywhere in South Asia. India, Nepal, and Pakistan are littered with them. You don’t read about them much because no development effort has ever been put under the kind of (mostly hostile) scrutiny that has attended America’s attempt to drag Afghanistan into the modern era. Given the general record of all development aid over the past half century and the difficulty of getting anything done in a conflict-wrecked society like Afghanistan, it may well be the case that reconstruction efforts by the U.S. military and U.S. government in Afghanistan were relatively effective and efficient.

Despite all the money that may have been wasted or stolen, there really has been an astonishing education revolution in Afghanistan that is transforming the society. It is an undeniable fact that the U.S.-led intervention in Afghanistan has enabled the education of millions of children who would never have seen the inside of a school of any kind had it not been for the overthrow of the Taliban. The World Bank and UNICEF both estimate that at least 8 million Afghans are attending school. This means that even if a quarter of the children who are nominally enrolled in school aren’t getting any education at all, there are still 6 million other kids who are; in 2001 there were fewer than 1 million children in formal education, none of them female.

To get a sense of what education can achieve in Afghanistan, even in less than ideal circumstances, you can hardly do better than to read The Last Thousand, by the journalist and teacher Jeffrey E. Stern. It tells the extraordinary story of Marefat, a school on the outskirts of Kabul. Marefat (the Dari word means “knowledge” or “awareness”) was originally founded in a hut in a refugee camp in Pakistan. After the fall of the Taliban regime in November 2001, its founder, Aziz Royesh, brought the school to Afghanistan and set it up on a windblown patch of desert West of Kabul. By 2012, Teacher Aziz, as he is known to all, had enrolled a total of 4,000 pupils and was sending students to elite universities around the world, including Tufts, Brown, and Harvard.

The school primarily caters to the Hazara ethnic minority, of which Aziz (a former mujahideen fighter) is a member. As anyone who read The Kite Runner or saw the movie made from the bestselling novel by Khaled Hosseini knows, the Hazara have long been the victims of oppression by the majority Pashtuns and Tajiks. The Hazara (who account for about 10 percent of the Afghan population) bear a double ethnic burden. They are Shiites—heretics in the eyes of the Sunni majority of the country. And they look Asiatic. Indeed, they are widely but probably wrongly believed to be the descendants of Genghis Khan’s invaders.

Hazara were traded as slaves until the early 20th century. As late as the 1970s, they were barred from government schools and jobs and banned from owning property in downtown Kabul. As if certain parallels to another oppressed minority weren’t strong enough, the Hazara are well known for their appetite for education and resented for their business success since the establishment of a democratic constitution, and they have enthusiastically worked with the international military coalition—all of which has made them particular targets of the Taliban.

From the start, Aziz was determined to give his students an education that would inoculate them against the sectarian and ethnic extremism that had destroyed his country. He taught them to question everything and happily educated both boys and girls, separating them only when pressure from conservative politicians put the school’s survival at risk. (When fathers balked at allowing their daughters to go to school, Aziz assured them that a literate girl would be more valuable in the marriage market.) Eventually the school also found itself educating some of the illiterate parents of its students and similarly changing the lives of other adult members of the school community.

The school’s stunning success in the face of enormous obstacles won it and its brave, resourceful founder affection as well as benefactors among the “Internationals”—the foreign civilian and military community in Afghanistan. When John R. Allen, the tough U.S. Marine general in command of all international forces in Afghanistan, finished his tour in February 2013, he personally donated enough money to the school to fund 25 scholarships. Thanks to reports about the school by a British journalist, a “Marefat Dinner” at London’s Connaught Hotel co-sponsored by Moet & Chandon raised $150,000 for the school in 2011. But by early 2013, Teacher Aziz was in despair for Marefat’s future, thanks to terrorist threats against the school and President Obama’s declaration that he would pull out half of America’s forces within a year regardless of the military and political situation in the country.

It’s a fascinating story. Which makes it a shame that much of it is told in a rather self-indulgent and mannered way. Stern’s prose tends to exude a world-weary smugness that can feel unearned, especially given some shallow or ill-informed observations on subjects such as Genghis Khan, Blitzkrieg, and the effect of Vietnam on current U.S. commanders, and his apparent ignorance of the role of sexual honor in Hazara culture.

Most exasperating, Stern patronizingly assumes an unlikely ignorance on the part of the reader. There are few newspaper subscribers who, after 15 years of front-page stories from Afghanistan, have not heard of the grand assemblies known as loya jirgas, or who don’t know that Talib literally means student. Yet Stern refers to the former as “Grand Meetings” and the latter as “Knowledge Seekers.” He also has his characters refer to Internationals as “the Outsiders,” even though any Afghan you are likely to meet knows perfectly well that the foreign presence comes in different and identifiable national and organizational flavors: Americans, NATO, the UN, the Red Cross, Englistanis (British), and so on. The same shtick apparently frees Stern from the obligation to specify an actual date on which an event occurred, or the actual name of a town or province.

Even so, The Last Thousand is a powerful and important book, especially in the way Stern conveys the sense of betrayal and the terror that many Afghans feel at the prospect of international abandonment. The Hazara children and staff at the Marefat school fear a prospective entente with the Taliban enthusiastically promoted by foreign-policy “realists” in the U.S. and UK. They correctly believe it would lead to cultural concessions that could radically diminish their safety and freedom—if not a complete surrender to murderous Pashtun racism and Sunni bigotry.

The book’s main characters are concerned by what seemed to be the imminent, complete departure of all foreign forces as part of the “zero option.” This option was seriously considered by the United States in 2013 and 2014 when then–President Karzai, in the middle of a bizarre descent into (hashish-fueled) paranoia and poisonous anti-Westernism, refused to sign a bilateral security agreement with the Western powers.

Aziz confessed to Stern (who was teaching English at the school) that he himself was in despair but was trying to hide his gloom from his pupils. He began to to urge his students, graduates, and protégés—especially the female ones—to be less vocal in their complaints about discrimination against Hazara, and he himself began controversially to cultivate unexpected allies such as the Pashtun presidential candidate Ashraf Ghani. But Marefat’s staff, students, and their parents had few illusions about the future. As one young girl said to Aziz: “If the Americans leave, we know there is no chance for us to continue our education.”

Although the future of Marefat and its Hazara pupils is uncertain, it is comforting that so much has already been achieved by the education revolution in Afghanistan. Assuming that the Taliban and its Pakistani government sponsors are not allowed to take over or prompt a collapse into civil war, this revolution may well have a tremendous and benign effect on the country’s future. After all, more than 70 percent of the Afghan population is under 25 and the median age is 17. Unlike their parents, these youths have grown up with television and radio (there are more than 90 TV stations and 174 FM radio stations), cellphones (there are at least 20 million mobile users), and even the Internet. Their horizons are wider than anything the leaders of the Taliban regime could even imagine.

As Stern relates in a hurried epilogue, the bilateral security agreement was finally signed in September 2014 after Karzai’s replacement by a new national unity government. There are still U.S. and other foreign troops in Afghanistan, even if not enough.

In Stern’s sympathetic portrayal of the Hazara and their predicament, it’s hard not to hear echoes of other persecuted minorities who put their trust in Western (and especially Anglo-Saxon) liberator-occupiers. The most recent example is the Montagnard hill tribes of Vietnam who fought alongside U.S. Special Forces and were brutally victimized by the victorious Stalinist regime after America pulled out of Indochina. Something similar happened to the Shan and Karen nations of Burma, who fought valiantly alongside the British during World War II but ever since have had to battle for survival against the majority Burmans who sided with the Japanese. In today’s Afghanistan, Gulbedin Hekmatyar, the Pakistan-backed Taliban leader, has overtly threatened the Hazara with something like the fate of the Harkis, the Algerians who fought with French during the war of independence between 1954 and 1962: At least 150,000 of the Harkis were slaughtered with the arrival of “peace.”

The Last Thousand should remind those who are “war-weary” in the U.S. (which really means being weary of reading about the war) that bringing the troops home is far from an unalloyed good. Having met the extraordinary Teacher Aziz and his brave staff and students through the eyes of Jeffrey Stern, and knowing the fate they could face at the hands of their enemies, one finds it hard to think of President Obama’s enthusiasm for withdrawal—an enthusiasm echoed distressingly by several candidates in the presidential race—as anything but thoughtless, heartless, trivial, and unworthy of America.

The Portents of Labour’s Extreme New Leader

InOctober 2015, the American novelist Jonathan Franzen gave a talk in London in which he expressed pleasure that Jeremy Corbyn had just been elected leader of Britain’s opposition Labour Party. To his evident surprise, Franzen’s endorsement was met with only scattered applause and then an embarrassed silence.

Most of Franzen’s audience were the same sort of people likely to attend a Franzen talk in New York: Upper-middle-class bien-pensant Guardian readers who revile the name Thatcher the way a New York Times home-delivery subscriber reviles the name Reagan. For them, as for most Labour members of Parliament, the elevation of Jeremy Corbyn offers little to celebrate. Indeed, it looks a lot like a disaster—a bizarre and potentially devastating epilogue to the shocking rout of the Labour Party at the May 2015 general election.

Franzen probably imagined Corbyn as a kind of British Bernie Sanders, a supposedly lovable old coot-crank leftie willing to speak truth to power—and so assumed that any British metropolitan liberal audience would be packed with his fans. In fact, for all the obvious parallels between the two men, Corbyn is a very different kind of politician working in a very different system and for very different goals. Sanders may call himself a socialist, but he is relatively mainstream next to Corbyn, an oddball and an extremist even in the eyes of many British socialists.

It may seem extraordinary that a party most observers and pollsters were sure would be brought back to power in 2015—and that has long enjoyed the unofficial support of much of the UK’s media, marketing, and arts establishments—now looks to be on the verge of disintegration. But even if no one a year ago could have predicted the takeover of the party by an uncharismatic extreme-left backbencher with a fondness for terrorists and anti-Semites, the Labour Party might well be collapsing due to economic and social changes that have exposed its own glaring internal contradictions.The first stage of Labour’s meltdown was its unexpected defeat at the general election in May 2015. The experts and the polls had all predicted a hung Parliament and the formation of a coalition government led by Labour’s then-leader, Ed Milliband. But Labour lost 26 seats, was wiped out by nationalists in its former heartland of Scotland, and won less than 30 percent of the popular vote. The Liberal Democrats, the third party with whom Milliband had hoped to form a coalition, did far worse. Meanwhile the populist, anti-EU, anti-mass immigration, UK Independence Party (UKIP) won only one seat in the House of Commons but scored votes from some 3 million people—and took many more voters from Labour than from the Tories.

Milliband’s complacency about and ignorance of the concerns of ordinary working-class people played a major role in the defeat. So did his failure to contest the charge that Labour’s spendthrift ways under Tony Blair had made the 2008 financial crisis and recession much worse. Perhaps even more devastating was the widespread fear in England that Milliband would make a deal with Scottish nationalists that would require concessions such as getting rid of Britain’s nuclear deterrent. He had promised that he would never do this, but much of the public seemed to doubt the word of a man so ambitious to be prime minister that he had stabbed his own brother in the back. (David Milliband was set to take over the leadership of the party in 2010 when his younger brother, Ed, decided to challenge him from the left with the help of the party’s trade unionists.)

In the old industrial heartlands of the North and Midlands, Labour seemed at last to be paying a price for policies on immigration and social issues anathematic to many in the old British working class. As a workers’ party as well as a socialist party, and one that draws on a Methodist as well as a Marxist tradition, Labour has always had to accommodate some relatively conservative, traditional, and even reactionary social and political attitudes prevalent among the working classes (among them affection for the monarchy). Today the cultural divisions within the party between middle class activists, chattering class liberals, ethnic minority leaders, and the old working class can no longer be papered over.

With the ascension of Tony Blair to the leadership of the party in 1994, Labour began to pursue certain policies practically designed to alienate and drive out traditional working-class Labour voters and replace them not only with ordinary Britons who had grown tired of the nearly two-decade rule of the Tories but also with upper-middle-class opinion leaders attracted to multiculturalism and other fashionable enthusiasms.

One can even make a kind of quasi-Marxian argument that as the Labour Party has become more bourgeois over the decades, the more it has engaged in what amounts to conscious or unconscious class warfare against the working class it is supposed to represent. One of the first blows it struck was the abolition of the “grammar schools” (selective high schools similar to those of New York City) on the grounds that they were a manifestation of “elitism,” even though these schools gave millions of bright working-class children a chance to go to top universities. Then there was “slum clearance,” which resulted in the breakup and dispersal of strong working-class communities as residents were rehoused in high-rise tower blocks that might have been designed to encourage social breakdown and predation by teenage criminals. But the ultimate act of Labour anti-proletarianism came after the Party was recovering from the defection of working-class voters to Thatcherism and its gospel of opportunity and aspiration. This was the opening of the UK’s borders to mass immigration on an unprecedented scale by Tony Blair’s New Labour. Arguably this represented an attempt to break the indigenous working class both economically and culturally; inevitably, it was accompanied by a demonization of the unhappy indigenous working class as xenophobic and racist.

In the 2015 general election, many classic working-class Labour voters apparently couldn’t bring themselves to betray their tribe and vote Tory—but were comfortable voting for UKIP. This proved disastrous for Labour, which had once been able to count on the support of some two-thirds of working-class voters. But these cultural changes made it impossible for Labour to hold on to its old base in the same numbers. And its new base—the “ethnic” (read: Muslim) vote, a unionized public sector that is no longer expanding, and the middle-class liberals and leftists who populate the creative industries and the universities—is simply not large enough.

Labour should have won the election in 2015; it lost because of its own internal contradictions. Out of the recriminations and chaos that followed the defeat, there emerged Jeremy Corbyn.

To understand who Corbyn is and what he stands for, it helps to be familiar with the fictional character Dave Spart, a signature creation of the satirical magazine Private Eye. Spart is a parody of a left-wing activist with a beard and staring eyes and a predilection for hyperbole, clueless self-pity, and Marxist jargon, which spews forth from his column, “The Alternative Eye.” (He’s like a far-left version of Ed Anger, the fictional right-wing lunatic whose column graced the pages of the Weekly World News supermarket tabloid for decades.) A typical Spart column starts with a line like “The right-wing press have utterly, totally, and predictably unleashed a barrage of sickening hypocrisy and deliberate smears against the activities of a totally peaceful group of anarchists, i.e., myself and my colleagues.”

The column has given birth to the term spartist—which is used in the UK to refer to a type of humorless person or argument from the extreme left. There are thousands of real-life spartists to be found in the lesser reaches of academia, in Britain’s much-reduced trade-union movement, and in the public sector. For such activists, demonstrations and protests are a kind of super hobby, almost a way of life.

The 66-year-old Corbyn is the Ur-Spartist. He has always preferred marches and protests and speeches to more practical forms of politics. He was a member of Parliament for 32 years without ever holding any sort of post that would have moved him from the backbenches of the House of Commons to the front. During those three-plus decades, he has voted against his own party more than 500 times. Corbyn only escaped being “deselected” by Tony Blair—the process by which a person in Parliament can be removed from standing for his seat by his own party—because he was deemed harmless.

Many of Corbyn’s obsessions concern foreign policy. He is a bitter enemy of U.S. “imperialism,” a longtime champion of Third World revolutionary movements, and a sympathizer with any regime or organization, no matter how brutal or tyrannical, that claims to be battling American and Western hegemony. Corbyn was first elected to Parliament in 1983, and many of his critics in the Labour Party say he has never modified the views he picked up from his friends in the Trotskyite left as a young activist.

This is not entirely true, because Corbyn, like so much of the British left, has adapted to the post–Cold War world by embracing new enemies of the West and its values—in particular, those whom Christopher Hitchens labeled “Islamofascists.”

One of the qualities that sets spartists like Corbyn apart from their American counterparts is an almost erotic attraction to Islamism. They are fascinated rather than repelled by its call to violent jihad against the West. This is more than anti-Americanism or a desire to win support in Britain’s ghettoized Muslim communities. It is the newest expression of the cultural and national self-loathing that is such a strong characteristic of much progressive opinion in Anglo-Saxon countries—and which underlies much of the multiculturalist ideology that governs this body of opinion.

As a result, many on the British left today seem to have an astonishing ability to overlook, excuse, or even celebrate reactionary and atavistic beliefs and practices ranging from the murder of blaspheming authors to female genital mutilation. Corbyn has long been at the forefront of this tendency, not least in his capacity as longtime chair of Britain’s Stop the War Coalition. STWC is a pressure group that was founded to oppose not the war in Iraq but the war in Afghanistan. It was set up on September 21, 2001, by the Socialist Workers’ Party, with the Communist Party of Great Britain and the Muslim Association of Britain as junior partners. STWC supported the “legitimate struggle” of the Iraqi resistance to the U.S.-led coalition; declines to condemn Russian intervention in Syria and Ukraine; actively opposed the efforts of democrats, liberals, and civil-society activists against the Hussein, Assad, Gaddafi, and Iranian regimes; and has a soft spot for the Taliban.

Corbyn’s career-long anti-militarism goes well beyond the enthusiasm for unilateral nuclear disarmament that was widespread in and so damaging to the Labour Party in the 1980s, and which he still advocates today. He has called for the United Kingdom to leave NATO, argued against the admission to the alliance of Poland and the former Czechoslovakia, and more recently blamed the Ukrainian crisis on NATO provocation. In 2012, he apparently endorsed the scrapping of Britain’s armed forces in the manner of Costa Rica (which has a police force but no military).

As so often with the anti-Western left, however, Corbyn’s dislike of violence and military solutions mostly applies only to America and its allies. His pacifism—and his progressive beliefs in general—tend to evaporate when he considers a particular corner of the Middle East.

Indeed, Corbyn is an enthusiastic backer of some of the most violent, oppressive, and bigoted regimes and movements in the world. Only three weeks after an IRA bombing at the Conservative Party conference in Brighton in 1984 came close to killing Prime Minister Thatcher and wiping out her entire cabinet, Corbyn invited IRA leader Gerry Adams and two convicted terrorist bombers to the House of Commons. Neil Kinnock, then the leader of Labour and himself very much a man of the left, was appalled.

Corbyn is also an ardent supporter of the Chavistas who have wrecked Venezuela and thrown dissidents in prison. It goes almost without saying that he sees no evil in the Castro-family dictatorship in Cuba, and for a progressive he seems oddly untroubled by the reactionary attitudes of Vladimir Putin’s repressive, militarist kleptocracy in Russia.

Then we come to his relationship with Palestinian extremists and terrorists. A longtime patron of Britain’s Palestine Solidarity Committee, Corbyn described it as his “honor and pleasure” to host “our friends” from Hamas and Hezbollah in the House of Commons. If that weren’t enough, he also invited Raed Salah to tea at the House of Commons, even though the Palestinian activist whom Corbyn called “an honored citizen…who represents his people very well” has promoted the blood libel that Jews drink the blood of non-Jewish children. These events prompted a condemnation by Sadiq Khan MP, the Labour candidate for London’s mayoralty and a Muslim of Pakistani origin, who said that Corbyn’s support for Arab extremists could fuel anti-Semitic attacks in the UK.

That was no unrepresentative error. As Britain’s Jewish Chronicle also pointed out this year, Corbyn attended meetings of a pro-Palestinian organization called Deir Yassin Remembered. The group is run by the notorious Holocaust denier Paul Eisen. He is also a public supporter of the Reverend Stephen Sizer, a Church of England vicar notorious for promoting material on social media suggesting 9/11 was a Jewish plot.

Corbyn’s defense has been to say that he meets a lot of people who are concerned about the Middle East, but that doesn’t mean he agrees with their views. The obvious flaw of this dishonest argument is that Corbyn doesn’t make a habit of meeting either pro-Zionists or the Arab dissidents or Muslim liberals who are fighting against tyranny, terrorism, misogyny, and cruelty. And it was all too telling when, in an effort to clear the air, Corbyn addressed the Labour Friends of Israel without ever using the word Israel. It may not be the case that Corbyn himself is an anti-Semite—of course he denies being one—but he is certainly comfortable spending lots of quality time with them.

How could such a person become the leader of one of the world’s most august political parties? It took a set of peculiar circumstances. In the first place, he only received the requisite number of nominations from his fellow MPs to make it possible for him to stand for leader after the resignation of Ed Milliband because some foolish centrists thought his inclusion in the contest would “broaden the debate” and make it more interesting. They had not thought through the implications of a new election system that Milliband had put in place. An experiment in direct democracy, the new system shifted power from the MPs to the members in the country.

The party’s membership had shrunk over the years (as has that of the Tory Party), and so to boost its numbers, Milliband and his people decided to shift to a system in which new members could obtain a temporary membership in the party and take part in the vote for only £3 ($5). More than 100,000 did so. They included thousands of hard-left radicals who regard the Labour Party as a pro-capitalist sell-out. (They also included some Tories, encouraged by columnists like the Telegraph’s Toby Young, who urged his readers to vote for Corbyn in order to make Labour unelectable.) The result was a landslide for Corbyn.

Labour’s leadership was outplayed. The failure was in part generational. There is hardly anyone left in Labour who took part in or even remembers the bitter internal struggle in the late ’40s to find and exclude Communist and pro-Soviet infiltrators—one of the last great Labour anti-Communists, Denis Healey, died this October. (This was so successful that the British Trotskyite movement largely abandoned any attempt to gain power in Westminster, choosing instead to focus on infiltrating the education system in order to change the entire culture.) By the time Corbyn took over, most of Labour’s “modernizers”—those who had participated in the takeover of the party leadership by Tony Blair and his rival and successor Gordon Brown—had never encountered real Stalinists or Trotskyists and lacked the fortitude and ruthless skill to do battle with them.

Unfortunately for the centrists and modernizers, many of Corbyn’s people received their political education in extreme-left political circles, so brutal internal politics and fondness for purges and excommunications are (as Eliza Doolittle said) “mother’s milk” to them. For example: Corbyn’s right-hand men, John McDonnell and Ken Livingstone, were closely linked to a Trotskyite group called the Workers Revolutionary Party. The WRP was a deeply sinister political cult that included among its promoters not only the radical actors Vanessa and Corin Redgrave but also the directors of Britain’s National Theatre. Its creepy leader Gerry Healy was notorious for beating and raping female members of his party and took money from Muammar Gaddafi and Saddam Hussein.

Most people in British politics, and especially most British liberals, had fallen prey to the comforting delusion that the far left had disappeared—or that what remained of it was simply a grumpy element of Labour’s base rather than a devoted and deadly enemy of the center-left looking for an opportunity to go to war. As Nick Cohen, the author of What’s Left: How the Left Lost Its Way, has pointed out, this complacent assumption enabled the centrists to act as if they had no enemies to the left. Now they know otherwise.

Another reason for the seemingly irresistible rise of Corbyn and his comrades is what you might call Blair Derangement Syndrome. It is hard for Americans and other foreigners to understand what a toxic figure the former prime minister has become in his own country. Not only is he execrated in the UK more than George W. Bush is in the U.S., Blair is especially hated by his own party and on the left generally. It is a hatred that is unreasoning and fervid in almost exact proportion to the adoration he once enjoyed, and it feels like the kind of loathing that grows out of betrayed love. Those in the Labour Party who can’t stand Blair have accordingly rejected many if not all of the changes he wrought and the positions he took. And so, having eschewed Blairism, they were surprised when they lost two elections in a row to David Cameron—who, though a Tory, is basically Blair’s heir.

Blair is detested not because he has used his time after leaving office to pursue wealth and glamour and has become a kind of fixer for corrupt Central Asian tyrants and other unsavory characters. Rather, it is because he managed to win three general elections in a row by moving his party to the center. Those victories and 12 years in office forced the left to embrace the compromises of governance without having much to show for it. This, more than Blair’s enthusiasm for liberal interventionism or his role in the Iraq war or even his unwavering support of Israel during the 2008 Gaza war, drove the party first to select the more leftist of the two Milliband brothers and now hand the reins to Corbyn.

As I write, Corbyn has been Leader of Her Majesty’s loyal opposition (a position with no equivalent in the United States) for a mere 10 weeks—and those 10 weeks have been disastrous both in terms of the polls and party unity. Corbyn’s own front bench has been on the verge of rebellion. Before the vote on the UK’s joining the air campaign in Syria, some senior members apparently threatened to resign from their shadow cabinet positions unless Corbyn moderated his staunch opposition to any British military action against ISIS in Syria. (It worked: Rather than face open revolt, Corbyn allowed a free vote instead of a “whipped” one, and 66 Labour MPs proceeded to vote for air strikes). Any notion that Corbyn’s elevation would prompt him to moderate his views quickly dissipated once he began recruiting his team. His shadow chancellor, John McDonnell, is one of the only people in Parliament as extreme as he. While serving as a London councillor in the 1980s, McDonnell lambasted Neil Kinnock, the relatively hard-left Labour leader defeated by Margaret Thatcher, as a “scab.” A fervent supporter of the IRA during the Northern Ireland troubles, McDonnell endorsed “the ballot, the bullet, and the bomb” and once half-joked that any MP who refused to meet with the “Provisionals” running the terror war against Great Britain should be “kneecapped” (the traditional Provo punishment involving the shattering of someone’s knee with a shotgun blast). Recently he made the headlines by waving a copy of Mao’s Little Red Book at George Osborne, the Chancellor of the Exchequer. As Nick Cohen has written of Corbyn and his circle: “These are not decent, well-meaning men who want to take Labour back to its roots…they are genuine extremists from a foul tradition, which has never before played a significant role in Labour Party history.”

During Corbyn’s first week as leader, he refused to sing the national anthem at a service commemorating the Battle of Britain, presumably because as a diehard anti-monarchist, he disagrees with the lyric “God save our Queen.” Soon after he declared that as a staunch opponent of Britain’s nuclear arsenal, he would not push the button even if the country were attacked.

He expressed unease at the assassination by drone strike of the infamous British ISIS terrorist “Jihadi John.” Corbyn said it would have been “far better” had the beheader been arrested and tried in court. (He did not say how he envisaged Jihadi John ever being subject to arrest, let alone concede that such a thing could happen only due to military action against ISIS, which he opposes).

Corbyn’s reaction to the Paris attacks prompted fury from the right and despair in his own party. He seemed oddly unmoved and certainly not provoked to any sort of anger by the horror. Indeed, he lost his chance to score some easy points against Prime Minister Cameron’s posturing. Cameron, trying to play tough in the wake of military and policing cuts, announced that British security forces would now “shoot to kill” in the event of a terrorist attack in the UK—as if the normal procedure would be to shoot to wound. Any normal Labour leader of the last seven decades would have taken the prime minister to task for empty rhetoric while reminding the public of Labour’s traditional hard stance against terrorism in Northern Ireland and elsewhere. Instead, Corbyn bleated that he was “not happy” with a shoot-to-kill policy. It was “quite dangerous,” he declared. “And I think can often be counterproductive.”

While there is no question that Labour has suffered a titanic meltdown, and that Corbyn’s triumph may mean the end of Labour as we know it, it’s not yet clear whether Corbyn is truly as electorally toxic as the mainstream media and political class believe him to be. What some observers within Labour fear is that Corbyn could indeed become prime minister after having transformed the party into a very different organization and having shifted the balance of British politics far to the left.

They concede that there is little chance of Corbyn’s ever winning over the 2–3 million swing voters of “middle England” who have decided recent elections. But they worry that in a rerun of the leadership election, Corbyn might be able to recruit a million or more new, young voters who have no memory of the Cold War, let alone Labour’s failures in the 1970s, and who think that he is offering something fresh and new.

It might not only be naive young people who would vote for Corbyn despite his apparent lack of parliamentary or leadership skills. In Britain, there is a growing disdain for, and distrust of, slick professional politicians—and for good reason. It’s not hard to seem sincere or refreshingly possessed of genuine political convictions if you’re going up against someone like David Cameron, who even more than Tony Blair can exude cynicism, smugness, and a branding executive’s patronizing contempt for the public. The fact that Corbyn is relatively old and unglamorous might also play in his favor; the British public is tired of glib, photogenic, boyish men. Corbyn and McDonnell are “an authentic alternative to the focus-group-obsessed poll-driven policies of the Blair days,” Cohen writes—but it is an authenticity based in “authentic far-left prejudices and hypocrisies.” Those prejudices and hypocrisies could sound a death knell for Britain’s historic role in advancing the Western idea—an idea that is, in large measure, this country’s greatest achievement.

Comment/OpEdComments Off on In Britain and Across the World, An Age-Old Schism Becomes Ever More Bitter – Sunday Times 3 Jan 2016

http://www.thesundaytimes.co.uk/sto/news/focus/article1652282.ece

Original Version:

For most Westerners the Shia-Sunni conflict has been a confusing but distant phenomenon that rumbles along in the background of Middle Eastern and South Asian politics: an obscure theological dispute within Islam that only makes headlines when a Shia mosque is destroyed in Pakistan or Hezbollah blows up a Sunni leader in Lebanon. But the rumble has been getting louder, and yesterday’s execution of the prominent Shia cleric Sheikh Nimr al-Nimr by the Saudi authorities could well turn it into a roar that will echo throughout the middle east and beyond.

Since the souring of the Arab Spring, and especially since the beginning of the civil war in Syria, outsiders have become more aware that this ancient sectarian division is reflected in the struggle between two bitterly opposed power blocs in the region: a conservative Sunni one led by the Saudis and a Shia one led and inspired by Iran. Their various proxy militias are fighting each other not just in Iraq and Syria but also in Lebanon and Yemen. As bad as that may seem, this war may well be about to expand to include divided countries like Bahrain and even the Saudi kingdom itself. And it is far from unlikely that the sectarian struggle could spread much further, even into our own cities.

Having misunderstood and even fostered Sunni-Shia tensions in the past, Western countries have tended to underestimate the importance of the Sunni-Shia divide in recent decades. You could see this in the poor planning for the Iraq war and in the fact that many Western media organizations covering that war initially had no idea if their translators and fixers were members of the Sunni minority and therefore likely to be supporters of the Saddam regime.

Sunnis of course make up the vast majority of Muslims around the world, and most of them are willing to live alongside Shia even if they don’t like or respect their beliefs. But hardline Sunnis and Salafists refer to Shia Muslims as Rafidah (a strongly pejorative term which roughly translates as “the rejectors” (a reference to the Shiite rejection of the first Caliphs in favour of Ali, the prophet Muhammad’s cousin and son-in-law), and see them as polytheists. As apostates, the Shia are worse and more deserving of death even than Jews and Christians.

You can sense this even in the UK. Most British Muslims are Sunni. The concern expressed by British Muslims about the killing of the faithful in wars abroad never extended to the tens of thousands of Shia civilians slaughtered in mosques and market places during the Iraq war. Nor have there ever been any demonstrations against the large-scale killing of Shia Hazaras by the Taliban, or murderous attacks on Shiite places of worship in Pakistan.

One line of Salafist thought sees the Shia as a fifth column set up to destroy Islam (by the Jews of course), and blames Shiite traitors for every Muslim defeat from the Crusades onwards. Hardline Shia are equally hostile to the Sunnis but have rarely been in a position to persecute them.

The mutual suspicion runs deeper than a theological dispute. It is political in that Sunni rulers fear that Shia minorities (or majorities in the case of Bahrain) may be more loyal to Tehran than the countries they live in. But it can also take bizarre forms: In Lebanon, both Sunni and Shia believe that the other community is prone to disgusting sexual immorality, and there are said to be some Sunnis who think that Shiites have little tails.

In any case it is often hard for non-Muslim Westerners to get a sense of the depth and intensity of Sunni-Shia hostility or to understand when that hostility is likely to overwhelm or be overwhelmed by other political or ethnic concerns.

Iraq’s Sunni Kurds were long happy to ally with the country’s Shiite Arabs against the ruling Sunni Arab minority. In Gaza Hamas is willing to accept support from Shia Iran while Islamic Jihad is not. On the other hand, senior Salafist clerics in Saudi Arabia celebrated Israel’s recent killing of a Hezbollah leader.

Recent developments have made Sunni-Shia hostility more lethal and more dangerous. One is the massive global missionary effort funded by Salafist and Wahabi princes in Saudi Arabia and other parts of the Gulf. Another is the growing power and aggression of Iran, its remarkably successful drive to establish a “Shia Crescent” from Iran through Iraq into Syria and Lebanon. A third is the weakening of forces that used to dilute Sunni-Shia hostility, such as secular Arab nationalism and Pan-Arabism.

A fourth is the sheer ferocity of the fighting in Syria. A leading Saudi cleric, Mohammad al-Barrak recently tweeted that Shiites are more harmful to the faithful than the Jews “because [the Shiite’s] crimes in four years have exceeded all the Jews’ crimes in 60 years”.

How bad could things get? Both the Saudi and Bahraini monarchies could face genuine uprisings. Lebanon is already on the brink of another civil war. But even more frightening perhaps is the prospect of attacks on Shia targets in the many countries and cities around the world – including in Europe – where there are Shia minorities surrounded by Sunni majorities. These could and probably would lead to reprisal terror attacks, most likely by Hezbollah and Iran’s Revolutionary Guards, both of whom have carried out operations as far away from the Middle East as Argentina. Much depends on whether the Saudi monarchy can appease or control its own furious Shiite minority, and whether calmer heads will prevail in both communities around the world.

About a decade ago, a five-car convoy of Toyota Land Cruisers pulled up in a cloud of dust at a remote village on the edge of a South Asian mountain range. The passengers, all of them Westerners apart from an interpreter, walked over to where a canopy had been set up by an advance team the previous day. About 25 villagers were already there, enticed by free cookies and snacks.

One of the new arrivals gave a quick talk that was translated into a local language and then the others handed leaflets to the villagers. Then the foreigners climbed into their Land Cruisers and raced back to the relative safety and comfort of the capital. The leaflets concerned a micro-finance scheme, and the men and women handing them out were part of a project sponsored by the World Bank.

Not a single person in the village ever read the leaflets for the simple reason that no one in the village could read, a problem that had apparently not occurred to the people running the project. Nevertheless, the forms sent to Washington would, no doubt, confirm that outreach had taken place, that awareness had been raised, and key step had been taken in the process of helping members of an impoverished community help themselves.

This expedition took place in Afghanistan, but it could have been in any of a dozen heavily aided countries. While it would be an exaggeration to say that no local person benefited from this particular project (after all, its foreign and local employees probably contributed a good deal to the capital’s economy), its wastefulness was arguably a betrayal both of the taxpayers who funded it and of its purported beneficiaries.

If that weren’t bad enough, even if this particular project had been better conceived and executed, and awareness really had been raised, it probably wouldn’t have done much good. That is because micro-finance, celebrated as a development panacea, simply doesn’t work in certain cultures. It can be successful especially in quasi-matriarchal societies such as Bangladesh where it was invented; but it has abjectly failed in violently macho cultures like those of Rajasthan or Pashtun Afghanistan.

The point of this story isn’t to imply that all aid is a similarly arrogant waste of effort and money, but to serve as a counter-anecdote: a reminder that real aid requires more than just good intentions, and a snapshot of the realities that all too often lie behind the heartwarming imagery and simplistic appeals to compassion used by aid advocates when selling the work of their vast global industry to the public.

To be fair, the aid industry has in the last couple of decades come to acknowledge that good intentions are not enough. Hence the conferences and academic papers on “aid effectiveness,” the shift to “evidence-based aid” and the increasingly rigorous efforts to understand what programs work with real people in specific cultures. Critics, skeptics and disillusioned practitioners such as William Easterly, an economics professor at New York University, are now given a grudging hearing rather than ignored or dismissed as apostles of heartlessness.

That is not quite the same as conceding that seven decades and trillions of dollars in development aid have had remarkably disappointing results, in stark contrast to the Marshall Plan that was its original inspiration. And you will rarely encounter any acknowledgement that those countries that have emerged from long-endured poverty and underdevelopment, for instance South Korea after the 1960s, or some of today’s booming African economies, have done so for reasons unconnected with aid.

Another awkward fact is that many of the attempts to bring accountability, transparency and value for money to the enterprise of development aid have actually made it less rather than more effective. U.S. aid efforts are especially compromised by the oversight requirements that would be comical if they didn’t do such a disservice to both the taxpayer and the theoretical beneficiaries of aid.

USAID in particular is notorious for an obsession with “metrics” strongly reminiscent of the McNamara approach to “winning” the Vietnam war; an approach that inevitably prompts managers to favor projects that produce crunchable data, no matter how useless those projects might otherwise be.

Moreover, as the anecdote above should suggest, a great deal of aid data isn’t worth the time it took to input into a spreadsheet. The more impoverished, chaotic and badly governed an aid-receiving country is, the less you can or should rely on official data or even aid agency estimates of its birth rates, population, mortality, literacy, family size, income. Most statistics from basket-case countries, those in which it is too difficult or dangerous for researchers and officials to visit villages far from the capital, are a combination of guesswork and garbage.

No one knows, for example, how many people live in countries like Afghanistan that have not had a census in decades, let alone how much they live on or how long they live. Often, statistics from even the largest and best-funded aid organizations are based on marketing needs rather than rigorous research. For instance, last year a U.N. agency claimed that malnutrition has gotten worse in Afghanistan since the overthrow of the Taliban, even though it’s almost impossible to know with any degree of accuracy how good or bad malnutrition was anywhere in the country in 2001 or how bad it is in large swaths of the country today.

In general, those who market development or humanitarian assistance to the public are still unwilling to admit that delivering effective aid is difficult in the best of circumstances, and even harder in the ill-governed, chaotic, impoverished societies where it seems most needed. They are even less likely to confront the reality that foreign aid all too often does actual harm.

This awkward fact is true of both development aid and humanitarian or emergency aid. The former accounts for more than 85 percent of American foreign aid even though it’s less visible to and much less understood by the public. And, if you take seriously the criticism coming from a growing number of African dissidents, activists and intellectuals, it has contributed massively to the corruption, misgovernment and tyranny that has kept their nations mired in misery.

Even before people such as Zambia’s Dambisa Moyo, Uganda’s Andrew Mwenda and Ghana’s George Ayittey became a public relations nightmare for the aid industry, some economists had noted a correlation between being aided on a huge scale and subsequently experiencing economic, political and social catastrophe.

It was after the great increase in aid to sub-Saharan Africa that begin in 1970, that per capita income dropped and many African countries endured negative growth. Other circumstantial evidence for aid as a corrosive force includes the fact that the countries that have received the most non-military foreign aid in the last six decades have a disproportionate tendency to collapse into anarchy: Besides Afghanistan and Iraq, the countries that have received the most aid per capita include Somalia, pre-earthquake Haiti, Liberia, Nepal, Zaire and the Palestinian territories.

It’s almost as hard to measure the alleged harm inflicted by aid as it is to find reliable and truly relevant metrics for aid success. On the other hand, the evident failure of many heavily aided societies speaks volumes.

How aid feeds corruption

As Mwenda said, having such a huge source of unearned revenue allows the government to avoid accountability to the citizenry. This is true of his own Uganda, where foreign aid accounts for 50 percent of the government’s budget. There, President Yuweri Museveni, once hailed as a model of modern, democratic African leadership, has responded to the generosity of the rich world not by pursuing the U.N. Millennium Development Goals, but instead by purchasing top-of-the-line Russian Su-30 warplanes for his Air Force and a Gulfstream private jet for himself.

Nor is it just in Uganda that foreign aid actually seems to discourage what donors would regard as good behavior. A recent study from the Lancet magazine showed that aid funding earmarked to supplement healthcare budgets in Africa invariably prompted recipient governments to decrease their own contributions.

It also enables such governments to avoid or postpone necessary reforms, such as the establishment of a working tax system. In Pakistan, for example, a country with a significant middle class as well as a wealthy ruling elite, less than 1 percent of the population pays income taxes. Because states with little or no income from taxation cannot afford to pay decent salaries, this makes large-scale official corruption and extortion all but inevitable.

Aid feeds corruption in other ways as well. This is partly because large-scale, state-to-state aid has the same economic and political effects as the discovery of a natural resource like oil. But it is also because so many aid agencies will do almost anything to ensure that their good works can continue.

This is especially true in humanitarian intervention. Disasters such as earthquakes and tsunamis can be tremendous windfalls for ruthless officials in places such as Sri Lanka, India and Pakistan. Their people know that if they want to help the poor and vulnerable, they will have to pay bribes to government officials. And the government officials know that the agencies they are extorting will never close their offices and pull out rather than pay upfront.

Large inflows of development aid also seem to encourage political instability. This makes sense to the extent that once the state becomes the sole source of wealth and leverage, getting control of it for one’s own party or tribe becomes all the more important, certainly worth cheating, fighting and killing to secure.

Aid can also encourage a dependency that is not just morally problematic, but also dangerous. Food aid is particularly destructive. When foreign aid agencies hand out grain, it bankrupts local farmers or at least discourages them from sowing next year’s crops, all but guaranteeing future shortages.

Afghan people carry their ration of American wheat from the United Nations High Commission for Refugees (UNHCR) food distribution center in Kabul, Afghanistan, Tuesday, July 9, 2002. (AP Photo/Sergei Grits)

At the same time, governments that ought to be preparing for the next famine don’t bother because they assume that the foreigners will deal with the problem. The United States is by far the worst offender in this regard. Its food aid programs are now and have always been little more than a corporate welfare program for American agribusiness. It boosts the bottom line of companies such as Cargill while wreaking deadly havoc abroad.

On the other hand, the United States has pioneered aid to encourage the civil society organizations that are essential checks on “poor governance,” that is, irresponsible and corrupt government officials. Unfortunately, such efforts are often undermined by other forms of aid such as budget support. After all, it’s deeply discouraging for third world anti-corruption campaigners, civil society organizations and political dissidents when they see foreign aid agencies talk about the importance of good governance, democracy and human rights while handing over yet more money to tyrants and kleptocrats.

One of the less dramatic but no less damaging side effects of humanitarian aid is the distortion of local economy when aid agencies arrive to set up refugee camps or hand out emergency rations. Not only do prices go up for everything from water to fuel, but professionals abandon their jobs to work as interpreters and drivers. The standard aid agency/media salary of $100 per day can be more than a doctor makes in a month.

Then there are the “taxes” that the agencies routinely pay local warlords or garrison commanders to secure permission to operate or in return for “security” in dangerous regions. These payments sometimes take the form of food, radios or even vehicles. As a result, the armed payee is not only wealthier, and better able to continue the fight against his rivals, he also gains vital prestige; local people see that foreigners pay court to him.

Sometimes agencies go further and allow militias or equally vicious army units or oppressive political parties to control who gets food and water. This notoriously happened in the Hutu refugee camps in Goma and happens today in parts of Ethiopia.

Some moral compromise is inevitable in the grueling, dangerous business of emergency aid. But again and again, as critics Linda Polman, David Rieff and Michael Maren have shown, aid agencies have followed the path of “Apocalypse Now’s” Col. Kurtz in pursuit of their ideals. They have become the enablers and accomplices of murderous militias and brutal regimes, prolonged wars, and even collaborated in forced relocations. The refugee camps they operate have become sanctuaries for terrorists and rear bases for guerrilla armies.

The most infamous example of this was the aid complex that grew up on Goma in what is now Eastern Congo but was then Zaire in the wake of the Rwandan genocide. There, as chronicled by Linda Polman in her devastating book War Games, the world’s aid agencies and NGOs competed fiercely to help the Hutu power genocidaires who had fled Rwanda with their families.

As so often happens, they ran the refugee camps, taxing the population, taking vehicles and equipment when they needed it, and controlling the supply of food to civilians so as to favor their members. Even worse, they used the Goma refugee camps as bases for murderous raids into Rwanda. The massacres they carried out stopped only after the army of the new Rwandan government crossed the border and overran the camps.

As Rieff pointed out, an analogous situation would have been if at the end of World War II, an SS brigade fled from the death camps it was administering and took refuge, along with its families, in Switzerland, and then, fed by aid workers, raided into Germany in an effort to kill yet more Jews.

There are many other examples of conflict being fomented and prolonged by those housing and aiding refugees, accidentally or deliberately. Refugee warriors, as some have called them, operating from the sanctuary of camps established by the United Nations High Commissioner for Refugees and others, have created mayhem everywhere from the Thai-Cambodia border to Central America and the Middle East.

Sometimes aid agencies have allowed this to happen as a result of ignorance. Sometimes it’s a matter of Red-Cross-style humanitarian ideology taken to the edge: a conviction that even the guilty need to be fed or a belief that providing security in refugee camps would be an abandonment of neutrality. And sometimes it’s because those providing aid are supporting one side in a conflict. The U.S. and Western countries did so from Pakistan during the Soviet-Afghan war.

For decades, Syria, Lebanon and Jordan allowed or encouraged Palestinian refugee camps to become bases for guerrilla and terrorist activity. This should make it clear that the aid world’s traditional ways of dealing with refugee flows are inadequate. Even purely civilian camps such as Zaatari, the sea of tented misery in Jordan that houses a million Syrians, quickly became hotbeds of radicalism and sinkholes of crime and violence, not least because they are unpoliced and because they are filled with working age men with nothing to do.

The American way of aid

A wounded Laotian soldier (wounded in action at Long Cheng mid February) walks to his cot at the USAID managed and financed hospital at Ban Xon in February 1971, during the Nixon administration. (AP Photo/Horst Faas)

American foreign assistance is carried out by a number of different agencies. USAID, founded in 1961, continues to be the largest and most important. Its priorities have shifted over the years.

During the Kennedy and Johnson administrations, USAID emphasized the development of infrastructure and embarked on large-scale projects modeled on the Tennessee Valley Authority. President Nixon took American aid in a different direction, working with Hubert Humphrey to pass the so-called “New Directions” legislation that prompted a new emphasis on health, education and rural development. It wasn’t until the Reagan administration that USAID began to emphasize democracy and governance.

The next revolution in American foreign aid took place during the administration of George W. Bush, who almost tripled USAID’s budget. The Bush administration also began two huge aid initiatives outside USAID: PEPFAR America’s HIV/AIDS program, which has been a great success, and the Millennium Challenge Corp. But the most radical Bush administration shift in aid policy may have been its increase in aid to Africa. Among other initiatives, Bush more than quadrupled funding for education on the continent. It was subsequently cut by the Obama administration.

Although aid is traditionally divided into two main types, development aid and humanitarian aid, one can categorize American aid in terms of the places it’s sent, bearing in mind some amount of foreign assistance goes to 100 countries.

There is aid given to genuinely poor countries in an honest effort to help needy people there. Then there is aid to relatively wealthy states whose elites are too irresponsible to take care of their own people. A good example is the aid the United States sends to India, a country that can afford to send rockets to Mars and which has its own growing aid program, but whose ruling elite is content to tolerate rates of malnutrition, illiteracy and curable disease that are worse than those of sub-Saharan Africa.

A third category is aid given as a foreign policy bribe. This is not the same thing as aid used as a tool of public diplomacy, because its target is a foreign country’s ruling elite. The most obvious examples are Egypt and Pakistan. America gives Egypt money and in return the enriched Egyptian military, with its prestigious American weaponry, promises not to attack Israel. U.S. aid to Egypt has preserved peace, but it has not been successful in its secondary purpose of promoting economic development and political stability.

Aid to Pakistan is arguably less successful. Its purpose was to persuade the Pakistani military and intelligence establishment to reduce its sponsorship of Islamist terrorism in the region and in particular its murderous efforts to destabilize the U.S.-supported government in Afghanistan in favor of its Taliban clients.

A fourth category is aid given as part of what was called the war on terrorism, which has been dominated by reconstruction efforts in Iraq and Afghanistan.

A fifth, linked category is aid used for the purpose of public diplomacy. This has become increasingly controversial in the aid community.

Controversies about the utility and effectiveness of aid do not necessarily break down along conventional Left vs. Right ideological lines. Interestingly, people who identified with the Left rather than the Right have recently argued that foreign aid does not win friends for America and should not be seen as a useful tool of public diplomacy.

They often refer to Pakistan and a study that showed that American humanitarian aid after the 2005 Kashmir earthquake did not have a lasting positive effect on Pakistani attitudes toward the United States. This is a problematic argument, not least because Pakistan is a special case. It is a heavily aided country in which key state actors foster anti-Americanism and have done so for a long time. An American rescue effort in one corner of the country was never likely to win over the population, especially as the state played down that effort in order to make its own efforts seem less feeble.

Moreover, those who insist that aid does not win friends abroad or influence foreign populations may have philosophical and ideological reasons for taking such a view. Many in the aid industry prefer to see aid as something that should be given without regard to any benefit to the donor country, other than that feeling of having done the right thing that comes from an altruistic act. Others are politically hostile to efforts by Western governments to win hearts and minds as part of the war on terrorism.

My experiences in aided countries in Africa and South Asia tend to contradict this argument. In Somalia, for example, the vital but decayed highway between the capital and the coast is still referred to affectionately as “the Chinese Road” some three decades after it was built.

It’s also no secret that in many parts of Africa, you encounter positive attitudes to contemporary China thanks to more recent infrastructure projects, despite the abuses and corruption that so often accompany Chinese economic activity.

A general view of a refugee camp is seen in the city of Kabul, Afghanistan, Thursday, June 12, 2008. (AP Photo/Musadeq Sadeq)

In Afghanistan’s Panjsher valley, any local will tell you how grateful the people are for the bridges built by the U.S. Provincial Reconstruction Team before it closed. This is a localized response to a local benefit. It would be naive, though not unusual on the part of U.S. officials, to expect people in other Afghan localities to be grateful for help given to their fellow countrymen.

At the same time, there seems to be evidence that bad aid makes things worse. That could take the shape of shoddy or failed projects; projects that employ disliked outsiders, and programs that everyone knows have been commandeered or ripped off by corrupt local officials.

It is probably fair to say that effective foreign aid can win friends for America, but mainly on a local basis and only if it reflects genuine local needs and preferences, and if the beneficiary population is not already steeped in anti-American prejudice.

Aid and Afghanistan

Anyone who follows media reports about the American-led reconstruction effort in Afghanistan — the greatest aid effort since the Marshall plan — could be forgiven for thinking it has been a total disaster. But anyone who has spent time there and seen how much has changed since 2001 knows that this is nonsense. The millions of girls in school, the physical and economic transformation of Kabul and other cities, the smooth highways that make commerce possible are only the most obvious manifestations of success. At the same time, the waste, theft, corruption and incompetence is at least as spectacular as these achievements.

It is not clear that Afghanistan is a radically more corrupt society than other countries that have been the target of major aid efforts, that its ruling elite is uniquely irresponsible, cynical and self-interested, or that foreign government agencies and NGOs working there have been especially naive and incompetent.

But it’s important to remember that aid to Afghanistan is not just on a uniquely large scale, offering vast opportunities for theft, misdirection and waste. It’s also much more closely scrutinized than any other aid effort in history.

Afghan police officers drag a sack full of blankets. Soldiers were meeting with village representatives to assess their needs, provide humanitarian aid assistance and to gain intelligence about the region.(AP Photo/Rafiq Maqbool)

Nothing like the same level of skeptical attention has ever been paid by media organizations to development or humanitarian aid programs in sub-Saharan Africa or South Asia. Nor has there been an equivalent of the Special Inspector General for Afghanistan Reconstruction turning a jaundiced eye on big, notoriously inefficient U.N. agencies such as UNHCR, or the efforts of giant nonprofits such as Oxfam and Save the Children.

On the other hand, Afghanistan may well be uniquely infertile ground for development aid, thanks to decades of brutalizing war, a historically feeble state whose primary function has been preying on those who lack access to effective armed force and a traditional political culture in which no one expects government officials to be better than licensed bandits.

Much of the controversy that has accompanied the aid effort in Afghanistan has involved criticism of work by the Defense Department and the military.

No one who has seen how weapons systems are procured for the U.S. military would be surprised if some Defense Department-funded aid projects in Afghanistan turned out to be wasteful, ill-considered and poorly administered. But whether they are that much worse than efforts funded by other government departments such as the State Department or USAID is another question. That they have tended to attract particular opprobrium from news media and the special IG could simply reflect institutional dislike of the military or opposition to the Afghan war.

There is evidence that in many places the military did a better job of providing aid than USAID and the rest of the aid establishment. This was partly because the military wasn’t hamstrung by security concerns; unlike USAID, its employees were willing and able to go anywhere in the country. Local commanders with the ability to hand out funds may have lacked development experience, but they were there where help was needed and, unlike many aid professionals, saw no shame in asking locals what assistance they wanted.

USAID’s bureaucratic, box-ticking approach was arguably unsuitable for a country as damaged, impoverished, misgoverned, traumatized and dysfunctional as Afghanistan. Where the military decided to build schools, it did so quickly and efficiently, assuming that American or Afghan aid agencies would then find teachers, buy schoolbooks and make the projects sustainable.

USAID, by contrast, was required to get the relevant permissions from the ministry of education in Kabul and then provincial ministries, both of which were incompetent and corrupt, and was so slow in the execution of its mandates that its tardiness threatened to undermine the war for hearts and minds.

The Obama administration and Aid

Despite what one might have expected from the candidate’s internationalist rhetoric, foreign aid was far from a priority for the first Obama administration. Key positions such as the head of the Office of U.S. Foreign Disaster Assistance went unfilled for an unconscionably long time, and overall aid spending fell. To the extent that the administration paid attention to foreign aid, its primary concern seems to have been to reverse or undo the priorities of the Bush administration.

Accordingly, efforts to promote democracy and civil society in third world countries were defunded. Countries that had been given more aid as an apparent reward for joining the international coalition in Iraq were now penalized for the same reason.

The second Obama administration has seen a relative normalization of aid policy and an increase in overall aid spending. Although democracy promotion is not the priority it was during the Bush years, it is still a sufficiently important part of U.S. foreign aid to cause USAID to be expelled from Bolivia and Ecuador. In both cases, USAID was targeted by authoritarian left-wing governments for supporting the kind of civil society organizations that can make a genuine difference to bad governance in poor countries.

But normalization is not necessarily a good thing. It means that the United States is still committed to the U.N.’s absurd Millennium Development Goals, a vast utopian list of targets whose realization would, as Rieff has put it, amount to “quite literally, the salvation of humanity,” and was always hopelessly unrealistic.

It also means that those guiding America’s aid efforts continue to be naively enthusiastic about cooperation with big business and to put excessive faith in the potential of high technology to solve third world problems. It has become increasingly clear that the Gates foundation and other new philanthropic giants are influencing the overall direction of U.S. development aid in undesirable ways.

In particular, it has meant a heightened, even feverish emphasis on technological solutions for development problems, as if cheap laptops or genetically modified crops might really be the magic bullet that “ends poverty.” As David Rieff has pointed out, such “techno-messianism” has often failed in the past. If you have a high-tech cyberhammer, all problems start to look like nails.

But reducing poverty, promoting economic growth and rescuing failing states are — this is the real lesson of six decades of development aid — extremely difficult and complicated things to achieve. One gain can lead to new problems in the way that lowering infant mortality may have contributed to overpopulation and therefore malnutrition and even starvation in some African societies.

The challenge presented by the influence of Gates and other tech billionaires on American government aid policy is not just a matter of a techno-fetishism even more intense than that of the rest of American society. It’s also a matter of priorities, of which problems get the most attention. It may be that the only thing worse than aid directed by ignorant box-ticking bureaucrats or by self-serving aid industry ideologues, is aid directed by the spouses of Silicon Valley billionaires.

The way forward

Foreign aid has so far not been a key topic for presidential candidates. Those who have said anything on the subject have tended to be relatively uncontroversial.

Hillary Clinton is especially keen on aid that benefits women and girls. Jeb Bush believes that aid is a vital instrument of U.S. foreign policy and approves of the administration’s aid boost to Central America. Marco Rubio is, perhaps surprisingly, a stalwart advocate of foreign aid, though not, of course, to Cuba while it remains in the Castro family’s grasp. His fellow Republicans Chris Christie and Mike Huckabee also see aid as key to America’s moral authority, though the latter is particularly enthusiastic about faith-based aid efforts.

On the other hand, both Rand Paul and Rick Perry have expressed a libertarian or isolationist suspicion of foreign aid, although the latter has also indicated that he thinks aid should be used more explicitly as a foreign policy lever, calling for aid to Mexico and Central American states to be withheld until those countries do more to stop the flow of immigrants to the United States. It matters less since he has dropped out of the race.

Ted Cruz supported Paul’s 2013 proposal to withhold aid from Egypt after the military coup that ousted President Mohammed Morsi, but does not seem to be against the idea of giving aid to key allies. On the other hand, Cruz did say that same year that “we need to stop sending foreign aid to nations that hate us.”

Donation goods from U.S. military, are seen hung by parachutes over the earthquake-hit area in Pakistan on Friday Oct. 14, 2005. (AP Photo/ Musadeq Sadeq, Pool)

It’s a reasonable sentiment that could resonate with the public. Carrying out such a policy switch would entail stopping aid to Pakistan (one of the countries where American aid is not only unappreciated but seems actually to feed anti-American resentment), the Palestinian territories (whose citizens are per capita the most aided people on Earth), Turkey, China and Russia.

Whatever Republican and Democratic candidates say now, it seems unlikely that questions of cutting or boosting or reforming foreign aid will play a major role in the 2016 election. That is unless the migrant and refugee crises in the Middle East and Europe gets so much worse that there are loud and popular calls for Washington to intervene in some way.

If that does happen, then it will be probably be the military that once again leads an American humanitarian effort, assisted, with the usual reservations and resentments by USAID and other agencies. It is worth remembering that during the Asian tsunami and Philippine typhoon disasters, no aid agency rescued as many people, did as much good or could have done as much good as the United States Navy, and that this was a source of pride for most Americans.

Is the Sun Setting on the United Kingdom?

The Abolition of Britain – From Winston Churchill to Princess Diana, by Peter Hitchens

A few years ago, I was hiking up to an observatory in Georgetown on the Malaysian island of Penang. On the steep, winding road to the top, I fell into conversation with a well-dressed middle-aged man, a Malaysian Chinese, who told me about the problems his daughter faced getting into university because of the regime’s nastily racist program that favored ethnic Malays and penalized the ethnic Chinese minority. It was unfair, unjust. “You’re British,” he said. “You should do something about this.”

It was touching and not a little sad that he thought British influence still counted for so much, and that he automatically associated the concept of fair play with the former colonial power. From a historical point of view, he wasn’t entirely mistaken: Over the centuries, many people — African slaves in agony in the Middle Passage, Hindu widows being burned alive, Indian travelers strangled by religious lunatics, Belgian civilians brutalized by Wilhelmine soldiery, and Jews being kicked to death by Nazi brownshirts — have all wanted the British to do something about it, and eventually they did.

But then Britain and its prestige are perceived differently abroad than at home these days — especially by the political class. When Peter Hitchens, the former Trotskyite who is now Britain’s most forthright conservative pundit, laments the “abolition of Britain,” he isn’t talking just about the Blair government’s formal destruction of the United Kingdom as a unitary state or even the modernizing Kulturkampf against such vestiges of the imperialist, racist, class-ridden past as the breeches worn by the Lord Chancellor and the popular Royal Tournament show of military pageantry.

He’s also talking about the long-term shift in national self-perception that allowed all this to happen — a shift, strangely enough, that accelerated as Britain left the strikebound malaise of the late 1970s for the prosperity of the 1980s and 1990s. Essentially, the British seem to have reacted, rather belatedly, to the loss of empire with an orgy of self-contempt. Pushed along by a middle-class minority who passionately desire the submersion of Britain in a European superstate, this peculiar self-loathing has made the British particularly vulnerable to a virulent form of PC multiculturalism and to the idea that Britain’s institutions and traditions are, at best, outmoded and absurd.

“We allowed our patriotism to be turned into a joke, wise sexual restraint to be mocked as prudery, our families to be defamed as nests of violence, loathing, and abuse, our literature to be tossed aside as so much garbage, and our church turned into a department of the Social Security system,” Hitchens writes in his concluding chapter.

We let our schools become nurseries of resentment and ignorance, and humiliated our universities by forcing them to take unqualified students in large numbers. . . . We abandoned a coinage which. . . . spoke of tradition and authority. . . . We tore up every familiar thing in our landscape, adopted a means of transport wholly unfitted to our small crowded island, demolished the hearts of hundreds of handsome towns and cities, and in the meantime we castrated our criminal law, because we no longer knew what was right or wrong.

Some of these changes were organic and others artificial (though Hitchens, to the detriment of his argument, rarely distinguishes the two). Some were initiated by Labour governments, but a surprising number were the work of Conservative administrations.

So, for instance, the foreign office under Margaret Thatcher pursued a relentless policy of post-imperial betrayal, beginning with hints to the Argentines that Britain no longer cared about the Falkland Islands and culminating in the selling of the people of Hong Kong to Communist China — after first removing their right to reside in the United Kingdom, so they’d have no leverage and nowhere to run.

And so, for another instance, the Tories under John Major took the country deeper into the European Union — while reciting the mantra that further integration into the emerging superstate was the only way Britain could hope to exert any influence, now that it was merely a “fourth-rate power.” (This phrase is always delivered in tones of such gloomy satisfaction, no one notices that such a “rating” ignores factors like economic strength, nuclear deterrents, seats on the U.N. Security Council, and cultural influence.)

But Tory surrenders of sovereignty pale beside the changes instituted by the “New Labour” government of Tony Blair. For the most part, the British population has been an unenthusiastic but oddly resigned witness to even more revolutionary changes. (Though the drive to abolish British currency and replace it with the Euro provoked a surprisingly vocal opposition.) The most important of these changes are the constitutional “reforms” carried out merely because the need for such changes was self-evident to the London media elite that calls the tune in British society.

The fact that the United Kingdom seemed to work — despite the oddness and antiquity and irrationalism of its constitutional arrangements — was declared irrelevant. Sure, it provided reasonable prosperity, liberty, and security at least as effectively as systems in use in the Continent (or across the Atlantic). Sure it proved less vulnerable to economic and political storms than, say, the modern German state since 1870 or the various republics, empires, and monarchies that have ruled France since 1789. But that’s all ancient history. The key thing is that nothing about the old United Kingdom conforms to what the new British elite conceives of as “modernity.”

The idea that there might be risks in sudden, radical constitutional change, that for a constitution to be effective it needs legitimacy and the emotional allegiance of the people, is not one that Britain’s hyper-rationalist but parochial reformers have given much thought to, despite the warnings flashed from Yugoslavia. For the new public-sector middle class and the metropolitan media elite, a single idea is paramount: Britain is a musty, provincial place “held back” by dated, irrational institutions and a culture that wrongly venerates a history that is essentially a record of shame and oppression.

In its mildest form, this idea is manifested in the culturalist theory of British decline that influenced Thatcher as much as Blair: the idea that postwar economic failure is inextricably linked to the persistence in Britain of a culture of deference. Better policy might well have been found by asking instead how a pair of small islands off the coast of Europe managed to become the world’s most powerful nation for a century and a half, producing a fair number of the world’s best scientists, poets, admirals, and statesmen. But those old successes were dismissed. As the newly elected Tony Blair put it in 1997 — so memorably and tellingly, in marketing-man’s jargon — Britain desperately needs to be “rebranded” as a “young country.”

That the Blair government has been able to tear so much down in so short a time with so little effective opposition is one of the most fascinating mysteries of modern politics. After all, it’s rare for a perfectly viable system of government to be dismantled in a time of peace and prosperity. Peter Hitchens understands that Britain came to this pass because of a series of social and cultural changes, some of them inevitable results of postwar exhaustion and impoverishment, but many more of them the products of cultural and class warfare.

Unfortunately his Abolition of Britain is arranged in such a scattershot way that it conveys no real sense of either the chronology or the interplay of the various factors that broke British morale and allowed a resentful section of the population, without previous experience of power and responsibility, to make a revolution. Still, The Abolition of Britain is an entertaining and moving read that helps explain why certain key strata of the British middle classes are such enthusiasts for eliminating the things that make Britain unique. It offers a key to such mysteries as how the British state could actually prosecute merchants for using non-metric measures, jail a farmer for defending himself against brutal robbers, and arrest a man for the “racist” act of flying a flag above a pub.

There are so many effective anecdotes in Hitchens’s book that it is difficult to pick one as particularly telling. So, for symbolic concision, how about the abolition of the flag? It was in 1997, the year of Blair’s election, that British Airways removed the Union Jack flag from the tails of its aircraft and replaced it with “ethnic” designs that it hoped foreign customers would find more sympathetic.

The airline’s then-CEO, Robert Ayling, apparently feared that foreigners associated the British national flag with skinheads, soccer hooligans, and imperialism. This was not based, of course, on any polling of Africans or Asians or Europeans. But Ayling did know that the Union Jack is associated with skinheads and soccer hooligans and imperialism by the media folk and the professional middle classes who now control Britain. These are people far too well-educated and sophisticated to have any truck with anything as atavistic as national pride and who simply cannot conceive that anyone would see a Union Jack as a symbol of something positive. (Britain is not in fact a flag-waving country; its inhabitants have long been embarrassed by the kind of loud patriotism associated with their continental neighbors or the United States. But there’s a difference between this kind of reticence and actual hostility to the flag.)

Kipling once asked, “What do they know of England who only England know?” The Blairite elite, for all their vacations in French or Tuscan villages, have much less experience of the outside world than the imperial elite they replaced. It’s why they don’t know that the French, whom they worship, are utterly unembarrassed by the traditional pageantry being scourged in Britain and would not dream of deconcessioning the tricoleur. Have the Blairites never seen the Communist deputies saluting, as mounted republican guardsmen in breastplates and horsehair plumes lead the Bastille Day parade, just in front of the tanks? Apparently not, which is another reason no one in the new ruling elite even questions the assumption that Britain is an embarrassingly Ruritanian society, long overdue for a thorough house-cleaning.

Still less do they doubt that a country properly cleansed of cringe-inducing vestiges of a quaint, elitist past like the changing of the guard, Oxbridge, red telephone boxes, hereditary peers, and the monarchy will be both more efficient and more popular with foreign tourists. For them it is an article of faith that new is better.

Alas for Peter Hitchens, impassioned, perceptive, and courageous though he is, the opposite is also an article of faith: For him, all change is bad. Hitchens actually laments the advent of central heating and double glazing, because families are no longer brought together by having to huddle around a single hearth. When he contrasts the Britain of Princess Diana’s funeral with the Britain of Churchill’s funeral, his case that everything has gotten worse includes the “crazed over-use of private cars” and “the disappearance of hats and the decline of coats.”

Indeed, if you were going to be harsh you might almost subtitle this book “A compendious diatribe of everything I hate about Britain today, with minor, aesthetic irritations given the same weight as the destruction of the constitution.” There’s a silly chapter in which Hitchens bemoans the famous trial of D. H. Lawrence’s Lady Chatterly’s Lover, which made it all but impossible for the British government to ban books on the grounds of obscenity. Then there’s his notion that the “American Occupation” of Britain from 1941 to 1945 introduced adultery to British womanhood — a claim that would have amused Lord Nelson and Lady Hamilton.

But the most bizarrely wrong chapter is the one that blames the satirical television and wireless programs of the late 1950s and early 1960s for destroying national unity. The idea that a culture that survived Alexander Pope and Jonathan Swift could be brought down by Dudley Moore and Alan Bennett is preposterous. And if comedy “made an entire class too ridiculous to rule,” then P. G. Wodehouse and perhaps even Charles Dickens are also to blame.

Of course many things are worse in Britain than they were during the 1950s, the decade that Hitchens takes as his paradigm for the real, lost Britain. Even people of the Left look with disgust upon Tony Blair’s “Cool Britannia” with its ubiquitous youth culture awash with drugs, its government by glib marketing men, its increasing corruption, the ever-spreading coarseness, and the startling ubiquity of violent crime (you’re now much, much more likely to be mugged or burgled in London than in New York).

But is it so terrible that the food is better, that there are sidewalk cafes, that middle- and even working-class people can afford to travel, that the state plays a smaller role in the nation’s economic life (though a far greater one in other realms)? Some of Hitchens’s nostalgia fixes on things that were not especially British, or not laudably so — like censorship, or the prosecution and blackmailing of homosexuals. Other things Hitchens sees as quintessentially British were, in fact, freakish phenomena of the postwar decades. In particular, the placidity and gentleness in those years was an artificial state, the result of exhaustion and wartime discipline.

Hitchens should know that for centuries European and other visitors were struck by the amazing pugnaciousness of the English and by their quick sentimentality. (Two enjoyable recent books, Jeremy Paxman’s The English and Paul Langford’s Englishness Identified, take up this topic.) From the eighteenth century on, Britons were seen even by their many European admirers as terrifyingly violent. That’s why small numbers of them were able to defeat large numbers of foreigners either on the continent or the battlefields of empire. The British soccer hooligan is a mere return to form. So, too, the Victorians were famous for their weeping: They kept emotional reserve for important moments, like when they were about to be tortured by Fuzzy Wuzzies.

It’s a shame The Abolition of Britain includes so much cranky fogeyism (including nostalgia for the flogging of teenage criminals). It’s a shame, because at its best this book combines superb reporting (especially about the hijacking of education by frustrated leftists) with a heartbreaking analysis of one of the strangest revolutions in history. And in many ways it is the most important of the torrent of books that have dealt with the crisis of British identity.

What Hitchens understands is that bourgeois New Labour is far more revolutionary than any government before — although, ironically, it learned just how easy it is to defy tradition and make radical constitutional changes from Margaret Thatcher, who abolished the Greater London Council merely because it was dominated by her political enemies. Hitchens rightly sees the New Labour “project” as a kind of politically correct Thatcherism with a punitive cultural agenda aimed at certain class enemies. The House of Commons’s vote to abolish fox hunting is a perfect example: an interference in British liberty enacted by our urban middle-class rulers in order to kick toffs in the teeth — one that will put thousands of rural working-class people out of work. When Labour was dominated by cloth-capped, working-class socialists, ownership of the means of production may have been at issue, but the party never threatened the structure of the kingdom. Tony Blair heads the least socialist, least redistributive Labour government ever. Yet at the same time he has used the legally unchecked powers of a House of Commons majority to enact the most revolutionary changes in the British constitution since the Civil War of the 1640s.

It still isn’t clear whether the Blair government sees its steady stream of attacks on the old order’s structure and accouterments as a clever and harmless way of distracting its genuinely socialist members and supporters from their fiscal conservatism, or whether they actually know that traditions and rituals are rather more important than marginal tax rates when it comes to destroying the old United Kingdom they despise.

Because the reforms, enacted swiftly and without serious debate, were intended mostly to proclaim the new government’s difference from the Tories, they followed no consistent theory. Scotland and Wales got separate parliaments but continue to send MPs to Westminster where they make laws for the English (some 80 percent of the population) who do not have their own separate parliament.

Of course, it never occurred to the Blairites — who see themselves as technocrats above primitive feelings of attachment to nation or any community other than their own cosmopolitan class — that by tossing bones to the Welsh and Scots nationalist minorities they might awaken the long slumbering beast of English nationalism. These people have lived so long under the protection of an inclusive British nationalism, they couldn’t imagine that English nationalism, fed by growing submission to Europe and the unfair favoring of Scotland, will of necessity be racial and resentful. When a few old souls mentioned the danger of awakening nationalisms after centuries of peace and comity, they were laughed at by the Blairites. Now you see all over England the red cross of St. George, a symbol from the medieval past that spontaneously appeared in the hands of soccer fans and on the dashboards of London taxicabs. It’s enough to make Hitchens warn of “interesting times” ahead — in the scary sense of “interesting.” As he says, “When a people cease to believe their national myths and cease to know or respect their history, it does not follow that they become blandly smiling internationalists. Far from it.”

Of course, you can detect in the Blair generation’s discomfort with Britain’s past an element of envy and insecurity. It cannot be easy for middle-aged Britons to look back on the achievements of their fathers and grandfathers (who defeated Hitler and the Kaiser), or, worse still, those of their great grandfathers (who brought peace and prosperity to millions around the globe), without wishing to denigrate those achievements.

But if you want to understand why a significant chunk of the British population loathes Britain and wants to undo it, you have to look beyond generational resentment to class. An acquaintance of mine was on his way to a party for the fiftieth anniversary of VE day in 1995 when he bumped into Jon Snow, a well-known British broadcaster and fairly typical figure of the new British establishment. He asked Snow if he too were going to a VE celebration. Snow sneered back that he was going to “an anti-VE day party.” Not for him any of that jingoistic nostalgia for World War II.

As Orwell pointed out, the English intelligentsia has always been severed from the common culture of the country. But in the 1930s, the intellectuals were joined in their alienation by a significant number of mandarins, upper- and upper-middle-class civil servants, who responded to democratization and the simultaneous decline of British influence by deciding that their country would be better off ruled by Nazi Germany or the Soviet Union.

The modern equivalent is to transfer one’s allegiance to the “European ideal,” which means, in practice, rule by the smooth bureaucrats of Brussels. For the remnants of the mandarin class, there’s something comforting in the idea that Britain and Europe can be run by a sophisticated international elite — made up of chaps not unlike themselves.

“Europe” also solves a status problem for the new public-sector middle class. Unlike the treacherous mandarins, these people have not lost position; they never had it. They therefore define themselves as being more “civilized” than the country-house toffs above them and the bigoted proles below. And they take to an extreme the retarditaire notion that everything is done better on the Continent. The basic idea is that if you are the kind of person sophisticated enough to appreciate wine and cappuccino — rather than beer and tea — then, of course, you must favor the transfer of sovereignty from Britain to Brussels.

There are good reasons for Americans to study Peter Hitchens’s The Abolition of Britain. It won’t be a good thing for America if British PC multiculturalists manage to discredit the parent culture of the United States. More important, however, is the lesson about the fragility of culture that Americans should take from this book. In his famous essay “England, Your England,” George Orwell wrote, “It needs some very great disaster, such as prolonged subjugation by a foreign enemy, to destroy a national culture.” But reading Hitchens you soon realize that Orwell was wrong: A culture can be destroyed from the inside, as well.

Old Articles/ArchiveComments Off on American soldiers really aren’t spoilt, trigger-happy yokels (Daily Telegraph 25 July 2003)

Whether the deaths of Uday and Qusay Hussein were self-inflicted or not, the military operation to capture them was immaculate. There were no American deaths, 10 minutes of warnings were given over loudspeakers, and it was the Iraqis who opened fire. So sensitive was the American approach, they even rang the bell of the house before entering.

The neat operation fits squarely with the tenor of the whole American campaign, contrary to the popular negative depiction of its armed forces: that they are spoilt, well-equipped, steroid-pumped, crudely patriotic yokels who are trigger-happy yet cowardly in their application of overwhelming force.

And, unlike our chaps, none of them is supposed to have the slightest clue about Northern Ireland-style “peacekeeping”: never leaving their vehicles to go on foot patrols, never attempting to win hearts and minds by engaging with local communities and, of course, never removing their helmets, sunglasses and body armour to appear more human.

As a British journalist working for an American newspaper, who was embedded with American troops before, during and after the conquest of Saddam Hussein’s Iraq, I know this is all way off the mark; a collection of myths coloured by prejudice, fed by Hollywood’s tendentious depictions of Vietnam (fought by a very different US Army to today’s) and by memories of the Second World War.

The American soldiers I met were disciplined professionals. Many of them had extensive experience of peacekeeping in Kosovo and Bosnia and had worked alongside (or even been trained by) British troops. Thoughtful, mature for their years, and astonishingly racially integrated, they bore little resemblance to the disgruntled draftees in Platoon or Apocalypse Now.

Yes, American troops wear their helmets and armour even though removing them might ease local relations. But it’s easy to forget that British troops in Northern Ireland have very often worn helmets when patrolling unfriendly areas. And the disaster that took the lives of six Royal Military Police officers in Majar may indicate that American caution – whether it means wearing body armour, or ensuring that soldiers have sufficient back-up or are always in radio contact with headquarters – isn’t so foolish.

And it’s simply not true that the Americans don’t patrol at all, patrol only in tanks or never get out of their vehicles. I accompanied foot patrols in Baghdad as early as April 13, only days after Saddam’s presidential palace was taken. The unit carrying out these patrols was also assigned to escort SAS troopers around the city. The SAS men told me how impressed they were, not just with the Americans’ willingness to learn from them, but with their training and self-control.

The idea that American troops are lavishly equipped is also a myth, a fantasy bred out of resentment of American wealth in general. The battalion in which I was first embedded came to war in creaky, Vietnam-vintage M113 armoured personnel carriers, which frequently broke down in the desert.

The battalion fought in green heavyweight fatigues because the desert camouflage ones never arrived. And, though a shipment of desert boots turned up just before the invasion, many were the wrong size, so that these GIs had to make do with black leather clompers designed for northern Europe in December. Perhaps most resented by the troops, they were not issued with bullet-resistant vests, only flak jackets, making them vulnerable to small-arms fire.

Another myth is that the Americans are also softies who live and fight in amazing, air-conditioned comfort. The truth is that the GIs encamped in and outside palaces and Ba’ath party mansions not only lack air-conditioning but also running water, unlike most of the population they guard.

And, unlike their British counterparts, they have no communication with their families at home. Many British troops are able to use the “e-bluey” system to email their loved ones on a frequent basis. The only times most GIs in Iraq ever get to let their spouses know they are well is if a passing journalist lets them have a couple of minutes on the Satphone.

And I remember what a thrill it was when I got my hands on a British ration box after nearly three months on American MREs (meals ready to eat). GIs bored of endless variations upon chilli and macaroni were amazed to find that British rations included things such as chicken and herb paté. And they were willing to trade everything from boots to whole cases of their own rations to get some.

Though the US Army lacks our regimental system, different American divisions vary greatly in culture and experience. The Third Infantry Division – the unit that reached Baghdad first and took the city in a feat of great boldness – has been kept in Iraq because its soldiers are clearly better than newcomers at the difficult task of winning hearts and minds in a newly conquered country.

You could see this in the way the tank commander, Captain Philip Wolford , broke the rules and walked around the area his company controlled, alone and bare-headed, chatting with the locals and organising food, medical care and even employment. I wish that more British reporters had gone into the streets with 3ID men such as Sgt Darren Swain, a no-nonsense soldier from Alabama who is loved in the Baghdad area his men call “Swainsville” because, off his own bat, he takes humvees out every morning to provide security at local schools.

More recently, American soldiers have been charged with the sensitive task of searching those who enter the Palace district of Baghdad. One Shi’ite mullah felt it a great dishonour to be searched. The soldier responsible, Captain Wolford, agreed to take him round the back of the building and search him in private. Once there, the mullah agreed to be searched. Captain Wolford refused then to search him – the agreement to comply was enough. The gentlemanly approach much pleased the mullah.

It is because of this kind of sensitivity that the Americans have slowly and quietly achieved the intelligence triumph that led to the discovery and killing of the sons of Saddam Hussein.

Scorsese’s film portrays racist mass murderers as victims

Martin Scorsese is rightly the most lauded living American film-maker – a beacon of integrity as well as a brilliant talent. But his bloody, visually gorgeous new epic, Gangs of New York, set in Civil War-era Manhattan, distorts history at least as egregiously as The Patriot, Braveheart or the recent remake of The Four Feathers. In its confused way, it puts even the revisionism of Oliver Stone to shame.

The film works so hard to make mid-19th-century Irish-American street gang members into politically correct modern heroes (and to fit them into Scorsese’s view of American history as one long ethnic rumble) that it radically distorts a great and terrible historical episode.

It treats the founding Anglo-Saxon Protestant culture of America with an ignorant contempt – where it doesn’t cleanse it from history altogether. Generally speaking, Hollywood sees that culture not as the root of treasured democratic freedoms, but as a fount of snobbery and dreary conformism. The paradoxical result of this Hollywood faux-Leftism is that the movie ends up casually glossing over the suffering of black Americans.

Gangs begins with a brutal battle in 1846 between two armies – “natives” (presumably Protestant) and immigrant Irish Catholics – for the control of the lawless Five Points area of Lower Manhattan. The leader of the Irish (Liam Neeson) is slain before the eyes of his five-year-old son, by the Natives’ leader, “Bill the Butcher” (a superb Daniel Day-Lewis). The son grows up to be Leonardo DiCaprio, a tough youth who comes back to the neighbourhood 16 years later determined to avenge his father’s death.

By 1862, Bill the Butcher has now incorporated many of the Irish thugs – including DiCaprio – into his own criminal organisation. Eventually he comes to see the boy almost as the son he never had. When the time comes for the two of them to square off, with DiCaprio in charge of the reborn “Dead Rabbits” gang, the Civil War is casting its shadow over the city with the 1863 Draft Riots.

These began with assaults on police by Irish immigrants enraged by Lincoln’s conscription order on July 11, 1863. Very quickly, they turned into a monstrous pogrom, with a 50,000-strong mob murdering and mutilating every black they could find.

The Coloured Orphans’ Asylum was set on fire, followed by several black churches and the Anglican mission in Five Points. The city’s small German Jewish population was also attacked. Panicked blacks fled to the safety of British and French vessels at anchor in the East and Hudson rivers. Many drowned. Those who were caught were often tortured and castrated before they were killed.

In the film, you don’t see any of this. Instead, a voice-over quoting from telegraph reports briefly mentions some of the mob’s racist violence. What you do see is the suppression of the riot: blue-clad troops massacring crudely armed civilians of all ages and both sexes. The rioters stand almost impassive, and are cut down by gunfire and mortar shells lobbed from warships in the harbour (a bombardment wholly invented by the film-makers).

The film’s narrator claims – and it’s a flat-out lie – that the mob was a multi-ethnic uprising of the city’s poor, that Germans and Poles joined with the Irish immigrants against New York’s epicene patricians and an unjust conscription policy that allowed the wealthy to buy their way out of military service for $300. In fact the city’s 120,000 German immigrants, many of them Catholics, took no part in the riots, there were almost no Poles living in the city and the rioters were almost entirely Irish.

They were furious with the city’s blacks because the city’s free negroes were often skilled artisans, economically and socially a rung or two above the newly arrived Irish, many of whom didn’t speak English.

Yet the film consistently portrays the “nativist” Yankees, led by Daniel Day-Lewis’s Bill the Butcher, as racists and the Irish underclass criminals, led by Leonardo DiCaprio, as multiculturalists avant la lettre.

The film’s misrepresentation of the “natives” begins early on. While the film’s Irish Catholics have a vibrant, energetic culture, the “native” Americans merely have prejudice. And you would never know that New York’s population included substantial numbers of Orange Ulstermen – a hundred people were killed in New York Orange-Green rioting as late as the riot of July 12, 1871.

Nor would you know from Scorsese’s depiction that Yankees – Northern Americans of English, Scottish, Welsh and Dutch extraction – increasingly thought that they were fighting the Civil War to abolish slavery. In the words of their favourite battle hymn, Jesus died to make men holy, and they would die to make men free.

The ending of slavery isn’t on Scorsese’s map, because its inclusion would be too difficult: it would require honesty and courage to reveal that his heroes – the Celtic predecessors of today’s beloved mafia – were on the wrong side of the most significant moral and political struggle in America’s history.

Nor would you know that many Irish volunteers fought with spectacular bravery on behalf of the union. Instead, everyone villainous in Gangs of New York is either a white Anglo-Saxon Protestant or an Irish Catholic who has sold out to WASPs.

There’s something bizarre about glorifying a subculture that fought to undermine Lincoln’s war to preserve the union and end slavery. Scorsese is treating racist mass murderers as heroes and victims. Yes, the Irish were cruelly abused in their adopted country. But it’s a strange modern fetish that assumes that victims cannot also be victimisers.

If, as the ad copy goes, “America was born in the streets”, it was not in the squalid, savage turf struggles of the Five Points, but in the streets of Boston and Lexington in 1776 – where the people traduced here as having no identity or qualities outside their xenophobia, fought for the liberties that all modern Americans take for granted.