I’ve seen the same thing, mainly from the USA where the notion ‘Christian=American’ is still strong, frequently on Facebook, and rather less from the UK, mainly because I’ve tended to unfriend the people who are happy to post Britain First memes which make a similar case.

It seems to me that there are two extremes of thought, neither of which does us much good. One is to say ‘all religions are the same, and are religions of peace, therefore it is not Islam’s fault’, and the other is to say ‘these people have one thing in common: they are Muslims, therefore Islam is the enemy’.

Leaving aside the Bible for a moment (if you are not a Christian, but have stumbled onto this article for whatever reason, this should give you some relief), it seems to me self-evident that all religions are not the same, and a desire for peace is not the universal goal of religion.

It is far too contentious to discuss Islam and Christianity at this point, so let’s look at something I know a little about: the AEsir. Norse religion, up to the early middle-ages, is well known to us from TV shows, Led Zeppelin’s Immigrant Song, and, to a much lesser and less popular extent, from extant Old Norse texts, historical accounts and archaeological findings. As someone who has laboured away translating Skirnismal into English, let me try to simplify matters by saying that the Norse religion for which we have evidence was not a religion of peace, but one of war. More exactly, it was a religion in which only death in battle could assure a place in Valhalla (pronounced with a glottal stop for the double ‘l’, in case you wish to be seen as erudite at dinner parties), and where the principal aim of Odin (pronounced ‘O-th-in’, if you’re still worried about the dinner party) was to gather warriors for the final battle.

Religion as such is not intrinsically about peace.

Equally, it seems clear to me that believing the teachings of your religion to be ‘true’ in any worthwhile sense means that you believe that which contradicts them to be ‘untrue’. This would be uncontroversial if we were talking about whether there should be one or two spaces after a full stop (it’s one, actually), or whether the answer to 1 + 3 x 2 – 6 / 3 is 5, or some other number. As it’s religion, even whispering that you think someone else is wrong seems to be desperately illiberal, and rather marks you out as the person causing all the problems in the first place. Perhaps if we chose the words ‘accurate’ and ‘inaccurate’, it might make things a little easier.

Even so, this does not mean we have to fall out.

Disagreeing about whether Jesus is a saviour, a prophet, or an impostor, whether Abraham took Isaac or Ishmael or no one onto the holy mountain, and whether the extent of God’s revelation is the Hebrew Bible, the Hebrew Bible and the Christian New Testament, both of those and the Qu’ran, or nothing at all, does not require us to be enemies. In fact, Christians, Jews, Muslims, atheists and others have managed to live peacefully in many different sets of circumstances over the last 1400 years. Some of those circumstances were intrinsically unjust, others were not.

When the early Christian missionary Paul arrived in Athens, the book of Acts in the New Testament retells that he was distressed because of its many idols. As a good Jew, brought up as a Pharisee and subsequently a convert to the new faith of Christianity (though not denying his Judaism), Paul was only doing what was culturally native to him by being distressed in this way. However, as Acts records, Paul’s response to this was not to declare a pogrom against idols, or even get up a petition. He went to the Areopagus, and addressed those there in these terms:

“Men of Athens! I see that in every way you are very religious. For as I walked around and looked carefully at your objects of worship, I even found an altar with this inscription: TO AN UNKNOWN GOD. Now what you worship as something unknown I am going to proclaim to you.”

You can (if you like) cast doubt on whether this is really what Paul said in Athens, whether there really was a Paul, and, indeed, whether there really is an Athens, but in as much as the framework of Christian belief is set out in the New Testament, this is what it is.

Paul’s writing, by contemporary standards, is at times quite intemperate. However, his intemporacy is almost always directed at people who describe themselves as Christians but reject (in word or action) whatever it is that Paul was teaching at the time. Neither he nor any of the other New Testament writers ever call for Christians to make the pagans their enemies, to attack their temples, denounce their gods, or do anything else to engage in a war of religion. The notion of a religion as an enemy is fundamentally foreign to the New Testament, Christian perspective.

“Ah”, you might say, “but Paul and his friends did not have to suffer what we are suffering at the hands of Daesh (or ISIS)”.

This is where things really do get interesting.

Up until Constantine, and already in the time of Paul, there were frequent violent, and often fatal, attacks by pagans on Christians. Some of these are recorded in the book of Acts (again, cast doubt if you like), and many others recorded by the Romans themselves. Some had imperial backing, to the point of requiring all copies of the Bible to be destroyed, and the execution of believers. We have a relative luxury of writings from the period. Many questions were raised among theologians, and there were some quite bitter disputes, but these were mainly about the status of Christians who abandoned their faith under persecution and then wanted to return to it. At no point in the first centuries of Christianity did the prevailing mood of the church become ‘treat the pagans as enemies’.

Daesh is clearly causing an enormous amount of suffering in today’s world, but it is not different in kind from the persecutions the early Christians faced. They were not instructed to make other religions their enemies, nor, by and large, did they do so.

The New Testament records Jesus as saying (and Paul, later, as repeating) ‘love your enemies, and pray for those who persecute you’. In this sense, individual Muslims, Jews, atheists, or people who do not consider faith or lack of it to be particularly important in their lives, can be our enemies, and we can both love them, and pray for them. The notion of a religion such as Islam being an ‘enemy’, though, is foreign to the New Testament world view.

Let’s fast-forward this to the present day. The kind of people who post on Facebook and encourage me to ‘like and share’ (I don’t) that Islam is the enemy will say that I’m just playing with words here. Surely (they would argue) it is clear that it is Islam which is the unifying factor among our enemies, and that we are deluding ourselves if we do not call it out for being so.

Let me give three responses.

First, anyone claiming to be a Christian (everyone else can skip to the next paragraph) needs to recognise that to be Christian at all means abandoning the right to call other people ‘enemies’. ‘Love your enemies’ has the direct implication that, after a while of loving them, you no longer look on them as enemies. Elsewhere, Paul writes that Christians should do their best to live at peace with those around them, and Peter argues that if we’re going to be persecuted, it’s better to be persecuted for being a Christian than for another reason. Trying to describe Islam as ‘the enemy’ is trying to extend a concept when Jesus Christ is calling us to abandon it altogether.

Second, whether or not Islam is a religion of peace (and, as a non-Muslim, I cannot possibly see how I could be qualified to have an opinion on the subject, and I feel other non-Muslims should exercise the same circumspection), it is clear that the vast majority of Muslims in North Africa, the Near East, the Middle East, South East Asia, former Soviet Central Asia, in the UK and across the rest of the world do not support Daesh or sympathise with them. Even the Sun’s ragged piece of journalism which claimed (in contradiction to the results of the survey) that 1/5 Muslims sympathised with jihadis, would have meant that 4/5 had no sympathy with them (leaving aside the fact that ‘jihad’ doesn’t mean ‘waging war’ but rather ‘exertion’, and is more commonly found in giving to the poor than in fighting battles).

Third, even from an entirely secular point of view, we are slipping back into bad, twentieth century, habits, if we try to identify an ideology and demonise it. “First there was Nazism. Then there was Communism. Now there is Islam.” The rhetoric is easy, but the conclusions are false.

Nazism was a genuinely evil ideology, linked at its roots with the exaltation of the selfishness of one group of people at the expense of the lives of other groups. Very few ideologies in the whole of history have ever been so vile, which is perhaps one reason why it endured for less than a generation. Nazism was not merely a vicious ideology. It was an aggressive one, and we fought a war against it, at great cost. The Nazis genuinely were the enemy.

Soviet Communism was totalitarian, imposed using many of the techniques of Nazism (though also using many techniques invented among the ‘Christian’ Tsars), and existed in a long period of cold enmity towards the West. The ‘continuous revolution’ of Soviet Communism genuinely was a threat to the West, and the language (and equipment) of the Cold War meant that the notion of the Soviet Union as ‘the enemy’ was not too far fetched. However, as the McCarthy witch-hunt trials showed, making communism (and, by extension, any form of socialism, and by extension to that, any disquiet with American capitalism) the enemy resulted in a great deal of injustice which reduced the freedom of Western Society. In making communism the enemy, we became more like the enemy.

We have got used to the idea of having an ideological enemy. With Nazi Germany and Soviet Russia gone, and communist China one of our biggest trading partners, it is all too easy to look around and see Islam as the next big enemy. To do so is sloppy thinking, flies in the face of history, and demonises hundreds of millions of people who have never done us harm, nor wished it.

When war is contemplated, it is usual to attempt to co-opt every part of society into supporting it. Western rulers have all too frequently put crosses on their shields, flags or tanks and tried to claim that ‘x is the enemy of Christianity’.

The truth is that Christianity is a faith entirely based on the notion of forgiveness, of loving one’s enemies, of seeking reconciliation. Co-opting it for war does violence to the faith itself. This is far from saying that Christians should be pacifists. There may come a time when many Christians conclude, from a reasonable perspective based on the New Testament faith, that it is their duty to go to war. But not a war which is ‘Christianity versus Islam’ or ‘Christianity versus Communism’.

If we do go to war in Syria, let it be because it has a realistic, strategic goal of saving lives and putting to an end an injustice which should and must be stopped. As Christians, let us resist to the end all notions of Islam as the enemy, and a war between two faiths. The symbol of the cross—if it is to be co-opted at all—belongs on medicines and food parcels, not on warplanes and munitions.

Morality. Of all types of opinions, we hold our moral ones most tightly, utter them uninvited most frequently, and are most outraged when they are breached. The sub-text of every tabloid shock headline is ‘and this should not be’. In journalistic and political writing, all we have to do is preface our complaint with ‘it is an outrage that…’, and we know that we have gained some kind of an aura as moral pundits, superior to those we criticise.

One of the very best places to find moral outrage is Facebook. Once the hangout of students and other cool folk, Facebook is increasingly populated by grumpy middle-aged people and others whose intent it is to police the moral values of, well, everyone. Students and other cool folk long ago migrated to Instagram and Whatsapp, and, as often as not, maintain their Facebook accounts just to keep in touch with grumpy relatives.

Alongside the personality quizzes, pseudo-maths puzzles, (where have the pictures of cats gone?) and general chit-chat, Facebook is the best place to pep up your daily moral outrage by seeing pictures of things you don’t like, with comforting text implying that you are a morally superior being by not liking them. If you are of a liberal tendency, you can be outraged by Republicans, right-wing Christians, oil companies and -ists of many kinds. If you are of a conservative tendency, you can be outraged by Democrats, extremist Muslims, feministas and -ists of most other kinds. If you sit somewhere in the middle, pictures of an empty House of Commons debating something important may well fill the gap.

A bit of moral outrage seems to tickle a spot in most of us. Newspapers have been selling on that basis for years. Far be it from me to try to limit people’s access to moral outrage. However, I do want to argue that such outrage is not a part of our moral sense at all, and that it can be quite dangerous: my ability to be outraged by, say, Donald Trump, can easily give me the impression that I am a moral person.

Our very best moral outrage is naturally reserved for hypocrisy. Ever since Jesus pointed this one out, it has been a perennial favourite. Most of us are able to keep in mind the things that we oppose. If we catch ourselves doing them, we generally drop our opposition to them and get wound up about something else instead. By this means, we can retain our stance against hypocrites, while making sure that we do not fall into their category.

Hypocrisy, though, is the symptom, not the cause. It is an outward expression of our natural and innate tendency toward self-deception. That we have learned to catch ourselves (usually) before making hypocritical statements merely demonstrates that we have gained some skills in thinking before we speak, at least in regard to a rather unfashionable vice.

Let me attempt to delve a little deeper.

The foundation of our shared morality—it seems to me—is in an innate sense which we all have which is often referred to as ‘conscience’. Our sense of right and wrong is intuitive rather than rational—at least, for things which are ‘obviously’ right and wrong. Some people have argued that conscience is identical to empathy, but I think this leads us very quickly into a cul-de-sac. The two senses overlap in many cases, but there are issues of morality such as stealing from a corporation where it is very hard to argue that the failing was an empathic one.

To some extent, what we regard as conscience is heavily shaped by upbringing and by society around us. There is nothing intrinsically wrong with eating with your fork in the right hand and your knife in the left, but for most British people, this prompts the same kind of reaction as the temptation to take a larger slice of the pie than the next person. However, anyone who spends much time with children (or who remembers what it was like to be a child) knows that ‘it’s not fair’ is a cry that does not have to be taught. Whether children get the notion of fairness from their parents or not, it is one that resonates early on, and is much more easily retained than that business with forks and knives, even though it is an abstract concept.

Here is the tricky bit. For most of us, when we are deciding whether something that we are contemplating is right or wrong, we consult our conscience. If introspection does not give us an immediate answer, our moral reasoning tends to proceed on the lines of ‘this is like that, and that is wrong’. 1

However, when making our moral pronouncements about the behaviour of others, we tend to first try to articulate a rule, and then apply it to other people. Making moral pronouncements would seem to be merely a distillation of our sense of conscience, but is it?

A few months ago, there was an article on the BBC website about filtering on motorways when three lanes are brought down to two lanes. According to the article, correct driving is to remain in your lane as long as possible. When filtering from the right, it is the other drivers’ responsibility to let you in. Interesting as it was, the article was nothing like as interesting as the comments that followed it. The language people were using was the language of morality, and, as the debate progressed, there were real signs of outrage. Just last week, LBC showed a cyclist and taxi colliding, and asked ‘but who is in the wrong?’ Perhaps the headline drew in those most interested in a moral debate. Either way, the language of the debate was moralistic, not technical.

It should be quite evident that the Highway Code is not a matter of intuitive conscience. It is a set of conventions for driving which form the basis for passing the UK driving test. As any driver coming from mainland Europe or the USA can attest, the most fundamental UK convention, that we drive on the left, is not intuitive at all. It certainly isn’t a matter of superior morality. What’s more, when a UK driver goes to Belgium and drives on the right, they are not being hypocritical by doing so.

I give this example because I hope it shows that our willingness to make moral pronouncements and then to reason from them is not actually a function of our moral sense at all. The commentator who argued (with quite a high implication of moral outrage) that it is a driver’s duty to filter as soon as ‘lane closed ahead’ was signed up (and the many who agreed with him) was doing exactly that. Most of the commentators on the LBC thread (at least, when I last looked) did exactly the same thing: they first created a ‘rule’, and then applied it to the situation. In most cases, the rule they created was not in the Highway Code at all, and the commentators who argued from the Highway Code were often dismissed by others.

I don’t intend to make a pronouncement about the cyclist or the taxi driver. What I’m more interested in is this as an example of our delight in rule-making and pronouncement.

Any system of justice, of course, relies on being able to make pronouncements. Roman law, indeed, relies on written rules, and even Anglo-Saxon precedent law has come to rely increasingly on regulation, often codifying what was previously judged on precedent. But law and morality are not the same thing, and were never intended to be.

While legal regulations have mushroomed, popular moral prescriptivism has exploded. In the last few days I have read articles and seen memes that tell me that it is immoral to explain things, immoral to accept refugees if there is any homelessness in one’s own country, immoral to own guns, immoral to control guns, even (I assume in jest) that it is immoral to put up Christmas decorations early. Political Correctness brings with it an ever tightening set of strictures, but its opposite, US Republican-style anti-political correctness, seems just as laden with rules, just about different things.

There are only three moral thinkers who, to me, have contributed substantially to the debate, and all of them have sought to reduce a super-complexity of moral rules down to just one or two positions.

John Stuart Mill, building on the work of Jeremy Bentham, posited Utilitarianism as a single moral theory which allowed us to dispense with other rules. I don’t agree with Utilitarianism, but I recognise its importance. The notion of ‘the maximum good to the maximum number of people’, when taken alongside minimising harm, is a framework which large, impersonal bodies, such as corporations and governments, can use effectively in many circumstances. There are lots of examples of where applying Utilitarianism would produce a result which was unjust, or even evil, and many of these have been condensed into moral philosophy puzzles involving crashing trams and other such crunch cases.

Immanuel Kant proposed the notion of a Categorical Imperative, which is “Act only according to that maxim whereby you can at the same time will that it should become a universal law without contradiction”. From this he derives: “Act in such a way that you treat humanity, whether in your own person or in the person of any other, never merely as a means to an end, but always at the same time as an end.” Further, “Therefore, every rational being must so act as if he were through his maxim always a legislating member in the universal kingdom of ends.”

Kant’s notion of the Categorical Imperative is widely cited (though not quoted) as something which it is not, quite. On that, more in a moment. On grand moral questions, it is well worth asking “if I did this, and it were to become the universal rule, would I be happy?” However, on day to day moral questions, issues of personality and personal style crowd in. As an Extrovert Intuiting Thinking Perceiving person (though I’m actually borderline on three of those), I have a view of the kind of things that make me happy, and which I think would make everyone happy. Some people do want a world full of parties, trying out the latest gadgets, bright clothes, late nights and loud music. To others, this would be hell on earth. One of the reasons for the explosion of new moral prescriptivism is that many people now imagine that they are legislating members of humanity.

Nonetheless, when applied as a personal code, Kant’s view is, I think, preferable to Mill’s, and Mill’s to today’s ad hoc prescriptivism. Mill’s needs a calculator to operate, whereas Kant’s needs a bit of introspection.

The moral thinker (it should surprise no one that I make this claim) who I think takes us the greatest distance is Jesus of Nazareth. He proposes two ‘laws’: ‘Love the Lord your God’, and ‘Love your neighbour as yourself’. When pressed on ‘who is my neighbour’, the answer is ‘anyone you encounter’. His other formulation is ‘do to others what you would have them do to you’.

Kant’s Categorical Imperative is often confused with this last pronouncement, typically referred to as the ‘golden rule’. Its negative form, ‘do not do to others what you would not have them do to you’ is relatively widespread before Jesus, but he is credited as being the first to put it forward as an injunction to ‘do’ rather than than ‘refrain from doing’. It differs from Kant’s in that Kant is saying that you should only do that which you would want to be a universal rule. Jesus’s is more direct: ‘would you want it? Then do it’, and requires nothing in the way of extended introspection.

What Mill, Kant and Jesus all have in common is that they are proposing one or two simple rules by which moral agents (ie, us) can evaluate the actions we are about to take. Mill’s view can be applied retrospectively, in the sense of ‘did that produce the maximum good?’, but that is not its intention. In each case, they are rules for us, rather than rules for us to impose on others. Indeed, neither Kant’s position nor Jesus’s can be applied to someone else. I cannot know whether, at the time, someone did something because it was what they would want done to them, or because they wanted it to be the universal rule, or for entirely selfish reasons.

If we could simply wipe out all the extra moral rules, the extra bits of ethics, custom, judgement, prescription, outrage and memification, and go back to any one of Mill’s, Kant’s, or Jesus’s formulations—in other words, have less but better morality, rather than more but bittier—then we would be in a much better position to evaluate our own behaviour ahead of time, and be possessed of a much better understanding that it really isn’t our business to evaluate other people’s.

So, for everyone poised to create that new meme, or to post an outraged remark on Facebook or as a comment to a BBC article, or to pen the newspaper article that prompts storms of outrage, or to make a speech in the House of Commons denouncing this group or that group, or to create new legislation that forces people to behave ‘better’ (whatever better is), let me offer one final moral remark, also from Jesus of Nazareth: ‘Do not judge others, lest you be judged yourself.’

Piaget, followed by Kohlberg, of course, argues that this is the form of reasoning only engaged in by those who are relatively morally developed. However, experimental evidence has not generally supported their view: we see quite sophisticated moral reasoning among children, and, indeed, children’s literature which is popular among children tends to have an intuitive rather than rule-bound moral sense. ↩

Tomorrow, Apple releases iOS 9, which will allow you to block online advertisements. It’s just the next step in a movement towards online privacy which is gathering energy from two opposite directions. On the one hand, web users are getting increasingly more nervous about what commercial enterprise knows about them. On the other, they are worried about government.

Commercial enterprise has been in the news this year, government snooping last year. The Ashley Madison fiasco will doubtless make a lot more people nervous about registering their personal details on, ahem, ‘personal’ sites. What is more oppressive to most users, though, is the way cookies are increasingly serving you content from one site to another. If you have cookies turned ‘off’ or ‘only for sites I visit’, have an ad-blocker installed, and have enabled ‘do not follow’ then you may not have noticed this. Turn cookies to ‘on’, though, and disable your other privacy extensions, and you enter a new web: browse something on Amazon and then go to Facebook. Depending on what you actually looked at, you will see Facebook’s ads change to match your interactions. Start surfing the web more generally, and all kinds of ads will pop up tailored for you.

This tailoring experience is eerily like the scene in Minority Report where the protagonist gets ads served to him on the street based on his retinas—except he’s just had replacement retinas, and now the ads think he is a middle-aged Japanese man. Consumers have been clamouring for a more personal experience for years, but the way ads follow you round the web is actually quite creepy. A couple of weeks ago one of my Facebook friends changed his settings and found this happening. He thought he’d been hacked. In a certain sense, he had.

The Guardian Newspaper, one of the more significant web news-presences, now alerts you if you have ad-blocking turned on, inviting you to support the Guardian by other means. It’s an interesting approach, a bit like the new speed signs in Birmingham which say ‘Thank you’ if your speed is at or below the speed limit, rather than just shouting at you if it thinks you’re going too fast. It’s an attractive and consistent response from the Guardian. Others, notably the Murdoch press, have already put their newspapers behind paywalls.

There’s a funny little meme going round Facebook at the moment.

It purports to be a genuine passport letter sent to the UK Passport Office, sent by someone who cannot understand why the government needs him to confirm his address details, as they should already be on file. Actually, if you follow it through to the end, the irate citizen complains that his family have been in the country since 1776. Interesting year, 1776, as it wasn’t one of the bumper years for UK immigration, though it was the year of US Independence. A little online checking reveals that this is a meme that was originally American, popular in 2011, hastily ‘upgraded’ to British by someone who doesn’t know that we don’t carry ‘National Health cards’ to prove we are entitled to medicine.

What’s interesting is that the fictional correspondent holds up Sky, which installed a satellite dish in 1988 and still keeps contacting him, as an example of the way government ought to work. In 2011, that kind of ‘joined-up’ thinking was still popular, probably one of the last relics of the pre-bi-millennium belief in the power of big data to help the individual.

I personally don’t have a real problem with government holding information about me. Every year I am delighted by the way DVLA allows me to do my car tax with essentially no fuss, and now allows me to access my MOT. GCHQ needs a warrant if they want to search my emails, in the same way that the police need a warrant if they want to search my house. There are checks and balances in place.

I’m altogether more worried about Google, eBay and others colluding on my data. Now, of course, the way cookies work, there is not actually any individual who has access to my browsing history. What happens is, when I go to Amazon, a cookie—a fragment of data—is saved on my hard disk which the browser serves up to any website which Amazon has authorised for the purposes of tailoring my experience. It’s really just an extension of the way Amazon uses your product browsing history to offer you products you might find interesting.

Even that, though, may be alarming. A few months ago I bought some halogen lights for a display. Since then, for a few months, Amazon kept suggesting I buy products to help with home hydroponics: buying histories had associated purchasers of halogen lights with people who like to grow their own recreational substances. If you share a computer, you might not particularly want your partner, spouse, children, or, heaven help you, parents, to start checking the attic and the greenhouse in case you were growing your own.

Apple’s decision to introduce ad-blocking is partly an enhancement of its brand-offering: it’s positioned itself for years as the friend of your online privacy, never selling your data, not allowing the government to tap it, and pursuing Google for trying to circumvent the ‘do not follow’ option in its Safari web-browser. It’s also a nice shot across arch-competitor Google’s bows: Google is an advertisement-serving company, and its entire revenue model is based around using your data to sell targeted advertisements to people who want to sell to you.

A few days ago I allowed Google to use my location in Google maps. I’d turned this off some time before, and thought ‘what the heck, how can it hurt?’ Since then, all the websites like Buzzfeed and its cousins which serve up random advertisements have been telling me that a mum in Oxford makes £770 an hour, apart from the ones that tell me that an eerily similar looking mum in Worcester makes £770 an hour. I accept that even the biggest of big data sheds don’t know where Marlcliff is, given that there are people who live five miles away who have never heard of it. I’m a bit more surprised they haven’t heard of Stratford-upon-Avon, though, any day now, I look forward to seeing pictures of the California-tanned mother (sometimes described as a ‘hot’ mum, which may account for the tan) who will by then be earning £770 an hour in Shakespeare town. She is always pictured as just climbing out of a gull-wing sports car, which probably explains how she gets about so much.

Given that these advertisements are served up on pages where every single story appears to relate to Oxford (or Worcester), I doubt I will really be taken in. Nonetheless, there are some real problems, and they need to be addressed.

What’s really wrong with bad data privacy?

A lot of people might well say ‘what’s the harm?’, especially if they know that the information isn’t really leaving my computer to be stored in a gargantuan off-site database.

I think there are some very significant problems. Legislation lags ten years behind or more, but industry codes of practice and consumer pressure could work to solve them.

Problem 1: the vulnerable

Extreme personalisation creates an entirely false impression of trustworthiness. While most web users who happily share dodgy memes on Facebook are altogether more circumspect when it comes to parting with cash, there are large numbers of vulnerable people who are more easily taken in. If someone were to come round to a vulnerable person’s house every day, chat with them, and eventually sell them hundreds or even thousands of pounds worth of products they didn’t need, we would consider this to be a confidence trick and expect the perpetrator—if caught—to go to jail. If the operation of cookies, big data and online advertising does it, for now at least, we accept it.

Problem 2: data profiling makes you the vulnerable

Even without actually harvesting your personal data, big companies are able to use the masses of information about you and about people like you to build ever more accurate profiles. Most of us believe that we would resist the attentions of a confidence trickster, but, in reality, there are constantly stories about how hard-nosed, alert, bright people have been caught in a sting of some kind. As often as not this is a newspaper-led sting, as those are the ones we read about. The ability to take someone for a ride is based on doing excessively more research than any of us would think anyone would ever do. Clearly, there is a limit to the number of stings of this kind that newspapers can fund. However, at the big data level, if you have all your cookies turned on, your ads unblocked and your location available, businesses can build more or less perfect profiles of you. Suddenly—because of the intensity of information—we are all in the vulnerable category, because we can be targeted with advertisements that are so tailored for us that they almost seem to be reading our minds.

Problem 3: your information is not safe

Over the past couple of years, Facebook, Twitter and Adobe have all contacted me to tell me that I need to change my password because of a breach of their own data protection. Facebook, Twitter and Adobe are big companies. How many small companies have been breached who never actually found out? How many knew there was an issue, but chose not to inform me. Normally speaking, these aren’t a problem to me. Neither Facebook nor Twitter have my credit card details, and I use different passwords for different sites, thanks to Apple’s Key Chain. However, if anyone had any illusions about this, we should now be aware that no website is unbreachable, no password uncrackable, and no promise of data integrity can really be kept. If the breaches can be contained to the sites in question, there is little problem. However, when the web ‘knows’ about you, the situation changes.

For myself, I no longer want to read about hot mums in Oxford or Worcester who make £770 an hour. I don’t want Facebook to know what I browsed on Amazon. I don’t want to be profiled alongside drug-growers because I bought some lights. To me these are irritations, and they are solved by switching on every method of privacy that comes with my Mac. As of tomorrow, I shall enjoy using iOS 9 to block these ads from my iPhone.

But that’s just me: it’s the irritation which irritates me right now, like the automated voices that ring me up to tell me about PPI, which I never had.

However, looking at the phishing emails which I receive, and which Sophos anti-virus very kindly warns me about, scammers are getting more and more sophisticated. I hope that I will always spot them, or the OS and the anti-virus will spot them for me, but I’m not entirely sure.

As for large businesses, their offerings are now so tailored and so sophisticated that I can be pretty certain that I’ve bought things I didn’t need at prices I didn’t have to pay, simply because of over-personalised marketing.

My parents grew up during the Blitz. My father once described to me how, for years, he could not hear the sound of an aeroplane overhead without fear. I was born in the late 1960s—part of the generation that believed the Bomb might fall at any moment: the ones who read When the Wind Blows, read leaked copies of Protect and Survive, and celebrated when the Berlin Wall came down.

In 1986, I stayed with a family in Würzburg. The mother told me how her mother had thrown her into a hay-cart crossing the border in 1961. A few weeks later, the East-West German border was closed. She never saw her mother again.

In the 1990s, the next-door neighbour at our office in Ghent was a man who had survived Auschwictz. He used to come in to use our photocopier. Sometimes, very gently, he would tell me about his experiences.

The community where I grew up, Stechford, was one of the rather less salubrious parts of Birmingham. It was multi-cultural, before the notion was popular. Alongside teenage ribbing, I saw some genuinely nasty examples of racism, much of it copied behaviour from older children. As the ’70s and ’80s progressed, I felt that attitudes were improving. By 1988, my feeling was that most people agreed that racism was wrong, even if they struggled sometimes to put that into practice—hence the well worn, and all too revealing phrase, “I’m not a racist, but…”

By 2001, back when I first stood for parliament, the landscape had changed. I was out of the UK from 1988 to 1996, and in some ways I’ve never quite adjusted to some of the cultural changes that took place while I was away. I look blankly when people talk to me about 1990s TV shows, bands and cultural phenomena. All we could get was Radio 4 Longwave, entirely given over to cricket during Test Match Special—a delight to me, but less useful in following cultural development.

What really struck me in 2001 was that the casual racism of the 1970s had returned in a new form. This time it was directed at asylum seekers, or, as the press always referred to them, ‘bogus asylum seekers’. Technically speaking, of course, you cannot have a bogus asylum seeker. Anyone who is seeking asylum is doing so. If they are in the country illegally and not seeking asylum, then they are illegal immigrants, not bogus asylum seekers.

Even the term ‘asylum seeker’ was new to me. In the 1980s we still called people ‘refugees’. I don’t know when the change in usage happened, but, as I now understand, it was a fairly cynical ploy to change the way people think by changing the words they use. Britain has international obligations to refugees. ‘Asylum seekers’, by contrast, are merely people who are candidates to be ‘refugees’. Etymologically, this is nonsense: a refugee is someone seeking refuge. By logical definition, an asylum seeker is a refugee. By UK legal definition, and in the popular press, they are not.

During the First and Second World Wars, Britain welcomed enormous numbers of refugees. I learned today that a quarter of a million Belgians came to Britain during the First World War—one out of every 40 people in Belgium—and returned to their homes once the war was over. During the 1970s and ’80s, it would have been popularly unthinkable (though I’m sure it still happened) to turn down asylum seekers coming over from the Soviet Union—victims of the Gulags and the purges.

Today, people are telling me that Britain is full, that migrants are a ‘swarm’, that people are coming to this country paradoxically only because they want our jobs, and only because they want to claim benefits. I’ve seen dozens of Facebook memes shared by people who I really thought knew better alleging that migrants are housed in multi-million pound homes with thousands of pounds a month in benefits, while British ex-servicemen are forced onto the streets. My heart goes out to anyone and everyone who is forced onto the streets, but none of them were forced there by asylum seekers.

We did experience a worldwide recession in 2007. It was not caused by immigrants, migrants, refugees or asylum seekers, nor was it caused by East Europeans. It’s easy to blame the bankers, but the truth is that Western economies have been pursuing ever greater prosperity since the 1950s. We have been happy to vote in governments that relaxed rules on financial transactions, and happy to buy into trickle-down economics. We’ve turned a blind eye to the startling increase in wealth inequality, just as long as we ourselves became ever more prosperous.

What was once an aspiration—to be more prosperous than our parents, and to increase our standard of living year by year—is now regarded as a right. Collectively, we reacted with outrage when our prosperity dropped in 2007, and took more than five years to recover to, and then exceed, its 2007 levels.

During the same period, xenophobia—when measured by the success of avowedly xenophobic political parties, distribution of Britain First memes on Facebook, and the rhetoric used by the mainstream press and some mainstream politicians—has also risen. Commentators pointed out that this always happens during recessions. Clearly a corollary, but recession itself is not the cause.

Popular response to the refugee crisis, or migrant crisis if you prefer, has oscillated between compassion and selfishness. ‘Someone should do something’ versus ‘they must not come here’. Britain is 14th in the league table of European countries accepting refugees, notwithstanding the fact that refugees are far more likely to be able speak English than most other European languages. Arguments about people being ‘economic migrants’, ‘illegally trafficked’ and so on do not wash. If Germany and Scandinavia can accept people, there is no reason why we should not.

It is true that there is a housing crisis in Britain—but this is a crisis of suitable accommodation in the south of England and in prosperous cities where the majority of high-paid jobs are to be found. In many parts of Britain, property prices are actually falling and properties go unsold or unlet. The people who climb into desperately unsafe boats to cross the Mediterranean, or who wait for months in a shanty-camp at Calais, are not asking to be housed in chic boho streets or quiet suburbs. They merely want to be somewhere where they can be safe, and where they can start to rebuild their shattered lives.

The ultimate cause of the refugee crisis is war. Politicians may claim that their main concern is to welcome only genuine refugees while excluding economic migrants, but we are seeing people flee their countries in large numbers right now not because they suddenly decided they wanted to become rich, but because war has driven them from their homes.

We did not start this particular war, but we cannot claim to have no responsibility for it. ISIL, the Syrian civil war, the Libyan crisis, the Arab Spring, the second Iraq war, the first Iraq war, the Iranian revolution and the ongoing crisis in Israel were all influenced by a pattern of British intervention in North Africa and the Near- and Middle-East which goes back to before Lawrence of Arabia. There are moments in that history of intervention which, in retrospect, we can probably be proud of. There are passages which, while well-intentioned, produced largely harmful results. Sadly, there were also interventions which it is hard to characterise as anything but purely self-serving.

Even if this were a conflict in which we had no hand, and never had, Britain remains a signatory to the International Convention and Protocol on the Status of Refugees. This is not an onerous or burdensome protocol. It does no more than solidify the way in which Britain treated refugees during the First and Second World Wars, and extend that to cover refugees of subsequent conflicts and persecutions. It does not create an open door for anyone who happens to feel like it to come to Britain, nor does it offer refugees a better standard of living than the one they had before they came under threat. Furthermore, it has enough signatories that even the numbers of refugees leaving North Africa at this time could be easily distributed around Europe without creating any particular drain on any economy or national life. Of course, the one place least able to accommodate economic shocks—Greece—is working to satisfy its obligations right on the front lines.

Britain must stop posturing and playing politics. The next General Election is five years away. If the government by some extraordinary generosity were to welcome more than Britain’s ‘fair share’, everyone would have realised by 2020 that it didn’t actually seem to make a great deal of difference to our national life. At the moment, we are in no danger whatsoever of accepting our ‘fair share’.

The very fact that we are trying to second-guess whether refugees are ‘genuine’ or not says an enormous amount about us, and nothing about the people claiming our help under international treaties, and under the common bond of humanity.

Even if our desire for prosperity has somehow corroded our compassion to the point that we no longer want to respond as our nation once did, there remains an underlying, unyielding moral and legal obligation.

By international law, we should help the refugees. By the most basic human morality, we must.

The One Basic Plot

The fruit of a six year project to unlock the perfect plot, The One Basic Plot identifies the plot shape which is found in almost all successful stories, from narrative jokes to three volume novels and from medieval French fairy tales to Hollywood blockbusters. The One Basic Plot is available from Amazon, and comes in the Kindle format which works on iPhones, iPads, Macs, PCs, and, of course, Kindle readers. Price £2.21. This is a book for writers.

The Skifter

The Skifter: “The beast came crunching down in the snow, writhing as it skidded. It stopped two feet short of where he stood waving his bag. The huge teeth were fixed in a snarl. But it was the snarl of death. A long knife, the size of a meat cleaver, trailed from its neck.”

The snow that began on December the sixth brought Birmingham to a halt and sent Scott Raynall out of school early. But deep snow brings out dark things, opening doors into the past and drawing malice, ruin and revenge into the present.

As he learns to skift through England’s darkest years, Scott is caught in an adventure which mingles magic, myth and swordplay, until he must finally confront an enchantress who threatens to unmake time for ever.

For fans of King Arthur, time travel, the middle ages, and, above all, snow. Ages 11 and up.

About…

My name is Martin Turner, I'm a political activist, chartered public relations practitioner, musician, committed Christian, and commentator on things technological, literary and otherwise creative.

Please feel free to post comments on articles — all comments are moderated, and may therefore take a little time to appear. I reserve the absolute right to not allow comments that are offensive or off-topic. My automatic spam filter reserves the absolute right to simply delete comments it thinks are spam.