In a recent address to his graduation class, Facebook founder Mark Zuckerberg called for the implementation of Universal Basic Income (UBI): a well-known and forward-looking concept in the social sciences, where the state provides everyone with an income sufficient to meet their basic survival needs such as food, shelter and clothing, irrespective of whether they are gainfully employed or not.

I am greatly encouraged by this resurgence of interest in UBI, amongst Zuckerberg, Musk and other Silicon Valley big wigs. I noticed however, that Zuckerberg’s speech in support of UBI drew a hostile reaction of non-trivial proportions across social media, and thought it worthwhile saying a few words in support of his cause.

Advancing technology and increasing automation is leading to fewer jobs.

The dire need for a financial cushion for people, so they could educate themselves as adults, or engage in quality parenting, or perform other productive activities at different stages of their lives, which don’t provide a direct cash return.

The undeniable role that basic financial security plays in fostering entrepreneurship.

I would like to reinforce Zuckerberg’s case for UBI by expanding and extending his rationale. I am charitably assuming of course, that Zuckerberg’s interest in the matter goes deeper than a mere desire to absolve people of the need to work, so they could spend all day on Facebook – a rather sardonic yet common enough reaction to his address.

Let us for a moment explore the ethical underpinnings of the objections to UBI. The commonest objection raised against UBI is the objection to giving people “a free lunch”, and thus “spoiling” them. I saw this particular objection echoed time and again in the commentary surrounding Zuckerberg’s address. One commentator stated this objection with poetic elegance, sighting the good book. “In the sweat of thy face shalt thou eat bread”, he chimed in.

I don’t buy this ethic. Iron age religions codified our inherited instincts to forage and hunt, which were perfectly natural, into a dogmatic ethical principle that one doesn’t deserve to eat, unless one has worked for it. There are two problems with this rather unfortunate paraphrasing of our natural instincts. The first is that nature doesn’t set a precedent to frown upon idle eaters who reach out effortlessly for an easy meal procured by someone else. A male lion, mooching about on the savanna whilst the rest of his pride sweats hard to bring down a kill, may simply saunter over and dig into the carcass, without causing any ill feeling.

Of course one has to sometimes “work” to obtain a meal in nature, but everyone doesn’t have to work for it all the time, and, more importantly, providing food to others is not something for which one need demand an explicit return. This is the second problem with the canonical viewpoint. Social animals such as lions, gorillas or meerkats instinctively understand that opportunity is the biggest success factor in nature, and the individual who “wins the bread” shares it without placing demands. An ancient Homo sapiens may have brought down a boar and dragged it over to his tribal dwelling, to be shared with his kinsmen with altruistic pleasure. Group cooperation, and adaptation for altruism amongst kin, are well-known Darwinian processes.

Civilization and the rise of religious ethics changed this protocol of feeding each other FOC, by sub-optimally placing a mandatory barter value for a meal; it had to be obtained by working (and usually working forsomeone else). To put it plainly, we were told we have to toil forevery f…ing meal. We were conditioned to feel squeamish if we had procured our lunch effortlessly, even if it harmed no one.

Another fallacy, which again I suspect has its roots in the folk psychology of religions, is the idea that poverty is the main driver of success. One particularly benevolent commentator on the Zuckerberg story had these words of wisdom to say: “poverty will be merely a step you take towards success”. Really?! Contrary to this rather masochistic view, the majority of those whom I’ve met who had lost jobs due to no fault of their own, will attest to the huge dependence of their subsequent success on how much financial support they got, when they were “down”.

Rather than being a driver of employability, the fear of starvation often rushes and muddles up the process of getting back on one’s feet. Your friends and relatives push you get any kind of job, which often doesn’t match your skillset, causing further disruptions in your career and more psychological distress.

I quote from a conversation that the political scientist Charles Murray had with the philosopher Sam Harris, where Murray says that an income stream actually improvesmoral agency, contrary to popular belief. It’s much easier for society to demand more from someone whose basic survival needs are already met. “Don’t tell us you are helpless because you aren’t helpless; the question is whether you are going to do anything to further improve your lot” is something we can tell those who are unproductive, yet receiving a basic income from the state. In contrast, far too many homeless people without a predictable income are powerless to land a job interview, simply because they can’t afford to dress up tastefully. This fact reinforces Zuckerberg’s third point.

Let us bring in another perspective to Zuckerberg’s second point. Many young people sacrifice their best years helping others, at the expense of helping themselves to a comfortable salary. If one raised a child (or looked after an aged parent or grandparent), one has discharged an important practical responsibility towards maintaining a civilized and productive society, for which one ideally aught to receive some material benefit. However, when such a dutiful person looks about to make a living after a hiatus in paid employment, they often face a forbidding society that won’t employ them again because they have a “broken track record”, or are “too old”, or judged to be “overqualified” if they seek a “lesser” job than they last held.

To expand on Zuckerberg’s first point, it is more than mere automation that future employment seekers must worry about. The demography of working class society is changing, towards the upper end of the IQ and EQ bell curves. The rise of importance in IT is a classic example. Coming from this industry myself, I can say that not everyone is cut out to be a good software engineer, for example. In fact, very few people are. Successful lateral career moves into software engineering are an absolute rarity, and worse, the ratio of employability of graduates keeps dropping over the years. It is harder to become an expert software engineer in 2017 than it was to become a successful corporate executive in 1980, in real terms.

The eminent historian Yuval Noah Harari predicts that, barring other catastrophic possibilities like extreme climate change or nuclear war wiping us out, humanity is reliably on course towards freeing itself from the shackles of existential labor, and morphing itself into a species that spends most of their lifetime on pursuits of a recreational nature, either of the intellectual or physical kind. Hence the title of his latest book, “Homo Deus” – human gods. Work, including food production and delivery, will soon be seconded to technology, and humanity will be left to worry about doing things to please themselves, or please each other. This doesn’t sound like all too bad a predicament for us, particularly if one didn’t subscribe to those silly Iron Age philosophies about the sanctity of laboring for one’s meal.

I’ve purposefully not discussed the economics of walking towards UBI and ultimately a labor-less, recreation-focused society. I’ll leave that discussion to the economists and experts. Suffice to say that a very promising trial is in progress in Finland.

However, I argue strongly against any moral objections to freeing ourselves from the need to labor for our basic needs. Social norms are evolving, and its time that we freed our minds of the ancient burden of mere survival, in order to move 100% into the more joyous space of innovation and recreation. Just as Homo erectus evolved towards freeing two of its four limbs to use tools and develop its mind, Homo deus aught to evolve towards freeing its mind of the worry of survival, and focus on developing its technology and the quality of its leisure-time, at an accelerated pace.

Like this:

Many years ago, my uncle – who was a doctor – once told me, “There are no systems of medicine, just a system [singular]”. What he meant was that the only effective “system of medicine” known to humankind is the one that discovers new ways to heal the sick through rational supposition (about a drug or a clinical method), and subsequent confirmation through controlled experiments.

Over 150 years have past since Pasteur & Koch confirmed the germ theory of disease, and it’s been nearly 70 years since the basics of health science and modern medicine were introduced into middle & upper school curricula in our own country. Yet I find that this foundational truth about the empirical nature of medicine has not taken root in the ethos of Sri Lankans. I see a worryingly large number of compatriots believe that there are several alternative “systems of medicine” available at our disposal, such as “Western Medicine”, “Ayurveda Medicine”, “Acupuncture”, “Homeopathic Medicine”, “Astral Medicine”, “Alternative Medicine” or “Indigenous Medicine”.

Furthermore, the fact that there is a functioning government Ministry for “Indigenous Medicine” shows how far and wide this retrogressive misconception is entrenched in Lankan society. I believe its high time movers, shakers and socially conscious individuals muster their courage and the necessary resources to launch a massive campaign to educate the masses away from this harmful notion of the availability of alternative “medical systems”. There are many “disruptive” campaigns afoot in Lanka to raise the consciousness of society about problem areas like Gay Rights, Smoking, Drinking, Drug Abuse, Women’s Rights and Children’s Rights. The addition of PSEUDOMEDICINE to this list is long overdue.

If one were to properly survey the magnitude of the damage caused by so-called “alternative systems” of medicine, calculated in the form of loss of life, debilitation and needless discomfort caused by maltreatment of diseases, and the amount of money frittered away on bogus therapies for chronic or incurable conditions, one might be stunned by its enormity. It may very well prove to overshadow the combined “cost” to the nation, incurred by the aforementioned problem areas combined, such as Smoking, Drinking and Drug Abuse.

I confess I’ve forgotten most of the details about elementary medical science that I learned at school; and yet I was impressed enough by the subject matter to have etched in memory such useful principles like The Double-Blind Trial, The Hippocratic Oath (i.e. doctors swearing to first do no harm to the patient), The Germ Theory of Disease and How Infections Spread, the Theory of Immunity and How Vaccination Works, the Hereditary Nature of Some Illnesses, the Unreliability of Anecdotal Evidence, or The Difference Between a Virus and a Bacterium.

Let us recollect for a moment the concept of The Double-Blind Clinical Trial; if memory serves me, this is something we learn about in our GCE O/L Class. Any new medicine is assigned a period to test its effectiveness, where neither the researcher of the drug, nor the drug’s potential beneficiaries, actually know who gets the potent pills, and who gets the dummy pills that are thrown into the bargain to eliminate subjective human bias. We learned that an impartial third party adjudicator does a random assignment of patients to pills (potent or dummy), and that this same third party gathers the raw results, performs statistical analysis, and presents only the final outcome to the research team.

Have such impartial clinical trials ever been conducted to test the effectiveness of these so-called alternative methods of treatment? I challenge readers to present a single credible experiment conducted on popular “alternative medicines” like the thailayas, guliyas and arishtas of Ayurveda, published as a case study in a peer-reviewed journal. At best these substances facilitate the placebo effect – where a patient’s psychology improves immediately because they think they are under treatment, perhaps causing some degree of physiological improvement in turn, due to reduction in stress. At worst though, some compounds (such as Alcohol, Heavy Metals) in these “medicines” can be toxic when ingested over long periods of time, aggravating the original condition or causing other illnesses to crop up.

I suspect however that the biggest problem is that there are countless unreported cases of patients having delayed receiving proper medical attention for their complaints, because they counted instead on an “alternative” therapy to do its work. When their condition gets acute, they are rushed to hospital, where oftentimes it’s too late. Septicemia or other complications set in, causing death.

The way so-called Western Medicine is administered in our country is far from perfect, where abuses range from incompetence, to the indiscriminate prescription of antibiotics for colds (which are caused by viruses and thus unaffected by antibiotics, unless there is secondary bacterial infection that needs treatment), to delays in the treatment of acute infections due to fear of accountability for their side effects, to the administering of drugs without informing patients of their side effects, and taking no precautions against them.

The naked truth though is that in spite of these common imperfections in its practice, the “Western” system of medicine remains the only effective and self-improving system of medicine available to us, and its benefits far outweigh its drawbacks. There is simply no comparison with “alternative medicines”; they are mere hocus pocus, and represent an early historical attempt at healing the sick. They were superseded by modern, evidence-based medicine around 150 years ago. We must move on.

What worries me most, and what I am trying to address in this appeal to Lankan society, is that the knowledge we are taught at school about health and medicine aught to shape our subconscious instincts about the world. Much in the same way that gravity makes us shy of heights, or the volatility of petrol makes us shy of lighting matches near open fuel tanks, one would expect the educated masses to shy away from pseudo medicines and quacks reflexively. It is this instinct, to know when we are stepping outside ofthe medical system into woo-woo land, which I feel we aught to inculcate in our children.

The movement to educate society about how to look after ourselves and our loved ones in times of ill health is worthy of being elevated to the level of a profound social campaign akin to human rights, anti-smoking or gay rights, where the consciousness of the masses are sensitized to this issue through direct action. Where are the NGOs promoting health awareness?

I am by no means advocating here that we must become completely mechanistic in our approach to helping sick people. All human existential problems in general, and illness in particular, must without doubt be approached with a touch of spirituality. I personally am an atheist, unless one considers a belief in a deistic order in nature that transcends parochial religion, as being religious. Yet I certainly could empathize with a more religious minded person, who says a prayer for her love ones to recover. Any compassionate human being aught to be able to relate to this need for an almighty’s help, when one feels utterly helpless. However a loving, spiritual approach to patient care clearly doesn’t include allowing charlatans to deceive patients and aggravate illnesses, or holding off on more effective treatments due to one’s sheer ignorance. It is this ignorance that we must eradicate.

If you care about your loved ones, and want them to be able to get the best possible medical attention when they fall sick, then please join this campaign and echo this mantra.

When you fall sick

Lets learn about our bodies,
and how we fall sick.
There is just one system of medicine,
that makes us well quick.

Or even if it doesn’t,
and it only eases the pain,
its far far better,
than suffering in vain.

Do say a prayer,
to heal your sister,
but don’t waste your time,
take her to a doctor.

When you fall sick, charlatans will rush forth,
they will play upon your vulnerability, and take you up the garden path.
Its only your education, and your desire to know the truth,
that will save you and your family, from the devils hearth.

Like this:

I feel that wearing Burkinis (and indeed Burkas) doesn’t make good dress sense at this point in history. I particularly dislike the Burkini fad because I believe that this fad helps symbolize an outdated and implicitly offensive view of normative human relations between the two genders. As civilized human beings, we have an obligation to inoffensively conduct ourselves in public places, if we can help it. Please allow me to explain myself.

I readily concede two possible handicaps, which may impair my judgment on this matter. I’m not a woman, and I’m not a Muslim. I’d be grateful to stand corrected, through rational discussion.

I believe that anyone has a human right to wear a Burkini. Any attempt to introduce a law banning Burkinis would violate so many fundamental human rights during the process of enforcing it, that such a ban would result in a moral travesty. Forcibly stripping the garment (and the dignity) of a woman is simply unthinkable to me.

Sadly, something of this sort happened in Nice last month. I am very disappointed with those French authorities that were responsible for this physical violence against Burkini wearers. I recoil from the notion that Muslim women must be “taught a lesson” physically, for revealing their religious identity through their clothing. If anyone wants to wage a “war” against what they feel is a highly offensive dress sense, then the proper thing to do would be to reach into the hearts and minds of the wearers.

I find nothing offensive in the mere physical appearance of the Burkini. Nor does it appear to be an impractical garment for the circumstances it was designed to be worn in. The Burkini is not quite like the Burka. Burkas were originally meant to be universal, commonplace clothing for women, yet they inhibit physical dexterity and the range of activities one can participate in today’s world, such as running for the bus, motorcycling, walking in the brush, exercising in the park or even driving a car.

The Burkini has no such shortcomings in my view, within its envisioned purpose. It is more or less like a loose, hooded wet suit, suitable for wading into the water, swimming (although a figure hugging wet suit made of the proper material might be more streamlined), or even hanging about the beach while avoiding a suntan. Burkinis might also be useful for those who have skin conditions or hair loss, which they’d like to hide when taking a dip. They come in attractive colors, can compliment a woman’s figure, and are pleasing to the eye.

I understand that the Burkini was designed with the good intentions. Aheda Zanetti presumably developed it as a step forward in the emancipation of Islamic women, allowing them to swim or wade in public places without revealing their skin and hair, thereby helping them to conform to the Islamic tradition of “modesty” in women. Women who wouldn’t swim beforehand, for fear of raising eyebrows in Muslim society by wearing a “revealing” swimsuit, are able to swim now.

I can appreciate the fact that some women, who have followed certain wardrobe habits through tradition, might feel an awkwardness to change them. Perhaps it may be similar to the awkwardness I felt the very first time I jogged in the park, in running shorts (I was a very shy teenager). I agree that you cannot be forced to wear something you feel awkward in, such as a swimsuit.

I don’t think however, that it’s a major leap of faith to change one’s dress sense. Islamic societies have been changing dress patterns rather rapidly at various points in history, in countries like Iran, Iraq, Syria or even in Sri Lanka, where I come from. Muslims have lived harmoniously in cosmopolitan Lankan society wearing both western and eastern (Sari) dress for centuries. It’s only within the past decade and a half that we Lankans have seen the Burka come into fashion amongst Muslim women. Their mothers didn’t wear them.

The Burkini and its “parent” garment the Burka cannot be isolated from the loud religious symbolism that underpins them. Anyone knows that Muslim ladies can only wear them, and that it would be an offence (in the eyes of a Muslim) for someone who doesn’t subscribe to the Islamic teachings to wear them. This is quite unlike other traditional garments such as Saris or a Kurtas, which were originally adorned by a particular culture, but with no exclusionist philosophy attached to them. Christian Lankans and Atheist Londoners have been seen wearing Saris and Kurtas for decades.

In a day and age where inter-cultural collaboration has led to better prospects for humanity, I feel it’s a little ostentatious to flaunt one’s inner religious beliefs as if it were the most important thing about oneself, to announce to the rest of the world. I feel the same way about the garb of Nuns or Priests, although in the case of nuns and priests, they by definition are renunciants from society. They would presumably like to discourage interaction with other people, except for solicited religious discourse. For women of the world, working closely with men and women of other cultures and religious denominations, I wonder if this flaunting of one’s religion makes good sense. It’s sort of like warning people that you belong to some intolerant cult.

Although some people might want to characterize Islam as such, I’m hopeful its not.

I am put off by the gender-demeaning symbolism of Burkinis and Burkas. The integrity and self-respect of both the genders are challenged by this symbolism. Just think about. In the case of the full Burka, we often find a well-dressed and otherwise attractive woman covered in what can only be described as a black cloth bag, to hide the “shameful” body she was born with. What are we ashamed of here?

Long before the advent of Islam, different human races had strived towards an optimal balance in body covering, balancing protection (from weather and sexual aggression) with display (of one’s unique identity and attractiveness). As dress senses evolved, we saw common patterns emerge, where one’s vulnerable places were often tastefully covered, whilst the rest of the body like the head, arms, hair, midriff and feet were often exposed (and adorned) for dexterity, recognition and beauty. Sure there were variations in the extent of cover, mainly based on climate. Those residing in temperate countries covered more of themselves because it was cold, and those in the tropics covered less because it was warm. There was no concept of hiding one’s entire body as a shameful object, with either gender. The fur coat of the Eskimo and the Sari of the North Indian are examples of naturally evolved wardrobe.

Furthermore, the majority of societies around the world developed systems of ethics, and rules of law, that strictly forbade women being molested by men at sight, for their bodily attractiveness. If we take Western Europe as an example, lawmakers and leaders improved social conditions over centuries, to allow attractive, figure enhancing dress to be worn by women, without being in danger of coming in harm’s way. The incidence of rape or violent sexual harassment due to the wearing of so-called “revealing” clothing is statistically insignificant in Western Europe today.

The philosophy of encasing women in order to protect them from the marauding instincts of men sets rather a low standard for men, and for the beautiful affair of human courtship. Since the days of the enlightenment, Western social norms neither accept nor allow disrespectful sexual submission; instead they expect high standards of restraint when it comes to sexual conduct. Women are not raped because they chose to be sexually attractive; rather, women occasionally get raped because of the psychopathic or violent behavior of errant men. Society trained to despise such men, and to protect the freedom of women (and men) to express their sexuality (i.e. capacity for sexual feelings and sexual orientation) openly, as a necessary part of friendly, nonviolent courtship.

Western traditions around courtship are fine-grained, such as reading the right body language before venturing into a kiss. Sex and courtship has evolved away from the course-grained affair described in the ancient religious texts, where women either covered themselves to look nondescript, or got plundered by sex-starved men. Courtship is about mutual attraction, love and consent today. Westerners or even easterners like myself who happened to grow up in liberal, evolved society, feel a tad uncomfortable to be implicitly branded as potential women-molesters.

I have this intuition that to be wisely dressed involves finding some middle ground between nudity and complete encasement in a cloth bag or skinny. Do you not feel this instinctively? That we should look nice and confident to others, but at the same time not offend others? That we should change how we dress based on our activities, our desire for comfort, and the weather?

If you do, I urge you to dress not for isolation, but for the occasion. If your society forbids you to do so, fight it nonviolently.

Share this:

Like this:

Watching the recent sparring between Dan Dennett and Sam Harris over the nature of “free will”1, 2 – the idea that human beings have conscious volition over their physical actions – has helped me immensely to refine my own opinion3 about this ancient and fascinating human intuition.

Sam’s view falls squarely in line with what neuroscience tells us. We know today that our subjective thoughts about a given physical action (like “lets turn on the light”) are preceded by unconscious neural activity that, if detected by the appropriate gadgetry, predicts the decision we “make consciously”. For example, if we were hooked up to the right kind EEG or FMRI scanners to measure our neuronal activity, we’d be able to first detect the neural processes that would make us “turn on the light switch”, and thereafter we’d have the subjective thought “lets on the light”. Finally, we’d physically turn the light on. Benjamin Libet first demonstrated this rather spooky phenomenon back in the early 1980s, through his famous “experiments”4.

It seems that Sam’s argument against free will emerges from this foundational scientific discovery, and is strengthened by his own, unique intuition based on introspection, that we don’t know where our thoughts come from, before they actually occur to us5. We don’t rationalize why we want to throw the light switch on, until the thought comes into our minds. Of course there would be physical causes for our actions, such as the ambient lighting in the room being low. It seems we make contact with the external world unconsciously, and a path of action falls into place unconsciously, and thereafter a thought bubbles into consciousness like “lets turn on the light”. Similarly, a myriad other thoughts may bubble into consciousness retrospectively, such as “its dark, that’s why I put the light on”. We are able to connect the dots – i.e. attach semantics to our actions in our consciousness – but only so far as our sensory inputs and other unconscious cognitive processes allow.

Therefore, a close study of the nature of our subjective thoughts and their relationship to our physical actions seems to nullify the long-held notion that we are in some sense absolutely free to “consciously preside and decide” over a multiplicity of options, when faced with a physical situation. Moreover, it seems that we usually don’t “think first and act later”. Rather, it seems on deeper analysis that we actually act deterministically, and rationalize or attach meaning to our actions later.

Sam has contrasted this important learning from neuroscience with the widespread advocacy for punishment, and the keenness for judgment in today’s society. Everyone wants to judge and punish others, because they falsely believe that wrongdoers “consciously decide” there actions and therefore are accountable for them in some absolute sense. In contrast, the concept of accountability it seems has no scientific grounding in an ontology of determinism, however complex and convoluted the deterministic processes that generate a given wayward action may be. We are never ultimately accountable for our actions in some puritanical sense, where if time were reversed magically, we’d have been able to “not switch the light on” (or more appropriately, “not plunge the knife in”). Therefore, according to Sam, we are better off sans the concept of punishment, and its allied judicial proceedings that focus on inflicting suffering on the wrongdoer, for making his “bad decision”. The practical implication of Sam’s central theme is not new, and in fact many countries practice a restorative system of justice (as opposed to a punitive system, that was instinctively rejected even by ancient sages like Christ) that, in theory at least, aligns well with the absence of free will.

It seems we aught to live in a world where wrongdoers are re-branded as errant human beings, or beings who are misguided, poorly trained or otherwise psychologically or physiologically maladapted to harmonious living. It is a well-known hypothesis that some people are by nature unempathetic towards others, due to physical abnormalities in their brains such as deficiencies in their mirror neuron systems6. It would seem less useful to understand such persons as “evil”. So we better off coming to terms with the fact that we live in a society burdened by weirdos, but not by “morally depraved persons” who must be punished or purified of their transgressions.

Therefore, systems of justice must focus on protection, prevention, restoration, and behavioral modification via training, if the latter were possible. The lives of such grossly harmful persons may still require termination, but merely as an act of self-defense by larger society. We don’t require a death sentence, nor do we judge people to be “evil” – instead we either sequester or (if absolutely necessary) kill extremely dangerous people as painlessly as possible, when we have concrete evidence of their impending transgressions.

Dan Dennett brings in an entirely different, but equally important dimension to the debate on free will. In summary, he believes that free will is a useful practical intuition (although perhaps an illusion in a theoretical sense) because it’s an effective way of minimizing and marginalizing errant behavior when living in an interconnected society. It’s socially advantageous to be “offended” and slap back, when someone slaps you, rather than contemplate about the inevitability of the first slap and the lack of volition on the part of the slapper. By placing the mantle of accountability on others, and punishing the wrongdoer, society jockeys towards a harmonious balance-position, where errant behavior is minimal.

The instinct to assume the capacity for an absolute freedom of choice in our neighbours, must have evolved for the above reason. To dwell on this point – how does the attribution of free agency to others become ubiquitous in a society?

The sense of pain (or discomfort) is the ultimate learning tool of evolution. Pain is useful because we have memories of us undergoing painful experiences. So when one has experienced a nasty slap for a particular action one has taken, the pertinent neural network associates that discomfort with the action. The next time that an opportunity presents itself for a similar physical action, an extra parameter comes into play during the early, unconscious part of the neural processing. The slap is not carried out, because its painful consequence is also fed into the neural network. When the subjective thought surfaces, the meaning of one’s action is expressed: “Lets not slap this guy, he may slap back”. So in a world where we punish people because they are “accountable” for their actions, we find society conditioned reflexively to expect a reprisal, and hence become more guarded. In the early days of human evolution, this would have amounted to an avoidance of death, and hence genes that predisposed a person towards retributive action, if they indeed exist, would have been selected.

There seems to be merits to both Sam and Dan’s points of view. I personally have developed weariness towards adopting Dan’s “social” or “3rd person hypothesis” of free will (which I used to empathize with some years ago).

The reason is this. Unlike in our evolutionary past, where person-to-person violence (strike and counterstrike) played an active, mediating role in behavior, we live today in a world where person-to-person violence appears to be on the decline7. Society has been trained to avoid person-to-person violence. Instead, large-scale violence organized via memes or catchy intellectual instruments of a punitive nature seems to be the order of the day. “Assad is an evil dictator” or “the west is greedy for Middle Eastern oil and is destroying the Islamic world trying to grab it, so we must defend ourselves” or “Corrupt Dictators are running some countries, lets punish them and do their countries a favor” or “Russia is an evil empire” are the sort of intellectual instruments behind global violence and suffering today. And, at some corner of these catchy thought patterns, lies the potent core idea of punishing leaders of countries, or even entire nations or communities, for what is perceived (and oversimplified) as their willful wrongdoing. The consequences of these crude attempts at justice often leave the world worse off then before.

It serves as a deterrent for persons who may be contemplating violent crimes

It serves as a “redress” for the victims of a violent crime

In Sri Lanka, Capital Punishment has thankfully been shelved for nearly half a century. Yet we occasionally hear a public outcry for its re-introduction, in the aftermath of a horrendous crime. Citizens feel outraged by a particularly vile act, and want “something done about it”, to prevent it happening again. Capital Punishment suddenly seems an attractive solution to politicians, who feel answerable to the demands of the general public. Recently, no less a person than The President threw his weight behind Capital Punishment, saying he was working towards its reinstatement in 2016.

It is noteworthy that a mere century and a half ago, judicial experts and the intellectual community at large would have sided with such an intuition for leveraging Capital Punishment.

Even that great rationalist luminary of the 19th Century, JS Mill, famously argued in parliament in favor of capital punishment, albeit for the most extreme of cases1: “…when the attendant circumstances suggest no palliation of the guilt, no hope that the culprit may even yet not be unworthy to live among mankind, nothing to make it probable that the crime was an exception to his general character rather than a consequence of it, then I confess it appears to me that to deprive the criminal of the life of which he has proved himself to be unworthy, solemnly to blot him out from the fellowship of mankind and from the catalogue of the living is most appropriate.”

However, times have changed since the days of Mill. We saw two whole new branches of science emerge, which may have something definitive to say about the efficacy of Capital Punishment; namely psychology and sociology.

We know today that motive #2 (Capital Punishment is a deterrent) is empirically false2, 3, 4, 5, 6, and we know that motives #1 (“punishment”) & #3 (“redress”) are a mere “window dressing” of an retributive instinct that was perhaps useful in stone-age tribal societies. Contrary to this primitive instinct, many a moral philosopher, both ancient7 and modern8, has rejected capital punishment (or worse, retribution as a solace for victims), as an uncivilized way of conducting human affairs.

Albert Camus, that outstanding French libertarian and writer, highlighted the concern:

“But what is capital punishment if not the most premeditated of murders, to which no criminal act, no matter how calculated, can be compared? If there were to be a real equivalence, the death penalty would have to be pronounced upon a criminal who had forewarned his victim of the very moment he would put him to a horrible death, and who, from that time on, had kept him confined at his own discretion for a period of months. It is not in private life that one meets such monsters”.

The utopian human being with perfect mannerisms and an unfailing character is an imaginary socio-psychological construct, a conceptual role model for children. The uneasy truth is that human intent is fickle, governed by a nervous system whose structure and function is fraught with aberrations, which cannot be eliminated through nurture alone. A sociopathic personality, for example, could be the direct consequence of a poor endowment of mirror neurons, or other generic mutations that attenuate empathy from birth9, 10, 11. The details are somewhat complex and beyond the scope of this essay, but for those with such subtle birth “defects”, no matter how peaceable their childhood influences may have been, they may be saddled with a fundamental inability to empathize with other living creatures. They may not even be able to empathize with themselves, in a self-reflective manner. Not being able to feel for someone (or even for one’s own self) makes it easy for one to cause injury or distress to others.

An Iowa Supreme Court Justice made this observation as far back as 1840:

“Crime indicates a diseased mind in the same manner that sickness and pain do a diseased body. And as in the one case we provide hospitals for the treatment of severe and contagious diseases, so in the other, prisons and asylums should be provided for similar reasons.”

If society ends up killing every such person who yields to his natural instinct (to strike, rape or plunder), rather than finding ways to curb or neutralize their behavior, we then get into a fascinatingly diabolical downward spiral. The more we kill those who lack empathy, in order to better the lives of those who have it, the more we lower the empathy of the empathetic. We know that a taste for judicial killing brutalizes society12, as was the case in Victorian England, where public hanging made life cheap, and people even more violent. We find such a brutal society today in Saudi Arabia, where domestic workers are abused13, and where murder and sex crimes are rampant. The executioner hacks away to no avail.

That is precisely why the more enlightened nations (including Sri Lanka) aspire to practice Restorative/Preventative Justice14.

We should also not make any mistake on the legitimacy of the actual act; Capital Punishment is a premeditated violent crime committed by the state, according to modern jurisprudence. It is not an act of self-defense (as Camus and others have clearly pointed out). Perpetrators are often executed years after their bad acts were committed, by which time their attitudes have changed dramatically for the better. There are ample such cases widely publicized in the media15.

There probably are a dozen other reasons16 for permanently abolishing capital punishment and resorting to a lengthy prison sentence, ranging from the danger of punishing the innocent to the cost of the entire procedure outweighing the cost of a life sentence. To quote Jeffrey A. Fagan, Professor of Law at Colombia Law School:

“As states across the country adopt reforms to reduce the pandemic of errors in capital punishment, we wonder whether such necessary and admirable efforts to avoid error and the horror of the execution of the innocent won’t—after many hundreds of millions of dollars of trying—burden the country with a death penalty that will be ineffective, unreasonably expensive, and politically corrosive to the broader search for justice.”

There is one very special reason why Sri Lanka should think twice about this measure. Sri Lanka is the Asian poster child for a country operating a genuinely restorative system of justice, supposedly drawing inspiration from the compassionate philosophy of Gautama Buddha17. It is a true sign of our sociocultural progress, in comparison with our neighbors. It is disappointing to see our President succumb to the knee-jerk reaction of the mob (or worse, “believe in” CP), rather than stand upright and explain to people the hard truth that we cannot win the war against crimes of passion and deviance through attrition.

Our President, in his speech, attempted to drop originality of thinking and hide behind the fact that the USA and China leverage capital punishment. Let us quote that preeminent American moral philosopher Sam Harris on this matter:

“Especially in the United States, is a barbaric system of imprisonment—to say nothing of capital punishment—that should make all citizens ashamed”.

Did our President take the sensible step of consulting a bona fide Sri Lankan criminologist or sociologist on the matter, or at least have his staff perform a literature survey and advise him, prior to making his announcement? Dr. L.B. L. De Alwis, ex-Chief JMO, had published an excellent analysis of the Lankan situation18 in The Sri Lanka Journal of Forensic Medicine, Dec-2011, where he strikes at the core of the problem, along with a superb background analysis. Let us quote.

“In my opinion it is not the Non-implementation of the death penalty that has contributed to the rise of grave crime, especially murder, in Sri Lanka, but the release of murderers, rapists, drug barons, extortionists, highway robbers etc. sentenced to death or to long term rigorous imprisonment by the Judiciary, but later released by the executive in the shortest possible time for petty political advantage”.

Yasantha Kodagoda and other Lankan legal luminaries have held similar views19 over the years.

To conclude, it’s simply awful when a terrible crime happens, like when a child is raped and killed (the crime that fuelled the President’s declaration). Our hearts go out to the victim’s kin.

The state owes four things to society in such cases:

A swift and accurate dispensing of justice, where the perpetrators are correctly identified, tried fairly and sentenced

The next of kin of the victims are provided with appropriate counseling and support, to the utmost possible degree

The lessons learned from the incident (if any) are shared for the broader education of the general public

Firmly discourage lawlessness and mob-justice, which would interfere with the official criminal investigation

The third point is important and not to be underestimated in its value. Education, awareness and vigilance are the real weapons against such “personal” crimes. Subtle profiling of violent or deviant persons, cautioning parents and children about how to stay safe in ungated, low-income neighborhoods where dangers lurk (as appears to have been the case in the particular crime that fuelled the President’s decleration20), enforcing better policing etc. are all steps to be facilitated by the state.

What the state does not owe society is a reactionary, “quick fix”, which would prejudice or pervert the broader course of justice in our country, and create an unhealthy punitive culture amongst our children. We leave the reader with this quote.

“I have never heard a murderer say they thought about the death penalty as consequence of their actions prior to committing their crimes.”

Like this:

Thank you for this instructive discussion with Paul Bloom, which I listened to with great interest and found worthy of sharing amongst friends. The thoughts that came up in this conversation, such as the difficulties in having a rational, considered argument in politics, resonated well with the concerns surrounding the ongoing political discourse in Sri Lanka, where I’m domiciled. I feel you are doing a fantastic job in bringing up for debate the ethical underpinnings of happenings in contemporary society. If nothing else, these discussions would teach the world how to grapple rationally with the morality of our age, without relying entirely on prevalent dogma and precedent. You guys simply rock!

I’d like to contribute with what I hope is a meaningful rejoinder to the thoughts expressed regarding the killing of Cecil the Lion. It appeared to me that the two of you were generally in agreement that there was a gross overreaction in social media to the killing of this magnificent creature, for sport. You were also concerned that this social media buzz spawned physical attacks on the hunter (or at least his lodgings back home in the US), which you saw as immoral.

Whilst I decidedly agree with the latter concern, I equally strongly disagree with the former. There are many reasons why a mass outrage in social media, to the killing of Cecil, was both ethical and timely.

Let us first level the playing field between the human animal and others. Animals have varying degrees of “sentience” (or “consciousness” or “subjective experience”, call it what you like) akin to us humans: a position that many reputed scientists and philosophers of mind seem to be gradually gyrating towards. Even those intellectuals who used to be known for their hardline materialistic views such as Dan Dennett have in the recent past professed that consciousness is some sort of epiphenomenon that “comes in the baggage” of complex, evolved creatures that exhibit purposeful behavior [Ref: 1, 2]. According to Dennett, a cat may not reflect much on his experiences, yet it very likely experiences a world moment-to-moment (note the word “a”; it may not be identical to our world, it may be diminutive or different). The more primitive aspects of experience would likely be very similar in different animals, such as pain. I’m sure Thomas Metzinger, Christof Koch or any other such astute thinker who has reflected enough to comprehend the relationship between evolution, behavior and consciousness would agree: animals are (varyingly) sentient [Ref: 3, 4].

This brings us to the first point in my argument. Leaving everything else aside, it was unethical to kill Cecil because he was a conscious creature who experienced a world, its pains and pleasures, like we do. One could argue that since Cecil’s nervous system and behavior was less sophisticated than ours, it was somewhat less unethical to kill Cecil than if we were to kill Paul, for example. 🙂 It would perhaps have been less of a painful experience than that would have taken place if Paul were the victim instead. Granted. But it would still be a bad business, ethically.

Lets us come to the second point. It is one thing to kill for sustenance, and another to kill for sport. Killing for sport not merely ignores the suffering of the animal; it also glorifies the act as a pleasurable activity for humans. The joy of sport seems to me to be an acquired taste or thrill that one enjoys after conquering the initial empathetic revulsion produced by the mirror neurons in our nervous system. Sport also brings about the latent pleasure of basking in the glory of one’s success, as the idol of fellow-minded hunters.

Moreover, killing for sport is symbolic of man having the moral right to dominate over other species on this planet, and destroy them not just for food, but also for the sheer joy of the experience. After all, who cares about “dumb animals”? This is a notion prevalent in Judeo-Christian cultures. To us “cultural” Buddhists however, it seems natural that we have no more moral right to be thrilled in killing an unwilling Lion, than to thrilled in having sex with an unwilling woman.

This brings us to my third point. As you yourself seem to suggest, there is a difference between a natural revulsion to killing, like the revulsion to an unpleasant odor, and a considered moral objection to causing pain or ending the lives of other conscious creatures, however “less sophisticated” their inner experiences may be. I agree that we are fighting against a strong, inherited taste for consuming meat, and that some of us conscientious omnivores would readily do the “more ethical thing”: engage in the consumption of seemingly less-sentient creatures like shellfish, and stop the consumption of beef or pork until such time as we could culture the muscle tissue of these tasty animals. Or at the very least, consume the meat of these large, sentient creatures sparingly. So if I’ve understood you right, we both agree that we should care about the fate of animals with complex nervous systems, and avoid eating them as much as our cravings will allow.

Let us now come to the heart of the matter. There are just about between 16,000 ~ 47,000 African Lions at present [Ref: 5], in an entire continent of 30 million square kilometers. To the best of our understanding, this is a pitifully low number, and in spite of the various professed “advantages” in allowing hunting for sport, such as local culling or as an enticement for tourism, I believe that the evidence points towards hunting/poaching as a significant contributory factor in the extinction of this species. Controlled hunting of a few lions may not be the main reason (some folks might argue that it is), but loss of habitat to humans most certainly is. The question is, in this context of an impending extinction due to human callousness (or carelessness), what moral right do we humans have to take pleasure in wiping out this magnificent species from our planet? We have none, and this is my fourth point.

Of course, the infamous dentist (and his fellow hunters) could claim ignorance of these facts. Here is where social media comes in. The Dentist, or the indeed any other (more civilized) human being will have no trouble in accepting that the murder of humans for sport is taboo. There is no need for social media to go buzzing when one Zimbabwean human murders another, with a bow and arrow. Why? Because almost all of civilization would immediately agree it’s a moral travesty, and (presumably) the local officials would duly restrain the perpetrator until he is rehabilitated (or restrained for life, if he proves to be incorrigible). If the culprit was an American, there would be not be much fuss in extradition for judgment, as long as there was agreement that the terms of trial would be fair and civilized (e.g. no capital punishment, torture etc.) Or at the very least, there would be an equivalent trial conducted in the United States on behalf of the country where the offense was committed.

It is the audacity and misplaced ethics of the “sportsman” hunter that must be challenged. Remember, this is not some starving tribesman who shot a lion with his bow and arrow for meat. Neither is this a remote villager fending for his livestock and shooting a marauding lion. This is a Dentist operating from a plush office in America, eagerly paying money to illegally commit a moral travesty. If not for the chatter on social media, this would be just one more such travesty gone unnoticed, and conceptually unchallenged. Our friend would be warming his bottom (or pulling out teeth) back in his clinic, gazing admiringly at his brand-new trophy.

The fuss on social media would at least make American hunters think twice about rushing to Africa to kill lions and destroy that continent’s biodiversity. It is an effective and practical deterrent, and more importantly, the best possible “raising of consciousness” (as Dawkins might put it) that happened in recent times with respect to the conservation of wildlife, and the banning of hunting for sport.

I stick by the concern that the Dentist should not be physically harmed. But I would be appalled if he were not A) appropriately reprimanded (I don’t like to use the word “punished” as I strongly believe there is no place for punitive justice today) and B) The pleasure-loving hunters of the world were not appraised on the ethical concerns stemming from their violent hobby. It would be quite appropriate for our dear Dr. to have to look abashed and say, “Yes its me, the naughty chap who shot Cecil. I’m sorry.” to everyone who recognizes him, for years to come. It’s a small price for him to pay; in return for a better world that has a place for other rare, sentient and beautiful species in it.

PS: In Sri Lanka, our biggest wild cat, Panthera kotiya (“Kotiya” or Sri Lankan Leopard) used to roam in the hundreds of thousands throughout the island back in the days of the colonial British. They were hunted and poached into extinction, first by the colonials and then by the locals, until we now have around 25 numbers in Willpattu National Park and perhaps 250 at Yala. Arguments in favor of hunting were heard over the years; long after hunting was strictly banned in the 1970s. There is almost no hunting now, but there are almost no leopards either.

Share this:

Like this:

The dominance of memes and hearsay over facts and evidence in contemporary Sri Lankan journalism

The Editorial Column in last Sunday’s Observer (presumably written by Lakshman Gunasekara who is listed online as its Editor1) is a piece that professes to analyze our ambient sociopolitical situation and serve as a guide the voter2 in the upcoming General Election.

The Editor begins with this salvo. “On January 8, 2015, Sri Lanka also became known for near-heroic social and political struggles that toppled a bumbling dictator-plunderer who allowed sorcerers and astrologers to define his, and by default, his country’s destiny.”

He goes on to say, “After the narrow shave for the UPFA in retaining the Uva administration, analysts were already doing their psephological analysis and coming to the conclusion that with the disillusionment of the Sinhala rural poor demonstrated in the Uva, any political force that relied solely on that vote bank now ran the risk of losing out. Notwithstanding that political reality, the Rajapaksa regime thought fit – on the advice of their astrologers, not psephologists – to hold presidential polls two years in advance and lost it and, convincingly, at that.”

Let us dwell on the essence of these two statements.

MR is a dictator who plundered our nation and,

MR held the presidential election early because his astrologer said so.

Can this Editor back these two claims with some evidence?

After all, these are profound statements to be made by the Editor of one of the nation’s widest selling Sunday newspapers3, which has a powerful influence over the general public.

If these two statements are true, then our Penal Code compels us to act upon them. If they are false, then I’m sure the Elections Commissioner would be compelled to act upon them, in order to uphold the law governing free and fair elections. Remember, election candidates have rights too; they cannot be subject to irresponsible, malicious defamation by news media. There is no attempt by this Editor to frame these aspersions into proper perspective, like for example “it was rumored in some quarters that MR held the election because his astrologer said so”. And there is certainly no evidence put forward by him to support these two statements in his piece.

Perhaps we aught to take example from his singular style of critical thinking, which throws empirical evidence, fair-mindedness and intellectual honesty out the window: and begin paying homage to gossip, rumor and a free casting of aspersions. Perhaps we should already “know” that this Editor is a mere hack of a political party opposing MR’s candidature, and that he is beating the drum of corruption and despotism to pleasure his unscrupulous paymaster.

However let us not stoop to his lowly depths, and simply call a spade a spade. This Editor has no qualms in deviating from the best practices in intellectual judgment and honesty dictated by civilization since the days of the renaissance, such as sighting evidence and dealing with factual statements. It appears that when evidence (such as that accepted in a court of law) is lacking to buttress the propaganda that this Editor’s political allegiances demand, he seems to be fishing out of his own backside political gossip that has been digested by his system. In doing so, he makes a feeble attempt at defending himself by hiding behind phrases such as “Sri Lanka became known as…”

If one were to use internet-age slang as a reaction, one would say WTF??? Doesn’t this person want to talk about facts that can be useful to the voter, or at least share a constructive bit of advise to the political parties involved in the election, in the hope of better shaping our nations future? Is his brain so devoid of ideas or facts that the only weapon he has to support his political view is slander?

Furthermore, does he really think that the wise thing to do at this point in time is to remind the voter that some wisecracker in 1983 called Sri Lanka a “tear drop”? What is the intellectual underpinning behind this type of argument? We can’t find a single nation in the world that hasn’t had its historical share of grief, and all nations have had a “bad name”. Perhaps at the next US presidential election we should remind the American voters of Mark Twain quip “God created war so that Americans would learn geography.” Perhaps true in some humorous way, but what purpose does it serve?

We expect more from the Editor of Sri Lanka’s preeminent newspaper. We’d like him to pay some attention to detail, discuss facts and form opinions based on evidence. We’d like him to refrain from the sort of idle talk that one would imagine being indulged by an uneducated, drunken village thug (“Mahatheyo mung horu okkoma” | “Sir, these chaps are all crooks”). Why would we need erudite newspaper editors if that were the standard of their political analysis? One could go to the local pub, and rant about how all journalists are paid hacks of politicians, and save LKR 50/-

Personally, I make no bones about the fact that I was a supporter of MR’s broader national policies. I deeply value the contribution made by MR towards Sri Lanka’s socioeconomic development, and his decisive eradication of terrorism. Sure, there were rumors of corruption during his tenure, which remain completely unsubstantiated to date. There were nepotistic tendencies displayed, which were self-evident4. However, as voters in the upcoming election, what we aught to care about is:

Who is the best person to elect locally, out of the given lot? This is a general election; we cannot vote based on the party leader’s record alone. Besides, we granted more autonomy and administrative capacities to parliament, precisely because we wanted to elect better persons locally.

Which party has the better national policy?

Which party has the stronger leadership team, with a better track record of success in difficult times?

It’s fascinating that this Editor takes offense that MR is leading a general election campaign after one single election defeat. One would have thought that RW, with a handsome record of 29 election defeats5 under his belt, would induce more squeamishness. The fact is though, we should not care either way; what matters is who can do a better job right now.

The MR administration defeated terrorism, built roads and highways, uplifted the image, cleanliness and public works in Colombo, facilitated commercial enterprise directly (through building renovation, telecommunication infrastructure, lower taxation for small businesses), supported farmers through subsidies and increased our agricultural production manifold, built ports and airports (yes, for those who think that the Hambantota harbor is idle, take a look at the cover photo of vehicles on the dock for transshipment), built public works, schools and hospitals in the war affected North and East, and boosted Sri Lanka’s global position to one of the fastest growing economies in the world6. Their achievements are well-documented7. Sure they did some quirky or even disturbing things, like muscling out the Chief Justice (with the backing of the parliamentary majority, note) because her policies were at odds with theirs. Most notably, we hardly ever heard them speak of the previous government during their tenure.

In comparison, the MS administration came in and changed the constitution, reducing the powers of the Executive President, as a “solution” to excesses of government. They cut prices of fuel and some essential commodities. They passed a “freedom of information” bill. They gained bipartisan support for these constitutional amendments. On the flipside, there was a parliamentary COPE committee formally investigating an unprecedented fraud in the Central Bank, allegedly orchestrated by its newly appointed Governor!8, 9 Parliament was dissolved just prior to the release of that important report. They arrested members of opposing political parties for alleged fraudulent practices, but we have no report as yet of their culpability before the eyes of the law. Oh yes, and they spend 100% of their airtime blaming the previous government.

In answer to a question posed to Eran Wickramaratne on the TV program “Wada Pitiya”10, a senior UNP MP and economist, the MP acknowledged that there was not one single development project or investment initiated by the UNP during their recent 7-month coalition government.

We understand that people vote at elections based on widely varying personal experiences and preferences. However, we urge everyone to be cautious of media skullduggery like that promulgated by the Editor of the Observer.

Nepotism in former UPFA government. The MR Administration had two of MR’s brothers as cabinet ministers, and a third as his secretary of defense (a non-ministerial public office with extensive power over the defense establishment of the island). This third brother was also put in charge of urban development8. MR openly backed his son’s nomination from the UPFA ticket, as a member of parliament. This MP son of his was also powerfully empowered to campaign on behalf of his the government. 70% of the national 2014 budget was allocated for portfolios or departments managed by the Rajapakse Brothers, in a cabinet of over 100 ministers. MR was instrumental in bringing forward legislation to change the constitution, allowing for his nomination as a presidential candidate for an unprecedented third term. See: https://en.wikipedia.org/wiki/Nepotism