The Strategic Issues & International Relations Forum is a venue to discuss issues pertaining to India's security environment, her strategic outlook on global affairs and as well as the effect of international relations in the Indian Subcontinent. We request members to kindly stay within the mandate of this forum and keep their exchanges of views, on a civilised level, however vehemently any disagreement may be felt. All feedback regarding forum usage may be sent to the moderators using the Feedback Form or by clicking the Report Post Icon in any objectionable post for proper action. Please note that the views expressed by the Members and Moderators on these discussion boards are that of the individuals only and do not reflect the official policy or view of the Bharat-Rakshak.com Website. Copyright Violation is strictly prohibited and may result in revocation of your posting rights - please read the FAQ for full details. Users must also abide by the Forum Guidelines at all times.

The long readThere is no such thing as western civilisationThe values of liberty, tolerance and rational inquiry are not the birthright of a single culture. In fact, the very notion of something called ‘western culture’ is a modern inventionby Kwame Anthony AppiahWednesday 9 November 2016

Like many Englishmen who suffered from tuberculosis in the 19th century, Sir Edward Burnett Tylor went abroad on medical advice, seeking the drier air of warmer regions. Tylor came from a prosperous Quaker business family, so he had the resources for a long trip. In 1855, in his early 20s, he left for the New World, and, after befriending a Quaker archeologist he met on his travels, he ended up riding on horseback through the Mexican countryside, visiting Aztec ruins and dusty pueblos. Tylor was impressed by what he called “the evidence of an immense ancient population”. And his Mexican sojourn fired in him an enthusiasm for the study of faraway societies, ancient and modern, that lasted for the rest of his life. In 1871, he published his masterwork, Primitive Culture, which can lay claim to being the first work of modern anthropology.

Primitive Culture was, in some respects, a quarrel with another book that had “culture” in the title: Matthew Arnold’s Culture and Anarchy, a collection that had appeared just two years earlier. For Arnold, culture was the “pursuit of our total perfection by means of getting to know, on all the matters which most concern us, the best which has been thought and said in the world”. Arnold wasn’t interested in anything as narrow as class-bound connoisseurship: he had in mind a moral and aesthetic ideal, which found expression in art and literature and music and philosophy.

But Tylor thought that the word could mean something quite different, and in part for institutional reasons, he was able to see that it did. For Tylor was eventually appointed to direct the University Museum at Oxford, and then, in 1896, he was appointed to the first chair of anthropology there. It is to Tylor more than anyone else that we owe the idea that anthropology is the study of something called “culture”, which he defined as “that complex whole which includes knowledge, belief, arts, morals, law, customs, and any other capabilities and habits acquired by man as a member of society”. Civilisation, as Arnold understood it, was merely one of culture’s many modes.

Nowadays, when people speak about culture, it is usually either Tylor’s or Arnold’s notion that they have in mind. The two concepts of culture are, in some respects, antagonistic. Arnold’s ideal was “the man of culture” and he would have considered “primitive culture” an oxymoron. Tylor thought it absurd to propose that a person could lack culture. Yet these contrasting notions of culture are locked together in our concept of western culture, which many people think defines the identity of modern western people. So let me try to untangle some of our confusions about the culture, both Tylorian and Arnoldian, of what we have come to call the west.

Someone asked Mahatma Gandhi what he thought of western civilisation, and he replied: “I think it would be a very good idea.” Like many of the best stories, alas, this one is probably apocryphal; but also like many of the best stories, it has survived because it has the flavour of truth. But my own response would have been very different: I think you should give up the very idea of western civilisation. It is at best the source of a great deal of confusion, at worst an obstacle to facing some of the great political challenges of our time. I hesitate to disagree with even the Gandhi of legend, but I believe western civilisation is not at all a good idea, and western culture is no improvement.

One reason for the confusions “western culture” spawns comes from confusions about the west. We have used the expression “the west” to do very different jobs. Rudyard Kipling, England’s poet of empire, wrote, “Oh, east is east and west is west, and never the twain shall meet”, contrasting Europe and Asia, but ignoring everywhere else. During the cold war, “the west” was one side of the iron curtain; “the east” its opposite and enemy. This usage, too, effectively disregarded most of the world. Often, in recent years, “the west” means the north Atlantic: Europe and her former colonies in North America. The opposite here is a non-western world in Africa, Asia and Latin America – now dubbed “the global south” – though many people in Latin America will claim a western inheritance, too. This way of talking notices the whole world, but lumps a whole lot of extremely different societies together, while delicately carving around Australians and New Zealanders and white South Africans, so that “western” here can look simply like a euphemism for white.

Of course, we often also talk today of the western world to contrast it not with the south but with the Muslim world. And Muslim thinkers sometimes speak in a parallel way, distinguishing between Dar al-Islam, the home of Islam, and Dar al-Kufr, the home of unbelief. I would like to explore this opposition further. Because European and American debates today about whether western culture is fundamentally Christian inherit a genealogy in which Christendom is replaced by Europe and then by the idea of the west.

This civilisational identity has roots going back nearly 1,300 years, then. But to tell the full story, we need to begin even earlier.

For the Greek historian Herodotus, writing in the fifth century BC, the world was divided into three parts. To the east was Asia, to the south was a continent he called Libya, and the rest was Europe. He knew that people and goods and ideas could travel easily between the continents: he himself travelled up the Nile as far as Aswan, and on both sides of the Hellespont, the traditional boundary between Europe and Asia. Herodotus admitted to being puzzled, in fact, as to “why the earth, which is one, has three names, all women’s”. Still, despite his puzzlement, these continents were for the Greeks and their Roman heirs the largest significant geographical divisions of the world.

But here’s the important point: it would not have occurred to Herodotus to think that these three names corresponded to three kinds of people: Europeans, Asians, and Africans. He was born at Halicarnasus – Bodrum in modern Turkey. Yet being born in Asia Minor didn’t make him an Asian; it left him a Greek. And the Celts, in the far west of Europe, were much stranger to him than the Persians or the Egyptians, about whom he knew rather a lot. Herodotus only uses the word “European” as an adjective, never as a noun. For a millennium after his day, no one else spoke of Europeans as a people, either.

Then the geography Herodotus knew was radically reshaped by the rise of Islam, which burst out of Arabia in the seventh century, spreading with astonishing rapidity north and east and west. After the prophet’s death in 632, the Arabs managed in a mere 30 years to defeat the Persian empire that reached through central Asia as far as India, and to wrest provinces from Rome’s residue in Byzantium.

The Umayyad dynasty, which began in 661, pushed on west into north Africa and east into central Asia. In early 711, it sent an army across the straits of Gibraltar into Spain, which the Arabs called al-Andalus, where it attacked the Visigoths who had ruled much of the Roman province of Hispania for two centuries. Within seven years, most of the Iberian Peninsula was under Muslim rule; not until 1492, nearly 800 years later, was the whole peninsula under Christian sovereignty again.

The Muslim conquerors of Spain had not planned to stop at the Pyrenees, and they made regular attempts in the early years to move further north. But near Tours, in 732CE, Charles Martel, Charlemagne’s grandfather, defeated the forces of al-Andalus, and this decisive battle effectively ended the Arab attempts at the conquest of Frankish Europe. The 18th-century historian Edward Gibbon, overstating somewhat, observed that if the Arabs had won at Tours, they could have sailed up the Thames. “Perhaps,” he added, “the interpretation of the Koran would now be taught in the schools of Oxford, and her pulpits might demonstrate to a circumcised people the sanctity and truth of the revelation of Mahomet.”

The world according to Herodotus. Photograph: Interfoto/Alamy/AlamyWhat matters for our purposes is that the first recorded use of a word for Europeans as a kind of person, so far as I know, comes out of this history of conflict. In a Latin chronicle, written in 754 in Spain, the author refers to the victors of the Battle of Tours as “Europenses”, Europeans. So, simply put, the very idea of a “European” was first used to contrast Christians and Muslims. (Even this, however, is a bit of a simplification. In the middle of the eighth century much of Europe was not yet Christian.)

Now, nobody in medieval Europe would have used the word “western” for that job. For one thing, the coast of Morocco, home of the Moors, stretches west of Ireland. For another, there were Muslim rulers in the Iberian Peninsula – part of the continent that Herodotus called Europe – until nearly the 16th century. The natural contrast was not between Islam and the west, but between Christendom and Dar al Islam, each of which regarded the other as infidels, defined by their unbelief.

Starting in the late 14th century, the Turks who created the Ottoman empire gradually extended their rule into parts of Europe: Bulgaria, Greece, the Balkans, and Hungary. Only in 1529, with the defeat of Suleiman the Magnificent’s army at Vienna, did the reconquest of eastern Europe begin. It was a slow process. It wasn’t until 1699 that the Ottomans finally lost their Hungarian possessions; Greece became independent only in the early 19th century, Bulgaria even later.

For one thing, the educated classes of Christian Europe took many of their ideas from the pagan societies that preceded them. At the end of the 12th century, Chrétien de Troyes, born a couple of hundred kilometres south-west of Paris, celebrated these earlier roots: “Greece once had the greatest reputation for chivalry and learning,” he wrote. “Then chivalry went to Rome, and so did all of learning, which now has come to France.”

The idea that the best of the culture of Greece was passed by way of Rome into western Europe gradually became, in the middle ages, a commonplace. In fact this process had a name. It was called the “translatio studii”: the transfer of learning. And it was an astonishingly persistent idea. More than six centuries later, Georg Wilhelm Friedrich Hegel, the great German philosopher, told the students of the high school he ran in Nuremberg: “The foundation of higher study must be and remain Greek literature in the first place, Roman in the second.”

So from the late middle ages until now, people have thought of the best in the culture of Greece and Rome as a civilisational inheritance, passed on like a precious golden nugget, dug out of the earth by the Greeks, transferred, when the Roman empire conquered them, to Rome. Partitioned between the Flemish and Florentine courts and the Venetian Republic in the Renaissance, its fragments passed through cities such as Avignon, Paris, Amsterdam, Weimar, Edinburgh and London, and were finally reunited – pieced together like the broken shards of a Grecian urn – in the academies of Europe and the United States.

There are many ways of embellishing the story of the golden nugget. But they all face a historical difficulty; if, that is, you want to make the golden nugget the core of a civilisation opposed to Islam. Because the classical inheritance it identifies was shared with Muslim learning. In Baghdad of the ninth century Abbasid caliphate, the palace library featured the works of Plato and Aristotle, Pythagoras and Euclid, translated into Arabic. In the centuries that Petrarch called the Dark Ages, when Christian Europe made little contribution to the study of Greek classical philosophy, and many of the texts were lost, these works were preserved by Muslim scholars. Much of our modern understanding of classical philosophy among the ancient Greeks we have only because those texts were recovered by European scholars in the Renaissance from the Arabs.

In the mind of its Christian chronicler, as we saw, the battle of Tours pitted Europeans against Islam; but the Muslims of al-Andalus, bellicose as they were, did not think that fighting for territory meant that you could not share ideas. By the end of the first millennium, the cities of the Caliphate of Cordoba were marked by the cohabitation of Jews, Christians, and Muslims, of Berbers, Visigoths, Slavs and countless others.

There were no recognised rabbis or Muslim scholars at the court of Charlemagne; in the cities of al-Andalus there were bishops and synagogues. Racemondo, Catholic bishop of Elvira, was Cordoba’s ambassador to the courts of the Byzantine and the Holy Roman empires. Hasdai ibn Shaprut, leader of Cordoba’s Jewish community in the middle of the 10th century, was not only a great medical scholar, he was the chairman of the Caliph’s medical council; and when the Emperor Constantine in Byzantium sent the Caliph a copy of Dioscorides’s De Materia Medica, he took up Ibn Shaprut’s suggestion to have it translated into Arabic, and Cordoba became one of the great centres of medical knowledge in Europe. The translation into Latin of the works of Ibn Rushd, born in Cordoba in the 12th century, began the European rediscovery of Aristotle. He was known in Latin as Averroes, or more commonly just as “The Commentator”, because of his commentaries on Aristotle. So the classical traditions that are meant to distinguish western civilisation from the inheritors of the caliphates are actually a point of kinship with them.

The term 'western culture' is surprisingly modern – more recent certainly than the phonographBut the golden-nugget story was bound to be beset by difficulties. It imagines western culture as the expression of an essence – a something – which has been passed from hand to hand on its historic journey. The pitfalls of this sort of essentialism are evident in a wide range of cases. Whether you are discussing religion, nationality, race or culture, people have supposed that an identity that survives through time and space must be propelled by some potent common essence. But that is simply a mistake. What was England like in the days of Chaucer, father of English literature, who died more than 600 years ago? Take whatever you think was distinctive of it, whatever combination of customs, ideas, and material things that made England characteristically English then. Whatever you choose to distinguish Englishness now, it isn’t going to be that. Rather, as time rolls on, each generation inherits the label from an earlier one; and, in each generation, the label comes with a legacy. But as the legacies are lost or exchanged for other treasures, the label keeps moving on. And so, when some of those in one generation move from the territory to which English identity was once tied – move, for example, to a New England – the label can even travel beyond the territory. Identities can be held together by narratives, in short, without essences. You don’t get to be called “English” because there’s an essence that this label follows; you’re English because our rules determine that you are entitled to the label by being somehow connected with a place called England.

So how did the people of the north Atlantic, and some of their kin around the world, get connected to a realm we call the west, and gain an identity as participants in something called western culture?

James Gillray’s 1805 cartoon, The Plumb Pudding in Danger, depicts prime minister William Pitt and Napoleon Bonaparte carving up the world

It will help to recognise that the term “western culture” is surprisingly modern – more recent certainly than the phonograph. Tylor never spoke of it. And indeed he had no reason to, since he was profoundly aware of the internal cultural diversity even of his own country. In 1871 he reported evidence of witchcraft in rural Somerset. A blast of wind in a pub had blown some roasted onions stabbed with pins out of the chimney. “One,” Tylor wrote, “had on it the name of a brother magistrate of mine, whom the wizard, who was the alehouse-keeper, held in particular hatred ... and whom apparently he designed to get rid of by stabbing and roasting an onion representing him.” Primitive culture, indeed.

So the very idea of the “west,” to name a heritage and object of study, doesn’t really emerge until the 1890s, during a heated era of imperialism, and gains broader currency only in the 20th century. When, around the time of the first world war, Oswald Spengler wrote the influential book translated as The Decline of the West – a book that introduced many readers to the concept – he scoffed at the notion that there were continuities between western culture and the classical world. During a visit to the Balkans in the late 1930s, the writer and journalist Rebecca West recounted a visitor’s sense that “it’s uncomfortably recent, the blow that would have smashed the whole of our western culture”. The “recent blow” in question was the Turkish siege of Vienna in 1683.

To be blunt: if western culture were real, we wouldn’t spend so much time talking it upIf the notion of Christendom was an artefact of a prolonged military struggle against Muslim forces, our modern concept of western culture largely took its present shape during the cold war. In the chill of battle, we forged a grand narrative about Athenian democracy, the Magna Carta, Copernican revolution, and so on. Plato to Nato. Western culture was, at its core, individualistic and democratic and liberty-minded and tolerant and progressive and rational and scientific. Never mind that pre-modern Europe was none of these things, and that until the past century democracy was the exception in Europe – something that few stalwarts of western thought had anything good to say about. The idea that tolerance was constitutive of something called western culture would have surprised Edward Burnett Tylor, who, as a Quaker, had been barred from attending England’s great universities. To be blunt: if western culture were real, we wouldn’t spend so much time talking it up.

Of course, once western culture could be a term of praise, it was bound to become a term of dispraise, too. Critics of western culture, producing a photonegative emphasising slavery, subjugation, racism, militarism, and genocide, were committed to the very same essentialism, even if they see a nugget not of gold but of arsenic.

Talk of “western culture” has had a larger implausibility to overcome. It places, at the heart of identity, all manner of exalted intellectual and artistic achievements – philosophy, literature, art, music; the things Arnold prized and humanists study. But if western culture was there in Troyes in the late 12th century when Chrétien was alive, it had little to do with the lives of most of his fellow citizens, who did not know Latin or Greek, and had never heard of Plato. Today the classical heritage plays no greater role in the everyday lives of most Americans or Britons. Are these Arnoldian achievements that hold us together? Of course not. What holds us together, surely, is Tylor’s broad sense of culture: our customs of dress and greeting, the habits of behaviour that shape relations between men and women, parents and children, cops and civilians, shop assistants and consumers. Intellectuals like me have a tendency to suppose that the things we care about are the most important things. I don’t say they don’t matter. But they matter less than the story of the golden nugget suggests.

So how have we bridged the chasm here? How have we managed to tell ourselves that we are rightful inheritors of Plato, Aquinas, and Kant, when the stuff of our existence is more Beyoncé and Burger King? Well, by fusing the Tylorian picture and the Arnoldian one, the realm of the everyday and the realm of the ideal. And the key to this was something that was already present in Tylor’s work. Remember his famous definition: it began with culture as “that complex whole”. What you’re hearing is something we can call organicism. A vision of culture not as a loose assemblage of disparate fragments but as an organic unity, each component, like the organs in a body, carefully adapted to occupy a particular place, each part essential to the functioning of the whole. The Eurovision song contest, the cutouts of Matisse, the dialogues of Plato are all parts of a larger whole. As such, each is a holding in your cultural library, so to speak, even if you have never personally checked it out. Even if it isn’t your jam, it is still your heritage and possession. Organicism explained how our everyday selves could be dusted with gold.

Britons once swapped their fish and chips for chicken tikka masala, now, I gather, they’re all having a cheeky Nando’sNow, there are organic wholes in our cultural life: the music, the words, the set-design, the dance of an opera fit and are meant to fit together. It is, in the word Wagner invented, a Gesamtkunstwerk, a total work of art. But there isn’t one great big whole called culture that organically unites all these parts. Spain, in the heart of “the west,” resisted liberal democracy for two generations after it took off in India and Japan in “the east,” the home of Oriental despotism. Jefferson’s cultural inheritance – Athenian liberty, Anglo-Saxon freedom – did not preserve the United States from creating a slave republic. At the same time, Franz Kafka and Miles Davis can live together as easily – perhaps even more easily – than Kafka and his fellow Austro-Hungarian Johann Strauss. You will find hip-hop in the streets of Tokyo. The same is true in cuisine: Britons once swapped their fish and chips for chicken tikka masala, now, I gather, they’re all having a cheeky Nando’s.

Once we abandon organicism, we can take up the more cosmopolitan picture in which every element of culture, from philosophy or cuisine to the style of bodily movement, is separable in principle from all the others – you really can walk and talk like an African-American and think with Matthew Arnold and Immanuel Kant, as well as with Martin Luther King and Miles Davis. No Muslim essence stops the inhabitants of Dar al-Islam from taking up anything from western civilisation, including Christianity or democracy. No western essence is there to stop a New Yorker of any ancestry taking up Islam.

The stories we tell that connect Plato or Aristotle or Cicero or Saint Augustine to contemporary culture in the north Atlantic world have some truth in them, of course. We have self-conscious traditions of scholarship and argumentation. The delusion is to think that it suffices that we have access to these values, as if they are tracks on a Spotify playlist we have never quite listened to. If these thinkers are part of our Arnoldian culture, there is no guarantee that what is best in them will continue to mean something to the children of those who now look back to them, any more than the centrality of Aristotle to Muslim thought for hundreds of years guarantees him an important place in modern Muslim cultures.

Values aren’t a birthright: you need to keep caring about them. Living in the west, however you define it, being western, provides no guarantee that you will care about western civilisation. The values European humanists like to espouse belong just as easily to an African or an Asian who takes them up with enthusiasm as to a European. By that very logic, of course, they do not belong to a European who has not taken the trouble to understand and absorb them. The same, of course, is true in the other direction. The story of the golden nugget suggests that we cannot help caring about the traditions of “the west” because they are ours: in fact, the opposite is true. They are only ours if we care about them. A culture of liberty, tolerance, and rational inquiry: that would be a good idea. But these values represent choices to make, not tracks laid down by a western destiny.

In the year of Edward Burnett Tylor’s death, what we have been taught to call western civilisation stumbled into a death match with itself: the Allies and the Great Central Powers hurled bodies at each other, marching young men to their deaths in order to “defend civilisation”. The blood-soaked fields and gas-poisoned trenches would have shocked Tylor’s evolutionist, progressivist hopes, and confirmed Arnold’s worst fears about what civilisation really meant. Arnold and Tylor would have agreed, at least, on this: culture isn’t a box to check on the questionnaire of humanity; it is a process you join, a life lived with others.

Advertisement

Culture – like religion and nation and race – provides a source of identity for contemporary human beings. And, like all three, it can become a form of confinement, conceptual mistakes underwriting moral ones. Yet all of them can also give contours to our freedom. Social identities connect the small scale where we live our lives alongside our kith and kin with larger movements, causes, and concerns. They can make a wider world intelligible, alive, and urgent. They can expand our horizons to communities larger than the ones we personally inhabit. But our lives must make sense, too, at the largest of all scales. We live in an era in which our actions, in the realm of ideology as in the realm of technology, increasingly have global effects. When it comes to the compass of our concern and compassion, humanity as a whole is not too broad a horizon.

We live with seven billion fellow humans on a small, warming planet. The cosmopolitan impulse that draws on our common humanity is no longer a luxury; it has become a necessity. And in encapsulating that creed I can draw on a frequent presence in courses in western civilisation, because I don’t think I can improve on the formulation of the dramatist Terence: a former slave from Roman Africa, a Latin interpreter of Greek comedies, a writer from classical Europe who called himself Terence the African. He once wrote, “Homo sum, humani nihil a me alienum puto.” “I am human, I think nothing human alien to me.” Now there’s an identity worth holding on to.

• This is an edited version of Kwame Anthony Appiah’s BBC Reith lecture, Culture, the fourth part of the series Mistaken Identities, which is available on the Radio 4 website

I am starting to think there are two forms of white racism - moral superiority and power superiority. Moral superiority based white racism very much interested in finishing off all cultural competition to its dominance. Power superiority based preserve the cultural identity of its opponents, but want to be respected and awed by them, but are destined to follow their natural demise, allowing others to dominate eventually.

Former uses Islam as an agent of destruction of other cultures. Not just Islam but many other forms of ideology - Communism, Missionary Christianism, Media Propaganda to denigrate culture and leadership, Corruption, Mass immigration, Academic monopolism, etc.

It is a truism that America has become a more diverse country. It is also a beautiful thing to watch. Visitors from other countries, particularly those having trouble incorporating different ethnic groups and faiths, are amazed that we manage to pull it off. Not perfectly, of course, but certainly better than any European or Asian nation today. It’s an extraordinary success story

And then they follow up stating its dying;

One of the many lessons of the recent presidential election campaign and its repugnant outcome is that the age of identity liberalism must be brought to an end.

And then the presstitute goes on to prove how western universalism has coloured the lens.

Recently I performed a little experiment during a sabbatical in France: For a full year I read only European publications, not American ones. My thought was to try seeing the world as European readers did. But it was far more instructive to return home and realize how the lens of identity has transformed American reporting in recent years. How often, for example, the laziest story in American journalism — about the “first X to do Y” — is told and retold. Fascination with the identity drama has even affected foreign reporting, which is in distressingly short supply. However interesting it may be to read, say, about the fate of transgender people in Egypt, it contributes nothing to educating Americans about the powerful political and religious currents that will determine Egypt’s future, and indirectly, our own. No major news outlet in Europe would think of adopting such a focus.

And finally he justifies why western universalism is important.

We began by singing the national anthem, and then sat down to listen to a recording of Roosevelt’s speech. As I looked out into the crowd, and saw the array of different faces, I was struck by how focused they were on what they shared. And listening to Roosevelt’s stirring voice as he invoked the freedom of speech, the freedom of worship, the freedom from want and the freedom from fear — freedoms that Roosevelt demanded for “everyone in the world” — I was reminded of what the real foundations of modern American liberalism are.

TSJones wrote:the problem with the 3 million deported? .....is that a lot of them come right back and live in sanctuary cities.......we have very porous borders.....

Porous border is a big misnomer. It used to be porous once upon a time, that is no longer the case, the Tortilla Curtain is firmly in place now. Wall and border fencing alone cover more than 1/3 of the entire length of the border. They cover all of populated area and some unpopulated area. Then there is virtual fence of camera, motion sensors and drone, which again cover substantial chunk of the border. The rest of the border has natural barrier and are hostile to crossing. An estimated 5,000 have died in last 10 years so crossing the border.

Lastly there are 20,000+ border patrol agents manning the border.

This is the real face of western white "universalism" whether Obama or Trump.And Shri Trump Super Senior migrated from Germany about a 100 years back.Its simple-Might is right.This is the real universalism of west.White migration is OK,not meztizo migration.White nations are more equal than others.This caste has the right to dictate terms to others.They will frame the dharma which others have to follow.They will enact, interpret, and enforce it according to their interests.

Some call themselves Liberals, others call themselves Progressives, but almost all seem to be very supportive of certain fascist and regressive medieval ideologies, giving support to them under the tagline "Freedom of Religion".

In the same spirit that Seculars are called out as Sickulars, I would like to rename Progressives as #ProRegressives.

What's the latest on left-lib idiot ''philosopher'' Martha Nussbaum ? The cat got her tongue ?? Why don't I see any denunciation of her homeland / Trump post elections, like we had seen of Modi / Gujarat as she went about sermonizing in India?

She was quite vocal about re-election of Modi being a 'black mark' on Gujarat years back (when he was still CM) - but when it comes to Trump she comes up with this very 'thoughtful' (!!) piece on why anger and misogyny are built into the 'very structure of human development' and therefore there would not be a point in blaming any particular society: http://www.abc.net.au/religion/articles ... 574917.htm

One has to differentiate between Liberal and the more general "Left" here. There are people to the Left of Nussbaum (in the US political spectrum) who accuse her of being "conservative" because of her moral absolutism (she, like Hillary Clinton and Tim Kaine, considers Western Civilization and post-Enlightenment Western Ethics to be morally superior to any other system in the world).

This is exactly where Nussbaum derives her self-proclaimed authority to attack Hindu political assertiveness as savage, atavistic fascism... she considers herself, as a spokesperson of superior Western civilization, uniquely empowered to stand in judgment over the heathens. Many academics to the "left" of Nussbaum find her arrogance indistinguishable from the American Exceptionalism that characterizes all conservative discourse in the US.

To influence a larger section of academia, Nussbaum pretends to a "Liberal" posture when attacking Hindu civilization (Hindus are victimizing poor minority Muslims, etc. etc.) However, as the shameless apologia in the article linked above amply demonstrate, it's a perfectly empty posture. She is a Western Civilizational Supremacist, that's all.

I've been reading some interesting things on this pernicious topic of "Feminism". Feminism, not like in equality of the sexes... no. Its about radical feminist domination in the west and how its influencing and slowly creeping into India

Quote

"Feminists aren’t for women’s choice, they are for women doing exactly what they are told to do by feminists. It is a very communist set of ideals, which is why you see a big crossover between feminism and Marxism."

Quote

Some of you may be asking: if feminism is a low percentage of the American population, how can it hurt women?

Well, cancer is also a low percentage of its sufferer’s cells, but it still damages the entire body by spreading to vital organs. A low percentage of Americans may be feminists, but feminism has spread into the most vital sectors of American life: academia, media, and politics.

About 1 year ago, the United Nations suggested we censor the entire Internet to save feminist’s feelings.

They released a completely insane report called “Cyber Violence Against Women And Girls: A Global Wake-up Call”.

Now, if I was writing a report on “cyberviolence,” I could do it in 140 characters, but someone infinitely more gifted already has.

The UN report itself contains a number of bizarre attempts to equate critical tweets on the internet with physical violence.

“A cyber-touch is recognised as equally as harmful as a physical touch” says the report. In their press release, UN Women claim that “cyber violence … places a premium on emotional bandwidth.”

Whatever the ****** “emotional bandwidth” is.

The UN isn’t so worried about women being raped and harassed in Europe, or discriminated against in the Middle East. If they did they wouldn’t have put Saudi Arabia on their human rights committee.

They care that people criticize feminists on the Internet. Think about that for a while.

Beyond the UN, look at Twitter and their trust and safety council. They basically killed Twitter by embracing the nonsense idea of “online harassment.” Naturally, their death became obvious once they banned their most fabulous and entertaining user.

"Feminists on campus will try to fill your head with garbage. One example is their rape on campus stats, are they up to 4 out of 5 girls will be raped? They always change the stat! It’s always garbage.

Feminists on campus will talk your ear off about the wage gap. It doesn’t how many economists show that the wage gap is a result of choices made.

America is the most free society for women on earth. Women are free to study what they want, and they naturally gravitate towards majors that suit them.

Some of these are high paying, like nursing and occupational therapy, both of which are more than 85% female. But most are not. Elementary Education and social work have similar percentages of female students, but much lower pay.

Men on the other hand dominate high paying STEM majors like Electrical Engineering. Women are free to study engineering, and often make fine engineers.

But why on earth would feminism feel the need to shoehorn women into studying a subject they don’t want to? The movement that claimed to be about female empowerment now seems to be about female conformity.

You’re probably quite depressed by now.

Don’t be.

Together, we are winning the war against feminism. We are publicizing their tricks on campus and off, exposing their lies, and proving that women don’t need shrill harpies calling them weak victims of a made up patriarchy.

I greatly admire the young people who have to fight feminism in the trenches of academia, as opposed to from a giant tour bus with their face on the side.

Keep resisting, keep pointing out the facts, and keep laughing. Together, with facts and laughter, we can fight off this disease. Thank you.

OK..this is going to be a three part series... I found the mother of all Feminazieseries... and I am so ashamed, the bitch in this series happens to be an Indian woman. She had a brit accent but happens to be a totaly psycho bitch... anyway... enjoy!

I heard the admission to the humanities has dropped in the US as parents are seeing a drop in prospective employment opportunities. Why would any parent willingly fund $20000 per annum tuition and then the child has to sit on their bottom for a long time. Perhaps the foundation of western universalism will have more of these hammer blows in the time to come.

There is the visible government situated around the Mall in Washington, and then there is another, more shadowy, more indefinable government that is not explained in Civics 101 or observable to tourists at the White House or the Capitol. The former is traditional Washington partisan politics: the tip of the iceberg that a public watching C-SPAN sees daily and which is theoretically controllable via elections. The subsurface part of the iceberg I shall call the Deep State, which operates according to its own compass heading regardless of who is formally in power.

During the last five years, the news media have been flooded with pundits decrying the broken politics of Washington. The conventional wisdom has it that partisan gridlock and dysfunction have become the new normal. That is certainly the case, and I have been among the harshest critics of this development. But it is also imperative to acknowledge the limits of this critique as it applies to the American governmental system.

Despite this apparent impotence, President Obama can liquidate American citizens without due processes, detain prisoners indefinitely without charge, conduct dragnet surveillance on the American people without judicial warrant and engage in unprecedented — at least since the McCarthy era — witch hunts against federal employees (the so-called “Insider Threat Program”). Within the United States, this power is characterized by massive displays of intimidating force by militarized federal, state and local law enforcement. Abroad, President Obama can start wars at will and engage in virtually any other activity whatsoever without so much as a by-your-leave from Congress, such as arranging the forced landing of a plane carrying a sovereign head of state over foreign territory. Despite the habitual cant of congressional Republicans about executive overreach by Obama, the would-be dictator, we have until recently heard very little from them about these actions — with the minor exception of comments from gadfly Senator Rand Paul of Kentucky. Democrats, save a few mavericks such as Ron Wyden of Oregon, are not unduly troubled, either — even to the extent of permitting seemingly perjured congressional testimony under oath by executive branch officials on the subject of illegal surveillance.

Yes, there is another government concealed behind the one that is visible at either end of Pennsylvania Avenue, a hybrid entity of public and private institutions ruling the country according to consistent patterns in season and out, connected to, but only intermittently controlled by, the visible state whose leaders we choose. My analysis of this phenomenon is not an exposé of a secret, conspiratorial cabal; the state within a state is hiding mostly in plain sight, and its operators mainly act in the light of day. Nor can this other government be accurately termed an “establishment.” All complex societies have an establishment, a social network committed to its own enrichment and perpetuation. In terms of its scope, financial resources and sheer global reach, the American hybrid state, the Deep State, is in a class by itself. That said, it is neither omniscient nor invincible. The institution is not so much sinister (although it has highly sinister aspects) as it is relentlessly well entrenched. Far from being invincible, its failures, such as those in Iraq, Afghanistan and Libya, are routine enough that it is only the Deep State’s protectiveness towards its higher-ranking personnel that allows them to escape the consequences of their frequent ineptitude.

How did I come to write an analysis of the Deep State, and why am I equipped to write it? As a congressional staff member for 28 years specializing in national security and possessing a top secret security clearance, I was at least on the fringes of the world I am describing, if neither totally in it by virtue of full membership nor of it by psychological disposition. But, like virtually every employed person, I became, to some extent, assimilated into the culture of the institution I worked for, and only by slow degrees, starting before the invasion of Iraq, did I begin fundamentally to question the reasons of state that motivate the people who are, to quote George W. Bush, “the deciders.”

Cultural assimilation is partly a matter of what psychologist Irving L. Janis called “groupthink,” the chameleon-like ability of people to adopt the views of their superiors and peers. This syndrome is endemic to Washington: The town is characterized by sudden fads, be it negotiating biennial budgeting, making grand bargains or invading countries. Then, after a while, all the town’s cool kids drop those ideas as if they were radioactive. As in the military, everybody has to get on board with the mission, and questioning it is not a career-enhancing move. The universe of people who will critically examine the goings-on at the institutions they work for is always going to be a small one. As Upton Sinclair said, “It is difficult to get a man to understand something when his salary depends upon his not understanding it.”

The Deep State does not consist of the entire government. It is a hybrid of national security and law enforcement agencies: the Department of Defense, the Department of State, the Department of Homeland Security, the Central Intelligence Agency and the Justice Department. I also include the Department of the Treasury because of its jurisdiction over financial flows, its enforcement of international sanctions and its organic symbiosis with Wall Street. All these agencies are coordinated by the Executive Office of the President via the National Security Council. Certain key areas of the judiciary belong to the Deep State, such as the Foreign Intelligence Surveillance Court, whose actions are mysterious even to most members of Congress. Also included are a handful of vital federal trial courts, such as the Eastern District of Virginia and the Southern District of Manhattan, where sensitive proceedings in national security cases are conducted. The final government component (and possibly last in precedence among the formal branches of government established by the Constitution) is a kind of rump Congress consisting of the congressional leadership and some (but not all) of the members of the defense and intelligence committees. The rest of Congress, normally so fractious and partisan, is mostly only intermittently aware of the Deep State and when required usually submits to a few well-chosen words from the State’s emissaries.

I saw this submissiveness on many occasions. One memorable incident was passage of the Foreign Intelligence Surveillance Amendments Act of 2008. This legislation retroactively legalized the Bush administration’s illegal and unconstitutional surveillance first revealed by The New York Times in 2005 and indemnified the telecommunications companies for their cooperation in these acts. The bill passed easily: All that was required was the invocation of the word “terrorism” and most members of Congress responded like iron filings obeying a magnet. One who responded in that fashion was Senator Barack Obama, soon to be coronated as the presidential nominee at the Democratic National Convention in Denver. He had already won the most delegates by campaigning to the left of his main opponent, Hillary Clinton, on the excesses of the global war on terror and the erosion of constitutional liberties.

As the indemnification vote showed, the Deep State does not consist only of government agencies. What is euphemistically called “private enterprise” is an integral part of its operations. In a special series in The Washington Post called “Top Secret America,” Dana Priest and William K. Arkin described the scope of the privatized Deep State and the degree to which it has metastasized after the September 11 attacks. There are now 854,000 contract personnel with top-secret clearances — a number greater than that of top-secret-cleared civilian employees of the government. While they work throughout the country and the world, their heavy concentration in and around the Washington suburbs is unmistakable: Since 9/11, 33 facilities for top-secret intelligence have been built or are under construction. Combined, they occupy the floor space of almost three Pentagons — about 17 million square feet. Seventy percent of the intelligence community’s budget goes to paying contracts. And the membrane between government and industry is highly permeable: The Director of National Intelligence, James R. Clapper, is a former executive of Booz Allen Hamilton, one of the government’s largest intelligence contractors. His predecessor as director, Admiral Mike McConnell, is the current vice chairman of the same company; Booz Allen is 99 percent dependent on government business. These contractors now set the political and social tone of Washington, just as they are increasingly setting the direction of the country, but they are doing it quietly, their doings unrecorded in the Congressional Record or the Federal Register, and are rarely subject to congressional hearings.

Washington is the most important node of the Deep State that has taken over America, but it is not the only one. Invisible threads of money and ambition connect the town to other nodes. One is Wall Street, which supplies the cash that keeps the political machine quiescent and operating as a diversionary marionette theater. Should the politicians forget their lines and threaten the status quo, Wall Street floods the town with cash and lawyers to help the hired hands remember their own best interests. The executives of the financial giants even have de facto criminal immunity. On March 6, 2013, testifying before the Senate Judiciary Committee, Attorney General Eric Holder stated the following: “I am concerned that the size of some of these institutions becomes so large that it does become difficult for us to prosecute them when we are hit with indications that if you do prosecute, if you do bring a criminal charge, it will have a negative impact on the national economy, perhaps even the world economy.” This, from the chief law enforcement officer of a justice system that has practically abolished the constitutional right to trial for poorer defendants charged with certain crimes. It is not too much to say that Wall Street may be the ultimate owner of the Deep State and its strategies, if for no other reason than that it has the money to reward government operatives with a second career that is lucrative beyond the dreams of avarice — certainly beyond the dreams of a salaried government employee. [3]

The attitude of many members of Congress towards Wall Street was memorably expressed by Rep. Spencer Bachus (R-AL), the incoming chairman of the House Financial Services Committee, in 2010: “In Washington, the view is that the banks are to be regulated, and my view is that Washington and the regulators are there to serve the banks.”

The corridor between Manhattan and Washington is a well trodden highway for the personalities we have all gotten to know in the period since the massive deregulation of Wall Street: Robert Rubin, Lawrence Summers, Henry Paulson, Timothy Geithner and many others. Not all the traffic involves persons connected with the purely financial operations of the government: In 2013, General David Petraeus joined KKR (formerly Kohlberg Kravis Roberts) of 9 West 57th Street, New York, a private equity firm with $62.3 billion in assets. KKR specializes in management buyouts and leveraged finance. General Petraeus’ expertise in these areas is unclear. His ability to peddle influence, however, is a known and valued commodity. Unlike Cincinnatus, the military commanders of the Deep State do not take up the plow once they lay down the sword. Petraeus also obtained a sinecure as a non-resident senior fellow at the Belfer Center for Science and International Affairs at Harvard. The Ivy League is, of course, the preferred bleaching tub and charm school of the American oligarchy. [4]

Petraeus and most of the avatars of the Deep State — the White House advisers who urged Obama not to impose compensation limits on Wall Street CEOs, the contractor-connected think tank experts who besought us to “stay the course” in Iraq, the economic gurus who perpetually demonstrate that globalization and deregulation are a blessing that makes us all better off in the long run — are careful to pretend that they have no ideology. Their preferred pose is that of the politically neutral technocrat offering well considered advice based on profound expertise. That is nonsense. They are deeply dyed in the hue of the official ideology of the governing class, an ideology that is neither specifically Democrat nor Republican. Domestically, whatever they might privately believe about essentially diversionary social issues such as abortion or gay marriage, they almost invariably believe in the “Washington Consensus”: financialization, outsourcing, privatization, deregulation and the commodifying of labor. Internationally, they espouse 21st-century “American Exceptionalism”: the right and duty of the United States to meddle in every region of the world with coercive diplomacy and boots on the ground and to ignore painfully won international norms of civilized behavior. To paraphrase what Sir John Harrington said more than 400 years ago about treason, now that the ideology of the Deep State has prospered, none dare call it ideology. [5] That is why describing torture with the word “torture” on broadcast television is treated less as political heresy than as an inexcusable lapse of Washington etiquette: Like smoking a cigarette on camera, these days it is simply “not done.”

After Edward Snowden’s revelations about the extent and depth of surveillance by the National Security Agency, it has become publicly evident that Silicon Valley is a vital node of the Deep State as well. Unlike military and intelligence contractors, Silicon Valley overwhelmingly sells to the private market, but its business is so important to the government that a strange relationship has emerged. While the government could simply dragoon the high technology companies to do the NSA’s bidding, it would prefer cooperation with so important an engine of the nation’s economy, perhaps with an implied quid pro quo. Perhaps this explains the extraordinary indulgence the government shows the Valley in intellectual property matters. If an American “jailbreaks” his smartphone (i.e., modifies it so that it can use a service provider other than the one dictated by the manufacturer), he could receive a fine of up to $500,000 and several years in prison; so much for a citizen’s vaunted property rights to what he purchases. The libertarian pose of the Silicon Valley moguls, so carefully cultivated in their public relations, has always been a sham. Silicon Valley has long been tracking for commercial purposes the activities of every person who uses an electronic device, so it is hardly surprising that the Deep State should emulate the Valley and do the same for its own purposes. Nor is it surprising that it should conscript the Valley’s assistance.Still, despite the essential roles of lower Manhattan and Silicon Valley, the center of gravity of the Deep State is firmly situated in and around the Beltway. The Deep State’s physical expansion and consolidation around the Beltway would seem to make a mockery of the frequent pronouncement that governance in Washington is dysfunctional and broken. That the secret and unaccountable Deep State floats freely above the gridlock between both ends of Pennsylvania Avenue is the paradox of American government in the 21st century: drone strikes, data mining, secret prisons and Panopticon-like control on the one hand; and on the other, the ordinary, visible parliamentary institutions of self-government declining to the status of a banana republic amid the gradual collapse of public infrastructure.

This paradox is evident even within the Beltway itself, the richest metropolitan area in the nation. Although demographers and urban researchers invariably count Washington as a “world city,” that is not always evident to those who live there. Virtually every time there is a severe summer thunderstorm, tens — or even hundreds — of thousands of residents lose power, often for many days. There are occasional water restrictions over wide areas because water mains, poorly constructed and inadequately maintained, have burst. [6] The Washington metropolitan area considers it a Herculean task just to build a rail link to its international airport — with luck it may be completed by 2018.

We are faced with two disagreeable implications. First, that the Deep State is so heavily entrenched, so well protected by surveillance, firepower, money and its ability to co-opt resistance that it is almost impervious to change. Second, that just as in so many previous empires, the Deep State is populated with those whose instinctive reaction to the failure of their policies is to double down on those very policies in the future.

.....

Even Wall Street’s rentier operations have been affected: After helping finance the tea party to advance its own plutocratic ambitions, America’s Big Money is now regretting the Frankenstein’s monster it has created. Like children playing with dynamite, the tea party and its compulsion to drive the nation into credit default has alarmed the grown-ups commanding the heights of capital; the latter are now telling the politicians they thought they had hired to knock it off.

The final factor is Silicon Valley.....That said, evidence is accumulating that Silicon Valley is losing billions in overseas business from companies, individuals and governments that want to maintain privacy. For high tech entrepreneurs, the cash nexus is ultimately more compelling than the Deep State’s demand for patriotic cooperation.

The outcome of all these developments is uncertain. The Deep State, based on the twin pillars of national security imperative and corporate hegemony, has until recently seemed unshakable and the latest events may only be a temporary perturbation in its trajectory. But history has a way of toppling the altars of the mighty. While the two great materialist and determinist ideologies of the twentieth century, Marxism and the Washington Consensus, successively decreed that the dictatorship of the proletariat and the dictatorship of the market were inevitable, the future is actually indeterminate. It may be that deep economic and social currents create the framework of history, but those currents can be channeled, eddied, or even reversed by circumstance, chance and human agency.

Throughout history, state systems with outsized pretensions to power have reacted to their environments in two ways. The first strategy, reflecting the ossification of its ruling elites, consists of repeating that nothing is wrong, that the status quo reflects the nation’s unique good fortune in being favored by God and that those calling for change are merely subversive troublemakers. As the French ancien régime, the Romanov dynasty and the Habsburg emperors discovered, the strategy works splendidly for a while, particularly if one has a talent for dismissing unpleasant facts. The final results, however, are likely to be thoroughly disappointing.The second strategy is one embraced to varying degrees and with differing goals, by figures of such contrasting personalities as Mustafa Kemal Atatürk, Franklin D. Roosevelt, Charles de Gaulle and Deng Xiaoping. They were certainly not revolutionaries by temperament; if anything, their natures were conservative. But they understood that the political cultures in which they lived were fossilized and incapable of adapting to the times. In their drive to reform and modernize the political systems they inherited, their first obstacles to overcome were the outworn myths that encrusted the thinking of the elites of their time.

The second, the reformers, offers a profusion of nostrums to turn the nation around: public financing of elections to sever the artery of money between the corporate components of the Deep State and financially dependent elected officials, government “insourcing” to reverse the tide of outsourcing of government functions and the conflicts of interest that it creates, a tax policy that values human labor over financial manipulation ...

For a non -superficial observer,there's no universalism evident in the working of US society except the universalisms of all societies grappling with physical and economic securities/anxieties.

We think that as scholars we are committed to the truth, to understand it as objectively as possible but in the view of left-wing scholars who dominate the academia in the West and I suppose in India as well, truth means alignment with the status quo. And therefore they feel no shame or embarrassment with disregarding the truth because in their view it is only a so-called truth, a truth which has been constructed by the upper, dominant section of society to wield power over the lower, dominated section of society; in order to gain their consent in their own oppression. And literature such as the Pañcatantra served that purpose in ancient time and so the purpose of scholarship should be to demonstrate how it contributed to the hegemonic discourse.

Therefore, I don’t think we can get much further by developing critiques of Western scholarship. They are essential but they are not sufficient. We need to develop rather our own scholarly interpretations of our texts and traditions in a manner that is more persuasive and interesting then what Western scholars have done and then use that as a platform for critiquing Indology.

panduranghari wrote:I heard the admission to the humanities has dropped in the US as parents are seeing a drop in prospective employment opportunities. Why would any parent willingly fund $20000 per annum tuition and then the child has to sit on their bottom for a long time. Perhaps the foundation of western universalism will have more of these hammer blows in the time to come.

20k PA is in-state for state schools. It is more like 40k+ for out of state for state schools or private schools. The total PA is upwards of 55k PA. So we are talking 225k for an UG degree. If it is stem, then add another 25k for a grand total of 250k or more. If the family income is 75k PA, no need based scholarships. Merit scholarships are very few in number and will offset 35-50% of the cost. Amru parents are fscked when it comes to children's education and healthcare. Let us not even talk about pension, social security etc.

If you were to come across someone who cried in the streets, who saw the world in terms of black and white and made death threats against strangers, who cowered in a special room and made public displays of naked self-harm and blood letting, you might conclude that they were suffering from a personality disorder.All these symptoms can be found in the High Conflict Personality Disorder category known as Axis II in DSMV, including Anti-Social PD, Histrionic PD, Paranoid PD, Narcissistic PD, and Borderline PD.Alternatively, you might reason that these are the everyday behaviors of the modern Social Justice Warrior (SJW).

Of course, not every SJW has a personality condition, but sufferers from High Conflict disorders are often drawn to extreme beliefs and behaviors under the illusion that they are acting politically.A 2016 UK survey found that, since 1990, rates of depression and anxiety among the young have increased by 70%, while the American Counseling Association has reported a “rising tide of personality disorders among millennials.”That such disorders appear to be an acute problem with this generation may be an unintended outcome of the unprecedented experiment conducted in the 1990s and 2000s by progressive parents.

Persecution Complex and the "Safe Space"

In 2014, a survey of 100,000 college students at 53 U.S. campuses by the American College Health Association found that 84% of U.S. students feel unable to cope, while more than half experience overwhelming anxiety.A byproduct of such fear has been the growth of the “safe space,” a safe-haven for minority groups and distressed students from what they perceive as threats within campus life. Safe spaces contain comforting objects that evoke childhood -- bean bags, soothing music, Play-Doh, coloring books. The spaces often forbid entry to straight white men or political opponents.

The idea of “running to the safe space” is a form of psychological regression. The safe space presents a fantasy barrier against imagined exterior evils, and so encourages persecution paranoia and hyper-fragility. These are all symptoms of histrionic, borderline, and paranoid personality disorders that emerge from problems with the early child-parent bond.

The majority of millennial children (now aged 18-34) had two working parents; this was partly an ideological project of feminism and partly economic necessity. The downside was the damage done by daycare, services for which grew by 250% between the 1970s and ;90s (see Laura Perrins’ work on psychological trauma caused by daycare). According to Bowlby’s Maternal Deprivation Thesis, babies require two years of intimate attention to enable them to form the caregiver-child bond essential for secure ego formation. Any disturbance of this process will “predispose the children to respond in an anti-social way to later stresses.”The National Institute of Child Health and Human Development has found:

Children in full-time day care were close to three times more likely to show behavior problems than those cared for by their mothers at home.The more time in child care of any kind or quality, the more aggressive the child.The result is young people who, a decade and a half after daycare, scream at the parent/State for not protecting them sufficiently. It is no coincidence that “safe spaces” resemble daycare centers.Unfortunately, “safe spaces” enforce the distressed person’s fear of the world, trapping them in their original trauma within a psychological frame of permanent and inescapable victimhood.

“Trigger Warnings” and “Helicopter Parents”

For the SJW, everyday speech contains a multitude of “microaggressions,” or subconscious power dynamics which conceal colonial or patriarchal oppression. Failing to use the words prescribed by SJW activists -- most particularly in the case of “trans-people” -- is seen as an act of violence equivalent to physical assault. See, for example, a statement made by a protester at UC Berkeley in January 2017 at a protest event that turned into a violent riot:Your free speech is raping and killing us.People with High Conflict Personality disorders experience similarly paranoid emotions about hidden messages, omnipotent threats, and imminent violence. They are hyper-alert and live with higher than normal levels of cortisol and adrenalin, which in turn causes lasting neurological damage, affecting their ability to reason and to regulate emotion. They panic easily and regress to infantile distress.Faced with histrionic students, university staff end up behaving like “Helicopter Parents”: those largely absent, full-time working parents who overcompensated by flying in to fuss over their child. Attempting to assuage parental guilt, one of the tools they used was “positive parenting” -- a philosophy created by social Progressives.Parents were taught to not scold or punish, and instead to use “positive reinforcement” in an attempt to raise their children with “high self-esteem.” This ideology also became fashionable within an increasingly progressive school system that awarded children prizes for “non-competitive sports” and for merely taking part in school activities.As they passed from day care to through high school, these children with artificially enforced high self-esteem were also told that they were morally superior to generations that came before. They were inducted into politically correct language and were even taught to lecture their own parents on racism, equality, and ecology. From the ages of six to eighteen, they took part in yearly multiculturalist “save the planet” projects. They were told they had a heroic destiny as “agents of change.”A false picture of the world and a vastly inflated sense of self-importance did not compensate for the foundational trauma of parental neglect. Instead, as Dr. Jean Twenge has explained, Positive Parenting created young people with a “narcissistic wound” for whom the real world would be perceived as a threat to self-worth.

Border Violation and Self-Harm

The Positive Parenting movement expounded the beliefs that “there is no boundary between you and your child” and that “you are friends and equals.” For the child growing up without “paternalistic laws” and boundaries, the only way to find limits was to attack the only boundary it knew: its own bodily boundaries.

In this light it is worth exploring why the Fourth Wave feminist/social justice activist group known as “Femen” should mimic the outward signs of the BPD sufferer. Their trademark form of protest is public toplessness, with slogans written over the belly and breasts in fake or real blood. One classic Femen image is of an almost-naked woman holding a protest sign that reads: “Rape Me. I’m a Slut.” The intention may have been to demonstrate that no matter how sexually a woman dresses she is still not “asking for it.” But public nudity as a protest against sexual violation is a contradictory signal -- and sending out conflicted messages around dangerous sexual subjects is a symptom of BPD and NPD.The Femen protester may subconsciously be saying, “show me boundaries and control, show me authority and concern.” She might be displaying the pain of living within a self-in-contradiction.

Contradiction and Splitting

SJW protests are awash with contradictions. SJWs claim to fight for freedom, but are opposed to freedom of speech, support banning videos and books, and support the violent disruption of public talks, as was seen with the riots at UC Berkeley, Middlebury College, and elsewhere.SJWs believe in a world with “no boundaries” where “everyone is equal” -- free immigration, open access to healthcare and education, etc. -- but at the same time are obsessed with creating segregated spaces.While they protest against the “fascist patriarchal state” they are, at the same time, fundamentally Statist, demanding that the government police language for them and punish their enemies. While SJWs claim to fight for human rights, they parade the symbol of the largest genocides in history -- the Communist flag. They are pro-feminist, and at the same time defend Sharia law.[b]Living-in-contradiction is similar to the “Love me -- I hate you” dynamic in Borderline pathology called “Splitting.” In splitting, everything is “all or nothing,” and the thing that was passionately idealized suddenly becomes an object of hatred. Traitors are everywhere. This was exemplified by the expulsion of gay men and “TERFS” -- “Trans exclusionary radical feminists” -- from LGBT+ groups by Intersectional feminists.Along with splitting comes the symptoms of low-impulse control, histrionics, dysphoria, a pervasive sense of emptiness, suicidal ideation, and self-harming.Symbolic demonstrations of self-harming behaviors are widely used in SJW protests. Along with smearing faces with fake blood to signify female oppression, a protest group called Lesbians and Gays Support the Migrants in the UK in 2015 took razors to their arms in public to “spill rivers of blood.”With an attempted self-immolation and a reported “contagion” of suicide threats occurring during the Trump protests, thousands attempted to use politics as an alibi for a deeper inner compulsion to self-harm.

The Results of the Human Experiment

Trapped among infant neglect, artificially elevated self-esteem, and identity dysphoria, the millennials were set up for a fall.When they were pushed out of their parental homes in the 2010s, they discovered they did not have the tools to construct stable selves. They couldn’t blame their parents or teachers. Instead they searched for a vast, abstract, all-encompassing enemy. In identity politics they found a temporary unity, through hatred of Patriarchy, of Capitalism, of White Men.In President Trump they found their savior.In the stages before psychosis, sufferers from High Conflict Personality disorders fixate on one object of hate. Subconsciously, they need this super-enemy so they can feel whole. This is the tragic truth of the identity politics of the SJW. Without a totalizing object of blame, the personality of the warrior for social justice falls apart.While the SJWs idealize themselves as victims of omnipresent evil, they are in fact the victims of well-meaning liberal parents and progressive teachers who subjected them to an experiment in social engineering. They were the guinea pigs of the progressive project. Older generations of radicals then exploited their volatility and rage for political ends.What the Social Justice Warriors are actually asking for, when they scream at us, is our help.

HOW FRENCH “INTELLECTUALS” RUINED THE WEST: POSTMODERNISM AND ITS IMPACT, EXPLAINED

| by Helen Pluckrose |

Postmodernism presents a threat not only to liberal democracy but to modernity itself. That may sound like a bold or even hyperbolic claim, but the reality is that the cluster of ideas and values at the root of postmodernism have broken the bounds of academia and gained great cultural power in western society. The irrational and identitarian “symptoms” of postmodernism are easily recognizable and much criticized, but the ethos underlying them is not well understood. This is partly because postmodernists rarely explain themselves clearly and partly because of the inherent contradictions and inconsistencies of a way of thought which denies a stable reality or reliable knowledge to exist. However, there are consistent ideas at the root of postmodernism and understanding them is essential if we intend to counter them. They underlie the problems we see today in Social Justice Activism, undermine the credibility of the Left and threaten to return us to an irrational and tribal “pre-modern” culture.

Postmodernism, most simply, is an artistic and philosophical movement which began in France in the 1960s and produced bewildering art and even more bewildering “theory.” It drew on avant-garde and surrealist art and earlier philosophical ideas, particularly those of Nietzsche and Heidegger, for its anti-realism and rejection of the concept of the unified and coherent individual. It reacted against the liberal humanism of the modernist artistic and intellectual movements, which its proponents saw as naïvely universalizing a western, middle-class and male experience.

It rejected philosophy which valued ethics, reason and clarity with the same accusation. Structuralism, a movement which (often over-confidently) attempted to analyze human culture and psychology according to consistent structures of relationships, came under attack. Marxism, with its understanding of society through class and economic structures was regarded as equally rigid and simplistic. Above all, postmodernists attacked science and its goal of attaining objective knowledge about a reality which exists independently of human perceptions which they saw as merely another form of constructed ideology dominated by bourgeois, western assumptions. Decidedly left-wing, postmodernism had both a nihilistic and a revolutionary ethos which resonated with a post-war, post-empire zeitgeist in the West. As postmodernism continued to develop and diversify, its initially stronger nihilistic deconstructive phase became secondary (but still fundamental) to its revolutionary “identity politics” phase.

It has been a matter of contention whether postmodernism is a reaction against modernity. The modern era is the period of history which saw Renaissance Humanism, the Enlightenment, the Scientific Revolution and the development of liberal values and human rights; the period when Western societies gradually came to value reason and science over faith and superstition as routes to knowledge, and developed a concept of the person as an individual member of the human race deserving of rights and freedoms rather than as part of various collectives subject to rigid hierarchical roles in society.

The Encyclopaedia Britannica says postmodernism “is largely a reaction against the philosophical assumptions and values of the modern period of Western (specifically European) history” whilst the Stanford Encyclopaedia of Philosophy denies this and says “Rather, its differences lie within modernity itself, and postmodernism is a continuation of modern thinking in another mode.” I’d suggest the difference lies in whether we see modernity in terms of what was produced or what was destroyed. If we see the essence of modernity as the development of science and reason as well as humanism and universal liberalism, postmodernists are opposed to it. If we see modernity as the tearing down of structures of power including feudalism, the Church, patriarchy, and Empire, postmodernists are attempting to continue it, but their targets are now science, reason, humanism and liberalism. Consequently, the roots of postmodernism are inherently political and revolutionary, albeit in a destructive or, as they would term it, deconstructive way.

The term “postmodern” was coined by Jean-François Lyotard in his 1979 book, The Postmodern Condition. He defined the postmodern condition as “an incredulity towards metanarratives.” A metanarrative is a wide-ranging and cohesive explanation for large phenomena. Religions and other totalizing ideologies are metanarratives in their attempts to explain the meaning of life or all of society’s ills. Lyotard advocated replacing these with “mininarratives” to get at smaller and more personal “truths.” He addressed Christianity and Marxism in this way but also science.

In his view, “there is a strict interlinkage between the kind of language called science and the kind called ethics and politics” (p8). By tying science and the knowledge it produces to government and power he rejects its claim to objectivity. Lyotard describes this incredulous postmodern condition as a general one, and argues that from the end of the 19th century, “an internal erosion of the legitimacy principle of knowledge” began to cause a change in the status of knowledge (p39). By the 1960s, the resulting “doubt” and “demoralization” of scientists had made “an impact on the central problem of legitimization” (p8). No number of scientists telling him they are not demoralized nor any more doubtful than befits the practitioners of a method whose results are always provisional and whose hypotheses are never “proven” could sway him from this.

We see in Lyotard an explicit epistemic relativity (belief in personal or culturally specific truths or facts) and the advocacy of privileging “lived experience” over empirical evidence. We see too the promotion of a version of pluralism which privileges the views of minority groups over the general consensus of scientists or liberal democratic ethics which are presented as authoritarian and dogmatic. This is consistent in postmodern thought.Jean-François Lyotard

Michel Foucault’s work is also centered on language and relativism although he applied this to history and culture. He called this approach “archeology” because he saw himself as “uncovering” aspects of historical culture through recorded discourses (speech which promotes or assumes a particular view). For Foucault, discourses control what can be “known” and in different periods and places, different systems of institutional power control discourses. Therefore, knowledge is a direct product of power. “In any given culture and at any given moment, there is always only one ‘episteme’ that defines the conditions of possibility of all knowledge, whether expressed in theory or silently invested in a practice.”[1]

Furthermore, people themselves were culturally constructed. “The individual, with his identity and characteristics, is the product of a relation of power exercised over bodies, multiplicities, movements, desires, forces.”[2] He leaves almost no room for individual agency or autonomy. As Christopher Butler says, Foucault “relies on beliefs about the inherent evil of the individual’s class position, or professional position, seen as ‘discourse’, regardless of the morality of his or her individual conduct.”[3] He presents medieval feudalism and modern liberal democracy as equally oppressive, and advocates criticizing and attacking institutions to unmask the “political violence that has always exercised itself obscurely through them.” [4]

We see in Foucault the most extreme expression of cultural relativity read through structures of power in which shared humanity and individuality are almost entirely absent. Instead, people are constructed by their position in relation to dominant cultural ideas either as oppressors or oppressed. Judith Butler drew on Foucault for her foundational role in queer theory focusing on the culturally constructed nature of gender, as did Edward Said in his similar role in post-colonialism and “Orientalism” and Kimberlé Crenshaw in her development of “intersectionality” and advocacy of identity politics. We see too the equation of language with violence and coercion and the equation of reason and universal liberalism with oppression.

It was Jacques Derrida who introduced the concept of “deconstruction,” and he too argued for cultural constructivism and cultural and personal relativity. He focused even more explicitly on language. Derrida’s best-known pronouncement “There is no outside-text” relates to his rejection of the idea that words refer to anything straightforwardly. Rather, “there are only contexts without any center of absolute anchoring.” [5]

Therefore the author of a text is not the authority on its meaning. The reader or listener makes their own equally valid meaning and every text “engenders infinitely new contexts in an absolutely nonsaturable fashion.” Derrida coined the term différance which he derived from the verb “differer” which means both “to defer” and “to differ.” This was to indicate that not only is meaning never final but it is constructed by differences, specifically by oppositions. The word “young” only makes sense in its relationship with the word “old” and he argued, following Saussure, that meaning is constructed by the conflict of these elemental oppositions which, to him, always form a positive and negative. “Man” is positive and ‘woman’ negative. “Occident” is positive and “Orient” negative. He insisted that “We are not dealing with the peaceful co-existence of a vis-a-vis, but rather with a violent hierarchy. One of the two terms governs the other (axiologically, logically, etc.), or has the upper hand. To deconstruct the opposition, first of all, is to overturn the hierarchy at a given moment.”[6] Deconstruction, therefore, involves inverting these perceived hierarchies, making “woman” and “Orient” positive and “man” and “Occident” negative. This is to be done ironically to reveal the culturally constructed and arbitrary nature of these perceived oppositions in unequal conflict.

We see in Derrida further relativity, both cultural and epistemic, and further justification for identity politics. There is an explicit denial that differences can be other than oppositional and therefore a rejection of Enlightenment liberalism’s values of overcoming differences and focusing on universal human rights and individual freedom and empowerment. We see here the basis of “ironic misandry” and the mantra “reverse racism isn’t real” and the idea that identity dictates what can be understood. We see too a rejection of the need for clarity in speech and argument and to understand the other’s point of view and avoid minterpretation. The intention of the speaker is irrelevant. What matters is the impact of speech. This, along with Foucauldian ideas, underlies the current belief in the deeply damaging nature of “microaggressions” and misuse of terminology related to gender, race or sexuality.

Jacques Derrida

Lyotard, Foucault, and Derrida are just three of the “founding fathers” of postmodernism but their ideas share common themes with other influential “theorists” and were taken up by later postmodernists who applied them to an increasingly diverse range of disciplines within the social sciences and humanities. We’ve seen that this includes an intense sensitivity to language on the level of the word and a feeling that what the speaker means is less important than how it is received, no matter how radical the interpretation. Shared humanity and individuality are essentially illusions and people are propagators or victims of discourses depending on their social position; a position which is dependent on identity far more than their individual engagement with society. Morality is culturally relative, as is reality itself. Empirical evidence is suspect and so are any culturally dominant ideas including science, reason, and universal liberalism. These are Enlightenment values which are naïve, totalizing and oppressive, and there is a moral necessity to smash them. Far more important is the lived experience, narratives and beliefs of “marginalized” groups all of which are equally “true” but must now be privileged over Enlightenment values to reverse an oppressive, unjust and entirely arbitrary social construction of reality, morality and knowledge.

The desire to “smash” the status quo, challenge widely held values and institutions and champion the marginalized is absolutely liberal in ethos. Opposing it is resolutely conservative. This is the historical reality, but we are at a unique point in history where the status quo is fairly consistently liberal, with a liberalism that upholds the values of freedom, equal rights and opportunities for everyone regardless of gender, race and sexuality. The result is confusion in which life-long liberals wishing to conserve this kind of liberal status quo find themselves considered conservative and those wishing to avoid conservatism at all costs find themselves defending irrationalism and illiberalism. Whilst the first postmodernists mostly challenged discourse with discourse, the activists motivated by their ideas are becoming more authoritarian and following those ideas to their logical conclusion. Freedom of speech is under threat because speech is now dangerous. So dangerous that people considering themselves liberal can now justify responding to it with violence. The need to argue a case persuasively using reasoned argument is now often replaced with references to identity and pure rage.

Despite all the evidence that racism, sexism, homophobia, transphobia and xenophobia are at an all-time low in Western societies, Leftist academics and SocJus activists display a fatalistic pessimism, enabled by postmodern interpretative “reading” practices which valorize confirmation bias. The authoritarian power of the postmodern academics and activists seems to be invisible to them whilst being apparent to everyone else. As Andrew Sullivan says of intersectionality:

“It posits a classic orthodoxy through which all of human experience is explained — and through which all speech must be filtered. … Like the Puritanism once familiar in New England, intersectionality controls language and the very terms of discourse.” [7]

Postmodernism has become a Lyotardian metanarrative, a Foucauldian system of discursive power, and a Derridean oppressive hierarchy.

The logical problem of self-referentiality has been pointed out to postmodernists by philosophers fairly constantly but it is one they have yet to address convincingly. As Christopher Butler points out, “the plausibility of Lyotard’s claim for the decline of metanarratives in the late 20th century ultimately depends upon an appeal to the cultural condition of an intellectual minority.” In other words, Lyotard’s claim comes directly from the discourses surrounding him in his bourgeois academic bubble and is, in fact, a metanarrative towards which he is not remotely incredulous. Equally, Foucault’s argument that knowledge is historically contingent must itself be historically contingent, and one wonders why Derrida bothered to explain the infinite malleability of texts at such length if I could read his entire body of work and claim it to be a story about bunny rabbits with the same degree of authority.

This is, of course, not the only criticism commonly made of postmodernism. The most glaring problem of epistemic cultural relativity has been addressed by philosophers and scientists. The philosopher, David Detmer, in Challenging Postmodernism, says

“Consider this example, provided by Erazim Kohak, ‘When I try, unsuccessfully, to squeeze a tennis ball into a wine bottle, I need not try several wine bottles and several tennis balls before, using Mill’s canons of induction, I arrive inductively at the hypothesis that tennis balls do not fit into wine bottles’… We are now in a position to turn the tables on [postmodernist claims of cultural relativity] and ask, ‘If I judge that tennis balls do not fit into wine bottles, can you show precisely how it is that my gender, historical and spatial location, class, ethnicity, etc., undermine the objectivity of this judgement?” [8]

However, he has not found postmodernists committed to explaining their reasoning and describes a bewildering conversation with postmodern philosopher, Laurie Calhoun,

“When I had occasion to ask her whether or not it was a fact that giraffes are taller than ants, she replied that it was not a fact, but rather an article of religious faith in our culture.”

Physicists Alan Sokal and Jean Bricmont address the same problem from the perspective of science in Fashionable Nonsense: Postmodern Intellectuals’ Abuse of Science:

“Who could now seriously deny the ‘grand narrative’ of evolution, except someone in the grip of a far less plausible master narrative such as Creationism? And who would wish to deny the truth of basic physics? The answer was, ‘some postmodernists.’”

and

“There is something very odd indeed in the belief that in looking, say, for causal laws or a unified theory, or in asking whether atoms really do obey the laws of quantum mechanics, the activities of scientists are somehow inherently ‘bourgeois’ or ‘Eurocentric’ or ‘masculinist’, or even ‘militarist.'”

How much of a threat is postmodernism to science? There are certainly some external attacks. In the recent protests against a talk given by Charles Murray at Middlebury, the protesters chanted, as one,

“Science has always been used to legitimize racism, sexism, classism, transphobia, ableism, and homophobia, all veiled as rational and fact, and supported by the government and state. In this world today, there is little that is true ‘fact.'”[9]

When the organizers of the March for Science tweeted:

“colonization, racism, immigration, native rights, sexism, ableism, queer-, trans-, intersex-phobia, & econ justice are scientific issues,”[10] many scientists immediately criticized this politicization of science and derailment of the focus on preservation of science to intersectional ideology. In South Africa, the #ScienceMustFall and #DecolonizeScience progressive student movement announced that science was only one way of knowing that people had been taught to accept. They suggested witchcraft as one alternative. [11]

Despite this, science as a methodology is not going anywhere. It cannot be “adapted” to include epistemic relativity and “alternative ways of knowing.” It can, however, lose public confidence and thereby, state funding, and this is a threat not to be underestimated. Also, at a time in which world rulers doubt climate change, parents believe false claims that vaccines cause autism and people turn to homeopaths and naturopaths for solutions to serious medical conditions, it is dangerous to the degree of an existential threat to further damage people’s confidence in the empirical sciences.

The social sciences and humanities, however, are in danger of changing out of all recognition. Some disciplines within the social sciences already have. Cultural anthropology, sociology, cultural studies and gender studies, for example, have succumbed almost entirely not only to moral relativity but epistemic relativity. English (literature) too, in my experience, is teaching a thoroughly postmodern orthodoxy. Philosophy, as we have seen, is divided. So is history.

Empirical historians are often criticized by the postmodernists among us for claiming to know what really happened in the past. Christopher Butler recalls Diane Purkiss’ accusation that Keith Thomas was enabling a myth that grounded men’s historical identity in “the powerlessness and speechlessness of women” when he provided evidence that accused witches were usually powerless beggar women. Presumably, he should have claimed, against the evidence, that they were wealthy women or better still, men. As Butler says,

“It seems as though Thomas’s empirical claims here have simply run foul of Purkiss’s rival organizing principle for historical narrative – that it should be used to support contemporary notions of female empowerment” (p36)

I encountered the same problem when trying to write about race and gender at the turn of the seventeenth century. I’d argued that Shakespeare’s audience’s would not have found Desdemona’s attraction to Black Othello, who was Christian and a soldier for Venice, so difficult to understand because prejudice against skin color did not become prevalent until a little later in the seventeenth century when the Atlantic Slave Trade gained steam, and that religious and national differences were far more profound before that. I was told this was problematic by an eminent professor and asked how Black communities in contemporary America would feel about my claim. If today’s African Americans felt badly about it, it was implied, it either could not have been true in the seventeenth century or it is morally wrong to mention it. As Christopher Butler says,

“Postmodernist thought sees the culture as containing a number of perpetually competing stories, whose effectiveness depends not so much on an appeal to an independent standard of judgement, as upon their appeal to the communities in which they circulate.”

I fear for the future of the humanities.

The dangers of postmodernism are not limited to pockets of society which center around academia and Social Justice, however. Relativist ideas, sensitivity to language and focus on identity over humanity or individuality have gained dominance in wider society. It is much easier to say what you feel than rigorously examine the evidence. The freedom to “interpret” reality according to one’s own values feeds into the very human tendency towards confirmation bias and motivated reasoning.

It has become commonplace to note that the far-Right is now using identity politics and epistemic relativism in a very similar way to the postmodern-Left. Of course, elements of the far-Right have always been divisive on the grounds of race, gender and sexuality and prone to irrational and anti-science views but postmodernism has produced a culture more widely receptive to this. Kenan Malik describes this shift,

“When I suggested earlier that the idea of ‘alternative facts’ draws upon ‘a set of concepts that in recent decades have been used by radicals’, I was not suggesting that Kellyanne Conway, or Steve Bannon, still less Donald Trump, have been reading up on Foucault or Baudrillard… It is rather that sections of academia and of the left have in recent decades helped create a culture in which relativized views of facts and knowledge seem untroubling, and hence made it easier for the reactionary right not just to re-appropriate but also to promote reactionary ideas.”[12]

This “set of concepts” threaten to take us back to a time before the Enlightenment, when “reason” was regarded as not only inferior to faith but as a sin. James K. A. Smith, Reformed theologian and professor of philosophy, has been quick to see the advantages for Christianity and regards postmodernism as “a fresh wind of the Spirit sent to revitalize the dry bones of the church” (p18). In Who’s Afraid of Postmodernism?: Taking Derrida, Lyotard, and Foucault to Church, he says,

“A thoughtful engagement with postmodernism will encourage us to look backward. We will see that much that goes under the banner of postmodern philosophy has one eye on ancient and medieval sources and constitutes a significant recovery of premodern ways of knowing, being, and doing.” (p25)

and

“Postmodernism can be a catalyst for the church to reclaim its faith not as a system of truth dictated by a neutral reason but rather as a story that requires ‘eyes to see and ears to hear’ (p125)

We on the Left should be very afraid of what “our side” has produced. Of course, not every problem in society today is the fault of postmodern thinking, and it is not helpful to suggest that it is. The rise of populism and nationalism in the US and across Europe are also due to a strong existing far-Right and the fear of Islamism produced by the refugee crisis. Taking a rigidly “anti-SJW” stance and blaming everything on this element of the Left is itself rife with motivated reasoning and confirmation bias. The Left is not responsible for the far-Right or the religious-Right or secular nationalism, but it is responsible for not engaging with reasonable concerns reasonably and thereby making itself harder for reasonable people to support. It is responsible for its own fragmentation, purity demands and divisiveness which make even the far-Right appear comparatively coherent and cohesive.

In order to regain credibility, the Left needs to recover a strong, coherent and reasonable liberalism. To do this, we need to out-discourse the postmodern-Left. We need to meet their oppositions, divisions and hierarchies with universal principles of freedom, equality and justice. There must be a consistency of liberal principles in opposition to all attempts to evaluate or limit people by race, gender or sexuality. We must address concerns about immigration, globalism and authoritarian identity politics currently empowering the far- Right rather than calling people who express them “racist,” “sexist” or “homophobic” and accusing them of wanting to commit verbal violence. We can do this whilst continuing to oppose authoritarian factions of the Right who genuinely are racist, sexist and homophobic, but can now hide behind a façade of reasonable opposition to the postmodern-Left.

Our current crisis is not one of Left versus Right but of consistency, reason, humility and universal liberalism versus inconsistency, irrationalism, zealous certainty and tribal authoritarianism. The future of freedom, equality and justice looks equally bleak whether the postmodern Left or the post-truth Right wins this current war. Those of us who value liberal democracy and the fruits of the Enlightenment and Scientific Revolution and modernity itself must provide a better option.

—————————

Helen Pluckrose is a researcher in the humanities who focuses on late medieval/early modern religious writing for and about women. She is critical of postmodernism and cultural constructivism which she sees as currently dominating the humanities. You can connect with her on Twitter @HPluckrose

Popular Western theories of human development focus on the belief that we are born dependent, and the task of socialization is to raise increasingly independent, individualistic people. This process of development describes separation from others as a sign of maturity. Individuals in this model are able to "stand on their own two feet." My colleagues and I believe that this developmental process has disintegrated or weakened the position of relationship in our culture.

Other cultures, primarily Eastern, focus on the centrality of relationships to all human health and well-being. When human beings are built within the context of human relationships, a much more sophisticated interpersonal neural network is built that allows a person to participate in relationships in a way that calms the stress response system, builds the immune system, and creates a sense of belonging. In this setting, when a person is separated from his group, a warning alarm of pain is issued, telling him that he is in danger.

So, there's a whole physiology there just waiting to be tapped into, if we are setting up social societies in a way that really focuses on the centrality of relationships to health and well-being.

That is true regardless of the patient's age. For example, the College Diabetes Network is a program available in hundreds of colleges across the country. It was formed in recognition that the transition to college can be a challenging time for teens, who are needing to take more responsibility for their lives and their health. Chronic illnesses, such as diabetes, can make them feel different at their new college. In an effort to belong, they may overlook the challenges of their medical condition and, with no parents to remind them to check sugars or monitor what they eat, their A1C levels can rise dramatically.

Within this network, college students with diabetes text each other their blood sugars. The results are impressive: Simply reaching out in this way universally helps students keep their blood sugars under control by interpersonal accountability.

Unfortunately, as a society, we have drifted away from the idea that our elders have great wisdom to contribute to the next generation. So many aging people have lost a sense of meaning or purpose, and as friends die, there is great risk for isolation. This should be a major health concern for all practitioners.

In order to regain credibility, the Left needs to recover a strong, coherent and reasonable liberalism. To do this, we need to out-discourse the postmodern-Left. We need to meet their oppositions, divisions and hierarchies with universal principles of freedom, equality and justice.

This is excellent. The left wants to go post-post-modern. Epitomised by Corbynomics in UQ. History shows left never goes back to the beginning without bloody revolution. My perception was the author wants to prevent this inevitable outcome.

I first noticed our national lurch toward fantasy in 2004, after President George W. Bush’s political mastermind, Karl Rove, came up with the remarkable phrase reality-based community. People in “the reality-based community,” he told a reporter, “believe that solutions emerge from your judicious study of discernible reality … That’s not the way the world really works anymore.” A year later, The Colbert Report went on the air. In the first few minutes of the first episode, Stephen Colbert, playing his right-wing-populist commentator character, performed a feature called “The Word.” His first selection: truthiness. “Now, I’m sure some of the ‘word police,’ the ‘wordinistas’ over at Webster’s, are gonna say, ‘Hey, that’s not a word!’ Well, anybody who knows me knows that I’m no fan of dictionaries or reference books. They’re elitist. Constantly telling us what is or isn’t true. Or what did or didn’t happen. Who’s Britannica to tell me the Panama Canal was finished in 1914? If I wanna say it happened in 1941, that’s my right. I don’t trust books—they’re all fact, no heart … Face it, folks, we are a divided nation … divided between those who think with their head and those who know with their heart … Because that’s where the truth comes from, ladies and gentlemen—the gut.”Whoa, yes, I thought: exactly. America had changed since I was young, when truthiness and reality-based community wouldn’t have made any sense as jokes.

Much more than the other billion or so people in the developed world, we Americans believe—really believe—in the supernatural and the miraculous, in Satan on Earth, in reports of recent trips to and from heaven, and in a story of life’s instantaneous creation several thousand years ago.We believe that the government and its co-conspirators are hiding all sorts of monstrous and shocking truths from us, concerning assassinations, extraterrestrials, the genesis of aids, the 9/11 attacks, the dangers of vaccines, and so much more.And this was all true before we became familiar with the terms post-factual and post-truth, before we elected a president with an astoundingly open mind about conspiracy theories, what’s true and what’s false, the nature of reality.

Our whole social environment and each of its overlapping parts—cultural, religious, political, intellectual, psychological—have become conducive to spectacular fallacy and truthiness and make-believe. There are many slippery slopes, leading in various directions to other exciting nonsense. During the past several decades, those naturally slippery slopes have been turned into a colossal and permanent complex of interconnected, crisscrossing bobsled tracks, which Donald Trump slid down right into the White House.

Today, each of us is freer than ever to custom-make reality, to believe whatever and pretend to be whoever we wish. Which makes all the lines between actual and fictional blur and disappear more easily. Truth in general becomes flexible, personal, subjective. And we like this new ultra-freedom, insist on it, even as we fear and loathe the ways so many of our wrongheaded fellow Americans use it.

Esalen was the center of the cyclone of the youth rebellion. It was one of the central places, like Mecca for the Islamic culture. Esalen was a pilgrimage center for hundreds and thousands of youth interested in some sense of transcendence, breakthrough consciousness, LSD, the sexual revolution, encounter, being sensitive, finding your body, yoga—all of these things were at first filtered into the culture through Esalen. By 1966, ’67, and ’68, Esalen was making a world impact.Not long before Esalen was founded, one of its co-founders, Dick Price, had suffered a mental breakdown and been involuntarily committed to a private psychiatric hospital for a year. His new institute embraced the radical notion that psychosis and other mental illnesses were labels imposed by the straight world on eccentrics and visionaries, that they were primarily tools of coercion and control. This was the big idea behind One Flew Over the Cuckoo’s Nest, of course. And within the psychiatric profession itself this idea had two influential proponents, who each published unorthodox manifestos at the beginning of the decade—R. D. Laing (The Divided Self) and Thomas Szasz (The Myth of Mental Illness). “Madness,” Laing wrote when Esalen was new, “is potentially liberation and renewal.” Esalen’s founders were big Laing fans, and the institute became a hotbed for the idea that insanity was just an alternative way of perceiving reality.

Its author was Theodore Roszak, age 35, a Bay Area professor who thereby coined the word counterculture. Roszak spends 270 pages glorying in the younger generation’s “brave” rejection of expertise and “all that our culture values as ‘reason’ and ‘reality.’ ” (Note the scare quotes.) So-called experts, after all, are “on the payroll of the state and/or corporate structure.” A chapter called “The Myth of Objective Consciousness” argues that science is really just a state religion. To create “a new culture in which the non-intellective capacities … become the arbiters of the good [and] the true,” he writes, “nothing less is required than the subversion of the scientific world view, with its entrenched commitment to an egocentric and cerebral mode of consciousness.” He welcomes the “radical rejection of science and technological values.”Granted complete freedom of thought, Thomas Jefferson and company assumed, most people would follow the path of reason. Wasn’t it pretty to think so.

During the ’60s, large swaths of academia made a turn away from reason and rationalism as they’d been understood. Many of the pioneers were thoughtful, their work fine antidotes to postwar complacency. The problem was the nature and extent of their influence at that particular time, when all premises and paradigms seemed up for grabs. All approximations of truth, science as much as any fable or religion, are mere stories devised to serve people’s needs or interests. Reality itself is a purely social construction, a tableau of useful or wishful myths that members of a society or tribe have been persuaded to believe. The borders between fiction and nonfiction are permeable, maybe nonexistent. The delusions of the insane, superstitions, and magical thinking? Any of those may be as legitimate as the supposed truths contrived by Western reason and science. The takeaway: Believe whatever you want, because pretty much everything is equally true and false.These ideas percolated across multiple academic fields. In 1965, the French philosopher Michel Foucault published Madness and Civilization in America, echoing Laing’s skepticism of the concept of mental illness; by the 1970s, he was arguing that rationality itself is a coercive “regime of truth”—oppression by other means. Foucault’s suspicion of reason became deeply and widely embedded in American academia

Meanwhile, over in sociology, in 1966 a pair of professors published The Social Construction of Reality, one of the most influential works in their field. The rulers of any tribe or society do not just dictate customs and laws; they are the masters of everyone’s perceptions, defining reality itself. To create the all-encompassing stage sets that everyone inhabits, rulers first use crude mythology, then more elaborate religion, and finally the “extreme step” of modern science. “Reality”? “Knowledge”? “If we were going to be meticulous,” Berger and Luckmann wrote, “we would put quotation marks around the two aforementioned terms every time we used them.”

Americans felt newly entitled to believe absolutely anything. I’m pretty certain that the unprecedented surge of UFO reports in the ’70s was not evidence of extraterrestrials’ increasing presence but a symptom of Americans’ credulity and magical thinking suddenly unloosed. We wanted to believe in extraterrestrials, so we did. What made the UFO mania historically significant rather than just amusing, however, was the web of elaborate stories that were now being spun: not just of sightings but of landings and abductions—and of government cover-ups and secret alliances with interplanetary beings. Those earnest beliefs planted more seeds for the extravagant American conspiracy thinking that by the turn of the century would be rampant and seriously toxic.

The sincerely credulous are perfect suckers, and in the late ’60s, a convicted thief and embezzler named Erich von Däniken published Chariots of the Gods?, positing that extraterrestrials helped build the Egyptian pyramids, Stonehenge, and the giant stone heads on Easter Island. That book and its many sequels sold tens of millions of copies, and the documentary based on it had a huge box-office take in 1970. Americans were ready to believe von Däniken’s fantasy to a degree they simply wouldn’t have been a decade earlier, before the ’60s sea change. Certainly a decade earlier NBC wouldn’t have aired an hour-long version of the documentary in prime time. And while I’m at it: Until we’d passed through the ’60s and half of the ’70s, I’m pretty sure we wouldn’t have given the presidency to some dude, especially a born-again Christian, who said he’d recently seen a huge, color-shifting, luminescent UFO hovering near him.

Relativism became entrenched in academia—tenured, you could say. Michel Foucault’s rival Jean Baudrillard became a celebrity among American intellectuals by declaring that rationalism was a tool of oppressors that no longer worked as a way of understanding the world, pointless and doomed. In other words, as he wrote in 1986, “the secret of theory”—this whole intellectual realm now called itself simply “theory”—“is that truth does not exist.”This kind of thinking was by no means limited to the ivory tower. The intellectuals’ new outlook was as much a product as a cause of the smog of subjectivity that now hung thick over the whole American mindscape. After the ’60s, truth was relative, criticizing was equal to victimizing, individual liberty became absolute, and everyone was permitted to believe or disbelieve whatever they wished. The distinction between opinion and fact was crumbling on many fronts.Many Americans announced that they’d experienced fantastic horrors and adventures, abuse by Satanists, and abduction by extraterrestrials, and their claims began to be taken seriously. Parts of the establishment—psychology and psychiatry, academia, religion, law enforcement—encouraged people to believe that all sorts of imaginary traumas were real.America didn’t seem as weird and crazy as it had around 1970. But that’s because Americans had stopped noticing the weirdness and craziness. We had defined every sort of deviancy down. And as the cultural critic Neil Postman put it in his 1985 jeremiad about how TV was replacing meaningful public discourse with entertainment, we were in the process of amusing ourselves to death.

In 1998, as soon as we learned that President Bill Clinton had been fellated by an intern in the West Wing, his popularity spiked. Which was baffling only to those who still thought of politics as an autonomous realm, existing apart from entertainment. American politics happened on television; it was a TV series, a reality show just before TV became glutted with reality shows. A titillating new story line that goosed the ratings of an existing series was an established scripted-TV gimmick. The audience had started getting bored with The Clinton Administration, but the Monica Lewinsky subplot got people interested again.

Before the web, cockamamy ideas and outright falsehoods could not spread nearly as fast or as widely, so it was much easier for reason and reasonableness to prevail. Before the web, institutionalizing any one alternate reality required the long, hard work of hundreds of full-time militants. In the digital age, however, every tribe and fiefdom and principality and region of Fantasyland—every screwball with a computer and an internet connection—suddenly had an unprecedented way to instruct and rile up and mobilize believers, and to recruit more. False beliefs were rendered both more real-seeming and more contagious, creating a kind of fantasy cascade in which millions of bedoozled Americans surfed and swam.

By “sacred”, I mean that which endures beyond our lifetime and beyond the lifetime of our children, the enduring characteristics that make us unique and will continue to distinguish us from the other peoples of the world, and which cannot be violated without destroying our sense of who we are. The sacred is what a country’s soldiers are willing to die to protect; unless there is something for which we are willing to die, we will find nothing for which we are willing to live.

The New Nationalism arose in opposition to the prevailing social philosophy of modern liberalism, namely the assertion that every individual should invent his identity. Existence precedes essence, in Jean-Paul Sartre’s cartoonish existentialism, which for him meant that we can invent our own essence. Many septic currents converged to create what we might term the postmodern notion of identity: Rousseau’s New Man, Nietzsche’s nihilism, Heidegger’s historicism, Ibsen’s assault on bourgeois society, Freud’s conjuring of the unconscious, and of course Sartre. Everything, including the most fundamental determinant of nature, namely gender, has now become a social construct; tradition has become a black museum of past instances of racism, misogyny and colonialism; and self-invention has become the universal and exclusive sacrament of a new secular religion. We mean many things by the term globalism, but the psychic content of globalisation is to leave not one stone of our past atop another so that the ground may be cleared for the arbitrary assertion of individual identity. The conceit that we can “define and express” our own identity is perhaps the cruellest hoax ever perpetrated on civilised peoples. We are neither clever enough nor strong enough to do so; to the extent that we succeed in such an endeavour, we make ourselves into monsters. There is a better word for postmodern identity, and that is anomie. Cut off from its past, the postmodern West has no vision of its future, and its most characteristic response is to fail to bring children into the world, thus ensuring that it will have no future of any kind.

To discuss theological criteria for the constitution of a secular republic runs against the grain of modern political thought, even though constitutional restrictions on popular sovereignty imply reliance on an authority that is greater than human. In a republic the people are sovereign, yet the purpose of a constitution is precisely to restrict the power of any future majority . . . The only basis for a polity to accept severe restrictions on popular majority rule is the conviction that the founding constitution derives its power from a higher form of sovereignty than the voters in any given legislative session. Without such a theological foundation, a republic cannot feel bound by the rules laid down by its founders. A purely secular republic would self-destruct because it could not protect its constitution from constant amendment.The sanctity of the British Constitution is embodied in the monarch, who is anointed in emulation of the ancient kings of Israel for this reason.[?]

Alexander Gauland, the most influential spokesman for AfD, has characterised Americans as “a people thrown together at random without its own culture”. In fact, it is far easier to identity the unique characteristics of American culture than it is in the case of German culture. America is the progeny of Britain’s radical Protestants, who believed that sovereignty and sanctity must be founded in the individual citizen rather than in the person of the monarch. American political thought flows directly from the revelation theology of the Reformation, which first appeared in the West with John Wycliffe and the 14th-century Lollards, and reemerges in the writings of John Selden and John Milton. Sola scriptura presumes that every individual receives revelation directly from Scripture, and a state founded on American principles thus presumes a nation of Bible readers — which America emphatically was at the time of our Revolution.Whether or not they attend Church of England services, Britons retain a pronounced sense of the sacred — one might say in spite of the feckless Church of England rather than because of it. Whether or not they read the Bible (and most still do), Americans retain a sense of the sacred which is pervaded by the radical Protestantism of 17th-century British thinkers. This concept of individual sovereignty precludes a monarchy and a state church. No national culture is so monothematically obsessed with the theme of individual redemption. It is a routine observation that the English Separatists who founded the Plymouth Bay Colony saw themselves as a new Mission in the Wilderness whose task was to found a new City on a Hill, an “almost chosen people”, in Lincoln’s bon mot. That is unremarkable; what still astonishes about the United States is that a nonconforming religious doctrine embraced by a small minority of Britons became the foundation of the popular culture of the world’s most powerful country.

America’s high culture is sparse and in many respects deficient, but its provenance and purpose are unmistakable. It is unimportant that not one American in a thousand has actually waded through Bunyan’s allegory. All of our popular fiction, from Mark Twain’s Huckleberry Finn to the protagonists of Westerns and hard-boiled detective fiction descend from John Bunyan’s Pilgrim’s Progress. Huckleberry Finn, who lights out to the new territory ahead of the others; the cowboy who rides off into the sunset; the private detective who fades into the urban nightscape; and the entire host of misfits and loners who stock the pantheon of American literary protagonists are all recognisable versions of Bunyan’s Christian. American literary critics have given little thought to the provenance of their popular heroes, although in private conversation the late Professor Harry Jaffa identified Huck as an avatar of Bunyan’s pilgrim. They are not the knights errant of European Romanticism, but hard men and sinners who hold to their own code of honour, and stand up to corrupt authority. The outlaw William Munny in Clint Eastwood’s 1990 film Unforgiven is a Christian pilgrim in the American understanding as much as is Huckleberry Finn.

Our “America First” President stands squarely in the mainstream of American culture. Donald Trump, I argued on the day of his Inauguration, “is instantly recognisable as the protagonist in a Christian drama: the lone avenger who stands up to the wicked powers of the world and calls them out for combat. Ted Cruz, though an engaged and enthusiastic evangelical Christian, failed to understand the religious impulse of the American electorate. They did not want a politician-pastor to preach to them what they already knew. They wanted a hero, sinner though he be, to give battle to the forces of evil — a Jephtha or a Saul.” No matter how far it strays from conventional religion, American culture remains stamped delibly with the Calvinist doctrine of total depravity.It does not look for saints who live exemplary lives but for sinners in search of redemption.

American identity has proven robust, transmitting the characteristics of the Puritan founders into its popular culture. It has pulverised the ethnic cultures that immigrants brought with them and removed virtually all memory of them by the second or third generation. Unlike the cultures of Europe, it depends on an imagined rather than an actual past. If Britain’s national epics are the histories of Shakespeare or Malory’s retelling of Arthur, America’s national epic is the King James Bible. The American pilgrimage emulates the history of Israel, with one profound difference: in the Protestant incarnation of an “almost chosen people”, Americans know that the City of God is not of this world, and that the goal of the earthly pilgrimage always beckons over the horizon. In a thousand versions, Americans are the poor wayfaring stranger of the song, hoping that “there is no sickness, toil or danger in that bright land to which I go”. In keeping with the limits of imagined memory, American culture is thin and repetitive. It provided poor soil for the flourishing of a literary high culture. Nonetheless, the American sense of the sacred is renewed and reinforced by a popular culture that retells the story of the individual journey to redemption.

What Abraham Lincoln called “the mystic chords of memory” are Hebraic rather than Anglo-Saxon. America retained Britain’s regard for individual rights, but it shifted the source of these rights to the direct and immediate relationship of God and the individual citizen. In different ways, Britain and America anchor their identities in that most ancient and robust of all national cultures, namely Israel. The Americans are Hebrews of the imagination; their mother country hopes to build Jerusalem in England’s green and pleasant land, and identifies its monarchy in symbolic as well as mythological fashion with the throne of David. Curiously, this divide mirrors a Biblical ambiguity over the desirability of monarchy which persisted through ancient and medieval rabbinic commentaries.

Although a great gulf is fixed between the sense of the sacred in Great Britain and America, they remain siblings who in different ways draw upon Israel’s idea of the sacred as conveyed by the Bible. What is sacred in both countries, moreover, pervades the popular culture, and is understood as easily by the least- educated citizens as by the mandarins of their high culture. Herr Gauland could not be more wrong to claim that Americans have no culture. German culture is a far more elusive entity, which helps to explains why German nationalism remains a problematic movement.

A brief aside on the question of what identity is and how it is formed is in order. Identity is above all being and time; we do not need to consult Heidegger on this point, for St Augustine explained it a millennium and a half before him in Confessions XI. The durability of biblical identity stems in part from the fact that it anchors memory in time. “Revelation is the first thing to set its mark firmly into the middle of time; only after Revelation do we have an immovable Before and Afterward,” wrote Franz Rosenzweig. “Then there is a reckoning of time independent of the reckoner and the place of reckoning, valid for all the places of the world.” The peoples of the world look back in time, though innumerable conquests, migrations, and forced assimilations that stretch beyond all records and encounter not time, but “once upon a time,” in myth. America’s identity, like Israel’s, begins with an event. The peoples who imagine themselves to be the autochthonous sons of the soil find that memory dissipates into the mists of time.That is a conundrum for the Christian world, which sees its spiritual origin at Golgotha but its ethnic origin in the impenetrable mists of the distant past. To be a whole person, the Christian must find a way to reconcile these two demarcations of memory. One solution is to embrace myth. C.S. Lewis argued that Christianity “is simply a true myth: a myth working on us in the same way as the others, but with this tremendous difference that it really happened: and one must be content to accept it in the same way, remembering that it is God’s myth where the others are men’s myths: i. e. the Pagan stories are God expressing Himself through the minds of poets, using such images as He found there, while Christianity is God expressing Himself through what we call real things.”

All of this was said before and better by the earliest German Romantics in the immediate aftermath of the French Revolution. It was Germany’s bad luck to fight its way to nationhood in response to the Napoleonic conquests, at a time when its intellectuals largely had abandoned religion in favor of philosophy. The modern notion of self-invention erupts into history through the Cult of Reason. Its prophets were Rousseau and Kant, who were armed by the French Revolution. In Germany, the first salvo against the Cult of Reason came from J. G. Fichte, who argued that Kant’s transcendental reason could only exist in human consciousness, and that this consciousness must arise from nationality. Fichte’s young student Novalis inquired rather how national consciousness might be formed in the first place. Consciousness, he argued long before Heidegger, is time-consciousness, and our sense of ourselves in the present moment was an ecstatic unity of memory and anticipation. But upon what memory might the Germans draw? In his 1799 essay “Christianity or Europe,” Novalis proposed a return to the Christianity of the early Middle Ages, but a Christianity that would ennoble the tales (Märchen) of the past. The disenchanted world which Schiller bemoaned in his poem “The Gods of Greece” would thus become reenchanted (wiederverzaubert) through the fairytale world that underlay medieval Christianity.That in summary was the Romantic programme, and a very bad idea it turned out to be. Where Schiller and Hölderlin hoped to re-enchant the world with Hellenism, the Romantics succeeded only in dredging up the raw material for a revived paganism. Heinrich Heine warned in 1834 that “if ever the Cross — the taming talisman — were to fracture, then the wildness of the old warriors will clatter to the surface, their mad berserkers’ rage . . . and the old stone gods will raise themselves up out of the forgotten dust and rub the dust of a millennium from their eyes, and Thor with his giant hammer will at last rise up and smash the Gothic domes.” In the hands of Richard Wagner, the Nibelungenlied became an anti-Christian manifesto. In the 19th century Germany brought forth a peerless high culture in literature, music and the sciences, but its national sense of the sacred lost its way in the land of Märchen and was bewitched by the pagan past.Germany had the misfortune to worship itself in the past, and in consequence most Germans today have no access to the sacred. They do not know how to begin to ask the relevant questions.

Arjun wrote:What's the latest on left-lib idiot ''philosopher'' Martha Nussbaum ? The cat got her tongue ?? Why don't I see any denunciation of her homeland / Trump post elections, like we had seen of Modi / Gujarat as she went about sermonizing in India?

While tiresome and unfair complaints about Jews are also a staple of world history, Dr. Mark Kalthoff opens with a rare line of critique that turns into a compliment, saying the Hebrews’ main contributions to the West’s heritage are not in the arts, in military victories, technological advancement, and so on, but in religion. While a relatively tiny minority of people now and ever have been religiously Jewish, that minority religion has had inestimably large historical effects.

As thinkers including Russell Kirk have famously noted, culture itself arises largely from religious ideas and practices. For a thing called a culture to exist, its participants must have some kind of understood social order that “has a purchase on us,” Kalthoff says, some transcendent truths the people hold in common. These truths are almost universally religious in nature, as they must be accepted on faith rather than demonstrated through scientific proofs. Kalthoff outlines four key beliefs about the Hebrew deity crucial to creating the West: God is one, not many; he is transcendent, not a part of this world; he is sovereign; and he is good.

He represents orthodox point of view. He doesn't beat around the bush saying that varna vyavastha isn't birth based, which it certainly is. Explains the traditional ( paramparic ) way of educating a person for a job vs modern form ( vyakulta ) form of education. This created a fairly stable form of society because there was essentially 100% guaranteed employment and in-house education available for everyone when everyone pretty much knew what they were going to be in the future. And this enabled the society to be cohesive even with massive internal divisions.

This is hardly unique to India, basically how feudal societies operated in pretty much all parts of the world including the Christian West. Western authors, thinkers, etc. have understood this a long time back and have managed to appreciate it for what it was rather than condemning it as an eternal dark age. At the same time, Western thinkers - even conservatives and orthodoxes understand that this is something you simply cannot implement in the modern era.

However Nishchalananda Saraswati goes further and says that this varna vyavastha could even be implemented in the modern era, which frankly sounds quite impractical. That said, liberalism and associated social changes that have come with it have definitely created its own problems which are:1. Status anxiety. Everyone wants more and more, and there are only finite resources in the world. Further, those that are 'losers' are considered to have deserved it (since we have a 'meritocracy' ).2. Technology and automation related changes have created an environment where anyone could be rendered obsolete at any time, thus becoming a 'loser'.

in this world, the concepts of liberalism, democracy, etc. come increasingly under question and in it is in this context that we find ourselves contemplating alternate world-views.

Nishchalananda Saraswati doesn't acknowledge problems with the varna system. While we can make comparisons of ancient India and even medieval India to some extent and show that India's creative, cultural and economic output was very great compared to other civilizations of the time despite having a 'caste system', we can't make the same claims for modern India. Part of it can be explained by Mughal/British invasion and colonization but not fully. Nishchalananda overstimates the importance of parampara (legacy thinking) and underestimates revolutionary thinking. This is particularly acute in the fields of technology and engineering . Machines are designed keeping in mind previous machines (incremental innovation) but there are also entirely new different kinds of machines which are a disruptive innovation. And all major technology developments have been disruptive in nature.

I can say that I am certainly happy that these types of people are not given power in a secular country. Ashay Naik has already enunciated a similar view of Hindu secularism - separation of parmarthika from vyavaharika in his interpretation of Panchatantra ( "Natural Enemity" ). Separation between organized religious institutes and the State is something that existed in ancient India as well and we would do well to follow their example. At the same time, secularism should not mean a rejection of religion or appeasement of minorities, which is the view constantly expoused by Marxists/Nehruvians.

ramana wrote:BTW, I heard on radio today that Steve Bannon is inspired some French philosopher who also inspired Mussolini.

Ramana saar, you're referring to Italian philosopher Julius Evola, who was also very highly influenced by Indic scriptures. Although to paraphrase Rajiv Malhotra - from a Western gaze. I have some of his books in ePub format if interested - if nothing else, helps understand the thought process of the average Breitbart fanatic.

csaurabh wrote:Nishchalananda Saraswati doesn't acknowledge problems with the varna system. While we can make comparisons of ancient India and even medieval India to some extent and show that India's creative, cultural and economic output was very great compared to other civilizations of the time despite having a 'caste system', we can't make the same claims for modern India. Part of it can be explained by Mughal/British invasion and colonization but not fully. Nishchalananda overstimates the importance of parampara (legacy thinking) and underestimates revolutionary thinking. This is particularly acute in the fields of technology and engineering . Machines are designed keeping in mind previous machines (incremental innovation) but there are also entirely new different kinds of machines which are a disruptive innovation. And all major technology developments have been disruptive in nature.

I can say that I am certainly happy that these types of people are not given power in a secular country. Ashay Naik has already enunciated a similar view of Hindu secularism - separation of parmarthika from vyavaharika in his interpretation of Panchatantra ( "Natural Enemity" ). Separation between organized religious institutes and the State is something that existed in ancient India as well and we would do well to follow their example. At the same time, secularism should not mean a rejection of religion or appeasement of minorities, which is the view constantly expoused by Marxists/Nehruvians.

Let me come to the defense of Shri Nishchalananda Saraswati ji. First, Thanks for posting the video. To understand what the Shankaracharya is saying one has to really understand our purusharthas/objectives and then understand that our systems of VarnAshrama were to meet those objectives. Many of the critics misunderstand this basic premise of what VarnAshrama is trying to do and the inevitable comparisons with western systems and paradigms follow as result of this misdiagnosis. The comparisions and contrasts with European experiences of class and organized church and the experiences of another people's history quickly follow, leading to a complete mess and muddled thinking amongst even folks sympathetic to indic systems.

The statement that major technology developments have been disruptive in nature certainly affects the nature of VarnAshrama as practiced in a given time frame, however, it does not change the basic nature of the objectives. IOW: One is free to evolve the VarnAshrama or even junk it - as long as the focus of the system is on those objectives. My humble submission is an evolved VaranAshrama and correspondingly an evolved Dharma Shatra is the need of the hour. However, this is possible once we are done with the kool-aid our current society is on based on the individual, a rights-based framework and equality. Technology has changed, systems have, not man and his quest to define what is considered righteous. This sense of what is right involves not just the individual but man, a social being, has to expand this to the family, the community, the nation and the worlds. People like me call this system Sanatana Dharma.

So-called Orthodox like the Shankaracharya are the true preservers of this Dharma. We can evolve the means and methods but the objectives have to remain the same.

csaurabh wrote:Nishchalananda Saraswati doesn't acknowledge problems with the varna system. While we can make comparisons of ancient India and even medieval India to some extent and show that India's creative, cultural and economic output was very great compared to other civilizations of the time despite having a 'caste system', we can't make the same claims for modern India. ... And all major technology developments have been disruptive in nature.

Thanks for posting the video, but not sure if you listened to what was said (even if you disagree with what he said please do a good purva paksha). He clearly uses the word Branthi. That it to say that Varna/Jati system has problems is falling into the trap - a Branthi - created by the British and others who want you to destroy the very system that has ensured the survival of your civilization. His explanation of the beda and abeda is right out of the books where they argued these very ideas to establish the cost-effectiveness, scientific basis (based on pramanas if you understand what that means) and righteousness or Dharma. The problem for "moderns" is that they lack the skill to do a good purva paksha, they learn someother framework - that too semi-correctly and then pounce on applying such learning without uderstanding that to take on the Varna/Jati system (if that is what you want to do), one has to first do a clear analysis of its framework and then show how to debunk it.

Further he is clearly saying that the use of machines and aggressive harnessing of nature will result in the deterioration of the very substrate that provides us sustainance and uses the scriptures to prove his point. Ignoring all this going to saying he is ignoring revolutionary thinking is falling into exactly what he has advised you to avoid onlee If you want to debunk his argument, you have to show that a non-Varna/Jati system has been capable of producing sustainable development where there is no identity crisis (not status anxiety) for the individual. So you have your work cut out for you...

"Just saying we can't make the same claims for modern India.... etc." will not do or make any difference to the experiential basis of why the system holds. A million legislations and bad laws laters there is still this confounding system, how so ever bustardized into the Caste System, has survived as proofed by every matrimonial ad in India

Sanathan Dharma requires Varna Ashrama system to provide a framework is his arguement. A more contemporary way to say the same is as follows:The Varna Ashrama system is the only sustainable way to preserve the well being and identity of the individual (beda) and that of the species (abeda).Before one jumps to quick dismissal (via LBW) of the learned swamy, one has to understand his framework as he understands it.

ShauryaT wrote:Let me come to the defense of Shri Nishchalananda Saraswati ji. First, Thanks for posting the video. To understand what the Shankaracharya is saying one has to really understand our purusharthas/objectives and then understand that our systems of VarnAshrama were to meet those objectives.

...

So-called Orthodox like the Shankaracharya are the true preservers of this Dharma. We can evolve the means and methods but the objectives have to remain the same.

Yes, but... The Shankaracharya is happily suggesting intellectual perfection and all his solutions are intellectual. In the old world, humans who listened to such individuals took their ideas and acted - there is no such luxury in these times. Which means, whose father what goes onlee?Also, there is something uncomfortable about sitting on doughy bottoms and viewing the world enact its maya as a detached advaitin!After all धर्मो रक्षति रक्षितः (Dharma protected protects!) the tautology demands action - the failure is in educating the objectives and in not acting!This is confounded by the deracinated merely regurgitating sermons from someone else's pulpit!

While emphasizing that I do not attack and much less defend the caste system in what follows, let us look at the existing descriptions and their consequences.

(a) Caste is an antiquated social system that arose in the dim past of India. If this is true, it has survived many challenges – the onslaught of Buddhism and the Bhakti movements, the Islamic and British colonization, Indian independence, world capitalism – and might even survive ‘globalization’. It follows, then, that the caste system is a very stable social organization.

(b) There exists no centralized authority to enforce the caste system across the length and breadth of India. In that case, it is an autonomous and decentralized organization.

(c) All kinds of social and political regulations, whether by the British or by the Indians, have not been able to eradicate this system. If true, it means that the caste system is a self-reproducing social structure.

(d) Caste system exists among the Hindus, the Sikhs, the Jains, the Christians, the Muslims… It has also existed under different environments. This means that this system adapts itself to the environments it finds itself in.

(e) Because new castes have come and gone over the centuries, this system must also be dynamic.

(f) Since caste system is present in different political organizations and survives under different political regimes, it is also neutral with respect to political ideologies.

Even though more can be said, this is enough for us. A simple redescription of what we think we know about the caste system tells us that it is an autonomous, decentralized, stable, adaptive, dynamic, self-reproducing social organization. It is also neutral with respect to political, religious and economic doctrines and environments. If indeed such a system ever existed, would it also not have been the most ideal form of social organization one could ever think of?