Essential Concepts in Sociology

Social life is in a constant process of change and sociology cannot afford to stand still Sociology today is theoretically diverse covers a huge range of subjects and draws on a broad array of research methods Central to this endeavour is the use of core concepts and ideas which allow sociologists to make sense of societies though our understanding of these concepts is constantly evolving and changing This clear and jargon free book introduces a careful selection of essential concepts that have helped to shape sociology and others that continue to do so Going beyond brief dictionary style definitions Anthony Giddens and Philip W Sutton provide an extended discussion of each concept which sets it into historical and theoretical context explores its main meanings in use introduces some relevant criticisms and points readers to its ongoing development in contemporary research and theorizing Organized in ten thematic sections the book offers a portrait of sociology through its essential concepts ranging from capitalism identity and deviance to citizenship the environment and intersectionality It will be essential reading for all those new to sociology as well as those seeking a reliable route map for a rapidly changing world

You can write a book review and share your experiences. Other readers will always be interested in your opinion of the books you've read. Whether you've loved the book or not, if you give your honest and detailed thoughts then people will find new books that are right for them.

8
Health, Illness and the Body
Biomedicine
Working Definition
A Western model of medical practice in which disease is defined objectively, in accordance with the presence of recognized physical symptoms, and scientifically derived medical treatments are sought to restore the body to health.
Origins of the Concept
Before the industrial age and the advent of a scientific understanding of disease, people relied on the traditional remedies passed down through the family and a variety of healers who had a special status in the community. Some of these older forms of healing continue to exist today, though in the developed countries they come under the general umbrella of ‘complementary therapies’ or ‘alternative medicine’. They are ‘alternative’ because, for more than two hundred years, Western ideas about medicine have been dominant, as expressed in the biomedical model of health. Biomedicine rose to dominance along with the modern scientific methods on which it is based, and both form the basis of most national healthcare systems across the world. As science was applied to illness, disease came to be defined objectively in terms of identifiable and objective ‘signs’ located in the body, as opposed to the symptoms being experienced by the patient. Formal medical care by trained ‘experts’ became the accepted way of treating both physical and mental illnesses. Medicine also became a tool of reform for behaviours or conditions perceived as ‘deviant’ – from crime to homosexuality and mental illness.
Meaning and Interpretation
The biomedical model of health has several central elements. Disease is viewed as a breakdown within the human body which diverts it from its ‘normal’ state of being or ‘health’. To restore the body to health, the cause of the disease must be isolated, treated and eliminated. Biomedicine treats the mind and body separately, so, when patients attend for diagnosis, medical professionals view them as essentially ‘sick bodies’ rather than as rounded individuals. The focus is on c; uring their disease, which can be investigated and treated in isolation from all personal factors.
Medical specialists adopt a ‘medical gaze’, a detached approach to viewing and treating the sick patient. The treatment is to be carried out in a neutral, value-free manner, with information collected and compiled, in clinical terms, in a patient’s official file. Properly trained medical specialists are considered the only experts in the treatment of disease and the medical profession adheres to a recognized code of ethics. There is no room for self-taught healers or ‘non-scientific’ medical practices. The hospital represents the most appropriate environment in which to treat serious illnesses, as these treatments often rely on some combination of technology, medication or surgery.
Critical Points
Over the past thirty years or so the biomedical model has been the object of growing criticism, and much of the sociological literature in this area has a critical tone. Some scholars claim that the effectiveness of scientific medicine is overrated. In particular, some historians of medicine argue that, in spite of the prestige that modern medicine has acquired, improvements in the overall health of populations have very little to do with the implementation of a biomedical model of illness (McKeown 1976). Most of the dramatic public health improvements seen since the early nineteenth century can actually be attributed to social and environmental changes, such as public sanitation systems, more effective food production methods and better nutrition, alongside public health campaigns leading to effective hygiene practices. McKeown’s argument is that these general social and environmental advances contributed more to lowering mortality and morbidity rates than the interventions of scientific medicine. The admittedly significant impact of pharmaceuticals, vaccinations and hospital treatment came to the fore only towards the middle of the twentieth century.
Ivan Illich (1975) even suggested that modern medicine has done more harm than good because of iatrogenesis, or ‘physician-caused’ disease. Illich argued that there are three types: clinical, social and cultural iatrogenesis. Clinical iatrogenesis is where medical treatment makes the patient worse or creates new conditions. Social iatrogenesis is where medicine expands into more and more areas, creating an artificial demand for its services. Social iatrogenesis, Illich maintained, leads to cultural iatrogenesis, where the ability to cope with the challenges of everyday life is progressively reduced by medical explanations and alternatives. To critics like Illich, the scope of modern medicine should be dramatically reduced.
A further line of criticism is that biomedicine discounts the opinions and experiences of the patients it seeks to treat. Because medicine is based on objective, scientific understanding, there is no need to listen to the individual interpretations that patients give. Critics argue that effective treatment can take place only when the patient is treated as a thinking, capable being with their own valid understanding. The division between medics and patients can often lead to misunderstandings and a lack of trust, social factors that can interfere with diagnosis and treatment.
Finally, scientific medicine presents itself as superior to any alternative form. However, alternative therapies, some old and some very recently devised, have risen to prominence over recent decades. Many people today are likely to make use of acupuncture, homeopathy, reflexology, chiropractic and many more. The reasons for this are complex, but sociologists suggest that people turn to alternative medicine when all biomedical treatments have failed, when they have lost faith in scientific medicine, or when their conditions are chronic and not easily ‘cured’. The last point is highly significant.
Medical sociologists identified a shift over the twentieth century in the types of illnesses people face, away from acute and towards chronic, often lifelong ones such as diabetes, high blood pressure and arthritis. As chronic conditions become more common, medicine seems less powerful and the biomedical model seems less appropriate. As such conditions need to be managed rather than cured, patients themselves can become experts on how best to handle their own health, and this is tending to change the doctor–patient relationship as the patient’s opinion and experience becomes crucial to treatment regimes. The patient has become an active, ‘whole’ being whose overall well-being – not just physical health – is important.
Continuing Relevance
Biomedicine has, over recent decades, faced an onslaught of criticism, which shows no sign of abating. However, we have to remember that it remains the dominant model for healthcare systems around the world, and the preventative vaccinations against life-threatening conditions such as polio and tuberculosis have transformed infant mortality rates and saved many lives. In times of health crisis, such as the swine flu outbreak of 2009 or the emergence and spread of HIV/ AIDS in the 1980s, people still look to medical science to provide effective treatments, which probably indicates an underlying assumption that biomedicine is a superior form.
However, it is now generally accepted that chronic and disabling conditions have become much more salient and politically significant, and the sociology of health and illness needs to embrace disability studies if the field is to remain vibrant. Scambler and Scambler’s (2010) edited collection brings together some of the innovative scholarship in this area, uniting around the contention that chronic illness and disabilities amount to ‘assaults on the lifeworld’ which demand that we grasp the interrelation of the psychological, biological and sociological if we are to understand them properly.
The rise of alternative medicine presents a constant challenge for mainstream healthcare – should alternative therapies be kept out or allowed in? The relationship between these two systems is explored in a study by Mizrachi et al. (2005) which looks at collaborations in an Israeli hospital setting between biomedical practitioners and alternative therapists, mainly acupuncturists. Alternative therapists had managed to ‘invade the fortress’, but they had signally failed to shift the boundaries between the two systems. Biomedical professionals adopted a strategy of ‘boundary at work’ or ‘on the job’ rather than a formal top-down policy, in order to contain the potential competitor while also avoiding increasing tensions. Using a variety of subtle methods, biomedical professionals are able to control the alternative practitioners but also have to afford them a measure of legitimacy.
References and Further Reading
Illich, I. (1975) Medical Nemesis: The Expropriation of Health (London: Calder & Boyars).
McKeown, T. (1976) The Role of Medicine: Dream, Mirage or Nemesis? (Oxford: Blackwell).
Mizrachi, N., Shuval, J. T., and Gross, S. (2005) ‘Boundary at Work: Alternative Medicine in Biomedical Settings’, Sociology of Health and Illness, 27(1): 20–43.
Nettleton, S. (2013) The Sociology of Health and Illness (3rd edn, Cambridge: Polity), esp. chapter 1.
Scambler, G., and Scambler, S. (eds) (2010) New Directions in the Sociology of Chronic and Disabling Conditions: Assaults on the Lifeworld (Basingstoke: Palgrave Macmillan).
Medicalization
Working Definition
The process through which lifestyle matters, such as weight, smoking or sexual practices, become transformed into medical issues to be treated by medical professionals.
Origins of the Concept
The concept of medicalization was devised in the 1960s and 1970s as part of a critical attack on the perceived dangers of an expanding medical profession, which some saw as becoming too powerful. Critics such as Ivan Illich, Irving Zola, R. D. Laing, Thomas Szasz and Michel Foucault saw medicine as a form of social control, with patients falling under the supervision of medical professionals. Szasz, for example, criticized the growing expertise of psychiatry and described many conditions that were labelled ‘mental illness’ as simply ‘problems with living’. Some behaviours which were best characterized as adaptations to difficult circumstances were being medicalized and people brought under the control and supervision of experts with the power to detain them. Since the 1970s, the concept of medicalization has moved into the mainstream of sociological studies of health and illness.
Meaning and Interpretation
For sociologists who are critical of the biomedical model, the medical profession as a whole holds a position of power that they perceive as unwarranted and even dangerous. One aspect of this social power comes from the ability of the medical profession to define exactly what does and what does not constitute illness and health. By doing so, medics are arbiters of ‘medical truth’, and their views have to be taken seriously by governments and the general public. However, a more stringent criticism of modern medicine concerns the way that, over time, it has continually expanded into more and more realms of life that were previously considered private or just part of everyday lifestyles. This long-term process is described as medicalization.
Feminist sociologists have shown how many aspects of women’s lives, such as pregnancy and childbirth, have been medicalized and appropriated by modern medicine. In the developed world, childbirth routinely takes place in hospitals under the direction of predominantly male specialists. Pregnancy, a common and natural phenomenon, has come to be treated as something akin to an ‘illness’ that is laden with risks and dangers and thus has to be constantly monitored using the latest technologies, such as ultrasound scans and other examinations. Although this may seem a ‘good thing’, as medicine has helped to lower the child mortality rate, ensuring that a majority of babies and mothers survive childbirth, feminists see that as a partial story. Women have also lost control over this process, a key part of their lives, and their opinions and knowledge are deemed irrelevant by the new experts.
Similar concerns about the medicalization of apparently ‘normal’ conditions have been raised in relation to hyperactivity in young children, unhappiness or mild depression – commonly regulated with the help of medications such as Prozac – and persistent tiredness, which has been redefined as chronic fatigue syndrome. An issue with such episodes of medicalization is that, once diagnosed in medical terms, the ‘cure’ tends to be found in medicines and drugs which bring with them side effects.
Ivan Illich argued forcefully that the expansion of modern medicine has done more harm than good because of iatrogenesis, or ‘physician-caused’ illness. According to Illich, one type of iatrogenesis is social iatrogenesis, or medicalization, which creates an artificial demand for medical services. As medicalization progresses, people become less able to deal with their own health and more dependent on healthcare professionals. This dependency leads to a greater demand for health services and the expansion of medical services in a vicious upward cycle that pushes health budgets higher at the expense of other services. For Illich, the key to changing this is to challenge the power of medics in society.
Critical Points
Critics of medicalization see the thesis as somewhat overplayed. There are some problems with the expansion of medicine into new areas, but medicalization also brings many benefits. Moving childbirth into hospitals may have sidelined some local ‘experts’, but the major benefit is that the overwhelming majority of babies are born safely and even very premature babies are likely to survive. Historical accounts of childbirth before modern medicine now read like horror stories, and it was common for babies and/or mothers to die in the process. Surely no one would want to deny that hospital childbirth, for all its faults, is genuinely an improvement? Similarly, medicalization can allow people with some conditions to have them taken seriously and to find help. Those suffering with chronic fatigue syndrome were often seen as malingerers, people with ME struggled to convince others of the reality of their symptoms, and children with ADHD were seen as just plain naughty before the condition was identified as a genuine medical problem. Medicalization may not be as damaging or dangerous as some social theorists believe.
Continuing Relevance
The medicalization thesis has been an important strand of criticism in many sociological studies, and the recent challenges to biomedical dominance seem to suggest that the thesis has found a receptive audience. But we do need to temper our criticisms with the recognition that modern healthcare systems are capable of change, such as the introduction of some less invasive complementary therapies into the mainstream. What was once a radical and, in truth, rather eccentric and marginal approach to biomedicine and health has in the twenty-first century quite rapidly become part of many accounts of health and illness.
The issue of obesity is now seen as a global medical problem which threatens to overwhelm national health systems. Wray and Deery (2008) look at the way that body size has been brought under the gendered medical gaze, with particular implications for women’s body image and self-esteem. In particular, large body size has come to be seen as symbolic of broader moral failures and unnecessary over-indulgence. The authors argue that this illegitimate connection threatens to undermine women’s perceptions of an equal right to healthcare as well as leading them to question their sense of self.
What has sleep to do with medicalization? One study of newspaper representations of the health problems of insomnia and snoring suggests that sleep may be yet another part of life to be medicalized (Williams et al. 2008). The authors show that two quite similar and related issues – insomnia and snoring – are treated differently in media reports of sleep problems. In the case of insomnia, the condition is reported as a symptom rather than an illness, and one that is related to the individual’s habits. In this way, although quite sympathetic, newspapers suggest behavioural changes, with pills and treatments viewed as a ‘last resort’. By contrast, snoring is seen as akin to passive smoking – affecting others – and a clear health problem in its own right, leading potentially to serious conditions such as sleep apnoea. Not just medical professionals, then, but journalists too play a key role in the social processes leading to medicalization.
References and Further Reading
Nye, R. A. (1995) ‘The Evolution of the Concept of Medicalization in the Late Twentieth Century’, Journal of the History of the Behavioral Sciences, 39(2): 115–29.
Williams, S. J., Seale, C., Boden, S., Lowe, P. K., and Steinberg, D. L. (2008) ‘Medicalization and Beyond: The Social Construction of Insomnia and Snoring in the News’, Health, 12(2): 251–68.
Wray, S., and Deery, R. (2008) ‘The Medicalization of Body Size and Women’s Healthcare’, Health Care for Women International, 29(3): 227–43.
Sick Role
Working Definition
A concept devised by Talcott Parsons to explain the social expectations attached to illness and the behaviour of sick people, deviation from which leads to sanctions and social stigma.
Origins of the Concept
When people fall ill they seek advice from medical professionals, who examine them, provide a diagnosis, and suggest a course of treatment aimed at restoring them back to health. This is an apparently simple and self-explanatory process – but not according to the American sociologist Talcott Parsons. Parsons (1952) observed that, although health and illness appear to be simple matters that lie outside the remit of sociology, in fact there is good reason to believe that we should approach them as social phenomena, using standard sociological concepts. Parsons argued that when people are ill they behave in certain socially approved ways, and if they deviate from these they may not be accepted as ‘ill’ at all. He also saw that there exist some key gatekeepers who sanction our illness as well as our return to health. The concept of a ‘sick role’ fell out of favour along with general functionalism in sociology during the 1970s and 1980s, but there has been some interest in reviving it for use in the comparative study of sickness across societies.
Meaning and Interpretation
For sociologists, people are not only individually sick, they also have to learn what society expects of them when they are sick. Parsons argued that there exists a sick role, a way of ‘being ill’, which societies impose on individuals. This is necessary in order that the disruptive impact of illness on the smooth operation of social institutions can be minimized. When we are ill it often becomes impossible to continue to attend work, perform routine household tasks or play our usual part in family life. As a result, our illness has an impact on work colleagues, family members and friends, and the ripples of our inability to participate fully in society effectively spread the burden of illness to others. The sick role is therefore a way of establishing what we should expect of ill people and how they should behave. For Parsons, people have to learn how to be ill. That is, they have to understand what is expected of them when they are sick and to put that knowledge into action should they fall ill.
People are not personally responsible for being sick and therefore they cannot be blamed. Scientific medicine understands most illnesses are not the fault of the individual sick person and that the onset of illness is unrelated to the individual’s behaviour or actions. Second, the sick role entitles people to some rights and privileges, including the withdrawal from work and family duties, while behaviour that would normally be unacceptable will be tolerated. Third, the sick person must work to regain their health by consulting a medical expert and agreeing to become a ‘patient’. This is crucial. The sick role is strictly temporary and ‘conditional’, contingent on the sick person actively trying to get well. In order to occupy the sick role, an individual must receive the sanction of a medical professional who legitimates their claim of illness. The patient is expected to cooperate in his or her own recovery by following ‘doctor’s orders’, but should they not comply their special status may be revoked.
A useful threefold distinction was provided by Freidson (1970), who identified conditional, unconditional and illegitimate sick roles. The conditional legitimate sick role is generally a short-term role performed by those who are ill but are expected to get well again quite soon. By contrast, the unconditional legitimate role applies when individuals have chronic conditions that require management but are unlikely ever to be cured completely. Consequently, this sick role is expected to be a permanent one, and no stigma or sanctions will be enforced if the person does not get well. The illegitimate sick role occurs when people suffer from conditions for which they are widely perceived to be, at least partly, responsible. Illnesses related to alcoholism, obesity or smoking are current examples for which people may well be treated with suspicion or be stigmatized. Freidson’s typology helps us to make sense of the reasons for the different ways that groups of people are treated when they are ill.
Critical Points
Parsons’s thesis of the sick role has been very influential, linking individual illness to the institutional structure of society. But as Parsonian functionalism lost ground, so too did his thesis of the sick role. One missing element is the actual experience of ‘being ill’. How do people experience acute or chronic illness and what impact does it have on their self-identity? This simple question led to a raft of new empirical studies in medical sociology which paid little heed to the ideas of Parsons. The consensual character of his ideas has also been seen as failing accurately to describe many encounters between patients and medical professionals. Empirical work since Parsons has detailed numerous cases of conflict as patients challenge the competence and diagnoses of medics. Such challenges have arguably become more widespread given the less deferential attitude towards ‘experts’ since the late twentieth century. The increased take-up of alternative and complementary therapies shows that many people are prepared to look beyond the dominant biomedical model.
The sick role itself is also far more complex and unclear than Parsons’s model suggests. People who develop symptoms may avoid visiting the doctor, sometimes for years, and live without a diagnosis or playing a sick role, but they are clearly still ill. Additionally, the sick role model does not take account of misdiagnoses and medical errors and negligence. Perhaps more seriously, as the disease burden has swung away from acute illness towards chronic conditions such as diabetes and arthritis, there is no universal set of role expectations for people living with such conditions, the impacts of which are many and varied. Hence the concept of a sick role may be less helpful today than was once the case.
Continuing Relevance
Parsons’s concept of the sick role is often thought to be less useful in today’s age of healthcare consumers, who are more knowledgeable and reflexive than the more deferential recipients of the 1950s. However, Turner (2009) argues that most societies do develop sick roles, but these differ. In many Western societies, for example, there exists an individualized sick role, which means that hospital stays for non-life-threatening conditions are generally quite short, visiting hours are limited and the number of visitors is strictly controlled. However, in Japan, a more communal sick role is the norm. Patients tend to stay in hospital longer after their medical treatment is completed and the average hospital stay is much longer than in Western societies. Hospital visits are also more informal, with family and friends often eating together and staying for longer periods. Turner suggests that we can still learn much about the social bases of health from such a comparative sociology of sick roles.
The sick role may appear simple and obvious, but, as Glenton (2003) argues, some people struggle to achieve it and, by not being able to do so, become more rather than less dependent on doctors. In her study of back pain sufferers, many express the fear that they are not believed, that they are seen as malingerers or hypochondriacs, or that they have a form of mental illness. Essentially their status as ‘patient’ is undermined by the problems of presenting their illness adequately for medical diagnosis, which can lead to delegitimation. Glenton interprets this problem as a failure to achieve the sick role. As such, this shows that Parsons’s description still pertains for medics and patients and provides useful evidence that, in spite of the common assumption to the contrary, chronic conditions are not beyond the reach of his original thesis.
References and Further Reading
Freidson, E. (1970) Profession of Medicine: A Study of the Sociology of Applied Knowledge (New York: Dodd, Mead).
Glenton, C. (2003) ‘Chronic Back Pain Sufferers: Striving for the Sick Role’, Social Science and Medicine, 57(11): 2243–52.
Parsons, T. (1952) The Social System (London: Tavistock).
Shilling, C. (2002) ‘Culture, the “Sick Role” and the Consumption of Health’, British Journal of Sociology, 53(4): 621–38.
Turner, B. S. (2009) Medical Power and Social Knowledge (2nd edn, Thousand Oaks, CA: Sage), esp. chapter 3.
White, K. (2009) An Introduction to the Sociology of Health and Illness (London: Sage), esp. chapter 6.
Social Model of Disability
Working Definition
An approach which locates the ‘cause’ of the disadvantages associated with disability within society and its organization rather than within the individual person.
Origins of the Concept
Until very recently, Western societies contained a dominant individualistic model of disability. This model suggested that individual limitations or ‘disabilities’ are the main cause of the problems experienced by disabled people in finding work, moving around, and becoming full citizens in society. In the individual model of disability, bodily ‘abnormality’ is seen as causing some degree of ‘disability’ or functional limitation. Medical specialists played a central role in the individual model because it is their job to offer curative and rehabilitative diagnosis to disabled people. For this reason the individual model is often described as a ‘medical model’. This model of disability was challenged by activists from within an emergent disabled people’s movement from the 1970s onwards.
In late 1960s America and Britain, an alternative perspective was developed which rejected the dominant model and saw disability as a political rather than a medical issue. A new ‘social model’ of disability emerged which separated impairments (individual problems such as loss of a limb) from disability (disadvantages caused by organizations not making provision for people with such impairments). The social model has been the subject of much research and development since then and has strongly influenced equal rights legislation aimed at forcing organizations to make ‘reasonable provision’ for disabled people. However, in more recent years there has been criticism that the social model needs to be amended to take account of the actual experience of disability.
Meaning and Interpretation
In the UK, the Union of Physically Impaired against Segregation (UPIAS) adopted, in its 1976 manifesto, a radical definition of disability based on the separation of impairment and disability. UPIAS accepted the definition of physical ‘impairment’ as a biomedical property of individuals, extending it to include non-physical, sensory and intellectual forms of impairment. Disability, though, was understood no longer as the problem of individuals but in terms of the social barriers that people with impairments face in order to participate fully in society. Disability was therefore a denial of full citizenship and a form of discrimination.
Mike Oliver (1983) was the first theorist to make explicit the differences between the individual and the social models of disability, and the social model soon became the focus of disability activism and academic studies. The social model provided a coherent explanation of why the social, cultural or historical barriers against disabled people have come about. Historically, many barriers were erected against disabled people’s full participation in society, especially during the Industrial Revolution, when they were effectively excluded from the labour market as capitalist factories began to base employment on individual waged labour. Many disabled people were unable to keep or retain jobs, and the state’s response was harsh deterrence and institutionalization. Indeed, even today, disabled people’s presence in the workforce remains very small.
The social model has been enormously influential in shaping the way that we think about disability today. Although it originated in the UK, the social model has gained global influence. In focusing on the removal of social barriers to full participation, it allows disabled people to concentrate on political strategy. This has led some to argue that, in accepting the social model, disabled people have formed ‘a new social movement’. In replacing the individual model, which identifies the ‘invalidity’ of the individual as the cause of disability, with a model in which disability is the result of oppression, the social model has been seen by many disabled people as ‘liberating’.
Critical Points
Since the late 1980s, several lines of criticism have been developed against the social model. Some see that it pays no attention to the often painful or uncomfortable experiences of impairment, which are central to many disabled people’s lives. Shakespeare and Watson (2002) state: ‘We are not just disabled people, we are also people with impairments, and to pretend otherwise is to ignore a major part of our biographies.’ Against this, advocates of the social model maintain that, rather than denying everyday experiences of impairment, the social model merely seeks to focus attention on the social barriers to full participation in society.
Medical sociologists often reject the social model, arguing that the division between impairment and disability, on which it rests, is false. These critics claim that the social model separates impairment, which is defined biomedically, from disability, which is defined socially. Medical sociologists see both disability and impairment as socially structured and closely interrelated. For instance, it is not easy to define where one ends and the other begins. Failure to design suitable wheelchair access to a building clearly creates a socially constructed disabling barrier to wheelchair users, but there are many more cases where it is impossible to remove all the sources of disability. Some argue that to be impaired by constant pain or significant intellectual limitation, for example, disables the individual from full participation in society in a way that cannot be removed by social changes. Hence, any full account of disability must also take into account disability caused by impairments, not just those caused by society.
Continuing Relevance
The social model was a radical move in both the academic study of disability and the political engagement of disabled people with the rest of society. And, despite the criticisms noted above, there do not seem to be any alternatives forthcoming to challenge it. The concept of disability itself has been transformed by the social model, and the sociology of disability was only possible after its introduction. The social model has shown, above all else, that disability is not something that can be left to the medical profession; it needs to be studied across all of the social sciences too.
The social model approach was adopted by Guo and her colleagues (2005) to examine some of the social barriers to Internet use in China. Using a survey method, the study sampled 122 people across twenty-five provinces. The survey found that only a minority of disabled people were Internet users, but for these individuals the Internet did increase the frequency and quality of their social interactions and helped to reduce social barriers. They were also able to interact with a much larger group of people than would be possible in the ‘real world’. However, the findings suggest that a clear digital divide is emerging among disabled people in China, with the majority currently unable to access the Internet. The social model suggests that solutions to this problem are to be found in the reorganization of existing social life and the reshaping of social policies.
References and Further Reading
Barnes, C., and Mercer, G. (2008) Disability (Cambridge: Polity), esp. chapters 1 and 2.
Gabel, S., and Peters, S. (2004) ‘Presage of a Paradigm Shift? Beyond the Social Model of Disability toward Resistance Theories of Disability’, Disability and Society, 19(6): 585–600.
Guo, B., Bricout, J., and Huang, J. (2005) ‘A Common Open Space or a Digital Divide? A Social Model Perspective on the Online Disability Community in China’, Disability and Society, 20(1): 49–66.
Oliver, M. (1983) Social Work with Disabled People (Basingstoke: Macmillan).
Sapey, B. (2004) ‘Disability and Social Exclusion in the Information Society’, in J. Swain et al. (eds), Disabling Barriers – Enabling Environments (London: Sage), pp. 273–9.
Shakespeare, T., and Watson, N. (2002) ‘The Social Model of Disability: An Outdated Ideology?’, Research in Social Science and Disability, 2: 9–28.
Social Self
Working Definition
The formation of self-awareness that is created as the individual human organism reacts to the varied reactions of others towards it.
Origins of the Concept
It has often been said that human beings are the only creatures who know that they exist and that they will die. Sociologically, this means that human individuals have an awareness of self. George Herbert Mead’s (1934) ideas on how the self is created constitute one of the most influential and genuinely sociological theories of self-formation. Mead insisted that a sociological perspective is necessary if we are to understand how the self emerges and develops, and his ideas formed the main basis for the symbolic interactionist tradition in sociology. He argued that, although the self, once created, amounts to the ability to ‘think things through’, it is an embodied self which resides within a real human individual and, unlike similar concepts such as the ‘soul’ or ‘spirit’, cannot be conceived without this.
Meaning and Interpretation
Mead’s theory aims to understand how infants begin to develop a sense of themselves as social beings through imitation and play. Young children can be observed mimicking the actions of parents and other children – holding pretend tea parties, digging in plant pots or vacuuming carpets with toy cleaners – having seen adults do similar things. This is the start of the self-formation process. When they move on to playing games, around the age of four or five, the next stage begins. Engaging in play means children have to start to take on aspects of social roles rather than simply mimicking what they see. Mead calls this ‘taking the role of the other’, which demands that children see their play from the standpoint of other people; it is at this point that a social self starts to emerge. In taking the role of others and effectively seeing themselves ‘from the outside’, as it were, they also begin to grasp that they are separate people from those others.
Mead’s theory is based on the idea of a two-part self: an ‘I’ and a ‘me’. The ‘I’ represents the human organism, the unsocialized element of the self. The ‘me’ develops through social interactions, beginning with imitation and play, as discussed above. The social ‘me’ starts to form at the age of around eight or nine, when play moves on into more organized games with numerous players. To learn organized games, children must understand not just the rules of the game but their place within it, along with the other roles that exist in the game. Children start to see themselves as if from the outside and, rather than adopting a single role, take on the role of a ‘generalized other’. It therefore becomes possible for individuals to develop self-consciousness through an ‘internal conversation’ between the individual, organismic ‘I’ and the socially generated ‘me’. And it is this internal conversation that we ordinarily refer to as ‘thinking’, a way of ‘talking to ourselves’, as it were. Developing a sense of self is the bedrock on which quite complex personal and social identities are constructed.
Critical Points
One criticism of Mead’s thesis is that the process of self-formation appears relatively unproblematic. But others have suggested that the process is full of conflict and emotional turmoil and can leave scars which last a lifetime. This is particularly the case in early socialization, when children acquire their sense of gender identity. Sigmund Freud and later Freudians argue that unconscious thoughts and feelings play a much more important role in self-formation and gender identity than Mead’s theory allows for. The process through which boys and girls break their intimate ties with parents can be traumatic for many. Even where the process is relatively smooth, it can lead to boys growing up with difficulty in forming personal relationships. Self-formation is difficult and involves the repression of unconscious desires, an aspect that is absent from Mead’s thesis. Others argue that Mead has little to say about the effects of unbalanced parental power relationships on the socialization of children, which can lead to selves that do not function well and are riven with internal tension and contradictions.
Continuing Relevance
Mead’s theory was very important for the development of sociology. It was the first genuinely sociological theory of self-formation, which insisted that, if we are properly to understand ourselves, we must start with the social process of human interaction. In this way he showed that the self is not an innate part of our biology, nor does it emerge simply with the developing human brain. What Mead demonstrated is that the study of the individual self cannot be divorced from the study of society, and that requires a sociological perspective.
We may perceive ourselves as individuals, but what happens to our individual selves within intimate relationships and how does their breakdown affect the self? This is explored in an article that looks at the breakdown of romantic relationships and its impact on people’s self-concept or sense of ‘me’ (Slotter et al. 2009). In strongly committed romantic relationships, people’s selves become intertwined and less clearly defined, evidenced in the routine use of terms such as ‘we’, ‘our’ and ‘us’. The ending of such relationships often results in distress and sadness, but it can also lead to changes in the content and structure of the self as individuals reorganize and reshape their lives. This study shows that many people subjectively perceive post-breakup confusion about their self and feel the self to be smaller. As both Mead and Elias argue, our experience of individuality actually belies the fact that the self is inevitably a social self that is shaped in interactions and relationships.
Sociologists have discussed the radical social changes of recent decades, including globalization, the spread of information technology, mass migration, travel and the compression of time and space, and restructured gender relations, to name a few. We would expect such changes to have an impact on people’s sense of self, and Adams (2007) brings together accounts of macro-social change and theories of shifting forms of self-identity. For instance, some theorists suggest that, as class identification diminishes, people’s individual selves are effectively cut adrift and become more vulnerable to uncertainty and anomie. Yet others see this shift as offering the possibility for a more reflexive form of social self that is capable of taking advantage of newly available freedoms. Adams helps us to understand recent theories of large-scale social change and their impact on self-formation.
References and Further Reading
Adams, M. (2007) Self and Social Change (London: Sage).
Burkitt, I. (2008) Social Selves: Theories of Self and Society (2nd edn, London: Sage).
Mead, G. H. (1934) Mind, Self and Society, ed. C. W. Morris (Chicago: University of Chicago Press).
Slotter, E. B., Gardner, W. L., and Finkel, E. J. (2009) ‘Who am I without You? The Influence of Romantic Break-Up on the Self Concept’, Personality and Social Psychology Bulletin, 36(2): 147–60.
Stigma
Working Definition
Physical or social characteristics that are identified as demeaning or are socially disapproved of, bringing opprobrium, social distance or discrimination.
Origins of the Concept
Sociological studies of stigma and processes of stigmatization have been conducted largely within the symbolic interactionist tradition from the 1960s onwards. Some early work, such as that of Goffman ([1963] 1990), theorized how stigmatizing processes work to produce discrimination and also investigated how the stigmatized person responds. For Goffman, there are some important differences depending on the type of stigma, which governs the extent to which people can manage their self-identity and protect their sense of self. Another source of ideas on stigma came from the disabled people’s movement. An important early challenge to the individual model of disability was Paul Hunt’s Stigma: The Experience of Disability (1966). Hunt argued that, rather than disabled people’s problems being seen as arising from their impairments, it was interactions between disabled people and able-bodied people that led to the stigmatizing of disability. In more recent times the concept has been successfully used to explore the situation of people with HIV/AIDS and other health-related conditions.
Meaning and Interpretation
The most successful and systematic account of the production of stigma is that of Erving Goffman. Goffman’s work is an excellent example of the close linkage between social identity and embodiment, as he shows how some physical aspects of a person’s body can present problems once these have been categorized by others as sources of stigma. He shows, for example, how disabled people can be stigmatized on the basis of readily observable physical impairments. Nonetheless, not all sources of stigma are physical, as stigma can reside in biographical features, character ‘flaws’ or personal relationships.
Stigma can take many forms. Physical stigma, such as a visible impairment, can often be hard or impossible to hide from others, and Goffman argues this can make the management of identities more difficult. Where this is the case, we can refer to a ‘discredited’ stigma – one that has to be acknowledged in interactions. Biographical stigma, such as a previous criminal conviction, can be easier to hide from others, and in this case we can speak of a ‘discrediting’ stigma – one that may lead to stigmatizing should it become more widely known. Managing this type may be somewhat easier, but it does still have to be continually controlled. A character stigma, such as associating with drug users, may also be a discrediting stigma, but it may turn into a discredited stigma if the person is observed with the wrong crowd. Note that Goffman is not suggesting people should hide stigma; he is just trying to make sense of how the process of stigmatization works in the real world and how people use strategies to avoid becoming stigmatized.
Goffman argued that stigma is a social relationship of devaluation in which one individual is disqualified from full social acceptance by others. Stigmatization often appears in a medical context as people become ill and their identity is changed – sometimes temporarily, but at other times, such as with chronic illnesses, permanently. Goffman argued that inherent in the process of stigmatization is social control. Stigmatizing groups is one way in which society at large controls their behaviour. In some cases, the stigma is never removed and the person is never fully accepted into society. This was true of many early AIDS patients and it continues in some countries.
Homosexuality has long been stigmatized in many countries around the world, and since the 1960s the hatred of gay men and lesbians has been described as homophobia. This may take the form of derogatory language and name-calling but also outright violence. In 2016, a gunman targeted gay men in a nightclub in Orlando, Florida, killing forty-nine people and injuring fifty-three others – the worst mass shooting in US history. One of the key settings for homophobic abuse has long been schools in which terms such as ‘poof’, ‘sissy’, ‘queer’ and many more have been and continue to be common currency in the playground. Given that childhood is crucial in the formation of the social self, homophobic abuse in schools has been seen as a key aspect in the reproduction of ‘heterosexism’ in society. Sarah Nettleton (2013) notes that, because AIDS was first found among gay men in the USA, it was originally called GRID – Gay Related Immune Deficiency – and it was suggested that a ‘fast-lane’ gay lifestyle actually caused the disease, which was often referred to in the media as a ‘gay plague’. Although this was false, epidemiological interpretations of gay men as part of ‘high-risk groups’ tended to reinforce the division between such groups and the ‘heterosexual general public’.
Critical Points
One of the deficiencies with studies of stigma is the relative lack of interest in resistance to stigmatizing processes. At the individual level, people may simply refuse to accept the stigmatizing label, though in isolation they are not very likely to be successful. However, collective forms of resistance can be very significant in challenging stigma. Disabled people’s movements and gay and lesbian movements challenged mainstream interpretations of their discredited and discrediting stigmas, often by protests and direct action campaigns. Highly visible symbolic protests and the head-on tackling of discriminatory language and labelling generated pressure for change and new equal rights legislation and helped to shift attitudes in society. Stigmatizing processes are perhaps more open to change than the earlier theories allowed for.
Continuing Relevance
The concept of stigma continues to be useful. Research into self-injurious behaviour, for example, shows how those who engage in practices of self-harm are keenly aware of the possible stigmatizing of their behaviour, choosing the body sites that are most easily hidden from view in public situations in order to avoid their discrediting stigma becoming discredited. Similarly, studies of eating disorders such as anorexia nervosa show that people go to great lengths to try and keep their behaviour hidden in order to manage their presentation of self, and thus their identity, rather than losing control over it to others and in the process facing the imposition of social stigma.
The continuing relevance of the concept of stigma is clear in the study of sexual promiscuity labels and AIDS in Thailand by Kit Yee Chan (2009) and her colleagues. This research used a mixed-methods approach to explore the perceptions of nurses in Bangkok towards the risk of being accidentally exposed to HIV in their work roles. The authors found that nurses’ fear of HIV was rooted mainly in the social ostracism they associated with being HIV-positive rather than in the medical consequences of infection. Although the nurses were well aware that the probability of actual infection at work was very low, they still had a fear which was sustained by what they perceived to be the social consequences of HIV. This social fear was reinforced by their observation at close hand of the stigma attached to their patients.
Goffman argued that stigma can accrue from almost any aspect of people’s lives. Caroline Howarth (2006) looked at how conceptualizing ‘race’ as a social stigma may help us to understand the process of stigmatizing ‘race’ but also how communities can contest and change the processes that lead to discrimination. Drawing on material from three qualitative studies, Howarth argues that, as the stigma attached to ‘race’ cannot be hidden or disguised, resistance and attempts to overthrow the stigmatizing regime have to be collaborative. Her article describes various examples of this in schools and church groups which aim to provide ‘social psychological spaces’ in which the operation of stigma can be challenged.
References and Further Reading
Chan, K. Y., Rungpueng, A., and Reidpath, D. (2009) ‘AIDS and the Stigma of Sexual Promiscuity: Thai Nurses’ Risk Perceptions of Occupational Exposure to HIV’, Culture, Health and Sexuality, 11(4): 353–68.
Goffman, E. ([1963] 1990) Stigma: Notes on the Management of Spoiled Identity (London: Penguin), esp. chapters 1 and 2.
Green, G. (2009) The End of Stigma: Changes in the Experience of Long-Term Illness (London: Routledge), esp. chapters 1 and 2.
Howarth, C. (2006) ‘Race as Stigma: Positioning the Stigmatized as Agents, Not Objects’, Journal of Community and Applied Social Psychology, 16(6): 442–51.
Hunt, P. (1966) Stigma: The Experience of Disability (London: Chapman).
Nettleton, S. (2013) The Sociology of Health and Illness (3rd edn, Cambridge: Polity).
10
Political Sociology
Authority
Working Definition
The legitimate power which one person or group holds over another.
Origins of the Concept
Max Weber’s ([1925] 1979) political sociology is the starting point for most studies of power, politics and authority. Weber saw power as the ability of people or groups to get their own way, even against opposition, but people can be said to be in positions of authority only when they are able to issue commands and have a reasonable expectation that those commands will be carried out. Authority therefore rests on the belief among those receiving commands that the person giving them is doing so legitimately. That is, their position is accepted as authoritative. Authority can be seen in operation in adult–child relations; within families, where the head of household makes decisions; within organizations, where managers are seen as having the right to give orders; in the armed forces, where a strict system of rank and authority is in place; and in politics, where governments introduce laws which they expect to be obeyed.
Meaning and Interpretation
Weber argued that systems of authority differ across societies and also over time. He distinguished three types of authority in history: traditional, charismatic and rational-legal. However, all three are ideal types – heuristic tools devised to assist researchers as they approach real-world phenomena. And though Weber’s scheme may appear chronological – from traditional to charismatic to rational-legal – any of the three types could become dominant, and it is more usual for two or three to exist at the same time.
Traditional authority is power that is legitimized through respect for long-established cultural patterns transmitted over generations. In this system, people obey commands on the basis of the traditional status of rulers. The legitimacy of traditional authorities comes from the knowledge and acceptance that this is the way things have been organized in the past. Weber gives the example of hereditary family rule of nobles in medieval Europe, echoes of which continue in aristocratic and royal families. In traditional authority, people’s allegiance is to particular individuals and not to the rules they put in place. In practice, this means that people obey rulers, not rules, and feel they owe them personal fidelity.
Charismatic authority tends to disrupt traditional forms and has been the source of innovation and change in history. Charismatic authority is based on the devotion felt by subordinates towards a leader by virtue of her or his exceptional qualities which inspire devotion. The concept of charisma has proved difficult to pin down, though, as it is unclear whether its special qualities actually inhere in the personality of the leader or whether it is the perception by others that the leader has such qualities. Historical examples include Jesus Christ, Adolf Hitler and Mahatma Ghandi, though heroic soldiers, ‘saintly’ individuals and political leaders have all been described as ‘charismatic’. One thing all charismatic leaders must do is to provide occasional ‘proof’ of their special qualities, and if such proof is not forthcoming the charismatic person may come under challenge. Weber saw that this made charismatic authority essentially unstable, reinforced by the fact that, when the leader dies, a crisis of belief and legitimacy is likely to follow. When charismatic systems begin to take on more routinized form, they tend to be transformed into traditional or legal-rational systems.
With the emergence of capitalism, Weber saw traditional authority giving way to a new form of legal-rational authority. This is power that is legitimized through legally enacted rules and regulations and combines a belief in the law with formal rationality in decision-making. It is found in modern organizations and bureaucracies and in democratic systems of government that direct the political life of a society. Rational-legal authority can only be exercised when decisions and commands have been arrived at through ‘due’ process, not according to tradition or individual whim. Bureaucracy is the typical form of legal-rational authority.
Critical Points
One longstanding criticism of Weber’s typology is that, although he identified four types of social action, there are only three systems of authority. The ‘missing’ category appears to be value-rational authority, where legitimacy rests on the absolute value attached to a set of norms. Essentially this is an ideological form of authority in which legitimacy is given to leaders on the basis of their pursuit of a goal or end. This fourth logical type rests on obedience to the ideological goal rather than on individuals, and commands issued are legitimized to the extent that they relate to the ultimate goal. Examples would include strongly ‘ideological’ systems such as religious organizations or early Soviet communism.
In recent years sociologists have discussed the emergence of a celebrity culture which glorifies individuals on the basis of their media presence rather than their achievements. This culture has also impacted on political life, and leading politicians now tend to be evaluated on their personalities as presented in the mass media. Some sociologists have suggested that this undermines or short-circuits legal-rational democratic processes and presents a threat to democratic values. Neil Postman (1986), for example, warned that politics was in danger of becoming a mere adjunct of show business.
Continuing Relevance
Weber’s classification allows for mixtures of the three types to coexist, even though one may be dominant. For example, modern Britain has a system of legal-rational authority, but in political life the House of Lords plays a part in government and the monarch still has a constitutional place. This mixing of ideal types gives Weber’s scheme flexibility and continues to be useful for political sociologists. However, the spread of celebrity culture into the world of politics has raised some questions about the basis of a political leader’s authority. It is commonplace today for politicians to manage their public image and for political parties to court popular celebrities such as pop stars, actors and sportspeople. Similarly, in the USA, two former actors, Ronald Reagan and Arnold Schwarzenegger, became president and a state governor respectively. This encroaching of celebrity into political life has often been seen as obviously pernicious.
However, Street (2004) argues not only that celebrity politics can be traced back to at least the eighteenth century but that the advent of the celebrity politician is not incompatible with the authority of representative democracy. Indeed, rather than being at odds with the principles of democratic representation, celebrity politics can be seen as an extension of them. ‘Representativeness’ is not a concept that is restricted to party manifestos and policy proposals; it also includes the style, aesthetics and attractiveness of politicians. All of these elements help to forge identification between politicians and those they claim to represent. And it is through political style and appearance that politicians communicate their relationship to voters and their future plans, reducing complex political arguments into a form with which citizens can identify.
Political scientists have often seen small political parties as relying more on a charismatic leader to help bridge the resource gap with the major parties. But do charismatic leaders really carry the authority to help small parties win votes? Van der Brug and Mughan (2007) bring empirical evidence from Dutch elections to bear on this issue. They analysed three elections, looking at the performance of right-wing populist parties, and concluded that the influence of their leaders was essentially no greater than that of the leaders of the larger established parties. The study also rejects the notion that those who vote for right-wing parties are motivated primarily by a vague sense of dissatisfaction rather than actually supporting the policies promoted by party leaders. Right-wing voters, they suggest, make the same kinds of considerations as all other voters, and their choices are no less ‘rational’ or swayed by charismatic forms of authority.
References and Further Reading
Morrison, K. (2006) Marx, Durkheim, Weber: Formations of Modern Social Thought (2nd edn, London: Sage), esp. pp. 361–73.
Postman, N. (1986) Amusing Ourselves to Death: Public Discourse in the Age of Show Business (London: Heinemann).
Street, J. (2004) ‘In Defence of Celebrity Politics: Popular Culture and Political Representation’, British Journal of Politics and International Relations, 6: 435–52.
Van der Brug, W., and Mughan, A. (2007) ‘Charisma, Leader Effects and Support for Right-Wing Populist Parties’, Party Politics, 13(1): 29–51.
Weber, M. ([1925] 1979) Economy and Society: An Outline of Interpretive Sociology (Berkeley: University of California Press).
Citizenship
Working Definition
A status accorded to individuals within a specific nation or political community which carries with it certain rights and responsibilities.
Origins of the Concept
The concept of citizenship originated in the city states of ancient Greece, where the status of ‘citizen’ was afforded to some of those living within the city boundary. In that sense, citizenship was a symbol of social status. In many societies before the modern period, the monarch or emperor ruled over a mass of people who had no proper means of being part of a system of governing. Indeed, in many societies with low levels of literacy, the bulk of the population had very little knowledge of government and politics at all. The idea that ordinary people could have individual rights or take part in political decision-making was quite alien, as such privileges were restricted to high-status members of the society. Today, people are considered to be and see themselves as citizens, usually part of a national community, with the rights and responsibilities that go with that status. Marshall ([1950]1973) saw citizenship as emerging alongside industrialization and traced the evolution of citizenship in Britain (specifically England) from eighteenth-century civil rights, through nineteenth-century political rights, to twentieth-century social rights.
Meaning and Interpretation
In the modern world, citizenship is a social status granted to members of nation states on the basis of residence. Citizenship therefore grants certain privileges, though these are balanced by duties which citizens are expected to accept. For example, citizens have the right to expect the state to protect them, but the state also expects citizens to act reasonably and not to take up arms against other citizens or government. The concept of citizenship has been divided into different types, with each new form building on the previous type.
Civil citizenship emerged with modern property ownership, as this imposed certain mutual obligations on people to respect one another’s right to property, leading to a mutual responsibility for the maintenance of social order. Political rights were restricted to property owners, and large numbers of people were left outside formal politics. In a second stage, political citizenship involved the gradual extension of voting rights to working-class groups and women, and certain rights of free association were introduced, such as those allowing the formation of trade unions, while ideas of free speech also emerged. The third stage, social citizenship, saw citizenship rights extended to social welfare and a shared responsibility for collective provision of welfare and other benefits. People were expected to contribute to the social fund used for supporting the vulnerable and, as a result, enjoyed the right to a share of the welfare safety net when they needed it.
In recent years, some have argued that we are moving into a fourth stage, described as environmental citizenship. In this stage, citizens have new rights to expect a clean, safe environment but also a new duty not to pollute the human or natural environment. A more radical version of ‘ecological citizenship’ envisages the protections embedded within human rights of citizenship being extended to some animals. Ecological citizenship would involve new obligations to non-human animals, to future generations of human beings, and to maintaining the integrity of the natural environment. New obligations to future generations of human beings also mean working towards sustainability over a long time period. In essence, ecological or environmental citizenship introduces a new demand for people to take account of the human ‘ecological footprint’ – the impact of human activity on the natural environment and natural processes.
Critical Points
Marshall’s conception of citizenship is problematic as it is based on the experience of one nation state, Britain. In France, Germany and other countries, citizenship did not ‘evolve’ in the way he describes. Some have also seen his approach as simply a post hoc description – this is what happened – rather than being genuinely explanatory. Why were political rights granted to the working classes and women at a specific historical moment, for instance? Was this really just part of a natural ‘evolution’? Trade unions, for example, had to fight hard for an extension of the franchise, which other groups fought equally hard against. Similarly, even in Britain, the voting age for men and women did not reach parity until 1928, well into the twentieth century, which is much later than Marshall’s scheme allows for. In short, it is not clear exactly why civil rights had to lead to political rights, which then had to lead to social rights, and this process requires proper explanation.
The attempt by administrations in both the USA and the UK in the 1980s to cut government spending and ‘roll back the state’ shows that citizenship is never so firmly established that it cannot be reversed. The politics of austerity that followed the 2008 financial crisis also led many governments to cut back on public spending and to extend the principle of conditionality to more welfare benefits, thus changing the content of social citizenship rights. Recent globalization theories have challenged the basis of the nation-state-based model of citizenship. For instance, the European Union offers a regional form of citizenship which grants some rights, such as the right to travel and to work, which nation states have to respect. European citizens can also challenge legal decisions made at nation-state level at the regional European level. Cosmopolitan thinkers see the possible extension of citizenship to the global level, with individuals having the status of global citizen, though we are a very long way from this vision at present.
Continuing Relevance
Though there are some issues and challenges to the nation-state model of citizenship, the basic concept of citizenship as involving rights and duties remains sound. Indeed, some of the more recent political debate has involved a rethinking of how to enable citizens to become more active as a means of reinvigorating politics and community life. The continual pressure for the expansion of rights and responsibilities continues to inform our understanding of what citizenship is and should be.
Redley and Weinberg (2007) tackle the question of whether the liberal democratic model of citizenship is capable of integrating people with learning disabilities. Can this democratic model, which demands intellectual ability and independence as prerequisites, politically empower those with intellectual impairments? This ethnographic study explored what can be learned from a UK initiative, the Parliament for People with Learning Disabilities (PPLD). The PPLD embraced a clear liberal democratic preference for ‘self-advocacy’ by people with learning disabilities. However, the study found several practical interactional obstacles to the liberal democratic preference for self-advocacy. Some participants were just not audible, some spoke ‘inappropriately’ (that is, did not move the discussion forward) and others did not take the floor when invited to do so. While the authors support the basic principle of self-advocacy, they argue that this principle needs to be bolstered by a concern with care, security and well-being if full citizenship is to be realized for people with learning disabilities.
The citizenship experience of two generations of British-Pakistani Muslims is explored in Hussain and Bagguley’s (2005) qualitative research in the aftermath of the 2001 ‘riots’ in some northern English towns and cities. In particular, the authors maintain that citizenship is a form of identity as well as a set of entitlements and that the identity of being a citizen is not necessarily shared by all. First-generation migrants from Pakistan did not generally consider themselves to be British citizens but reported that they lived in Britain, which remained an essentially foreign country to them. However, second-generation British Pakistanis had a strong sense of themselves as British-born citizens with all the rights that identity confers. For this second generation, the electoral success and overt racist language of the far-right British National Party posed a direct threat to their status as British citizens as well as to their ethnic identity.
References and Further Reading
Bellamy, R. (2008) Citizenship: A Very Short Introduction (Oxford: Oxford University Press).
Dobson, A., and Bell, D. (eds) (2006) Environmental Citizenship (Cambridge, MA: MIT Press).
Hussain, Y., and Bagguley, P. (2005) ‘Citizenship, Ethnicity and Identity: British Pakistanis after the 2001 “Riots”’, Sociology, 39(3): 407–25.
Marshall, T. H. ([1950] 1973) Class, Citizenship and Social Development (Westport, CT: Greenwood Press).
Redley, M., and Weinberg, D. (2007) ‘Learning Disability and the Limits of Liberal Citizenship: Interactional Impediments to Political Empowerment’, Sociology of Health and Illness, 29(5): 767–86.
Civil Society
Working Definition
The sphere of society made up of all those networks, voluntary associations, businesses, clubs, organizations and families formed by citizens independently of government.
Origins of the Concept
The concept of civil society can be traced right back to ancient times, when it was tied to notions of civility and people treating one another with respect. However, modern conceptions of civil society draw from Alexis de Tocqueville’s nineteenth-century idea of ‘civic associations’, such as lodges, charities and religious groups, which he found in abundance in the USA. Tocqueville saw the existence of thousands of such associations not just as performing useful functions but as fundamental to sustaining the country’s democratic culture (Eberly 2000). For much of the twentieth century, sociologists and political theorists had little to say about civil society, but there has been a resurgence of interest since the 1980s. Of late, interest has shifted to cosmopolitan theories of a global civil society which, for the first time, offer the promise of an effective global form of citizenship.
Meaning and Interpretation
The concept of civil society is close to that of the public sphere. However, the latter is generally taken as all of those public spaces in which discussion and debate about society and its political decisions take place. In contrast, civil society consists of voluntary groups, clubs and other organized forms of civic association. However, there are many disagreements about what civil society entails. For some it does not include businesses, for others families are excluded, and yet others see three distinct realms: state, market and civil society.
There are also fundamental disagreements about the nature of civil society. For some, it represents a space for the expression of active citizenship and a democratic bulwark against authoritarianism. This view glosses over the distinct possibility that organizations and voluntary groups are, to some extent, in competition with one another (for resources and members) and the relations between them may be much less cooperative than the more positive assessments suggest. In the Marxist tradition, civil society is even less of a progressive arena of voluntarism and creativity. Marx saw civil society, along with the rest of the cultural superstructure, as implicated in transmitting the ideological and cultural dominance of capitalism and its values. However, later neo-Marxists, especially Gramsci, acknowledged that such ideological domination was never complete and that civil society at least offered opportunities to build a counter-cultural challenge (Edwards 2014).
Reinvigoration of the concept of civil society in the late 1980s seems to have been stimulated by events in Eastern Europe and the collapse of Soviet-style communism. Strengthening civil society seemed a useful way of counterbalancing the power of states, and in recent years it has also been invoked as an effective means of peacemaking in places such as Northern Ireland, Kosovo and Afghanistan (Harris 2003: 2). Establishing inclusive voluntary associations and networks could help to build strong social foundations beyond the actions of governments.
The concept has been extended more recently by cosmopolitan thinkers whose research agenda has become established in the social sciences. Beck (2006) argues that the ideas of a universal citizenship and global civil society were historically the preserve of well-travelled and well-connected social elites who voluntarily chose to see themselves as ‘Europeans’ or ‘citizens of the world’. But, due to processes of globalization, this outlook now has much stronger roots in reality and is potentially more effective. As global communications and interactions become more common, a global civil society may be evolving. For example, campaigners against landmines, tax avoidance by multinational companies, and fundamentalist terrorists are able to link up with sympathizers around the world in global networks that help to constitute a global civil society (Kaldor 2003).
Critical Points
Some studies assume that a strong civil society inevitably strengthens democracy and that the development of the two runs in tandem. However, this is not necessarily the case. Many voluntary organizations and clubs are far from democratic, and there is no reason to suppose that they should be. Promoting civil society as a panacea for democratic deficits in formal politics or as balancing authoritarian leadership may therefore be misguided. Some voluntary groups may enjoy high levels of social capital – such as the National Rifle Association in the USA – and have access to government, which gives them much more power than other groups to influence policy without having to run for election.
Not everyone agrees that civil society is in a state of rude health. Robert Putnam’s (2000) study of civic associations in America found much evidence that civic ties and membership of voluntary bodies was actually declining. He argued that parent–teacher associations, the National Federation of Women’s Clubs, the League of Women Voters and the Red Cross had all experienced membership declines of roughly 50 per cent since the 1960s. Fewer people reported that they socialized with neighbours or even trusted them. Similar, if less dramatic results were also found in the UK and Australia, though Sweden, the Netherlands and Japan had stable or rising levels of social capital (social networks) (Halpern 2005). The picture is therefore mixed, but it does not augur well for ideas of a global civil society.
Cosmopolitan theories which perceive a global form of civil society emerging seem poorly supported by the evidence. So far, cosmopolitan mentality and practice seem to be restricted to Western activists and academics who hold a normative commitment to the project or to wealthy global tourists who are able to take full advantage of opportunities for international mobility. For most people, a commitment to the nation or the local community remains the dominant source of identification.
Continuing Relevance
In contrast to some of the more optimistic perspectives on the possibility of a future global civil society, the 2008 global financial crisis has led to some much less sanguine analyses. One example is Pianta’s (2013) paper on the prospects for a concerted response from within civil society. Noting the ‘democratic deficit’ in the EU, Pianta argues that the eurozone crisis has heightened awareness of this, as decisions are made and imposed on citizens without their proper involvement. On the other hand, there have been strong reactions across Europe from civil society actors, illustrating the potential strength of citizens’ groups. However, so far, these groups are not united in their approach and remain divided on how best to increase democratic participation.
It is often remarked that the spread of the Internet is a key factor in the constitution of an emerging global civil society, enabling global communications, debate and interaction. However, Naughton (2001) argues that the Internet may not be as unproblematic as it appears. Most studies assume that it is simply a resource to be used. But this is rather naïve. While the open source nature of the Internet is in line with the values of a global civil society, this radical openness is not inevitable, and there are governments and corporate interests that are pressing for change. The increasing web presence of corporate advertising in many subtle and not-so-subtle forms shows how the character of the Internet may be changing. The huge digital divide between the information-rich and information-poor countries is also a barrier to global communications. Naughton claims that, for too long, cyberspace has been seen as very different from the ‘real world’, but in fact the two are converging around essentially similar power struggles between civil society and corporate and government interests.
References and Further Reading
Beck, U. (2006) Cosmopolitan Vision (Cambridge: Polity).
Eberly, D. E. (ed.) (2000) The Essential Civil Society Reader (Lanham, MD: Rowman & Littlefield).
Edwards, M. (2014) Civil Society (3rd edn, Cambridge: Polity).
Halpern, D. (2005) Social Capital (Cambridge: Polity).
Harris, J. (ed.) (2003) Civil Society in British History: Ideas, Identities, Institutions (Oxford: Oxford University Press).
Kaldor, M. (2003) Global Civil Society: An Answer to War (Cambridge: Polity).
Naughton, J. (2001) ‘Contested Space: The Internet and Global Civil Society’, in H. A. M. Glasius and M. Kaldor (eds), Global Civil Society (London: Sage).
Pianta, M. (2013) ‘Democracy Lost: The Financial Crisis in Europe and the Role of Civil Society’, Journal of Civil Society, 9(2): 148–61.
Putnam, R. (2000) Bowling Alone: The Collapse and Revival of American Community (New York: Simon & Schuster).
Conflict
Working Definition
The struggle for supremacy between social groups, which involves tensions, divisions and competing interests.
Origins of the Concept
Conflict is as old as human society and, though today we may see it as unacceptable and something to be prevented, in broad historical terms, conflict and conquest have shaped the human world and led to the spread of humanity across the globe. Western colonial expansion was based on the naked exploitation of subject populations and natural resources, but, by creating new conflict relationships over a larger geographical spread, colonialism also promoted more global interconnectedness. For Georg Simmel, conflict is a form of human association in which people are brought into contact with one another and through which unity can be achieved. This is an important starting point because it helps us avoid the idea that conflict is the ending of relationships and interactions. Simmel’s point is that conflict forces parties to acknowledge each other even though the relationship may be antagonistic.
Sociological studies of conflict are often seen as forming a ‘conflict tradition’, though there appears to be little common theoretical ground apart from a general focus on clashes of interest between large social groups. Most studies have adopted either a Marxist or a Weberian approach to conflict, and a majority explore intra-society conflicts such as those centring on major inequalities, among them social class, gender and ethnicity. Conflict sociologies were popularized in the 1960s, partly as a reaction to the dominant structural functionalist paradigm and partly in response to increasing conflicts within and between societies at that time. Functionalism seemed better able to explain consensus and conformity than conflict, and many sociologists turned away from Parsons and Durkheim and moved towards Marx and Weber for inspiration. Today conflict theories are well established, and sociology is better equipped to understand and explain phenomena such as social movements, terrorism and war.
Meaning and Interpretation
Conflict is a very general term that takes in both disputes between two individuals and international war between many states and encompasses everything in between these two extremes. In practice, sociology has concentrated on the structured social conflicts which are embedded within society rather than, say, wars between nation states, which have been relatively neglected until quite recently. The quest for power and wealth, attempts to gain status, and social inequalities lead to the formation of distinct social groups with shared interests and identities that pursue those interests against others. Conflict theory therefore sees the potential for strife as always present.
The conflict perspective is one of the main traditions of inquiry in sociology, which includes numerous theoretical approaches. Marxism, feminism, many Weberian perspectives and more – all use some version of conflict theory. Conflict theories investigate the importance of those social structures within society which produce chronic tensions and opposition that occasionally flares into violence. Some theories, such as Marxism, put structured class conflicts at the centre of society as the dynamic that drives forward social change. Simmel’s point is worth restating here, namely that, although they are in conflict, social classes are also embedded within relationships of mutual dependence. Under capitalism, workers depend on capitalists to provide them with the jobs and income they need to survive, but capitalists need workers to deliver the products and services that make profits.
Conflict theories are by no means all Marxist. Many conflict studies have been influenced more by the ideas of Max Weber, who saw much broader conflicts arising on more than a class basis. Conflicts can be based on political differences, status competition, gender divisions or ethnic hatred, all of which may be relatively unrelated to or independent of class. Patriarchal power works to the advantage of men and the disadvantage of women wherever they are situated in the class structure, though class position may well exacerbate the multiple problems faced by working-class women. Similarly, the episodes of genocidal violence by Hutus against Tutsis in Rwanda (1994) and by Serb armed forces against Bosniaks in Srebrenica (1995), as well as the mass murder committed by the German Nazi state against Jewish populations in Europe during the Second World War (1939–45), have been viewed primarily as arising from traditional ethnic rivalries and racist hatred rather than class conflict. None of this suggests that class is not important, of course – merely that the true importance of class, gender, ‘race’, ethnicity, and so on, can only be assessed in real-world research studies.
Critical Points
The difference between conflict and competition are sometimes elided in conflict theory. Social groups may be in competitive relationships over access to resources, but competition does not always lead to conflict actions. Unless competitive relations lead to actions aimed at achieving supremacy over an identified enemy, then the competition may not develop any further. Similarly, is it correct to describe, say, class relations as class conflict? It may be possible to demonstrate that social class groups have some differing interests, but, unless these lead to attempts to establish supremacy over the class ‘enemy’, is there any real basis for theorizing class in conflict terms?
Over recent decades, there has also been a move towards analysing peace processes rather than simply conflict situations. Sociologists have started to apply themselves to the study of dispute resolution, reconciliation processes and peacekeeping efforts, and this growing body of work may well take conflict theories in different directions.
Continuing Relevance
Conflict theory and studies of conflict in sociology have never been so numerous. Research into ‘civilizational’ clashes, anti-capitalist protests, the ‘new terrorism’, ‘new wars’, genocide, hate crimes and lots more has expanded over the last thirty years, and sociologists have had to use their conceptual and theoretical tools to analyse these new episodes of serious conflict. As globalization processes have gathered pace, and following the end of the Cold War, there has been an emergence of new conflicts.
An up-to-date account of the scholarship in the field of conflict and its resolution can be found in Bercovitch, Kremenyuk and Zartman’s (2009) edited collection. The authors remind us that the historical evidence shows conflict to be ‘normal, ubiquitous and inevitable … an inherent feature of human existence’ (2009: 3). It is important to be realistic about this fact. However, what should be possible is the management and/or control of the violent expression of conflict, and this has become the focus of recent academic research. Given the multiple dimensions of human conflict, including political issues, personal motivations and shifting international context, it is not surprising that the analysis of conflict resolution is a multidisciplinary endeavour, and there are numerous examples in this book.
Nonetheless, a thoroughly sociological perspective is John Brewer’s (2010) theoretical perspective on peace processes and their likelihood of success – a previously neglected issue. Brewer identifies three basic types of peace process after a violent conflict has subsided: conquest, cartography and compromise. Broadly, the conquest situation exists after wars between nation states or in civil and colonial wars; the cartography situation is when peace is achieved mainly by geographical separation; and compromise covers situations in which previous combatants have to negotiate to end violence and agree a reasonable settlement. However, which of these processes is possible does depend both on the extent of shared nationality, values and norms, and on the degree to which participants retain or lose their historical and cultural capital. Brewer’s scheme aims to bring a better sense of what is realistic and achievable in specific post-conflict situations.
References and Further Reading
Bercovitch, J., Kremenyuk, V., and Zartman, I. W. (2009) ‘Introduction: The Nature of Conflict and Conflict Resolution’, in J. Bercovitch, V. Kremenyuk and I. W. Zartman (eds), The Sage Handbook of Conflict Resolution (London: Sage).
Brewer, J. (2010) Peace Processes: A Sociological Approach (Cambridge: Polity).
Joseph, J. (2003) Social Theory: Conflict, Cohesion and Consent (Edinburgh: Edinburgh University Press).
Democracy
Working Definition
A political system providing for the participation of citizens in political decision-making, either directly or through the election of political representatives.
Origins of the Concept
The concept of democracy comes from the Greek demokratia, bringing together demos (‘the people’) and kratos (‘rule’ or ‘power’). The radical nature of this concept is clear; it suggests that societies should be ruled by ‘the people’ themselves, rather than by emperors, monarchs or unelected dictators. However, although a direct type of mass democratic participation was practised in ancient Greece, important decisions of governance were taken by a much smaller group of ‘citizens’ with special rights not afforded to the rest of the population. Democratic rule has also taken differing forms at varying times and in different societies, not least because what is meant by ‘the people’ has changed over time and location. At various times, the concept of ‘the people’ has been restricted to adult men, just to those who owned property, and to male and female adults – but only those beyond a certain age. Representative democracy – in which people elect representatives to act on their behalf – has become the normal method of achieving ‘rule by the people’. With the ending of Eastern European communism in the 1990s, representative forms of ‘liberal’ democracy have been seen as the dominant model around the world.
Meaning and Interpretation
Democracy is generally seen as the political system most able to ensure political equality, protect liberty and freedom, defend the common interest, meet citizens’ needs, promote moral self-development, and enable effective decision-making which takes everyone’s interests into account (Held 2006). Representative democracy is a political system in which decisions affecting a community are taken not by its members directly, but by those they have elected. In national governments, representative democracy takes the form of elections to congresses, parliaments or similar national bodies. Representative democracy also exists at other levels, such as in provinces or states within an overall national community, cities, counties, boroughs and other regions. Countries in which voters can choose between two or more parties and in which the mass of the adult population has the right to vote are usually called ‘liberal’ democracies and include Britain, the USA, Japan and Australia.
Since the early 1980s, a number of countries in Latin America, such as Chile, Bolivia and Argentina, have undergone the transition from authoritarian military rule to democracy. Similarly, following the collapse of the communist bloc in 1989, many Eastern European states – Russia, Poland and Czechoslovakia, for example – have become democratic. And, in Africa, a number of previously undemocratic nations – among them Benin, Ghana, Mozambique and South Africa – have come to embrace democratic ideals. Democracy is no longer concentrated primarily in Western countries but is now endorsed, at least in principle, as the desired form of government in many areas of the world.
One reason for this may be that other political systems have simply failed. In that respect, perhaps democracy has shown it meets the needs of the mass of people better than other systems. However, although some have made this argument, it seems likely that globalizing processes have played an important role in spreading democracy around the world. Increasing cross-national contacts have invigorated democratic movements in many countries, while a global media and advances in information and communications technology have exposed people in non-democratic states to democratic ideals, increasing internal pressure on political elites.
More importantly, global media and instant communications spread news of democratic revolutions and mobilizations. News of the revolution in Poland in 1989 travelled rapidly to Hungary, providing pro-democracy activists there with a useful, regionally appropriate model for their own protests, while the so-called Arab Spring in 2011 saw a wave of demonstrations and protests that forced out leaders in Tunisia, Egypt, Libya and Yemen, as well as leading to a destructive civil war in Syria. International organizations such as the United Nations and the European Union play an increasingly important role in global politics and have put external pressure on non-democratic states to change.
Critical Points
The dominance of representative democracy is not absolute. Aspects of participatory democracy play a part in democracies even today. Small communities in New England, USA, still hold annual ‘town meetings’, for example, while referenda in many countries may be gaining in popularity. This is possible where direct consultation can be made on specific issues with just one or two questions to be answered. Referenda are regularly used at the national level in some European countries to inform important policy decisions, such as whether national governments should sign up to a new European Constitution. They have also been used to decide contentious issues of secession in ethnic nationalist regions such as Quebec, the predominantly French-speaking province of Canada, and, as in the UK’s referendum in 2016, to give the public a say on remaining in or leaving the EU.
The general trend towards democracy should not be seen as inevitable. In Poland, the Czech Republic and Hungary, liberal democracy seems to be taking a firm hold. But in other countries, such as the former Central Asian Republics of the Soviet Union, Yugoslavia and even in Russia itself, democracy is still fragile. Another reason not to assume democracy has ‘won’ is that, almost everywhere, established democracies are facing internal problems. In Britain, for instance, the numbers who vote in European, general and local elections have declined considerably since the early 1990s. A perception that political elites do not properly represent the people’s interests – particularly evident during the expenses scandal of 2009 – has led to a loss of trust in politicians and formal democratic politics. There is also evidence that people may be turning to less formal ways of ‘doing politics’, such as forming social movements or voluntary groups to campaign on specific issues.
Continuing Relevance
Francis Fukuyama ([1992] 2006) once argued that the ideological battles of earlier eras are over and we stand at ‘the end of history’. No one defends monarchism, fascism and communism any more; capitalism has won the struggle with socialism, and liberal democracy is the unchallenged victor. Certainly recent evidence supports this contention. However, cosmopolitan thinkers now argue that national democracies are no longer able to handle the demands of global processes.
Cosmopolitan democracy is seen by many advocates as an ambitious project of post-national politics. However, Calhoun (2007) argues that not only is this project rather premature but it may also be positively dangerous. It is premature because, since the early 1990s, a series of violent conflicts, episodes of genocide (including within Europe), terrorism and responses to it, and international economic recession have shown that cosmopolitanism remains an illusory dream. It is also a dream that has accompanied modernity from its inception, and it may well be bound up with nationalism rather than being its direct opposite. More than this, nationalism is a key source of identification for large numbers of people and many liberation movements and is by no means inherently dangerous. Indeed, national identification remains a vital force in the struggle for democracy, social integration and citizenship and is easily underestimated by cosmopolitan thinkers. Calhoun’s is one of the more spirited and constructive critiques of cosmopolitan democracy currently available.
Democracies take time to become established, and some scholars suggest that newer democratic regimes tend to be less stable on account of the failure by political parties to instil loyalty among supporters. However, in an historical analysis of democratic development and political affiliations in Argentina over an entire century, Lupu and Stokes (2010) found that electoral stability grew in periods of democracy but declined again during dictatorships. Their study suggests that new and putative democracies can be severely disrupted by military coups, which prevent a democratic culture from taking root. One aspect of this is that the constant disruption of democracy by military coups effectively interrupts elections, erodes grassroots party activity, and thus presents an obstacle to the cumulative partisan loyalty needed to stabilize democratic systems.
References and Further Reading
Calhoun, C. (2007) Nations Matter: Culture, History and the Cosmopolitan Dream (London: Routledge).
Fukuyama, F. ([1992] 2006) The End of History and the Last Man (London: Hamish Hamilton). Held, D. (2006) Models of Democracy (3rd edn, Cambridge: Polity).
Lupu, N., and Stokes, S. (2010) ‘Democracy, Interrupted: Regime Change and Partisanship in Twentieth-Century Argentina’, Electoral Studies, 29(1) March: 91–104.
Nation State
Working Definition
The combination of a large community (nation) and a territorial, political form (state), creating a cultural-political entity, now the most widespread ‘survival unit’ across the world.
Origins of the Concept
The nation state appears to be the normal, even natural, political-cultural entity in the modern world. But, like all social phenomena, nation states have a history that can be traced. Most scholars agree that the modern nation state is relatively recent, dating from the late seventeenth and the eighteenth century. Between the fifteenth and eighteenth centuries, Europe was ruled by absolutist and constitutional monarchies that had absorbed many smaller political units to produce fewer but much stronger states which coexisted in a competitive struggle for power. This system of sovereign states produced the Westphalian conception of international law (1648), based on the right of states to self-government and with interstate disputes being legitimately settled by force.
The Westphalian system laid the foundations for the transition to the modern nation state, which was ushered in by the English Revolution of 1640–88 and the French Revolution of 1789, symbolically marking the end of feudal social relations. However, it was the demands of industrialization that created the need for a more effective system of government and administration, and, since the basis of society was no longer the local village or town but a much larger unit, mass education and a planned education system based on an ‘official language’ became the main means whereby a large-scale society could be organized and kept unified. Nation states are thought to have become dominant due to their gaining a monopoly of the legitimate means of taxation and violence, which gave them both enormous military power and the loyalty of large populations.
Meaning and Interpretation
The cluster of concepts including the nation, the nation state, nationalism and national identity are some of the most contested and difficult to pin down in the whole of sociology. Yet they may appear quite simple. For instance, a nation is a large community, while a state is the political form which guarantees that community its security. However, nations are not necessarily homogeneous cultures with a shared language, history and traditions. The United Kingdom, for example, is a nation state consisting of England, Scotland, Wales and Northern Ireland and has several languages and different historical traditions. It is also a multicultural society with many more cultures and traditions – hence British citizens are an extremely diverse group with many languages and numerous religions.
Benedict Anderson (2006) argues that nations are ‘imagined communities’ rather than concrete ‘things’, with diverse groups bound together by a perception or imagination of what constitutes the cultural entity to which they feel they belong. Just because they are ‘imagined’, though, does not mean they have no reality. When many people act on the basis of a perceived national community, they bring about a shared national identity that binds them together.
Nationalism is in some ways quite modern, but it also draws on sentiments and forms of symbolism that go back much further into the past. Smith (1986) has argued that there is often a line of descent between a nation and historical ethnic communities – what he calls ‘ethnies’. Over time, Western European countries and regions saw one ethnie gradually becoming dominant over others. For instance, in France, several language communities vied with each other until the nineteenth century. However, once the state made French the official and only language that was taught and used in schools, competing ethnies rapidly lost ground. A similar process occurred in the UK, where English became the dominant language across the constituent nations of the union. Other languages did not completely disappear in this process. For example, Welsh, Scottish Gaelic and Irish Gaelic are still spoken in parts of the UK, while Basque continues to be used in parts of Spain and France (in the Basque Country). Survival of these languages is an important aspect of keeping alive the continuity of past and present for the ethnies which use them.
Critical Points
Sociologists are happier to discuss states than nations, simply because the concept of the nation is so hard to pin down. But the concept of the nation state can also be seen as rather woolly at the edges, as there exist several types of ‘nations without states’. A nation state may accept cultural differences within its minorities and grant them a certain amount of active development, as with Scotland, Wales and Northern Ireland within the UK as a whole. In 1999, Wales and Scotland achieved more autonomy with the introduction of a Scottish Parliament and a Welsh Assembly respectively. However, Scotland and Wales are not independent nation states. The Scottish independence referendum of September 2014 delivered a majority against independence and in favour of remaining within the United Kingdom state. The National Assembly of Quebec (a French-speaking province of Canada) and the Flemish Parliament (a Dutch-speaking area of northern Belgium) are other examples of devolved political bodies in nations that are not fully independent states. Many other nations within states still have no legal status or recognition, including Tibetans in China and Kurds in the region that takes in parts of Armenia, Turkey, Syria, Iran and Iraq.
Nation-building and nation states in the developing world have generally not followed the same path as the developed countries. In large measure this is because many developing countries were colonized by Western states and achieved independence only in the latter part of the twentieth century. The creation of many nation states in the developing world was the subject of quite arbitrary boundary decisions that failed properly to consider historically developed ethnic and cultural divisions. Post-independence, the mix of ethnies and other groups made the promotion of a distinct national identity much more difficult and politically contentious in these countries. The same issues and problems did not arise to the same extent in regions that did not suffer colonization, such as Japan, China and Korea, which were already more culturally unified.
Continuing Relevance
Arguably, one of the main factors in changing national identity today is globalization, which creates conflicting pressures between centralization and decentralization. On the one hand the powers of business organizations and political units (such as transnational corporations and organizations) become more concentrated, but on the other there is pressure for decentralization. As a result, globalization creates a dual threat to national identity: centralizatio