Professor of Sociology

Celebrity culture has given women the confidence to defy and challenge those with power and influence in show business.

“You brought the flames and you put me through hell.” The words are from the American singer Kesha’s recent track “Prayer” and are, in many people’s opinion, directed squarely at her former producer Dr. Luke, with whom she has been locked in a legal battle for years.

There have been allegations of sexual abuse made by Kesha (pictured above, by rocor) that Dr. Luke (Lukasz Sebastian Gottwald) denies. She signed for his Kemosabe Records, an imprint of Sony, in 2005, when she was 18. Their relationship was fractious pretty much from the outset, though it was creatively fertile and made her a star.

But in 2014, Kesha, or Ke$ha as she was then known, went into rehab and there she told doctors that Dr. Luke had drugged, sexually abused and physically assaulted her. When she emerged, she replaced the $ with an s in her name and filed a lawsuit, accusing him of sexual assault and battery, sexual harassment, gender violence, civil harassment, unfair business and intentional and negligent infliction of emotional distress in a lawsuit. He countersued for defamation.

A ruling in March concluded that Kesha had entered a contract after the time she alleged the abuse started; this seemed to contradict the singer’s allegations and suggest that Dr. Luke’s alleged abusive behavior was foreseeable. The conflict appears to have subsided, at least for the time being. Dr. Luke continues to produce music.

Indecorous stories about the singer, songwriter and R&B producer R. Kelly have been circulating for several years. The latest broke a couple of weeks ago and centered on his alleged immurement of several young women. According to allegations, these women live in properties in Atlanta and Chicago, owned or rented by Kelly, where every aspect of their lives is controlled — down to what they eat and wear, how they address him (“daddy,” apparently) and when they have sex with him. A parent of one of the captives despaired: “It was as if she was brainwashed. [She] looked like a prisoner … she just kept saying she’s in love and [Kelly] is the one who cares for her … if I get her back, I can get her treatment for victims of cults.”

Men with Power

While there are huge differences in the two cases, there are similarities: In both cases the alleged wrongdoer is a man who is successful in the entertainment industry and respected for his artistic output. The cases bring into grim focus an ugly aspect of show business — men with a certain status can be controlling abusers of the opposite sex.

The focus seems sharper now than ever. Bill Cosby was recently in court accused of numerous offenses. An inconclusive jury verdict resulted in a mistrial, but the stories of drugs, intimidation and sex resonated with other episodes, particularly the many episodes that have emerged in the UK recently. The most infamous of these concerned the television presenter Jimmy Savile, who died in 2011 but was posthumously disgraced after it was found he abused 60 people, aged from 5 to 75.

Embed from Getty Images

Most of us will assume the paradigm is the O.J. Simpson case, which involved the murder of his wife Nicole Brown Simpson and culminated in what many still regard as the trial of the century in 1995. Simpson was cleared but later served nine years behind bars for armed robbery at a Las Vegas casino hotel in 2007. He was recently released.

But these kinds of incidents are probably as old as the entertainment industry itself. Enrico Caruso, the world’s preeminent tenor of the early 20th century and one of the most famous figures of the time, was in 1906 prosecuted for molesting a woman in New York City. At the trial Caruso was said to have imposed himself on six women in total. Caruso was found guilty and fined $10, the maximum amount allowed by law. Since then there have been standout cases. Roman Polanski in 1977, Woody Allen in 1992, Mike Tyson in 1992 (I count sports as part of the entertainment industry). But after the Simpson case, there seems to have been a prevalence of cases involving men who have exploited their status, influence, authority or a combination of all these to abuse women.

This is probably a misleading perception. More likely, we are just more aware of such cases. Why? Obviously, the media are much more likely to pounce on this type of case nowadays. Our appetites are probably more salacious now than ever. We take delight in pronouncing our own judgments in supermarkets or at the office. And the media feed this. But there is more.

Code of Silence

Celebrity culture has delivered many gifts, many of them unwelcome. But an agreeable aspect of its largesse is the confidence it has given women. I’m not talking about confidence in its most general sense, though I do think this has been affected by our preoccupation with celebrities. I mean the confidence to defy and challenge what were once regarded as indomitable show business figures with power and influence enough to get pretty much what they liked and do as they pleased — with anyone they chose.

Women, young and old, have been emboldened because they’re no longer awestricken by the kind of men who in previous eras were popularly regarded as inaccessible, unapproachable and, in some cases, godlike. In any case, there are probably hundreds, if not thousands, of cases that have been buried by Hollywood’s super-efficient publicity machine. Stars, especially male stars, were surrounded by an invisible defensive shield, a shield that dissolved as celebrity culture took shape. The once-remote stars were humanized into celebrities — the kind of people who would stand next to you and chat at a bar.

One of the features of Britain’s Savile case was the apparent hesitance of women in the 1970s to raise a whisper about men in the public eye. They weren’t just star-struck; they were terrified, not by the man, but by his aura – that immanent quality possessed by public figures of the time. Not now, of course: Fans exchange views on social media, take selfies with them and track their movements online. All of which has rendered them more ordinary. And being ordinary means having the same sort of inadequacies and being capable of the same kind of transgressions as anybody else.

Without caricaturing every powerful man in the entertainment industry as a sex-seeking missile, it seems reasonable to assume that the casting couch of Hollywood lore has some basis in reality and that attractive but powerless young women have been awarded roles in return for granting sexual favors.

Whatever happened to the predators? There are probably plenty of them around, though their pursuit of young women has been restrained, paradoxically, by the spate of cases that have dominated news in recent years. Any time a man contemplates making an unwelcome sexual advance on a woman, the possibility that she’ll react with the fury of Judith beheading Holofernes must cross his mind. Flashing before him are thoughts of a career-ending indictment, a shaming court case and even a prison sentence.

Learning about sexual coercion in the entertainment industry horrifies us, but it also reminds us that the days when young women did as they were told and obeyed a code of silence are gone.

Are we witnessing the disappearance of childhood, at least in the way we’ve understood it for generations?

Everyone must at some point wonder if the internet and the apparent dependency it has introduced is a benediction digitally bonded to a curse — or several curses. Barely a week goes by without some cautionary study warning of trolling, addiction, the decline of interpersonal skills and the decomposition of community life, all because we have our eyes fixed to our screens.

The evidence is, as any self-respecting cynic realizes, skewed in favor of tradition: Few researchers are prepared to embrace screens in context. We now have populations that not only have to engage with screens but find it rewarding. They like navigating their ways around cyberspace.

The internet has given us a weightless world of wonder and, for the most part, the supposed negative side effects have the character of the scares that accompanied the growth in the popularity of television in the late 1950s and early 1960s. If the doomsayers of the time were to be believed, it shortened our attention spans, blitzed our cognitive capacities, ruined family life and so on. All of these predictions were ill-founded, as we now know.

Recently though, a report made me realize I’ve been too dismissive. It concerned a 5-year-old English boy who was investigated by police for allegedly sexting. For those who don’t know, sexting means sending sexually explicit photographs, messages or other kinds of materials via smartphones.

The case of the child came to light in the context of a report on the rise in sexting among young people: More than 4,000 children have been dealt with by British police for sexting since 2013. The most common age of these children is 13 or 14, but younger people, as the standout case indicates, are also taking up the practice.

Disappearing Childhood

Every sentient being knows that childhood is disappearing. I’m defining childhood orthodoxly as the state or period of being a child and a child as a human below the age of puberty. In Britain, the age of consent to any form of sexual activity is 16 for both men and women, regardless of gender or sexual orientation.

Self-accredited experts on the subject offer advice to parents, much of which is obvious or useless. Apart from the usual suggestions to update software, change passwords and avoid clicking on unusual links, parents are often urged to familiarize themselves with security settings, implement family settings and make children aware of the risks of exchanging information with people they meet online.

I should perhaps own up at this stage: I’m not a parent. So everything I write has to be understood in this context. I am sure people over the age of 30 have little or no idea what their children do online. They can adjust their settings to the highest level of security and counsel their kids with an earnestness that will make them giggle with disbelief. Children today, probably like children of previous generations, will carry on heedlessly and, if necessary, in wilful contravention of their parents’ advice. That’s what children do. All of which makes the report even more discomforting than it already is. The inescapable conclusion is: We can’t do anything about the growing number of children who choose to pursue what they regard as a rewarding experience. We’re destined to watch helplessly as more and more kids send sexually charged text and images.

This is an ugly question to raise, but are we witnessing the disappearance of childhood, at least in the way we’ve understood it for generations? Childhood never stays the same.

As recently as 100 years ago, children were not the same as they are today. Produced often by accident, they were typically parts of much larger families than today and, should they survive their first 10 years, were sent out to work part-time or full-time and their income became a contribution to the family’s economy.

They were probably loved, though not in the same way that children are today. Orphaned children (of which there were plenty) would hardly be loved at all; they would spend their tender years in orphanages and dispatched to the outside world as quickly as possible. A hundred years before this, kids were sent to work in factories or as chimney sweeps’ assistants (they climbed inside the chimneys). In other words, we’ve understood children and childhood in a particular way since the end of World War II. This may be about to change.

Mark of Maturity

It’s unsettling to imagine children as sexual beings. Britain’s youngest parent was 12 when she gave birth (11 when she conceived, as a result of rape). Rates of teenage pregnancy in the UK have halved in the past two decades and are now at their lowest levels since record keeping began in the late 1960s. Sex and relationship education, contraceptives and changes in the status of pregnancy have been factors — pregnancy may once have been a mark of maturity, but it now carries more stigma than kudos.

But the provision of sex and relationship education and the widening of the availability of contraceptives are, of course, predicated on the assumption that young people are interested in and willing to engage in actual sex. And the widening awareness that their immersion in the net will surely bring will stimulate that interest even more.

Children are, it seems, sexualizing themselves. I mean by this that they are exhibiting themselves in a way they fully realize will be interpreted as erotic. This is disturbing in itself, more so when it’s realized that the images and text, once posted, are no longer under their control. Are children net-savvy enough to know that once they hit “send” their pictures and other materials are in the public domain to be appropriated and used potentially by anybody? I suspect from my own research that many are and many more will become so over the next few years.

But I’m sure none of them has the intellectual or emotional maturity to make what we’d consider an informed decision about whether it’s right. This is a moral choice made by people without the sophistication or experience to comprehend the probable consequences of their actions. Of course, we all make mistakes; that’s how we learn – by responding to errors. There’s not much leeway with sexting: the decision to distribute is irreversible and its consequences, in practical terms, impossible to undo.

A non-parent like me finds this unnerving. So I presume any right-minded parent will be even more disturbed. We waste energy fretting over the imagined ill effects of our preoccupation with screens. For the most part, the alarmists are misguided scaremongers who struggle to keep up. But in this one important respect, there is a demand for creative thought. Somehow we must avert the potential reconfiguring of childhood.

Screens present most of us with agreeable and convenient portals to work and pleasure. To the young, screens are the places where they learn and play. In an ideal world, they should be for observing, not for being observed.

*[Ellis Cashmore is a member of the Screen Society research team investigating the cultural impact of digital media. The current questionnaire is here. The results will be published in 2018 by Palgrave Macmillan.]

Given open competition, women could achieve parity with men in virtually all events, apart from those very few that require the rawest of muscle power.

“Dear John, I adore and respect you but please please keep me out of your statements that are not factually based.” Serena Williams (pictured above) was replying via Twitter to John McEnroe’s impolitic remark that if she ventured to play tennis against men she would be “like, 700” in the rankings.

McEnroe’s statements actually were “factually based.” At least in the way evolution is a factually based. It’s so well established that no new evidence is likely to alter our understanding of it substantially. Similarly, it appears self-evident that a female athlete, no matter how proficient, could be accomplished enough to beat an equivalently proficient male, less still beat a man of her own rank. Williams versus Andy Murray?

Serena gave her own assessment of such a match in 2013 on US television’s David Letterman Show: “If I were to play Andy Murray, I would lose 6-0, 6-0 in five to six minutes, maybe 10 minutes.” She was more confident as an 18-year-old in 1999 when she claimed: “I can beat the men,” and asked for a wild card entry for the Eurocard Open in Stuttgart, then one of the elite tournaments on the men’s circuit. Her entry didn’t materialize.

Yet in sports in which women are allowed to compete against men, they’ve fared quite well. In equestrianism, for example, women and men compete on equal terms in a completely gender-integrated contest: Whether in show jumping, three-day eventing, dressage, enduring and driving disciplines, women regularly beat men. In sailing too, there is integration in solo ocean racing (though, since 1988 women compete in a separate category in Olympic sailing events). There are other sports in which women and men participate such as ultra-marathoning, curling and climbing, but none of these is what we’d call a major sport.

Marathon running is a major sport and up till 2011, women were allowed to run in the same event as men, at least in the city marathons. There were two competitions with the first women home receiving a separate accolade, but, for practical purposes, women ran alongside men. Then the London Marathon, presumably at the behest of television, splintered the races so that female runners started before the men.

In 2003, Britain’s Paula Radcliffe finished the London Marathon in 2:15.25 seconds. The time has not been bettered by any woman since. Running against men, it seems, brought women to their mettle and their performances reflected this. What if they’d have been allowed to continue running with men? We can probably guess they’d never have got on even terms with men. But Radcliffe’s best time would almost certainly have been beaten, several times over.

The Best of Us

Sport has, since the 19th century, been based on the unarguable maxim that competition brings the athletic best out of us. Striving to win something by establishing superiority over others is a sure-fire way of reaching our limits. So, the question we asked of marathons stands with tennis: How would the world’s number one female fare in a head-to-head with the top male had women been playing competitively against men for the past 100 years?

Again, many will argue that the results would be basically the same, the support this time coming from the copious amount of evidence on the physical differences between the sexes — that is, differences which do not refer to social or cultural influences. There are differences in, for example, adipose tissue, respiratory volumes, activity of sweat glands and other areas, but there is also similarity: Women’s bodies respond to training in the same way as men’s. It’s possible that women can close the gap in strength to within 5% — crucial in some though not all sports.

You can probably guess where I am going with this. Say we could turn the clocks back a 100 years and dissolve the distinction between men’s and women’s — or to use a term tennis still favors, ladies’ — competition at Wimbledon and elsewhere, where would we be now? It’s likely that female tennis players would be, at first, annihilated, then well beaten, and perhaps then edged out by men. For how long? Sixty years? Maybe longer. But what about today?

It’s misleading comparing performances in male and female events, which have developed separately. Tennis has for long been open to at least those women of resources sufficient to afford it. Only in the most playful mixed doubles have they been allowed to confront male adversaries. One-off exhibitions between the likes of an aged Bobby Riggs and Billie-Jean King (and, before her, Margaret Court) owed more to theater than competitive sport, though The Battle of the Sexes, as it was hailed by the media in 1973, was a victory of sorts for King. This is the subject of a film to be released later this year.

Men Only

Now, women compete in every sport, even then ones that were once strictly “men only.” The once-exclusive male preserve of combat sports has been breached. Professional women cage fighters appear regularly on major MMA bills; tae kwon do has featured as a competitive event since the 2000 Olympic Games. Women are involved in virtually every form of combat sport.

Over the years, women have not achieved as much as men in terms of prestigious titles or money: The highest paid female athlete is currently Serena Williams, who ranks 50 places behind the highest earning Cristiano Ronaldo. In fact, she is the only woman in the top 100. Yet the conclusion that women can’t achieve the same levels doesn’t follow logically from the original premise that they are biologically different. In fact, it could be argued that, if women had been regarded as equally capable as men physically, then they would perform at similar standards, and that the only reason they don’t is because they’ve been regarded as biologically incapable for so long.

It would be ridiculous to deny that there are differences, but think of the body is a process, not a thing: It is constantly changing physically and culturally, as our personal perceptions. Sporting performance promotes changes in terms of muscular strength and oxygen uptake; changes in diet and climactic conditions induce bodily changes too, of course.

Athletes, in particular Caster Semenya, have complicated the traditional male-female binary. In 2009, testosterone testing was introduced to identify cases where testosterone levels were elevated above an arbitrary level, a condition termed hyperandrogenism. Semenya was excluded until 2015, when the rule was reversed and she returned. Indian sprinter Dutee Chand, was dropped from the 2014 Commonwealth Games but, at the last minute, successfully appealed to the Court of Arbitration for Sport, which ruled that there was insufficient evidence that testosterone increased female athletic performance. Athletics’ governing organization, the IAAF, is due next month to deliver a clarification on this issue.

In our particular culture and this stage in history we understand women and their association with men in one way; in another place and at another time, this relationship may be understood quite differently. It is a matter of convention that we organize sports into women’s and men’s events, just as it’s a convention to award Oscars for the “best actor,” a man, and “best actress,” a term that’s still used to describe the best female actor.

The Second Sex

There can be no argument about the fact that the experience of women in sports virtually replicates their more general experience. They have been seen and treated as not only different to men, but also inferior in many respects. Historically, women’s position has been subordinate to that of men. They have been systematically excluded from high-ranking, prestigious jobs, made to organize their lives around domestic or private priorities, while men have busied themselves in the public spheres of industry and commerce.

Being the breadwinner, the male has occupied a central position in the family and has tended to use women for supplementary incomes only, or, more importantly, as unpaid homeworkers, making their contribution appear peripheral. Traditionally, females have been encouraged to seek work, but only in the short term: Women’s strivings should be toward getting married, bearing children and raising a family.

Since the late 1960s and the advent of legal abortion and reliable contraception, women in the West have been able to exercise much more choice in their own fertility and this has been accompanied by feminist critiques of male dominance. Studies showed wide discrepancies in earning power and this prompted legislation on both sides of the Atlantic designed to ensure equality in incomes for comparable jobs.

One of the loudest cries of feminists was about the abuses of the female body: Women, it was argued, have not had control over their own bodies; they have been appropriated by men, not only for working, but for display. “Sex objects” were how many women described themselves, ogled at by men and utilized, often dispassionately.

Against this, they recoiled. Even today, at practically any tennis tournament, the media will almost certainly gravitate toward the best-looking rather than best player. Maria Sharapova, for instance, earns about $20m per year from commercial endorsements. Eugenie Bouchard, ranked only 61st in the world, makes nearly $6m a year from ads, suggesting aesthetics often outweigh sports performance for advertisers. Danica Patrick, the stock car racing driver, earns around $6m from endorsements, while cage fighter Ronda Rousey makes a meager $4m from advertising.

Women are underrepresented in politics compared to their total number in the population. They consistently earn less than their equivalent males and are increasingly asked to work part-time. Despite recent changes in the number of places in higher education occupied by women, they tend to opt for subjects, like sociology and art, that won’t necessarily guarantee them jobs in science and industry. When they do penetrate the boundaries of the professions, they find that having to compete in what is, to all intents and purposes, a man’s world, has its hidden disadvantages — what many call the glass ceiling.

Women’s experience has been one of denial: They simply have not been allowed to enter sports, again because of a mistaken belief in their natural predisposition. In the late 19th and early 20th centuries they were considered too frail to withstand the physical exertions of sport. Then they were warned that their reproductive capacities would be harmed by exercise. They were even told to beware of virilization — the development of male physical characteristics, such as muscle bulk, facial hair and a deep voice. Historically, women who excelled at, or even participated in, sports were called “mannish” and regarded as unnatural. Even as recently as 1967, when Kathy Switzer became the first woman to run the Boston Marathon (she applied under “K. V. Switzer”), she was pilloried and, because of her run, the Amateur Athletics Union barred women from all competitions with male runners.

Because of this, the encouragement, facilities, and, importantly, competition available to males from an early age hasn’t been extended to them. In the very few areas where the gates have been recently opened — the marathon being the obvious example — women’s progress has been extraordinary. Given open competition, women could achieve parity with men in virtually all events, apart from those very few that require the rawest of muscle power.

The vast majority of events need fineness of judgment, quickness of reaction, balance, and anticipation; women have no disadvantages in these respects. Their only disadvantage is what many people believe about them: In sports as in life, women will simply never catch up.

“I had lived this woman’s life from the age of 15 to 65 as she was sexually abused, beaten, treated like dirt. I really felt the injustice and I was called nigger just one time too many on screen,” Halle Berry told The Daily Mail. It was 1993 and she had just finished playing the title role in Alex Haley’s Queen, the concluding part of the Roots saga. “I was going to give up acting and become a full-time civil rights activist.”

She didn’t, of course. She went on to grander roles, more bravura performances and, in 2001, became the first African-American woman to win the best actress Oscar for her role in Monster’s Ball.

Whether the experience of playing the daughter of a slave and a white plantation owner who tries to pass as white in the period after the American Civil War impressed Berry indelibly isn’t certain. Berry had shown an awareness of history when she dedicated her Oscar: “This moment is so much bigger than me. This moment is for Dorothy Dandridge, Lena Horne, Diahann Carroll. This is for every faceless woman of colour who now has a chance tonight because this door has been opened.”

But in an interview with Teen Vogue‘s Elaine Welteroth, she described receiving the Oscar as one of her “lowest moments.” “I thought it meant something, but I think it meant nothing.”

Still sleekly beautiful as she approaches her 51st birthday, Berry expressed her interest in “making more opportunities for people of colour … and I’m trying to figure out how to help and add more diversity to the academy.” In her own quiet, prepossessing way, Berry has become one of the most thoughtful, yet provocative artists of recent years. She has raised issues that have been uncomfortable to discuss, yet relevant to the experience of African-Americans today. She has reminded us that, whether they like it or not, all black actors are, in some sense, political figures.

For example, in 2011, after splitting up with her partner, a white Canadian, Berry pressed for custody of their daughter. The custody was contested and Berry based her claim on her daughter’s ethnicity: she was black, insisted Berry, drawing on what has become known as the one-drop rule. This is an old idiomatic phrase that stipulates that anyone with any trace of sub-Saharan ancestry, however minute (“one drop”), can’t be considered white and, in the absence of an alternative lineage — for example, Native American, Asian, Arab, Australian Aboriginal — they are considered black.

The rule has no biological or genealogical foundation, though. In 1910, when Tennessee enshrined the rule in law, it was popularly regarded as having scientific status, however spurious. By 1925, almost every state in America had some form of one-drop rule on the statute books. This was four decades before civil rights. Jim Crow segregation was in full force. Anti-miscegenation laws that prohibited unions of people considered to be of different racial types remained until 1967 when the Supreme Court repealed them completely. Berry’s case reflected the interest in “authentic” black culture that spread across popular culture, leading to a redefinition of roles available to black actors and, indeed, a redefinition of blackness itself. It also resonated with historical memories and emotions.

Berry herself had an African-American father and a white mother, who was from Liverpool. Her parents divorced and she was brought up by her mother in Cleveland, Ohio. Prior to the custody argument, she had declared that she considered herself bi-racial, this referring to a child with a black parent and a white parent: “I do identify with my white heritage. I was raised by my white mother and every day of my life I have always been aware of the fact that I am bi-racial.”

Berry had occasionally talked about the particular predicament bi-racial people but had never made an issue of it. At various points, she had also used black, African-American and woman of colour to describe herself. She had, in measured terms, talked about how she never felt accepted as white, despite her white mother. But her appeal to the one-drop rule seemed a bit like a physicist trying to explain the movements of celestial bodies by citing astrology.

Actually, while it seemed irrational, Berry’s explanation of her actions was far removed from any kind of faux biology or pseudoscience. “I’m black and I’m her [daughter’s] mother, and I believe in the one-drop theory. I’m not going to put a label on it. I had to decide for myself and that’s what she’s going to have to decide — how she identifies herself in the world,” she was quoted by Chloe Tilley of BBC World Service.

In resisting conventional census categories or labels such as bi-racial or multiracial, she was not resorting to another label, black, as if returning to a default setting. Black, in her argument (at least, as I interpret it), is no longer a label: it is a response to a label — a response, that is, to not being white. Blackness, on this account, doesn’t describe a colour, a physical condition, a lifestyle or even an ethnic status in the conventional sense: It is a reaction to being regarded as different or distinct.

Black no longer describes a designated group of people: It is the way in which those who have been identified as distinct from and opposite to whites have reacted; their answer. When Berry allowed, “that’s what she’s going to have to decide,” she meant that her daughter has some measure of discretion in the way she responds. Blackness is now a flexible and negotiable action; not the fixed status it once was.

Rachel Dolezal advanced a similar argument when she proclaimed herself black, even though several critics called her a fraud. The Branch President of the National Association for the Advancement of Colored People (NAACP) thought Dolezal’s argument was too sophisticated for most to understand: Blackness, like whiteness, is a culturally created label that’s often confused with a biological description. If she identifies with blackness, the only fraud is in perpetuating artificial categories invented by Europeans to subordinate slaves in the 17th century.

Similarly, Berry’s wish that her daughter will mature in a world where she can make choices about her own identity, including her ethnic affiliations, and perhaps even change these as she moves from situation to situation, is a difficult one to grasp, but one the world is going to have to. Blackness is not a thing: It is, as I say, a response.

Berry is using her status and intellectual ingenuity to prompt debates that affect not just actors, but everybody. At times, her arguments are worthy, yet confusing; there is surely a contradiction in appealing for choice in ethnicity while citing an ancient, racist justification that would be endorsed by the Ku Klux Klan.

The volte-face on the Oscars is also perplexing. The virtual exclusion of African-American artists from nominations was dramatically reversed earlier this year, though this seemed suspiciously like Hollywood tokenism. Perhaps this too-obvious lip service payment has convinced her that more structural change is called for. She now says: “I want to start directing, I want to start producing more [and] I want to start being a part of making more opportunities for people of colour.”

Her changes of mind reflect a restless intellect pulsing with ideas. You don’t have to agree with Berry to acknowledge that she is a woman for the hour when popular culture needs to speak to its time.

Ellis Cashmore is the author of “Elizabeth Taylor,” “Beyond Black” and “Celebrity Culture.” He is a visiting professor of sociology at Aston University and has previously worked at the universities of Hong Kong and Tampa.

A well-publicized bout of illicit sex never did any celebrity’s reputation harm; often, a lot of good. But how does it affect a politician’s? Up to last week, you’d probably say: ruinously. But the election of Donald Trump (pictured above) amid a tsunami of accusations from women who claim he either touched or propositioned them inappropriately has forced us to change our minds.

It’s at least possible that, far from being damaged by the allegations, Trump actually profited from them. For a while he looked almost like a victim: someone whose every wink and flirtatious gesture over the past thirty years had suddenly returned as a baleful curse. Practically every day for about a fortnight, fresh grievances appeared; women, who had been silent for years, decided it was time to make their claims public.

After a while, it seemed Trump’s denials were useless and that his presidential campaign was wrecked. At least according to many journalists. But maybe there was a rebound in public sympathy with voters who doubted the authenticity of the claimants, lending their support to the beleaguered Trump.

He wasn’t the first US President or prospective President to have extricated himself from a potentially career-wrecking sex scandal and perhaps Trump owes his survival to the strategy adopted by none other than the husband of his presidential rival.

Bill Clinton is a liminal figure, occupying a position on both sides of the celebrity politician divide: he had a successful political career as governor of Arkansas, 1979-81, and 1983-93, before becoming president. Clinton cut a beguiling figure en route to the presidency: telegenic and good-looking, he also had the sheen of authenticity, appearing natural and relaxed on television.

Clinton arrived at the White House in 1993 in the middle of a media revolution, with cable television providing a 24-hour news cycle. His arrival also coincided with a voyeurism diffusing through the population: consumers’ interest in private lives practically commissioned the media’s intrusive approach and obliged even presidents to expose themselves. On one memorable occasion in 1992, Clinton donned Ray-Bans and played saxophone on a late night talk show. Yet there was more celebrity to Clinton than anyone dared to imagine and, in 1998, he became the central figure of a sex scandal bigger than anything dreamt up by Madonna.

There was a stunning moment shortly after the scandal broke when Clinton appeared on national television and affirmed: “I did not have sexual relations with that woman.” That woman was Monica Lewinsky, White House aide, and her account of her relationship with the President was somewhat different. The US President is always a figure of great interest by virtue of his position (there’s never been a female President), and this and the several other allegations of sexual peccadilloes that followed marked Clinton out as someone worthy of even greater interest.

Clinton was the US President for two terms of office and, for a while, under threat of impeachment. So the scandal could have had wider-reaching repercussions than it actually did. And the fact that Lewinsky actually worked in politics gave it added relevance. As the concupiscent details of the case unfurled — the semen-stained dress, the cigar, the secretly recorded phone conversations — interest built and, for the final two years of the twentieth century, Lewinsky was one of the most famous women in the world. Her celebrity status manifested in several books about her, an assortment of well-paid endorsement deals, her own line of accessories and a reality tv program in which she featured. She then faded from view.

The affair should have hurt, even destroyed Clinton. Why didn’t it? He had narrowly avoided a controversy about his wilder years as a student, when he issued his famous “I did not inhale” notice about his supposed marijuana smoking. The Lewinsky denial could have undermined his credibility.

In December 1998, within months of the denial, Clinton achieved his career highest approval rating of 73. His average approval rating during his term of office was 55.1, below John F. Kennedy, but above Reagan, Jimmy Carter and George W. Bush, among others. He enjoyed a consistently high approval rating among the “baby boomer” generation (those born in the immediate post-second world war period). An experts’ poll in 2011 placed Clinton at 19 in the all-time list of presidents. Maybe honesty was no longer part of the Presidential job description.

Clinton remained as President till 2001, when he left office after serving his complete second term. He also acquired a status distinct from that of other politicians, who leave legacies. Clinton could have been remembered for bringing together Israel’s Yitzhak Rabin and Yasser Arafat of the Palestine Liberation Front on the White House lawn in 1993, or signing the 1994 Kremlin Accords that stopped the preprogrammed nuclear missiles, or organizing peace talks for Bosnia and Herzegovina in 1995, or ordering cruise missile strikes on Afghanistan in 1998. He could also be remembered as the first president to have solicited the public’s favour in spite of deeds that would have damned politicians from earlier eras.

Clinton though was a politician for the celebrity era. Squeaky-clean politicians whose worst vice was an extra-marital fling were, by the 1990s, remnants of another age. Compare his experience with that of former civil-rights leader and Washington DC mayor Marion Barry, who in 1990, was convicted of cocaine possession. A female friend had lured him into a police sting: at their assignation, hidden cameras captured him smoking crack. During his six-week trial, accounts of his sex and drug binges, backed by evidence from a pimps and pushers, were relayed to homes via television. He served six months in jail, but two months after his release, he returned to the city council and, within three years, was re-elected Mayor. In another sex-related case, New York governor Eliot Spitzer resigned after being implicated in a federal investigation into inter-state prostitution in 2008. He barely broke stride, returning in his own television show, his credibility in tact.

John Edwards, a 2004 vice presidential candidate, had an affair with a woman while his wife was dying with cancer. This was scandal enough to blow him off course in his bid for president in 2007, but he would probably have navigated his way back had it not been for allegations that he masterminded a $1 million cover-up of his affair, mis-using funds from two wealthy campaign donors. Substance abuse, carnal activities and sundry other deviant behaviours are, it seems, forgivable; in a way, they humanize a politician, exposing a few of the kind of flaws all of us secrete.

Clinton sailed close to the wind; but it blew in his favour. The political culture in which he prospered had lost the stiffness and propriety of earlier eras and his sexual misconduct was not thought venal. Clinton brought a sense of showmanship and his occasional peccadillo only intensified the drama of his presidency. Even in the midst of the Lewinsky scandal, he battled on like a rock star in his fifties, determined to show his audience he had a few good songs in him. Clinton may not have been the greatest President, but he was surely the most consumable and, as if to prove this, he still tours the world, giving guest lectures, signing copies of his own books, receiving invitations to do spots on tv shows and doing what celebrities do – appear.

There’s no evidence at that Trump studied Clinton’s expertly manoeuvred strategy. But you can be sure the people surrounding him were aware that a resolve to remain unforthcoming, distant and aloofly silent about the allegations was the only response realistically available to their man. Denial would have just led to further accusations, implicating him in a vortex of claim and counter-claim. Speechlessness effectively killed the narrative. A stream of allegations became repetitive and uninteresting after a while.

Is there irony in this? After all, most celebrities revel in sex scandals. It reminds us that political celebrities – and Trump is now arguably the paragon of these – are different from other celebs. They still need to engage with us in a way that reminds us that they have that indefinable quality of ordinariness; they also need to keep us in close contact via social aswell as traditional media; and they need to surrender their private lives to us – after all, we feel entitled not just to know but to own celebrities.

Yet politicians don’t just entertain us: they make decisions that affect our material lives and, possibly, those of our children. We like to know that, for all their flaws and foibles, they have our interests in mind. Trump has skilfully persuaded Americans that, for all his reputed dalliances, he is a man who can be trusted to put his followers’ interests before his own. This is a rare feat for a politician today.

Q: It’s sixty years this month since the release of the film Giant (40th anniversary poster above). This was a big film in the 1950s, but never ranks among the likes of The Godfather, Casablanca, or Gone With The Wind as a twentieth century classic. But I know you’re going to tell us that it has cultural significance that escapes most of us.

A: It’s only what you’d expect from me, isn’t it? You can see a half-century of popular culture in Giant. Three mortal figures advance towards immortality in this film.

Q: Well, that’s quite a claim. Continue.

A: First, the story. Edna Ferber’s book Giant concerns an oil-and-ranching family modeled on the Kleberg family, who ran (and still run today) the vast King Ranch in South Texas. Giantis the story a simple cowhand who becomes a conniving, bigoted oil tycoon and cattle baron and his strong-willed wife, transplanted from the greenery of her native Maryland, who curbs his Southern vulgarities with her Eastern civility. Serialized in Ladies’ Home Journalbeginning in the spring of 1952, Giant was released that fall to immense sales, quickly leaping onto the New York Times best-seller list. But the film based on the novel secretes another story. Warner Brothers, having secured the rights – amid much competition from other studios – to Ferber’s work, cast Rock Hudson in the central role of Bick Benedict, the Texas rancher, Hudson, then 29, was what was known in the mid-1950s as “beefcake,” meaning an outstandingly handsome and muscular man who radiated heterosexual attractiveness. Montgomery Clift, also possessed of exceptional good looks, was earmarked for the role of Jett Rink, the poor dirt farmer who strikes it rich, thought to be based on Glenn McCarthy, who was a flamboyant oil millionaire, known as “King of the Wildcatters” (a wildcatter is a prospector who sinks exploratory oil wells). But the producers were suspicious of his drinking and opted for the then relatively untested method actor James Dean, who made East of Eden(1955) and seemed an acceptable risk. Dean was also handsome, but, in his case, haunted-looking, which was fashionably impressive – he looked, to use a term that originated at the time and has persisted since, cool.

Q: And the role of Leslie?

A: Grace Kelly was a natural for the role of Bick’s wife. The humble, blonde Philadelphia beauty who became Hollywood star had not yet fled to become a European princess and looked perfect. She was about as hot as it was possible to be at the time. She’d been in High Noon, Dial M for Murder and To Catch a Thief. The film’s director George Stevens had briefly considered Elizabeth Taylor, but, at 23, she seemed too young (Kelly was nearly two-and-a-half years older). The story goes that Hudson, possibly wary that hugely popular Kelly might steal his thunder, argued Taylor’s case and eventually got his way. Remember: Taylor was not yet the scandalizing hellcat she became, though, she had gone through her first unruly marriage and was now married to English actor Michael Wilding. But she had not yet taken on a role that was truly adult and the role demanded that she age an improbable twenty-five years over the course of Ferber’s saga.

Q: Now, you describe Hudson as beefcake. But he later became the first Hollywood star to die from Aids. He was gay, if memory serves. So?

A: This was the 1950s. America hadn’t even started contemplating repealing its sodomy laws, as they called them. Hudson was shut tight in the closet. In fact he was married to his agent’s assistant. In those days, they were called “lavender marriages,” meaning they were designed to remove suspicions about an actor’s sexual preferences.

Q: So, there were suspicions about Hudson?

A: In the film industry, for sure. But don’t forget, in the 1950s, Hollywood operated a smooth-functioning publicity operation and allowed only the information it wanted released to escape to the outside world. Had it become known that Hudson was gay – and he didn’t come out until only weeks before his death in 1985 – it would have killed off his professional career instantly.

Q: Did Taylor know?

A: Almost certainly. And, if she didn’t when they started filming, she would have known soon enough, if only because he didn’t make a move on her. She was one of the most desirable women in the world at the time and her marriage was apparently on the rocks. If there had been social media back then, we would have all got rolling reports on them.

Q: And Dean?

A: Well his heterosexual credentials were also called into question, though not as conspicuously as Hudson’s of course. Then again, there the gossip, rumour and hearsay surrounding Dean has never ceased. When someone dies, especially prematurely, it seems to provide the world with licence to think, say and share whatever they choose. Dean was killed in a road accident before filming had even finished. He’d completed his scenes and was driving his Porsche Spyder in Cholame, California. This was 1955. Dean (who was born in 1931), like Marlon Brando (born 1924) was one of those mid-20th century glamor-rebels challenging a society in the throes of a social, cultural and psychological adjustment to peacetime. Their political aspirations were captured in Brando’s answer to, “Hey Johnny, what are you rebelling against?” in The Wild One (1953). “What’ve you got?” Elvis was another pin-up rebel without a cause, conviction or purpose. Dean, perhaps more than the others, encoded the mood of his generation. It was a generation that had not yet assimilated changes in the cultural politics of sex: Dean was unequivocally male and that meant his glazed handsomeness was intended to excite young women. It did. But that was just the visible tip of Dean’s ultra cool iceberg. The Dean myth grew bigger, appreciably bigger, than the man. Check this picture of him in crucifiorm mode, with Taylor looking at him almost worshipfully.

Q: Let me pause briefly to reflect: the film featured Hudson, who was, for all the world knew, a straight lady’s man, but who later took on iconic importance when he became the first Aids victim from the Hollywood A-list. There was also Taylor, who, at that time, was still four years away from her scandalous affair with Eddie Fisher, who was best man at her third wedding, and married to one of the world’s most popular girl-next-door types, Debbie Reynolds — and father of her children. And Dean, who died young and handsome and whose image was to adorn millions of posters, tee-shirts, coffee mugs and who was to become the subject of books and movies. He was one of those characters who, as they say, captured the zeitgeist.

A: Correct.

Q: I get it: they were all, in their own ways, icons of the late twentieth century.

A: Yes, though the affair with Fisher was only the start of the Taylor’s notoriety. In the early 1960s, she meet Richard Burton in Italy on the set of Cleopatra (see below). Still married to Fisher, she became involved with the Welsh actor, himself married and with children. The timing of the clandestine affair was perfect in a sense. The Italian photojournalists who later became known to us all as paparazzi were just beginning their exploits and caught Taylor and Burton in flagrante. The image quickly circulated around the world, heralding the arrival of a new type of journalism.

Q: And ultimately, the rise of what we now recognize as celebrity culture.

A: I’d say so. Now do you understand what I mean when I say Hudson, Taylor and Dean were three mortals advancing towards immortality? In a way, all three have left their impressions on our culture.

Q: What made you think of this?

A: I claim no credit. An American journalist Amanda Champagne, who writes for Closer, asked me to comment on the film as we approach its anniversary and, as I was thinking about the production, it occurred to me that the three main actors were far from cultural behemoths in 1956 when the film was released. But, over subsequent decades, each became colossally significant in completely different ways.

In a way, he was right: one game had indeed finished. Ali fought only once more. His health had been deteriorating for several years before the ill-advised Holmes fight and the savaging he took repulsed even his sternest critics. Ali the “fearsome warrior,” as Hauser calls him, would disappear, replaced by a “benevolent monarch and ultimately to a benign venerated figure”.

And now that venerated figure has died, aged 74.

Muhammad Ali was also a symbol of black protest, a cipher for the anti-Vietnam movement, a martyr (or traitor, depending on one’s perspective), a self-regarding braggart, and many more things beside. While there have been several sports icons, none have approached Ali in terms of complexity, endowment and sheer potency. Jeffrey Sammons suggests: “Perhaps no single person embodied the ethic of protest and intersected with so many lives, ordinary and extraordinary.”

Born into two nations

Born in Louisville, Kentucky, in the segregated south, Cassius Clay, as he was christened, was made forcibly aware of America’s “two nations,” one black, one white. After winning a gold medal at the 1960 Rome Olympics, he returned home to be refused service at a restaurant. This kind of incident was to influence his later commitments.

Clay both infuriated and fascinated audiences with his outrageous claims to be the greatest boxer of all times, his belittling of opponents, his poetry and his habit of predicting (often accurately) the round in which his fights would end. “It’s hard to be modest when you’re as great as I am,” he remarked.

He beat Sonny Liston for the world heavyweight title in 1964 and easily dismissed him in the rematch. Between the two fights, he proclaimed his change of name to Muhammad Ali, reflecting his conversion to Islam. While he’d made public his membership of the Nation of Islam (NoI), sometimes known as the Black Muslims, prior to the first Liston fight, few understood the implications. The NoI was led by Elijah Muhammad and had among its most famous followers Malcolm X, who kept company with Ali and who was to be assassinated in 1965.

Among the NoI’s principles was a belief that whites were intent on keeping black people in a state of subjugation and that integration was not only impossible, but undesirable. Blacks and whites should live separately; preferably living in different states. The view was in stark distinction to North America’s melting pot ideal.

Ali’s commitment deepened and the media, which had earlier warmed to his extravagance, turned against him. A rift occurred between Ali and Joe Louis, the former heavyweight champion who was once described as “a credit to his race.” This presaged several other conflicts with other black boxers whom Ali believed had allowed themselves to become assimilated into white America and had failed to face themselves as true black people.

Sting like a bee

The events that followed Ali’s call-up by the military in February 1966 were dramatised by a background of growing resistance to the US involvement in the Vietnam War. Ali’s oft-quoted remark “I ain’t got no quarrel with them Vietcong” made headlines around the world. He insisted that his conscience not cowardice guided his decision not to serve in the military and, so, to many others, he became a mighty signifier of pacifism. To others he was just another draft dodger.

At the nadir of his popularity, he fought Ernie Terrell, who, like Patterson, persisted in calling him “Clay.” The fight in Houston had a grim subtext with Ali constantly taunting Terrell. “What’s my name, Uncle Tom?” Ali asked Terrell as he administered a callous beating. Ali prolonged the torment until the 14th round. Media reaction to the fight was wholly negative. Jimmy Cannon, a boxing writer of the day wrote:

It was a bad fight, nasty with the evil of religious fanaticism. This wasn’t an athletic contest. It was a kind of lynching … [Ali] is a vicious propagandist for a spiteful mob that works the religious underworld.

Wilderness years

Ali’s refusal to serve in the armed forces resulted in a five-year legal struggle, during which time Ali was stripped of his title. During his exile, Ali had angered the NoI by announcing his wish to return to boxing if this was ever possible. Elijah, the supreme minister, denounced Ali for playing “the white man’s games of civilisation”. He meant sports.

Other evaluations of sport were gathering force. The black power inspired protests of John Carlos and Tommie Smith at the 1968 Olympics, combined with the anti-apartheid movement in South Africa had made clear that sport could be used to amplify the experiences of black people the world over. While Ali was a bête noir for many whites and indeed blacks, several civil rights leaders, sports performers and entertainers came out publicly in his defence. He was hailed as their champion.

Given the growing respect he was afforded, he was seen as an influential figure. Ali’s moves were monitored by government intelligence organizations; his conversations were wiretapped. But the mood of the times was changing: he was widely regarded as a martyr by the by-then formidable anti-war movement and practically anyone who felt affinity with civil rights.

His years of exile over, he returned to boxing. But prospect of a smooth transition back to the title was dashed March 1971 by Joe Frazier (see picture above)), who had taken the title in Ali’s absence and defended it with unexpected tenacity in a contest that started one of the most virulent rivalries in sport. Ali had called Frazier a “white man’s champion” and declared: “Any black man who’s for Joe Frazier is a traitor.” Ali lost once to Frazier and beat him twice over the following years, every fight being viciously fought.

Ali had to wait until 1974 before getting another chance at the world title. By this time, Ali, at 32, was not favoured; in fact, many feared for his well being against the hitherto unbeaten George Foreman. The fight in Zaire became immortalised as “The Rumble in the Jungle” and Ali emerged again as champion.

In June 1979, Ali announced his retirement from boxing. At 37, he appeared to have made a graceful exit when he moved to Los Angeles with his third wife Veronica whom he had married two years before. His first marriage lasted less than a year ending in 1966; Ali married again in 1967, again in 1977 and then in 1986 to his current wife Yolanda Williams.

Hauser estimates Ali’s career earnings to 1979 to be “tens of millions of dollars”. Yet, on his retirement, Ali was not wealthy.

Within 15 months of his retirement, Ali returned to the ring, his principal motivation being money. He also made several poor business investments and, while prolonging his sports career seemed suicidal, he managed one more fight, again ending in defeat. He was 39 and had fought 61 times.

In 1984, he disappointed his supporters when he nominally supported Ronald Reagan’s re-election bid. He also endorsed George Bush in 1988. The Republican Party’s policies, particularly in regard to affirmative action programs, were widely seen as detrimental to the interests of African Americans and Ali’s actions were, for many, tantamount to a betrayal.

London Olympics 2012: Ali as global icon.Owen Humphreys/PA Wire

Ali’s public appearances gave substance to stories of his ill health. By 1987, he was the subject of much medical interest. Slurred speech and uncoordinated bodily movements gave rise to several theories about his condition, which was ultimately revealed as Parkinson’s syndrome. His public appearances became rarer and he became Hauser’s “benign venerated figure.”

Over a period of five decades, Ali excited a variety of responses: admiration and respect, but also condemnation. At different points in his life, he drew the adulation of young people committed to peace, civil rights and black power; and the anger of those pursuing social integration.

Ali engaged with the central issues that preoccupied America: race and war. But it would be remiss to understand him as a symbol of social healing; much of his mission was to expose and, perhaps, to deepen divisions. He preached peace, yet aligned himself with a movement that sanctioned racial separation and the subordination of women. He accepted a role with the liberal Democratic administration of Jimmy Carter, yet later sided with reactionaries, Reagan and Bush. He advocated black pride, yet disparaged and dehumanised fellow blacks. He taught the importance of self-determination, yet allowed himself to be sucked into so many doubtful business deals that he was forced to prolong his career to the point where his dignity was effaced. Like any towering symbol, he had very human contradictions.

Ellis Cashmore discusses reactions to his new book with his commissioning editor at Bloomsbury, Katie Gallof.

Katie Gallof:Well your new book on Elizabeth Taylor is provoking some reaction, isn’t it? It seems you’ve captivated some reviewers, and infuriated others. Liz Smith, in particular, has moved from the first response to the second. What goes on here?

Ellis Cashmore: First let me introduce Liz Smith, @LizSmth, who, in all probability doesn’t need much of an introduction. She’s the most experienced and arguably most respected society journalist in the world and, even in her nineties, files an influential column called New York Social Diary in which she chronicles the lives of celebrities. To call her a gossip columnist – which I do in the book – is really like describing the Sistine Chapel as a church. She is the doyen of celebrity journalists.

KG: She was a friend of Elizabeth Taylor, right?

EC: Absolutely. A confidante too, I would surmise. Certainly, Liz Smith covered Elizabeth Taylor’s career in depth and for a period of time that qualifies her to comment authoritatively on virtually any aspect of her life.

KG: And your book is, of course, about Taylor’s life, but also the cultural changes she both lived through and, in her way, instigated.

EC: Yes, my argument is that Taylor ushered in what we now call celebrity culture: audiences were as fascinated by her private life as they were by her dramatic performances and she was adept at manipulating the media in a way that suited her own ends perfectly. In a genuine sense, she helped cultivate our appetite for scandal, particularly with her tempestuous romance with Richard Burton. We take this for granted now, of course. But La Liz, as Liz Smith calls her, was the first Hollywood star to capture fans in this way. Incidentally, Liz Smith wrote about Taylor and Burton: ““They trusted me and eventually I became the only journalist who could get to them.”

KG: So what did Liz Smith think about your book?

EC: In her column New York Social Diary, she offered her view that I “intelligently and dramatically” address the changing status of fame, specifically how Taylor benefited from scandals that would have ruined lesser stars, whether Taylor deliberately started those scandals, if she delighted in or squirmed from the global fame she acquired and how she turned her fame to her own purposes. In a lovely phrase, Liz Smith notes my analysis of “How she [Taylor] made mythology out of her travails and happiness.” You can imagine how thrilled I was when she concluded: “I found myself agreeing with most of his conclusions, perhaps because I myself had come to believe, and had written those same conclusions, over the many, many years I knew and had unprecedented access to the star of stars.”

KG: Praise indeed from someone who has been writing about the stars for at least four decades. I understand she launched her renowned New York Daily Newscolumn in 1976.

EC: Yes. In fact, she implicitly invited me to contact her for further information when she wrote that her input could have “made his good book better.” I don’t doubt this.

KG: So what’s changed?

EC: Three days later in another New York Social Diary column, Liz Smith wrote that the more she thought about my book’s references to her, the more “pissed-off” she became. Naturally, it wasn’t my intention to upset her and I don’t think there was any inaccuracy in my account. But I recorded how she was present at many pivotal events in Taylor’s career and was closer to her than any other journalist. This led some writers to assume she lost some objectivity and became too chummy. This wasn’t my criticism: in fact, it came from Ann Gerhart, who, in 1993, wrote critically after Liz Smith had emceed a press conference at which Taylor introduced her range of fragrances: “Now, the veteran gossip columnist is a celebrity in her own right, by virtue of her years of access and hefty salary, and many times she has hosted various functions to raise money for charity. But a journalist serving as a flack, helping an interview subject hustle a commercial venture, that’s something entirely different and smacked, to us, of ethics violations.”

KG: That was certainly a stinging censure.

EC: It was, though, in a sense, journalists can, indeed have to become familiar, if not friendly with their subjects. Remember Gerhart’s remarks were 23 years ago. Today, we consumers expect journalists to provide insider accounts of the most personal details of celebrities’ private lives. This is not sycophancy, but Liz Smith was ahead of her time in this respect. I know she grumbles that many critics have given her “bitchy write-ups,” but I’m hoping she doesn’t include me. In writing the book, I’ve tried to be analytical and detached.

KG: I notice that, at the end of the book, you include her in the roll of influential individuals who, in their own way, shaped Taylor and, in turn, the world in which she lived.

EC: Indeed I do. The whole book is as much about times of Elizabeth Taylor, as well as her life. She was inseparable from her cultural context and, of course, Liz Smith was part of that context. I quote her poignant phrase after Taylor died: “She was only 79, but had lived a thousand years, had fired up and exhausted endless fantasies for herself and the millions who watched her.”

Katie Gallof is Bloomsbury’s Senior Commissioning Editor for Film and Media Studies. She’s based in New York. katie.gallof@bloomsbury.com @BloomsburyMedia

Q: A contagious disease, the threat of violence, insanitary conditions, a constitutional crisis, and now … a doping crisis! All in a day’s work for the organizers of the Rio de Janeiro Olympics, eh?

A: Yes: every summer Olympics has its share of problems in the lead-up to the tournament, but they’re usually about getting the stadiums built in time, or completing the transport links. For Rio, these are minor problems: they have much more serious crises to avert. Do you want me to go through them?

A: Cataclysm might be overstating it a bit, but the Zika virus certainly has the potential to develop into a global pandemic. Zika is the virus spread by mosquitoes — those pesky little long-legged flies with a taste for human blood. Aedes aegyptiis the name of a species of mozzie that carries this Zika virus and if they bite a pregnant woman her baby could develop a devastating birth defect. This has already happened in Rio. The danger is that, if some of the expected 500,000 visitors to the Olympics get bitten, then return home, then the virus goes with them. The European Centre for Disease Prevention and Control says the mosquito type has been recently reported in Madeira, the Netherlands and the north-eastern Black Sea coast. You can bet that, after the Olympics, it will be many, many other places too.

Q: So this is potentially a huge public health risk. Who in their right mind would go to a part of the world where this kind of mosquito thrives?

A: The Olympics has a great pulling power and not only for audiences. Athletes train for four years and, when they finally get the chance to compete in the most prestigious tournament in the world, they will run a cost-benefit calculation through their mind and decide it’s a risk worth taking.

A: And I find his arguments compelling. I was in a discussion with him recently and agree with his findings. But I don’t think the Olympic organizers will listen. As always, money overpowers everything, including public health considerations.

Q: How much money are we talking about?

A: Well let’s start with the sponsors’ money. The International Olympic Committee , or just IOC for short, has about 30 global corporations in its team of commercial “partners,” as it likes to call them. These include Samsung, Visa, Omega as well as the ever-present pair, Coca-Cola and McDonalds.They pay for the rights to use the Olympic rings logo, advertise themselves as Olympic sponsors and generally associate themselves with the Olympic brand. Because of the different levels and lengths of contracts, I can only estimate the value of sponsorships for this particular tournament, but I don’t think $1 billion would be wide of the mark. And I know we tend to use the word billion as we used million a decade ago. But remember: a billion is a thousand times more than a million.

Q: A thousand million American dollars? That’s £691,488,810. Serious money!

A: Actually, it gets more serious. The media deals are enormously complex because they’re often structured over several Olympic cycles and there are subcontractors who buy the broadcast rights to whole territories and then sell on to individual broadcasters. The IOC has one particularly lucrative contract with NBC television worth $7.5 billion and which stretches to 2032. But for this single Olympic games, the overall value of media contracts is, I’d say, slightly north of $4.1 billion.

Q: Why so much?

A: Advertising. The 2012 London Olympics was broadcast to 115 different countries, reaching an audience of 3.8 billion homes. That’s a formidable reach and very, very few televised events can claim such a fantastic demographic. Football’s World Cup is one of them, of course. So, if you’re an advertiser and you want to show your products to the biggest possible consumer market, then you advertise during the Olympics. And tv and radio companies charge you more. So they make money. The IOC charge a lot in the confident expectation that broadcasters will cough up, secure in the knowledge that they can charge advertisers a premium. The USA’s NBC charges about $100,000 per 30-seconds and has already taken $1 billion in advertising spots. The rate is dwarfed by those attached to some sporting events, like the Super Bowl, but, of course the Olympics lasts over two weeks. So a cancellation at this late stage would create pandemonium for both sponsors and broadcaster.

Q: But surely the huge corporations tied up with the Olympics are insured against a cancellation or some other kind of catastrophe.

A: Definitely. But imagine the brand damage: the Olympics is a popular portal for advertising and marketing because of its connotations: health, wholesomeness, purity, virtue — squeaky-cleanliness. Public health disasters are not part of the brand profile.

Q: Which brings me to the other potential problems. I was reading the Brazilian footballer Rivaldo had warned prospective travellers to stay away from Rio. He thinks they will be exposing themselves to violence.

A: I’m always skeptical about these kinds of warnings. Every big city in the world carries its own menace: cities are, almost by definition, places where rich and poor live side-by-side. Well, perhaps not side-by-side: there are affluent and impoverished areas of most cities. Rio is no different. Of course, there are dangerous parts and most clued-up travellers will give them a wide berth. All the same, when someone like Rivaldo reckons Brazil is getting “more ugly,” I guess we should take notice. You might expect Brazilian athletes to support the Games and encourage fans from everywhere to flock to Rio. He’s warning them off. Add to this the report that Rio’s Olympic waterways are rife with pathogens — bacteria that can cause disease — and that corruption is rife and you come up with the picture of a country that is not quite fit-for-purpose as an Olympic host. Matter of fact, Rio and the Olympics makes Quatar and football’s World Cup look like a match made in heaven!

Q: Almost inevitably there’s been an ominous doping scandal, the difference this time being that this one has arrived before rather than during or after the Games.

A: Let me recap: Russia is already suspended from the Olympics and it will petition to have its suspension lifted before the start. It’s case is now being considered. Kenya has also been mentioned, though nothing has materialized thus far. Russia is known to have had a state-sponsored doping programme. Kenya was recently declared “non-compliant” with the World Anti-Doping Agency’s rules. It would be a major blow if either or both nations were excluded from the Games for drugs violations. Russia was fourth in the 2012 medals table and Kenya is the preeminent force in middle and longdistance running. And it gets worse: dozens of athletes expecting to compete in Rio de Janeiro could be barred from the Games. The International Olympic Committee announced that it had retested urine samples taken at the Beijing Olympics of 2008 and would retest more from the 2012 tournament. The intention is presumably to strip those who tested positive of their medals and, if they planned to compete in Brazil, ban them.

Q: Hang on, I’m not quite getting this. The athletes at Beijing and London were tested in 2008 and 2012 respectively and, we presume, came out clean and so kept their medals. How can they testers change their minds now and declare them “cheats”? It seems to go against the entire ethos of sports. I mean, there’s a contest, an outcome and winners are declared. OK, we know dopetesting can take a few days. But eight years? This means that every single medallist at Rio keeps the medal conditionally and, if at some future unspecified time, their sample shows up a banned substance, their medal could be annulled.

A: That’s it. Every result at Rio will be provisional. And it will remain provisional for ever. The testing equipment available now will detect some substances. But athletes are intelligent enough to realize that, if they intend to enhance their athletic performance, they don’t want to use substances that will be detected. That’s why they use designer drugs, these being drugs that are synthetically made to escape detection. It’s possible that at some point in the future, the testers will catch up — as they apparently have with some of the substances used in 2008 and 2012 — but, there’s also a better-than-even chance that they’ll never devise tests sophisticated enough to catch them.

Q: All the same, this has to be an unsatisfactory state of affairs. It means that the 78,838 fans at the Estádio do Maracanã (pictured above) plus the 3 billion+ tv audiences will be watching events in which the results will be inconclusive and always subject to change.

A: Correct. But try to think of the Olympic Games less as a sporting tournament and more of a spectacular exhibition — a showcase for the world’s seventh biggest economy. Between August 5-21, there will be plenty of competition, but there’ll also be the grand opening and closing ceremonies and two-and-a-half weeks of the most intensive marketing imaginable. The Chariots of Fire bolted long ago.

Q: Prince (pictured above) is the latest in an incredible series of celebrity deaths this year. The unexpected death of David Bowie was an ominous start to 2016. Since then we’ve lost Harry Potter star Alan Rickmam, Pulitzer Prize-winning novelist Harper Lee, Glenn Frey of the Eagles, and renowned singer Natalie Cole, among others.

A: And don’t forget Lemmy, of Motörhead, who died last December. These are all people who have, in some way, helped shape all of our lives. The impact of some, particularly Bowie, has been substantial. It’s hard to imagine anyone between the ages of, say, 50 and 70 who hasn’t been affected by him. The response to his death was one of those great “outpourings,” as we now call them, following the aftermath of the death of Princess of Diana in 1998.

A: Not so publicly. Nowadays, there’s an exhibitionist quality about our grieving. We feel almost a sense of obligation, as if we’re participating in a ritual. There’s nothing wrong or artificial about it: it’s just part of a more generic cultural shift towards expressing everything, including our innermost feelings. It reminds us that even the personal is actually social.

Q: I’m not sure exactly what you mean by that, but I assume you’re hinting that the emotions we presume are instinctive states of mind and distinguishable from our outward expressions are not as private as we think.

A: That’s pretty much it, yes. Everything we are is made possible by our participation in society.

Q: OK, let me push you towards answering a more specific question about the meaning of celebrity deaths.

A: Here’s the thing: celebrities today are not like the Hollywood stars of the 1940s or 1950s or, earlier, the great political, military or even religious leaders, all of whom stood above us on pedestals. We put them there, of course; but we were comfortable looking up to them – as if they were godlike creatures; untouchable and inaccessible. Today, celebs are just like us: we communicate with them via twitter and Instagram, we learn their “secrets,” we invest part of our own lives in theirs. In sum, we treat them as ordinary human beings, except they are in the media. We might respect some of them; others we might just like; still others we might hate. As long as they somehow elicit a reaction from us, we follow them. To use a term of today, they engage us. That’s all a celebrity needs to do.

Q: And when they die, they remind us that they’re just flesh and blood like the rest of us, right?

A: You’re ahead of me. That’s exactly right: death is the ultimate reminder of mortality. We don’t wish our celebrities to be dead, of course; but we are macabrely reassured by their passing.

Q: I guess illness functions similarly.

A: Yes. As you know, my recent book Elizabeth Taylor: A Private Life for Public Consumption approaches the star as a harbinger — a person or thing that announces or signals the approach of another era, in Taylor’s case celebrity culture. Throughout her life, she was bedeviled by serious illness. Everyone knew this because every bout of sickness was generously covered by the media. She became ill publicly. As the consummate celebrity, Taylor knew exactly how to use this to her advantage: she manipulated the media perfectly in a way designed to squeeze the maximum amount of sympathy from her public. She actually advised her friend Michael Jackson that he could exploit his own illnesses. My point is that, when we hear of the ill health of celebrities, it is, again, one of those reminders that they’re just as susceptible to sickness as anyone else. And we find that comforting. It sounds perverse, but that’s just one of a number of perversities in celebrity culture.