.

Tuesday, 26 April 2011

One of the most interesting books I ever read is The Handmaid's Tale
by Margaret Atwood. It's a novel about a dystopian future America
(called Gilead) in which women are categorised by their value as
reproductive objects. The story focuses on Wives, the Handmaids who are
their husband's concubines, and the Jezebels and Unwomen who can not be
integrated into this new, fundamentalist society.

One thing the book touches on is the overlap of far-right and far-left
ideals which results in the oppression of women in Gilead. People in the
middle, who had no particular investment or opinion either way, got
caught in the resulting military dictatorship. They probably approved of
some of the early stages without looking into the motives of the people
behind them, and implicitly endorsed a future they probably didn't want
to live in.

Reproduction in Gilead is regulated by the idea that sex is inherently
degrading to women. The book references a past (our present) where
feminists teamed up with conservatives in campaigns against pornography.
The consequences of this alliance, however, only empowered feminism's
worst enemies. Descriptions of the narrator's feminist mother burning
books - then being sent to labour camps as an Unwoman - show feminism
allying itself with the religious right, then being discarded by those
'allies'.
(When I read this as a teenager it was powerful food for thought. Also,
it was kind of nice to read a sci-fi book told exclusively from a
woman's perspective, by an authentic female voice. A lot of sci-fi has
too much allegory about it for my taste, and the women all end up as
traitors or queens. It was refreshing to read a book that had a point to
make, but made it with the voice of someone who did not know what the
'right' or 'correct' thing was, or have a particular moral agenda.
Offred, the narrator, is in many ways only a vessel. Anyway.)

Handmaids in the film of the novel, watched over by blue-suited Wives.

Silly as it seems the book has greatly influenced how I interpret, well, loads of stuff. The irony, for instance, of a parent's recent complaint against The Handmaid's Tale being
taught at his son's school, because it is "rife with brutality towards
and mistreatment of women" and contravenes the school's policies of
respect and tolerance.

Wow. Just wow. That is Not Getting It on so many levels, you hardly know where to begin.

And yet... it is only degrees away from a lot of the arguments against
adult entertainment, against sex work. It's hard not to feel defensive
about sex work when it seems like just about everyone hates you. The
right can't decide if you need to be in prison or saved; the left,
whether you need to be in a shelter or an 'exiting' programme. There are
few accepted stories for sex workers other than Criminal or Victim.

The more closely you look at the key players behind some of the stories
popping up lately - particularly ones about trafficking or sexualisation
- the more you notice some odd pairings. A group working closely with
the anti-gay, anti-abortion US lobbying group Family Research Council
using a female MP as the mouthpiece of their opinions on the internet
and porn. The well-known UK feminists lending their names to
international groups with questionable agendas.

There are so many ways to use women outside of sexual commerce. What is
the more damaging - selling a service, or not realising you're selling
out?

There's a saying where I come from: you got to dance with the one who brung you.
I wonder when everyone gets to the end of their dance cards, what
promises they've made and what obligations they'll have to honour.

Friday, 15 April 2011

While various areas of sex work have little in common apart from the
'sex' bit, increasingly they are lumped together in the eyes of the
public, government, and media as something that is affecting society
more than before and needs attention now.

The reasons for this are numerous. One particular influence is the rise of what is known as the Rescue Industry, an umbrella term coined by Laura Agustin
to cover people not in the flesh trade, who nevertheless profit from
attempting to end sex work of all kinds. Did I say "profit"? Yes, I sure
did.

Issues such as trafficking, sex work, and pornography are hot topics for
people who claim their main motivation is to help those involved. Help
is a great thing. There are loads of people who could all use a little
help, in all professions and walks of life. But when does the reasonable
goal of helping others cross the line into infantilising others... and
helping yourself?

Cynical? Maybe a little. On one hand many of the people concerned about
the welfare of sex workers are no doubt motivated by a genuine desire to
help others. Particularly those they think of as unable to defend
themselves. But the flipside of this concern is that everyone needs
money to survive. As other charities have discovered in the past,
sometimes the desire to have a high profile and keep the wheels greased overtakes the benefit to the people you were trying to help.

Charities aside - and, let it be said, there are many worthy and honest
ones - there are also the academics, researchers, and writers who earn
their living not through hands-on effort, but by writing papers. Papers
which allow them to win grants. Grants so that they can write more
papers.

This, as a former cancer research academic, is a world I know well. We
can't all save lives. But we do all have to earn a crust. Still,
sometimes the ratio of money available to size of the problem seems far
out of whack. You do start to wonder how much of what is said and
written is born from genuine concern, and how much is just chasing
another year's salary.

Is there enough money in it to even bother making this criticism? Well,
thanks to a little tool that compares the money from funding grants over
time, we can make a rough guess of what it's worth. For instance,
funding for studying trafficking is enormous - in 2009, it was funded
worldwide to the tune of nearly a billion US dollars. This is a total greater than the amount of grant money awarded to study lung cancer,
which of course, is also devastating, and affects far more people. And
spending on trafficking since 2000 has dwarfed the grant awards on such
important international health concerns as malnutrition, malaria, or tuberculosis - conditions that kill millions of people worldwide every year, and affect hundreds of millions more.

Another way in which opposing sex work brings financial benefit is
through the Proceeds of Crime Act 2002. Police know, for instance, that
if a brothel owner is prosecuted, since running a brothel is illegal,
any money and property retrieved from the 'crime scene' becomes theirs.
When police resources are limited, does the temptation of profit
possibly influence victimless crimes being prosecuted more vigourously
than they otherwise would? Hard to know for sure. It's a handy little
coincidence, the pre-Olympic crackdown on brothels and the recent cuts
in police funding, isn't it? You can read more about the criticisms of such crackdowns in the Grauniad.

Hanna Morris, who ran a brothel, lost her abuse of process case against the police. She rang 999 when masked and armed gunmen threatened her business...
only to find herself arrested, and the violent criminals never pursued
or apprehended. It's impossible to know for certain, but one can imagine
plenty of situations in which police - with restricted time and money -
must make choices: unknown violent criminals who may be difficult and
expensive to catch, or women technically breaking the law standing right
in front of you, with cash assets?

The outcome of the Hanna Morris case certainly sends a message, but I'm not
convinced it's the message of 'protecting women' that some people prefer
to promote.

Monday, 11 April 2011

There is an excellent article by investigative journalist Nick Davies, a
primer on how the UK trafficking numbers were blown out of all
proportion: Anatomy of a moral panic. (For commentary about US-based numbers, here's a blog about the topic.)

To summarise Davies's article, a paper which estimated a very small
range - between 142 and 1,420 trafficked sex workers in the UK per year -
was misreported and misinterpreted, ending up with people claiming
4,000 (or even as many as 80,000!) trafficked women entering the UK sex
trade every year.

Part of the problem with these kinds of numbers is that while they're
very, very wrong, they are also difficult to disprove. And the idea,
once it enters mainstream media, is difficult to dislodge, even with
facts.

In many social research fields, exact numbers can be hard to come by.
Even seemingly straightforward calculations are fraught with error.
Let's take a simple example. Imagine you were trying to count the number
of people living in the world (and that you, like Father Christmas, are
able to get to every household in an unfeasibly short amount of time).
It would be a hard job. By the time the count was finished, loads of
those counted would have died, and even more would have been born. An
actual number that represents the real number of living people on earth
at any one time? It's impossible. So, the world population is actually
an estimate made based on some facts known about the countries of the
world, their last population estimate, and their birth and death rates.

Making this kind of an estimate is a “Fermi problem”.
Enrico Fermi, one of the physicists who worked on the Manhattan
Project, was reputedly able to make accurate guesses based on limited
information.

Here’s an example of a Fermi problem in action. I was at pub quiz one
week, and our team was tied for the lead. The tiebreaker was the
question “How many performances did Yul Brynner have as the King of Siam
in The King and I on Broadway?” As the only former drama geek in our team, it came down to me.

I calculated that Brynner probably did 8 performances a week ("once a
day and twice on Sundays", as the saying goes). It’s a full-time job, so
minus a two-week holiday, Brynner was probably performing 50 weeks a
year. I wasn’t sure how many years it ran but knew he had been in at
least one revival of the popular musical, so let’s say ten years of
being the King total. That makes an estimate of:

8 shows a week x 50 weeks a year x 10 years = 4,000 shows

Sounds pretty high, right? We won the tiebreak (and the quiz) because,
as it turned out, the real answer is 4,525. I was off by over 10%, which
would be terrible for science, but was good enough for the quiz. The
other team guessed 600... way too low. Picking a number out of thin air,
as the other team probably did, is fraught with error. It’s hard to
make good guesses with no information. Apply a few basic assumptions,
however, and your accuracy goes up rapidly.

Fermi problems are great for pub quizzes, less so for evidence-based
reporting. Common-or-garden estimates are not the stuff on which good
research is built. At the very least, applying a set of assumptions to
estimate a number should meet two major criteria:

1. The assumptions must have some foundation in reality. Eight Broadway performances a week is reasonable; 80 wouldn’t be.

2. The method of calculation needs to be explained. If an
assumption turns out to be wrong, the calculation can then be adjusted. I
don’t think the other person on my team would have bought 4,000 as an
answer if he hadn’t seen my reasoning.

What does this have to do with the trafficking estimates?

The people who claim there are thousands, or even tens of thousands, of
sex slaves in Britain are claiming an unrealistically high number. So
unrealistic in fact that if it were true, that would mean the vast
majority of prostitution in Britain was undertaken by trafficked people.
That violates the first principle - basis in reality. Some people
involved in sex work have encountered people who may have been
trafficked; the vast majority have not. So either there's a whole other
sex industry going on that no one in the sex industry knows about, or...
the calculations are wrong.

Part of the difficulty with fighting such unrealistic claims, however,
is getting good estimates to counter them. There is no comprehensive UK
mapping of sex workers, much less trafficked ones, but there are some
estimates. As part of the European Network for HIV/STD Prevention in
Prostitution (EUROPAP), Hilary Kinnell contacted projects providing
services for sex workers. [pdf]
She had 17 responses. The average number of prostitutes per project was
665. She then multiplied that figure by 120, the total number of
projects on her mailing list, to get an estimate of 79,800. This total
includes women, men, and transgender women and men sex workers in the
UK.

Kinnell notes there are obvious problems with this particular Fermi
problem: the centres responding might be larger than most, some sex
workers might use more than one centre. She finds it strange that number
- ten years old, a huge estimate, and taken out of context - is still
quoted. "The figure was picked up by all kinds of people and quoted with
great confidence but I was never myself at all confident about it. I
felt it could be higher, but it also could have been lower."

Meanwhile data from the UK Network of Sex Work Projects (UKNSWP) records
an estimate of 17,081 sex workers in some kind of contact with centres.
Of these 4178 - about 24% - work on the street. A larger total for all
sex workers was 48,393. More recent, and rather lower, than the 1999
estimate. So if the trafficking hype is correct, that would make
anywhere from one in 12 to as high as one in 2 sex workers in the UK the
victim of trafficking.

Let's go back to the paper which kicked this all off, the one that
estimated a range of 142 to 1,420 trafficked sex workers in the UK. Now,
a note about that number: it included not only women who were
trafficked against their will, but also women who willingly arrived
(perhaps illegally) to the UK for sex work. In other words, Kelly and
Regan’s total included both unwilling and willing sex migrants.

Part of the problem is how different groups define “trafficked”. Some
assume that if someone is not British and is working in the sex trade,
she must be trafficked. That’s quite a leap in logic! Hold on a sec - I
was born abroad. And I worked in the sex trade. Does that mean they
count me as "trafficked"? WTF?

The Poppy Project reported in 2004 that 80% of prostitutes in London
flats were foreign-born. But there is no evidence that those women were
trafficked or that this high proportion of foreign sex workers to
natives is true of the entire UK. (In fact, evidence puts the UK-wide
proportion closer to 37%.)

‘Foreign-born’ also includes citizens of other EU countries, who have
the automatic right to live and work in the UK. Eaves, the organisation
that includes the Poppy Project, did an interesting nip-and-tuck on
reporting the origins of women working in the sex trade in London. In
their 2004 report Sex In the City [pdf],
they claimed 25% of women working in London were from Eastern Europe.
But look closer - they have classified Italy and Greece as “Eastern
European” countries.

Why? Well, the reason is given that “because these ethnicities are often
used to code women from the Balkan region, advised by pimps and
traffickers to lie about their ethnicity to avoid immigration issues.”
Hey, my dad is Italian... if I said this to a researcher, does that mean
they would assume I'm really Eastern European? That violates the second
principle of the realistic estimate: show your work clearly. It’s the
kind of sloppy calculation that throws all subsequent conclusions into
question. It's bad Fermi.

So if some people who come here voluntarily can be erroneously called
“trafficked,” then what is “trafficking”, exactly? The Palermo Protocol
to Prevent, Suppress and Punish Trafficking in Persons Especially Women
and Children, part of the 2000 UN Convention against Transnational
Organized Crime defines ‘trafficking’ as

…the threat or use of force or other forms of coercion, of
abduction, of fraud, of deception, of the abuse of power or of a
position of vulnerability or of the giving or receiving of payments or
benefits to achieve the consent of a person having control over another
person, for the purpose of exploitation.

In other words, illegal migration for purposes of economic advantage, if undertaken willingly, is not trafficking. If nothing else, it's worth remembering the excellent analogy offered by Charlie Glickman:

Sex work is to Trafficking as Consensual Sex is to Rape.

Just because rape exists (and is rightly both reviled and illegal), that
doesn't mean banning sex would solve any problems. What Glickman's
statement encapsulates so brilliantly is that while trafficking occurs
within sex work, that in and of itself is no good reason to either
equate the two, or to ban sex work. Pumping up the trafficking numbers
might be great for getting media attention, but it does nothing to solve
the real problems of people who are really trafficked.

Claiming huge numbers of trafficked sex slaves where they do not exist
distracts attention and resources from the (far smaller) number of
people forced into sex who genuinely need assistance. And I for one
think inflating a problem is not only unethical, it's dangerous to real
victims. Let's get our terminology right, at the very least. Let's start
with realistic research and maybe someday we'll get realistic results.

Monday, 4 April 2011

There's a lot to be said on the subject of trafficking in
general, and in the UK in particular. First off I'd like to link two
excellent articles on the topic by investigative journalist Nick Davies
as a primer on how the trafficking numbers get blown out of all
proportion, and the result: Anatomy of a moral panic, and Inquiry fails to find single trafficker.

For some, the word trafficking evokes scenes of beaten and smuggled women a la The Wire.
For others, the image most strongly associated with trafficking is that
of the drowned Chinese immigrants of the Morecambe Bay disaster. And
for a special few, trafficking invokes the opportunity to attach their
agenda to international events... particularly sporting events.

So what's the connection between high-profile sporting events and trafficking - is there even evidence for a connection at all?

One hardly ever sees mention of prostitution anymore where human sex
trafficking is not also invoked. It's bizarre, this assumption that the
vast majority of men are not only paying for sex, but willing to pay for
sex with unwilling partners. Says a lot about what the people making
these assumptions think of men, I guess.

“Aids and HIV warning to South Africa World Cup fans” featured prominently on the BBC website
in the run-up to the 2010 World Cup. The warnings were widespread, not
only in the UK, but all over the world. They implied that with the
upcoming football tournament, not only were prostitutes preying on
innocent fans of footy, but pimps and smugglers were ramping up the
trade in sex slavery as well.

According to reports seeded by social work groups and charities, some
40,000 prostitutes were set to arrive in South Africa, many of them
trafficked – coincidentally, the identical number that had been
predicted (but never materialised) for Germany’s World Cup in 2006. Expect to see similar, if not identical, numbers "projected" in advance of the 2012 London Olympics.

With the expected number of fans going to the World Cup in South Africa
estimated at 450,000, that just doesn’t pass the sniff test. One working
girl for every 11 people at the World Cup? Wow. That’s
hospitality provision on a level Premier League teams’ Christmas parties
would envy! And tying these numbers together with anti-trafficking
efforts, well, that's powerful stuff. It's basically saying that out of
any coachload of supporters turning up to watch the matches, several
would have been paying to have sex with unwilling partners smuggled in
to South Africa.

Now, I've never been a big football fan myself, but implying that a
large percentage of them are active endorsers of sex slavery? Stretches
the bounds of credibility juuuuust a little bit too far for me.

As it happens, the claim about widespread sex tourism was refuted several months later
when a UN Population Fund report showed sex workers’ activity didn’t go
up at all. Prostitution was not affected. Neither was trafficking

But the propaganda machine continues apace.

Early 2011 saw reports of the tens of thousands of women who were
“expected” to be trafficked into Dallas for the Super Bowl. The
projected numbers were identical to those supposed to have been
trafficked for the World Cup in South Africa, the Ryder Cup in Wales,
the 2006 World Cup in Germany, the 2004 Olympics in Athens, and the 2000
Olympics in Sydney. In every one of these examples, the projections
have neither been supported by evidence beforehand nor proven to have
happened afterwards. And yet the usual suspects keep trotting them out
the same stories and the same numbers anyway. With the London Olympics
on their way, conferences and fundraising events are already popping up
to ‘raise awareness’ of trafficking issues.

Where does this come from? The consistency points to a well-organised
and well-funded campaign to keep bringing the same arguments around
again, hogging column inches while the reality goes largely unreported.

Several agendas are involved, without doubt. But one in particular is a
strategy devised by Hunt Alternatives Fund to make sure sex for money is
presented as badly as possible. And they use celebrities and writers of
headlines to do so.

But who exactly are Hunt Alternatives Fund, and what's their agenda? As ever, that's a topic for another time.

Friday, 1 April 2011

The causes of rape are not well understood. If they were, it would be
easier to fight them, since we would know how to apply resources. The
fact that rape is such a difficult, under-reported, under-investigated,
and under-prosecuted problem indicates that we really don't know all
that much about its causes.

Because the numbers involved are relatively small, the fluctuations in
rate could be influenced by any number of things not actually to do with
rape per se. After all, the number of reports could change even
when the rate of rape stays the same. There could be subtle reasons why.
A sympathetic and approachable officer in a particular area, for
instance. Availability of crisis support and hotlines. Changes in, or
absence of, these things. It's not only hard to say - it's impossible.
Far easier, and what this series of posts has sought to do, is to sieve
out what does not cause rape, so as better to focus on the real job at hand.

And of course, being small numbers... sometimes a fluctuation is just a fluctuation.

Better evidence collection and better prosecution might help. But we
also need to think hard about preventing rape, not just punishing it.
When someone claims a cause that is not a real cause, this can derail
the real struggle against violence. If the focus is on lap dancing, in
spite of the fact that it has no connection with rape, it is potentially
diverting resources from preventing and investigating the real causes
of crime.

It’s because rape is such a serious crime that researchers must be at
least as rigorous in their analysis as they would with other serious
events like cancer. Otherwise, it’s not real analysis. It’s throwing
numbers around without context. It’s producing reports that look and
feel like real research without the methodology to back them up. It is
cargo cult science.

To avoid becoming cargo cult scientists, Richard Feynman said
researchers must be willing to question their results, and investigate
possible flaws in a theory. Researchers should pursue a level of honesty
that is rare in everyday life. "We've learned from experience that the
truth will come out,” Feynman said. “Other experimenters will repeat
your experiment and find out whether you were wrong or right. Nature's
phenomena will agree or they'll disagree with your theory. And, although
you may gain some temporary fame and excitement, you will not gain a
good reputation as a scientist if you haven't tried to be very careful
in this kind of work. And it's this type of integrity, this kind of care
not to fool yourself, that is missing to a large extent in much of the
research in Cargo Cult Science."

Both cargo cult science and real science look similar on the outside to a
layperson, in that they contain numbers and try to come to some sort of
conclusion. Even if the Lilith report had managed to get its
calculations correct in the first place, there still would have been
plenty of clues that while the look and feel of real research was being
imitated, the content wasn’t up to scratch.

1. It doesn’t calculate a rate. Rates are the bread
and butter of incidence statistics, and a written-in-stone requirement
of any report dealing with a population group. How do I know? Because I
used to write papers reporting children's cancer rates. No rate = no
paper. If one year’s incidence is being compared to another, expect to
see rates, not raw numbers.

2. It doesn’t show a long-term trend. In the Lilith report, a
small number of years were reported. Rapes before the lap dancing clubs
weren’t shown, so they couldn’t be compared. Rapes more than two years
after weren’t shown, so it was impossible to see if the trend was real.

3. It doesn’t use a control group. Control groups, when it comes to population statistics like these, are hard.
I get it. There's no Truman Show bubble world kept somewhere for us to
compare everything to. But as we say where I come from, hard cheese. You
make do. Mention was made in the report of other boroughs (such as
Islington) which have lap dancing, but crimes in areas of London without
lap dancing were not even mentioned so no comparison could be made. The
rest of the country was not considered.

4. It makes a causal connection without direct evidence for a cause, and doesn’t consider other factors. Statisticians
talk about “confounders” – the other factors that can affect your
results. On the basis of a short-term miscalculated trend, a cause and
effect relationship is claimed between lap dancing and rape. However,
this does not take into account the types of rapes reported, any
possible correlation with crime hotspots within the borough, or any
other possible contributing factors. Again, I know from personal
experience this shit is hard. But that's no reason not to make an effort.

The lack of statistical rigour in the report is far from the Lilith paper's only problem, however.

A pervasive feature of poor research is that it often starts from an
assumed position, and any data falling outside of that position are
ignored. The writers come to the study with a bias and look to find ways
for the numbers to fit with their preconceived notions of what the
truth should be rather than what it actually is.

We can see this on the very first page of the Lilith report with statements like:

“This ‘fast fantasy’ approach is demeaning and insulting to women…”

“Lap dance… [is] not going away without a fight.”

“[I]f Camden were to change
its policy on lap dance and striptease establishments, then this good
practice could spread through London.”

It’s clear from the outset that the writers of the report have a
particular agenda – prohibiting adult entertainment. Which is fine,
since everyone's entitled to a say in what happens in their communities.

I don't object to opinions. Think lap dancing is a sin? Great, that's
fine for you. Think it's oppressing women? Great, I look forward to your
paper. What gets my goat is invoking a semblance of statistical
analysis. I'm a (former) statistician, yo. You're on my turf now.
Everyone is entitled to an opinion and also entitled to express it. But
if the writer of any scientific research were so openly biased from the
beginning, there is no chance the report would be accepted by a
reputable journal.

Claiming the methods of science, without buying in to the philosophy of how and why they work, is unethical. If you don’t play by the same rules, you can’t use the same tools.

The tone of the report is so attached to its assumptions that it does not address several other theoretical problems.

The report focuses on the difference in rapes between 1999 and 2002.
However, in its first paragraph, the report states that lap dancing
‘arrived in Britain in 1997 with the opening of Secrets in Hammersmith’.
So why pick and choose statistics starting two years later? If the
opening of lap dancing clubs had an impact, wouldn’t you expect the
impact to be evident reasonably soon afterwards?

Actually, you wouldn’t – not because someone has proven that the lag
time between opening a strip club and increase in rape is 2 years, but
because no one has conclusively proven there is any link between the two
at all. So the choice of year can be completely arbitrary and it does
not matter. Strip clubs are not correlated with rapes in any credible
study.

In fact, the question of what effect adult services have on local crime
has been studied so thoroughly that there can now be studies of the
studies, or what statisticians call “meta-analysis”. A meta-analysis, or
pooled analysis, combines the results of published studies by many
different groups in order to arrive at an overall conclusion.

A meta-analysis examined 110 papers that claimed adult businesses increased crime rates. 73%
of these were records of political discussions, not actual studies.
Removing these and anecdotal reports about only one crime incident, the
authors were left with 29 studies. Of the papers that did not contain
flaws, there was no correlation between any adult-oriented business and
any negative effect. Of the ten most frequently cited papers, not one
met the minimum standards for good research – comparable controls,
sufficient time, and valid data collection.

So while many people might be tempted to believe dodgy statistics
because they sound like something that “should” be true, the analysis
shows no demonstrable link between adult entertainment and crime. The
idea that adult businesses have negative fallout for communities is a
myth that should be put to bed for good.

Another problem with the original paper is a lack of an appropriate
control population, to compare the results with. Having a control
population is particularly important in assessing risk. Controls are
populations where the thing you want to test – strip clubs, in the case
of the Lilith report – doesn’t exist, for comparison purposes.

For instance, in order to suggest a link between smoking and lung
cancer, the original epidemiologists had to examine lung cancer rates
not only in smokers, but also in non-smokers. You need to show that the
factor being examined - smoking, or in our case, lap dance clubs - is
the influencing factor.

Lack of a control group means that the numbers of rapes in Camden were
not reported against the rape stats in a non-lapdancing area. It’s
perfectly possible that the trends happening in Camden were happening
everywhere, regardless of whether there was lap dancing or not. It’s
impossible to know from the Lilith study if such other parts of London
were experiencing similar trends in their crime rates.

The report makes comparisons between Camden, Westminster and Islington,
all of which contain lap dancing clubs. As far as control populations
go, that’s no good: you need somewhere where it doesn’t happen. Kind of
like a placebo group in a medical trial.

So let’s run the statistics using Camden, one of the other areas they
pick which does have strip clubs, and choose an additional one that has
none at all. Because crime can be influenced by factors such as poverty,
it would preferably be of a similar demographic profile. Then an
assessment of the occurrence of rape in that area can be made, for
comparison’s sake. Without doing this, it’s impossible to say whether
any trend was locally concentrated or happening everywhere regardless of
strip clubs.
Lambeth has a somewhat larger population than Camden and similar makeup
in terms of ethnic origin. It contains no lap dancing clubs at all.
Islington has a somewhat smaller population than the other two boroughs
and has two venues licensed for fully nude lapdancing. And since these
statistics are also available for the entire country, let’s throw that
in too. After all the original claim was that Camden's rape stats were three times the national average.

Comprehensive statistics are available for crimes reported to police
throughout England and Wales, so these are straightforward to find.

I shan't bore you with another table, though of course, those numbers
are available (both from me and from the Metropolitan Police) if you're
interested. It pains me to leave one out, because I love tables like a
fat kid loves cake. But one woman's cakey feast is another's sugar rush
nightmare, so. Let's skip straight to the graph. Again, the years
covered by the Lilith paper for Camden are highlighted in red:

The graph shows that adding comparison changes the picture considerably.
It no longer appears that lap-dancing clubs lead to an increase in
rape, since boroughs with fewer or no clubs had consistently higher
rates than Camden’s. The data from the original study is shown to be a
small blip in a larger – downward – trend all over London.

If there was a relationship between the number of lap dancing clubs and
the occurrence of rape, you would expect Lambeth to be lowest of the
three because it has no clubs. Islington would be higher because it has a
couple, and Camden highest because it has more than those other
boroughs. But Camden turns out to be the lowest of the three. There does
not appear to be any relationship between the number of lap dancing
clubs in a borough and the risk of rape.

The trend for the three London boroughs shows clearly that Lambeth (with
no lap dancing) and Islington (with only 2 clubs) both have rates that
are higher than Camden’s. All three have decreased over time, as well,
which is why it pays to look at the longer trend rather than
cherry-picking a few years in statistics. Apart from the early 2000s
peak, Camden’s numbers are close to the overall rate for England and
Wales, and are sometimes even below it. This is a far cry from the
“three times the national average” claimed by the Lilith report.

All things considered, you might wonder why the Lilith report chose to
look at Camden at all. According to the introduction, it was because
“Lilith and Eaves believe that Camden’s opinion and acts carry great
weight with other London boroughs.” Which from the analytical point of
view (especially considering there are no references or other reasons
given) doesn't make sense.

If we were to take this graph as our only evidence, we might conclude
that the risk of rape goes up not because of the presence of lap dancing
clubs, but by living in London, with Camden actually safer in that
regard than other boroughs. We might also be tempted to conclude that
the presence of lap dancing clubs in fact indicates a safer borough in
terms of rape.

Naturally, that would be a very rash conclusion, something a responsible
statistician would be reluctant to suggest. It would require far more
data from the rest of London and the entire nation before such an idea
could be suggested. But that’s the point – in order to make a conclusion
about the effects of social phenomena in general, you need a huge
amount of information to back it up. One limited study of a crime
statistic is not enough and should never be allowed to stand on its own.

Interestingly enough, there are other places where the opening of lap
dancing clubs does seem also to correspond with a reduction in rape and
assaults. One of these is Newquay, in Cornwall.

In 2010, local paper Newquay Voice obtained Devon and Cornwall Constabulary’s figures of sexual assaults.
They found that the total number of recorded sexual assaults (including
rapes) in and around Newquay peaked at 71 in 2005, the year before
Newquay's first lap dance club opened. In 2006, the year following its
opening, the number fell to 51.

In 2007, when the town’s second lap dancing venue opened, the total
number of recorded sexual assaults fell again to 41, then dropped to 27
in 2008 when a third lap dancing club opened. In 2009, the number rose
slightly, but with a total of 33 offences, it is still less than half
the total than before the clubs appeared. Here are the incidence rate
calculations (using midyear population levels for the council of
Restormel, where Newquay is located):

Again, this is only a single example – to conclusively demonstrate that
an increase in lap dancing corresponds with a decrease in rape and
sexual assault, there would have to man more such results, over longer
time periods, from many places. However it does reinforce the same thing
the statistics from Camden show: that lap dancing definitely does not
correlate with higher occurrence of rape. And if there is no rise in
rape, then it is impossible to claim that lap dancing “causes” rape.

Rape is widely thought to be a vastly under-reported crime. The
calculations don’t tell us whether rapes were under-reported for the
area in any particular year, nor what might cause that.

What it does tell us is that the original claim made in the Lilith
report – that the number of reported rapes is rising – is not true. It
was not true in 2003, it continues not to be true, yet the myth that
rapes rise 50% after lap dancing clubs opened in Camden is still
reported, even as recently as August 2009.

Even more important than correcting the errors, as outlined in the
previous posts, is looking at the longer-term trends. Rapes might go up
one year, or two, or three… and they might fall the next. There are
natural fluctuations that can mask the overall trend. The more data we
have to analyse, the more accurate the results. The more accurate the
results, the more informed the reporting.

A problem common to dealing with small numbers is making a hasty
generalisation. This is a fallacy that happens when someone makes a
large conclusion based on a small sample of evidence, such as an initial
result that disappears later, when more data are collected.

Here's an example: Let’s say you’d never been to York before, and went
there for a five-minute visit while changing trains. Let’s also say that
while getting off the train you saw exactly three people – all of them
with red hair. It would be a hasty generalisation to then go around
saying that everyone in York is ginger. And yet, given the very small
number of observations, saying so (while obviously not true) would not
be a contradiction of the evidence you collected.

Small numbers are a problem in statistics, because the less information
we have, the less we can reliably say about it. Dealing this problem
means having to collect more evidence where available, making pertinent
comparisons, and applying more than just simple arithmetic. Reported
rapes are relatively rare, so writing about rape statistics requires
special attention.

Now, just because a crime is rare doesn’t mean it isn’t serious. Rape is
extremely serious. No matter how many people are raped, it’s too many.
One rape in the course of a year would be a tragedy; 72 is obviously a
big problem. However, regardless of the fact that rape is a horrific
crime, it’s also not very commonly documented. By comparison, the rate
of breast cancer among women in the UK hovers around 120 per 100,000 per
year, or more than three times higher than the rate of reported rapes
in Camden in 2000.

It’s important to also find out whether the rate was a one-off, or
whether the rise implied in the Lilith report was sustained. So let’s
calculate rates, but this time for a longer timespan. We know that
between 1999 and 2000 the rate of reported rapes in Camden rose. But did
the trend continue? Have a look at the results:

The change in rates fluctuates a lot on a year-to-year basis! Surprised?
Actually, that’s another feature of dealing with small numbers. Because
the event is uncommon, a few incidents either way have more power to
change the trend. Which is why percentage change for a couple of years,
even if a lot different from what was originally reported, is not a good
indication of what is really happening. (Or as I like to say, more
years equals more better!)

But without the trend, the door is left open for people to misinterpret
the statistics in a way that could be sensationalist and scary. As an
example, let’s say there was 1 death due to vending machines falling
over in Glasgow one year, and then 2 the next. Irresponsible reporting
might say “Vending machine deaths double in one year!” Technically, that
true - but it misses the spirit of what is really going on. It makes
people think the risk of being squashed by a vending machine is going
through the roof, when in fact there aren’t many at all… and there might
be fewer next year.

If we graph the rates, we can see if the trend is rising, falling, or
staying the same. The years covered in the Lilith study are highlighted
in red:

For the ten years 1999-2008, it appears the trend for rate of reported rapes in Camden is actually falling, not rising.

Let's look in depth at one year's change in rape statistics in Camden.
In 1999, the Metropolitan police recorded 72 reports of rape. In 2000,
the number was 88. The Met numbers are available to the public so can’t
be disputed. And those numbers went up. This much the Lilith report got
right. But is that all there is to the story?

The problem with numbers on their own is they don't say anything about context. The number may rise from year to year, but if the population is going up as well, the rate might not be changing at all.

Imagine, for instance, if a paper claimed London has 1000% more Chinese
restaurants than it did 40 years ago, but didn't report the relative
populations for those years. You wouldn't think much of the numbers. Of
course the raw number would have gone up - the population got a lot
bigger from 1970 to 2010. Without context, the numbers don't mean very
much.

When the population grows, you have to take that change into account.
What you need is not just the raw number of crimes reported, but also
the population of the area from one year to the next. This is used to
calculate not the number of crimes, but the rate. Rate and number are
two different things, but many people (even those who should know
better) use them interchangeably, and this creates confusion.

You don't have to be a London native (or even a Daily Mail reader) to
know the population is going up. It's on the rise in Camden. But is it
going up enough to make the rate of rapes look different from the number? Let's see.

Whenever numbers of incidents are reported, they should be used to
calculate the rate of occurrence. This gives you an estimate of how many
times the crime occurred per 100,000 population. So let’s look at those
rape numbers again. For the year 1999, we have 72 rapes reported in
Camden and - according to National Statistics - a population of 195,700
people.

To determine how many rapes occurred per 100,000 residents, we divide
the number of rapes by the total population. Then we multiply by
100,000:

72 ÷ 195,700 × 100,000 = 36.8

This tells us that in 1999, there were 36.8 reported rapes for every
100,000 residents of Camden. Performing the same rate calculation for
2000, when the population was 202,800 and the number of rapes 88, gives
us a rate of 43.4.

Mathematically calculating the change in rate from one year to the next
gives us the percent change, be it a rise or a fall. The change in rate
from 1999 to 2000, or the change from 36.8 to 43.4, is 17.9%.

That is considerably different from 50%. So the rate (which is what
counts) of rapes in Camden did not go up by 50% after the lap-dancing
clubs opened. If you include the even more modest increases in 2001 and
2002, you still come up with a result that is nowhere close to the
Lilith report’s original claim. The combined change from 1999 to 2002 is
a rate increase of 26.9% - in other words, about half of what was
originally reported.

So not only did the media take six years to correct the error in the
Lilith report, they didn’t even get it right the second time around. But
the story doesn’t end there...

The borough of Camden in north London is a vibrant and diverse quarter
of the city. From the Bloomsbury of feminist hero Virginia Woolf to the
leafy expanse of Hampstead Heath, it embraces a colourful past and
present. In the modern iconography of London, Camden Lock is as famous
for its nightlife as Kentish Town and Chalk Farm are for their music
venues. At night the area comes alive, host to almost 2,000 pubs, 130
licensed entertainment venues, and seven lap dancing clubs.

To the uninitiated the Spearmint Rhino may look like any of the handful
of similar establishments in the area, but it has been the epicentre of
controversy since its opening. Not only was it one of the first clubs
granted an all-nude license in the 1990s, it also paved the way for
identical clubs in cities around London and the UK.

Spearmint Rhino was notable not only for full nudity but also for its
style. It gained a reputation for having a less seedy atmosphere than
previous clubs in Soho.

Comfortable leather chairs curl around customers who patronised
Britain's first all-nude strip club. The topless dancers at
Stringfellows were modest titillation by comparison. Spearmint Rhino’s
arrival signalled a new era of adult entertainment in the capital.

Customers responded by making lap dancing the talk of London. 'Table
dancing has moved into the mainstream,' wrote Ben Flanagan in the
Observer. 'The clubs, previously perceived as sleazy and hostile, are
now seen as ideal venues for a corporate night out or a bit of
celebrity-spotting.'

So when a 2003 study of the impact of lap dancing clubs in London
reported a 50% rise in rapes in surrounding areas, people were aghast.
Even worse, the number of rapes was claimed to be three times the
national average. As a statistic, it sounded shocking, but it also had
the ring of truth to it. Lap dancing was as controversial as it was
popular. News outlets all over the UK reported the results as evidence
why the UK should not give in to the creeping infestation of
‘high-street’ lap dancing chains. But was this claim actually true?

The "Report on Lap Dancing and Striptease in the Borough of Camden" [pdf]
was produced by Lilith R&D, part of the Eaves charity founded to
support homeless and vulnerable women. The stated aim of Lilith,
according to its website, is “to eliminate all aspects of violence
against women”. A very worthy ideal, and an important issue. But the
intentions of the authors doesn't make the relationship between their
stated concern (violence) and the subject of the paper (lap dancing) any
more reliable than anyone else's. And there are a large number of
problems with the report simply from the statistical standpoint that
disprove any such connection.

The first flaw in the report is the lack of connection between the
outcome (rape) and the supposed cause (lap dancing). In well-conducted
studies, you expect the researchers to show some connection between the
thing being studied and the outcome being measured. Otherwise, what you
have is a case of ‘correlation is not the same as causation’. In other
words, just because two things happened at the same time doesn’t make
them related.

The complete lack of cited research about stripping causing sex crimes
is unsurprising, because no such results exist. A lot of reports have
claimed the two are related, but repeated studies from many fields have
all failed to connect them – I'll discuss this more later.

The next flaw is an evident unfamiliarity with calculating reliable statistics.

According to the Lilith report, rapes in Camden had been on the rise
since 1999 and showed no signs of dropping. The number reported in 1999
in the borough was 72 rapes. By 2000 it was 88, 2001 had 91 and for
2002, the number of reported rapes in Camden was 96. As far as the
report was concerned, the numbers spoke for themselves.

Only, there's a bit of a problem with their maths. And here’s where the
evidence for the paper being more cargo cult than reliable research
starts to show through.

If you look at only the numbers themselves, the difference from 1999 to
2002 is the difference from 72 to 96. That’s a difference of 24 rapes,
which is only 33% increase – not the 50% originally claimed. A pretty
basic error in mathematics, and one that was surprisingly resistant to
being corrected. It was only six years later, in early 2009, the Guardian reported this elementary miscalculation. The original claim of 50% is still widely reported without being corrected, however.

But actually, it turns out the increase wasn’t even as high as 33%. In
the next part, I'll discuss what it means to calculate rates using the
changes in population to make valid comparisons.