Search

Subscribe

Cameras "Predict" Crimes

The £7,000 device, nicknamed "the Bug", consists of a ring of eight cameras scanning in all directions. It uses software to detect whether anybody is walking or loitering in a way that marks them out from the crowd. A ninth camera then zooms in to follow them if it thinks they are behaving suspiciously.

[...]

"The camera picks up on unusual movement, zooms in on someone and gathers evidence from a face and clothing, acting as a 24-hour operator without someone having to be there," said Jason Butler, head of CCTV at Luton borough council. "We have kids with Asbos telling us they hate the thing because it follows them wherever they go."

This is interesting. It moves us further along the continuum into thoughtcrimes, but near as I can tell, the system just collects evidence on people it thinks suspicious, just in case. Assuming the data is erased immediately after, it's much less invasive than actually accosting someone for thoughtcrime; the costs for false alarms is minimal.

I doubt it works nearly as well as the article claims, but that's likely to change in 5 to 10 years. For example, there's a lot of research being done in the area of microfacial expressions to detect lying and other thoughts. This is the sort of technological advance that we need to be talking about in terms of security, privacy, and liberty.

Comments

"I doubt it works nearly as well as the article claims, but that's likely to change in 5 to 10 years"

I don't think so. Really. We seem to have this false belive that the "truth" is there to be found. For a good crim or a messed up teenager. They are not lying. Because they belive it.

Futher the problem I have with these systems is that the data will *not* be deleted. And what about probable cause. Getting caught on one of these cameras might then become suffent for probable cause of commiting a crime.

I think we just need to get over the fact that sometimes bad things happen. We are just too affraid.

Being one of those unfortunates that live and work around London (UK) and are therefore likley to be on a CCTV around 300-400 times a day I have ambivalent thoughts about the system.

First off I predict that it will have some well publicised early successes and then become steadily less and less usefull (the norm for all CCTV Security Systems).

The up side I guess is that it will probably need less CCTV Operators who normally follow their prejudices (i.e. the Young kid with the hood over their head, or the blond woman who is wearing a short skirt etc).

As for the criminals they will learn fairly quickly how to avoid getting caught by the system so the only thing to have changed is the profit margin of another snake oil security company...

"Stuart Thompson, managing director of Viseum, conceded that the camera might zero in on an innocent member of the public, but he denied it was intrusive, claiming that the innocent had nothing to fear."

The problem are not really the cameras themselves. I wouldn't really mind such a system very much if it were used in the way described.

The issue is more that by buying and installing these systems, we do two things: We voluntarily set up an infrastructure that can be abused by some power-hungry politician, or by an ordinary criminal (eg for blackmail), often they are the same person. There are not even laws regulating maximum storage times, minimum storage security etc. that would contain the possible damage somewhat.

Secondly, we condition everyone that constant surveillance is ok. This is probably the greater cost overall, since it degrades the stability of a (free) society.

And greg is right, how long before someone is told in court: "So if it wasn't you, why did the Sentinel® pick you out of the crowd 2 minutes before?"

As for the technical feasibility: These systems are meant to defeat (ie pick out and record) small-time crooks, and given the state of the art, they will work pretty well even in crowds in 5 years or so. Against knowledgeable opponents, they have no chance, not now and not in 10 years.

"This system would spot if someone was behaving strangely: for example, by constantly changing direction or going up and down stairs."

Which sounds exactly like the behaviour pattern of someone waiting for a date. Nice.

Of course all the critics of CCTV in the UK fail to realise that as our criminals routinely wear striped shirts, face masks and carry large sacks marked SWAG they are relatively easy to spot by camera, leaving the police free to carry on with their vital work of shooting people who look a bit foreign.

Even if it works and nobody cares about the "innocent till proven guilty" still there isanother problem: it targets only low level crime/security problems. Corruption, fraud and other white collar crimes - often much more damaging for society - go undetected yet again.

Cool now once we learn what it considers suspicious we can create diversions and they will end up missing the real evidence.

Seriously consider the, I think it was Marines, in Iraq that kidnapped a suspect and used their knowledge of the surveillance, and its limitations, to setup the suspect and kill him in a staged firefight.

"Hmm....to me, it sounds more like just attempting to automate looking out for someone who is acting 'hinky'[0] In that
case, wouldn't this be a good thing?"

This is actually a real good point. A the point where these systems are as good as a well-trained security guard looking for suspicious behavior, they will have real security value. Then, two questions remain: what are the false alarm rates, and is the data on innocents stored?

"Then, two questions remain: what are the false alarm rates, and is the data on innocents stored?"

The answer to the first is not realy relevant providing it is less than the available resources can handle.

The Second you already know the answer too in the U.K. which is,

"yes and for as long as resources alow".

This is based on the simple premise that you don't know when you have missed something that might be of use in the future it is the same reasoning used for keeping DNA profiles and fingerprints on file even though the person has not commited a crime.

How on Earth can one test a system like this? It's relatively easy to make a system that will notice strange behavior, but how can the designer know what behavior actually correlates with crime?

It was inevitable that people would build AIs to sift through mountains of surveillance data. This twist of having an AI decide whether to gather the images of a particular passer-by in the first place is just temporary, until the memory and optics get good enough for it to record detailed video of everyone.

@bruce: "Then, two questions remain: what are the false alarm rates, and is the data on innocents stored?"

two words - facial recognition, which will turn at some point (if not already, I'm no expert in this field as such) to facial fingerprinting. Whether your actual image is stored is one thing, but once your face is fingerprinted, then tracking your movements and recording them over long periods of time becomes much more achievable. There will almost certainly be ways of tying this in with formal identification - ATM's or anywhere automated payment is made, tollways, etc, etc.

Obviously there are problems, growing or removing facial hair, glasses, hats, etc - but this wont stop people trying. Given we need photo-id for so many things these days, car license, boat license, gun license, volunteer service identification, corporate employment, etc etc - how long before these are tied and facial fingerprinting is the precursor rather than a by-product. You'll be allowed to remove your moe, or grow a beard, but it will be illegal not to have your photo-id updated and accordingly your facial fingerprint ...

Re: attempting to automate looking out for someone who is acting 'hinky'

@ Bruce

> A the point where these systems are as good as a well-trained security guard looking
> for suspicious behavior, they will have real security value. Then, two questions
> remain: what are the false alarm rates, and is the data on innocents stored?

Ah, you forgot the third question: what is the actual escalation process when an alarm is triggered?

Does the camera contain a loudspeaker that shouts out, "DANGER, WILL ROBINSON"? Does the camera call the local police? What is their response time? When they arrive, can they easily identify the source of the alarm? What do they do? Do they arrest everyone, just to be sure? Does triggering a camera alarm give them probable cause to search everyone in the area? Just the suspect? Will the police now interrogate the suspect? Will this turn into a "the camera says your guilty, therefore I will treat you like a convicted felon instead of a suspected citizen?" By the time these things have that fine-grained analytical ability, we may have a robotic police force. Does the ED-209 summarily execute the suspect?

False alarm rates are critical, but as we know the false alarm rates for polygraphs are pretty high, and yet they're still treated as authoritative evidence by a lot of people.

I worked for the company who developed Viseum (Caederus in Swansea) until about 3 years ago.

As I understand it, development has continued since then on an ad-hoc basis with one part-time developer, so I doubt the system has changed all that much since I worked there.

The system was designed to track "objects" between certain speeds - a job it did pretty well (sometimes getting confused if people went behind lamp-posts, etc.). I'm guessing that it's been extended to simply note these objects as "more interesting" when these tracked objects stop for a certain length of time. The system tries to follow the most interesting object it can see for as long as possible, or until another object becomes more interesting.

near as I can tell, the system just collects evidence on people it thinks suspicious, just in case. Assuming the data is erased immediately after, it's much less invasive than actually accosting someone for thoughtcrime; the costs for false alarms is minimal.

That was certainly the case when I was there. Data was recorded for 28 days then discarded.

I could imagine a demonstration of this system. A few hundred shills walk through the field of view, then one of them acts hinky, and the system zooms in on him, to the amazement and wonder of the invited guests. "Ta-da, there it is folks. Now, start shoveling money our way."

In a realworld application, it will be a different story. Suppose a thousand people are in the field of view, and 100 of them fire off the 'hinky' detector. Which 99 will be spared the closer inspection? The luck of the draw means that a malefactor is almost guaranteed to be spared.

If the sensitivity is reduced to keep the false alarms down to a tolerable level, will this help?

No, it will not. Criminal behaviors are statistical rarities. For counts, figure one act per person per second in real time. Almost every single act will then be someone doing something law-abiding, like walking the way they're going, or standing while waiting for the bus, or window shopping.

Out of a thousand people in view, how many will actually be pickpockets? Very few, if any. Most of what a pickpocket does is expertly blend in with the crowd while surveilling it for his next victim. The actual criminal acts -- the pick and the pass (to the partner) -- will be barely detectable to all but the trained eye, and will be over in an instant.

Watching for exceedingly rare events guarantees that virtually all positives will be false positives, and the occasional acts of interest will usually be false negatives -- undetected.

Suppose a shooting occurs out of view, and the shooter, along with dozens of people chasing after him, comes into the field of view. Which runner will the system select for closer scrutiny?

Or make it a few dozen gang members chasing after their intended victim. Which runner will the system select for closer scrutiny?

That said, the system would be very good at detecting and tracking a drunk stumbling around an empty parking lot at 3 a.m. But so would a cheap commercial VCR setup.

@Roy: You asked "Which runner will the system select for closer scrutiny?"

If everyone was running in a group it'd capture the whole group. If they split into separate groups then it'd decide which was most interesting (based on a load of configurable settings - size, speed, entering areas that had been designated as being "watch" areas etc.) and pick one to track. If in the meantime the other group became more interesting it would switch to that group instead.

Of course, it not only captures from the PTZ camera, but also all the fixed reference cameras around the camera too. These were fixed wide-angle so wouldn't capture much in the way of detail.

@Mike: You said "Of course this couldn't possibly be a tourist who is a bit lost, or anything like that... could it?"

Yes, it certainly could, and it'd probably record that lost tourist too. But remember that this (as it stands) captures pictures which can be viewed at a later date *if* something happened and then gets reported to the Police, or similar. At that point there's a timeframe to search through and it'll have captured all the suspicious-looking events.

If nothing happens, that footage will fall by the wayside and get deleted after a pre-determined amount of time.

So, you'd rather have them profile based on behavioral attributes than on anything else, because a terrorist is not an ethnicity, it's a behavioral set determined by intent to act in a certain way.

But you'd rather not have surveillance that is based on a heuristic of pattern-violation because we're bordering on "thoughtcrime"? Acting in an unusual fashion is most often not criminal. Acting criminally is criminal. However, criminal behavior is aberrant. Notice the lack of enforcement inherent in replacing guards with cameras. Notice the emphasis on behavioral profiling, with a system that is programmed to detect pattern violations, not skin-color violations. If you're going to have an observation system prone to abuse, it should at least be one based on good heuristics. Punish the authorities for abusing the information, not for gathering information based on good analysis tactics.

It sure could. We work in a part of the city (Toronto) that is near the waterfront, near the theatre district, near downtown, near a major residential area, near the world's tallest tower, and near a 50,000 seat stadium. I've occasionally surprised co-workers by offering to help tourists find their way -- before even the tourists had figured out they were lost.

In every case, they were behaving differently than both people who work in the area and tourists who were not lost.

(This part of Toronto can be particularly hard to navigate for the inexperienced tourist because it requires navigating in three dimensions. I've seen people lost when they were literally on top of where they wanted to go.)

You don't need cameras, just a whole lot of computing power.
"In fact, a crime mapping and forecasting system is already in the alpha-test stage in two American cities. With funding from the Justice Department, computer scientist Andreas Olligschlaeger, criminologist Jacqueline Cohen, and I amassed individual reports from police departments in Pittsburgh and in Rochester, New York."
Source: Wired. Cloudy, With a Chance of Thefthttp://www.wired.com/wired/archive/11.09/view.html
They were able to predict monthly criminal activity before it happened, with 80 percent accuracy. A good start. It makes cameras look like toys.

Spider listed some very peculiar ASBO's (the British version of the restraining order). Such orders can get quite creepy in some very different ways.

1. Many of them I've read of in news stories from the UK come off sounding like, "don't break the law (again), or I'll be forced to speak harshly to you again." (Not that restraining orders in the USA are much better, but American laws and court orders aren't hollow threats so often.)

2. The rest amount to judge-made laws imposed only on certain people. There certainly are reasons for this sometimes - e.g., if the defendant's excuse for a long string of petty crimes is, "I was drunk," forbidding him to drink just might prevent a recurrence, at a lower cost than prison and a better chance of him staying reformed - but I can't imagine the reason behind most of the orders Spider cited, and the notion of a judge imposing different laws on you than on me is just plain creepy.

3. In the USA, it is frighteningly easy to restrict a man's rights by getting a "domestic violence" restraining order, without anything resembling due process as I know it. My daughter's ex was calling up their kid's preschool and frightening the staff. (He never frightened her; she'd have kicked his ass, but he did like to bully those that don't know how to defend themselves.) She had to merely fill out a form alleging that one morning, and by 5:00 PM the court posted an order barring him from contacting or going near the preschool. She wasn't abusing the system, but look at how easy it is to abuse a system that takes hearsay that someone else was "frightened" as evidence of wrongdoing, and gives the accused no chance to defend himself until after he has lost rights for weeks or months while waiting for a court date.

I heard that the kids in notingham used to run in sight of the surveillance cameras. the operators thought something was up, and sent cops after them. finally, running on the streets was more or less forbidden, the person who told me that had a hard time finding a route to go jogging.

now, you can' forbid people to act hinky. makes a great way of spamming the system.

Some months ago I heard an item on the BBC World Service. The UK Government was privatizing one of the government's military research organizations. They had someone from the organization from the Beeb telling us what great technology they could offer the public. One was a technology that could be used to detect people squirming in airline seats. The interviewee said it could be used by airline staff to spot terrorists.

The BBC Interviewer didn't ask a single intelligent counter-question back: e.g. aren't many people nervous when flying and what if the terrorists aren't? Instead, they BBC Interviewer gave this guy a free run.

The BBC do this a lot, even on their supposedly tech-savvy "Digital World" program. They did a piece about voting over the Internet in Lithuania. The reporter babbled about exciting and convenient it was. She didn't give any consideration or air-time to the prospect it could be hijacked. Given the negative publicity over Diebold's elections in the U.S. and that this was supposedly a tech program , you think they'd have clue enough to mention it.

@m: completely agreed, and everyone should be advised to do the same...

this is just another money-making activity set up by the UK government. True, these cameras could potentially detect some criminal activity - but do you remember the case where a woman with a buggy put her bag on top of it, then walked - and she was picked up for littering because the security camera didn't recognise her buggy?

@Derp: that sounds very much like the (BS that the) "Every Child Matters" government programme is. How can you predict crime to that level of accuracy? Sure, there are a few people in the system with a high probability of committing crime, and a few more that are part of organised crime. But what about the rest?

Is this not just another sea in the ocean of surveillance that is possible because there are companies out there that can make money out of this "opportunity"? ["opportunity" is a word Tony Blair used very much in his reply to the people against ID cards]

This is a real life sorry of how surveillance-cameras made my life better on Sunday and gave me more freedom.

I live in Cambridge (UK), on Sunday it was a very nice hot day, so I decided I wish to read my book outside in a quiet place. So I cycled to the local science park, (that is private land) and set next to their lake that is about 20 feet from some of the office buildings. I set on one of the seats that were put there by the users of the building. There are no fences to keep me out, no keep-out signs, just a lot of cameras to track people, I saw that a camera was zoomed in on me to check was I was doing, this was a lot better the being disturbed by a guard asking me what I was doing then maybe telling me to leave.

There is a manned office on site that checks all the cameras and then directs the guard were he is most useful, I saw him walking along the other side of the lake, (this may have been so that I know I was being watched). Before they put in cameras, they were planning to close of the science park and put guards on all the gates.

@spider
You said:
"Here are some of the more rare reasons why people have been given "Anti-Social Behavior Orders"

* Two teenage boys from east Manchester forbidden to wear one golf glove.

* A 13-year-old forbidden to use the word "grass"."

I am not a fan of ASBOs in general but given that we have them, these two examples are not that unusual. The first is presumably some sort of gang symbol. The second is because grass is slang for informer. I assume this kid was intimidating someone and shouting that they were a grass. In some areas, this can have a similar affect as accusing someone of being a paedophile.

I would like to draw your attention towards a project by a Spanish university (Universidad Rey Juan Carlos) currently in pilot phase at Barajas Airport in Madrid. It detects pieces of luggage left unattended and uses face recognition to detect known criminals.

By the book.
Prediction and Classification: Criminal Justice Decision Making, a collection of commissioned essays by distinguished international scholars, is the ninth volume in the Crime and Justice series. Like its predecessors, Prediction and Classification is essential reading for scholars and researchers seeking a unified source of knowledge about crime, its causes, and its cure. http://www.press.uchicago.edu/cgi-bin/hfs.cgi/00/2396.ctl

Methods for Estimating Crime Rates of Individuals
Describes methods for analyzing offenders' crime commission data and deriving (1) individuals' crime commission rates and (2) rate distributions for groups of offenders with specified characteristics. Uncertain data are treated as censored observations, to obtain nonparametric maximum-likelihood estimates of the distribution of observed crime rates. No standard distributional form was found satisfactory for all crime types, and some types apparently do not occur according to a Poisson process. Shrinkage estimators of individuals' crime commission propensities are obtained by dividing offenders into groups and shrinking data toward a regression estimate of an individual's propensity, based on personal characteristics. A new multivariate distributional form for characterizing the joint distribution of individual crime counts is derived and fit to inmate survey data. Populations that can be surveyed (e.g., prisoners) are unrepresentative of target offender populations of primary interest. Sampling probabilities of surveyed individuals are estimated with stochastic models, allowing estimation of crime rate distributions in target populations.http://www.rand.org/pubs/reports/R2730/

Sure, there are a few people in the system with a high probability of committing crime, and a few more that are part of organised crime. But what about the rest?