Posted
by
Soulskill
on Friday March 02, 2012 @04:05PM
from the nope-nope-nope dept.

alphadogg writes "The little cameras in your home are multiplying. There are the ones you bought, perhaps your SLR or digital camera, but also those that just kind of show up in your current phone, your old phone, your laptop, your game console, and soon your TV and set-top box. Varun Arora, founder of startup GotoCamera in Singapore, wants you to turn them all on and let his company's algorithms analyze what they show, then sell the results as marketing data, in a sort of visual version of what Google and other firms do with search results and free email services."

You missed a step. First they need to come up with some incentive, let's call it a Judas goat, to sign on and let their programs sift through our pictures. This is a little more complicated than web bugs and tracking cookies, since it requires more effort on our part than logging into Facebook or searching through Google.

Pardon me for being crude but - what are these nutjobs thinking? All it takes is someone in the household going "Sure, we'll do that!" and then little 15 year old Suzie walks by the camera on the way from her bathroom to her bedroom and *boom!* the company behind this has just analyzed child porn. Congrats!

I cover up or disconnect all cameras in my home that might be turned on remotely for one simple reason -- it is my private home. Period. The end.

[...] and then little 15 year old Suzie walks by the camera on the way from her bathroom to her bedroom and *boom!* the company behind this has just analyzed child porn

Putting aside for a moment that perhaps such households should think for a moment before opting into such plans, I have to ask......what, exactly, would be wrong with that scenario?

For one thing, it's not child pornography. The law may perhaps interpret it as such, especially if it ends up being treated as such by the person caught on camera / their legal guardians, but naked people walking in front of a camera does not necessarily pornography make.

For another, my computer could be analyzing child pornography all day long every day of the week. Perhaps it's analyzing it to see if it's known pornography or new pornography. If it's new pornography, perhaps it's trying facial recognition to see if this is of a person whose case has already been handled, or that it may be a new case and should be flagged as such.

But given that the system doesn't know what the material is in the first place, perhaps it's analyzing the picture, sees what looks like a human form, detects that either there's no clothing or the person happens to walk around in a wetsuit that matches their skin color, and either way decides to discard the data.The analyzing software may be much more interested in that bright rectangular surface called your TV to see what programs you watch.

People are way, way too jumpy about this stuff. Next thing you know an adult can't go to a lake for a swim because there's also kids who like diving into the water and have issues keeping their bikini bottoms on* and you just might see that. Oh noes.

It's different if that's the purpose of going swimming there in the first place, of course. Just as it would be different if one of the goals of this company would be to catch people naked (adults: blackmail, kids: CP market?), or if, as part of its operation, the material would be made available to third parties who in turn might have such motives.

Probably about 10 seconds after the first time they recorded and then looked at child porn (i.e. a nude "good" under the age of 18 in most U.S. states). That's a strict liability crime in most states. Also, makes you a registered sex offender.