Is it true? New service detects processed photos

Fourandsix Technologies, Inc. has launched izitru.com (pronounced 'is it true') a new, free service and companion iPhone app that can determine whether or not an image has been processed. After uploading a JPEG file to the site, izitru runs six image analysis tests that can differentiate whether or not the image has been altered since it was captured with a digital camera. Izitru then assigns the image a 'trust' rating.

In a quick test, I uploaded an image that had been resized but was otherwise untouched. It received a yellow warning indicating 'potential file modification' and a note that 'our forensic tests suggests this file has been re-saved since initial capture'.

While you may not be able to learn exactly how an image has been altered, it seems izitru does a good job when presented with an untouched file. For example, I uploaded a second file, which had not been altered or re-saved, and it received a 'high trust' rating.

It will be interesting to see how it ranks other images according to the amount of post-processing. If enough people disagree with the results and click the 'challenge' button, izitru will re-evaluate the image using more advanced testing and, if appropriate, the trust rating will be updated.

Izitru says its image tests examine how different cameras store files and looks at the artifacts that are introduced when images are saved multiple times.

Each uploaded image is stored online and given a unique URL, which can be kept private or shared from within the service or app to various social media sites and via email. Images are kept online for those who want to refer others to the test results but will be deleted if no views have been recorded for six months. Images can be deleted manually immediately after the test or, if you signed in with Facebook, Google or Twitter, deleted from your account at a later time.

Fourandsix Technologies will also make an API available for businesses and third-party websites such as insurance companies and social media sites to bypass the izitru.com site and send images directly to the server for analysis. This service, however, is fee-based.

By the way. It does say it will flag something yellow for being resaved if izitru doesn't have a proper profile yet for a certain camera model. Now, shouldn't they instead bring up the message: "Camera not yet profiled, can't verify" if that happens?

Every single image I have tried - regardless of how it's been modified, or not, an original jpeg or a converted raw - they all gave me the same yellow flag for being resaved. For fun I also tried one where I had added some fake water with Flood as well, filling 30% of the frame. Same result: it said it had been resaved, nothing else.

As long as a tool such as this could only find evidence of a photo being resaved, that should still give a green flag. Anything else makes no sense as to get verified it would force people to shoot jpegs and do all modifications in-camera rather than shooting raw and doing proper processing. Not to mention that the best way to prove a photo is real is to present the unadulterated raw file.

And, since all the examples I tried, from original to resaved and heavily modified, and izitru not picking up on neither the real nor the fake, I'd say it doesn't do its job.

Who can say that you didn,t fake the motive when shooting raw? You could also photograph a fake picture and most software would not recognize that.To best way would be realtime uploading + geodata + someone taking a picture of someone taking a picture of someone taking a picture of you...

this is much worse than i expected. Check out my photo (you can even see that I didnt fix all the power cables on the right. Izitru didnt tell me anything except for resaved: http://izitru.com/a4A2V

Now compare it to FotoForensics you can clearly see the rediculously bad heal job in the bottom (I wasnt trying to hide anything - its plain black :) )http://fotoforensics.com/analysis.php?id=a17ef19435fbcc553cbb523709232001f2db1bfa.3690818

I tried the Fotoforensics one with a photo with big, sloppy cloning of a huge area, done in Lightroom. It didn't show up at all, so I'm not sure I'd trust that either... Although, as you say, it does seem to be definitely better than izitru.

So basically JPEGs straight out of the camera would likely pass. Never mind these jpegs straight out of the camera have lens distortion that affect people's faces, but a raw file that is corrected for lens distortion is less likely to pass.

i am just not seeing the use for this unless it can break down into specifics. otherwise what is the point?

that's like going to a restaurant and using a sophisticated state of the art algorythm to tell you if the food is cooked.

well, yes, no kidding it's cooked, but to what degree is it cooked and what method and is it cooked appropriately for the dish or not. identifying if it is cooked or not is not useful in the slightest bit.

Yes and no. Film and the camera it was shot in could be examined by an expert and one could conclusively determine if it was faked. But the negative would provide a level of repudiation against a modified print.

The digital negative has no non-repudiation mechanism. So its inherently NOT the same. Security logs on truely secure systems will have a non-repudiation mechanism, usually a cryptographic checksum.

In the context of a camera, the camera would use a private key to sign / encrypt the cryptographic checksum of the image data. ( The signatory agent could also be a wifi attached service, so some other mechanism. ) Then should someone need to validate the digital negative they need only decrypt the checksum value with the cameras public key. The checksum can be verified against the image data, and you have verified that this particular camera took that particular negative.

The value to this goes beyond the trivial use of determining if the supermodels eyebrows were digitally waxed. It could easily establish forgeries, and settle ownership issues. Furthermore, a system could be constructed to create a certificate chain like verification mechanism that would establish derived works. So photo B is an edited version of photo A, but B's crypto gobbledygook is chained to A's.

Oh for heaven's sake. We already went through this with Neal Krawetz, who claimed the World Press Photo prizewinner was fake but turned out not to be. Collectors and art historians spend millions of dollars every year to try to determine if paintings are fake, and they can't agree. Why do we think a cheap or free app would be able to do what PhDs and forensic experts cannot?

I'd love to have access to the original test images. Its super easy to tell if an image has been edited - even expertly - as each layer of edits has its own signature.

Perhaps DPR should have a "Is It Fake?" challenge with 10 pictures, and see if we, as human beings, are better than the soul-less cloudy-cloud. We fill out a true / false form for each and whoever gets it right is put in a pool to win something.

EXIF data does not contain a non-repudiation mechanism and can be forged easily. Its a bad litmus test for both real and fake photos.

It seems that it only considers OOC images as unmodified, and even resizing throws a red flag, so on the consumer level it's not sufficiently flexible. But I doubt very much that their tests are of sufficiently high accuracy to be accepted as legal evidence by any court. Then what's the point of such a service?

I guess the question's going to be how the system evolves through use. If enough people use it actively (flagging up images for reassessment), they may be able to use a "big data" approach to refine the system and eliminate the current shortcomings.

The problem with this is the inability to distinguish between the sort of processing that has no effect on authenticity (white balance, colour profile, levels/curves) and significant modifications such as compositing.

For the verification to really be of use, you'd ideally need checkboxes that let you declare what changes you've made before evaluation (colour/contrast as above, conversion to black & white, JPEG conversion from RAW), and have the site confirm that nothing ELSE was done.

They should combine this with that site that evaluates your photo to see if you are perceived as trustworthy. In one pass you could find out that (a) the person looks honest and (b) their photo has been altered.

If I stand where Ansel Adams stood, take the exact same shot, and do the same PP in the dark room, can I sell it as an Ansel Adams? Because that's the argument... Provenance matters, so by extension the EXIF matters.

It didnt work.I tried one unmodified JPEG out of camera, one processed raw file turned JPEG and one strongly modified JPEG manipulated with Photoshop.In all 3 it said the photos were modified without further information.

Mine is that all ooc jpegs are already processed. Ooc jpegs aren't reality, even if you haven't touched them since they came from the camera. An ooc could have been processed with a pinhole, miniature, dramatic, etc. art filter, but even if it wasn't, it was processed in some way. It reflects a choice, if not on your part, then on the part of the camera manufacturer, who chose a default processing standard.

Not sure what to think of this service. I have used one of my 4x5" scans (yes - scan from a largo format film) and it was marked as potentially modified. Now that means that every image that originated on film was 'modified'.

Every half-decent image that is to be used professionally will have curves/contrast/colors adjusted in post processing - after all - there is no 'true' colour setting in any camera out there since whatever comes out of a scanner needs further adjustments.

I have expected that they test whether there have been pixel content modifications to the file like cloning etc. I have tested that too, but still get the same message. So to me this sort of testing is pretty much useless, since the result is too ambiguous.

I vaguely remember hearing an interview a few years back on NPR radio, where a computer scientist found a way to detect if a shot had been altered in Photoshop. He was using this technology for legal/court cases, etc. I wonder if this is a by-product of that guy?

Philosophically, perhaps, but not technologically. That method analyzed the image noise signature, which would allow it to detect things like compositing and cloning/healing; areas of the picture that weren't part of the original capture would have different noise patterns from the rest of the image. (Unless, of course, you knew exactly what the software was looking for, reduced the original image noise to insignificance, and overlaid consistent artificial camera-sourced noise to create a unified noise signature throughout the image. With sufficient technical chops, you can even back-port an edited/composited image to a "raw" file. It's just a lot more trouble than a person is likely to go to unless there was a lot at stake.)

What a silly, silly service. On all accounts it's authenticity rating is contested at best. I really feel that this is more a symptom of what is wrong with modern photography than anything else. The fact that it relies on JPEGs is all you need to consider. Really dumb stuff I must say.

So everything that is not OOC JPEG is potentially altered? I thought this service can detect if an image is a composite of several images or if the model in the photo has been altered to look slimmer or something like that.

I only see some sense if you need to prove that something is shown as is - like if you are documenting a crime scene or insurance claim. Then this function could supplement something like the image data verification data which Canon DSLR can add to any photo taken...

Good try. However, how would it judge JPEGs generated from RAW files, where the RAW files have changed white balance or other things? The end result depends on which RAW converter you use. If they can solve that, then it will be a very useful service indeed.

To be honest, that izitru site seems like one of those mood or personality detector toys you might get in a Christmas cracker. You will always get an answer, but it's pure luck if it matches the question.

datiswous, every image requires alteration of some sort. most competitions should have rules on what not to do, usually revovlving around compositing, removal or addition.

if we are being true to the image, if i shoot raw, there are a whole host of things i can do with the data that should be allowed because that data was captured. as long as i am not, say, cloning, it is true to what i shot from camera.

say i recovered highlight to reveal more detail. that should be allowed. the in camera jpeg may not process that detail but the raw file sure does. it just needs to be revealed. i did not create that detail. it's there. you need to edit to get it to show. as long as i am not cloning it in or compossiting from another source/location, it's a valid edit.

Ok, I gave this a try... You can only upload jpeg images. So if you shoot RAW or Tiff, you have to convert your images to jpeg first, so it will always flag them up as modified...

The real kicker is that a service that is dedicated to verifying 'authenticity' of images then allows you to (wait for it...) CROP the image you just uploaded and had verified.. and display a CROPPED version of your 'verified' image... Excuse me? How is this not totally conflicting with the service you aim to provide?

I'd think that "this is the purported use case" and "it's essentially worthless for that use case" isn't exactly "refuting your own argument", since badi never actually argued that it was useful (the smiley is usually a dead giveaway in such cases).

It is if you are relying on photographic evidence to support guilt or innocence. Though POV can affect the interpretation of an image, I'd hate to be arguing the case for my innocence against a prosecutor who is allowed to photoshop a pistol or bloody knife into my hand

that's not useful for that either. let's say i took a photo that happened to be of a crime in progress, but the perps face is covered and tirned away from me. but wait... there's a window he's facing, except it's blown out by highlight relection... wait again, what if i were to pull the detail out of that highlight area and suddenly you are able to see his face reflected. that's a pretty heavy edit.

what if you took a picture of a getaway car but the license plate is covered by heavy shadow but you pulled detail out to reveal the numbers.

even in court, these things would hold up.

for this service, without being able to get into specifics, it's useless.

@wansai Those are edge cases where expert witnesses are used to explain the reconstruction - not "look here is what is clearly the defendant with a knife in his hand" instances. This is "is what is in the presented image actually what is in the presented image". Pulling CSI wannabe does not invalidate the basic utility of this tool - just because it doesn't cover your 1% "thanks very much Mr Bauer" usecase. This is the way tools are developed - it works for 80% - anything else you do manually.