DARPA sets its sights on image manipulation
23 October 2015 - 11:18, by , in News, No comments

shutterstock_280260356

Thank goodness TMZ revealed that the Hollywood Life UNTOUCHED AND PRE-PHOTOSHOP images of Kim Kardashian’s butt-baring photo from Paper magazine last year were fake.

Otherwise, how would we possibly have been able to tell, simply by looking at images like this, this, or this, that the image had been manipulated?

(Note: A spokesperson for Paper Magazine said none of the Photoshopping they did on the actual photo was “drastic”)

The Defense Advanced Research Projects Agency (DARPA) feels our pain and is looking to build an algorithmic-based platform that can detect image manipulation.

As the agency described in materials for a Media Forensics (MediFor) Proposers Day held earlier this month, there’s currently no measure for the integrity of images and video.

That’s an issue, given the escalating plethora of imagery on the internet: according to DARPA, imagery “assets” posted per day had shot up to 1.8 billion as of 2014.

(That’s a lot of assets inflation. And before you bring it up, yes, I agree, Kardashian’s assets don’t need inflating.)

At this point, image analysis to detect modification is relying on ad hoc, manual assessment, DARPA says.

For an example of what that kind of “is it real?” mulling looks like, you can check out the 2008 incident in which the image of the launch of four Iranian missiles graced the front pages of a slew of major newspapers and news sites.

Note the criteria used both in the New York Times story and in the comments section to consider whether the photo was faked (which it was).

For example:

Chicken Sockpuppet July 10, 2008 · 9:32 pm
I am surprised by the number of people asking if it is even photoshopped (or GIMP’d, take your pick). It obviously is – take a look of the color of the sky around the faked missile image – it’s the same gradiant as the higher missile the image was chopped from! Forget the smoke and the dirt (even that is an obvious give away), the color of the sky pulled down to the horizon level is enough to give it away…

Colette July 10, 2008 · 7:51 pm
It looks like the two missiles on either end are also duplicates of each other (notice the puff of smoke on the right side of each). So how many missiles were successfully launched? At most 2? 50% failure rate. Maybe the whole thing was shopped. Iran may not have squat. They don’t even have decent graphic artists. My 17 year old daughter has better photoshop skills.

…and just for laughs, my personal favorite:

Mike July 10, 2008 · 7:58 pm
I think the pic with four missiles is more pleasing to the eye, but I was wondering if someone could add some sheep in the foreground and maybe a 7-11 somewhere?

Assessing the Iranian missile photo manually exemplifies what DARPA’s talking about: ad hoc analysis that doesn’t scale to match the current billions and billions of images being posted every day.

What DARPA has to say about it:

The forensic tools used today lack robustness and scalability and address only some aspects of media authentication; an end‐to‐end platform to perform a complete and automated forensic analysis does not exist. Although there are a few applications for image manipulation detection in the commercial sector, they are typically limited to a yes/no decision about the source being an “original” asset, obtained directly from an imaging device.

As a result, media authentication is typically performed manually using a variety of ad hoc methods that are often more art than science, and forensics analysts rely heavily on their own background and experience.

These are the essential elements of media integrity that DARPA’s indicated for its desired end-to-end platform:

  • Digital integrity. Assessment of the pixels or representations. For example, the part of the Iranian missile photo that showed suspiciously identical smoke plumes. This could include blurred edges or replicated pixels.
  • Breaking the laws of physics. For example, are the shadows and lighting consistent with the time of day and the date that the image was purportedly captured?
  • Semantic integrity. Are the dates/times of the image accurate, or was it repurposed? An example: last year, Russian State TV offered photographic “proof” of Ukraine involvement in the tragic downing of Malaysia Airlines flight MH17. But several bloggers dubbed the photograph a clear fake, citing a cloud pattern and other details to prove that part of the photo dated back to 2012. This will be one of the primary challenges for the platform, to uncover the provenance of an asset or event.
  • Integrity reasoning. How do you pull all of the above indicators into one, integrated assessment?

There are only about, oh, a gazillion potential manipulations the future platform will have to analyze, including copy/paste, eraser/inpainting, cropping, metadata forgery, retouching, multiple compression, artificial shadows, artificial light sources, resizing, blurring/sharpening, contrast, rotation, median smoothing, lightening/darkening, text overlays, warping, and color adjustment.

It certainly sounds like quite the technology challenge. If DARPA succeeds and manages to fund a platform that detects doctored images, we’ll pop a bottle of champagne on its behind.

Excuse me, I mean on its behalf.

Image of Kim Kardashian courtesy of Tinseltown / Shutterstock.com

About author:

Comments are closed here.