Adobe’s Engineers Develop AI Routine to Track Fake News Images

Adobe announced that they were working on new artificial intelligence technology that could spot the differences between genuine news photos and fake ones. Their flagship product Photoshop has long been used by those who had wanted to generate phony images for use with manufactured news stories. Considering the recent attention paid to dishonesty in the media, this has become something of a hot button issue.

Newer AI routines could look for certain markers that would indicate an image had been tampered with by using Photoshop or another graphics editing package. Machine learning technology is slowly helping these routines recognize the signs that Photoshop incidentally leaves behind.

When an image saved in Photoshop, it usually gets altered ever so slightly on the pixel level. Lighting, noise distribution and edges are often changed even if the user never intended them to be altered.

This is merely a side-effect of using graphics editing software. On top of this, information derived from watermarks and metadata are usually rather drastic signs that a photograph has been edited to at least some degree.

If the color of an image is off, then forensic computing tools might be able to tell that the image has been manipulated. Facebook and Google have been using tools like this for some time, but this is quite possibly the first time that a developer of a popular graphics package has ever developed tools designed to look for fake images created with solutions they themselves published.

Critics have said, however, that a very simple shape recognition program could do the same thing. Some have even argued that questionable news services could even figure out how these techniques work so that they later defeat them.

Nevertheless, commentators have pointed to the fact that Adobe’s engineers must have decades of information about photomanipulation that puts them ahead of the competition when it comes to figuring out the best way to deploy an AI in this manner.

The goal of this particular package seems to be the ability to detect cloning and pasting of new subject matter, which would certainly reduce concerns about news sites publishing photos that have a person edited into a place they couldn’t have ever been.


Tags

John Rendace


John is a GNU/Linux expert with a hobbyist's background in C/C++, Web development, storage and file system technologies. In his free time, he maintains custom and vintage PC hardware. He's been compiling his own software from source since the DOS days and still prefers using the command line all these years later.
Close