Camera Makers Develop New Tools To Fight AI Deep Fakes and Stolen Works

REDPIXEL.PL / shutterstock.com
REDPIXEL.PL / shutterstock.com

For decades now, artists have had to be worried about people copying their style, reselling their work as their own, or outright sealing their art. Taking a moment to slightly modify it, and then publishing it as their original piece, these criminals have made an entire industry off of artistic thievery. With original artists receiving no financial compensation for their work and often not even given acknowledgment (aka exposure bucks), the courts have been inundated with copywriting complaints.

Now, the three major camera brands, Canon, Nikon, and Sony, are coming together to assign “digital watermarks” to images captured by their equipment. This data will establish proof of who took the image or video and provide sufficient proof to back it up.

As Hot Air explained, “Digital signatures will contain information such as the date, time, location, and photographer of the image and will be resistant to tampering. This will help photojournalists and other professionals who need to ensure the credibility of their work. Nikon will offer this feature in its mirrorless cameras, while Sony and Canon will also incorporate it in their professional-grade mirrorless SLR cameras.”

Continuing, “The three camera giants have agreed on a global standard for digital signatures, which will make them compatible with a web-based tool called Verify. This tool, launched by an alliance of global news organizations, technology companies, and camera makers, will allow anyone to check the credentials of an image for free. Verify will display the relevant information if an image has a digital signature. If artificial intelligence creates or alters an image, Verify will flag it as having ‘No Content Credentials.’”

However, all this information on a single file can be seen as a huge risk to digital security, as well as the privacy of the photographer. Opting into this program will result in the stamping of these images with tons of sensitive data, including your name and the location of the image. While geotagging and copywriting information being embedded has been around for over a decade now, its inclusion in the creation of the image makes it an embedded piece of data. Something hackers have been cracking and exploiting for some time now.

Another problem comes from the lack of ‘Verify’ backed programs. For platforms like Instagram and Facebook they aren’t currently plugged in, thus meaning someone would need to check to confirm the authenticity of the image. A time-consuming and effort-wasting move for many, this seems to only benefit the editor who is looking for who they can exploit the easiest by avoiding established artists.

Currently, Instagram and Twitter see an average of 95 million new images being uploaded daily. Running that volume of images through such software is nearly impossible. Much the same, the old pen and paper works of proof have become outdated and faked easily too. For many artists, this is now a problem with no real solution. That is beyond keeping a lawyer on retainer and frequently scouring for copies of their image being used without permission. Both time-consuming and costly efforts.

While this isn’t a perfect solution, it is a step in the right direction. Spend some time with any photographer who uses social media, and you’ll hear the horror stories of people using their work without permission. From clients stealing ‘proof’ photos for social media, and never buying the unlocked versions. Other ‘artists’ take their work and crop out or remove a watermark to claim it as their own so they can ‘impress’ others and steal work.

All in all, these tools can help mitigate thieves, but criminals will always find a way if they want it bad enough. This is the kind of crime that law enforcement doesn’t take seriously, and the court system largely hates to have clogging up their dockets. With the cost of a proper lawyer to sue them, many artists lack the resources to go after someone stealing their work.