[ad_1]
Artists have long sought ways to protect their creative works from unauthorized and often unauthorized use, especially by AI models trained using vast amounts of internet data.
As VentureBeat reports, Nightshade v1.0 is a cutting-edge tool released by computer scientists at the University of Chicago that provides artists with a digital shield to protect their work from unnecessary AI consumption.
Related: What does it take to build truly ethical AI? Here are three tips to help.
Nightshade is the “aggressive” counterpart of its predecessor, Glaze. Glaze is a tool designed to obfuscate an artist’s style from AI.
The transformation of the glaze into a work of art is “like ultraviolet light” and cannot be detected with the naked eye. “The model has mathematical capabilities that allow it to view images in a completely different way than how the human eye sees them,” Sean Xiang, a graduate researcher at the University of Chicago, told IT Brew. .
Similarly, Nightshade embeds pixel-level changes within the artwork that are unnoticeable to the human eye, but according to VentureBeat, its tweaks effectively act as hallucinogenic “poison” to the AI, rendering the content This can lead to complete misinterpretation. A photo of an idyllic landscape could suddenly be recognized by AI as a fashionable accessory. For example, a cow can be turned into a leather handbag.
This tool is tailored for users with Macs with Apple’s M1, M2, or M3 chips or PCs running Windows 10 or 11.
RELATED: Google sues hackers for creating fake ads to download Bard AI technology
According to the outlet, many artists have welcomed Nightshade with open arms, including Kelly McKernan, a plaintiff in a highly publicized copyright infringement class action lawsuit against AI art companies such as Midjourney and DeviantArt. But critics have denounced the tool as a covert attack on AI models and companies, with some even calling it “illegal” hacking.
Oh, that’s an insane way to deal with it.
It’s legitimate to object to making your image glossy because in his eyes it’s “illegal.”
He likened it to having your PC hacked because it “interfere with your business.”I’m happy pic.twitter.com/BhMP73BkUb
— Jade (@jadel4w) January 19, 2024
Nightshade’s development team stands behind their work, and their aim is not to wreak havoc on AI models, but to tip the economic scale, making it economically viable to ignore artist copyrights. They argue that this will make it more attractive to enter into legal licensing agreements.
[ad_2]
Source link