This Free Tool 'Poisons' AI Models to Prevent Them From Stealing Your Work — But Some Say It's Akin to 'Illegal' Hacking Nightshade v1.0 empowers artists to protect their works from unauthorized AI training.
- The new tool alters images at the pixel level to mislead AI models while appearing unchanged to human viewers.
- Nightshade aims to increase the cost of training on unlicensed data and encourage AI developers to seek licensing agreements.
Artists have long sought ways to protect their creative works from unsanctioned use, particularly by AI models that train on vast swathes of internet data, often without permission.
Enter Nightshade v1.0: a cutting-edge tool released by computer scientists at the University of Chicago that provides artists with a digital shield to guard their creations against unwanted AI consumption, VentureBeat reported.
Nightshade is the "offensive" counterpart to its predecessor, Glaze — a tool designed to obfuscate an artist's style from AIs.
Glaze's changes to works of art are "like UV light" — not detectable by the naked eye. "The models, they have mathematical functions looking at images very, very differently from just how the human eye looks," Shawn Shan, a grad researcher at the University of Chicago, told IT Brew.
Similarly, Nightshade embeds pixel-level alterations inconspicuous to the human eye within an artwork — but its tweaks effectively serve as a "poison" hallucinogenic for AI, causing it to misinterpret the content entirely, according to VentureBeat. Pictures of pastoral scenes might suddenly be recognized by AI as fashionable accessories — for example, a cow becomes a leather purse.
The tool is tailored for users with Apple's M1, M2 or M3 chip-equipped Macs or PCs running on Windows 10 or 11.
Many artists, including Kelly McKernan — a plaintiff in the highly publicized copyright infringement class-action lawsuit against AI art firms, including Midjourney and DeviantArt — have welcomed Nightshade with open arms, per the outlet. However, critics denounce the tool as a veiled attack on AI models and companies, with one going so far as to call it "illegal" hacking.
Ahahah the cope is insane.— Jade (@jadel4w) January 19, 2024
Dude is legit arguing against glazing your images because it's 'illegal' in his eyes.
He compared it to having his PC hacked because it 'disrupts his operation'
I am delighted pic.twitter.com/BhMP73BkUb
The development team behind Nightshade stands by their creation, arguing that their intention is not to wreak havoc upon AI models but to tip the economic scales, making it less financially viable to ignore artists' copyrights — and more attractive to engage in lawful licensing agreements.