AI Tools Quillbee Expert Commentary Nov 1 2023

Fighting Back Against AI With Poison Pills

Think of it like adding a few wrong answers on a stolen test. Artists will have the ability to poison their work to sabotage the AI copying them without permission. 

How? A new tool called Nightshade.

Nightshade lets artists subtly tweak images before posting them online. The changes are invisible to us. But they completely corrupt the data that AI systems ingest and learn from. 

Researchers at the University of Chicago developed Nightshade

In tests against the popular Stable Diffusion model, feeding it just 50 poisoned dog photos made the AI start generating mutant dogs. At 100 poisoned samples, it produced cats instead of dogs!

The effects spread beyond the poisoned images too. Since AI links related concepts, messing with “dogs” also scrambles outputs for “puppy,” “husky,” and “wolf.” Pretty wild right?

Here’s the deal. AI models need massive training datasets from the internet to learn. A lot of this data is copyrighted art and images scraped without the artist’s consent. 

By subtly tweaking their work, artists can trash the AI systems that are profiting from their creations. Poisoning threats might make companies get proper licensing and pay artists.

Now, the researchers admit Nightshade could be misused by bad actors. But it takes thousands of samples to really damage large AI models. And defenses against these “data poisoning” attacks are still lacking.

So what does it all mean? Nightshade gives artists a smart new way to fight back against AI abuse of their work. It may help reset the power balance toward a more equal relationship.

In the early stage, tools like Nightshade show that artists won’t stay silent as AI systems use their creations unchecked. If companies keep ignoring artist copyrights and IP, their products will suffer from the resulting defects. 

Nightshade isn’t a magic wand against unethical AI. But it’s an interesting new form of protest by artists against an industry barreling forward recklessly. How AI developers respond will be telling.

This art poisoning highlights the need for dialogue on AI ethics. Because creative resistance ✊🏻 is just getting started.

Similar Posts