This tool poisons AI from stealing artists' work
HTML-код
- Опубликовано: 26 авг 2024
- University of Chicago Professor Ben Zhao and his team of PhD students created tools Nightshade and Glaze to protect artists and art from generative AI. FULL STORY: abc7.ws/48LOIGS
We need one for music producers
While I agree, what visual artists are going through is the equivalent of record labels and music-producer software companies started generative AI compaies that took all of the work they've published, or have helped create to oust the music artists, and remove them from the equation.
Which I freaking hope laws are passed to protect music artists from this.
Record label, or self-published, it should be protected.
It's also a double-edged sword...
Music artists have services such as audible magic, which can detect uploads to platforms that infringe copyright. If something were to alter audio in that way, it could render such services useless. Visual artists don't have that kind of service or protection. So they need this.
totally agree, ai music should burn.
As an artist who also learns to make music, I 100% agree. Screw greedy AI companies!
That's much easier to defend against AI.
this is excellent! human artists deserve to be protected from this theft
There's hope!
Amazing.
good people flock together, bad people also flock together, great news.
Will definitely be using it👍
This is amazing!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
Ben Zhao and the team are doing lord's work! Fuck generative AI crap. Hope we can develop similar tech for music, voice acting and writing. It's not just about the issue of stealing work of creatives, but the harm of deepfakes, even though that's been around for a while, but generative AI definitely made it much for easier to commit cybercrimes.
justice
yes yes eys eys yess im so happpyyyyy know i can still make a living yayyyyyyy
I just don't understand that these ai companies have billions of dollars but they will not find it easier to just pay the artist the royalty for their artwork used in training the ai system?
Why steal?
And if this is the attitude then that shows a lot about their intentions for the rest of the humanity.
why steal?
greed
They may want to liberties to push the capability of their beloved project before recognizing the destruction caused in the wake of their decisions, especially when nobody is stopping them, and they are writing the book on the ethics of this new front (AI/data scraping).
In a nutshell, Greed. They're corrupt to the core and try to sugar coat it by making people think they're doing it for the "good".
Yes. This is good.
I want the bad cat mona lisa
Thank you for this story, I downloaded nightshade the same day I first saw this story and now I’m able to protect myself and my body of work.
Go Steven Zapata! 👏
😮
Great .... i m support for nightshade
The thing is, many people aren't creative as they used to be.
I feel the same way. Most of the things created with image generators are waifu anime, robot women, or astronauts riding horses on distant planets.
I believe that is also one reason why the models ultimately collapse, because they are uncreative and have nothing to do with human understanding of the world.
Wrong.
In your own bubble that may look like that.
There have been, and always will be uncreative people who ride on the backs of the creative people who aren't afraid to go off the beaten path. Creative people are visionaries and risk takers. People have vision, and take risk to varying degrees, and those who have none of it in them, but want to appear as if they do, will copy, cheat, and steal. This has rung true since the dawn of man. People should accept that if they do not have unique vision and aren't willing to take risks, they should work together closely with people that do, instead of stealing to directly compete because their ego is oversized. There's more than enough room in the world for people who are not necessarily creative. It's not the negative thing they think it is, they could just be more wired for procedural tasks, factual information, etcetera. And there's real value in that, and people like that make great teammates and counterparts to creative visionaries.
There are still so many incredible, creative artists out there that are history-in-the-making modern day. We take them for granted because they're here with us.
@@caterpillar4153 right? it just looks good but that's basically it. there's never a deep meaning or a creative twist.
It's a great little segment, but the tool only works in captivity. It doesn't work on the large models because they're too big to a fact in any real way . A single artist would need to submit thousands of their own images to have any real effect. Nightshade is a panacea. You see tons of these articles about what it does, but you don't see a single one about how well it works in the real world, and there's a reason for that.
I wish you were brighter.
This is how humans work.
Look at trade.
The reality of the humanity forms through the stories they share.
This is a modern story about the will to fight against impossible to defeat odds.
This is a good story.
This will shape a different future than what ai companies planned.
This is a seed being planted.
But you are looking for a tree and fruits.
Patience.
Nightshade haven’t been available to the public until recently, so we are not talking about only one artist protecting their work
There’s gonna be many artworks with the warning and when scrapers take them without second thoughts, let’s see what happens
When someone immediately resorts to personal attacks, it's a sign that big box they're carrying is empty.
Be mollified. That's all this will accomplish in the long run. Math is not on the side of either Nightshade or the academic team who created it. We have seen this played out before with Glaze, and that didn't work in real world conditions either.
The topic of artificial intelligence being trained on other people's work is usually misunderstood by anybody who is not a programmer themselves. Spend some time reading up on the science of generative AI and latent space. It doesn't work they way most people think it does.
@@GeneTurnbow apologies first.
probably yes.
But what do we know about the future.
We never expected a fraction of uproar.
But here we are.
The problem started with the sneaky attitude of the ai companies.
If there's was any transparency that would have actually helped spread of the ai.
But we don't live in a nice world.
And we forget that it is our job to make it nice for all. Not just for ourselves.
Greater power comes with greater responsibility.
And i am well aware of the machine learning systems.
It is not the problem.
Business people are.
I aint reading all that @@GeneTurnbow