sadly glaze does not work. All someone has to do is just make an image 1% small or larger then boom they can keep training. Glaze and every other method does not work other than just not putting your art online.
@@michaeldata5741 Literally I was thinking, jjust blur the image, (which is similar to make it smaller). Then keep doing that and see what the classification AI most often outputs. It would literally keep this from ever being a thing.
Watching AI artbros moaning about how glazing and nightshading is "dRiNKinG pOiSon hOpiNg tO hUrT oTheRs" is priceless. Oh, and they are starting to demand their prompts be protected by copyright laws too, the audacity of those parasites is just stunning to watch.
You're fighting ghosts because nobody is complaining about them. They don't work, are very easy to detect, and can be de-noised (un-poisoned) if someone really, REALLY wants to use it as part of training.
@@EisenbisonLiterally one of the first searches to Nightshade is a subreddit full of pro-AI losers complaining that software like Nightshade are on par with murder and should be illegal.
@@nsacockroach4099 If it were really worth the company's time, they'd make their models dedicated to denoising images for a fraction of the cost YP has been spending to try and fight the adblockers (and losing). But as it turns out, the vast majority of images available aren't poisoned and the people most paranoid about protecting their works don't have anything companies would consider worth using.
I feel bad for Greg Rutkowski. He's the most AI-emulated living artist. His original work is now vastly outnumbered by AI ones, and some searches almost exclusively give AI results when you look him up
The saddest part is that using his name in a prompt barely changes anything to the final image, and certainly do not make the model emulate his style. It was just placebo effect making AI prompter thinks the images looked better with it.
@@snowolf494 I did, cause I like art, and he is one the top concept artists in the industry, he's done a shit ton of work on Dungens and Dragons. Literally doesn't matter if more people know him, because more people can steal his work for free. No different than you turds trying to pay an artist in Exposure.
I work in a large game studio, and my boss trained a model using my colleague's artwork, then named the model after himself, since the art was supposedly made on company time. (He actually used his personal Artstation works, as well. He then fired anyone who was openly against AI (3 artists in total). Suffice to say, the results were bad, and the parent company fired that boss for unrelated reasons. But creativity hasn't recovered.
He really used the tool wrong.. Sure you can just train an AI, but that doesnt mean what you get is any useful. Surely looks pretty, but thats about it. Real work has to go into it, even with AI images, they are only 'good enough' (not in terms of quality, but what you get vs what you wanted or invisioned)
he fired 3 artists just because they were against ai? sure dude either you are making the hole story up or you are seriously miss representing the story
He is a shitty boss for reasons unrelated to AI. I fully support the transformative use of training data but would not recommend anyone do this type of thing. Human artists are still capable of producing much better images than AI models, even very lousy artists.
I have a friend who left their studio because not only now their work is used to train models, they now work on fixing the imperfections of those generated images. It's a disgusting situation, fixing something generated instead of being creative. Several months later the boss contacted them to return, but I don't think they took it back.
At least currently AI art is still a relatively unexplored artform, especially when making videos. There are plenty of ways for people to improve and get creative, and try to find new uses for it.
Just like factory machines won’t ever steal our joy of creating things yourself, engraving them, spray painting, doing fun projects with your siblings/children/parents… :)
AI would also not do all little details, composition etc. When you create something there is always a meaning behind, even something like color scheme may have some implications. AI can't do that.
This. AI development should focus on making it a tool, instead of a replacement for artists. Digital art f.e. allowed a next level of erasing, color layers, line thickness, ability to print at many resolutions and so on, but it didn't went and replace drawing on paper or something. AI should become a tool too; one that aids with the harder or more unfavorable parts in the art process, instead of copying homework.
I have photoshopped family photos with the help of AI to great success. It's such a great tool for things like that. It's a shame there are bad actors in AI trying to replicate others' styles for their own benefit. I'd be okay with generating whole new images if they kept the art to themselves or on places where they can't gain popularity over real artists. Or if you're a content creator using them in place of stock images or something, though you may as well use watermarked free ones at that point
@@apieceoftoast768Right? Imagine how fucking cool it would be if you could use AI to, say, generate a physically accurate brush stroke with a depth map, with infinitely-zoomable precision. Or to simulate some form of ultra fancy physical or entirely otherworldly effect of the canvas, the colors and so on. Instead we get people typing five words into a text box and then calling themselves artists.
It should be legally required for all generative AIs to watermark their work, since it needs to be differentiated from IP. First, it shouldn't be copyrightable, it wasnt designed by anyone. Second, who knows who should get the credit for making it, its been trained on billions of images. We also need to change when images can be used to train AI. It should be OPT IN, meaning if you want to use my image you need my consent.
@@blobymcblobface That's an overexageration. I think you'd need to prove that the art had a significant human hand in making it, definitely more than a few pixels or even touching up the eyes and hands. And that could work for corporate models that make pieces for commercial use, but good luck enforcing that law on open-source models. Even if you forced the next release of Stable Diffusion to add a watermark, people would just make a modified version without it in a matter of hours, then the devs at Stablity AI would look at you and raise their eyebrows as if to say "What did you expect to happen?" That's just the nature of open-source software.
@@Eisenbison yeah and that kind of content without a watermark would be illegal and could be taken down. You could crop out a watermark, why didn't you say that? There would need to be enforcement, like any policy, to ensure that AI content IS watermarked. There would be loopholes, like almost any policy. Are you saying my solution is bad because it's not 100% effective or what are doing here? It would be a step in the right direction in my opinion, if you're saying it isn't please articulate why better. It seems to me like you're just trying to shut me up.
@@Eisenbison and yes I was using hyperbole. Sorry apparently you have an issue with figurative language so I won't use any. It's the artist's disease and I feel strongly about this topic because my livelihood is being ruined.
I'm just a hobbyist and not a professional artist so this doesn't affect me any really but I can emphasise with people who do make a living from their art. I think AI art is fun to mess around with, or possibly as a tool for reference, but I do hate seeing AI art flood art sites. I want to see something actually drawn by a human. Even with some art sites having settings to supress AI art not everyone properly tags their art as AI. Not to mention people selling AI art or running patreons with their AI art etc. Most of the ones I've seen are just low effort and the basic stuff you get from typing prompts. I know cause that's the sort of stuff I got quickly messing around with some AI image generators. You can pretty much tell AI art instantly depending on the style. Anime art in particular seems to have a distinct AI style.
I'm definitely someone who isn't too talented as an artist, and I really like the idea of tinkering with AI to do fun things I could never do before. But I also understand the plight of artists and how they fear for their careers due to this new AI technology.
@@Bruh-zx2mc what do you think? What could I possibly do with AI that an inexperienced person like me would never do to the same standard otherwise? I mean absolutely nothing against artists, I sympathize with them wholeheartedly, but I fear you don't care about that...
It is true that AI isn't intelligent. It isn't creative, it's still a deterministic program, it can't be spontaneous, it can't create new things. AI is a heavy deceptive marketing tool, AI can be very useful, but current it will not be as good as humans, it will never be as long as it keeps being deterministic. AI IS DETERMINISTIC. We MUST protect people that actually create new things.
People use AI to create new things. That spontaniousness is done by the AI artist, when they make creating AI art a process. AI art itself is made from a random seed, that yes is deterministic, but even something like changing the order of the prompts or adding an extra comma to the end changes the result slightly. How much you use it and in what way is up to you.
You don't need to do exactly as one thing for it to be good enough, like how lighting has worked in games for so long, it was never replicated realistically, just emulated close enough.
About nightshade, the results have yet to be reproduced. People trained models using poisoned artworks and didn't notice any significant effects. So there's that...
I also feel like you could just train another AI to look for the "poison" and remove it / ignore it. It's an interesting idea but it feels like it wouldn't actually be very effective in the long run.
@@system-pn5qwthis is accurate, or adding any truly random sample of noise. Unfortunately for artists, it's very easy to have a model train identically to how a human sees an image. The poisoned labeling method also mentioned in this video will be infinitely more effective.
Sadly i think the battle is already lost How many millions of old images are there on the internet from inactive or unaware artists. While it will be harder for a ai to learn new styles in the future, the classical realistic styles of 1 million google results of dogs before 2020 remain.
@@zeppie_ that is _part_ of it, but yes, stopping AI models IS something I would be for, at least in its current form where AI art is presented as if it is equivalent to other art forms. Look at the studies in the video with concerns about discouraging new artists. Look at art contests where AI generated images won. FFS look at the photography competitions where AI art won! If every single AI generated image could be tagged such that anyone could tell it was generated in part or whole with an AI algorithm at a glance that would be the most balanced solution, sadly as the algorithms improve it will soon become impossible to tell the difference. "AI artists" have proven to not give a shit about tagging art as AI generated, wanting others to think they were able to produce the image with their artistic talent. We will be left with doubt that any photo is reflective of reality, and that any piece of art was an expression of the human experience rather than the soulless hallucinations of a machine learning algorithm that was trained on stolen artwork.
My machine learning class taught us that in a generative adversarial network(GAN), any method of detecting an ai model can be used to train the same ai model in theory. GANs can also be used to avoid poisoning and can lead to an arms race of using the glaze itself to train the ai model, and glaze learning to become better at poisoning. This is really only a temporary solution until the training model catches up using this technique.
@@theonesithtorulethemall I literally had to reread this 37.8 times, and I still don't understand what you're trying to say, and I'm someone raised in a household full of terrible English speaking skills I'm sorry-
Midjourney devs got caught discussing laundering and creating a database of artists to train off of which is now evidence in a current lawsuit, i dunno why anyone thinks it's fine to use copyrighted work without permission in commercial products? it's not like copyright law doesn't apply to these companies because of new software?
while it's still ongoing it's my understanding they where downloading, storing and maintaining a local database of copyrighted artwork that their commerical product was utilizing which is how they're in such a legal clustertruck. Getty images is suing Stability AI for reproducing their watermarks and scraping their website, the law hasn't changed AI or not you can't reproduce others watermarks or train AI on a copyrighted database for commerical uses. I don't agree with certain aspects of copyright law especially things like "70 years after the death of the creator" type stuff but it is what it is and a lot of these AI companies in a perfect example of Midjourney flat out ignored warnings from folks saying what they're doing is gonna get them sued. In terms of your more specific scenario lets try something like: an AI camera staring at a screen going through various google image searches to train from so it's not connected to the local computer at all, would that be iterative enough to not bring down the lawyers for this commerical AI product? one day it may get iterative enough to go undetected but even then you're still selling software that can reproduce copyrighted material it's likely in your more specific ideal scenario best case would be it's stuck in a legal grey limbo area, id imagine these recent lawsuits will shape additonal copyright law around generative AI going forward. It's kind of like the paid mods debacle when bethesda stepped into a legal minefield where they needed to vet every mod to ensure no copyrighted material (such as assets ripped from other games) would end up in said mods, so they'd need a dedicated team reviewing every mod prior to being sold. with the sheer volume AI can generate and efforts from said companies to hide where or how their models source data from creates an impossible to police scenario my opinion is anyone trying to push heavy into commerical AI products without an existing backlog of owned content (such as adobe, microsoft etc) to pull from are walking into a legal grey minefield. In terms of free software? free ai generating things that people are not going to commerically benefit from? i don't personally see an issue with that but the reality is much like the very silly NFT rush everyone is currently spamming AI art on every possible website that has a marketplace for it, drowning out real people by sheer volume, i have no idea how those kinds of issues will be solved. @@2kliksphilip
@@2kliksphilip The comparison doesn't work because AI models are not human. We also don't give copyright to chimps that create artwork because copyright law, at least in the US, is designed to protect human expression. The real question is whether OpenAI & co. were within the grounds of Fair Use when designing the model, and right now that's looking like a losing battle. They are now arguing in court that they should have a copyright exemption because they "need" to harvest copyrighted and licensed data to develop GPT and DALLE. In my opinion, when your company is generating revenue from a product, you should be required to pay for the labor that went into creating it. Doubly so when your product competes with that labor.
@@janus798It doesn't matter if AI is human or not, what matters is that the final product does not use any copyrighted content, only the metadata gathered from training. Any lawsuit is bound to fail from that alone.
@@Bomberman66Hell If the copyrighted content did not exist, the product would not exist. Just because a tablespoon of salt becomes saline in a stew that doesn't mean the chef didn't use salt in the pot. This argument makes no sense and is why OpenAI themselves isn't arguing it. Furthermore, the New York Times, researchers at Google, and numerous independent users have gotten GPT to print verbatim NYT articles and Midjourney to repeatedly generate literal screenshots of the film Dune. These models demonstrably have some representation of unauthorized copyrighted content in their database, and if they didn't, it wouldn't matter because the content made the product.
it's a nice thought, and I understand why artists are excited by these efforts, but the fact is 1) no one has been able to reproduce this poisoning effect yet, who knows if it even works on say, SDXL-turbo, or an already finetuned model. 2) it would be trivial to download 10,000 images from laion, run the poisoning on them, and train a model to convert poisened to non-poisened images. Then in training, you simply add a new step to "de-poison" the image. Bypassing this isn't a matter of if it can be done, or even when it'll be done, but if anyone bothers in the first place.
this idea of "poisoning" or "glazing" art is fundamentally flawed. AI developers are constantly striving for more human-like AI. The whole point of these "protection" methods is that it's hard for humans to tell the difference. So as AI advances, it will stop being affected by these things, because it'll see the image more like a human sees it.
The only way to end this would be copyright and authoring metadata embeded in image files, like what Adobe proposed, but that would also mean the end of only anonymity to some extend.
@@vibaj16Agreed. They don't think these dev already protecting their model by having some sort of auto-filtered system? You're dealing with AI dev that work on this thing since 2015. You're not dealing with any average developer making CRUD app. You're dealing with the best of the best. A Large model like Stable Diffusion will most likely have enough data to train and know which image is right and which image is wrong already. This whole thing feel like synthetic sugar people eat to convince themselves that it cures their disease.
@@apierror When the artist's name is already being used for prompts, I doubt artists really give a sh*t about anonymity. You can't be scared of being exposed to the public when you already are.
@@vibaj16 The thing about so called human-like AI is that it's pure marketing kool aid. The current models that actually exist are still just maths statistics maths etc.
@@arrebarre i am sure about that... these paid anti-ai products can be very easy circumented!!! no, what are you thinking, these anti-ai product are very professional, and seem to work verry well as well
It's the same as cheat vs anti-cheat, will be an ongoing war where they force the other side to use more and more sophisticated techniques until one side quits.
The issue I see with this approach is that a painting, once poisoned and published cannot be "updated", where classifiers and models are getting better over time. Training data may be corrupted now because of these images, but I suspect technology will advance fast enough to be immune to this in no time.
it's entirely on platforms - the image needs to be reprocessed - true. It would be nice if there could be a platform that would process all art, and react to advancements - like RUclips that's keeping your initial download, but serves transcodes of lower quality. That works if we assume noone is storing tens or thousands of TBs of training images once they go live - which is likely.
I imagine that smart artists are going to keep their originals, and can run said originals through new poisoning/masking systems as the tech progresses to keep up with those who seek to bypass it. Might not help a ton with something where you want that original post to remain up, but for an online portfolio/personal site its easy enough to just upload a new image to replace the old. Maybe we'll see people delving into internet archive for older versions of poisoned/masked images? But at that point we're in the range of people copying a specific artist, rather than the intended use case of stopping webcrawling image vacuuming bots that use every image they can find that vaguely resembles a style.
Its going to be a ongoing two way street. Glaze and Nightshade will continue to be worked on and improved, and AI will... probably continue to improve depending on what gets released when, what with so much of it being proprietary. The first iterations of something aren't a good way to judge lasting impact or future improvement when it comes to tech.
Simple solution: don't let AI models continue to improve. Most of the biggest ones are made on stolen training data and labeled by people payed far less than minimum wage. And courts have decided that if a given piece of media wasn't made by a human it can't be copywrited.
it already is immune. Glaze isn't new, it's been doing its thing for a whole year now, they're taking the "up me, up you" route against AI that they can't possibly be win because defense requires more work than attack. If anything, poisoning has been helping AI. At first it was fixed by just bypassing data that had been poisoined, which didn't work well and sometimes was overzealous, ruining a data set. Nowadays it's fed which thing is poisoned and uses that data to better detect other poisoned data. The only thing that can protect the integrity of digital art in the long run is legislation, all this other shit is just drama and business.
Some report it takes from 20 minutes to 12 hours to process an image with that (tool is based on AI model). Also "Use deepbooru for caption" and "Use BLIP for caption" both still tagged nightshaded images fine when I tried, so it does not seem to help against auto tagging tools. AI training specifically is a process, and you need to both tag images and cut / scale them to specific size squares before the training, and I feel this step alone tends be enough to limit the effectivenes of any pixel alteration based method.
Also nowadays you only need 1 image to be able to copy a style, since inpainting has become really strong. Simply inpaint the bottom of the image first, then the top of the image, so the image is 100% replaced. (AI is really good at extending images outward in same style)
@@willhart2188 So you say - that in process preparing a picture, to remove any possible Glaze/whatever's the case, you need to gen-ai fill at least two halves, leading to huge loss/accuracy or diversity, making new model more like old model? Eh, I'm saying that because I don't see "100%" at all there. I agree with auto-tagging being most likely unaffected, but I'm not certain if that's unexpected. I can't confirm the times to Glaze - suggested presets are ranging from
I used AI a couple of times just for references, I'm an artist who free draws both digital and traditional. I've noticed when I was generating poses, one was literally a drawing of a girl lying down or doing yoga. But her head was turned in an uncomfortable way, which was possibly the AI's doing. You can still see the signature of the artist
it depends on the model, some models rely too much on the original artwork while other models like dall-e are more creative, and barely rely on the original art
Every time I see AI Artworks on Social media, I try to mute those "artists". I have nothing against them, but I'm not sure if their AI "art" is legit or not
AI generations and true art is starting to turn into the -Hacker- Cheater (sorry) vs Developer war. A developer blocks a way to cheat, a hacker finds a workaround, rinse and repeat.
The thing about hackers is that there is a monetary incentive for hacking, as you can get access to valuable data. With poisoning AI there's no incentive for the developer of these poisoning tools, apart from feeling of doing the right thing.
@@annoyannoy i think you only know one side of the coin. There are Malicious hackers yes. But there are also Hackers who use their activities for positive things. Only recently hackers caught a large rail manufacturer using inbuilt mechanisms to brick entire trains should they be serviced by someone else. I would argue that there are many more of these whitehat hackers than the malicious blackhat hackers. Just that the ones with malicious intent cause headlines.
@@annoyannoy Hackong is not only data hacking or systems hacking, but actually doing anything you want with something. Hacker can be someone 3d printing or changing how their electronics work.
Glaze + NS would probably fill the same niche as a watermark. A paying customer would probably want the original unaltered copy, but those watermarks aren't _for_ paying customers.
It's so bizarre how many techbros will just approve of unprecedented mass copyright infringement because it makes the slop get made a little quicker. People talk about dystopian futures due to the government/technology etc. but the people who will bring forth that dystopia are americans, willingly.
It's not unprecedented. This has been a "legal" thing since the PATRIOT act, when the NSA has argued that processing personal information from mass surveillance into metadata, is in fact not mass surveillance, because you don't retain the original personal data and cannot reconstruct it just with one profile (this is how AI is trained, your "original" is not stored in its model). You just get a blob of processed metadata that doesn't mean anything by itself. So if you're okay with the US spying on you and training their models on you for the past two decades, then you should just be quiet. Americans have shoveled their own grave on this matter 20 years ago.
As an hobby artist this is quite the enjoyable development, looks like we are getting more ammo for our side of the war after the corporations caught us off-guard at the start
On a serious discourse I really doubt this technique will have any intended effect any effect on large model training, these techniques work by using a small proxy CLIP model, and the hope is that if it works on this model it will also confuse larger, trained on a different dataset model with potentially a different architecture, while not introducing too many artifacts. Nightshade promises was also to confuse auto-labers, but now that it's out and even with the strongest setting I can't manage to fool GPT-4V or even open source and much smaller model like LLaVa-1.5, as some other paper has shown the bigger the image encoder became the more align is to human vision. In the end I think some people will spend tons of compute to create images with adversarial noise that will help the models learn more robustly and not take shortcuts like the proxy model did (the reason why it was vulnerable), as the images with the adversarial noise or not will be labeled correctly the model will actually learn that the patterns that deceived the proxy model are not what a human would think a cow is for example, and the model should be able to align more to what a human would expect.
And that's not getting into the fact that all these countermeasures have a noticable effect on quality. You can't glaze a lot of art that's shared online, and even for those you can there are counter-countermeasures for this stuff popping up on the regular.
Thank you for covering this; the war has just begun -Nightshade is a fantastic first offensive -and if any of you who sympathize are in the States, please consider donating any amount to Concept Art Assoc.'s fundme for protecting artists against AI. They are making wonderful legal efforts
Thank you philip. As an artist and a long time follower of yours I really appreciate you putting light on this :-) I look up to you and really admire everything you do^^
unfortunately, if glaze gets popular, a de-glaze AI will probably be developed to restore the original for training AI. I don't think it would even be that difficult for them.
It’s not even needed, some current models aren’t even fooled. It’s a losing game trying to trick something that can learn from said tricks (if they even work on the model in question in the first place)
I am not an artist but I enjoy the the stuff PEOPLE create. These days, if not completely obvious, I just don't trust artists that have no works posted before 2020. If there are more people thinking like me, it means that becoming an artist now has become much harder. Also, art that looks the best gets hit the most.
It's a little depressing that you don't trust artists who started after 2020. I feel like I can instantly tell an AI generated image from regular art with 99.9999% certainty
The problem with both these options is that they fundamentally alter the picture and introduce noisy artifacts. Passing a picture into some edge-refining tool would most likely break this. Obviously it blocks data scrapers trying to automate the process, but anyone with enough time to actually want to copy a specific artist would probably be able to do so without too much hassle.
I geniunely appreciate you making this video cuz I feel my art being used by someone for AI is unavoidable at some point during my art journey, this is a good video I'm saving and it is definitely useful for me when I post my art online but keeping the clear version and the original files myself. I really didn't expect that this is something I would learn from you on this sort of matter xd
Poisoning these models would be really a good use of trolling - esp if we can get selective trolling par how damaging the tool is. Modern art AI programs are dangerous for our careers and the artistic world we want to live in, but AI can be really cool and useful, too, IE much of what chatbots are (outside of when they try to write stories) is amazing. AI even has artistic uses, such as easily cleaning up brush strokes, turning 20 tedius brush strokes into 1 key.
artists immediately 180ing on fair use when it no longer suits them and demanding a total extension of copyright to styles and themes makes me glad AI will replace these useless eaters
Positioning the current use of AI models as being “for education” exclusively is laughable. Companies are using AI models trained on data obtained without permission in for-profit ventures, and have been for quite some time.
most of the art i see are already reposted so many times that it's a crunchy pixel mess so this filter wouldn't really change how i see the work anyways.
I believe the entire concept of intellectual property needs to be re-examined thoroughly. No one wants the artists to starve, but at the same time, if we go really hard with AI licensing laws, only gigacorporations will be able to buy enough training data to create AIs. It will end the open-source grass roots level AI development right then and there. I believe that to be the more dystopic alternative.
I couldn't agree more. The "options" are not "AI exists or AI doesn't exist". The options are "Everyone has access to AI" or "Big companies have access to AI". There's no denying that it will be a rough time for artists, but i believe being as open about is as possible is the best course of action. Even if big corporations agree to pay artists for their training data, they would do so once and then have their perfect AI. In the long term there is no winning for artists, apart from trying to work WITH AI, using it as a tool to speed up their own creation process.
@@Magnos We are way past the point of choking AI development out by restricting its access to training data. Even if no new data were given to AIs, the existing models have enough training data, that the models can begin using pictures generated by AI to train their models. We only really need human artwork for the initial seed, and after that the selection weights will do the rest. Sure, it will be faster with quality human work as of now, but we most certainly don't have a scarcity of data. Using real works for training is just the easy way. All restricting access to more data will accomplish is raise the barrier for entry for AI developers, meaning we will end up with some mystery Google blackbox AI, rather than thousand individual models, most of which being open source.
Yes. But also: Why are artists worried if they claim AI cannot produce "real art"? Seems like a bit of a double standard. You can't make that claim and at the same time say it will result in a decline in creativity. You could make the point that oversaturation and the lack of professional artists will result in less people taking it on as a job but there's a reason that it's a dream job to many people besides the money. There is nothing inherently degrading about making it a hobby. Like with automation in general it will just mean there's less demand for that kind of labour. Given enough time, development and data this could make creating art as accessible as never before if you give the user more control and a more consistent quality of what it is they're generating. I understand wanting to get compensated for contributing to a technology they never agreed to be a part of but a lot of artists are just flat out anti AI and are pushing for the entire thing to be illegal or heavilly regulated like you said. If you don't want your art to be "stolen*" then don't put it on the internet. OpenAI is going to such ridiculous lengths not to offend licence holders or anyone else already that ther company slogan may aswell be "Please don't sue us.". It was extremely usefull for coding until almost every request resulted in placeholders beeing put in instead of actual code. I feel sorry for anyone still paying 20$ a month for this if that's what they use it for.
I think it's not Artists vs. AI, it's Artists vs. Profit oriented ppl with no creativity using AI. At the end of the day AI is a tool and I don't mind ppl using it if they have the consciousness level of using it to help. But for ppl who're planning to use it to copy or replace artists, then you're literally trying to kill creativity and the connection to the soul through the facade of progression. And much later ppl are going to realized that these AI gen image are just boring and will look else where.
DLsite has "dlst" files that are encrypted, and can only be unlocked by people who have bought them (you need to be signed in). They also require a specific program to open, that prevents taking screenshots of the comics downloaded. It detects that and will show as a black page instead, even when trying to use external screenshot program. Yet still there are people getting around that and pirating stuff.
Thats not possible as images are just a series of color data. If the data to mask it from ai is seperate, it will only affect normal users, people training ai will just not need to read that part.
It is definitely a dream solution, due to how AI works there will always be work around that it either learned during training on its own or deliberately told to ignore the blockers. It is far too complex for really any simple solution, and if one is found a new architecture will render it irrelevant
The dream solution would be a platform where you can upload your work and its protected for you, Instagram was good for that ! Since you cant really save images from there. But with a filter would be dope. 😢
I think glazing will create demand for prints. If a user wants to commission an unglazed image, they can pay extra to get a physical copy. People that want a beautiful SFW artwork will likely pony up the dough. On the other hand, the porn artist will still be raking in with the glazed images since you don't need the pic to be high rez to wank to it. But really, there should be laws to protect art from being used as training data. I feel like the best solution for both parties is if the artist is given the right to sell rights to use their art in training data, AI could instead become an impetus for artist to make money instead of being put out of work.
This makes me wonder if one day these artifacts we see as a result of stuff like glaze and ight shade will be looked at in the same way we look at lines in a tv or older grimey static. Will glaze, ironically in a way, be seen as a artistic expression of defiance and a mood.
When I first saw glaze, as shown in your video, I actually thought it was AI generated artwork, before thinking instead that it was JPEG artifacts. Not sure what that means for the tool if the first thought that someone has when they see glaze is that it's AI artifacts.
I used to be really pro AI art, even made plugins for the popular UIs but the more I dig deeper into the community, the more "crypto-bro" it becomes. I went onto the reddit thread of Hollys and the comments there pretty much look straight out of a crypto bro's vocabulary, its a Us vs Them type deal to them and anyone who questions the morality of AI and using other people's work is seen as FUD.
I hope to see a future where tech continues to advance and we can create new thing that were impossible before, but I don't like AI images flooding sites for artists. I want to side with artists because they're being taken advantage of by big companies, but I also want to see the technology get better. Really conflicted on this.
The other side isn't any different, sorry to break it to you. Furthermore, they reinforce each other It's a choice between crypto bros and technically illiterate moral panickers spreading misinformation about technology to push their agenda.
@@LutraLovegoodand yet the most popular sites for sharing art and portfolios (artstation and deviantart) actively promote it. It’s a massive fuck you to artists
re: would they notice? - ive seen larger errors in ai generation go unnoticed like strange architecture, the amount of teeth in a face, finger shapes on hands or the way foliage is too chaotic even for nature. as an artist myself i feel that if a work of art has those defects after being run through glade and/or nightshade it would be indistinguishable from being an aspect in the style of which the artist renders their work, though it may be more obvious if the art isnt a painting.
Yes, and digital photography reduced the need for jobs where they were developing film and the sale of said film for cameras in general. Polaroids are a niche product now. Casette players a bygone era. CD's as well. A whole lot, perhaps the majority of artists use digital software to digitally paint now, lowering the demand for canvas and paints. Someone has always suffered as we have advanced the way we do things. If you're a good artist. You will find work. The artist name is a brand as well. I find it bordering on stupid to try "fight AI" when it's not going to disappear. At this point, it would make just as much sense to start fighting everyone who can use their eyes and copy art styles as well.
@@nustanielAI is still a bad tool for hacks without taste. A problem previous innovations just didn't have. It might not go away but we can still ruin its reputation.
@@Cyliandre441 But there's no need to "ruin it's reputation." It comes off as juvenile in my eyes. A bunch of artists afraid it'll take their jobs, when it won't. If you are a good artist, you will find work. People will always want to hire a real artist. AI won't replace that need, no matter how good it can get as it is further developed. Traditional painters, sculptors and so on gets work nowadays still, and in some cases very good sums of money for it also. I don't care to use AI outside of the curiosity to try something new, but I see it as a tool. There's so much wonky stuff about AI generated art that an artist needs to go in and fix it anyways. I guess traditional painters should revolt and ruin the reputation of using digital software with Undo, Cut, Copy and Paste as well. (They tried btw.) "Cheats and tools for hacks who can't paint!" AI as far as I see it, is a tool that artists can use, if they want. To everyone else it's a toy. I also don't see what taste has to do with it. If the generated artwork looks good, it looks good. If it doesn't, it doesn't. There's a bunch of art drawn by humans I wouldn't consider good taste as well. Get off that pompous high horse.
@@Cyliandre441 It is also a tool for people who don't have thousands of hours to gather experience, and hundreds of hours to get the pictures they want... Or for that matter thousands of dollars to commission everything. You wont ruin the reputation, you'll just look like sour assholes angry that progress caught up... Now art is accessible and available to the uninitiated... Without having to spend thousands of hours, or thousands of dollars. You know what artists said about digital artists? They were hacks, soulless, they could never replace them, etc. You sound the same. Stop whining and being an elitist prick.
@@nustaniel stop with the comparison to "copying" art styles. I absolutely hate people that have never studied art coming with this stupid shallow argument.
I thought AI art was cool at first but after 2023 I kinda wish it would've never existed. It's just so effortless and worst of all you see it everywhere. It's literally spam, just google fanart or something and you'll see.
As a temporary measure there should be some kind of custom small watermark or something like if you include this string of numbers or text in the uploaded file the model wont take it
ill make a bold statement AI isnt even AI, its just really, really advanced code until some prodigy makes a truly sentient AI that is similar to a person, it is nothing more than a more advanced machine; and machines cant exactly have human creativity
And the saddest thing about this is people are actually calling themselves artists and this their art. Because its only a "tool". Its like saying I'm know Chinese and I'm a translator because I know how to use google translate.
Programmers: ❌Automate routine work so that people concentrate on a carefree life full of entertainment and art. ✅Automate entertainment and art so that people forced to concentrate on boring routine work.
This! Yes. I thing this allows even emerging artists to price one’s original work way higher than what may have once been advised not to price high starting out. An artists I follow shares her process of creating her art. Her paintings sell at 12 to 17 thousand dollars after she hit the jackpot in sales once going on TikTok. So now I’m thinking starting out pricing original paintings say a 24x30 at a thousand dollars as an emerging artists somehow doesn’t seem so high a price tag anymore.
Human creativity and expression does not need an automated solution. Generative AI causes unemployment and benefits no one but the people at the top. Generative AI is also great at creating misinformation, CSAM and other dangerous things. AI should stick to research and Medicine where it can actually benifit people.
This is truly a tough problem. Artists are absolutely justified in bringing upset about every going on with AI and training on their work for free. However, I fear that the cock’s out of the bag, and there’s no getting it back in. Wait… cat?
Yeah, this is likely true. The big companies running Dall-e and Midjourney already have scraped datasets of probably billions of images. I somewhat doubt that new data is even a topic in improving Image-Generators. Propperly tagging all those billions of images is likely more of a challenge.
They're not justified in the slightest. AI 'steals' in the same way that you or I 'steal' by just looking at things and remembering the conclusions we drew from doing so. They're perfectly fine having people look at their art and having people draw conclusions from them and remember those conclusions, but somehow it becomes an issue when people use a program to do the exact same thing. If they weren't just complaining over sour grapes, then they wouldn't have publicly released their artwork. They did, therefore they have no foot to stand on.
@@ChaosSwissroIl I agree with that completely. The AI really does just see things, learns, and uses it to draw upon when making something a user has asked for. Sure, there's instances of it basically just replicating some things nearly identically, but that's a small part of the problem. A real person can't remember billions of sources of inspiration and create something new with it in seconds. I think that's where I feel most sympathy with artists. The only way people will commission artists for custom and unique works moving forward is just going to be out of pure principle. Which works for individuals requesting new art, but businesses and corporations don't operate on principles, they operate on profit margins... Artists have every right to be upset about all their work being used, without permission, to train AI that is going to put them out of job, or potential careers. I say this as someone that's been fascinated with, and used AI for all sorts of stuff over the years. I like it, and use it almost daily. I can't even think of a good compromise to try to make everyone happy.
@@ChaosSwissroIl those models don't "learn" anything, they don't see or have even a remote understanding of the real world. They don't feel anything, have no emotions. AI models literally assemble Frankenstein-esque copies of the "trained" material, and are designed to circumvent copyright law. It's basically a machine designed to screw artists, so they're very justified.
@@BRUXXUS You're completely wrong. AI pumps out thousands of images, but it has no knowledge of what it creates, it doesn't see any relationship between objects nor even sees them. It literally has no awareness of anything going on in the real world and can't put emotion into anything. It can perfectly replicate and recall thousands of images, and assemble them in a Frankenstein-esque way. That is not art, nor is the learning process similar to a human
I still think as an artist, it is copyright infringement to use artworks that you do not have the rights to, to do anything with intention to distribute, including training ai
Idk how anyone can justify AI art. IT IS LITERALLY STEALING. These people TOOK HER IMAGES they stole her art and had ai regurgitate her art. It's not okay. They shouldn't be allowed to do so. That's literally as if they stole her art prints and then started selling them claiming it as their own art. ISN'T THAT ILLEGAL? How can anyone defend AI.
this sort of controversy gives me very similar vibes to react content, where ai training seems to be taking content without permission and then only taken it out after request, which is what a lot of reactors do. i wonder about that for you philip as im sure you mightve been a victim of the reaction content, whats your stance between the two? you feel the same for both? or do you think its different for original youtube content being reacted to vs original artwork being trained on?
With most reaction content its clear that its against copyright as its not fair use (for most reaction channels). But going against those is super hard and tedious, but a community has started suing them and so far none of them have appeared in court.
No AI training is not even close to react content, react content literally copy pastes the original with an added facecam, and AI models remember thousands of images and spit out Frankenstein-esque assembled combinations of those. Both are unethical and infringe copyright. F ck both
I feel AI art is even more fair than most reaction streamers. It isn’t reposting art that it didn’t make, it’s learning from art and then making its own.
Genie's out of the bottle and it's not going back in regardless of what anyone wants. The conversation should be about how artists can get compensated for their work going forward, but trying to "stop" this ain't ever going to happen. It literally _can't_ happen. That's not how disruptive inventions like this work. _Someone_ will use it and that's that. I mean, remember Metallica trying to stop Napster? Doesn't work.
Nah, the conversation should be about how AI in it’s current state couldn’t function without the works of others without their consent. Saying “Genie’s out of the bottle lol” isn’t going to make the use of other artist’s intellectual works (without their consent and en mass to boot) suddenly okay. Emerging technology isn’t immune to regulation, but the law does need to catch up to AI.
@@thatradioboy Yeah, those are _feelings._ Non-enforceable feelings. Like I said, people _will_ use this technology... and in the coming years you won't even be able to detect that it was AI that was involved at all. What then? Think laws are going to stop that?
@@Cimlite Why do you AI Bros have to be such snarky douchebags lol? AI uses people’s intellectual property without their permission, that isn’t a feeling bro that’s a fact and is the reason why so many people dislike this technology. And AI generated images will always have aspects that call them out as AI, no matter how good the technology gets.
@@thatradioboy I'm no "AI bro", I'm just a realist about these things. AI is invented, and it's not going anywhere. Trying to legislate it away is a fool's errand. You say that generated images will always have tells, and that's just not accurate. Even today, there's images that fools people (just look at the AI painting that won an art award). Right now, you can't tell when it's a good image... and the bad ones to good ones ratio is just going to get better and better. A few years and being an artist is a lost cause - and that's not something I'm happy about, but it's what's going to happen no matter how you feel about it.
Nothing wrong with the tech. There is no way stopping progress. The fault is in ethics and copyright. Commercial AI sould be only trained with dataset they have fully right of the ownership
I think training on copyright data should be fine. Using AI to generate copyright infringing art / or to impersonate people is not OK. AI is fine as long as it is used as a tool for making something new, and also when you don't try to pass it as something you drew yourself.
@@milandavid7223 It is not stealing, as long as what you make is transformative, and adds to it in a meaningful way. How you use the tool is important.
Glad you dedicate a video about this side of the conversation as well. Must say I was a bit annoyed with the general publics praise of generated imagry that came at the cost of using copyrighted artwork, while using it before for commercial purposes wouldn't be dared. But developments like this make me hopeful that, if not the usage of copyrighted artwork in models will be unusable, it'd atleast become banned for commercial usage.
GOOD, if their whole purpose was to get rid of artists, we artists have zero compassion when we say we want to get rid of these "ai" image generation sosftwares.
i feel like artists shouldn't pretend like they care about copyright law esp when so much of the internet's culture is based on freely taking random images you find and remixing them
I can't exactly blame the trolls. AIs like SD are generally trained using mass copyright infringement; the models arguably shouldn't even exist in the first place. It sucks to see artists' work get stolen for profit with no recourse, at least this way they can try to deter plagiarists. It's not even that hard to bypass this issue, big models could've just limited their image queries to things licensed under something permissive. As for private models, it'll probably become an arms race a-la cybersecurity.
Here's a thing: Copyright laws are already outdated and harm artists, especially fanartists. Making the laws stricter will definitely evoke "be careful what you wish for".
It's a nice thought, but unfortunately I really don't think it's gonna be enough. AI models keep being updated and eventually an anti-poison feature will simply part of the software.
it's an arms race and eventually the amount of compute required to generate a non-poisoned image becomes prohibitively expensive and not worth anyone's time. Hopefully.
So, for those wondering, here's how the current process works. Model trains, makes image, different model looks at image and says if it looks off and points out flaws, model then trains on this feedback. The "poison" system is probably just exploiting subliminal noise patterns which is literally just a bug. The entire issue they are "working with" (exploiting) will likely be completely fixed/solved in the next few months at the longest.
I don't really care about the results of these arguments, but I do love hearing them. It really makes you feel like you're in the future. I wonder, if AI does kill professional human art careers, if that will force humans to do art for the joy of it alone, which is probably the healthiest reason to do art in the first place. Time will tell
Yeah, artists complain that companies and profits corrupt art.... but art has essentially always been for profit. And now that there is a potential to do art just for the love of art, they are whining as well.
this argument is moot- remove the word art completely. There are people who have spent years of their lives working on a skill in order to make a living and AI has started getting these people laid off and in the future will continue to do more of that. In reality a job is a job at the end of the day, telling someone who has dedicated their life to a skill "its ok you can work minimum wage and make art for fun" isnt a solution to their problem. Thats why copyright is being brought up, make no mistake this is an effort to see if these people will continue to be able to feed themselves. If your response to this is "fuck them get a real job" its pretty shortsighted, every job is at risk and until some magical perfect UBI utopia falls into our laps, this is a crisis of what are we going to do for these people.
@@SioxerNikita people talk about how they wish they were free to work on their art fulltime without having to pay bills. Its fantasy. Talking points of a perfect world. Telling them "its ok now you wont have any income from art at all" just deletes them. Decades of work and training now mean nothing. "Work at burger king and make an OC in your small amount of free time and you will definitely be happy now!" misses both the current point and the point of the conversation prior.
@@epocfeal Well, welcome to the world of shoemakers... welcome to the world of 99% of European farmers... welcome to the world of 95% of factory workers... welcome to the ... I think you got the point. This is just the "I like the benefits society has gotten from all the automation... until it hits the field I like!"
Only publish poisoned images and only send to customers a picture with an imperceptible watermark (I'm sure it can be done) so if it leaks you know who leaked it and sue
I dont think this will do much in the long run. If we as humans can see through the filter an AI will eventually see through that. A copyright compliant AI dataset should be created and slowly but surely enhanced enough to be useful for further AI training. There is no point in "war". Solar sand has a good couple of videos on this, AI models are inevitable.
@@Pigness7 Censorship often just makes martyrs. Then all of a sudden making AI models is the "path of freedom". I'm hoping for artists adapting to the new reality. An artist with AI knowledge will make art with more quality than any unexperienced joe could ever make, or even dream to make.
@@Pigness7 then there's still Mistral and Claude and Gemini and millions of copies of other openly available LLMs... Once you open Pandora's Box there is no closing it. Going back is not an option, only learning to live in tomorrow. This, ladies and gentlemen, is how people react to technological progress, and this is the first true breakthrough most under 30s have ever experienced.
Ai is already at the point where it can use generated images to improve, so there is all this effort being put into an area that is already being pulled away from. and as you mentioned the individual style copying is going to be done by individuals, and they will just pick images that aren't poisoned. and unless the artist never shares/sells an image, it can be acquired and used to train a personal model. I have looked at the paper about poisoning the training data, and I feel it would only sort of work if the ai researchers/developers stopped working on their projects and sabotaged their own work. Like part of improving the technology would be making sure the quality of the incoming data is good, and it would detect that it is either a glazed image and categorize it as too low quality to use, or if it is label poisoned like with the dog/cat thing, then it would see that the label doesn't match what it has seen in the past and also throw it out. and I guess you have succeeded in having an image not be used in an ai model, but you have also either reduced the image quality for all the humans you share it with or ruined the accessibility of the image by having the wrong description of the image. I support artists and enjoy making art, I just don't like spreading false hope, or information that doesn't seem to have much to back it up.
I alost feel sorry for the people defending this AI crap. They don't understand the artists because they don't know what it feels like to have created anything of value ever in their life.
I actually think poisoning techniques will work long term. The reason is a matter of economic incentives. For people that produce AI algorithms, there is a strong diminishing return to adding new pieces to the training set. If you already have billions and billions of images, adding in one additional artist will only fractionally improve the value of your algorithm. Meanwhile for artists that are trying to protect their style, the value of that is fixed. Thus artists and creaters of adversarial methods will continue to improve their methodology, while at a certain point, the likes of stable diffusion will just stop training the algorithm and take the existing result as the final version. There are also multiple ways to corrupt images, and it is relatively easy to devise new ones. It becomes increasingly difficult for an algorithm to detect an increasing number of differently corrupted images, without making mistakes.
There’s companies and some very dedicated nerds against each other. Luckily if there’s a passion for it the public will win. At least that’s what I hope. I’m fine with ai images, but we need to force them to use images they have permission to use instead of scraping the web for everything they can find.
I choked on my drink when I saw the "restoration" of that old photo, was not expecting that
Same brother
Improvement*
timestamp?
1:25
This is a great way to use AI
This feels like a modern-age water mark
facts
Is*
sadly glaze does not work. All someone has to do is just make an image 1% small or larger then boom they can keep training. Glaze and every other method does not work other than just not putting your art online.
@@michaeldata5741 I've read on NS's website that it is not affected by edits to the image.
@@michaeldata5741 Literally I was thinking, jjust blur the image, (which is similar to make it smaller). Then keep doing that and see what the classification AI most often outputs. It would literally keep this from ever being a thing.
Watching AI artbros moaning about how glazing and nightshading is "dRiNKinG pOiSon hOpiNg tO hUrT oTheRs" is priceless. Oh, and they are starting to demand their prompts be protected by copyright laws too, the audacity of those parasites is just stunning to watch.
You're fighting ghosts because nobody is complaining about them. They don't work, are very easy to detect, and can be de-noised (un-poisoned) if someone really, REALLY wants to use it as part of training.
@@EisenbisonLiterally one of the first searches to Nightshade is a subreddit full of pro-AI losers complaining that software like Nightshade are on par with murder and should be illegal.
@@Eisenbison
The last thing was mentioned in the end of this video.
It would make it non-lucrative for companies to do that.
@@nsacockroach4099 If it were really worth the company's time, they'd make their models dedicated to denoising images for a fraction of the cost YP has been spending to try and fight the adblockers (and losing).
But as it turns out, the vast majority of images available aren't poisoned and the people most paranoid about protecting their works don't have anything companies would consider worth using.
sd3:
I feel bad for Greg Rutkowski. He's the most AI-emulated living artist. His original work is now vastly outnumbered by AI ones, and some searches almost exclusively give AI results when you look him up
The saddest part is that using his name in a prompt barely changes anything to the final image, and certainly do not make the model emulate his style. It was just placebo effect making AI prompter thinks the images looked better with it.
I mean, did you know about him before AI came along? I bet no one did.
@@snowolf494well now its ruined for everyone, including greg himself.
@@snowolf494 I did, cause I like art, and he is one the top concept artists in the industry, he's done a shit ton of work on Dungens and Dragons.
Literally doesn't matter if more people know him, because more people can steal his work for free. No different than you turds trying to pay an artist in Exposure.
@@1itemorless Congratulation, you are very cultivated. Now try to ask what is a "Greg Rutkowski" to anyone in the street.
I work in a large game studio, and my boss trained a model using my colleague's artwork, then named the model after himself, since the art was supposedly made on company time. (He actually used his personal Artstation works, as well. He then fired anyone who was openly against AI (3 artists in total). Suffice to say, the results were bad, and the parent company fired that boss for unrelated reasons. But creativity hasn't recovered.
that is shitty behavior and that wouldnt have even benefitted anyone, not suprised he was fired for unrelated reasons
He really used the tool wrong..
Sure you can just train an AI, but that doesnt mean what you get is any useful.
Surely looks pretty, but thats about it.
Real work has to go into it, even with AI images, they are only 'good enough' (not in terms of quality, but what you get vs what you wanted or invisioned)
he fired 3 artists just because they were against ai? sure dude either you are making the hole story up or you are seriously miss representing the story
He is a shitty boss for reasons unrelated to AI. I fully support the transformative use of training data but would not recommend anyone do this type of thing. Human artists are still capable of producing much better images than AI models, even very lousy artists.
I have a friend who left their studio because not only now their work is used to train models, they now work on fixing the imperfections of those generated images. It's a disgusting situation, fixing something generated instead of being creative. Several months later the boss contacted them to return, but I don't think they took it back.
"I don't need to explain why this will happen; you know I am right." Pure poetry and simply true.
it is inevitable, internet may be changing but it's still the internet
ClOCKS is always the answer 🐓
Too bad literally everyone else isn't allowed to use this argument.
I'm currently pushing my art skills to the next level and I don't think AI will ever match the sheer joy of making art and then continuing to improve
Doing art for the joy of doing art is the most pure way to create, don't let money or AI take that from you
and you won't have to pay a huge electric bill x)
At least currently AI art is still a relatively unexplored artform, especially when making videos. There are plenty of ways for people to improve and get creative, and try to find new uses for it.
Just like factory machines won’t ever steal our joy of creating things yourself, engraving them, spray painting, doing fun projects with your siblings/children/parents… :)
AI would also not do all little details, composition etc. When you create something there is always a meaning behind, even something like color scheme may have some implications. AI can't do that.
i hate ai because a lot of people don't use it as a tool but to replace art effort
This.
AI development should focus on making it a tool, instead of a replacement for artists.
Digital art f.e. allowed a next level of erasing, color layers, line thickness, ability to print at many resolutions and so on, but it didn't went and replace drawing on paper or something.
AI should become a tool too; one that aids with the harder or more unfavorable parts in the art process, instead of copying homework.
I have photoshopped family photos with the help of AI to great success. It's such a great tool for things like that. It's a shame there are bad actors in AI trying to replicate others' styles for their own benefit. I'd be okay with generating whole new images if they kept the art to themselves or on places where they can't gain popularity over real artists. Or if you're a content creator using them in place of stock images or something, though you may as well use watermarked free ones at that point
@@phoenixvance6642yeah tbh even seeing it in a youtube thumbnail puts me off that channel tbh cuz even then it’s being used for profit
@@apieceoftoast768Right? Imagine how fucking cool it would be if you could use AI to, say, generate a physically accurate brush stroke with a depth map, with infinitely-zoomable precision. Or to simulate some form of ultra fancy physical or entirely otherworldly effect of the canvas, the colors and so on.
Instead we get people typing five words into a text box and then calling themselves artists.
How about just don't use it at all.
It should be legally required for all generative AIs to watermark their work, since it needs to be differentiated from IP. First, it shouldn't be copyrightable, it wasnt designed by anyone. Second, who knows who should get the credit for making it, its been trained on billions of images.
We also need to change when images can be used to train AI. It should be OPT IN, meaning if you want to use my image you need my consent.
I think you're behind the times a bit... The courts have already ruled that AI art can't be copyrighted.
@@Eisenbison In the US. AND if I change one pixel I can copyright it. Hope that helps. The watermarks would make it less of an issue
@@blobymcblobface That's an overexageration. I think you'd need to prove that the art had a significant human hand in making it, definitely more than a few pixels or even touching up the eyes and hands.
And that could work for corporate models that make pieces for commercial use, but good luck enforcing that law on open-source models. Even if you forced the next release of Stable Diffusion to add a watermark, people would just make a modified version without it in a matter of hours, then the devs at Stablity AI would look at you and raise their eyebrows as if to say "What did you expect to happen?"
That's just the nature of open-source software.
@@Eisenbison yeah and that kind of content without a watermark would be illegal and could be taken down. You could crop out a watermark, why didn't you say that? There would need to be enforcement, like any policy, to ensure that AI content IS watermarked. There would be loopholes, like almost any policy. Are you saying my solution is bad because it's not 100% effective or what are doing here? It would be a step in the right direction in my opinion, if you're saying it isn't please articulate why better. It seems to me like you're just trying to shut me up.
@@Eisenbison and yes I was using hyperbole. Sorry apparently you have an issue with figurative language so I won't use any. It's the artist's disease and I feel strongly about this topic because my livelihood is being ruined.
I'm just a hobbyist and not a professional artist so this doesn't affect me any really but I can emphasise with people who do make a living from their art. I think AI art is fun to mess around with, or possibly as a tool for reference, but I do hate seeing AI art flood art sites. I want to see something actually drawn by a human. Even with some art sites having settings to supress AI art not everyone properly tags their art as AI. Not to mention people selling AI art or running patreons with their AI art etc. Most of the ones I've seen are just low effort and the basic stuff you get from typing prompts. I know cause that's the sort of stuff I got quickly messing around with some AI image generators. You can pretty much tell AI art instantly depending on the style. Anime art in particular seems to have a distinct AI style.
I'm definitely someone who isn't too talented as an artist, and I really like the idea of tinkering with AI to do fun things I could never do before.
But I also understand the plight of artists and how they fear for their careers due to this new AI technology.
"or possibly as a tool for reference"
Tools for references already exist, they're called cameras.
@@GameMaker3_5 "I really like the idea of tinkering with AI to do fun things I could never do before" Such as?
Those AI art patreons will be almost exclusively full of porn art. Going to assume most people don't care if their porn is AI generated or hand drawn.
@@Bruh-zx2mc what do you think? What could I possibly do with AI that an inexperienced person like me would never do to the same standard otherwise?
I mean absolutely nothing against artists, I sympathize with them wholeheartedly, but I fear you don't care about that...
It is true that AI isn't intelligent.
It isn't creative, it's still a deterministic program, it can't be spontaneous, it can't create new things. AI is a heavy deceptive marketing tool, AI can be very useful, but current it will not be as good as humans, it will never be as long as it keeps being deterministic.
AI IS DETERMINISTIC.
We MUST protect people that actually create new things.
Getting A.I to mimic your brain is closer than you think and human despite their supposed unpredictability are ironically predictable.
@@GregorianMGwhat are you fucking smoking. A single neuron has more complexity than a neural network.
People use AI to create new things. That spontaniousness is done by the AI artist, when they make creating AI art a process. AI art itself is made from a random seed, that yes is deterministic, but even something like changing the order of the prompts or adding an extra comma to the end changes the result slightly. How much you use it and in what way is up to you.
You don't need to do exactly as one thing for it to be good enough, like how lighting has worked in games for so long, it was never replicated realistically, just emulated close enough.
it is intellegent, probably more intellegent than the likes of you
About nightshade, the results have yet to be reproduced. People trained models using poisoned artworks and didn't notice any significant effects. So there's that...
I also feel like you could just train another AI to look for the "poison" and remove it / ignore it. It's an interesting idea but it feels like it wouldn't actually be very effective in the long run.
Yeah I really doubt it will do much of anything
Yep, I don't see how this would ever work
You can just run it through smooth upscale and you're done. Result may be little worse but still recognisable.
@@system-pn5qwthis is accurate, or adding any truly random sample of noise. Unfortunately for artists, it's very easy to have a model train identically to how a human sees an image. The poisoned labeling method also mentioned in this video will be infinitely more effective.
Sadly i think the battle is already lost
How many millions of old images are there on the internet from inactive or unaware artists. While it will be harder for a ai to learn new styles in the future, the classical realistic styles of 1 million google results of dogs before 2020 remain.
This is not about stopping the creation of AI models. This is about stopping people from using artist's artworks without permission
@@zeppie_ that is _part_ of it, but yes, stopping AI models IS something I would be for, at least in its current form where AI art is presented as if it is equivalent to other art forms. Look at the studies in the video with concerns about discouraging new artists. Look at art contests where AI generated images won. FFS look at the photography competitions where AI art won! If every single AI generated image could be tagged such that anyone could tell it was generated in part or whole with an AI algorithm at a glance that would be the most balanced solution, sadly as the algorithms improve it will soon become impossible to tell the difference. "AI artists" have proven to not give a shit about tagging art as AI generated, wanting others to think they were able to produce the image with their artistic talent. We will be left with doubt that any photo is reflective of reality, and that any piece of art was an expression of the human experience rather than the soulless hallucinations of a machine learning algorithm that was trained on stolen artwork.
My machine learning class taught us that in a generative adversarial network(GAN), any method of detecting an ai model can be used to train the same ai model in theory. GANs can also be used to avoid poisoning and can lead to an arms race of using the glaze itself to train the ai model, and glaze learning to become better at poisoning. This is really only a temporary solution until the training model catches up using this technique.
Jsut train an ai on how images shuld look and have it simply restore images and all glazing that does not ruin images will fail
@@theonesithtorulethemall
I literally had to reread this 37.8 times, and I still don't understand what you're trying to say, and I'm someone raised in a household full of terrible English speaking skills
I'm sorry-
@@theonesithtorulethemallJsut
Midjourney devs got caught discussing laundering and creating a database of artists to train off of which is now evidence in a current lawsuit, i dunno why anyone thinks it's fine to use copyrighted work without permission in commercial products? it's not like copyright law doesn't apply to these companies because of new software?
Where I live it's literally legal to do so, as the law is written.
while it's still ongoing it's my understanding they where downloading, storing and maintaining a local database of copyrighted artwork that their commerical product was utilizing which is how they're in such a legal clustertruck. Getty images is suing Stability AI for reproducing their watermarks and scraping their website, the law hasn't changed AI or not you can't reproduce others watermarks or train AI on a copyrighted database for commerical uses.
I don't agree with certain aspects of copyright law especially things like "70 years after the death of the creator" type stuff but it is what it is and a lot of these AI companies in a perfect example of Midjourney flat out ignored warnings from folks saying what they're doing is gonna get them sued.
In terms of your more specific scenario lets try something like: an AI camera staring at a screen going through various google image searches to train from so it's not connected to the local computer at all, would that be iterative enough to not bring down the lawyers for this commerical AI product? one day it may get iterative enough to go undetected but even then you're still selling software that can reproduce copyrighted material it's likely in your more specific ideal scenario best case would be it's stuck in a legal grey limbo area, id imagine these recent lawsuits will shape additonal copyright law around generative AI going forward.
It's kind of like the paid mods debacle when bethesda stepped into a legal minefield where they needed to vet every mod to ensure no copyrighted material (such as assets ripped from other games) would end up in said mods, so they'd need a dedicated team reviewing every mod prior to being sold. with the sheer volume AI can generate and efforts from said companies to hide where or how their models source data from creates an impossible to police scenario
my opinion is anyone trying to push heavy into commerical AI products without an existing backlog of owned content (such as adobe, microsoft etc) to pull from are walking into a legal grey minefield. In terms of free software? free ai generating things that people are not going to commerically benefit from? i don't personally see an issue with that but the reality is much like the very silly NFT rush everyone is currently spamming AI art on every possible website that has a marketplace for it, drowning out real people by sheer volume, i have no idea how those kinds of issues will be solved. @@2kliksphilip
@@2kliksphilip The comparison doesn't work because AI models are not human. We also don't give copyright to chimps that create artwork because copyright law, at least in the US, is designed to protect human expression.
The real question is whether OpenAI & co. were within the grounds of Fair Use when designing the model, and right now that's looking like a losing battle. They are now arguing in court that they should have a copyright exemption because they "need" to harvest copyrighted and licensed data to develop GPT and DALLE.
In my opinion, when your company is generating revenue from a product, you should be required to pay for the labor that went into creating it. Doubly so when your product competes with that labor.
@@janus798It doesn't matter if AI is human or not, what matters is that the final product does not use any copyrighted content, only the metadata gathered from training. Any lawsuit is bound to fail from that alone.
@@Bomberman66Hell If the copyrighted content did not exist, the product would not exist. Just because a tablespoon of salt becomes saline in a stew that doesn't mean the chef didn't use salt in the pot. This argument makes no sense and is why OpenAI themselves isn't arguing it. Furthermore, the New York Times, researchers at Google, and numerous independent users have gotten GPT to print verbatim NYT articles and Midjourney to repeatedly generate literal screenshots of the film Dune. These models demonstrably have some representation of unauthorized copyrighted content in their database, and if they didn't, it wouldn't matter because the content made the product.
it's a nice thought, and I understand why artists are excited by these efforts, but the fact is
1) no one has been able to reproduce this poisoning effect yet, who knows if it even works on say, SDXL-turbo, or an already finetuned model.
2) it would be trivial to download 10,000 images from laion, run the poisoning on them, and train a model to convert poisened to non-poisened images. Then in training, you simply add a new step to "de-poison" the image.
Bypassing this isn't a matter of if it can be done, or even when it'll be done, but if anyone bothers in the first place.
this idea of "poisoning" or "glazing" art is fundamentally flawed. AI developers are constantly striving for more human-like AI. The whole point of these "protection" methods is that it's hard for humans to tell the difference. So as AI advances, it will stop being affected by these things, because it'll see the image more like a human sees it.
The only way to end this would be copyright and authoring metadata embeded in image files, like what Adobe proposed, but that would also mean the end of only anonymity to some extend.
@@vibaj16Agreed. They don't think these dev already protecting their model by having some sort of auto-filtered system? You're dealing with AI dev that work on this thing since 2015. You're not dealing with any average developer making CRUD app. You're dealing with the best of the best.
A Large model like Stable Diffusion will most likely have enough data to train and know which image is right and which image is wrong already. This whole thing feel like synthetic sugar people eat to convince themselves that it cures their disease.
@@apierror When the artist's name is already being used for prompts, I doubt artists really give a sh*t about anonymity. You can't be scared of being exposed to the public when you already are.
@@vibaj16 The thing about so called human-like AI is that it's pure marketing kool aid. The current models that actually exist are still just maths statistics maths etc.
congrats for winning the best content creator award philip!
:D
Best content creator
It was 3kliksphilip
@@selohcin From HLTV
@@Yeenosaurus Thank you!
I imagine someone will just train a denoiser for images generated with nightshade and glaze to get around this.
it is bound to happen
you could probably use Topaz do get rid of the protections
@@arrebarre i am sure about that... these paid anti-ai products can be very easy circumented!!!
no, what are you thinking, these anti-ai product are very professional, and seem to work verry well as well
Stable Diffusion is literally a denoiser
It's the same as cheat vs anti-cheat, will be an ongoing war where they force the other side to use more and more sophisticated techniques until one side quits.
The issue I see with this approach is that a painting, once poisoned and published cannot be "updated", where classifiers and models are getting better over time. Training data may be corrupted now because of these images, but I suspect technology will advance fast enough to be immune to this in no time.
it's entirely on platforms - the image needs to be reprocessed - true.
It would be nice if there could be a platform that would process all art, and react to advancements - like RUclips that's keeping your initial download, but serves transcodes of lower quality.
That works if we assume noone is storing tens or thousands of TBs of training images once they go live - which is likely.
I imagine that smart artists are going to keep their originals, and can run said originals through new poisoning/masking systems as the tech progresses to keep up with those who seek to bypass it. Might not help a ton with something where you want that original post to remain up, but for an online portfolio/personal site its easy enough to just upload a new image to replace the old. Maybe we'll see people delving into internet archive for older versions of poisoned/masked images? But at that point we're in the range of people copying a specific artist, rather than the intended use case of stopping webcrawling image vacuuming bots that use every image they can find that vaguely resembles a style.
Its going to be a ongoing two way street. Glaze and Nightshade will continue to be worked on and improved, and AI will... probably continue to improve depending on what gets released when, what with so much of it being proprietary. The first iterations of something aren't a good way to judge lasting impact or future improvement when it comes to tech.
Simple solution: don't let AI models continue to improve. Most of the biggest ones are made on stolen training data and labeled by people payed far less than minimum wage. And courts have decided that if a given piece of media wasn't made by a human it can't be copywrited.
it already is immune. Glaze isn't new, it's been doing its thing for a whole year now, they're taking the "up me, up you" route against AI that they can't possibly be win because defense requires more work than attack.
If anything, poisoning has been helping AI. At first it was fixed by just bypassing data that had been poisoined, which didn't work well and sometimes was overzealous, ruining a data set.
Nowadays it's fed which thing is poisoned and uses that data to better detect other poisoned data.
The only thing that can protect the integrity of digital art in the long run is legislation, all this other shit is just drama and business.
Some report it takes from 20 minutes to 12 hours to process an image with that (tool is based on AI model). Also "Use deepbooru for caption" and "Use BLIP for caption" both still tagged nightshaded images fine when I tried, so it does not seem to help against auto tagging tools. AI training specifically is a process, and you need to both tag images and cut / scale them to specific size squares before the training, and I feel this step alone tends be enough to limit the effectivenes of any pixel alteration based method.
Also nowadays you only need 1 image to be able to copy a style, since inpainting has become really strong. Simply inpaint the bottom of the image first, then the top of the image, so the image is 100% replaced. (AI is really good at extending images outward in same style)
@@willhart2188
So you say - that in process preparing a picture, to remove any possible Glaze/whatever's the case, you need to gen-ai fill at least two halves, leading to huge loss/accuracy or diversity, making new model more like old model?
Eh, I'm saying that because I don't see "100%" at all there.
I agree with auto-tagging being most likely unaffected, but I'm not certain if that's unexpected.
I can't confirm the times to Glaze - suggested presets are ranging from
I used AI a couple of times just for references, I'm an artist who free draws both digital and traditional. I've noticed when I was generating poses, one was literally a drawing of a girl lying down or doing yoga. But her head was turned in an uncomfortable way, which was possibly the AI's doing. You can still see the signature of the artist
it depends on the model, some models rely too much on the original artwork while other models like dall-e are more creative, and barely rely on the original art
@@infinitehexingtonAi is never original, tho- it just seems "original"
Every time I see AI Artworks on Social media, I try to mute those "artists". I have nothing against them, but I'm not sure if their AI "art" is legit or not
That uncle restoration caught me off guard, can't even lie
AI generations and true art is starting to turn into the -Hacker- Cheater (sorry) vs Developer war. A developer blocks a way to cheat, a hacker finds a workaround, rinse and repeat.
The thing about hackers is that there is a monetary incentive for hacking, as you can get access to valuable data. With poisoning AI there's no incentive for the developer of these poisoning tools, apart from feeling of doing the right thing.
6:40
except both sides enter this game as hackers, trying to break the system for an advantage
@@annoyannoy i think you only know one side of the coin. There are Malicious hackers yes. But there are also Hackers who use their activities for positive things. Only recently hackers caught a large rail manufacturer using inbuilt mechanisms to brick entire trains should they be serviced by someone else.
I would argue that there are many more of these whitehat hackers than the malicious blackhat hackers. Just that the ones with malicious intent cause headlines.
@@annoyannoy Hackong is not only data hacking or systems hacking, but actually doing anything you want with something.
Hacker can be someone 3d printing or changing how their electronics work.
Glaze + NS would probably fill the same niche as a watermark. A paying customer would probably want the original unaltered copy, but those watermarks aren't _for_ paying customers.
It's so bizarre how many techbros will just approve of unprecedented mass copyright infringement because it makes the slop get made a little quicker.
People talk about dystopian futures due to the government/technology etc. but the people who will bring forth that dystopia are americans, willingly.
It's not unprecedented. This has been a "legal" thing since the PATRIOT act, when the NSA has argued that processing personal information from mass surveillance into metadata, is in fact not mass surveillance, because you don't retain the original personal data and cannot reconstruct it just with one profile (this is how AI is trained, your "original" is not stored in its model). You just get a blob of processed metadata that doesn't mean anything by itself. So if you're okay with the US spying on you and training their models on you for the past two decades, then you should just be quiet. Americans have shoveled their own grave on this matter 20 years ago.
Well said
Fuck usa
I don't like the idea of having to make all images of art lower quality. And AI is definitely gonna just learn and adjust to this
It's an arms race.
As an hobby artist this is quite the enjoyable development, looks like we are getting more ammo for our side of the war after the corporations caught us off-guard at the start
On a serious discourse I really doubt this technique will have any intended effect any effect on large model training, these techniques work by using a small proxy CLIP model, and the hope is that if it works on this model it will also confuse larger, trained on a different dataset model with potentially a different architecture, while not introducing too many artifacts. Nightshade promises was also to confuse auto-labers, but now that it's out and even with the strongest setting I can't manage to fool GPT-4V or even open source and much smaller model like LLaVa-1.5, as some other paper has shown the bigger the image encoder became the more align is to human vision. In the end I think some people will spend tons of compute to create images with adversarial noise that will help the models learn more robustly and not take shortcuts like the proxy model did (the reason why it was vulnerable), as the images with the adversarial noise or not will be labeled correctly the model will actually learn that the patterns that deceived the proxy model are not what a human would think a cow is for example, and the model should be able to align more to what a human would expect.
And that's not getting into the fact that all these countermeasures have a noticable effect on quality. You can't glaze a lot of art that's shared online, and even for those you can there are counter-countermeasures for this stuff popping up on the regular.
Thank you for covering this; the war has just begun -Nightshade is a fantastic first offensive -and if any of you who sympathize are in the States, please consider donating any amount to Concept Art Assoc.'s fundme for protecting artists against AI. They are making wonderful legal efforts
Thank you philip. As an artist and a long time follower of yours I really appreciate you putting light on this :-) I look up to you and really admire everything you do^^
Thanks philip. I'm having a panick attack because of 6:47
unfortunately, if glaze gets popular, a de-glaze AI will probably be developed to restore the original for training AI. I don't think it would even be that difficult for them.
It’s not even needed, some current models aren’t even fooled. It’s a losing game trying to trick something that can learn from said tricks (if they even work on the model in question in the first place)
Apparently just applying a gaussian blur gets rid of nightshade, so more robust methods are clearly needed
I am not an artist but I enjoy the the stuff PEOPLE create. These days, if not completely obvious, I just don't trust artists that have no works posted before 2020. If there are more people thinking like me, it means that becoming an artist now has become much harder. Also, art that looks the best gets hit the most.
It's a little depressing that you don't trust artists who started after 2020. I feel like I can instantly tell an AI generated image from regular art with 99.9999% certainty
It's interesting you mention this because I've seen a certain inclination that some beginners have towards making use of AI
The problem with both these options is that they fundamentally alter the picture and introduce noisy artifacts. Passing a picture into some edge-refining tool would most likely break this. Obviously it blocks data scrapers trying to automate the process, but anyone with enough time to actually want to copy a specific artist would probably be able to do so without too much hassle.
I geniunely appreciate you making this video cuz I feel my art being used by someone for AI is unavoidable at some point during my art journey, this is a good video I'm saving and it is definitely useful for me when I post my art online but keeping the clear version and the original files myself. I really didn't expect that this is something I would learn from you on this sort of matter xd
The human urge to spew mass produced forgettable garbage at the lowest effort possible.
Poisoning these models would be really a good use of trolling - esp if we can get selective trolling par how damaging the tool is. Modern art AI programs are dangerous for our careers and the artistic world we want to live in, but AI can be really cool and useful, too, IE much of what chatbots are (outside of when they try to write stories) is amazing. AI even has artistic uses, such as easily cleaning up brush strokes, turning 20 tedius brush strokes into 1 key.
Its really shitty that companies are having a "ask for forgiveness not permission" model for AI stuff, you should ALWAYS get permission first.
In our contemporary cyberpunk dystopia, only the poor need to ask for permission.
It is called Fair Use. That is the whole purpose behind it so people can use research and educational training without the need of permission.
artists immediately 180ing on fair use when it no longer suits them and demanding a total extension of copyright to styles and themes makes me glad AI will replace these useless eaters
Who's going to opt in though? Nobody. None of our established legal frameworks are capable of properly dealing with this as is.
Positioning the current use of AI models as being “for education” exclusively is laughable. Companies are using AI models trained on data obtained without permission in for-profit ventures, and have been for quite some time.
most of the art i see are already reposted so many times that it's a crunchy pixel mess so this filter wouldn't really change how i see the work anyways.
Jumpscare at 6:47
I don't see any jumpscare
@@accountwontlastlong1 probably meant the soundtrack, 2kliksdad's music is somewhat strange, experimental
I believe the entire concept of intellectual property needs to be re-examined thoroughly. No one wants the artists to starve, but at the same time, if we go really hard with AI licensing laws, only gigacorporations will be able to buy enough training data to create AIs. It will end the open-source grass roots level AI development right then and there. I believe that to be the more dystopic alternative.
I couldn't agree more. The "options" are not "AI exists or AI doesn't exist". The options are "Everyone has access to AI" or "Big companies have access to AI". There's no denying that it will be a rough time for artists, but i believe being as open about is as possible is the best course of action.
Even if big corporations agree to pay artists for their training data, they would do so once and then have their perfect AI. In the long term there is no winning for artists, apart from trying to work WITH AI, using it as a tool to speed up their own creation process.
@@Magnos We are way past the point of choking AI development out by restricting its access to training data. Even if no new data were given to AIs, the existing models have enough training data, that the models can begin using pictures generated by AI to train their models. We only really need human artwork for the initial seed, and after that the selection weights will do the rest. Sure, it will be faster with quality human work as of now, but we most certainly don't have a scarcity of data. Using real works for training is just the easy way.
All restricting access to more data will accomplish is raise the barrier for entry for AI developers, meaning we will end up with some mystery Google blackbox AI, rather than thousand individual models, most of which being open source.
@@MidWitPrideno, you deeply underestimate the cannibalistic and incestuous nature of ai training
There's a clip out there that Adobe representation want artists to be able to copyright their style, I'm afraid if that gets enforced
Yes. But also: Why are artists worried if they claim AI cannot produce "real art"? Seems like a bit of a double standard. You can't make that claim and at the same time say it will result in a decline in creativity. You could make the point that oversaturation and the lack of professional artists will result in less people taking it on as a job but there's a reason that it's a dream job to many people besides the money. There is nothing inherently degrading about making it a hobby. Like with automation in general it will just mean there's less demand for that kind of labour. Given enough time, development and data this could make creating art as accessible as never before if you give the user more control and a more consistent quality of what it is they're generating.
I understand wanting to get compensated for contributing to a technology they never agreed to be a part of but a lot of artists are just flat out anti AI and are pushing for the entire thing to be illegal or heavilly regulated like you said. If you don't want your art to be "stolen*" then don't put it on the internet.
OpenAI is going to such ridiculous lengths not to offend licence holders or anyone else already that ther company slogan may aswell be "Please don't sue us.". It was extremely usefull for coding until almost every request resulted in placeholders beeing put in instead of actual code. I feel sorry for anyone still paying 20$ a month for this if that's what they use it for.
I loved the Dr.Suess-esque image "cockification" at the end - great video!
I think it's not Artists vs. AI, it's Artists vs. Profit oriented ppl with no creativity using AI. At the end of the day AI is a tool and I don't mind ppl using it if they have the consciousness level of using it to help. But for ppl who're planning to use it to copy or replace artists, then you're literally trying to kill creativity and the connection to the soul through the facade of progression. And much later ppl are going to realized that these AI gen image are just boring and will look else where.
Great timing. Saw stuff about nightshade but didn't find a good video on it. Now I have one
The dream solution would be a special type of file that artists can save their work under that has built-in blockers for AI.
DLsite has "dlst" files that are encrypted, and can only be unlocked by people who have bought them (you need to be signed in). They also require a specific program to open, that prevents taking screenshots of the comics downloaded. It detects that and will show as a black page instead, even when trying to use external screenshot program. Yet still there are people getting around that and pirating stuff.
DRM is demonic.
Thats not possible as images are just a series of color data. If the data to mask it from ai is seperate, it will only affect normal users, people training ai will just not need to read that part.
It is definitely a dream solution, due to how AI works there will always be work around that it either learned during training on its own or deliberately told to ignore the blockers. It is far too complex for really any simple solution, and if one is found a new architecture will render it irrelevant
The dream solution would be a platform where you can upload your work and its protected for you, Instagram was good for that ! Since you cant really save images from there. But with a filter would be dope. 😢
I think glazing will create demand for prints. If a user wants to commission an unglazed image, they can pay extra to get a physical copy. People that want a beautiful SFW artwork will likely pony up the dough. On the other hand, the porn artist will still be raking in with the glazed images since you don't need the pic to be high rez to wank to it. But really, there should be laws to protect art from being used as training data. I feel like the best solution for both parties is if the artist is given the right to sell rights to use their art in training data, AI could instead become an impetus for artist to make money instead of being put out of work.
Museums, the only place where you can find non-tainted images.
the weird thing about the glaze thing is that it distorts it in a way that could be confused for ai generated
I think the point is so it positions it more then anything, that being its main intention.
This makes me wonder if one day these artifacts we see as a result of stuff like glaze and ight shade will be looked at in the same way we look at lines in a tv or older grimey static. Will glaze, ironically in a way, be seen as a artistic expression of defiance and a mood.
When I first saw glaze, as shown in your video, I actually thought it was AI generated artwork, before thinking instead that it was JPEG artifacts. Not sure what that means for the tool if the first thought that someone has when they see glaze is that it's AI artifacts.
I mean, given that Glaze is an AI tool, they kind of ARE AI artifacts, lol
I used to be really pro AI art, even made plugins for the popular UIs but the more I dig deeper into the community, the more "crypto-bro" it becomes. I went onto the reddit thread of Hollys and the comments there pretty much look straight out of a crypto bro's vocabulary, its a Us vs Them type deal to them and anyone who questions the morality of AI and using other people's work is seen as FUD.
I hope to see a future where tech continues to advance and we can create new thing that were impossible before, but I don't like AI images flooding sites for artists.
I want to side with artists because they're being taken advantage of by big companies, but I also want to see the technology get better.
Really conflicted on this.
@@Speejays2 Yeah, AI generated pictures shouldn't be shared in the same spaces or ways others are. The public at large is too easily fooled.
The other side isn't any different, sorry to break it to you. Furthermore, they reinforce each other
It's a choice between crypto bros and technically illiterate moral panickers spreading misinformation about technology to push their agenda.
@@LutraLovegoodand yet the most popular sites for sharing art and portfolios (artstation and deviantart) actively promote it. It’s a massive fuck you to artists
re: would they notice? - ive seen larger errors in ai generation go unnoticed like strange architecture, the amount of teeth in a face, finger shapes on hands or the way foliage is too chaotic even for nature. as an artist myself i feel that if a work of art has those defects after being run through glade and/or nightshade it would be indistinguishable from being an aspect in the style of which the artist renders their work, though it may be more obvious if the art isnt a painting.
it might stop AI from stealing your particular art, but it won't stop AI from lowering your wages
Yes, and digital photography reduced the need for jobs where they were developing film and the sale of said film for cameras in general. Polaroids are a niche product now. Casette players a bygone era. CD's as well. A whole lot, perhaps the majority of artists use digital software to digitally paint now, lowering the demand for canvas and paints. Someone has always suffered as we have advanced the way we do things. If you're a good artist. You will find work. The artist name is a brand as well. I find it bordering on stupid to try "fight AI" when it's not going to disappear. At this point, it would make just as much sense to start fighting everyone who can use their eyes and copy art styles as well.
@@nustanielAI is still a bad tool for hacks without taste. A problem previous innovations just didn't have. It might not go away but we can still ruin its reputation.
@@Cyliandre441 But there's no need to "ruin it's reputation." It comes off as juvenile in my eyes. A bunch of artists afraid it'll take their jobs, when it won't. If you are a good artist, you will find work. People will always want to hire a real artist. AI won't replace that need, no matter how good it can get as it is further developed. Traditional painters, sculptors and so on gets work nowadays still, and in some cases very good sums of money for it also. I don't care to use AI outside of the curiosity to try something new, but I see it as a tool. There's so much wonky stuff about AI generated art that an artist needs to go in and fix it anyways. I guess traditional painters should revolt and ruin the reputation of using digital software with Undo, Cut, Copy and Paste as well. (They tried btw.) "Cheats and tools for hacks who can't paint!" AI as far as I see it, is a tool that artists can use, if they want. To everyone else it's a toy. I also don't see what taste has to do with it. If the generated artwork looks good, it looks good. If it doesn't, it doesn't. There's a bunch of art drawn by humans I wouldn't consider good taste as well. Get off that pompous high horse.
@@Cyliandre441 It is also a tool for people who don't have thousands of hours to gather experience, and hundreds of hours to get the pictures they want... Or for that matter thousands of dollars to commission everything.
You wont ruin the reputation, you'll just look like sour assholes angry that progress caught up... Now art is accessible and available to the uninitiated... Without having to spend thousands of hours, or thousands of dollars.
You know what artists said about digital artists? They were hacks, soulless, they could never replace them, etc.
You sound the same. Stop whining and being an elitist prick.
@@nustaniel stop with the comparison to "copying" art styles.
I absolutely hate people that have never studied art coming with this stupid shallow argument.
I thought AI art was cool at first but after 2023 I kinda wish it would've never existed. It's just so effortless and worst of all you see it everywhere. It's literally spam, just google fanart or something and you'll see.
You can add "-ai" on the search, and it will help filter a lot of it.
I sure love adding an AI tag every time I search for something, internet is blooming! Love how there isn't a way to permanently filter these out
AI will never truly replace human imagination. Keep doing you!
As a temporary measure there should be some kind of custom small watermark or something like if you include this string of numbers or text in the uploaded file the model wont take it
ill make a bold statement
AI isnt even AI, its just really, really advanced code
until some prodigy makes a truly sentient AI that is similar to a person, it is nothing more than a more advanced machine; and machines cant exactly have human creativity
EXACTLY!
I like that you are using AI art for the video thumbnails. The artist has two left hands.
no? the hands are clarly correct what are you talking about?
Those stats where heartbreaking 😭
God dammit why do I think your humor is fantastic- i love that you embrace it without going too far off the rails
You can also double enchant your image so half of the image have glaze and the other half is nightshade. So plus +2 enchant 😂
And the saddest thing about this is people are actually calling themselves artists and this their art. Because its only a "tool". Its like saying I'm know Chinese and I'm a translator because I know how to use google translate.
Generative art tools have been around for a long time. Lot of the digital art is based on it.
Nightshade and glaze the good AI
You're not wrong about what the search results will turn into 💀🤣
Programmers:
❌Automate routine work so that people concentrate on a carefree life full of entertainment and art.
✅Automate entertainment and art so that people forced to concentrate on boring routine work.
The time has come that physical artworks have become more valuable again
This! Yes. I thing this allows even emerging artists to price one’s original work way higher than what may have once been advised not to price high starting out. An artists I follow shares her process of creating her art. Her paintings sell at 12 to 17 thousand dollars after she hit the jackpot in sales once going on TikTok. So now I’m thinking starting out pricing original paintings say a 24x30 at a thousand dollars as an emerging artists somehow doesn’t seem so high a price tag anymore.
How authentic and honest is the nightshade or glaze? I feel like now I have trust issues with everything new that comes out!
Human creativity and expression does not need an automated solution. Generative AI causes unemployment and benefits no one but the people at the top. Generative AI is also great at creating misinformation, CSAM and other dangerous things. AI should stick to research and Medicine where it can actually benifit people.
"Wild West era for AI" is the best way to put it
generative ai is the worst piece of technology ever invented, why are we in such a rush to replace what makes humans special
This is truly a tough problem. Artists are absolutely justified in bringing upset about every going on with AI and training on their work for free.
However, I fear that the cock’s out of the bag, and there’s no getting it back in. Wait… cat?
Yeah, this is likely true. The big companies running Dall-e and Midjourney already have scraped datasets of probably billions of images.
I somewhat doubt that new data is even a topic in improving Image-Generators. Propperly tagging all those billions of images is likely more of a challenge.
They're not justified in the slightest. AI 'steals' in the same way that you or I 'steal' by just looking at things and remembering the conclusions we drew from doing so. They're perfectly fine having people look at their art and having people draw conclusions from them and remember those conclusions, but somehow it becomes an issue when people use a program to do the exact same thing. If they weren't just complaining over sour grapes, then they wouldn't have publicly released their artwork. They did, therefore they have no foot to stand on.
@@ChaosSwissroIl I agree with that completely. The AI really does just see things, learns, and uses it to draw upon when making something a user has asked for. Sure, there's instances of it basically just replicating some things nearly identically, but that's a small part of the problem.
A real person can't remember billions of sources of inspiration and create something new with it in seconds. I think that's where I feel most sympathy with artists.
The only way people will commission artists for custom and unique works moving forward is just going to be out of pure principle. Which works for individuals requesting new art, but businesses and corporations don't operate on principles, they operate on profit margins...
Artists have every right to be upset about all their work being used, without permission, to train AI that is going to put them out of job, or potential careers.
I say this as someone that's been fascinated with, and used AI for all sorts of stuff over the years. I like it, and use it almost daily.
I can't even think of a good compromise to try to make everyone happy.
@@ChaosSwissroIl those models don't "learn" anything, they don't see or have even a remote understanding of the real world. They don't feel anything, have no emotions. AI models literally assemble Frankenstein-esque copies of the "trained" material, and are designed to circumvent copyright law. It's basically a machine designed to screw artists, so they're very justified.
@@BRUXXUS You're completely wrong. AI pumps out thousands of images, but it has no knowledge of what it creates, it doesn't see any relationship between objects nor even sees them. It literally has no awareness of anything going on in the real world and can't put emotion into anything. It can perfectly replicate and recall thousands of images, and assemble them in a Frankenstein-esque way. That is not art, nor is the learning process similar to a human
I'm so happy that this exists. I'm so depressed that it has to.
I still think as an artist, it is copyright infringement to use artworks that you do not have the rights to, to do anything with intention to distribute, including training ai
Idk how anyone can justify AI art. IT IS LITERALLY STEALING. These people TOOK HER IMAGES they stole her art and had ai regurgitate her art. It's not okay. They shouldn't be allowed to do so. That's literally as if they stole her art prints and then started selling them claiming it as their own art. ISN'T THAT ILLEGAL? How can anyone defend AI.
People defend a lot of insane things. I mean, a quarter of the population is voting for jeffrey epstein's friend in November.
Yup, people who defend ai just want to profit off it because they have no skill themselves-but obviously won’t just admit that
this sort of controversy gives me very similar vibes to react content, where ai training seems to be taking content without permission and then only taken it out after request, which is what a lot of reactors do. i wonder about that for you philip as im sure you mightve been a victim of the reaction content, whats your stance between the two? you feel the same for both? or do you think its different for original youtube content being reacted to vs original artwork being trained on?
With most reaction content its clear that its against copyright as its not fair use (for most reaction channels).
But going against those is super hard and tedious, but a community has started suing them and so far none of them have appeared in court.
No AI training is not even close to react content, react content literally copy pastes the original with an added facecam, and AI models remember thousands of images and spit out Frankenstein-esque assembled combinations of those. Both are unethical and infringe copyright. F ck both
There could not be a more transformative use of content than using it as AI training data
I feel AI art is even more fair than most reaction streamers. It isn’t reposting art that it didn’t make, it’s learning from art and then making its own.
Genie's out of the bottle and it's not going back in regardless of what anyone wants. The conversation should be about how artists can get compensated for their work going forward, but trying to "stop" this ain't ever going to happen. It literally _can't_ happen. That's not how disruptive inventions like this work. _Someone_ will use it and that's that.
I mean, remember Metallica trying to stop Napster? Doesn't work.
Nah, the conversation should be about how AI in it’s current state couldn’t function without the works of others without their consent.
Saying “Genie’s out of the bottle lol” isn’t going to make the use of other artist’s intellectual works (without their consent and en mass to boot) suddenly okay.
Emerging technology isn’t immune to regulation, but the law does need to catch up to AI.
@@thatradioboy Yeah, those are _feelings._ Non-enforceable feelings.
Like I said, people _will_ use this technology... and in the coming years you won't even be able to detect that it was AI that was involved at all. What then? Think laws are going to stop that?
@@Cimlite Why do you AI Bros have to be such snarky douchebags lol?
AI uses people’s intellectual property without their permission, that isn’t a feeling bro that’s a fact and is the reason why so many people dislike this technology.
And AI generated images will always have aspects that call them out as AI, no matter how good the technology gets.
@@thatradioboy I'm no "AI bro", I'm just a realist about these things. AI is invented, and it's not going anywhere. Trying to legislate it away is a fool's errand. You say that generated images will always have tells, and that's just not accurate. Even today, there's images that fools people (just look at the AI painting that won an art award). Right now, you can't tell when it's a good image... and the bad ones to good ones ratio is just going to get better and better. A few years and being an artist is a lost cause - and that's not something I'm happy about, but it's what's going to happen no matter how you feel about it.
@@thatradioboyThey never said any of this was okay (even though I think it’s fine). It’s just a matter of a fact: AI is unstoppable at this rate.
Nothing wrong with the tech. There is no way stopping progress.
The fault is in ethics and copyright.
Commercial AI sould be only trained with dataset they have fully right of the ownership
I think training on copyright data should be fine. Using AI to generate copyright infringing art / or to impersonate people is not OK. AI is fine as long as it is used as a tool for making something new, and also when you don't try to pass it as something you drew yourself.
@@willhart2188 But then it's not even an AI question, really. You shouldn't steal someone's work regardless of the tools you use to do it.
@@milandavid7223 It is not stealing, as long as what you make is transformative, and adds to it in a meaningful way. How you use the tool is important.
Should you only be able to view and memorize images you have full right of ownership too?
I GOT SO SCARED WHEN I HEARD CABOOSING AGAIN. PLEASE STOP
Glad you dedicate a video about this side of the conversation as well. Must say I was a bit annoyed with the general publics praise of generated imagry that came at the cost of using copyrighted artwork, while using it before for commercial purposes wouldn't be dared. But developments like this make me hopeful that, if not the usage of copyrighted artwork in models will be unusable, it'd atleast become banned for commercial usage.
GOOD, if their whole purpose was to get rid of artists, we artists have zero compassion when we say we want to get rid of these "ai" image generation sosftwares.
Great video, learned something useful for an essay I'm currently procrastinating writing. I'm sure It'll come in handy soon enough, Thanks !
i feel like artists shouldn't pretend like they care about copyright law esp when so much of the internet's culture is based on freely taking random images you find and remixing them
Imagine if Nightshade and Glaze would be implemented in to art programs like Krita or Blender
With an open-source nature of those programs, I'm sure somebody is already implementing this!
it won't take long for AI to surpass ns and glaze
But you can already ruin your artwork by running the brush tool across the canvas
@@fagadafa and thus a cat and mouse game has begun
but the cat is blind and the mouse (AI) is jerry w steroids @@bas_ee
I can't exactly blame the trolls. AIs like SD are generally trained using mass copyright infringement; the models arguably shouldn't even exist in the first place. It sucks to see artists' work get stolen for profit with no recourse, at least this way they can try to deter plagiarists.
It's not even that hard to bypass this issue, big models could've just limited their image queries to things licensed under something permissive. As for private models, it'll probably become an arms race a-la cybersecurity.
I for one welcome our cocks overlords.
Training AI is not copyright infringement
@@PrintScreen. How about copying them 1-1?
Here's a thing: Copyright laws are already outdated and harm artists, especially fanartists. Making the laws stricter will definitely evoke "be careful what you wish for".
cope, bootlicker
It's a nice thought, but unfortunately I really don't think it's gonna be enough. AI models keep being updated and eventually an anti-poison feature will simply part of the software.
a anti-poison feature would definatly give them legal trouble if AI laws get put in place
it's an arms race and eventually the amount of compute required to generate a non-poisoned image becomes prohibitively expensive and not worth anyone's time. Hopefully.
So, for those wondering, here's how the current process works.
Model trains, makes image, different model looks at image and says if it looks off and points out flaws, model then trains on this feedback. The "poison" system is probably just exploiting subliminal noise patterns which is literally just a bug. The entire issue they are "working with" (exploiting) will likely be completely fixed/solved in the next few months at the longest.
I don't really care about the results of these arguments, but I do love hearing them. It really makes you feel like you're in the future.
I wonder, if AI does kill professional human art careers, if that will force humans to do art for the joy of it alone, which is probably the healthiest reason to do art in the first place. Time will tell
Yeah, artists complain that companies and profits corrupt art.... but art has essentially always been for profit. And now that there is a potential to do art just for the love of art, they are whining as well.
this argument is moot- remove the word art completely. There are people who have spent years of their lives working on a skill in order to make a living and AI has started getting these people laid off and in the future will continue to do more of that. In reality a job is a job at the end of the day, telling someone who has dedicated their life to a skill "its ok you can work minimum wage and make art for fun" isnt a solution to their problem. Thats why copyright is being brought up, make no mistake this is an effort to see if these people will continue to be able to feed themselves.
If your response to this is "fuck them get a real job" its pretty shortsighted, every job is at risk and until some magical perfect UBI utopia falls into our laps, this is a crisis of what are we going to do for these people.
@@SioxerNikita people talk about how they wish they were free to work on their art fulltime without having to pay bills. Its fantasy. Talking points of a perfect world.
Telling them "its ok now you wont have any income from art at all" just deletes them. Decades of work and training now mean nothing. "Work at burger king and make an OC in your small amount of free time and you will definitely be happy now!" misses both the current point and the point of the conversation prior.
@@epocfeal Well, welcome to the world of shoemakers... welcome to the world of 99% of European farmers... welcome to the world of 95% of factory workers... welcome to the ... I think you got the point.
This is just the "I like the benefits society has gotten from all the automation... until it hits the field I like!"
@@SioxerNikitaAutomation has always been a shitty thing, return to monke
Only publish poisoned images and only send to customers a picture with an imperceptible watermark (I'm sure it can be done) so if it leaks you know who leaked it and sue
Bro nuked his ad revenue in the last 30 seconds 😂
So glad the default state is "your art will be stolen by default without consent. Click here to stop that."
Theft by definition requires depriving the original owner of property. No such thing as stealing intellectual "property".
@superlazydog2183 Copyright exists precisely because you couldn't claim someone stole something that you still have in your possession.
@superlazydog2183 Then why are copyright violations prosecuted under entirely different laws, if it is just theft?
" instead of coks we will get coks "
that then they'll do the reverse, instead of boobs we'll get moobs
I dont think this will do much in the long run. If we as humans can see through the filter an AI will eventually see through that. A copyright compliant AI dataset should be created and slowly but surely enhanced enough to be useful for further AI training. There is no point in "war". Solar sand has a good couple of videos on this, AI models are inevitable.
There's no way to stop the advance of AI, only slowing it down, is like having a fight you cannot win.
Sledgehammer to chat GPT's servers would work
@@Pigness7 Censorship often just makes martyrs. Then all of a sudden making AI models is the "path of freedom".
I'm hoping for artists adapting to the new reality. An artist with AI knowledge will make art with more quality than any unexperienced joe could ever make, or even dream to make.
@@Pigness7 then there's still Mistral and Claude and Gemini and millions of copies of other openly available LLMs... Once you open Pandora's Box there is no closing it. Going back is not an option, only learning to live in tomorrow. This, ladies and gentlemen, is how people react to technological progress, and this is the first true breakthrough most under 30s have ever experienced.
What's stopping people/AI/scrubbing tools from just screenshotting the image lmao
Ai is already at the point where it can use generated images to improve, so there is all this effort being put into an area that is already being pulled away from. and as you mentioned the individual style copying is going to be done by individuals, and they will just pick images that aren't poisoned. and unless the artist never shares/sells an image, it can be acquired and used to train a personal model.
I have looked at the paper about poisoning the training data, and I feel it would only sort of work if the ai researchers/developers stopped working on their projects and sabotaged their own work. Like part of improving the technology would be making sure the quality of the incoming data is good, and it would detect that it is either a glazed image and categorize it as too low quality to use, or if it is label poisoned like with the dog/cat thing, then it would see that the label doesn't match what it has seen in the past and also throw it out. and I guess you have succeeded in having an image not be used in an ai model, but you have also either reduced the image quality for all the humans you share it with or ruined the accessibility of the image by having the wrong description of the image.
I support artists and enjoy making art, I just don't like spreading false hope, or information that doesn't seem to have much to back it up.
Using generated content to improve is the basics of GANs (Generative Adversarial Networks), so this is a problem that has already been solved.
I alost feel sorry for the people defending this AI crap. They don't understand the artists because they don't know what it feels like to have created anything of value ever in their life.
This is a good idea. How do I apply glaze and nightshade? (Especially nightshade, that one sounds absolutely hilarious)
Absolutely horrific times.
The horror hasn't even started yet 😂
I actually think poisoning techniques will work long term.
The reason is a matter of economic incentives. For people that produce AI algorithms, there is a strong diminishing return to adding new pieces to the training set. If you already have billions and billions of images, adding in one additional artist will only fractionally improve the value of your algorithm. Meanwhile for artists that are trying to protect their style, the value of that is fixed. Thus artists and creaters of adversarial methods will continue to improve their methodology, while at a certain point, the likes of stable diffusion will just stop training the algorithm and take the existing result as the final version.
There are also multiple ways to corrupt images, and it is relatively easy to devise new ones. It becomes increasingly difficult for an algorithm to detect an increasing number of differently corrupted images, without making mistakes.
"Shit on your food to stop other from eating!"
Your food in your fridge
@@d4darwin458for normal people, yes but artists are all scum
There’s companies and some very dedicated nerds against each other.
Luckily if there’s a passion for it the public will win. At least that’s what I hope. I’m fine with ai images, but we need to force them to use images they have permission to use instead of scraping the web for everything they can find.