Anti-AI Art Bill Backfires? How C2PA Really Work

Поделиться
HTML-код
  • Опубликовано: 11 сен 2024

Комментарии • 796

  • @bycloudAI
    @bycloudAI  Год назад +23

    The first 1,000 people to use the link will get a 1 month free trial of Skillshare skl.sh/bycloud07231 !
    and please keep in mind that this video is to raise the awareness of C2PA + how dumb it is, and not trying to address, attack or criticize artists like Karla Ortiz on them advocating for artists rights.

    • @RyanStrobelVFX
      @RyanStrobelVFX Год назад

      What if we rework a system of training?
      What if the training process included another sub-process that marked the neurons that are accociated with an image being trained, then store it along with the model.
      That way, another program could run the model, then highlight the neurons used to create the image, and determine the closest latent origin of the image. Basically the AI citing it's sources.

    • @allanatiers9261
      @allanatiers9261 Год назад

      Mass Internet Surveillance and surveillance in general will come anyway. what do people think cbdcs, digital id and basic income is for... don't pin that on artists. if artists are not used as the scapegoat they will find another group of people for that. governments and big tech are the villains here, not artists.

    • @michaelhallett2842
      @michaelhallett2842 Год назад

      @@RyanStrobelVFX am i wrong in thinking of training data as inspiration? you shouldn't be required to cite your inspiration.

    • @fnytnqsladcgqlefzcqxlzlcgj9220
      @fnytnqsladcgqlefzcqxlzlcgj9220 Год назад

      hey man, im not sure about you saying over and over that the adobe ai is copywrite free, the images ARE copywrited, they just own the copywrite, so its completely privately copywrited, meaning that there will be competition that 'licence' artists art to go into ai models, so then youll have bots canvassing on mass for small artists to give the rights of thier art away for a tiny amount of money forever, and then that company will sell licences for thier "copyright free" AI that has been built on exploiting small artists who dont understand that they are the product.
      its not copyright free! thats is a misnomer, its actually misleading, it is PRIVATELY copyrighted

    • @allanatiers9261
      @allanatiers9261 Год назад +1

      @@fnytnqsladcgqlefzcqxlzlcgj9220 plus adobe ai is not trained on their own images. they used the same lion5b database and only AFTER that did they "finetune" it with their own stuff.

  • @Starius2
    @Starius2 Год назад +401

    Whenever Disney or Adobe wants something, I always assume it's going to mess with our rights

    • @absolstoryoffiction6615
      @absolstoryoffiction6615 Год назад +1

      It's terrible idea that only benefits corpo...
      This is not the answer when the solution WAS CLEAR AS GLASS...
      Honestly... It's not a debate but the line was crossed and America burns for it.

    • @Starius2
      @Starius2 Год назад +2

      @absolstoryoffiction6615 can you clarify a bit so I can make sure I am on same page as what you mean

    • @absolstoryoffiction6615
      @absolstoryoffiction6615 Год назад +3

      @@Starius2
      It's as this video stated. Their proposal is both in contradiction to USA Copyright Law and highly concerning for privacy reasons.
      Who owns the data?... This is not how legislation should be done for and only for corporate interests.
      The government will not hold onto that Data. Corporations will instead.

  • @madcatandrew
    @madcatandrew Год назад +152

    They always have a great excuse to take away:
    Encryption
    Data privacy
    Personal privacy
    Personal property rights

    • @johanbrandstedt9570
      @johanbrandstedt9570 Год назад +6

      That’s what Stability, Midjourney and OpenAI have been doing.

    • @ko-Daegu
      @ko-Daegu Год назад +12

      @@johanbrandstedt9570 hwo did stability took your personal privacy with open source software please elaborate ?

    • @johanbrandstedt9570
      @johanbrandstedt9570 Год назад +2

      @@ko-Daegu since they don’t care to filter properly, current gen sludge models contain a lot of private material, including medical x-rays.
      As for OpenAI, Meta and Anthropic, they all train on The Pile which contains Books3, which is a collection of 197.000 pirated e-books with their DRM removed.

    • @ko-Daegu
      @ko-Daegu Год назад +8

      @@johanbrandstedt9570 "a lot of private material, including medical x-rays." would you please showcase source were they used unlawfully a private x-ray image ?
      how did they optain it from the get it go ?
      thus being said none of what you say is attack on privacy (it's an attack on other things like IP this is outside my field of speciality so I will take your world on it) however, I fail to see what StabilityAI did was anti-privacy
      the rest yeah i agree ..

    • @ultimamage3
      @ultimamage3 Год назад +7

      @@johanbrandstedt9570 and how did the "private material including medical x-rays" become available to public scraping? answer quickly.

  • @NopWorks
    @NopWorks Год назад +29

    Remember years ago when people find out that computers in North Korea applies digital signatures to every file it encounters, allowing them to not only trace down where the file originated from, but also which computers have viewed the file?
    Yep, we're apparently heading down that direction.

    • @silbus8670
      @silbus8670 9 месяцев назад +1

      И всё благодаря жадным ублюдкам из больших компаний крадущие чужие права
      Так что да, ты прав :3

  • @betterthantrash111
    @betterthantrash111 Год назад +93

    Whats funny, is that c2pa would require all of that metadata to be signed by a device, but bad actors could chnage what the device thinks that metadata is, and could frame people.

    • @ko-Daegu
      @ko-Daegu Год назад +20

      this is what happen when group of people led by emotion try to play the role of an engineer or a lawyer trying to solve political, legal or engineering problems by introducing feelings-based solutions

    • @gabrielc7861
      @gabrielc7861 Год назад +7

      ​@CertifiedAbyssGazersounds like musicians pushing for stricter copyright laws

    • @MattGoldenberg
      @MattGoldenberg Год назад

      Couldn't you just sign it with a private key?

  • @AlistairKarim
    @AlistairKarim Год назад +325

    It's similar to how musicians inadvertently harmed themselves with copyright issues. They were afraid of piracy, so they rallied for various legislation. However, as a result, they didn't earn any more money and now they themselves are fearful of copyright infringement.

    • @asterlofts1565
      @asterlofts1565 Год назад +45

      TRUE. They are very similar cases because in both, they did not seek a balance between the modern era and technology and the traditional and classical methods and ways of making art... they just wanted to destroy the other counterpart without thinking about the real present and future consequences of those actions, because it affected them financially and their egos (they feel unique and special for doing what they do)... in addition, they became very irrational and inflexible with these issues, they saw technology as Satan in person and did not stay to think about the good possibilities of using it or the bad consequences s not to use it. That does not mean that they directly embrace everything new that is coming, but as I said, they did not stop to think about how to find the best solution to this problem and it went badly for both music artists and painting artists of all kinds.

    • @AlistairKarim
      @AlistairKarim Год назад +35

      @@asterlofts1565 Well, I agree, but I think that it can be expressed in a simpler manner - they preferred to be right, instead of being effective. It's a common mistake. And that's how we're being ruled as well, being dragged by our emotions.

    • @Puppetmaster2005
      @Puppetmaster2005 Год назад

      I for one totally agree with legislating AI, but the way they're going about doing it is completely wrong. They totally missed the mark.
      I mean look, right now, access to AI, its various APIs, training data, etc. are all over the internet. It's in the hands of the average joe, and that's really dangerous because, as well all know, there ARE malicious actors out there - scammers, money launderers, fake tech gurus, etc. There are A LOT of them in fact. But instead of legislation in favor of preventing these bad actors from misusing AI, they've instead chosen to focus more on the copyright infringement angle.....
      It's so mind bogglingly obvious that the people who are involved in the legislating have absolutely no idea what kinds of AI tech are available in the wild right now as well as the long term implications, and they're either just being lobbied by big tech (with a profit bias) or are just plain guessing.

    • @trashviewer3521
      @trashviewer3521 Год назад +4

      Don't really see where musicians are being harmed. They do they usual stuff just fine. Yes, maybe there could've been more job opportunities if the music industry copyright was looser. But the most important part, aka : Being able to do art is mostly intact for now. I suppose it's hard to understand to ai hussle-bros. But for a lot of artist the ability to create & own their art can be equally or even more important that the ability to make a living with it.

    • @oo--7714
      @oo--7714 Год назад +23

      ​@@trashviewer3521they barely get any money now but ok

  • @maximumspoil2500
    @maximumspoil2500 Год назад +141

    So, let's recap: If this is implemented, AI image generators will still exist and will still put artists out of business. But governements will need to finance a huge machinery (paid with taxpayer money) to track all images on the internet. People will loose even more privacy.

    • @johanbrandstedt9570
      @johanbrandstedt9570 Год назад +3

      LOL no AI art generators would finance the thing.

    • @ko-Daegu
      @ko-Daegu Год назад +21

      this is what happen when group of people led by emotion try to play the role of an engineer or a lawyer trying to solve political, legal or engineering problems by introducing feelings-based solutions

    • @ko-Daegu
      @ko-Daegu Год назад +13

      @@johanbrandstedt9570 been seeing your comments i highly doubt you work in this field or understand most of these things if not start by asking questions

    • @johanbrandstedt9570
      @johanbrandstedt9570 Год назад +11

      @@ko-Daegu I’ve worked all sides of international content licensing and digital strategy, and work close with legal and tech experts. The original lawsuit is weak, admittedly. Let me guess: you “rationally” believe that property rights shouldn’t exist and “rationally” believe algorithms are “inspired” just like people? Have you listened to the utter hogwash that is the Stability and Adobe testimonies. “Think of it like a human brain”. “Like visiting the library”. These people deliberately mislead techies with a tenuous grasp of IP law.

    • @ko-Daegu
      @ko-Daegu Год назад +12

      @@johanbrandstedt9570 with all due respect by work I meant as an ML Eng
      and thank you for further proofing my point

  • @dandy445
    @dandy445 Год назад +89

    There's no chance they will be able to define what a "style" is. It has to be quantifiable to be enforceable. AND the artist somehow needs to demonstrate that their style is in fact theirs. Even if internetartbro has a massive profile of digital artwork that he has made, how does he prove that he made it? There's just no way to enforce this.

    • @BinaryDood
      @BinaryDood Год назад +4

      indeed

    • @augustus4832
      @augustus4832 Год назад +6

      Megacorps could.

    • @generalawareness101
      @generalawareness101 Год назад +30

      There is that Disney style but for Jane Doe to say her style is hers alone is laughable at best. Did you just sit in a vacuum in deep space to arrive on this planet to awaken 6 million years later having your style? No, you were influenced even if at the sub conscious level and that is 100% scientific fact. Connections.

    • @deadlyrobot5179
      @deadlyrobot5179 Год назад

      Artists are arrogant little assholes, I'm an artist myself (music producer) and coming up with a completely new style is literally impossible, the only thing that can do such a thing is an advanced A.I.

    • @dibbidydoo4318
      @dibbidydoo4318 Год назад +23

      Yeah! Have you ever create an artwork with no style? it's impossible, it's like creating an artwork with no color.
      In that way it's like trying to say "you're stealing my color" when you say "you're stealing my style" to someone.

  • @ieye5608
    @ieye5608 Год назад +51

    Company or big entity being "ethical" ,is the most unbelievable thing to me.

    • @ko-Daegu
      @ko-Daegu Год назад +1

      this is what happen when group of people led by emotion try to play the role of an engineer or a lawyer trying to solve political, legal or engineering problems by introducing feelings-based solutions

  • @markemad1986
    @markemad1986 Год назад +44

    At this point I am not as angry at the ccp like companies anymore as I am am at the people who are just sitting there accepting these stuff, why does everyone seem to be hungry for government to regulate their existence right now??

    • @jari5230
      @jari5230 Год назад +5

      I am not a fan of this C2PA solution for AI art, but why would I want for-profit mega corporations to regulate most of the internet and technologies right now? A little regulation of technologies and the internet could go a long way imo. With all the fast developments at the moment it's really hard to say how to go about it the right way at the moment though, but I'm sure not regulating has a lot of risks too

    • @VodkaPandas
      @VodkaPandas 11 месяцев назад

      Who cares, I artist loving it, I personally doesn't care about your concern, since people like you also doesn't care about our concern, mocking us, stealing from us, should've do AI the correct way like what Adobe did in the first place, it might be slow but safety progress, no Artist can complain to it either since Adobe use only images that they have the rights to use, you prepare your bed now you sleep in it.

    • @markemad1986
      @markemad1986 11 месяцев назад +1

      @@VodkaPandas there is a cutoff for how much an artist owns his art, imagine if you need a license to talk about a piece with someone, you literally cannot talk about what you see if you are caught doing it you will be fined, that doesn't even sound real, like a destopian world or something, it's natural for people to get gather what has been presented to them and integrate it to themselves and it becomes part of their persons and the artist no longer hold the right to it. If that is true for people why not for a computer? It's extremely silly the copyright system we have, it does not work in the digital age. I Love art. I think any actual artist that cares about the art itself cannot be sad that an ai will help with it, out current stuff is really not good enough to replace artists but it will probably get there with or without full Rights to training data, cheap parasitic artists will "lose their jobs" but if anyone is presenting actual new thing to the world he will probably be rewarded handsomely and his work will have a lot more value, this is just how things will be.
      However, this is a whole different thing. It is made with mass surveillance mega governments that will impose their whim on the population because "that's the only way we can help the artists 😇" this is very dangerous on everyone no one wants what this will lead into, they were just waiting for an excuse

    • @Kaucukovnik666
      @Kaucukovnik666 9 месяцев назад

      @@VodkaPandas If AI "stealing" art makes you angry, you don't love art itself, but the idea of being good at it, being praised and getting paid for it. If you actually loved art, you'd be happy for any advancement that lets more people turn their ideas into pictures (or other creations) even if it diminishes your own importance.
      We've had this already. Ignorant people have called photography, digital painting or CGI "just letting machines do the work for you."
      If you actually explored it instead of whining, you'd find out that getting anything more than lolrandom sometimes cool looking crap out of an AI actually takes quite a bit of effort. The more specific, unique and consistent results you want, the more work it takes. Not pencil or brush work, but imagination, skill and effort nonetheless.
      Sure, most AI generated art has very little artistic value. The majority of anything sucks. That's just how it is.

    • @VodkaPandas
      @VodkaPandas 9 месяцев назад

      @@Kaucukovnik666 I loved to support the technology of AI art, but the fact that the advancement of AI art is being made from stolen Artwork that had copyright in it, the Artist deserve a compensation, that is the rules written by the laws of ownership and respect on copyrighted property.

  • @MistWit
    @MistWit Год назад +216

    Some anti-ai art fears are legitimate, but I really don’t see a way to enforce any type of protections without huge invasions of privacy and freedom.

    • @VioFax
      @VioFax Год назад +74

      I'm an artist. I think other artists are being pretentious. Ive had all my work stolen. Its all been used, so what? i still make money on it. People just want to draw like 5 things and have them profit them forever. its just not a passionate way to do your work its a greedy and attention seeking one. Imo atleast. I got over myself and my "art" as soon as this stuff dropped. I always KNEw where this was going when all these social media sites were hosting everyone's work for "free". IVe quite enjoyed moving from painstakingly drawing each line to AI art. And I don't need credit for EVERY stupid thing Imake. ya know? I do okay. Its not worth it to me to have all this privacy invasion JUST over art... I think its a psyop to give the big boys even MORE control over our rights and creativity. But what do i know, i'm just an artist.

    • @BlackTakGolD
      @BlackTakGolD Год назад +26

      ​@@VioFax It is quite refreshing to see a visual artist who does not complain about an exclusionary copyright system that prevents others from utilizing their works.
      It is a rare quality to find in certain members of the artistic community. Why do they feel they deserve the authority to imprison others for using their art, even if manipulated significantly?. This mindset is hardly seen in any other group, such as engineers, mathematicians, physicists, and architects. Many engineers would willingly sacrifice their positions for efficiency's sake, mathematicians seek to uncover the code of reality, regardless of who utilizes it, physicists appreciate being referenced and having their works utilized, and architects focus on designing structures that can accommodate others, leading to interlinked design concepts. The whining is virtually non-existent, even though they are all creators in their own right.

    • @no_player_commentary
      @no_player_commentary Год назад +8

      @@VioFax As a artist I also somewhat agree the only issues I really had orginally was it making it harder for newer artists to get their first jobs. But what were all the big artists doing they where celebrating it, then when they were allegedly having their art stolen which I still dont believe how it works they got all pissy about it, especially karla ortiz I doubt shes lost any money from AI. Now dont get me wrong im a fan of her & she even gave me a nice hug when I met her once at a convention but that doesnt stop me from thinking shes wrong. If you never even made it how is it stealing(now if your dna gets cloned thats different) Acting like AI is only coming for art & art alone & its all being stolen(is it really theft if its a 1000 different pieces of art or has photographs mixed in too I have made some really unique stuff that im still sitting on) just too have even less freedom & privacy is crazy.

    • @memegazer
      @memegazer Год назад

      You could argue that c2pa invades privacy, but it depends...I am sure lots of people would like to upload images to the internet and remain anyomous...maybe even for legitimate reason rather than being nefarious, but imo people in general will not really care about that kind of privacy that much, especially not artists that are trying to protect their IP, they are going to want people to know who the image belongs to.
      For example I would take no issue if child abusers could not easily traffic images of child abuse online.
      Or as another example, it would be bad if whistlerblowers, activist, journalist, etc could not share image data without of fear of losing privacy that could endanger their lives.

    • @foxdog9332
      @foxdog9332 Год назад +3

      @@VioFax this....artists are just lazy and need to push the limits but they rather sue everyone.

  • @adrianm7203
    @adrianm7203 Год назад +89

    If AI art is not copyrightable then that means human artists will always have a niche in being able to create art which can be owned. Disney and other large companies are not going to use AI if it means they no longer own the rights to that work. At the same time small creators which can't afford to hire an artist can use AI tools to level the playing field. C2PA looks to me to be solving a problem that doesn't exist in order to enforce the monopolies of large companies like Adobe.

    • @urgyenrigdzin3775
      @urgyenrigdzin3775 Год назад +28

      Exactly what I'm thinking. In addition, these anti-AI artists are using AI as a facade for their anti-competitiveness. I've always been saying, if an artist is a decent person with even with average skill, that artist wouldn't have to worry about anybody taking their job, because clients would love the artist not just because of the work, but also because of the personality, easy and fun to deal with.
      So if an artist is worrying about "AI generative art woild steal my job", then I'd suggest considering to reflect on one's own skill, and/or personality, and/or clientele, and the raison d'etre for being an artist.
      To me, being an artist is more about exploration. So all these new tools and medium are new adventures, not a threat at all.

    • @djan71
      @djan71 Год назад +4

      If humans design the characters and other assets and then Disney just uses AI for the animation, can it still not be copyrighted? Genuine question.

    • @AlistairKarim
      @AlistairKarim Год назад +10

      ​@@djan71 As of now, a bunch of splines animating a rig in Maya or Blender are not copyrightable. In the future, who the hell knows? I guess in-between frames would count as still images.

    • @urgyenrigdzin3775
      @urgyenrigdzin3775 Год назад +10

      @@djan71 good point. And when it comes to that, I'm willing to bet that Disney and other big corps lawyers would just find a way to circumvent it so it can be copyrighted.
      Speaking of copyright, can style actually be copyrighted?
      That's the anti AI argument, isn't it?
      Using their works to train AI without permission = copyright violation.
      I'm not in any way had law education so most probably I'm misunderstanding their argument. The way I understand it, it seems like they're trying to copyright style.
      If that is the case, then how would they quantify style? With music, it's similarities of more than 7 or 8 musical notes in sequence. How would they do that with visual style?
      Because without any clear quantifying factor, then any painting of swirling wave or grass could be an infringement of Van Gogh's copyright.
      Any sculpture or visual representation of human male in idealized form with curly hair would infringe Michaelangelo copyright, no?
      (Were copyright existed during their period and they copyrighted their style)

    • @paulaumentado1588
      @paulaumentado1588 Год назад +2

      ​@@urgyenrigdzin3775it's a both a force of good and a threat when people want something instant when they haven't earned it there's something very wrong with it there's a reason why the old and new art is very different

  • @theaudiocrat
    @theaudiocrat Год назад +15

    @6:35 "they discuss how it could literally ruin our lives..." believe me, thats a price the US govt is more than willing to pay

  • @JackSilverGamingOcean
    @JackSilverGamingOcean Год назад +27

    I think many people forget that the 'cure' should never be worse than the 'disease'.

    • @MTSP-pe2fn
      @MTSP-pe2fn Год назад +9

      Sadly people thing a cure is effective because it's loud and impactful, not because it actually cures the issue it was meant to

    • @ko-Daegu
      @ko-Daegu Год назад +6

      this is what happen when group of people led by emotion try to play the role of an engineer or a lawyer trying to solve political, legal or engineering problems by introducing feelings-based solutions

    • @thinkbetter5286
      @thinkbetter5286 Год назад +2

      ​@@ko-DaeguLol, spamming the cope

  • @metatron3942
    @metatron3942 Год назад +12

    This is what would happen if it went into effect. After celebrating, digital artists are suddenly reminded that they have to sign contracts with big companies and cannot create their own artwork because they need a certificate from C2PA to sell it.

    • @VodkaPandas
      @VodkaPandas 11 месяцев назад +2

      Who cares, I artist loving it, I personally doesn't care about your concern, since people like you also doesn't care about our concern, mocking us, stealing from us, should've do AI the correct way like what Adobe did in the first place, it might be slow but safety progress, no Artist can complain to it either since Adobe use only images that they have the rights to use, you prepare your bed now you sleep in it.

  • @nilaier1430
    @nilaier1430 Год назад +24

    Literally 1984

  • @oldsoul3539
    @oldsoul3539 Год назад +30

    Adobe is just good old fashioned fearmongering to try and scare you into paying through the nose for it's software instead of using one of the many high quality AIs available for free. Anyone else old enough to remember "You wouldn't steal a car would you?" This is that bs all over again. How in the world would anyone be able to reverse-trace any of the millions of source images your AI art used? The only way is if you flat out told them what set you used. Steam greatly overstepped it's bounds claiming their game "contains art assets generated by artificial intelligence that appears to be relying on copyrighted material owned by third parties." How the hell can an AI image "appear to be relying on copyrighted material"? That is some powerful mindreading.

    • @JorgetePanete
      @JorgetePanete Год назад

      its*

    • @silbus8670
      @silbus8670 9 месяцев назад +2

      А на чем тренируют искусственный интеллект, умник? Может быть на работах людей у которых даже разрешения не спрашивали? Нет? 2 + 2 = 3?

  • @jonp8015
    @jonp8015 Год назад +51

    "In order to keep you safe from the thing you currently fear, you must give the non-elected governing bureaucracies more power."
    I have little faith that the majority of artists will see the danger in this course.

    • @VodkaPandas
      @VodkaPandas 11 месяцев назад +1

      Who cares, I artist loving it, I personally doesn't care about your concern, since people like you also doesn't care about our concern, mocking us, stealing from us, should've do AI the correct way like what Adobe did in the first place, it might be slow but safety progress, no Artist can complain to it either since Adobe use only images that they have the rights to use, you prepare your bed now you sleep in it.

  • @sez5435
    @sez5435 Год назад +27

    "Some people just don't understand the dangers of indiscriminate surveillance"

  • @oldsoul3539
    @oldsoul3539 Год назад +23

    I looked up the steam message given to that game dev, it doesn't say "unless you can prove" it says "unless you can affirmatively confirm that you own the rights to all of the IP used in the data set that trained the AI to create the assets in your game."
    So write back and tell them it's your own dataset so steam can cover it's ass and say you claimed it was your own dataset. They just want to be able to say you told them it was yours.

    • @ULTRATERM
      @ULTRATERM Год назад +6

      Fair enough and thanks for pointing that out to me! The truth is that there is currently no way to prove your AI model was only trained on media you own the rights to. All they can do right now is collect extra written affirmation as sufficient proof. C2PA is intended to track both what media an AI model has been trained on, as well as its output media. This is so that AI model providers, such as Adobe, can certify their models as being copyright respecting/compliant, and users can show platforms the "provenance" of their AI generated media if required (in order to enforce copyright law and help identify "disinformation").
      The fact that a platform like Steam is singling out developers for suspicion of using AI art tools and getting them to pinkie swear they only used a copyright respecting AI model is pretty grim. If this is their policy now before there are even laws in place, who knows how nightmarish it could become after hard regulations are established and C2PA is accepted as the standard for solving the "problem."

    • @DajesOfficial
      @DajesOfficial Год назад +5

      But it is extremely unrealistic claim so they still are risking by accepting it. It's just impossible for a small company or a single person to train anything close to usable. Stable diffusion required hundreds of thousands dollars and billions of image-text pairs and its still not as good as commercial competitors

  • @SingularityZ3ro1
    @SingularityZ3ro1 Год назад +23

    With all due respect to Adobe, I think my first reaction watching the C2Pa presentation would be: Excuse me, are you high? Also, very trust-evoking having this controlled by the mega corps. On the other hand, they can already shout down many devices you have actually fully paid for reasons... On a bit more constructive level: Was this not what blockchain is supposed to solve with marking images on the chain but in an anonymous way, giving artists a kickback of x percent in the event of commercialization?
    I know many artists also hate blockchains, even the Co2-neutral / offset, prove of stake ones, for reasons like - well, not knowing much about blockchain besides echo chambers. Saying this as someone who also uses Adobe on a daily basis...

    • @kevChess
      @kevChess Год назад

      If you ask many artists you would hear that adobe doesn’t deserve respect.

    • @SingularityZ3ro1
      @SingularityZ3ro1 Год назад +1

      @@kevChess depends on what you are doing. I think you are hardly able to get around it in most production pipelines. For artists just doing solo art, there are much more tools, but I guess that will not matter much if they really get this surveillance system through.
      It would almost be hilarious to start something from scratch and get an instant ban while drawing because you are getting too close to a trademarked style - which means every style on earth, since the corporations have trademarked every style under the sun. But for 20 USD per month for non-commercial use, you can unlock it again, and are allowed to draw it, and share it :-D
      I imagine it like a buzzer with a popup, once you put the pencil down to the tablet, and draw the first outline of anything :-D
      I agree, though, liking Adobe is another thing. ;-) In the end, they do what most large, quasi-monopoly holders do. Take care that it stays that way, and take unpopular decisions / screw their customer base over, as long as their projections tell them that is the most profitable, even with a backlash. That said, despite some questionable things they did, I think there is not much to complain about the software, and it deserves its top spot feature-wise. When it comes to Adobe Premiere, well, maybe not so much anymore.

    • @JorgetePanete
      @JorgetePanete Год назад

      proof*

  • @Veptis
    @Veptis Год назад +13

    Corporate gate keeping art is a great summary of that this is.

  • @kien9350
    @kien9350 Год назад +13

    No matter what AI has powerful impact in Digital Art world... Even with these protection law now, it might only protect the strong and punish the weak. A lot of style is inspired/learned/influented/teached by other artist. So how do they decide what style is their and only their alone.

    • @VodkaPandas
      @VodkaPandas 11 месяцев назад +1

      Who cares, I artist loving it, I personally doesn't care about your concern, since people like you also doesn't care about our concern, mocking us, stealing from us, should've do AI the correct way like what Adobe did in the first place, it might be slow but safety progress, no Artist can complain to it either since Adobe use only images that they have the rights to use, you prepare your bed now you sleep in it.

    • @silbus8670
      @silbus8670 9 месяцев назад

      Причем тут это и искусственный интеллект?.... Ты равняешь вещь с человеком? У тебя всё в порядке с головой?

  • @VincentNeemie
    @VincentNeemie Год назад +36

    This is absolutely mind-boggling! How on earth can we effectively regulate cutting-edge technologies like LoRA, textual inversions, hypernetworks, and all the intricate fine-tuning techniques involved? The concern here is that prominent entities will exploit these regulations as a weapon in the future, just as they have done with various legal milestones they've set up. It seems like the goal of protecting artists might actually end up shielding corporations that exploit genuine creators. The video did a great job of highlighting how this could have a far-reaching impact on the entire learning landscape, but the underlying issue runs even deeper than meets the eye.
    If this law passes, artists may be presumed guilty of stealing if they can't prove their work wasn't created using AI. Some companies already ask about AI use, and this burden of proof could impact the creative process and lead to legal disputes over AI-generated content, raising privacy concerns. The assumption of guilt without evidence could complicate the artistic landscape.

    • @Schmidtcreations
      @Schmidtcreations Год назад +2

      Thats very well worded. I agree with you even though I am an artist myself and am more inclined to wanting protection. Your points make sense👍

    • @13RedCorpse
      @13RedCorpse Год назад +1

      And that's even with the fact that AI can improve artistic endeavors by suggesting new ideas or help with find creative diversity faster and more efficiently. But now, as an artist, I have to worry about using it, thus limiting myself?
      And don't get me started on topic of artist "stealing" from another artist. When you never even saw a picture that someone says you stole the idea from.

    • @cendresaphoenix1974
      @cendresaphoenix1974 Год назад +6

      ​@@SchmidtcreationsIt's not protection if you are fending off the wolves by blowing up your house

    • @Schmidtcreations
      @Schmidtcreations Год назад +2

      @@cendresaphoenix1974 wich is why I am agreeing to not support the current direction of legislation. But there has to be another way because I belive 0 regulation is going to be just as bad.

    • @t1t1t62
      @t1t1t62 Год назад +1

      ​@@Schmidtcreationsno law should be put on AI content, as soon as thats controlled its abused.
      Sites like twitter and reddit can stop image scalping, argue to them to stop bots from mass collecting images.

  • @Crazylittlehobbit
    @Crazylittlehobbit 11 месяцев назад +8

    I've been saying this for months now, The Anti-AI group did not realize the effect their protest would create. The very fact that so many wanted to copyright style is dangerous.

    • @lexa7250
      @lexa7250 8 месяцев назад

      The average person is just as evil as big corpo

    • @Hollowed2wiz
      @Hollowed2wiz 6 месяцев назад

      @@lexa7250 Do not blame evil what can be explained by stupidity or ignorance.

  • @hellmarmel2660
    @hellmarmel2660 Год назад +37

    I hope that c2pa wont be a thing.

    • @iresineherb7
      @iresineherb7 Год назад +2

      +2

    • @GrorfnOnTheLoose550
      @GrorfnOnTheLoose550 Год назад +2

      +1

    • @glint6070
      @glint6070 Год назад +1

      Screw hope, get persistently belligerent with your representatives regarding this nonsense
      If the nose is loud enough, even the most paid-off government official won't cross the line

    • @diadetediotedio6918
      @diadetediotedio6918 Год назад

      .

  • @ULTRATERM
    @ULTRATERM Год назад +6

    launch c2pa into the sun

  • @themore-you-know
    @themore-you-know Год назад +11

    It's a really dumb, doomed pursuit
    A) open-sourced models have already been released, so there's no way to put the genie back in the bottle. they should have learned that from pirate bay.
    b) if the corporations are annoying with this regulation thing, then they'd essentially give people a reason to crowd-source a competitor, almost in the being of a kickstarter or WikiCommons project.

    • @ronnetgrazer362
      @ronnetgrazer362 Год назад

      That's where the governments come in. Sure, anyone can start their own platform, but if you want people to use it in our union/country/state, you have to play by our rules, that we're not knowledgable enough to come up with ourselves so here's a ready made solution by Adobe.
      To combat disinformation, child abuse imagery and terrorism, of course.

  • @winkletter
    @winkletter Год назад +14

    Adobe is doing a great job confusing people about the source of their training images. "Openly licensed content" is not open source or public domain. They vacuumed up all the images on Adobe Stock without notification or consent to the artists who were selling their art there. They scraped their own users' images and have managed to convince the media that their model is "ethically sourced."
    Straight from Adobe's mouth: "Stock content is covered by our Stock Contributor license agreement. We are developing a compensation model for Stock contributors and will share details once Firefly is out of beta."
    In other words, they are setting the terms for compensation without any input from the artists who created the images. Does this really sound any more ethical than what any other company is doing in this space?

  • @venturelord32
    @venturelord32 Год назад +13

    Ironic because the artist community railed against NFTs last year and are now turning back to Corporate Government approved NFTs to "beat AI." Artists are being cast as useful idiots.

  • @conditionednarrator7126
    @conditionednarrator7126 Год назад +40

    Legal and consensual Ai is going to have a bigger harmful impact on society and culture than illiegal Ai. Because most people will be fequently affected by legal Ai but very rarely affected by illegal Ai. Also goverments are much scarier than a random company of criminals doing shady stuff.

    • @kusog3
      @kusog3 Год назад +4

      Imagine giving this much power to government but people can't afford to call an ambulance, makes you question if it's a really a first world country of freedom

  • @mariokotlar303
    @mariokotlar303 Год назад +17

    Thank you for spreading awareness on such an important issue!

  • @Schmidtcreations
    @Schmidtcreations Год назад +31

    Damn I am a artist and am more "anti-ai" but these are some very good points. Just makes it clear how difficult this topic is.

    • @cendresaphoenix1974
      @cendresaphoenix1974 Год назад +4

      Why are you anti-ai?

    • @Schmidtcreations
      @Schmidtcreations Год назад +15

      @@cendresaphoenix1974 Well, I ve put the "anti-ai" in quotation marks because i don't think ai art is purely bad, as a matter of fact I actually see the many pros of ai art. I see it as a two edged sword. One one hand it can immensly support creativity and on the other it can inhibit it. Mainly its about artwork being used as training material for the ai (wich wasnt consented to). It is very different than a human taking reference because the ai will just impersonate the artist (humans do to sometimes but thats besides the point.) Long story short I am just more concerned of the negative sides of ai wich need to be adressed. Not restricting small creatives but making shure big cooperations (ehem disney/marvel) can further exploit (Vfx and other) artists. Also I of course have to mention my personal attatchement to art and the journey Ive been on, but I try to keep that to a minimum.

    • @herp_derpingson
      @herp_derpingson Год назад +10

      @@Schmidtcreations Lets imagine a scenario where someone literally scrapped all the artwork from all the artists illegally and made a zip file and put it on the internet. That person "stole" all the artwork. But literally no artist would care, it was already out there.
      Artists are not really afraid of their art being stolen, they are afraid of their source of income and livelihood being stolen.

    • @dibbidydoo4318
      @dibbidydoo4318 Год назад +5

      @@Schmidtcreations "humans do to sometimes but thats also bad and besides the point." why is that considered bad? many of the art styles of famous artists looked nearly identical to each other; they were differentiated in the content they portrayed rather than the art style itself.

    • @Schmidtcreations
      @Schmidtcreations Год назад

      @@dibbidydoo4318 that is true I am going to redact that part.

  • @I-ONLY-BUILD-MECHS-AND-DUSTERS
    @I-ONLY-BUILD-MECHS-AND-DUSTERS Год назад +7

    "Indie devs have to hire artists... does this whole situation sound any better?" Yes, it does. Do you not think they'll start having AIs pump out those very games, or rip off your very own videos, eventually, or just bury them under an endless stream of ai generated sludge? This whole thing is bad and you should not be arguing for it. The very best case is the internet gets flooded with absolute crap, and you're not able to tell what's real anymore.

  • @greendsnow
    @greendsnow Год назад +45

    holy cow! I was a defender of Artist rights... But now... Some people can lose their jobs for the good of humanity overall.

    • @bazookaman1353
      @bazookaman1353 Год назад +17

      Same.
      They are starting to look like a selfish bunch now.

    • @Y0UT0PIA
      @Y0UT0PIA Год назад +4

      @@bazookaman1353 'Actually, I don't like some of those people, so screw them all lol'
      nice principles bro

    • @VioFax
      @VioFax Год назад +8

      ​@@bazookaman1353 They are. They don't mind stealing your work either. I've lost all sympathy. Capture and control all creativity. Disguise it as protection.

    • @starstorm9198
      @starstorm9198 Год назад +9

      @@Y0UT0PIA If a minority of artists campaign to worsen our lives for their selfish gains and the other majority of artists only stand-by and watch, then the majority is just as culpable if the minority manages to achieve their goals. So yes, screw them all if this is what we'll end up with otherwise.

    • @diadetediotedio6918
      @diadetediotedio6918 Год назад +1

      @@Y0UT0PIA
      This is literally what is behind the artists who are fighting against AI

  • @CMatt007
    @CMatt007 Год назад +11

    I don't think we should br enforcing laws based on fears by people who don't understand the thing and those who just want to self benefit from it.

  • @Kaucukovnik666
    @Kaucukovnik666 9 месяцев назад +1

    The end goal is pretty clear - anything that can be seen, listened to, watched or experienced in any way needs to be "protected from intellectual theft" and paid for every second of enjoyment or usefulness from it.

  • @Matt-st1tt
    @Matt-st1tt Год назад +15

    I think all the current copyright laws are good enough. Your unique characters and individual pieces of art are already protected from copy and redistribution. What more do you want your style isn't a thing.

    • @Casadriss
      @Casadriss 3 месяца назад

      Artists just don't want AI to be trained on their works. That how the "artstyle" is replicated in the first place. It's not just about the artstyle, it's about not stealing someone's property to put in the database.

  • @reindunkelheit
    @reindunkelheit Год назад +3

    This video is realy important. System like C2PA harms everyone, from artist to simple users, only ones who benefit from it, is corporate. AI generated art is bad, but system like C2PA, is so much worse. Every one on the internet should be aware about something like this happening, to stop it before its to late.

  • @artman40
    @artman40 Год назад +7

    I'd also like to remind that many artists, let along companies don't have one single static "style". Moreover, some even have evolving styles. Does that mean you can "own" all of them?

    • @pokepress
      @pokepress Год назад +3

      This is an important point regarding style. Recall that copyright generally requires something to be in a “fixed medium”. Since style changes as new techniques are introduced, the artist’s motor skills change, art movements develop, style is not fixed. As a result, attempting to protect it is a different ball game from protecting individual works.

    • @nightynightlayla374
      @nightynightlayla374 Год назад +1

      That’s what I’m curious about. How are they going to define what’s a art style or not??? Anime has tons of different art styles in that genre, but some average joe would just think that anime all looks the same. Are they going by artist, or by company, or something else? If I make a piece of art with thin line weight, but make another piece that’s the completely the same except with slightly thicker lines, would that make a difference? This is a extreme example, but I can see how potentially nuanced this could be.

    • @I-ONLY-BUILD-MECHS-AND-DUSTERS
      @I-ONLY-BUILD-MECHS-AND-DUSTERS Год назад +2

      @@nightynightlayla374 Arguing about what a style is is kind of a red herring... just don't feed the models images you don't own. Pay the artists to come in and give their art, or pay them to allow their existing art to be used in the model, if they so wish. Hell, even pay an art impersonator to replicate that person's style, even though I doubt it will be enough. I don't think you can fully avoid this type of thing, but at least don't just let some international multibillion dollar company come in and steal everyone's stuff.

    • @shogeki
      @shogeki Год назад

      @@I-ONLY-BUILD-MECHS-AND-DUSTERS I don't think human artists should be able to look at images I've created to train themselves to create better art. Those artists don't own any of the images I created, and I don't want them impersonating my style. From now on, anyone who is an artist needs to pay me to learn from the art I made.

  • @sotomonte_
    @sotomonte_ Год назад +7

    I don't like how you say that removing access from ai to indie developers would be the ramifications, and how you imply the "ethics" don't have importance. The same way that indie developers exist, indie artists also exist, and they are getting offset by ai trained on their own art without permission. Of course adobe's solution isn't great, but pushing for these legislations to be delayed only hurts the artists, and instead of one big corporation (adobe) benefiting from it and getting an upper hand vs its competition, it would be another one (openai, stability ai, etc) doing the same, but faster and with copyrighted material, without permission from the artists.
    Edit: Indie developers can buy assets cheaply, made by artists who sell them, or they can create their own, collab with artists and split the profit, etc.

    • @Alexis-hn1yw
      @Alexis-hn1yw Год назад +4

      is funny right? how they think this "tools" will allowed them to "create" more when it would be used against developers too. Is like people doesnt have education nowadays. they just like their techy shiny things

    • @gondoravalon7540
      @gondoravalon7540 Год назад

      ​@@Alexis-hn1yw > *is funny right? how they think this "tools" will allowed them to "create" more when it would be used against developers too.*
      Er ... that's a non-sequitur if I ever saw one - people using a tool to abuse others doesn't mean the potential benefits stop existing (not to mention that it is an issue with how people use it, clearly)

  • @junofall
    @junofall Год назад +7

    I don't think C2PA will fly. Let's just hope it's another DoA standard.

  • @Hannah_from_iterationscrafts
    @Hannah_from_iterationscrafts Год назад +14

    I definitely have issues with the way in which Adobe is marketing itself as the only “ethical” AI image generator. While it seems to be true that they had a legal right to all of the images they used in training their models, there still seem to be a lot of questions about whether the creators of those images gave meaningful consent for them to be used in this way (by which I mean, they opted in to this use knowing what it would be used for). We all give passive consent all the time for corps to use our data/images for various things we don’t know or understand (soooo many lengthy Terms of Service we have to constantly click Agree on!) there’s Terms of Service when we make a website, or maybe we’re bound by the terms of service agreed to by someone when they created the website we just put our art on… not to mention the fact that I’m not sure anyone has a very clear idea of what legal claims Meta thinks they have to all of the images we’ve been putting on Facebook and Instagram all this time. And they are in the generative AI game too with the legal/political firepower only a big corp can have. So basically, yeah. I agree that it’s a huge problem when the solutions to these problems get created by the big corps. Especially given that they are the ones who designed the legislation that allows all of the internet scraping to begin with. I don’t know how to give protection to independent artists without new regulations. But in general the regulations we get around copyright and art (and especially tech) don’t tend to benefit individuals in the fine print

    • @BinaryDood
      @BinaryDood Год назад

      even if you have a complete copyright free database, you can't "ethically" replace/devalue/drown out artists. This whole debacle is being tackled at the wrong angle.

    • @13RedCorpse
      @13RedCorpse Год назад

      Making AI "legal" just makes it more limited to those companies, who already have a lot of power and influence.

  • @williamkuroki7683
    @williamkuroki7683 Год назад +5

    Well, as an art student, I'm really discouraged to continue. It is not easy to follow this scenario. I guess I'm not the only one unmotivated. What I've been studying for a while is done at almost instantaneous speed by a prompt typist who doesn't even need to study design, color theory, composition, etc.
    The most incredible thing is some Senior Artists, saying that nothing will change. Sounds like a joke.. but it's true. Of course, for artists (probably born in the US or some rich country) already established and inserted in the industry, everything is easier. Nothing is going to change. But for beginning artists, and even in underdeveloped countries, it is logical that it could be the end of the line. Anyway... I think that from now on, capitalizing on artistic work will become more difficult. Art as a profession will also be something in extinction.

  • @melindawolfUS
    @melindawolfUS Год назад +3

    The government getting involved is the WORST idea.
    I don't mind the tracking data as an option. Some websites might require something like that while the internet will also give us rebel social media spaces that WON'T. We should let people choose how much they want to value the little badge. The big problem is definitely when big companies try to create advantageous laws for themselves and provide additional methods of control and manipulation for our government (who NEVER has our best interests above their own wallets and who have proven over and over again they can't be trusted with this level of power and access). The whole thing reeks of corruption and makes me feel sick

  • @Lucien_M
    @Lucien_M Год назад +33

    C2PA is just a massive monkey's paw. The problem was that artists were getting their art used in the machines without their consent(which could easily be solved by making it opt-in), but now the "solution" is to track everyone who's ever manipulated an image.
    I can't make an accurate comparison, but businesses and schools still ask for everyone's consent for their faces to be shown in a picture online. It's not as if they shouldn't listen to people's concerns because "It's just like when other people look at you. If you didn't want me to photograph you, you shouldn't have come outside"

    • @kusog3
      @kusog3 Год назад +6

      the thing was, it was never really a problem for artists, that is until the AI got good enough. Meaning, AI generators will still be a thorn to artists side even if new models are made entirely of "morally ethical" dataset, (even though courts are ruling fair use)

    • @Lucien_M
      @Lucien_M Год назад +4

      @@kusog3 I'd argue that it is a big problem, especially since some popular artists(i.e. Greg Rutkowski) literally have their own style stolen from them(since his name is a popular prompt).
      That has lead to confusion about the validity of his own art, and now he technically has to fight against himself when making high-quality art.
      This definitely won't affect smaller artists in the same way, but I'm sure the concerns about AI stealing jobs would be lessened if artists actively chose to feed their art into the algorithm

    • @Lucien_M
      @Lucien_M Год назад +10

      @darukineo like I said, it could've been solved in a non-dystopian way(using opt-in). Acting like it's your way or the high way is what led to some artists praising this dumb solution.
      The faster artists and AI artists cooperate, the harder it is for corporations to fearmonger and implement dystopian solutions

    • @IronFire116
      @IronFire116 Год назад +3

      The problem isn't taking a photograph, it's reproducing that photograph.
      An AI trained on images is just like looking at someone. It's just a fact. Do you realize that the AI do not have databases of images? They contain no images inside of them. They genuinely learn to make images just as humans learn to make images.

    • @Lucien_M
      @Lucien_M Год назад +1

      @@IronFire116 it doesn't matter how it learns, the fact is that it can still choose what it learns from(or to be more specific, the AI devs can choose what gets inputted into it), just as you can choose which images on your camera roll should be posted online.
      DeviantArt took a step closer to making art opt-in, by making any published art opt-out(basically, click a button before publishing to prevent AI learning your art).

  • @boroborable
    @boroborable Год назад +4

    you can protect art or characters for example however you like but you cannot own a style. that's simply trying to pattern a genre like 'techno' in music. because every style or genre in existence already derived from previous ones and would be already broke that copyright laws the moment the law is accepted.

  • @Y0UT0PIA
    @Y0UT0PIA Год назад +9

    I'm definitely with you on 'disinformation is out of control so let the government tell you what to think' being pretty dystopian.
    If disinfo does become that much of a problem, the only tenable solution I can think of would be 'personal information firewall' services that give end-users full control over what is filtered and how. Ideally it'd have to be the kind of AI assistant you can just run locally. The more decentralized the better.

  • @themore-you-know
    @themore-you-know Год назад +16

    Art is water: readily available by the tap.
    There is little moral high ground upon which artists may stand: they are now held to the same standards as sanitation workers on whom they've depended on for entire decades. Producing a commodity, except they don't need to have to work knee-deep in raw human waste.

    • @GerardMenvussa
      @GerardMenvussa Год назад +4

      "they don't need to have to work knee-deep in raw human waste."
      Except for those who decide to post their art on Twitter, of course.

  • @shamrock5725
    @shamrock5725 Год назад +3

    Whose style would be considered the primary style. How does anyone learn anything if not copying previous artists in order to learn how to do something. This is only going to go sideways where it ruins things for everyone instead of new technologies allowing new artists new opportunities.

  • @trickybarrel444
    @trickybarrel444 Год назад +5

    Never let a good crisis go to waste

  • @samthesomniator
    @samthesomniator Год назад +3

    Making to copy "someones artstyle" illegal is the most disturbing thing I have heart in the entire debate. 😃 That is plain against the core principle of art history itself. 😅 Just imagine the legal shitshows that will follow.
    Who owns the style of some music genres after that?! Who owns metal? Iron Maiden? Metallica? Some other OGs? 😃
    People say all about power metal is to sound like Powerwolf. 😂That bands would all be going behind bars nows?
    Whatever dystopia you can think of generative AI has ready for us. It seems to be nothing against the art dystopia those anti ai dudes have in mind here. 😂

  • @TheStrandedAlliance
    @TheStrandedAlliance Год назад +6

    Can't we just let go of copyright?

    • @succubiwishes
      @succubiwishes Год назад

      PLEASE

    • @TychonAchae
      @TychonAchae Год назад +5

      Why stop there? Let's just let go of all private property rights.
      Can I have your stuff?

    • @timooothy1234
      @timooothy1234 Год назад +1

      ​@@TychonAchaebut isn't Ai using art as an inspiration like every other artists?

    • @TychonAchae
      @TychonAchae Год назад +3

      @@timooothy1234 - No, it isn't. Why would it be?

    • @APaleDot
      @APaleDot Год назад +1

      ​@@TychonAchae
      Physical property is not like intellectual property. You pay for everything you own, but no one owns culture. When I take an idea from someone else in order to make something new, no one has lost anything and our culture is that much richer. We shouldn't let private corporations completely enclose the culture that rightfully belongs to everyone, so they can pretend that it's a scarce resource you must pay for.

  • @minidreschi2
    @minidreschi2 Год назад +3

    remember:
    big corps DONT want to give u anything for free,
    make something to protect your art? u can bet it's gonna cost you in one way or other

  • @savorsauce
    @savorsauce 10 месяцев назад +2

    I would compare this new C2PA legislation to the Patriot Act. Both were advocated by governments and corporations, and utilized irrational fears and complex problems to implement mass surveillance. Lets not make the same mistake we made back then. There is a better solution if we just look for ourselves.❤❤❤

  • @skynetAU
    @skynetAU Год назад +1

    We can already edit metadata, so why won’t be we be able to edit C2PA?

  • @pokepress
    @pokepress Год назад +2

    I think there’s also a freedom of the press issue (probably several) here. Suppose I’m an entertainment journalist, and a casting/directing announcement is made. Rather than using an unrelated stock photo, you could use AI to generate a hypothetical image of what that actor might look like as that character, or what that franchise might look like in a certain director’s style. As long as an image is disclosed as generated by AI, I think that’s fine. I’ve done something similar to this for certain Pokémon characters, attempting to figure out what they would look like in a live-action movie, or what it would look like if other animated Pokémon movies got a CGI remake (see some shorts I’ve published recently). This would also be useful in framing discussions around scenes cut from a book->movie adaptation, alternative endings, etc.

  • @Ryuuko3
    @Ryuuko3 Год назад +2

    C2PA makes fan art illegal right? Artists cannot sign copyright character they don't own

  • @gustinian
    @gustinian Год назад +9

    Artist styles are drawn from (or are a reaction to, a negative prompt of) other prior styles or random accidents - exactly what AI is doing. We are all influenced by our environment. Artists are bemoaning their inefficiency of learning when compared to AI - it took me 20 years to get this good etc.
    The photography debate has covered these dilemmas 150 years earlier.

    • @I-ONLY-BUILD-MECHS-AND-DUSTERS
      @I-ONLY-BUILD-MECHS-AND-DUSTERS Год назад +3

      Whether something learns "like a human" is simply an opinion. No one is certain of all the details of how exactly human minds learn at all, so it's impossible to claim that as fact. You could as easily say a camera just remembers stuff like a human, and then record a bunch of movies from a theatre and sell them as your own. The facts remain: the models were fed mountains of images without the owners' consent.

    • @QdnbdbIeuegr
      @QdnbdbIeuegr Год назад

      ​@@I-ONLY-BUILD-MECHS-AND-DUSTERSit's just an excuse they like to use to justify stealing. It doesn't work like a human and that's a fact. They'll always pull these desperate pseudo-science comparisons.
      Let them be.. They're only devaluing themselves this way

  • @dyoanima
    @dyoanima Год назад +15

    I wholeheartedly hope that these anti-AI initiatives fail miserably, the only thing they will seek is the loss of people's freedom.

    • @paulaumentado1588
      @paulaumentado1588 Год назад +1

      I support them but we the people in general should force the government to actually use their brain not the money of corpos

  • @TexanMiror2
    @TexanMiror2 Год назад +3

    It's striking how this regulation is hyper-optimized for destroying competition in the AI field, rather than actually protecting anyone from the consequences of AI.
    You could totally create a digital authenticity system without requiring big companies to verify all your data on the cloud. Decentralized and community-built solutions could be created (hate to point it out, but crypto is perfect for this application, by the way).
    But nobody can make a profit from that. So, centralized big brother "helping" it is. Lots of profit in that.

  • @skyscraperfan
    @skyscraperfan 9 месяцев назад +1

    It already annoys me how much data is embedded into an MS Office document or a PDF without your explicit consent.

  • @dandevti5560
    @dandevti5560 Год назад +2

    In the end, they will realize that the only way to really guarantee copyright is to remove civil rights. Especially since a lot of copyright consists of the continuity of an author's intellectual property even with a person in possession of his home. Which means that, necessarily, the author needs to have the power to see what his client / consumer does in order to be able to control him in order to guarantee that he does not infringe his copyright. However, he will realize that in order to achieve this, he will be doing something that the government also wants: he watches the people. Imagine, you can see what each one does with their digital media. In the future, the future of C2PA is digital media that are true trackers. Just like today we have browsers where we place adblockers, we will need to use a program to block tracking and surveillance of digital media that in the end will become true trojans. It reminds me of how the government's interest is against cybersecurity when the same vulnerabilities that it makes big tech puts in their products end up in the hands of hackers. If you think about everything that hackers do that is bad, before them it was a weapon of the government. Even TOR, was created by the government, Zero Day was created by the government, Virus that infects industry? Cyber Army, in the end they create the problems and then offer a false solution that violates privacy. I myself would rather they do away with the TOR network than with Privacy. Privacy is a human right, without it democracy cannot survive. However, that's it, they create a problem and then convince society that the most authoritarian solution is the best, with no guarantee that it will improve, just more restrictions.
    About humanity: why is it that there are always anti-progress people? Seriously, always. Anti Bitcoin, Anti AI, Anti electricity, yes there were people who were against electricity in the past, anti nuclear energy - ironically today many people make a living from it, anti democracy - in the past many thought it was normal, anti freedom. Seriously, humans being human, the best expression of humanity is when stupid people fear the unknown and start screwing up. Back to the Middle Ages.
    About AI Art: C2PA is clearly idealized by those who don't understand AI Art and Cybersecurity. Both because: I Art is visually distinguishable. That is, it makes more sense to create software that looks for glitches and artifacts produced by IA Art than a certificate that works in the same way as the Digital Certificate used on a site where hackers, when they want to perform a Phishing attack, the first thing they do is attack and tamper with the Certificate. Even NFTs are not secure as NFTs that are NFTs are mere addresses, which means the content itself is not registered on the blockchain. And yes, only the blockchain has the robustness to actually serve as a universal certificate. That is, from a technical point of view, C2PA is more political than practical. It will still be a lawless land for hackers who will forge their certificates and as it does not use Blockchain it is possible to use keygen species or even pentest or OSINT in the certifying companies themselves and thus, we will have what we already have with countless private data on the dark web that is: C2PA is the dark web hackers' paradise with no technical value in cybersecurity. What would have technical value in cybersecurity? Blockchain certified. But there's a question, real solution is not magic. Blockchain has processing costs and this occurs for only one number and one wallet. An image would need to record every pixel or even every binary on blockchain. A kind of Blockchain Immutable Storage would be needed, but here comes the opposite side, these people don't like the idea of Immutable Storage or that it is resistant to destruction, especially because the free internet side that likes that, just look at Odysee so as the real solution to the problem contradicts their ideology of control. They will opt for the stubbornness of continuing in authoritarian solutions and full of vulnerability because the objective is power. It's about control and power, not security. We all know that there are two groups and two interests: Copyright and Surveillance. And the practical technology that would guarantee the authenticity of an image would need to have no individual control. When individuals have control of the media, that's where the vulnerability is, we can take the most extreme case, just think: The hacker can do everything you do, that's why sometimes, the best form of security is restriction. It's how bitcoin operates, bitcoin security is the lack of human in control, nobody controls, so it's not manipulated. If a human controls it, that's it, it's already manipulable. just look at Ethereum the news of hackers and cataclysms, this happens with things centralized and with humans in control.

  • @tbk2010
    @tbk2010 Год назад +5

    Not sure about C2PA, but anti-impersonation right is the dumbest thing I have ever heard. It would hurt artists waaaaay more than it would help them. And of course it's completely impractical.

  • @lelouch9609
    @lelouch9609 Год назад +6

    Tranning and copying both are different things, inspiration and copying are different things.

  • @BinaryDood
    @BinaryDood Год назад +9

    Karla Ortiz's focus and method of tackling the "AI art vs artists" issue has always been erroneous from the getgo and I've been expressing it since last year. That she and people who think like her became the forefront of this debacle has obfuscated the true danger of AI as disruptive tech in the creative/subjective sector of jobs and overal humanity. There is no such thing as an "Ethical Database" because you can't "ethically" devalue/replace/drown out artists and any creatives' lives and jobs. Art styles are not by themselves copyrightable and copyright is not your saviour, its enforcement would always imply bringing out mass authoritative measures such as this if it was ever to have a serious chance of succeding. Adobe Firefly thanks you though, and so does the common interests of the likes of Sam Altman's ilk who desperately are trying to find a way of destroying every last bit of internet anonimity to "save us" from the problem they have created. This may not work now, when the iron is hot, but in a few years when the extraordinary has become ordinary, it's crazy to think a massive shift won't be enforced slowly but surely on the web to "combat" the innevitable chaos of a bot-filled web overlayering a bot-filled job market.
    If you use Ai art generators, you are customer and not a creator. But that is according to its labor value. In the abstract value society relies on, there really is no one right method to deal with the disruption this tech will bring without massive enforcement. But we can at least be cogniscent that if something was created using AI, it belongs to NO ONE. We can't play at being artists and musicians (because a robot has done it for us) when there are real artists and muscians out there who will suffer directly as a consequence of a system who encourages the lax end-user to believe that through consumption he "is" what he "has". That is the path (of least resistance) which leads to the outsourcing and commodification of all creative and subjective thought and practices. Down the line, why would you pick up a pencil if the world, in the system of decentralized self-scrutiny/oppression which it sells to is occupants, is continually forcing it down and leading you to the machines which "do it for you"? The answer: Intrinsic Motivation. Something which we barelly cultivate in individuals.
    But of course that wasn't the angle this all happen in. What's on the surface of this debacle will be used aganist artists whilst Adobe and the like are able to get away with clean hands because they are "ethical". Most clients don't care that much. "Oh sure, we used Adobe Firefly to generate all of our storyboards. But hey! The database used only copyright-free images, so that is fine, that was the worry right?" to justify layoffs. Artists really are the worst defenders of their own interests.

  • @connorwaldman8124
    @connorwaldman8124 Год назад +2

    As an artist of 3 years I do feel strongly AGAINST AI DIGESTING MY ART. I support something to track and protect what I post online. Maybe just make applying it optional.

    • @SuperSigma69
      @SuperSigma69 Год назад +3

      Your art would probably get flagged as "style infringement" if all these get implemented. Most artists go for a look that's based on a style that's already out there.

    • @connorwaldman8124
      @connorwaldman8124 Год назад +1

      @@SuperSigma69 yeah. I don't claim that my style is unique. I just don't want my art being studied by AI to improve it. There just needs to be a way to prove that I was the sole creator. To protect me against both AI and others trying to claim it as their own.

    • @QdnbdbIeuegr
      @QdnbdbIeuegr Год назад +1

      ​@@SuperSigma69there won't be any style infringement. That was a dumb statement by adobe. They aren't so foolish to implement it.
      AI training on the other hand will be regulated, data transparency and disclaimers of production will be required.

    • @QdnbdbIeuegr
      @QdnbdbIeuegr Год назад

      ​@@connorwaldman8124AI doesn't study your work. If they put your art into the dataset their program regenerates imagery based on probability. It's more of an image compiler. "Study" is a more complex matter. Don't give them credit, they attempt to equalize AI to humans to justify taking your art.

  • @dayleywhaley2420
    @dayleywhaley2420 Год назад +7

    I can understand both sides on the war on Ai image generation but as a digital artist, stable diffusion user and conspiracy theorist this is not what I want at all 😂 C2PA sounds like a way to push for more censorship and hiding of information , I understand it’s being done in the noble cause of protecting creators but it absolutely will be used for other agendas
    And as a digital artist I don’t want to have to sign a certificate everytime I change an iteration of the work I’m creating for a client, since iterations of projects are sometimes sent back and fourth dozens of times before being decided upon

  • @stephenthumb2912
    @stephenthumb2912 Год назад +3

    THE critical issue to me with C2PA are the CA's, aka the certificate authorities. The CA's are completely centralized. Basically you, your device, your app or your org are digitally signing the metadata attached to your file. Then anyone making changes, also must do the same. Then it's all recorded in a 'manifest'. On its own, nothing wrong here. The issue is in order to sign, you must have a certificate which basically proves your identity. the exact same method as domains are handled. The certificate of course comes from the certificate authority and therein is the problem. They become basically the arbiters of truth. if they don't decide to issue you a certificate you cannot validly sign your content and be approved according to C2PA standards. Of course for approved apps, devices and companies, getting the certificate is no problem, but anybody not "approved" well.... From a dev's standpoint, I actually like the cryptographic signing aspect, you can prove cryptographically that you made a change or at least are a originator of a file. It's the certificates that are a black hole.

    • @Storygospel533
      @Storygospel533 Год назад

      Sounds like a billion dollar issue that Microsoft will have conveniently already thought of

  • @HarpaAI
    @HarpaAI Год назад +6

    🎯 Key Takeaways for quick navigation:
    00:00 🎨 AI-generated art has sparked controversies, with concerns about job loss for artists and unauthorized use of their work to train AI models.
    01:08 🛡️ Artists are seeking ways to protect their artwork, with a popular solution being to ban the use of artists' work in training AI.
    02:02 💼 Industry professional artist Carla Ortiz is leading the anti-AI art movement, advocating for data privacy laws and ethical AI rules.
    02:59 📜 A proposed "anti-impersonation right" aims to protect artists from someone imitating their style or likeness using AI tools.
    05:05 📊 Adobe proposes the c2pa metadata system to track the origins of AI-generated media, potentially leading to centralized control and censorship of content.
    07:37 🚫 Critics fear that c2pa could erode trust in online content, jeopardize privacy, and create a system of mass surveillance.
    09:15 🤝 Adobe's involvement in proposing c2pa raises suspicions of corporate interests leveraging copyright concerns to their advantage.
    11:23 🔄 Rejecting c2pa might not offer a perfect solution, but it could prevent the negative consequences of centralized control and censorship.
    Made with HARPA AI

  • @limo_was_here
    @limo_was_here Год назад +5

    during the hearing, an opt in mechanism was discussed. There's also been talk on the invisible watermark that would disrupt ai's ability to learn from images (effectively creating an arms race, but it could be regulated to force that protection and to stop companies from trying to break through that protection, practically enforcing an opt in mechanism (as only images without that protection would properly function in ai training).
    I feel like this video is a bit alarmist. Sure, C2PA could not be the way to go, and i share some of your concerns with it (i don't want it to be implemented). But that doesn't mean that anti-ai art is "on that side". It's just a proposed solution from the corporate side.

    • @cendresaphoenix1974
      @cendresaphoenix1974 Год назад +3

      I would rather people freak out than not care at all

    • @pine_13
      @pine_13 Год назад +1

      ⁠@@cendresaphoenix1974This is actually a dangerous sentiment to have in general. Fear mongering only provokes people to action, which can lead to major consequences if the information they are acting upon is incorrect. It’s a tactic people with power use to manipulate others to do what they want. Staying neutral at least doesn’t lead to any direct harm. It also allows someone to be more receptive to more informed and beneficial solutions.

    • @gondoravalon7540
      @gondoravalon7540 Год назад

      @@cendresaphoenix1974 That's how we got the DMCA, patriot act, and the TSA ... no, people need to react with their brains, not their emotions. THINK ffs.

  • @dantepearl4186
    @dantepearl4186 Год назад +2

    In the late 90's and early 2000's, companies freaked over music MP3's instead of trying to capitalize on them. They whined about Napster and screamed "piracy piracy!!" Windows tried to lock up music with WMA tracking crap like C2PA now.
    Then Apple sold digital music -- instead of bitching about it.

  • @matthewboyd8689
    @matthewboyd8689 10 месяцев назад +1

    Why can't these AI companies just buy a share of the rights to use their kind of style and anytime prompts are used that would use that style they get a share of the profit?

  • @MegaTang1234
    @MegaTang1234 Год назад +1

    3:38 fan art is already illegal, it's just looks bad when a company goes after an artist for drawing fan art.

  • @carrotman
    @carrotman Год назад +2

    But if we can sign our own images... We could sign our own images?
    It's one more step sure but it still seems like it's relying on people not faking?
    Which is unrealistic because that's meant to be the issue they're trying to solve.

  • @normietwiceremoved
    @normietwiceremoved 6 месяцев назад +1

    They're giving us a solution to the problem they created.

  • @free4fire
    @free4fire Год назад +4

    Goddammit, the only thing I (and many other artists) want is that it's not allowed to just take our work and put it into those training data sets, all that other extra shit that's being snug in to further screw up the internet isn't needed -_-

    • @cendresaphoenix1974
      @cendresaphoenix1974 Год назад +1

      That's what you get for trusting the government. Chances were corporations were going to use less of your art anyway but yall kicked up the dust got the man involved and now you, me, and everyone else can go get fecked. Same thing happened in the music industry and now musicians are constantly terrified of copyright because they didn't like that someone used a little bit of their song in a RUclips video.

    • @ULTRATERM
      @ULTRATERM Год назад +1

      The problem is that there is no way to enforce what an ai model is trained on without a technology like C2PA.

    • @free4fire
      @free4fire Год назад +4

      @@cendresaphoenix1974 Wrong, this is what we get thanks to those a-holes who ignore other peoples rights and use their work without permission.
      You're basically the guy who blames the shop owner for calling the police while being held at gunpoint if somebody gets shot during the arrest, ignoring that it was the
      guy who decided to rob a store who actually started that situation, basically victim blaming at its finest.
      Also don't compare us visual artists to the music industry, they've got billions of dollars behind them meanwhile (aside from a few "artists" who help with the modern "art" money laundering scheme) most of us don't have that much money despite needing to sink thousands of hours into practice to get to an acceptable level (yes, despite popular belief being able to paint well isn't all just "talent" but actually a massive load of work and probably 98% of all artist start out with the same stick figures that everyone else can paint).
      And it's not just big companies using AI that would be a problem, many artist also heavily rely on private commissions for things like D&D campaigns, fanart etc. which are now basically gone thanks to AI, which yeah we can't stop people from using available technology and force them to pay us for commissions instead but at the very least our work shouldn't be used without our consent to basically build our replacement.
      Plus not all of us ran to the government since many of us (me included) aren't even from the US, let alone trust the government or big corpos, so I've got zero say in this and can basically only sit down and get f*cked over from each direction here.

    • @gondoravalon7540
      @gondoravalon7540 Год назад +1

      @@free4fire I'm not sorry, that's horseshit implying 1) that people who make laws can't be lobbied to have laws made and passed with specific interests in mind, and 2) that the govt has no agency, despite being the one making the laws, in regards to making sure it is sensible.

    • @PaleFolklore
      @PaleFolklore Год назад

      You are so naive

  • @obi-wan-afro
    @obi-wan-afro Год назад +6

    There are people who appear optimistic about this situation, stating that AI will be beneficial for art. I believe it will be detrimental; beginner artists who are just starting to learn will face many more barriers to continue pursuing this. All professionals were beginners who couldn't draw hands at some point

    • @goncalocarneiro3043
      @goncalocarneiro3043 Год назад +7

      It is inevitable that some people are lost along the way. But human determination is great, and there will always be someone who pushes through adversity and creates, even if they have little support.

    • @DajesOfficial
      @DajesOfficial Год назад +4

      @@goncalocarneiro3043 this is not a good argument because increased difficulty of entry will certainly reduce artist amount. But I would argue that the difficulty will actually decrease because no one prevents (at least should not prevent) artist from using the AI tools themselves. And if they do then they will be more inspired to improve at this because they would get pretty good results from the start, learning how to get good drawing step by step instead of going from zero to full painting like its been done before.

    • @ultimamage3
      @ultimamage3 Год назад

      All the people claiming to be "professionals" now are _still_ beginners that can't draw hands.

    • @thinkbetter5286
      @thinkbetter5286 Год назад

      This comment and replies are why nobody will listen to the people trying to stop this law.

    • @gondoravalon7540
      @gondoravalon7540 Год назад +3

      @@thinkbetter5286 What do you mean?

  • @gaspachoo5046
    @gaspachoo5046 Год назад +1

    the exact same people who caused the problem are now offering the solution.

  • @Arthur-jg4ji
    @Arthur-jg4ji Год назад +3

    the problem is that the big corporation would have enough money to buy artist to make art for their ai model and use that to be more productive. In contrast, the small artist are going to struggled against theses big corpo and their ai because their are going to be faster and more productive. The big corpo wouldn't need a lot of artist. Then , the price for model with training that respect artist will be estremely costly, like 200 dollars or more. So the artist are going to buy theses licences and use them but they will lose a lot of money to theses corporation for their ai model that are in addition censored.
    But hey ! they did not stole their art at least ! Their dream will come true ! If the regulation they want to have pass, the small artist will be poorer than before !

  • @00000a0009
    @00000a0009 Год назад +10

    Let's say that they will approve this new sort of nightmare law. Then someone could just take a camera with this new signature system and take photos of a screen that will try all the possible combinations of pixels then that person will own all the images possible that have not been previously taken. Then no one will be able to generate new arr

    • @herp_derpingson
      @herp_derpingson Год назад +4

      "All possible combinations of pixels" you might be underestimating the complexity

    • @edxotattoo
      @edxotattoo Год назад +5

      @@herp_derpingson yep, the number of possible picture surpass the numbers of atoms un the universe

    • @enkvadrat_
      @enkvadrat_ Год назад +1

      it is the same with music and in that case someone has already done that

    • @00000a0009
      @00000a0009 Год назад +1

      @@enkvadrat_ yes, I saw the interview about the guys who did it. Interesting project

    • @00000a0009
      @00000a0009 Год назад +1

      @@herp_derpingson you have to first see what the law define as "different".

  • @marek_tarnawski
    @marek_tarnawski 11 месяцев назад

    - The video makes it sound like the whole idea is coming from artists. The CP2A initiative is Adobe's idea, not Karla's.
    - I watched the whole explanation from Adobe engineers and the idea was to focus on tracking authentic content for news sites. The copyright thing is secondary here.
    - From what I know you won't have to authenticate images unless your job is to deliver photography for news. It's for specific cases
    - Image metadata with all the information inside image already exists. It just gets lost quickly when uploading to certain platforms or image gets reworked.
    The main thing that bothers me though is that it's coming from Adobe since they are really into keeping their monopoly.

  • @Sasha444luvs
    @Sasha444luvs 11 месяцев назад +2

    Most artists including Karla don't want to copywrite style. Karla said she was against that during that court hearing. What we want is consent, compensation, credit and transparency. For the use of our work and data. Your making it sound like artist are pushing to copywrite style when that's not what we are trying to do at all, idk if you actually watched the full hearing but she clearly was against that idea. Either your misunderstanding our position, or trying to frame it in a dishonest way.

    • @Casadriss
      @Casadriss 3 месяца назад +1

      EXACTLY!!! Why isn't this comment more popular???

  • @zellator
    @zellator Год назад +3

    Thanks for the insightful video!! But the Gamedev argument was really under cooked. "Think of the indie devs! They will not be able to use any AI assets in their games!!!"... imagine having to learn a craft to use it properly... the horror

    • @lip3gate
      @lip3gate 11 месяцев назад

      Or Just... idk... pay someone to draw it😂

  • @sorijin
    @sorijin Год назад +1

    Adobe sucks so bad, they always have been slowmo gatekeeping design and art for decades now

  • @Phrismo_Vekanandre
    @Phrismo_Vekanandre Год назад +1

    I cant stand the fact it would be absurdely simplier if people were transparent by their img training sources by the beginning

  • @wisemage0
    @wisemage0 Год назад +1

    "Raising funds to protect human artists" might actually be the sketchiest crowdfunding campaign I've ever seen.
    It actually boggles my mind that people were willing to give over 100k to something so rife for abuse.
    If they just take the money and run that would likely be the best course of action for both artists and proompters.

  • @assasinsbear
    @assasinsbear Год назад +1

    Great reporting!

  • @solid3259
    @solid3259 Год назад

    This is a letter to the man who sold the world:
    I can be good at math, give me the formula and a calculator. I can be good at art, give me an computer and prompts. I can be rich, give me connection to people and the acess to their dreams.
    That is the reason why there is a lot of people that watch soccer, but never play it.
    People who like to critique art, but not create it.
    If some movie, game or comic uses more AI than humans to make, I wont buy it. I wont even see it. For me, they can sell their 'art' to other robots.
    The most precious resource art have is called..... passion. Yes it can be incredible, nice and have a great look. But int the end, AI its just a calculator and a formula.
    I'm still an art student after 8 years. And, I will always be. Art is a path that I've taken. People around me see this situation like an economical disaster to me life. You can create a replica, but no one will forget the original. This, you can't erase.
    My hope is that, in 20 years, robots beggining to buy human art again. It seems like they will 'feel' more than us humans rn.

  • @skymessiah1
    @skymessiah1 Год назад +2

    I think that a lot of this image ownership drama will evaporate over time as it seems likely that it will become increasingly easy to access models that were trained only on image sets where useage agreements have been negotiated already and you as the customer looking to generate some images can buy into that agreement for a price. Businesses looking to use AI will find the latter appealing because they are often risk averse and will be keen to just buy their way out of any future legal trouble relating to asset ownership. Smaller, cheaper and likely less general models will probably also become availabe for things like texture generation that will be within reach of indie devs etc. and costs there could be modest and not unreasonable if they come with a credible promise that future legal issues won't be a thing (and maybe some "paperwork" you can hand over to valve et al). There will also still be plenty of "wild" models being developed and released by enthusiasts too who aren't worried about the provenance of their training data - but these probably just won't be appealing for commercial use. Some artists may still be sore about generated images of their characters etc. floating around on the internet for free but if broad commercial use of "derivative" generation is discouraged (because commercially palatable alternatives are available) I think that will more or less settle the matter - at least with respect to new laws being passed etc. Those are my guesses anyway!

    • @TheOriginalOrkdoop
      @TheOriginalOrkdoop 11 месяцев назад

      Thank you for this comment. I sincerely hope you are right.. I love making traditional art, digital art, and ai art. I would hate if a giant gatekeeping corporation started taking my work off the internet bc it looks too similar to a style they "own". It's honestly frightening. And your comment gave me some hope that it will be ok in the end. I'm crossing my fingers for sure.

  • @f.eckert
    @f.eckert Год назад +1

    And when C2PA is established, artist will have to pay a monthly fee to have their work protected. Hm, protection money, that reminds me of something... ah, the mafia, but let's call it PaaS - protection as a service.

  • @look2wardsdatrees
    @look2wardsdatrees 11 месяцев назад

    the bicycling game clip shown is called Descenders. Its fire

  • @edxotattoo
    @edxotattoo Год назад +4

    Only digital artists should be concerned, but traditional artist whining about AI is just pathetic

    • @mayankprasoontirkey369
      @mayankprasoontirkey369 Год назад +2

      it is because you are an illiterate who doesn't understand that most traditional artist sell prints and other stuff and not paintings. obviously they will complain about it.
      regardless why won't a traditional artist complain about AI art, it is effecting the community in general and not just the digital space.

    • @paulaumentado1588
      @paulaumentado1588 Год назад

      No they can be angry look at da Vinci and van Gogh for example their work bastardized by machine

  • @NielsDewitte
    @NielsDewitte Год назад +5

    Adobe's whole deal is to, while pretending to be on artists side, offer such a dumb solution that it weakens anti-ai arguments. Pretty much no artist thinks Adobe's suggestions are reasonable. In fact most artists just want model makers to be open about which data they use, and to be (by law, not by automation) restricted from using copyrighted work for commercial purposes. It's also ridiculous to blame artists for the current situation, seeing as it is model makers who felt the need to exploit flaws in copyright law in the first place, and how the solutions proposed are suggested by a massive corporation charging digital artists way too much for their stagnant products...

    • @ULTRATERM
      @ULTRATERM Год назад +1

      How can you enforce creation and use of "copyright respecting" AI models without something like C2PA? How do you verify which media was used to train the AI, and which AI model generated an image?

    • @allanatiers9261
      @allanatiers9261 Год назад +1

      @@ULTRATERM you can identify which "art" was made by ai and just ban ai works from being used commercially in general. that's it. the 95% "fast buck grifters" will fall right away. and for the rest of the people ai might actually be a usefull tool then and not a replacement. no C2PA needed.

    • @NielsDewitte
      @NielsDewitte Год назад

      @@ULTRATERM The AI does not need to respect copyright, only the data used for training. Due to the way diffusion works, if copyrighted material isn't in it's dataset it's unlikely to produce copyright infringing work. In the rare cases where it does, it an be dealt with on a case by case between the copyright holder and infringer (settlement, or the infringer can agree to make changes to their work)
      As for how to enforce copyright free datasets? In smaller datasets, like those used for Lora etc, we can simply expect the model creator to do due diligence, with existing enforcement like copyright takedown requests to deal with infringers. For larger datasets, we can build datasets meant for commercial purposes using public domain images as well as artist contributions (if they opt in) etc.. and disallow the use of research data sets for commercial models.

    • @ULTRATERM
      @ULTRATERM Год назад

      ​@@allanatiers9261 there's no way to reliably detect ai generated media, and any attempt to do so would end up banning non-ai art as false positives.

    • @ULTRATERM
      @ULTRATERM Год назад

      @@NielsDewitte so how exactly do you prove someone trained a Lora or diffusion model etc. on copyrighted images?

  • @lexasusopra8704
    @lexasusopra8704 Год назад +2

    when uber replaced taxi drivers no one was fighting for them.

  • @espectroultravioleta
    @espectroultravioleta Год назад +1

    All the alternatives sound pretty dystopian and sad to people in the art world. I'm sure the conclusion we'll come to is that it would have been better if these technologies weren't implemented in the first place.

  • @scottschweitzer9774
    @scottschweitzer9774 Год назад +1

    What are the odds that C2PA gets implemented, and can we have some more follow up coverage on this please Mr. Bycloud sir???

  • @maxponce1668
    @maxponce1668 Год назад +2

    Once again, the CORPOS FEAR OPEN SOURCE

  • @radradder
    @radradder Год назад +2

    Fan art is already illegal..

    • @dbt4869
      @dbt4869 6 месяцев назад

      Is that a joke?

  • @UncleRay420
    @UncleRay420 Год назад +1

    cant people just draw a similar art similar to the original pieces? its not like every art style is patented right? These artists act like they own every art style. Aint nothing stopping AI image generation.