Generative AI is Not Theft

Поделиться
HTML-код
  • Опубликовано: 20 окт 2024

Комментарии • 63

  • @dicebringer
    @dicebringer 13 дней назад +1

    Met an old-school oil paint artist who worked as a painter in the RPG and gaming industry back in the day. Claimed digital art wasn't "real art" and that it was stealing jobs away from real artists, etc. Sans the "stealing" rhetoric, it was eerily similar.

  • @Zelflix
    @Zelflix День назад +1

    I agree with you. I allows for so much creativity at a greater pace.

  • @HuygensOptics
    @HuygensOptics 5 месяцев назад +3

    I agree mostly with you on this. Where there are legal issues is when something generated is too similar to something existing that is copyrighted. In that case the "creator" can be sued for plagiarism, especially if it's used in a commercial setting. But that is something that is basically unrelated to the method of generation. It's the end result of this process and what you do with it that determines whether it is intellectual theft or not.

    • @AGI-Bingo
      @AGI-Bingo 5 месяцев назад +1

      Yo it's Huygens! The video where you showed that how there's no quanta, that a "particle" just an interpolation of all the waves in a normal distribution in region of the spectrum... It's waves all the way down, That was so enlightening ❤

  • @creuzedeterosicleidegomesd4290
    @creuzedeterosicleidegomesd4290 2 дня назад

    I strongly agree with your point.
    AI tools have democratized the production of art, allowing those with an artistic soul to express themselves without requiring the technical mastery that, due to various factors, they may not have been able to acquire during their lifetime (which is truly unfortunate).
    Despite this, the ethical and moral debates raised by the artistic community are merely attempts to sabotage technology (which can be used to their advantage) solely to prevent the devaluation of their work. This closely parallels the situation you highlighted in the video comparing Uber drivers and taxi drivers about ten years ago.
    Regarding the ethical debates, I seriously doubt that these same artists who bring up this issue didn't download music online during the Napster era!
    Ultimately, I believe that artists who are unwilling to evolve or adapt to the new reality are trying to use a false moral stance to suppress a technology that has only positive contributions to offer society. Just as with the Uber x Taxi Drivers conflict in the past, I don't see them winning this battle.
    Congratulations on the video!

  • @spark300c
    @spark300c 5 месяцев назад +3

    generative AI is theft because it has to make copy of said work then chop it. copy right make it illegal for any machine to make copy with out authorization.

    • @MarkEndsley
      @MarkEndsley  5 месяцев назад +2

      @spark300c that is simply not true. The data is not stored anywhere in the model. What AI does instead is looks, listens and learns, much like people do.

    • @spark300c
      @spark300c 5 месяцев назад +1

      @@MarkEndsley then its scraping off the internet then. which is worst. It use images that not allow to use. In fact people have caught A.I. copying and pasting some else music with right key words. It may learn what people but the no law for human have copy of some thing in their brain. there is law against machines making copies. If A.I was disconnected it require a data base to draw upon. this where concept that generative A.I. maybe break copy right.

    • @MarkEndsley
      @MarkEndsley  5 месяцев назад +2

      It doesn't actually store the data! You could disconnect it from the internet an run it on a local network and it could still work!
      AI will never copy paste, it can't because it doesn't have the data on hand. It gives you things it thinks you want based on what it has witnessed in the past.

    • @spark300c
      @spark300c 5 месяцев назад

      @@MarkEndsley generative AI has been caught copy and paste because people got stills form movies that have been unedited.
      So basically if ask to give you a picture of bat man it go search the internet and find image of badman and show it to you. IT broke copy right law because it created a new image of copy righted character.
      offline generative IA has it model data store and limited what can do. So far there is non gerative AI programs that I know of that you can use off line.
      I use program that use AI but not generative AI. It called synthesizer v and store a voice samples which it use recreate the voice.

    • @gibbonbasher8171
      @gibbonbasher8171 2 месяца назад +1

      I'm honestly not buying this point, that AI is "stealing" by taking others' art by basing the art it generates off of human-authored art. Yes, AI isn't human, but it's identical to the concept of inspiration so...

  • @bonecircuit9123
    @bonecircuit9123 5 месяцев назад +12

    The tool is fine, the illegally obtained data is not.

    • @MarkEndsley
      @MarkEndsley  5 месяцев назад

      Hi @bonecircuit9123 this is an interesting concern, what illegal data collection activities are you referring to? If someone did breach someone else's security to obtain this data that would certainly change things. I'm not advocating AI for black hat hacking here.

    • @bonecircuit9123
      @bonecircuit9123 5 месяцев назад +1

      @@MarkEndsley I posted a comment with the link to the LAION5B dataset and the message might have been muted. To reiterate in case. that data and other datasets were illegally obtained without permission and used as commercial works. If the datasets are vetted it's fine, otherwise it's contentious and liable.

    • @MarkEndsley
      @MarkEndsley  5 месяцев назад +1

      Oh yes! I agree that case literally would be theft! Generative AI itself is not theft, but still can't steal in order to train your model!

    • @bonecircuit9123
      @bonecircuit9123 5 месяцев назад +1

      @@MarkEndsley The commercial work being used to produce another commercial work is a no brainer and currently under litigation. Additionally you obtain no copyright from producing said works. cite : zara of the dawn copyright

    • @MarkEndsley
      @MarkEndsley  5 месяцев назад +1

      I think this is a point I will bring up in a future video. In this video I mostly ignored the fact that corporations do shady and very illegal things. These AI companies could already be, or turn crooked at some point. Certainly there are some accusing some already.
      I will go ahead and state now that just because I think AI is creating work the way humans do, that doesn't mean I believe these companies are above the law. I don't want the law changed unfairly to curtail them from their work, but using stolen data could certainly be a big issue with all of this in the future.
      I'm going to do another video about this in the future, probably soon. Thank you for your insight @bonecircuit9123

  • @rbus
    @rbus 5 месяцев назад +5

    Also fun fact I learned some time ago, sometime in the 80’s in England there was an organization formed to try to stop people from using synths and samplers in recordings instead of real musicians. They had a name but faded into obscurity like every ridiculous attempt to halt technological development.

    • @MarkEndsley
      @MarkEndsley  5 месяцев назад +1

      That is another good point rbus. Computers have been mimicking real sounds for a long time, this next step isn't really that new. There are already tons of artists that use a computer as their only instrument.

    • @SF7PAKISTAN
      @SF7PAKISTAN 5 месяцев назад

      Samplers get a cut when their music or voice is used in a song, are AI companies planning to give royalties to the actual people who made that stuff, or remove that data set from their models if they ask?

    • @rbus
      @rbus 5 месяцев назад

      @@SF7PAKISTAN If you can trace a digitally sampled sound to the exact audio source AI learns the characteristics of a sound from a number of recordings and recreates a new original sound with those characteristics. Unless the sound of a saxophone is copyrightable, there is no claim.

    • @SF7PAKISTAN
      @SF7PAKISTAN 5 месяцев назад

      @@rbus copyright claims aren't just limited to actual content, they can also be applied to the style and essence of the artwork. If the AI generated material is recognizable in the style of a human artist, then yes there is a case of plagiarism and theft to be made there. And these claims will never go away unless AI can imitate "inspiration" which carries the essence of the work that inspires new art but is distinguishable as not being the original content that is just rehashed

    • @MarkEndsley
      @MarkEndsley  5 месяцев назад +2

      I would disagree with that @SF7PAKISTAN, Generative AIs creation is true inspiration, it just isn't human inspiration. The fact is the part you mentioned about style also gets human artists sometimes, where they thought what the inspiration they had was "unique", but it turns out someone was inspired in that same or similar way and goes after them. I don't know any way as a human artist that I can be sure my inspiration isn't similar to someone elses, and I can't listen to every track ever copyrighted. So with both AI and being a human artist you simply have to take the risk the way the law is written today.

  • @ambi.music_
    @ambi.music_ 5 месяцев назад +3

    I completely disagree with your initial statement in the first minute and believe the image is correct. Humans don't need references to imagine new things, the very fact that we as a civilization have built what we have from nothing is proof of our ability to create and imagine beyond what already exists.
    AI in its current is just weightings based on previous art and knowledge. It might create something "new" by mixing and matching weightings of something old, like a half dog/half horse amalgamation, but it is not creating something new, it is just repackaging what already existed. It might even identify a pattern in the data previously missed, that appears novel and new, but it is not.
    The easiest proof of this is "AI needs humans to create something of value to humans, humans don't need AI, or other humans to create something of value because they inherently understand the human condition". i.e. they can create something they perceive as valuable to themselves, out of thin air, and there is an immeasurably higher probability another human will also find it valuable, even if the original creator has had no prior interaction with another human.
    We can see this clearly when AI trains on its own generated data.
    "Whenever we are training an AI model on data which is generated by other AI models, it is essentially learning from a distorted reflection of itself. Just like a game of “telephone,” each iteration of the AI-generated data becomes more corrupted and disconnected from reality. Researchers have found that introducing a relatively small amount of AI-generated content in the training data can be “poisonous” to the model, causing its outputs to rapidly degrade into nonsensical gibberish within just a few training cycles. This is because the errors and biases inherent in the synthetic data get amplified as the model learns from its own generated outputs.
    The problem of model collapse has been observed across different types of AI models, from language models to image generators. Larger, more powerful models may be slightly more resistant, but there is little evidence that they are immune to this issue. As AI-generated content proliferates across the internet and in standard training datasets, future AI models will likely be trained on a mixture of real and synthetic data. This forms an “autophagous” or self-consuming loop that can steadily degrade the quality and diversity of the model’s outputs over successive generations."
    AI models in their current form are 100% dependant on the creativity and knowledge of humans, we know this because we know it needs human generated data to build the initial model, we also know, given it has no inherent understanding of the human condition it will devolve to garbage without consistent feedback from humans when training future iterations.
    We also know humans are completely different from this in their creative ability as evidenced by the fact we live in a world full of stuff, and ideas that previously never existed. Algebra, Calculus, the Wheel, Antibiotics all novel discoveries possibly inspired by prior knowledge or accident, but completely new ideas to humanity in their own right.
    To reach a high proficiency in an art it can take decades of training. How can it be rationalised that it's OK for big tech to commercialise someone else’s art, through stealing the data fractionalizing/laundering it and then reselling it as something "new" when we know it’s dependence on prior data is absolute.
    It's theft, they steal human creativity and understanding of the human condition, taking 0.001% from thousands of different sources, not creating anything new at all, repackage it and sell it back to us without compensating the original sources of the ideas, knowledge, or art.
    The original image is correct, AI in its current form depends entirely on humans, whereas humans do not depend on AI or other humans what a ridiculous thing to try and create an equivalency between LLMs and human creativity.
    To be clear, I’m not Anti AI, it’s amazing and will increase productivity, but it's wrong to discount the necessity of the collective human contribution required to make it work such that big tech gets to repackage and resell everyone’s hard work just so Jeff Bezos and Mark Zuckerberg can buy another 200m superyacht. If all the human creators stop creating, because they can’t make a living from it, and they know ultimately it will just get fractionalized and resold by big tech without their consent then the AI models in their current form will at best stay where they are, at worst will devolve as they end up training on more and more AI generated data. That should be telling.
    People acting like these LLMs are gonna bring about the next general relativity, when they can't even return insight for 2024 as they were only trained with data up with 2023. Advanced search engines, but at least the with search engines the website content providers could sell ad space to pay for their work. Now big tech doesn't need to pay anyone it seems, and only big tech bootlickers and technophiles can't seem to see the problem here.

    • @MarkEndsley
      @MarkEndsley  5 месяцев назад +2

      Hi @ambi.music, thank you for sharing your thoughts here!
      It sounds like you have the view that humans can create work without reference. I understand how this could be true, but I don't see any evidence of this. Humans began creating things in order to interact with the natural world. Theoretically, we could train an AI on only things from the natural world and it could do... something with that. But in our modern day those things wouldn't be very useful or interesting to us as most of them have already been done by humans.
      Could a human born today raised alone in a cave create something similar Mozart? I certainly cannot prove that they can't, but I have yet to see any evidence that backs this up. All evidence points to humans making things based on what they've been exposed to in their lives, including creative works from others, copyrighted or not. Humans are also repackaging what already exists.
      Humans aren't even the only species on this earth that creates works of art.
      Both humans and AI create things from the environment they are exposed to. Both a human and AI could create something new without any external influences (nature only), but it would generally only be of value to those who also live without external influences (nature only), which is nobody today.
      I also don't think that feeding the AI back it's own output leading to poor results shows much of anything. If you isolate a human with their own thoughts that also leads to poor results.
      I also disagree that AI cannot do anything novel or new. Without the limitations of physical instruments, AI does things in songs that a human could not do with a physical instrument. This alone shows it is capable of doing new things in the very same way humans do. When it finds a way to remove a limitation, it runs with it.
      As far as how the future will play out, that may be more important than the actual facts. I think where you and I differ here is that I see resistance to Generative AI art as helping billionaires in tech. They are going to push this technology through with brute force. When artists resist by filing lawsuits, they just set the barrier that high for entry in using these AI tools. In other words, if you don't have the capital to take on artist copyright lawsuits, you can't participate in the AI space. But guess what group of people have that amount of capital.
      I want to let you know though that I respect your opinion. I realize that there are many that feel that humans are special and we simply cannot be replaced in certain ways. I hope in a way that you are right, but I don't see evidence for that in the scientific world. I do however think if you are correct, AI simply won't be a problem for artists. If the AI can't truly create something new, then it won't, and nobody will bother with AI generated art.

  • @jayschankman214
    @jayschankman214 5 месяцев назад +3

    If one goes to a museum with a sketchpad, stares at a Van Gogh and then proceeds to make an illustration with which he is going to put his own take on a Van Gogh, that is called a transformative work. All generative AI is transformative. Gatekeepers will find another hill to conquer, but this isn't the one that's going to give them the result they want. It's too late, it's not going anywhere. And, it's a cheaper alternative to human labor.
    Big business has a lot to gain by employing AI and firing human resources. Big business always wins here. Those who learn to work with AI will be at an advantage. Those that don't will be left in the dust with the horse and buggy.

    • @MarkEndsley
      @MarkEndsley  5 месяцев назад +1

      I strongly agree @jayschankman214 , it doesn't make me happy but I think you are correct.

  • @0MVR_0
    @0MVR_0 5 месяцев назад +3

    Academia already has a mechanism to prevent intellectual theft, namely referencing.
    Just get the machine to spit out where this information was sourced,
    to better allow end users to evaluate the credibility.

    • @MarkEndsley
      @MarkEndsley  5 месяцев назад

      I agree @0MVR_0, many of us that have been on the smaller creator side already feel like Copyright law is completely broken. While the academic model isn't perfect, and probably couldn't be used 1-1, I think it is a good way to start thinking about reforming it.

    • @0MVR_0
      @0MVR_0 5 месяцев назад

      @@MarkEndsley this may sound simple, yet the process of encoding information disregards the original format. the best that one can do is commit to an ad hoc reinvestigation of plausible citation during decoding.

    • @MarkEndsley
      @MarkEndsley  5 месяцев назад

      I feel like today we could come up for a great digital solution for linking the data. If we had an internal database of everything that was copyrighted, we could link things referentially on the internet.
      ...I mean... we COULD do that, I highly doubt we would.

    • @0MVR_0
      @0MVR_0 5 месяцев назад +1

      @@MarkEndsley hardly up to us.
      Thieves do exist

  • @user-zx8tv3ug5j
    @user-zx8tv3ug5j 5 месяцев назад +4

    Not even gonna watch the video, but I will pose this question, if the art it uses was not preexisting, could it make art? Because if it has to use what has already been made, it's theft. I would like to make it clear that I don't care, but it's absolutely theft.

    • @craigmhall
      @craigmhall 5 месяцев назад +3

      Can we not apply exactly the same reasoning to humans? Art does not spring forth from nothing, it is based on all of the art that has come before.

    • @MarkEndsley
      @MarkEndsley  5 месяцев назад

      I completely agree with @craigmhall, Humans can only really do something they have learned to do from influences outside of themselves.

    • @avalerionbass
      @avalerionbass 16 дней назад

      ​@@craigmhallPeople ALWAYS forget this part when making that argument. Every creative artist stands on the shoulders of giants.

  • @rbus
    @rbus 5 месяцев назад +1

    If AI “copies” work, it must be the most efficient data compression known to humans I’ve factored training data to size of resultining models to a few bits per image. Not enough to store a file name - yet you can kind of call generative AI a data compression. Trains by learning concepts by form and/or aesthetic, so it can given a concept and noise, produce an image by iteratively enhancing those concepts/aesthetics deemed positive in the noise while obscuring concepts/aesthetics deemed negative. The most common relationship between two objects is how ‘man’ and ‘horse’ is drawn as a man on or standing near a horse and not a horse on a man. It also will likely draw man with cowboy hat but not man and horse inside a house (unless explicitly prompted as a positive concept).
    The fact is neural nets are modeled on biological brains and we’ve finally come to the point where the theory that large enough networks of analog weighted notes can do human tasks. It was just a matter of time until the training became possible, and people are freaking out because things are “moving too fast.”

    • @MarkEndsley
      @MarkEndsley  5 месяцев назад +1

      Thanks for your insight! The part about file size is particularly good proof that AI doesn't sample content. I feel like the fact that it could not possibly contain the sample material acts as excellent scientific evidence.
      And Yeah, I agree it's strange how most don't know most of this was already in the works. I remember talking about AI in College 8 years ago,.

    • @rbus
      @rbus 5 месяцев назад +1

      Neural nets have been around for about half a century and one friend got completely obsessed with them in the late 80's and created some gesture recognition software for the 68000 written in assembly. The thing that's remarkable is how very simple the code is to implement them. And now $9 dev boards come with incredibly powerful AI models for image classification, face recognition, license plate detection and pose detection running in trillion ops per second. Just around the corner, Microsoft's amazing 2.3GB Phi3 LLM could be running in a single chip embedded to give devices natural communication abilities.

    • @MarkEndsley
      @MarkEndsley  5 месяцев назад

      I figured the work in the 80s is what led to science fiction works referencing neural nets for androids (Thinking of Data of course).
      At some point this may become ubiquitous, I still worry the cost of training a model may slow things down, but it seems to be getting cheaper and cheaper.

  • @jayhalloween
    @jayhalloween 5 месяцев назад +5

    Literally click bait, and not worth watching more than 1 second. Came to let you know you are wrong.

    • @KrapTacu1ar
      @KrapTacu1ar 5 месяцев назад

      Thanks

    • @MarkEndsley
      @MarkEndsley  5 месяцев назад +1

      Thanks @jayhalloween, care to elaborate as to how?

    • @avalerionbass
      @avalerionbass 16 дней назад

      How? Explain. Use your big boy words and tell us how much you ACTUALLY know on this subject.

  • @AGI-Bingo
    @AGI-Bingo 5 месяцев назад +2

    Happy to find fellow agentic developers & wholesome humans
    Subscribed! ❤ Great points

    • @MarkEndsley
      @MarkEndsley  5 месяцев назад +1

      Thanks AGI-Bingo, I'm seeing people organizing to spread misinformation out there, so I figure we'd better organize to spread the real information.

    • @AGI-Bingo
      @AGI-Bingo 5 месяцев назад +1

      ​@@MarkEndsleygodspeed
      I also wish we'd orginize better, we need better coordination. I less worry about the whole artistic aspects, copyrights, and other legal propaganda. Much more interested in the global access to intelligence. We need a decentralized free & open source agi for the people by the people. #WholesomeAGI

    • @MarkEndsley
      @MarkEndsley  5 месяцев назад

      I am 100% with you on the @AGI-Bingo. The corporate turn AI has taken has not been for the better. I feel though that it all happens in the same space. Copyright law could be altered to disallow AI works, and that is something I feel I have to fight against. The law should reflect the reality of the technology.

  • @opelfrost
    @opelfrost 5 месяцев назад +1

    precisely
    copyright was meant to promote creativity, it's disgusting how ppl are using it to hinder creativity instead
    for those who says it will make programmer obsetely is being silly, LLM is like a higher level language where we can finally code in English. its true that your average joe can 'code' now, but how would they fix bugs without the technical knowledge? coding isn't just making it work on the happy path. the same applies to music/arts/videos/etc.

    • @MarkEndsley
      @MarkEndsley  5 месяцев назад +1

      @opelfrost, I agree, and to take it even further, I do not foresee a world where AI runs everything. With both coding and art (or are they one in the same) managers would have to specify very clearly what they are looking for. This will never happen, therefore AI will never take over :)