I think it was high time. They need money to train their models. I don't think donations would cover it. And there were plenty of services raking in big money from the work they did, while they didn't see any of it. I wish the license was more permissible for personal use, but people need to stop being so entitled.
@@NevelWong It's not an entitlement. The models they release are borderline unusable at their raw state. Thanks to the community and countless hours, GPU compute, work and money the community develops additional models, Control Nets, IC Lights and what not and this is what makes the Stability special. They singlehandedly deterred anyone who had incentive to develop and fix their product for FREE to lose interest immediately. In the end, why I have to lose sleep, money and time developing your product to have to pay YOU for that privilege at the end? Sounds dumb, isn't it?
Laughs maniacally… I loved the comparison to stable diffusion 3… and the careful review of their license. I was worried everyone would shrug and take SD3 licensing like cascade.
RUclipsrs are a big part of the community around these projects, and draw a lot of people that eventually fine tune, train l'ora, control nets, etc. I'm not sure requiring a separate license for creators was a good idea.
Absolutely splendid video, as always! :D If I didn't know your channel I wouldn't even know these model exist, thank you ever so much for your work! ^^
The empty prompt test is a pretty cool idea - immediately shows the average samples it was trained on. When they look too similar to each other and too far from what you’re going to create with it - it usually means the model is bad for your use case.
My god, not even using SD3 in the video is actually a very good reaction to the new license...so sad that all we can do is cancel the new model, because its too censored and has this horrible license attached! 😢
Thanks for the great overview! HunYuan is better at following most prompts you gave it, and better at composition, and was better at proper human faces and hands. But I sadly think its name will hold back its popularity. Its name doesn't sound cool. It sounds confusing. 😅 Edit: Oh it's by Tencent, now the quality makes sense. They are some of the best in the world at this stuff.
@@NerdyRodentOh yeah true, Tencent are great at creating IPAdapter stuff, which I hope helps the popularity of this model. I can't even remember the name right now to write it again in this comment. Yuanhun something? That's a problem for its popularity. 😅 There's only one or two fine-tunes for it right now. Really hoping it gets more popular soon.
@@NerdyRodentThe difficult name strikes again. I couldn't remember its name and literally had to check here again. I don't think the name was a good idea. 😁
Great video! It would have been awesome to test typography and text. Also scene composition. “Next to”, “on top of”, “in the background” etc. a woman holding a sign that says “Freedom” standing next to a police officer with a chat bubble above saying “Pay a license”. 😢
Thanks! I did some tests like that in the previous videos, and they’re quite good at composition, not so much at English text. The latest HunYuan is also much better as they’ve fixed the colour issues. Whilst also not a truly open source license, I think it’s reasonable enough for most people to use.
When comparing apples to apples, sd3 appears to have really fallen on it's face here, at least for me. Hunyuan took the gold overall. This was a fun and interactive experience. Nice work.
@@NerdyRodent Thank you for your work on this! I'm in the middle of doing a threeway test on 30 prompts. I modified your workflow slightly to also do the SDXL refinement on PixArt Sigma like your other workflow. I'll be posting the results either on Twitter or Reddit.
Bonus meme, if he stops paying them, he would have to delete his video if he used a deriviative model of SD3. Why? Outputs only created by the core model (aka SD3) are immune, but anything outside that must be destroyed. I think this is their "we _really_ don't want people undoing our censorship" clause.
You can for sure show the SD3 without the liscence, it is not meant for you. People are acting like this is something so terrible of a licence even in use cases it is not meant to be for. It is meant for, people making money "DIRECTLY" from SD3 models. Your job is indirect. By that logic a creator who teaches stuff like Unity or Unreal Engine on youtube should also pay them them as well, right ? Cause they are making money ???? Absolutely not, but if they make a game with Unreal or Unity, then you need to pay them for "USING" their product. People like you are deliberately making the situation from bad to worse. Cascade had similar licence. Did people pay for it before making videos about it ?
I'm not a lawyer, so I don't know for sure, however, to be safe, I think it's best to be cautious unless you're 100% certain. Also, asking Stability AI for clarification on this part of the license could be helpful for everyone
I asked in the official sai discord server and they confirmed that you can't use sd3 in a monetarized RUclips video (or with patreon) since you are making money using sd3. In that case you would need to purchase the 20$/month license.
Trusting a corporation to not abuse vague licenses is like trusting a dog with a pack of raw steaks Quit speaking on issues you have zero knowledge in.
>"A woman sleeping on the grass"
oh this is targeted XD
xdddddddddd
SD 3 licensing was a big mistake.
Its a commercial product now.....sad.
But Im sure the ai community will keep promoting it for them.
I think it was high time. They need money to train their models. I don't think donations would cover it. And there were plenty of services raking in big money from the work they did, while they didn't see any of it. I wish the license was more permissible for personal use, but people need to stop being so entitled.
@@NevelWong It's not an entitlement. The models they release are borderline unusable at their raw state.
Thanks to the community and countless hours, GPU compute, work and money the community develops additional models, Control Nets, IC Lights and what not and this is what makes the Stability special.
They singlehandedly deterred anyone who had incentive to develop and fix their product for FREE to lose interest immediately.
In the end, why I have to lose sleep, money and time developing your product to have to pay YOU for that privilege at the end?
Sounds dumb, isn't it?
@@PAEz... The ai community is pretty pissed because of the extreme censorship in the model.
@@stratos7755 I know.
I love sd3. Ive always wanted to see piccasso art with realistic humans.
Picasso does realism now
Laughs maniacally… I loved the comparison to stable diffusion 3… and the careful review of their license. I was worried everyone would shrug and take SD3 licensing like cascade.
RUclipsrs are a big part of the community around these projects, and draw a lot of people that eventually fine tune, train l'ora, control nets, etc.
I'm not sure requiring a separate license for creators was a good idea.
Thanks for taking the time to prepare this comparison. It’s really good to understand where these other foundations stand in terms of performance
They're both really quite good out of the box too! I really hope the community ralies behind these models.
Absolutely splendid video, as always! :D
If I didn't know your channel I wouldn't even know these model exist, thank you ever so much for your work! ^^
The empty prompt test is a pretty cool idea - immediately shows the average samples it was trained on. When they look too similar to each other and too far from what you’re going to create with it - it usually means the model is bad for your use case.
I’m not even downloading SD3. Absolutely no incentive. I rather stick with SDXL and the community models/loras.
My god, not even using SD3 in the video is actually a very good reaction to the new license...so sad that all we can do is cancel the new model, because its too censored and has this horrible license attached! 😢
Thanks for the great overview! HunYuan is better at following most prompts you gave it, and better at composition, and was better at proper human faces and hands. But I sadly think its name will hold back its popularity. Its name doesn't sound cool. It sounds confusing. 😅 Edit: Oh it's by Tencent, now the quality makes sense. They are some of the best in the world at this stuff.
IP adapter soon! 😃
@@NerdyRodentOh yeah true, Tencent are great at creating IPAdapter stuff, which I hope helps the popularity of this model. I can't even remember the name right now to write it again in this comment. Yuanhun something? That's a problem for its popularity. 😅 There's only one or two fine-tunes for it right now. Really hoping it gets more popular soon.
@@NerdyRodentThe difficult name strikes again. I couldn't remember its name and literally had to check here again. I don't think the name was a good idea. 😁
Great video! It would have been awesome to test typography and text. Also scene composition. “Next to”, “on top of”, “in the background” etc. a woman holding a sign that says “Freedom” standing next to a police officer with a chat bubble above saying “Pay a license”. 😢
Thanks! I did some tests like that in the previous videos, and they’re quite good at composition, not so much at English text. The latest HunYuan is also much better as they’ve fixed the colour issues. Whilst also not a truly open source license, I think it’s reasonable enough for most people to use.
When comparing apples to apples, sd3 appears to have really fallen on it's face here, at least for me. Hunyuan took the gold overall. This was a fun and interactive experience. Nice work.
Exactly the video i was searching, thanks
Could you make HunYuan installation tutorial?
There’s always last week’s video 😉 HunYuan DiT - Open Source & Better Than Stable Diffusion 3?
ruclips.net/video/oDK0-KesWQo/видео.html
In my very unscientific and subjective test, SD3 only won with prompt #13 and it was virtually a tie with HunYuan DiT.
Good to know, thanks!
@@NerdyRodent Thank you for your work on this! I'm in the middle of doing a threeway test on 30 prompts. I modified your workflow slightly to also do the SDXL refinement on PixArt Sigma like your other workflow. I'll be posting the results either on Twitter or Reddit.
Damn pixart gives some beautiful images.
This isn’t even it’s final form…
That really is a lovely Cathulhu
Thanks
And thank you! ☺️
Hahahaha nerdy rodent brings us AI stuff in a very British way!!!
Wow top level snark :)
That was too far Rodent! Or was it☺
This is weird that you cannot even feature the model results in a video.
Sadly, without comparison with Sd3-medium
Would've cost him 20$
@@WatchNoah He not using this model commercially, so it's free.
@@bolon667 if he makes any money off this video, he is using it commercially.
@@Cara.314 just release a new short video of sd3 pics and demonetize it
Bonus meme, if he stops paying them, he would have to delete his video if he used a deriviative model of SD3.
Why? Outputs only created by the core model (aka SD3) are immune, but anything outside that must be destroyed. I think this is their "we _really_ don't want people undoing our censorship" clause.
Do you have a discord server?
Why did you flip the sides?
it's pronounced /hʊnˈjuːæn/ or Hun-you-wen
👋
You can for sure show the SD3 without the liscence, it is not meant for you.
People are acting like this is something so terrible of a licence even in use cases it is not meant to be for.
It is meant for, people making money "DIRECTLY" from SD3 models.
Your job is indirect.
By that logic a creator who teaches stuff like Unity or Unreal Engine on youtube should also pay them them as well, right ? Cause they are making money ????
Absolutely not, but if they make a game with Unreal or Unity, then you need to pay them for "USING" their product.
People like you are deliberately making the situation from bad to worse.
Cascade had similar licence.
Did people pay for it before making videos about it ?
I'm not a lawyer, so I don't know for sure, however, to be safe, I think it's best to be cautious unless you're 100% certain. Also, asking Stability AI for clarification on this part of the license could be helpful for everyone
It wad billed as open source
I asked in the official sai discord server and they confirmed that you can't use sd3 in a monetarized RUclips video (or with patreon) since you are making money using sd3. In that case you would need to purchase the 20$/month license.
Trusting a corporation to not abuse vague licenses is like trusting a dog with a pack of raw steaks
Quit speaking on issues you have zero knowledge in.