Olivio, thank you so much for the overview of the StableProjectorz! Multiprojection and Inpaint-Autofill were one of the suggestions by our community, it was awesome to implement them. I'll be adding even more features, such as albedo for physically-based-rendering, fixing the shadows etc. Thank you for your awesome help :)
This is really nice work, well done and thank you for creating this tool! The addition of del-lit PBR maps would really make it challenge substance painter etc
So basically, your software renders textures on 3d models from the cameras view? And there's no way to automate this process in real time using your software?
@@BerkErkul Hello! That's strange, which button did you click in the website? Maybe I gave it a wrong link The Installer (Exe) should fetch the 2.0.2 PS: I have just purged the cache in cloudflare, try to download again, let me know if it got fixed
I been using this software for a while now and I was wondering why it was so underrated, but also thanks to your video the discord is now popping with over 500 ppl. I hope it stays free. Big studios have this sort of stuff developed internally and yea, if things keeps going the way it is going, only way to survive in the industry is being able to utilize the new Ai tools like this to boost productivity and push video games visual complexity to an order of magnitude higher. This guy that is developing this is the real Goat
@@marilynlucas5128 can you tell me more about this? Does blender have plug-ins to generate the texture/PBR? Say I made something simple like a donut, is there a plug-in that lets me use stable diffusion to apply a donut texture to it based on prompt? If you can point me towards a tutorial for that, that'd be awesome.
Since it renders the 3D model and allows you to move around it in the viewport, it must be using something like vertex painting or baking the textures. I think it would be even more useful for a 3D artist trying to concept if the program offered options to either (a) export the model with the vertex painting or (b) generate the 2D texture maps baked into the model.
@@mehface Not really. I enjoy doing some of the stuff, but not all the stuff. The same goes for coding, I love writing some smart methods myself, but for the simple and boring ones I let chatgpt doing them. This can help completing solodev projects that will fall into oblivion otherwise, it's like having a junior helping you. At least, you can use it for placeholders, while you finish texturing the final models.
@@mehface not all solo indies are capable of wearing all the hats at once. I personally enjoy composing music, storytelling, level design and sound design, but I can't code and can't do 3D. So rather than spending another 10 years attempting to master the skills that I don't even enjoy, I'd prefer finding a tool to fill in the gaps for me:)
Maybe actually say what the tool does in your intro brother... For people listening on headphones etc I had no idea what this tool does without pausing and reading the description, as it came up in my headphones as i was in transit! Just some cc though! :) This looks cool now i looked into it!
Yeah it's neat, wished the author had a dedicated video on using the full power of controlnet, image2image etc. Seems like all you can do is just generate entire pictures, mask off parts you want to regenerate and hope the image is closer to what you want. He does mention image2image is done via the art(bg) tab, but is still vague and limited to me?
TBH this does a lot of the work, and finer touch ups could always be done traditionally. AI doesn't have to do 100% of the job. this is a tool for people who more than likely already know how to edit a texture in the first place.
@@scetchmonkey007 there is no problem generating the textures, the main problem is the uv mapping and generation of clean Quad topology with correct coordinates and connections. these are not something very easy to achieve.
@@xyzxyz324 Yeah I get that, 3d Modeling and textures are not really my forte, I'm an animator, but I do understand how annoying it can be to create the uv maps. I'm just waiting for more tools to help me out. So being able to create a 3D character with textures like we do for 2D images would be huge. Auto-riggers already exist so you could have a ready to use unique character right away. But maybe thats a pipe dream making 3d images is not how stable diffusion works, you would need stable diffusion to make a 2d image and have that converted into a 3D model in the same package. and the sounds possible if not very tricky.
@@chrisgreenwell3404 they do the same thing. Plus t Substance is a professional tool. If you want free, then Blender is the way to go. These AI tools are super gimmicky.. little artistic input driving anyone to use them
@@chrisgreenwell3404 Not really, they do the same thing. Substance is the standard though and is a lot more satisfying to work with, specially as an artist.
@@hectorescobar9450 In what way can you texture a product with a prompt in substance painter, that is absolutely ridiculous, have you even used Stableprojector at all ? I have both and know how to use both very well they are completely different :)
It's not working at all! It always gives me an error when installing ( neuro net ) , can't generate anything! It's not connecting or anything. And what is the "black window" ??
I am a bit eEeEeHhHhH...? about Stable ProjectorZ's Ability to "Create" 3D meshes all around, and even its 3D Texture/Material generator for Larger Studios for the same reasons everyone loathes AI and everything it stands for _its 3D Material Generator for Solo or Tiny team Devs on the other hand_
You never explain what problem this solves 😂. Seems to render 2D images from a 3d object that has no texture maps on it. I mean what is the minimum requirements for our base model? Do we need our own uv maps? Is it only outputing 2D?
Unfortunately they are still trying to get it optimized on lower vram gpu's. iv'e even heard the 720p model only works on gpu's with a wopping 40gb vram
3 minutes into video, I'm seeing details of the UI, installation etc. and I still have no idea whatsoever what this tool is for. Maybe start by introducing the basic subject before going into tiny details? I'm piecing together that it generates textures, but you never told me and I have no idea what kind of results I can expect from it.
I agree with this guy, we kind of jumped in without any major explanation or introduction. Very useful tool but the video could have given some kind of initial idea of what it was first
Looks slow and with tons of stuff you need to fix and tweak until it's usable. Not sure i see a great advantage over just creating hi-res AI images and then apply them in blender? The final UV-map here is not something that you can tweak or switch out either. This is an immense disadvantage if you love tweaking textures or trying new textures (which i do) or at some point decide it's not the style you wanted or something needs to have higher res or needs to fit better with the rest etc.
For me it doesn't generate anything, it doesn't connect and there is some kind of problem with a "black window" connection 🤔 Installed it two times and always an error with a neuro-net or something.
Lately I keep seeing videos where people talk and talk and explain things and it is not clear at all what are we even looking at??? Why cant you take the first 39 seconds of the video to actually tell us what are we going to watch??? What does this thing do??? Tell us from the start. People who are not interested will downvote it because you waste their time. This isn't a personal attack. I will not downvote it. Personally I didnt like that this tool doesnt generate mesh on places it doesnt see in camera view. That is not how this tool should work. It looks like someone adapted a 2d image generator to slap texture on a 3d model, but the AI doesnt actually see and understand 3d models, so it cant wrap the models.
Olivio, thank you so much for the overview of the StableProjectorz!
Multiprojection and Inpaint-Autofill were one of the suggestions by our community, it was awesome to implement them.
I'll be adding even more features, such as albedo for physically-based-rendering, fixing the shadows etc.
Thank you for your awesome help :)
This is really nice work, well done and thank you for creating this tool!
The addition of del-lit PBR maps would really make it challenge substance painter etc
So basically, your software renders textures on 3d models from the cameras view? And there's no way to automate this process in real time using your software?
Hi Igor, how do we get access to version 2.0.2? The all in one installer is installing 1.1.2
@@BerkErkul Hello! That's strange, which button did you click in the website? Maybe I gave it a wrong link
The Installer (Exe) should fetch the 2.0.2
PS: I have just purged the cache in cloudflare, try to download again, let me know if it got fixed
What are the mesh file types it takes in? I've saved out an OBJ from Blender & it doesn't list
Olivio, remember "lightning" is electrical discharges coming from the sky and "lighting" is the light thrown by those discharges
Exactly 👍
that's more my dyslexia than my lack of english knowledge ;)
I been using this software for a while now and I was wondering why it was so underrated, but also thanks to your video the discord is now popping with over 500 ppl. I hope it stays free. Big studios have this sort of stuff developed internally and yea, if things keeps going the way it is going, only way to survive in the industry is being able to utilize the new Ai tools like this to boost productivity and push video games visual complexity to an order of magnitude higher. This guy that is developing this is the real Goat
There's some amazing, OP people out there creating some amazing things.
Nice, have been playing with the early versions quite some time ago, amazing to see how sophisticated it became. Gonna try out the new version
Skinning is why I gave up in Blender. Well… back to it! Thanks for letting us know about this :)
this has nothing to do with Skinning
The UI reminds me a lot of the 90s, it's so cool.
Nice, now we need it as an add-on in blender
There are already projection mapping plugins for blender
@@marilynlucas5128 name? i want front and side project matching seamlessly
@@marilynlucas5128 can you tell me more about this? Does blender have plug-ins to generate the texture/PBR? Say I made something simple like a donut, is there a plug-in that lets me use stable diffusion to apply a donut texture to it based on prompt? If you can point me towards a tutorial for that, that'd be awesome.
Physics based 3d modeling is where continuity will emerge in generations. We are almost there
Uh oh! It's another GAME CHANGER!!! 😆
@@97BuckeyeGuy lollllll
RUclipsrs with probably no idea about an industry be like
*This "__________" will change "_________" FOREVER!!*
Since it renders the 3D model and allows you to move around it in the viewport, it must be using something like vertex painting or baking the textures. I think it would be even more useful for a 3D artist trying to concept if the program offered options to either (a) export the model with the vertex painting or (b) generate the 2D texture maps baked into the model.
This is gonna be the bread for indies.
Finally! I can continue some of my game projects.
This looks like it can be a handy tool, thanks for showcasing it.
This is the kind of tools indie game devs are looking for 😍👍
Not really. We actually enjoy making stuff ourselves. This is the kind of tool non game devs want.
@@mehface Uses "we" for a subjective opinion.
@@mehface Not really. I enjoy doing some of the stuff, but not all the stuff. The same goes for coding, I love writing some smart methods myself, but for the simple and boring ones I let chatgpt doing them. This can help completing solodev projects that will fall into oblivion otherwise, it's like having a junior helping you. At least, you can use it for placeholders, while you finish texturing the final models.
@@mehface not all solo indies are capable of wearing all the hats at once. I personally enjoy composing music, storytelling, level design and sound design, but I can't code and can't do 3D. So rather than spending another 10 years attempting to master the skills that I don't even enjoy, I'd prefer finding a tool to fill in the gaps for me:)
@@IvanTeslenko Why not find a HUMAN to do those things with you?
Wow, making of games using this is so fast , im very excited of ai in the future.
Awesome, Stable ProjectorZ is perfect for game designers. Thank you ❤
Maybe actually say what the tool does in your intro brother...
For people listening on headphones etc I had no idea what this tool does without pausing and reading the description, as it came up in my headphones as i was in transit!
Just some cc though! :) This looks cool now i looked into it!
This is exactly what I've been looking for for so long.
definately want to see more of this tool
Quite amazing, thanks for sharing! I’ll definitely check it out!
this looks amazing. will have to try.
This is amazing thank for sharing this with us.
Yeah it's neat, wished the author had a dedicated video on using the full power of controlnet, image2image etc. Seems like all you can do is just generate entire pictures, mask off parts you want to regenerate and hope the image is closer to what you want. He does mention image2image is done via the art(bg) tab, but is still vague and limited to me?
TBH this does a lot of the work, and finer touch ups could always be done traditionally. AI doesn't have to do 100% of the job. this is a tool for people who more than likely already know how to edit a texture in the first place.
you can select individual parts of the model, do masking and inpainting
Now when will we get a software which renders actual geometry from a prompt?
I've used it and it was veeeery handy!
Looks superb!
If you figure out a way to export for 3D printing, do let us know. Great video, thumbs up!
Exactly what I came to find out too
What about retopology? and number of triangles?
Im curious does this then transfer the new ai images to the 3d models UV's location?
Thats cool, but I wonder when AI will get to the point of generating entire 3d Models. Generating the textures is a neat step.
@@scetchmonkey007 there is no problem generating the textures, the main problem is the uv mapping and generation of clean Quad topology with correct coordinates and connections. these are not something very easy to achieve.
@@xyzxyz324 Yeah I get that, 3d Modeling and textures are not really my forte, I'm an animator, but I do understand how annoying it can be to create the uv maps. I'm just waiting for more tools to help me out. So being able to create a 3D character with textures like we do for 2D images would be huge. Auto-riggers already exist so you could have a ready to use unique character right away. But maybe thats a pipe dream making 3d images is not how stable diffusion works, you would need stable diffusion to make a 2d image and have that converted into a 3D model in the same package. and the sounds possible if not very tricky.
AI can generate entire 3D Models already, they are still crude but this does exist (both as open source and online services).
Ty for tutorial, lfg I use it now!! ❤😊
Are the models Steam approved?
I've been using it for ages! 🥰
what do you use to generate low poly models to use with this texturing tool?
Can it export the model to a STL or OBJ format?
It textures the model. It doesn't create the. Model
@@OlivioSarikas Cheers. do you know of anything like this that would produce 3d objects?
I rather use Substance, it is a lot more artistic and you have finer control
They are completely different things. And substance is paid , this is free.
@@chrisgreenwell3404 they do the same thing. Plus t Substance is a professional tool. If you want free, then Blender is the way to go. These AI tools are super gimmicky.. little artistic input driving anyone to use them
@@chrisgreenwell3404 Not really, they do the same thing. Substance is the standard though and is a lot more satisfying to work with, specially as an artist.
@@hectorescobar9450 In what way can you texture a product with a prompt in substance painter, that is absolutely ridiculous, have you even used Stableprojector at all ? I have both and know how to use both very well they are completely different :)
You’re comparing apples and oranges
What is this tool? What uses does it have? What exactly did you show us? An intro giving us some context would have gone a long way
It's not working at all! It always gives me an error when installing ( neuro net ) , can't generate anything! It's not connecting or anything. And what is the "black window" ??
the black window is the command window and needs to stay open. for the rest please ask in their discord
@@OlivioSarikas ok thank you
Plus you need to mention the usage when you make a game and sell in steam
I am a bit eEeEeHhHhH...? about Stable ProjectorZ's Ability to "Create" 3D meshes all around, and even its 3D Texture/Material generator for Larger Studios for the same reasons everyone loathes AI and everything it stands for
_its 3D Material Generator for Solo or Tiny team Devs on the other hand_
If this could create pbr textures that would be greate.
An Ai tool with a simple Exe install!? no way XD
Their website is offline for like a day now. What happened?
weird, when I go to this website, I get a completely different page. It almost looks like a placeholder web page. Is anyone else getting this?
beautiful!
So this is for when you already have a 3D model, right?
yes, but you can get tons of free models online and it can still help you create textures and views for designs :)
@@OlivioSarikas I'm just asking because I don't have that 3D modelling skills. Thanks!
You never explain what problem this solves 😂. Seems to render 2D images from a 3d object that has no texture maps on it. I mean what is the minimum requirements for our base model? Do we need our own uv maps? Is it only outputing 2D?
olivio! i cant believe you havent made a video on how to locally install pyramid flow. you are the only person i can trust to teach me how
Unfortunately they are still trying to get it optimized on lower vram gpu's. iv'e even heard the 720p model only works on gpu's with a wopping 40gb vram
Waiting this tool to 3D viz apps.
It's cool to generate 3D stuff, but can I port anything to, say, UE5 ?
It exports textures on your model so you can put them in anything that takes a diffuse map.
Games like dnd and Wow heading for obsolescence real fast.
I really don't like it not being open source, this won't be helpful for AI in the long term.
@@Captain_Wet_Beard wdym
3 minutes into video, I'm seeing details of the UI, installation etc. and I still have no idea whatsoever what this tool is for. Maybe start by introducing the basic subject before going into tiny details? I'm piecing together that it generates textures, but you never told me and I have no idea what kind of results I can expect from it.
At second 22 it clrearly says: A free tool for making textures" 😆
@@OlivioSarikas There's lots of ways to make textures and aspects to texturing, I still had no idea what I was watching.
@@fammud1192 I not a 3d tutorials channel, so all I can tell you that it makes textures.
I agree with this guy, we kind of jumped in without any major explanation or introduction. Very useful tool but the video could have given some kind of initial idea of what it was first
I need to test that
Your videos are the best! 😄
Phenomenal
nice tool, but tyFlow in 3ds max already does that
Nsfw capable?
Get some help
@Phobos11 oh like making a fake OF for money is so wrong?!
it works on amd cards?
Wrong question, you must ask if AMD cards work with it
Looks slow and with tons of stuff you need to fix and tweak until it's usable. Not sure i see a great advantage over just creating hi-res AI images and then apply them in blender? The final UV-map here is not something that you can tweak or switch out either. This is an immense disadvantage if you love tweaking textures or trying new textures (which i do) or at some point decide it's not the style you wanted or something needs to have higher res or needs to fit better with the rest etc.
@@Vurt72 you are going to loose your job soon
@@proudz and what job is that? I haven't worked for almost 25 years now, and very happy with that i will add.
make web version
Does it come with the license to use all the work of artists involved in it's development?
I suppose that depends on the model you use. The Tools doesn't have anything to do with the AI model licenses
you know the answer, no artists were involved in these models, it's just stealing their work without any consent or compensation
whatever licenses are based on the stable diffusion models you use, nothing to do with the application
@@beanbeater So, when an artist visits a museum and gets inspired by what he sees there - that's stealing, too?
Great, but 5min in I still have no actual clue what this “AI tool” really does.
@@Naundob right, 4 minutes in and can't tell if it generates the model or just textures?
For me it doesn't generate anything, it doesn't connect and there is some kind of problem with a "black window" connection 🤔 Installed it two times and always an error with a neuro-net or something.
Way too complicated.
Lately I keep seeing videos where people talk and talk and explain things and it is not clear at all what are we even looking at???
Why cant you take the first 39 seconds of the video to actually tell us what are we going to watch??? What does this thing do??? Tell us from the start. People who are not interested will downvote it because you waste their time. This isn't a personal attack. I will not downvote it. Personally I didnt like that this tool doesnt generate mesh on places it doesnt see in camera view. That is not how this tool should work. It looks like someone adapted a 2d image generator to slap texture on a 3d model, but the AI doesnt actually see and understand 3d models, so it cant wrap the models.
stealing from artists is a great thing isn't it??
Idiot
Learning isn't stealing Mr.luddite, go back to your cave in the stone age 😅