Ai will do true 3D generation once they figure out how to do vector embeddings for geometry. It's amazing to me that nobody has trained AI for this yet. I know it's hard, but the rewards are there waiting. Once they figure out the 3D stuff, they can do virtual Gaussian splatting with existing diffusion generation.
yeah 3D environments will be one huge step into Holodeck VR driven by AI. Which will basically change entertainment/gaming/training/education/etc forever.
I wouldn’t be surprised if that first bit with the limited exploration worlds was AI Gaussian splatting. 🤔 The limits are likely there to prevent you from breaking the illusion like what happens when you get too far away from a splatting’s center.
If a world can be prompted, and then gaussian splatted, when you hit the end of the world, or are about to, an agent can prompt more world. Like a dungeon Master agent.
Olivio can u please make a video on how to install try on locally for comfyui, I tried but failed every time, and thank you, u are a savior man, we appreciate you
how do you get the generated videos to transition flawlessly to the next video? do you use a screenshot of the end of the old vid? what should i put in the "input image'' (video to video)
try mochi. it's one of the best local, but also kling is way better. video is just a different beast and the models that can run local are too small to get the good quality. for now at least
Is IC-Light something I can run inside of Forge or it it its own program? Has anyone had any luck running it on 6GB of VRAM? Seems like a great tool for getting consistent lighting conditions over multiple renders. That's historically been a big challenge for kitbashing together cohesive scenes where you need to set the denoising level high enough to get the stylization that you want but then the color balance tends to shift from one generation to the next. With this, it seems like you could render one frame and then reuse that lighting for subsequent shots in that same environment.
@@fixelheimer3726 It is, but the Spaces in Forge s*ck. Thought there isn't much there at the moment I immediate had incompatibilities between two entries. I really don't like it if developers launch a lot of different new tools and functions without stabilizing the current stuff sufficiently.
Yes. IC Light has been in forge and Comfy for a while. Forge has a "Space" called "relight" or something like that. You have to install Spaces for Forge. And yes, I run it in Forge with 6GB VRAM.
I got skybox but i was making better results using Flux with a 360 panorama lora. It turned out to be a waste since doing the 3d effects in blender yourself is quite easy.
In the video game industry, including among indie developers, effects such as dolly zoom, blur, and object tracking along terrain or surfaces have been standard features in 3D engine libraries for years, making AI unnecessary for these tasks. That said, these techniques can significantly streamline video editing workflows, saving hours of effort, even though depth of field effects are already widely available in many video editing tools and are a staple in both the film and gaming industries.
yes and no. AI requires new ways to work and creates you ways to be creative. This is especially true for people who are not game developers. From my short period of trying to by a indie dev I can tell you that nothing is easy or quick when it comes to making a game. So every help is appreciated
Yes, totally. You can already do that with the 360 worlds from skybox - but just looking around. I think in unity or unreal you could make it a bit walkable too ;)
#### Links from the Video ####
www.worldlabs.ai/blog
skybox.blockadelabs.com/
www.voia.com/
huggingface.co/spaces/Kwai-Kolors/Kolors-Virtual-Try-On
klingai.com/release-notes
x.com/Yokohara_h/status/1862573053051642080
x.com/azed_ai/status/1863267839735664743
x.com/Ror_Fly/status/1863658092526792760
github.com/lllyasviel/IC-Light
huggingface.co/spaces/lllyasviel/IC-Light
👋 hi
Wow! Kling is killing it. 🤩
Thanks for something new, for example 7 months old excellent IC-Light 🙂
Ai will do true 3D generation once they figure out how to do vector embeddings for geometry. It's amazing to me that nobody has trained AI for this yet. I know it's hard, but the rewards are there waiting. Once they figure out the 3D stuff, they can do virtual Gaussian splatting with existing diffusion generation.
yeah 3D environments will be one huge step into Holodeck VR driven by AI. Which will basically change entertainment/gaming/training/education/etc forever.
I wouldn’t be surprised if that first bit with the limited exploration worlds was AI Gaussian splatting. 🤔 The limits are likely there to prevent you from breaking the illusion like what happens when you get too far away from a splatting’s center.
Just as a side note there is v2 for Ic light, with better quality, though it's under different license for commercial use.
We really needed a better IC-light for that purpose.
This is all pretty nuts. Great video. Thank you
Glad you like it
Kling Try-on looks impressive. not necessarily something i would use, but looks very convincing.
Yes, mind blowing. I had to watch it 5 times to check if it is really ai
Some cool looking tech on display there :D Nice new outro!
Thank you :)
In WorldLabs, if you hit Q or E it gives you yaw rotation. R and F raise and lower your camera
cool, good to know :) you can also use your Xbox gamepad
If a world can be prompted, and then gaussian splatted, when you hit the end of the world, or are about to, an agent can prompt more world.
Like a dungeon Master agent.
That is the dream :)
Please make a workflow/tutorial for BG-remove+IC-Light as soon it is local available
Thank you, i will check that out :)
IC-Light is old; I helped out some redditors find it a month ago or so. I didnt realize not everyone is watching local AI models like a hawk...
Mind Boggling!
100%
07:40 somehow expected Apple will introduce this first..
Wake me when we have VR Holodecks. Shouldn't be long now.
Olivio can u please make a video on how to install try on locally for comfyui, I tried but failed every time, and thank you, u are a savior man, we appreciate you
how do you get the generated videos to transition flawlessly to the next video? do you use a screenshot of the end of the old vid? what should i put in the "input image'' (video to video)
you mean the rap video? I cut that in capcut after the ai generation
what is the closest AI we can use locally that gets good quality as Kling try on?
try mochi. it's one of the best local, but also kling is way better. video is just a different beast and the models that can run local are too small to get the good quality. for now at least
Is IC-Light something I can run inside of Forge or it it its own program? Has anyone had any luck running it on 6GB of VRAM? Seems like a great tool for getting consistent lighting conditions over multiple renders. That's historically been a big challenge for kitbashing together cohesive scenes where you need to set the denoising level high enough to get the stylization that you want but then the color balance tends to shift from one generation to the next. With this, it seems like you could render one frame and then reuse that lighting for subsequent shots in that same environment.
Not sure. Haven't tried it local yet
Isn't it in forge under the spaces tab?
@@fixelheimer3726
It is, but the Spaces in Forge s*ck. Thought there isn't much there at the moment I immediate had incompatibilities between two entries. I really don't like it if developers launch a lot of different new tools and functions without stabilizing the current stuff sufficiently.
Yes. IC Light has been in forge and Comfy for a while. Forge has a "Space" called "relight" or something like that. You have to install Spaces for Forge.
And yes, I run it in Forge with 6GB VRAM.
wow
I got skybox but i was making better results using Flux with a 360 panorama lora. It turned out to be a waste since doing the 3d effects in blender yourself is quite easy.
In the video game industry, including among indie developers, effects such as dolly zoom, blur, and object tracking along terrain or surfaces have been standard features in 3D engine libraries for years, making AI unnecessary for these tasks. That said, these techniques can significantly streamline video editing workflows, saving hours of effort, even though depth of field effects are already widely available in many video editing tools and are a staple in both the film and gaming industries.
yes and no. AI requires new ways to work and creates you ways to be creative. This is especially true for people who are not game developers. From my short period of trying to by a indie dev I can tell you that nothing is easy or quick when it comes to making a game. So every help is appreciated
IC light v2 still to be if open or not. Guess not.
Where's the usable version of World Labs? ❤😂🎉
Неплохо 😅
this is ta ta tight tight tight ta tight !
tight tight tiiiiiight! :)
Haha! 😂😂
Wow
wow indeed :)
Need these a.i worlds to be generated for vr headsets on the fly. Google can do this or any company with massive computing gpu power
Yes, totally. You can already do that with the 360 worlds from skybox - but just looking around. I think in unity or unreal you could make it a bit walkable too ;)