the results look a bit plastic but it's definitely promising. It could definitely be useful in post-prod cgi lighting correction. It's application in fan editing could be interesting
@JoshuaMKerr I think it's also a lack of subsurface scattering. Light doesn't just bounce off of our skin, it travels through it and scatters throughout the inside.
That tells so much about realism. The final result does look plastic almost like it's a video game character, but that doesnt make sense because that's an actual person being recorded. Maybe the key to realism is afterall a good roughness map
for making a rougness map use materialize, it allows for batch processing too, and the only reason the skin looks plasticy is because skin in real life absorbs light, so whatever you are applying the video material too, you should use a mask to make the skin part a subsurface scattering material and then with the roughness map the lighting will look photorealistic
@@JoshuaMKerr any update on this? really want to integrate switchlight into our workflow but everytime i revisit, i just come out looking plasticy as ever, even with switchlight now generating roughness maps
Amazing! While it’s still early days, pushing the technology to its limits like you’ve done here will surely help to accelerate the development of this groundbreaking technology. Imagine where we’ll be in just a year!
Is it better than their current relighting tool? The current tool seemed mainly useful for making subtle changes rather than totally relighting footage.
Honestly I kind of hope that Cyan Inc does this for the Riven remake. The FMV characters are what made Riven feel so real... and I think this is nuts that it could allow FMV style moments in games again.
If I had seen your final footage entirely out of context, I would have never guessed it was shot with entirely flat lighting. This technology is amazing and shows great promise. As is right now, I can tell it would work wonders for a highly stylized film along the lines of Sin City.
@@JoshuaMKerr When I was in film school, this would have been firmly the stuff of science fiction. Hell, everyone was still laughing off the prospect of studio movies shooting on digital media. I'm loving the proliferation of cheaper high-quality cameras, and the wealth of amazing software available to the average indie filmmaker. The future is indeed bright for the average indie filmmaker.
I'm already using it in UE5. It gives me better results with the environment lights, and additionally, it saves you a few hours in compositing. Simply incredible. Thank you very much😎
This is obviously not perfect, but far, far better than I expected. I'm especially intrigued about how this tech might work on something like an iPhone than can capture LIDAR depth data alongside images.
It's always so amazing but taunting in a sense that the interface for these ground-breaking softwares are so simple. Like before it would be a challenging process that takes lots of time and planning, but here is this one button that can do it now in seconds.
Hugely promising for a fix up or two. Obviously better to get it in camera - but I can't wait to see how far this can be pushed, the number of times I've shot something and just gone "Wish I had a little more light" or "wish I'd put a rim light here" - even just basic stuff like that.
honestly the roughness map would complete this. the only thing that i noticed is at 6:20 it looked a bit off like it didn't quite look like it matched the environment too well. but it is alot better than it would have been without this. really cool tech
@@JoshuaMKerr One time I used Photoshop to create normal map and then use ebsynth to make normal map of whole video and then used it in blender and Second time I found eaiser way their is a node in davinic resolve fusion name bump node which convert your video into normal map
Hey, I remember watching Dr. Károly Zsolnai-Fehér's videos covering the papers on this technology! Neat to see is deployed in an actual tool people can use.
One thing i noticed about it was that it doesn't take skin's translucency like taking account one's redness as a result of blood showing through skin when lit therefore the "plasticy" look
this is amazing. lost me when you said you had to process each frame individually, but im sure that wont be a problem in a few months. great stuff mate!
@@JoshuaMKerr We just need to find a way for the adult entertainment industry to start using it. Time and time again they have advanced the way in which we produce and consume media.
Your RUclips channel is absolutely amazing! The content you create is both entertaining and informative,The production quality, creativity, and passion you put into your videos are evident in every frame. keeping me hooked with every video. Keep up the fantastic work, you're doing a fantastic job! 😊👍
What really needs to happen in parallel to this development is the ability to add shadow and reflection without motion capture. I mean for purpose of shadows, one can imagine that an algorithm could morph a keyed video of an actor into a 3d animated mesh. It wouldn't look pretty, but the mesh would not be visible but it would still render nuanced shadow, reflection to add realism.
Very interesting! You should do a test with much more dramatic instead of soft lightning, based on you initially dragging those lights it almost feels like the result could look a lot more believable with deep shadows.
In its current state, I don’t think it’s very useful for relighting an entire scene, but it’s definitely useful in short bursts, like for example if there’s light from an explosion or gunshot that you couldn’t film practically
That is amazing! Imagine this tech with a higher budget it could be amazing!! I wonder if Filoni and Favreau with ILM are already testing this combined with the volume for the next mandalorian or some other SW project. Still it feels 3D scanned, that uncanny valley, it reminded me LA Noire for some reason. But it is amazing!!!
Relighting people with normals generated from monocular video is the new part. Regarding shadows the plane does cast onto the environment and there are self shadows. What am I missing?
The concept behind the tech is amazing, but the results make you look like a videogame character. It kinda smooths all the surfaces a bit too much, I think.
Glass, lens, gaffer. 1080p and even an IPhone camera with all its “imagination” and super sampling is still a more flexible tool for capturing the right scene-in camera. Old school vfx with background replacement and simple foregrounds can still set up fantastic locations just fine
It’s the poor man’s version of the screen stage used in The Mandelorian. One thing that I noticed as a 3D artist is the lack of back lighting around the alpha edges. When it’s not reflecting the colors of the background that’s how you end up with the cutout look. I’d also love to see it use an environment map for the fill lighting to more closely match the color and brightness.
Exactly, that’s a great example. That’s what I typically use but most image formats will work. I’m not familiar with Switch Light so I don’t know how it would use an environment map, if at all. The most likely way, I would think, is to create a fake reflective texture that overlays the image which is how old video games created metallic surfaces. It’s much less computationally expensive than rendering light rays. Keep up the good work!
@Chuntise Oh, it does a great job with hdris. You have to try it. Ive been testing their standalone desktop software and the results are 10x what you see here
I'm surprised how great that normal map looks. :) Final lighting/comp results are 90% there towards realism but still has that unplanned green screen comp look. Since it's a computer vision process, did you do any test on distance from camera, other less common objects, etc, to see if it became confused about depth estimation?
Yeah the normal is amazing and will improve with time. I was actually Suprised how well it handles full body shots, objects and hands. All seemed quite promising.
Being able to extract normal and albedo maps is going to be huge for I die game development too. It will make it easy to generate base materials for 3d models.
In Unreal, using Opacity Masks is always super harsh. Have you figured out how the translucency blend mode while also not losing spec/norm/rough channels?
impressive, but i wouldn't use it as a crutch. if i have a couple of stage lights and a greenscreen, i wouldn't necessarily light the actor completely flat every time. i would try to match the environment the best i can, and then use relight for the details.
can I ask how did you manage to set all the maps on the plane? I don't know how you've done it since the maps are a video🤔 did you set a material or what? I'm new to UE so thanks for helping🙏
Great progress. Ironically u look like a metahuman. I would try putting a sss skin material on one instance of the regular plane to key out skin portion to reveal the sss material version beneath
I feel like we're stepping in a different kind of uncanny valley. Regardless, this is huge. Just the normal maps alone is an amazing feature. Let's see what they bring next to the table.
I'm not knowledgeable where visual effects and virtual production is concerned, but is Switch Light basically baking a newly relit image using the hdri so that it can be added to a scene in unreal?
It can do that. Or it can export the normal, specular, roughness, depth and albedo maps for import into unreal so your character can react to the virtual environment.
As someone who has been searching for some kind of normal map generation, AI or otherwise, I’m very excited about the applications this has beyond just video.
@@JoshuaMKerr I’ve had some ideas cooking involving 3d printing and was looking for something that could quickly turn photos into bas reliefs. Davinci has a similar relight function but the depth map is lower in fidelity
Wel this is handy two days ago I learned unreal engine and now I need to finish my project before I saw this video I was learning unreal engine and now somebody made a 3 day video so I can do it to I know nothing about unreal engine I needed to learn everything see ya❤
That cool! I believe DaVinci Resolve Studio also has something similar. I don't think it can use HDRs yet, but I've seen demos where it generates normals and you can relight scenes using point lights.
@@JoshuaMKerr What is it designed for? If I remember correctly, the way it worked was that it'd give you a greyscale image that you could use as a mask for color correction. So if you lightened the image, it'd look like it was lighter. I could be wrong, but I imagine you could string together a bunch of those nodes for multicolor light. Or are you just talking about how it can't use HDRs as a lighting input?
The bump map it generates doesn't have enough detail for you to do complete relighting like this. It's a tool for more subtle touch ups but can't be pushed very far.
Wow thanx for the video was great. I was just thinking about something like this a few days ago while editing a video texture in unreal, its insane seeing this
@@JoshuaMKerr If it does end up being a subscription then it's an instant fuck no from me and many other people the rise in subscription service popularity has been one of the worst things to even happen to humanity market wise comparable to the invention of the pop up ad
Would LOVE a more in-depth tutorial on how to recreate this effect! I followed what you did and got a quite flat result in UE5 (I probably did something wrong). Either way this is incredible and very exciting!
@@JoshuaMKerr It looks weird on the image now, but I imagine in a few months, maybe less, it will look as natural as real footage. Excellent work, by the way.
This makes me think you could turn the normal maps into depyth maps, and use that into the pixel offset in the material, to get different fog density/depth of field/projected shadows/AO based on this different depth
Davinci Resolve Studio actual has a similar feature they just added. It allows for relighting within your grading workflow and can output normal maps for videos.
@@JoshuaMKerr resolve is grading whole scene after the fact, but relighting in UE relights the whole scene during the fact. although the word to describe seems to be really close, those corrections and changes make very different dimensions for the storytelling. while the resolve surface/relighting improves the visual narrative, relighting w unreal (allows to) creates another 'visual' narrative.
Finally, I can be normal....
You win the comments section.
Lmao that's a good one
💀💀
I didn’t expect to see you here
I dont understand de joke :(
the results look a bit plastic but it's definitely promising. It could definitely be useful in post-prod cgi lighting correction. It's application in fan editing could be interesting
That's mostly my shoddy roughness map. Just wait until Switchlight generate them.
Well you could add a cartoon effect to the actor and Thinking about that, that could be amazing
@JoshuaMKerr I think it's also a lack of subsurface scattering. Light doesn't just bounce off of our skin, it travels through it and scatters throughout the inside.
@GANONdork123 yeah, Relight's results look good with that but the unreal engine ones don't quite get it
That tells so much about realism. The final result does look plastic almost like it's a video game character, but that doesnt make sense because that's an actual person being recorded. Maybe the key to realism is afterall a good roughness map
for making a rougness map use materialize, it allows for batch processing too, and the only reason the skin looks plasticy is because skin in real life absorbs light, so whatever you are applying the video material too, you should use a mask to make the skin part a subsurface scattering material and then with the roughness map the lighting will look photorealistic
I'm working on this right now :)
@@JoshuaMKerr Sounds exciting, cant wait to see the video!
@@JoshuaMKerr any update on this? really want to integrate switchlight into our workflow but everytime i revisit, i just come out looking plasticy as ever, even with switchlight now generating roughness maps
doing a night scene inside of a moving car would be cool to see, since the lighting is always changing as the car is moving. Awesome work man
That's where I'm heading with this
@@JoshuaMKerrsick can’t wait to see what you come up with 🔥
This is crazy impressive I could see this being a huge piece of an arsenal for compositing and CGI artists in the near future
It certainly won't be long before the results from this improve. Glad you enjoyed the video.
"who needs meta human when you have human?" has to be the most awesome thing I have ever heard
probably you'll need a 5th element multipass for that...
Haha thanks.
That’s crazy! You edited and relighted png sequences one by one it’s even crazier!
Amazing! While it’s still early days, pushing the technology to its limits like you’ve done here will surely help to accelerate the development of this groundbreaking technology. Imagine where we’ll be in just a year!
Totally agree. These guys need to be shouted about in my opinion.
@@JoshuaMKerr Davinci Resolve a freemium software is also bringing this feature in their next update for video and the beta tests work great!!
Is it better than their current relighting tool? The current tool seemed mainly useful for making subtle changes rather than totally relighting footage.
@@agent_op Are you talking about the one in 18.5? That’s not nearly as good as this.
@@johanfolke2971 Yes I am talking about the one in 18.5 and I think it is good enough to be used and I know it's still not perfect but works for me!
Honestly I kind of hope that Cyan Inc does this for the Riven remake. The FMV characters are what made Riven feel so real... and I think this is nuts that it could allow FMV style moments in games again.
Interesting. I never thought about that
If I had seen your final footage entirely out of context, I would have never guessed it was shot with entirely flat lighting. This technology is amazing and shows great promise. As is right now, I can tell it would work wonders for a highly stylized film along the lines of Sin City.
Give it a little time (probably months) and the results should be even better than this. Glad you're excited too.
@@JoshuaMKerr
When I was in film school, this would have been firmly the stuff of science fiction. Hell, everyone was still laughing off the prospect of studio movies shooting on digital media. I'm loving the proliferation of cheaper high-quality cameras, and the wealth of amazing software available to the average indie filmmaker. The future is indeed bright for the average indie filmmaker.
I'm already using it in UE5. It gives me better results with the environment lights, and additionally, it saves you a few hours in compositing. Simply incredible. Thank you very much😎
Glad this was helpful mate. It's a lot of fun and it's still early days.
This is obviously not perfect, but far, far better than I expected. I'm especially intrigued about how this tech might work on something like an iPhone than can capture LIDAR depth data alongside images.
It's always so amazing but taunting in a sense that the interface for these ground-breaking softwares are so simple. Like before it would be a challenging process that takes lots of time and planning, but here is this one button that can do it now in seconds.
I feel that.
The potential for this is huge. Definitely going to be keeping a sharp eye on this software. Thanks for the video!
You're welcome.
Hugely promising for a fix up or two. Obviously better to get it in camera - but I can't wait to see how far this can be pushed, the number of times I've shot something and just gone "Wish I had a little more light" or "wish I'd put a rim light here" - even just basic stuff like that.
I think we are going to be very suprised at the speed that this develops.
@@JoshuaMKerr I think you're right. Generally speaking this kinda stuff is developing REALLY quickly.
I've been expecting software like this to be invented, so here we go.
Just the beginning
honestly the roughness map would complete this. the only thing that i noticed is at 6:20 it looked a bit off like it didn't quite look like it matched the environment too well. but it is alot better than it would have been without this. really cool tech
Next version should blow this out of the water.
Yeah,this technique is super cool.I also use this technique every time while putting real footage in 3d scene
That's great. Do you use the same or different software?
@@JoshuaMKerr One time I used Photoshop to create normal map and then use ebsynth to make normal map of whole video and then used it in blender and Second time I found eaiser way their is a node in davinic resolve fusion name bump node which convert your video into normal map
The quality wasn't good enough for me from Davinci. Also I tried photoshop and their normal map isn't in world space, it was tangent space.
@@JoshuaMKerr Yeah,this method is better than both of them . Thank you 😄
No worries
This is precisely the method I used in my own video three months ago, but for me I used Blender and DaVinci Resolve. Cool stuff!
Feel free to drop a link here. I'd like to see your results
Hey, I remember watching Dr. Károly Zsolnai-Fehér's videos covering the papers on this technology! Neat to see is deployed in an actual tool people can use.
Love it! The filmmaking industry is heading towards exciting times 😍😍
It truly is.
Honestly, all digital virtual production looks like shit, just your perception of what looks real or not is distorted.
I don't remember mentioning realism. But I am very excited about the potential of this technology. You don't have to share that enthusiasm.
Dude, you’re gonna grow massive إن شاء الله ❤
Here's hoping
One thing i noticed about it was that it doesn't take skin's translucency like taking account one's redness as a result of blood showing through skin when lit therefore the "plasticy" look
Yeah, there's no subsurface map for sure
Relight is amazing ✨👌just like you
this is insane 🤯 can't wait to see what the future will bring 🙏
Right there with you
That's spectacularly awesome! I can't wait until these kinds of tools are just built into compositing software~
You and me both!
Relight is in DaVinci Resolve 18.5 beta , which is out now (3 Months ago) and there are lots of Demos ( RUclips) .
Davinci is great for subtle relighting but their bump map is not detailed enough for complete relighting like this.
this is amazing. lost me when you said you had to process each frame individually, but im sure that wont be a problem in a few months. great stuff mate!
Yeah their video normal map feature is desperately needed but it's coming.
thank you for the show and letting us know!
No problem, glad you enjoyed it.
Great to see this video getting the attention it deserves!
cheers mate
You can go from real life to Next gen graphics. Imagine what it will be like in 3 years
That's what I keep telling people. it's just the beginning for all of this tech and it's advancing very fast.
@@JoshuaMKerr We just need to find a way for the adult entertainment industry to start using it. Time and time again they have advanced the way in which we produce and consume media.
That final lighting render looks beautiful
Thanks man. Not perfect but interesting for sure
Your RUclips channel is absolutely amazing! The content you create is both entertaining and informative,The production quality, creativity, and passion you put into your videos are evident in every frame. keeping me hooked with every video. Keep up the fantastic work, you're doing a fantastic job! 😊👍
Thank you so much, that means a lot to me. I hope to keep bringing you videos you'll enjoy.
@@JoshuaMKerr 🫰🫰🫰🫰
You deserve a sub for this. I just randomly had your video recommended and I'm glad it was.
Welcome aboard! Glad you enjoyed it
this is so exiting for more stylised live action
Unreal Engine is the future of filmmaking.
What really needs to happen in parallel to this development is the ability to add shadow and reflection without motion capture. I mean for purpose of shadows, one can imagine that an algorithm could morph a keyed video of an actor into a 3d animated mesh. It wouldn't look pretty, but the mesh would not be visible but it would still render nuanced shadow, reflection to add realism.
Specular maps and depth maps have been developed by Switchlight. Its just a question of me making an update video to showcase them :)
@@JoshuaMKerr Does that allow shadows, reflections from any angle?
Early days, but thats the idea
Thanks for sharing with this information Joshua :) I absolutely love it the final result.
Thanks for saying. I'm going to do another video and see if I can push it further.
Asset gathering performances... mind blown
Very interesting! You should do a test with much more dramatic instead of soft lightning, based on you initially dragging those lights it almost feels like the result could look a lot more believable with deep shadows.
Worth a try for sure.
Man, that relit version of yourself seems to have it rough
but I really like the idea and how well it works, very nice video too :)
Glad you enjoyed it. It's very interesting stuff for sure.
they needa use this on so many movies from the last decade
In its current state, I don’t think it’s very useful for relighting an entire scene, but it’s definitely useful in short bursts, like for example if there’s light from an explosion or gunshot that you couldn’t film practically
That's probably true. But I'd be Suprised if it didn't improve very quickly
@@JoshuaMKerrI completely agree!
This is mind blowing!
Really amazing, can think of a few shots I could have done with this on before. Will try it out👍Thanks for the video.
You're welcome. Glad you found it useful.
That is amazing! Imagine this tech with a higher budget it could be amazing!! I wonder if Filoni and Favreau with ILM are already testing this combined with the volume for the next mandalorian or some other SW project.
Still it feels 3D scanned, that uncanny valley, it reminded me LA Noire for some reason. But it is amazing!!!
I never found LA noire that strange (but again i just came from finishing GTA5)
wow! pelase make detail maybe shorts format video how to setup this maps in UE's timeline
I should be doing this soon enough but might be on patreon.
Cool! I remember seeing something from Disney research a while back, but it just focused on faces and skin.
Yeah I saw that. Very interesting stuff
It's kinda like a budget volume set, looks really good
Thanks.
It’s very cool and relighting in post with normal isn’t new, but it has limits. Because you’re working with a flat 2D plate, you can’t cast shadows
Relighting people with normals generated from monocular video is the new part.
Regarding shadows the plane does cast onto the environment and there are self shadows. What am I missing?
The concept behind the tech is amazing, but the results make you look like a videogame character. It kinda smooths all the surfaces a bit too much, I think.
For now. I'm waiting for 4k normals and roughness to get us there.
Glass, lens, gaffer. 1080p and even an IPhone camera with all its “imagination” and super sampling is still a more flexible tool for capturing the right scene-in camera. Old school vfx with background replacement and simple foregrounds can still set up fantastic locations just fine
There are many approaches and all have their merits.
It’s the poor man’s version of the screen stage used in The Mandelorian. One thing that I noticed as a 3D artist is the lack of back lighting around the alpha edges. When it’s not reflecting the colors of the background that’s how you end up with the cutout look. I’d also love to see it use an environment map for the fill lighting to more closely match the color and brightness.
By environment map, do you mean an hdri?
Exactly, that’s a great example. That’s what I typically use but most image formats will work. I’m not familiar with Switch Light so I don’t know how it would use an environment map, if at all. The most likely way, I would think, is to create a fake reflective texture that overlays the image which is how old video games created metallic surfaces. It’s much less computationally expensive than rendering light rays. Keep up the good work!
@Chuntise Oh, it does a great job with hdris. You have to try it. Ive been testing their standalone desktop software and the results are 10x what you see here
this was in aftereffects over 10 years ago . but the normal map creation/heightmap creation has seen a huge bump last year , many good new methods
Which part was in Ae 10 years ago?
could you make an in depth tutorial on how to get your footage inside of unreal/ connect your normals and albedo please? I'm new at this.
Planning on doing this soon
they made a plugin for Unreal it's so easy now! @@JoshuaMKerr
I'm surprised how great that normal map looks. :) Final lighting/comp results are 90% there towards realism but still has that unplanned green screen comp look. Since it's a computer vision process, did you do any test on distance from camera, other less common objects, etc, to see if it became confused about depth estimation?
Yeah the normal is amazing and will improve with time. I was actually Suprised how well it handles full body shots, objects and hands. All seemed quite promising.
@@JoshuaMKerr Amazing. It has no business being that good.
Great video. I subscribed.
Amazing piece of tech. Looking forward to seeing how this develops!
Welcome! I'm keen to see how it develops too. I'll definitely be making more videos as it does.
Being able to extract normal and albedo maps is going to be huge for I die game development too. It will make it easy to generate base materials for 3d models.
In Unreal, using Opacity Masks is always super harsh. Have you figured out how the translucency blend mode while also not losing spec/norm/rough channels?
alpha composite blend mode works nicely. Surface forward shading should be enabled.
I really look forward to their after effects integration!
Wow this technique looks amazing!
So much potential for improvement too.
it is truly a new age for indie film makers
I think this is just the beginning
That is way better, than the stable diffusion method!
I've tried literally everything by this point. Switchlight for the win
impressive, but i wouldn't use it as a crutch. if i have a couple of stage lights and a greenscreen, i wouldn't necessarily light the actor completely flat every time. i would try to match the environment the best i can, and then use relight for the details.
Which is a totally valid approach. But the flat lighting was just to try and push the tech as far as possible and also aid the switchlight algorithm.
THIS IS INSANE!
Really cool stuff!
can I ask how did you manage to set all the maps on the plane? I don't know how you've done it since the maps are a video🤔 did you set a material or what? I'm new to UE so thanks for helping🙏
Waiting for part 2 of this
Really interesting stuff! Great walkthrough, this can definitely be a game changer for filmmaking 🤯
It's going to be very exciting as it develops
This is amazing. Obviously, I had no way of doing it but this is so cool.
I hope it's a big hit
It wasn't too difficult. I think most people could achieve this.
Thank You for Your work man. Subscribing right away!
Awesome, welcome!
Great progress. Ironically u look like a metahuman. I would try putting a sss skin material on one instance of the regular plane to key out skin portion to reveal the sss material version beneath
Its become so much better since then too.
How to turn perfectly natural looking live action characters into uncanny valley mannequins in bad CGI. INSANE!
haha :)
How to set this up in a level sequence correctly? Where did roughness layer come from?
I'll definitely be making a video on this for patreon. Not sure what areas of this I'll be covering on the channel next
yo josh how did you do it, not to get motion trail in your image sequence when render???? thank you so much broddy
Could you make a tutorial on how you got the image onto unreal to actually be able to edit it?
Working on it
Amazing technology
Please make an update video on the workflow using this for unreal virtual production
On the list
@@JoshuaMKerr thanks ur a legend 🙏
I feel like we're stepping in a different kind of uncanny valley. Regardless, this is huge. Just the normal maps alone is an amazing feature. Let's see what they bring next to the table.
Cant wait to see how it progresses.
I'm not knowledgeable where visual effects and virtual production is concerned, but is Switch Light basically baking a newly relit image using the hdri so that it can be added to a scene in unreal?
It can do that. Or it can export the normal, specular, roughness, depth and albedo maps for import into unreal so your character can react to the virtual environment.
As someone who has been searching for some kind of normal map generation, AI or otherwise, I’m very excited about the applications this has beyond just video.
What do you imagine using it for :)
@@JoshuaMKerr I’ve had some ideas cooking involving 3d printing and was looking for something that could quickly turn photos into bas reliefs. Davinci has a similar relight function but the depth map is lower in fidelity
seriously impressive!
This is amazing!!!!!!
Thanks man. Good to see you
Wel this is handy two days ago I learned unreal engine and now I need to finish my project before I saw this video I was learning unreal engine and now somebody made a 3 day video so I can do it to I know nothing about unreal engine I needed to learn everything see ya❤
I hope my videos will be helpful as you learn
That cool! I believe DaVinci Resolve Studio also has something similar. I don't think it can use HDRs yet, but I've seen demos where it generates normals and you can relight scenes using point lights.
Davinci's relight feature isn't really designed for this type of relighting.
@@JoshuaMKerr What is it designed for? If I remember correctly, the way it worked was that it'd give you a greyscale image that you could use as a mask for color correction. So if you lightened the image, it'd look like it was lighter. I could be wrong, but I imagine you could string together a bunch of those nodes for multicolor light. Or are you just talking about how it can't use HDRs as a lighting input?
The bump map it generates doesn't have enough detail for you to do complete relighting like this. It's a tool for more subtle touch ups but can't be pushed very far.
@@JoshuaMKerr Ah, I see. That's cool! Thanks for taking the time to explain the differences.
very impressive!
Wow thanx for the video was great. I was just thinking about something like this a few days ago while editing a video texture in unreal, its insane seeing this
I'm super glad it's finally here.
Do you have a link to switch light? I did a search and couldn’t find them.
www.switchlight.beeble.ai/#/
This is crazy. I had this idea so many years ago. Obviously, I had no way of doing it but this is so cool.
I'm excited too
Honestly incredible
do somewhat wish this didnt use credits so it could ACTUALLY be acessible to most people
I think their eventual pricing is going to be very reasonable.
@@JoshuaMKerr as long as it doesn't become a subscription service
It might have a subscription option but also pay as you go. I guess we will have to wait and see.
@@JoshuaMKerr If it does end up being a subscription then it's an instant fuck no from me and many other people
the rise in subscription service popularity has been one of the worst things to even happen to humanity market wise comparable to the invention of the pop up ad
Would LOVE a more in-depth tutorial on how to recreate this effect! I followed what you did and got a quite flat result in UE5 (I probably did something wrong). Either way this is incredible and very exciting!
Did you flip the green channel of the normal map?
You can also mess with the intensity
@@JoshuaMKerr I am clearly way too new at UE since I have no clue what that means LMAO. Will look into this, thanks for the response!
Don't worry. There was a time not too long ago I didn't know these things. I'm sure I'll get to make some tutorials soon.
@@JoshuaMKerr That would be awesome!
How clean and free from artifacts are the mask edges with moving actors against green screen?
Its impressive but not perfect. I still key in AE or Davinci
So the character - you - gets better lighting in the engine because of the normal map? But the downside is that u need to do each frame?
Well at the moment I had to process it in frames rather than as a video. But it shouldn't be long before this changes.
Not only does it look bad, now all the fun is taken out of filmmaking
It's just an emerging method of filmmaking and doesn't negate traditional filmmaking by any stretch. I had a lot of fun with it.
I didn't know there was a uncanny valley for cgi light
I'm learning a lot of new things by trying this.
@@JoshuaMKerr It looks weird on the image now, but I imagine in a few months, maybe less, it will look as natural as real footage. Excellent work, by the way.
Thanks. Yeah it's a process. We'll get there.
yo this is insane
Thanks man
This makes me think you could turn the normal maps into depyth maps, and use that into the pixel offset in the material, to get different fog density/depth of field/projected shadows/AO based on this different depth
Depth maps are on Switchlights road map. This is going to be my next experiment
Davinci Resolve Studio actual has a similar feature they just added. It allows for relighting within your grading workflow and can output normal maps for videos.
Have you seen the relative quality of resolves surface map?
@@JoshuaMKerr resolve is grading whole scene after the fact, but relighting in UE relights the whole scene during the fact.
although the word to describe seems to be really close, those corrections and changes make very different dimensions for the storytelling.
while the resolve surface/relighting improves the visual narrative, relighting w unreal (allows to) creates another 'visual' narrative.
what a game changer!
Cheers man
This is insane 🎉🎉🎉
Thanks