This reminds me so much of the guy who made a working camera in blender, I love the solution "build it as real life" and bam, you get realistic result, what a surprise :D
@@ЭДЭ Nope, because you simulate the bending of light rays as they do in real life, and that affects how the elements are perceived by the camera, not just the colors, but the different "layers" like background, midground and foreground, plus you get accurate light effects when hitting a lense. All of that are impossible to get on post (for a video at least)
Oh cool. I only saw this after I asked if this would be possible in Blender. Is this camera build something that can be purchased as an add-on or something?
@@eladbari damn if only we were on something almost like a search engine where you type into a search bar.. such a shame.. But because you can't do this yourself apparently "Make Your Renders Unnecessarily Complicated" & "Achieving True Photorealism With Lens Simulation" are the two videos I ASSUME people are referring to p.s copy and paste those titles into the magical search bar
@@eladbariit's just a RUclips video. They've seen it years ago and would have to search for it as well, just like you. It's not an addon because it's completely impractical. Fun experiment, yes, but the added render time is enough to learn all about lenses and then about post processing and then create a realistic result that way and still have time left.
"Hey look i made a system to get lense aberrations!" "but we already have that, that's what all the options and sliders, etc are for ??" "yes but i made "virtual lens" system that has the same effects, with more steps, and it renders slower too!" "this is beneficial to me HOW exactly ??"
@@pieppy6058 "incredibly fast". ... so just to get a standard 24fps video, it takes 360 seconds, that's 6 MINUTES, per second of 24fps video. 360 minutes (6 hours) to render 1 minute lol. a 10 minute video would take 60 hours, that's two and a half days ... "incredibly fast" 🤣🤣🤣
I like how artificial imperfections seems more "natural" to us when consuming entertainment. Just like 60 FPS movies aren't a thing because we are too used to the lower FPS formats, and the higher FPS ones feel like a bad TV show. There are a ton of examples like this, like lenses flares added to game/movies artificially, even if it's something that appears on a camera and doesn't happen on an eye. So bascially "realism" is just recreating something that we are used to, not really getting close to the ground truth when it comes to light.
As a filmmaker / cinematographer, this is BRILLIANT. Such a smart, creative solution. If we're making virtual worlds, why not use a virtual camera with a truly virtual lens? Love it.
@@86Corvus And what do our eyes have to help us see the world outside? dan-da-dahhh ...Lenses ! Cameras are simply mechanical adaptations of what nature created for us.
@@Taiwanhiphopexactly!! We watch media not in person, but in a screen! From an image that’s been graded, edited, manipulated, etc. videographers and cinematographers are adding artificial imperfections on top of their footage all the time!
Not going to lie, this is absolutely genius!!! I'd love to know more about how you came up with the idea and the science behind all of this. You should basically make a doc about how you made this, without revealing your trade secrets of course. I got the 85 and am playing with it now. Will definitely be buying the set! I just want to encourage you to keep making videos and keep making these lenses. ANAMORPHIC WOULD BE A DREAM!! Also please keep blessing the community with brilliant ideas like this!!
Bearing in mind this is a channel about filmmaking and virtual production, I think realistic is exactly right. Images captured by completely perfect imaging systems dont exist outside of games and vfx software. So, to bring a sense of realism, the images need dirtying up.
It's funny how the camera manufacturers spend loads of money on minimizing artifacts that prevent the captured image from looking like what our eyes naturally see, yet we race to saddle our "perfect" virtual cameras with those same artifacts. If we'd had the technology in the early 20th century to perfectly re-create in a camera system what our eyes actually see, we wouldn't be having this discussion. We're trying to re-create imperfections and technological limitations that we've been forced to endure for decades instead of fine-tuning the virtual lens to more closely match the human lens. Other than shot-matching CGI to in-camera footage, or creating abstract art, why would we do that?
Yeah, I'm of the same opinion. Honestly wish they pursued actually looking like real life more. I don't care about all those garbage fancy effects. If I take a picture I want to take a picture that's as close as can be to what I'm seeing with my own 2 eyes.
It’s similar (identical) to how audio producers spend a lot of their time recreating analog artefacts and subtle distortion while tech companies work on making their equipment more and more high fidelity - digital “perfection” feels sterile and cold and the imperfections we grew up with and have grown used to evoke a certain feeling. Sometimes it’s nice to have the option of a perfectly clean starting point and be able to work in the imperfections for artistic flair/tone/taste, rather than the other way around where you feel like you’re fighting against the equipment and always falling short of perfection.
for the people who love photography and who do VFX compositing... unreal images are usually not perfect for them.. or to say are too perfect to be real. so this is brilliant work you've done, BRAVO! 💘
You can see there are a lot more noise that takes much longer to go away when using the lens, but I am sure denoising will be strong enough to remove them in the future
@@JoshuaMKerr so 3 days of render time for 1 minute of footage. That would somewhat dissipate the main advantage of rendering in Unreal which is faster render times.
That's all great, but it just doesn't make sense. Using stouch glass won't help to calculate a scene in real-time in Unreal Engine, and since UE is only used for backdrops in movies, it doesn't really make sense either. In Blender, 3DS Max, Cinema 4D, Houdini, and others, there's a function to adjust the lens, as if the image passes through glass. It's not a post-effect; it's simply the lens emulation, and it works without an add-on. Another issue is whether you want to simulate a specific lens or glass with unique characteristics, but in that case, you can't be sure that your material settings in a 3D program will correspond to the physical properties of the real object 100%. So, again, the idea doesn't lead anywhere. What would really be useful and worth doing would be to create LUTs for old films, which could, if possible, not just reduce colors to the final result but also emulate chemical reactions in film, producing the "real color".
What always amuses me is how hard we strive to make the thing you're looking at look like it's shot through a lens even when it isn't. Like, games where when you're in the rain the drops are running down your eyeballs, or you get lens flairs, even when the character isn't wearing glasses.
Agreed: In all seriousness, the amount of science that goes into making high end glass for cine is far beyond looking up a couple "simple" lens configs on the web or in a book for this or that focal length, creating a glass like virtual material (which is an entire OTHER ball of scientific wax), and slapping them in front of the virtual cam...Optical Engineers most of us are not...I'll take my chances using VFX programs and compositing things together for filmic pieces; and, anyway, gaming doesn't "need" such photorealistic effects, which would likely take away from the feeling of "presence" that is unique to that medium.
It's just a clickbait title for an interesting experiment. But it still looks fake and it doesn't solve the issue the creator claims it solves. And it only really makes it look closer to real photography, it doesn't work for moving images and especially not for videogames. The one important thing he's forgetting is that real life doesn't need any kind of filter to look like real life. Real life without filters looks perfectly realistic. You can use filters like lenses, photos, etc to highlight stuff or do it slightly different and it'll still look realistic. Not the case here where the input simply does not look realistic. All he's doing is removing details. It's kinda like that far cry looks like real life image. With a small low-resolution image it's hard to tell the difference. Of course when you're playing the game in full-screen you see perfectly well that it does not look anywhere close to real life.
I don't understand? Why use a realtime rendering system just to make it no longer real time? Why not use a traditional renderer and get better results? This makes no sense
@@pyit.876 but then why not use something like Blender or Cinema4D? Why use Unreal, which the whole point is that it's real-time, to do fancy film effects which take multiple minutes per frame?
Lens Manufacturers - Spend decades of development, & vast sums of money to create lenses with as little distortion as possible. CG artists - Your goals look fake. I'mma go back & recreate early versions of your work, & make the rendering process even more inefficient & noisy. :p I've had a think about doing this. Well done on actually making it work. It does look good! 2D comping kinda produces the "poor quality" images we are used to seeing, but not as well as playing with actual lenses. Real lenses are as close to perfectly smooth curves as it's possible to get. CGI models are reated using planes, which produces errors. Sure, subdividing helps, but doesn't remove the error, & adds computational difficulty. Nurbs/curves model? I assume they are fine, as there's no distortion of the model needed. I don't know if it's a practical solution for most productions, but it does look great!~ Post processing has one major advantage though. If you don't like the distortion, it doesn't take long to make & process changes. If you have to re-render the scene because you dind't like a lans choice made before hand... Yikes! Obviously, film makers all know the stuff I'm spouting already. ;)
That is so cool. When I did that in Modo about 10 years ago it took forever to render and I didn't even have the lens dirt and grit. It wasn't manageable for anything but still shots.
Well, all general CG application cameras are same as here - they are all clean, abberationless and no physical bloom. And nobody was complainin 'bout that from Houdini or Max users.This thing must be really super heavy to render. But I must admit it - the result is very sweet looking. Maybe too much for most cases, but lovely, aspecially softness of the image.
Love - we've done this in the past in Lightwave, Blender Cycles, etc. So cool that Unreal can do it as well - and love the range of effects you're getting. Cant wait to have a look!
Randomly ran into this and know it's far beyond my knowledge. And I hope my amazement at what you as a single persin has done here is warranted. Amazing.
This is AWESOME! I will defintiely buy those, thanks for making one free to test. Anamorphic definitely wanted as well. What is your Graphics card and does it slow down your render time?
I too am interested in ANAMORPHICS and ZOOM ... can your lens blueprints be applied on top of the anamorphics in ue5 (I'm sure not, but I had to ask) ,,, maybe in Jetset Pro?
Amazing video 🎉❤ My master's in computer science was on cameras in computer graphics and the main drawbacks that current systems in modern game engines have, which mostly boil down to an over reliance on post-processing effects to make up for inadequate lighting systems and overly abstract cameras. Your video is a huge step in the right direction in bringing awareness to how game engine cameras need an overhaul if we want better-looking scenes. Great job 👍👍👍
Great effect, nice work! Really interesting to see how the "feel" of the resulting image is similar to the result I got with my virtual camera in blender, no doubt because the lenses you defined were similar to the triplet design I used. I'm surprised with how fast it seemed to render. How long did it take (and how many samples) to get a reasonable looking result?
Looks cool, but it's going to be hell for sampling! You'll get a much grainier image, or spend much more time rendering. The best place to do this stuff is in a projection shader (or whatever name unreal uses for that step), so the bokeh and lens roughness can be importance sampled. Also if the engine treats camera rays as special (many do), it would preserve that classification.
Thats probably true. Im more interested in the possibilities across the next two version of unreal. adaptive sampling in 5.4 and an excellent new denoiser in 5.5 is coming.
Hate to break this to you, but all these effects can be achieved a lot more efficiently in compositing. What would be really cool would be to figure out how to achieve them with a post process material that runs in realtime.
Love what you've done here Josh! But I have a question. It may be just me, but I think the focus should still be able to become tack sharp, like real cinema lenses, but that's not what I'm seeing. Also, in your examples did you show the frames with Path Tracing in mid-render? Or is that the final rendered frame (for example the frame at 5:49). If it is mid-render it would be nice to see the final render. If it is the final render it's super noisy. Maybe I'm just not understanding the examples? Thanks, Gary 🙃
Hi man, they're, all mid-render. I show final rendered images on the site with before and after examples. The lenses can absolutely become sharp, but my focus (pun intended) of this project was to try and get the aberrations first and foremost and a lot of that is driven by materials on the lenses and the aperture planes.
Joshua, I admire your approach and result looks phenomenal. I have few questions as "conservative CGI artist" who used to DaVinci and Fusion and all that "2d" filter stuff. Why not Post Processing Material? It might be less performance hungry and can do the same effects. Yes it is much harder to make proper and need to know blueprinting or sometimes even C++. How much samples would you need for this to render without fireflies? So it all boils down to: how much performance is affected with this approach? Keep doing great stuff man!
@@JoshuaMKerr, I wish you had compared it to an image that included the post-processing that can be added within UE5. You can approximate much of what you are doing with pretty minimal overhead. Of course, the through-the-lens shot will look more real than the unfiltered 'perfect' Cinema camera. That's why the other stuff exists.
@@JoshuaMKerr You should really look into a way to do this inside the pathtracer itself instead of a 3D mesh. 3D meshes aren't perfect, and to make them perfectly round, you need a lot of triangles, which is very expensive.
Photographers spend tens of thousands of dollars getting great lenses that minimize these effects. This guy simulates the optics of a crappy disposable kit lens stored in a dust bin and is thrilled.
Once lens technology reached its peak and we had nearly eliminated all perceived flaws, it allowed us the freedom to use various lens characteristics as deliberate artistic choices. But because of the general attitude, it's more common to see such choices in filmmaking and in movie industry. Great work!
These lenses were built to work with Unreal's perspective based cine camera. To do what you're suggesting, you would need an orthographic camera to act as a sensor plane. This is possible but would alter the focal length of these lenses.
There are about three people (that I know of) who are doing awesome things like this but their work wasn't part of my process of learning and building this, and I'm creating for different software.
Good for him. Maybe this guy's not in a position to give this away for free. Either way, it was a ton of work, and looks like it has real value. If you don't want to buy it, that's fine. But don't complain that it's not free. Go make your own
This is insanely cool, I clicked on this out of curiosity even though I have never even touched unreal engine but as a vintage lens nerd who obsesses over these sort of characteristics in real glass I am just so impressed by how cool this is! Unreal should buy licenses for these from you and make it part of the application.
Excellent work, I thought about that myself every time I try and render a cinematic out of unreal. Glad you found the answer!! Definitely going to purchase this.
I would say that the title is a bit misleading. Very interesting experiment and result, but the reason is because UE pathtracer is comparatively new and limited in features compared to an offline renderer like Octane that has all of this functionality built into the camera already. If you're forced to path trace these effects in UE then I don't see why you wouldn't use a proper path tracer in a DCC? Nevertheless, I admire the passion and the achievement.
i wonder how far you could control for this with just a post-process material and render targets. like others have said, it looks like this will affect render times by a lot and create a much grainier image
Had a headache trying to make my own lenses in blender, CAN I GIVE YOU A KISS! EDIT AND IT'S ONLY 15 GOOFY DOLLARS, ok you can get a hug for that one. Edit 2: Where is the cart? It's pissing me off because I'm so excited.
This is really cool, I work on games and nothing in realism so I wasn't aware of this issue nor have I gotten into anything that would require the solution, but the idea of creating real world objects to better simulate reality in the simulation is very cool. Props to you on coming up with this, and I get some vicarious joy out of how stoked you are about your discovery and solution here.
Chromatic aberration and lens dirt effects are the first things I always remove in games settings. My eyes don't have those things. But it is surely possible to recreate eye lens effects no? Some fatigue, dryness, floating dirt in the eyes or on the surface, astigmatism..! Now that would be real 😉 Still amazing virtual camera lens you have there to transform light paths! Crazy tools for ultimate cinematics!
Octane Render has all the things you have reated just built in from custom aperture shape, bokeh bias, all sorts of lens distortions, exponential glow, spectral dispersion glare, chromatic abberation, ACES tonemapping etc. I have never tested Octane in UE but since it´s free, it´s likely having less features, But if those feature are existent in Unreal, you will find them at octane camera settings and octane render post settings. Render settings is global. Camera setting override global individually But anyway, creating all of this from scratch is crazy awesome. Kudos!
What he's doing is a little on the nerdy cinematographer side, so i can see how some would say its not needed considering UE already has camera effects. What's cool about it is that it opens the door for a similar process that cinematographers use which is choosing vintage, flawed or just hyper-specfic glass to achieve a certain look. And it's not just increased chromatic aberration or a certain boke shape, it's usually a mix of things that create a "secret sauce", a complex mix of variables (not just specs) that shape a unique look. What you get, and what cinematographers tend to look for, are those unexpected effects that are natural artifacts of the lens. Someone can engineer these types of lenses for other users to download. Like a Helios 44-2 Russian lens or Ultra Panavision 70 anamorphic from the 1970's.
This is INCREDIBLE!!! I really love seening the push in technology to go beyond the old limits of 3d work. Going from trying desperately to get a human looking shape on screen to being able to virtually replicate how a modern camera lens affects small details which drastically change the image.
All of these things you describe as being (potentially) great are fine, as long as you are making Interactive Movies/Stories. They are in fact garbage if you are trying to make Video Games or Simulations, in which the player/user is _not_ looking through a camera, but only out of his own eyeballs.
not everyone's vision is crystal clear 20/20. We have floaters, focus issues, near sightedness, far sightedness, uneven strength in each eyeball, so next would be to model a pair of eyeballs to look through
Actually, a lot of new games that people think look very realistic are achieving that effect in part by employing visual artifacts associated with cameras and imperfect lenses. Look at a game like Unrecord for example.
@@JoshuaMKerr My apologies... I didn't realize that you weren't speaking with respect to video game development. P.S. I had been watching videos on video game development and commented too hastily.
Joshua, this is so cool! As you may or may not know I made a video a while back about real life anamorphic lenses in Unreal Engine 5, but this is just taking that to next level creativity - I love it. Also as mentioned, it reminds me of the camera video from Blender haha. Shame it's not for Lumen which is what I use 99% of the time. 🤔
@@chelo111 whether it is in editor or rendered is irrelevant - it's cinematic preview, if anything it's just a lower quality preview of the final render.
@@y0blue thank you bro, i like that, i just wanna see in a real render, i wanna see the asset in action if you know what i mean....that's all. josh a real one we all know that
This is impressive. Well bloody done! I don't have a use for them yet, but if I ever do, I'll be supporting you. And I'll be telling everyone about this.
Also in VFX we delens plates in Nuke.....often in CG we dont render with lens distortion and apply the lens distortion in Nuke. Also in VFX we do NOT use Bloom. You might want to try manual focus in your camera settings. Your Bloom looks horrific...and comp sup in VFX would slay you. Your example of Bloom looks like someone licked the lens LOL. You might want to rent an Arri, Sony or Red camera.
Should've stopped at ARRI, in the context of filmic looks. Sony & RED are well known for their "too digital" looks. Hell, even an ARRI can be made to look like digital trash, just look at how Marvel used them... And the bloom does not look lens-licked, unless you do enjoy the Marvel look. It very much looks akin to what a lot of vintage lenses would output on film. But pedantry and the fear of finding new ways to do things and iterating upon them, vs the status quo, are art in themselves, I guess.
@@justaguyfromeurope I have no problems with new techniques. I love and use Unreal proffesionally in VFX.....im just dont agree with his look and methodology ...hes not getting the most out of UE. You dont have to fudge a camera..you can get all camera and mathematical lens specs on Cinematographers guild. And yes I have worked with Red, Sony, Arri from some of the world's top cinematographers. But if you would be happy to list your IMDB credits Id be happy to take your thoughts on Movie cinematography seriously.
@@justaguyfromeurope also with fudging lens....means he wont get plate line up.....cameras have to match what was shot on set....adding cheap 2D effects doesnt make in camera DOF....Also all VFX CG needs to be undistorted for comp. I would also recommend using a proffesional compositing program like Nuke
Great video, when Cyan created their masterpiece 'Riven' they used real-world textures to map onto their geometry, they are currently remaking this in Unreal and the textures don't look anywhere as realistic as the original. Great advice here, love it.
I just bought the pack! What setting in the blueprints or component of the lenses make them specifically compatible with 16:9? Can we adjust something to our liking so they work with other filmbacks?
They were modlled to cover a 16:9 Digital Film sensor. They don't automatically scale fro different filmbacks at the moment. Deffinitely something I'd like to implement.
I love this. It may not be efficient for workflow, but it takes the finickyness and mysticism out of how to get from point a to point b When you're searching for a look and you don't know how to get there, and you don't have real life vintage camera's and lenses to use for reference. You also get an incredible result in a way that moves several layers of abstraction. This is such a great tool, even if it's just used for learning, or a keyframe diagnostic/reference when you send renders down the pipeline.
As a VFx artist we use cameras that the production shot with. You can get camera data out of Nuke.... There are also cameras you can get cinematographers guild with exact camera specs with lens packages from the academy... No offense your home made cameras are trash.
I love how you had an idea, tried to solve it and it works. maybe even better than you expected. you can see it from start to end on your face. Love it! Great job.
This is exactly what I've been doing in Blender for the last few weeks - glad to see others loving lenses and having success with virtual optics in other software
@@hanktremainI saw a video on that topic which was super interesting. But it affects render times significantly right? Is it even plausible to render out animations with it in Blender!
@@JuanGoreTex Yeah, absolutely adds significant time to your renders. I've been rendering animations with success, but it's all subjective as to what you're willing to accept for render times and how complicated your scene is to begin with.
This is so amazing! I have been trying to recreate a lense on Adobe Substance to some degree i made it work but it was not perfect. This takes it to another level. It's so bizarre and unbelievable to know that a lens has been virtually created and has working functions just like in real life. Awesome job!
how does this effect how you approach virtual production? Do the artifacts and virtual focal lengths make it harder to realistically composite your live action footage?
From my own experience, if we speak about realism in animation, a main problem for Unreal and any other realtime rendering (and even for most software renderers) is a lack of good antialiasing and motion blur. Blooming can be simulated mostly in screen space. Lense flares also, with some tricks like rendering a second pass with a larger FOV. What cant be done efficiently is good antialiasing and motion blur. This type of physical lenses can be simulated accurately without a need of using raytrace. It's enough to use multistep rendering.
Impressive work, although this product is not for me. I prefer to have the msot control until the end of production, so I do not want to bake these imperfections in engine. I also think that these shots are way too soft - I use vintage glass daily in my film work and I usually shoot with 50+ years old lenses and they still have less halation / blur and are much sharper than your examples, so I hope it is possible to regain some sharpness even with these virtual lenses. What makes lenses truly unique is not only imperfection part but color rendition and contrast. Some lenses are better than others in this field. Nonetheless - good work, hope you will continue to develop this. :)
In game development you'd usually achieve this using shaders. Either by adjusting/rewriting the standard surface shaders or adding different shaders for certain objects. The standard surface shader in unreal does have a cartoon look to it.
This is so cool! I don't even use Unreal but love the concept and execution of this. I also totally understood that laugh at 2:06 🤣 spending 6 months on something for it to work properly is quite the dedication hahaha
This reminds me so much of the guy who made a working camera in blender, I love the solution "build it as real life" and bam, you get realistic result, what a surprise :D
isn`t it just putting a post effect just right before camera, making basically re-inventing bicycle?
@@ЭДЭ Nope, because you simulate the bending of light rays as they do in real life, and that affects how the elements are perceived by the camera, not just the colors, but the different "layers" like background, midground and foreground, plus you get accurate light effects when hitting a lense. All of that are impossible to get on post (for a video at least)
Oh cool. I only saw this after I asked if this would be possible in Blender. Is this camera build something that can be purchased as an add-on or something?
That was so cool!
Sirrrandalot I think
I did the same in blender years ago. It really makes a difference even though it added tons of render time back then
Wow I would love to see a tutorial for that
Many comment say there's a same tool for blender. Zero links or name mentions for such tool, though.... :\
@@eladbari damn if only we were on something almost like a search engine where you type into a search bar.. such a shame.. But because you can't do this yourself apparently "Make Your Renders Unnecessarily Complicated" & "Achieving True Photorealism With Lens Simulation" are the two videos I ASSUME people are referring to
p.s copy and paste those titles into the magical search bar
@@eladbariRUclips removes comments with links, but the name of the channel is "sirrandalot". Hope it helps!
@@eladbariit's just a RUclips video. They've seen it years ago and would have to search for it as well, just like you. It's not an addon because it's completely impractical. Fun experiment, yes, but the added render time is enough to learn all about lenses and then about post processing and then create a realistic result that way and still have time left.
Bro started selling virtual lenses
@@FoxInSocks23 Yes he did
@JoshuaMKerr nice
The metaverse is becoming real 😂
00:13 at this moment I paused the video and asked myself if I really want to keep watching. Sometimes, ignorance is bliss
Maybe it's just not that important to you
Ignorance may be bliss, but that's a luxury the artist can't afford.
@@brettharrison4740 Wise words
Render Time x1000
"Hey look i made a system to get lense aberrations!"
"but we already have that, that's what all the options and sliders, etc are for ??"
"yes but i made "virtual lens" system that has the same effects, with more steps, and it renders slower too!"
"this is beneficial to me HOW exactly ??"
Exactly, scale this up to render entire animation frames and we're looking at slightly blurrier result in... Several months instead of a day or so
@@cjjuszczak such backwards thinking. This will probably be the norm in the near future.
1000x render time is like 15 seconds. Per frame.
That’s like, incredibly fast
@@pieppy6058
"incredibly fast". ... so just to get a standard 24fps video, it takes 360 seconds, that's 6 MINUTES, per second of 24fps video.
360 minutes (6 hours) to render 1 minute lol.
a 10 minute video would take 60 hours, that's two and a half days ... "incredibly fast" 🤣🤣🤣
I like how artificial imperfections seems more "natural" to us when consuming entertainment. Just like 60 FPS movies aren't a thing because we are too used to the lower FPS formats, and the higher FPS ones feel like a bad TV show. There are a ton of examples like this, like lenses flares added to game/movies artificially, even if it's something that appears on a camera and doesn't happen on an eye. So bascially "realism" is just recreating something that we are used to, not really getting close to the ground truth when it comes to light.
Lens flares are a real part of every day life for anyone who wears glasses 🤓
I hate camera effects in games and I love my videos with high FPS/interpolated frames lol. It looks more real to me
Once you watch movies with 120Hz interpolation you can never go back.
Yeah, that's why filmmakers like old lenses. The modern lenses are too sharp, too perfect, it often feels sterile and boring.
anything above.. say... 25 fps, cannot seriously be considered cinema
Can you output some denoised images tho, it's really hard to tell being so noisy.
Anamorphic lenses YES PLEASE!!👍🏼👍🏼
👏🏼👏🏼👏🏼👏🏼 Bravo Joshua!!
literally came here for this comment
Thinking Selling my collection of Anamorphic adapters and lenses I collected over the last 15 years. Any interest in them still?
Theres already Jackimorphics for blender they do exactly this
@@enilenis that does nothing for the accurate depth of field, no?
I know a guy who make the anamorphic bokeh in Unreal, just saying
As a filmmaker / cinematographer, this is BRILLIANT. Such a smart, creative solution. If we're making virtual worlds, why not use a virtual camera with a truly virtual lens? Love it.
Because we have eyes, not cameras
@@86Corvus And what do our eyes have to help us see the world outside? dan-da-dahhh ...Lenses ! Cameras are simply mechanical adaptations of what nature created for us.
@@Taiwanhiphopexactly!! We watch media not in person, but in a screen! From an image that’s been graded, edited, manipulated, etc. videographers and cinematographers are adding artificial imperfections on top of their footage all the time!
Not going to lie, this is absolutely genius!!! I'd love to know more about how you came up with the idea and the science behind all of this. You should basically make a doc about how you made this, without revealing your trade secrets of course. I got the 85 and am playing with it now. Will definitely be buying the set! I just want to encourage you to keep making videos and keep making these lenses. ANAMORPHIC WOULD BE A DREAM!! Also please keep blessing the community with brilliant ideas like this!!
This reminds me of the "Make your renders unnecessarily complicated" video by sirrandalot, absolutely awesome work
I'm not sure if adding more camera imperfections can be called "more realistic". It looks more cinematic, not realistic.
Bearing in mind this is a channel about filmmaking and virtual production, I think realistic is exactly right. Images captured by completely perfect imaging systems dont exist outside of games and vfx software. So, to bring a sense of realism, the images need dirtying up.
It's funny how the camera manufacturers spend loads of money on minimizing artifacts that prevent the captured image from looking like what our eyes naturally see, yet we race to saddle our "perfect" virtual cameras with those same artifacts. If we'd had the technology in the early 20th century to perfectly re-create in a camera system what our eyes actually see, we wouldn't be having this discussion. We're trying to re-create imperfections and technological limitations that we've been forced to endure for decades instead of fine-tuning the virtual lens to more closely match the human lens. Other than shot-matching CGI to in-camera footage, or creating abstract art, why would we do that?
Yeah, I'm of the same opinion. Honestly wish they pursued actually looking like real life more. I don't care about all those garbage fancy effects. If I take a picture I want to take a picture that's as close as can be to what I'm seeing with my own 2 eyes.
It’s similar (identical) to how audio producers spend a lot of their time recreating analog artefacts and subtle distortion while tech companies work on making their equipment more and more high fidelity - digital “perfection” feels sterile and cold and the imperfections we grew up with and have grown used to evoke a certain feeling. Sometimes it’s nice to have the option of a perfectly clean starting point and be able to work in the imperfections for artistic flair/tone/taste, rather than the other way around where you feel like you’re fighting against the equipment and always falling short of perfection.
for the people who love photography and who do VFX compositing... unreal images are usually not perfect for them.. or to say are too perfect to be real. so this is brilliant work you've done, BRAVO! 💘
How much does it affect render times?
You don't want to know.
You can see there are a lot more noise that takes much longer to go away when using the lens, but I am sure denoising will be strong enough to remove them in the future
you need three 4090 gpus to ask this question
I can render these on a 3080 at 3mins per frame. Others in the comments get very fast results with a 4090.
@@JoshuaMKerr so 3 days of render time for 1 minute of footage. That would somewhat dissipate the main advantage of rendering in Unreal which is faster render times.
That's all great, but it just doesn't make sense. Using stouch glass won't help to calculate a scene in real-time in Unreal Engine, and since UE is only used for backdrops in movies, it doesn't really make sense either. In Blender, 3DS Max, Cinema 4D, Houdini, and others, there's a function to adjust the lens, as if the image passes through glass. It's not a post-effect; it's simply the lens emulation, and it works without an add-on. Another issue is whether you want to simulate a specific lens or glass with unique characteristics, but in that case, you can't be sure that your material settings in a 3D program will correspond to the physical properties of the real object 100%. So, again, the idea doesn't lead anywhere. What would really be useful and worth doing would be to create LUTs for old films, which could, if possible, not just reduce colors to the final result but also emulate chemical reactions in film, producing the "real color".
What always amuses me is how hard we strive to make the thing you're looking at look like it's shot through a lens even when it isn't. Like, games where when you're in the rain the drops are running down your eyeballs, or you get lens flairs, even when the character isn't wearing glasses.
@@darrennew8211 I hate all such photorealistic effects in games. They are just immersion breaking
Agreed: In all seriousness, the amount of science that goes into making high end glass for cine is far beyond looking up a couple "simple" lens configs on the web or in a book for this or that focal length, creating a glass like virtual material (which is an entire OTHER ball of scientific wax), and slapping them in front of the virtual cam...Optical Engineers most of us are not...I'll take my chances using VFX programs and compositing things together for filmic pieces; and, anyway, gaming doesn't "need" such photorealistic effects, which would likely take away from the feeling of "presence" that is unique to that medium.
It's just a clickbait title for an interesting experiment. But it still looks fake and it doesn't solve the issue the creator claims it solves. And it only really makes it look closer to real photography, it doesn't work for moving images and especially not for videogames.
The one important thing he's forgetting is that real life doesn't need any kind of filter to look like real life. Real life without filters looks perfectly realistic. You can use filters like lenses, photos, etc to highlight stuff or do it slightly different and it'll still look realistic. Not the case here where the input simply does not look realistic. All he's doing is removing details.
It's kinda like that far cry looks like real life image. With a small low-resolution image it's hard to tell the difference. Of course when you're playing the game in full-screen you see perfectly well that it does not look anywhere close to real life.
I don't understand? Why use a realtime rendering system just to make it no longer real time? Why not use a traditional renderer and get better results? This makes no sense
Content
He had an idea and pursued it.
Well you pretty much can't do stuffs like spherical bokas in post processing
you do realise unreal engine is an offline render engine as well right... and good luck moving around a 3d scene in blender at 200fps 😂😂
@@pyit.876 but then why not use something like Blender or Cinema4D? Why use Unreal, which the whole point is that it's real-time, to do fancy film effects which take multiple minutes per frame?
Lens Manufacturers -
Spend decades of development, & vast sums of money to create lenses with as little distortion as possible.
CG artists -
Your goals look fake.
I'mma go back & recreate early versions of your work, & make the rendering process even more inefficient & noisy. :p
I've had a think about doing this. Well done on actually making it work. It does look good!
2D comping kinda produces the "poor quality" images we are used to seeing, but not as well as playing with actual lenses.
Real lenses are as close to perfectly smooth curves as it's possible to get. CGI models are reated using planes, which produces errors. Sure, subdividing helps, but doesn't remove the error, & adds computational difficulty. Nurbs/curves model? I assume they are fine, as there's no distortion of the model needed.
I don't know if it's a practical solution for most productions, but it does look great!~
Post processing has one major advantage though. If you don't like the distortion, it doesn't take long to make & process changes.
If you have to re-render the scene because you dind't like a lans choice made before hand... Yikes!
Obviously, film makers all know the stuff I'm spouting already. ;)
That is so cool. When I did that in Modo about 10 years ago it took forever to render and I didn't even have the lens dirt and grit. It wasn't manageable for anything but still shots.
Well, all general CG application cameras are same as here - they are all clean, abberationless and no physical bloom. And nobody was complainin 'bout that from Houdini or Max users.This thing must be really super heavy to render. But I must admit it - the result is very sweet looking. Maybe too much for most cases, but lovely, aspecially softness of the image.
Do you have more rendered examples? Like a car, nature, etc? Thank you!
Hey, how do u import the camera to an unreal engine project from the downloaded virtual lens file?
Love - we've done this in the past in Lightwave, Blender Cycles, etc. So cool that Unreal can do it as well - and love the range of effects you're getting. Cant wait to have a look!
every piece of content you produce is a joy to consume! ️
@TuroBozo Hey thanks! Glad you're enjoying my work
This is beautiful and so creative! Nicely done Joshua, I look forward to getting stuck into these. Ill be sure to recommend this moving forward!
Hey man! That's awesome. How are you these days?
@@JoshuaMKerr Keeping well Joshua thank you for asking.
Been loving your virtual production stuff lately.
Can you explain how you made the lenses, because they should be quite complicated to calculate and also have dynamic focus?
I modelled them in blender using the Opti-Core plugin. I made the so they would work with a perspective camera's focus.
@@JoshuaMKerr lovely thanks! Would be fun to see a video how you did it. Lenses are so complicated 😱 I see why you say it took 6 months
I have a new video coming out that shows a bit of the process
Got an Unreal Engine advertisment before this video talking about how *Real it looks* 😂
Love this comment
Randomly ran into this and know it's far beyond my knowledge.
And I hope my amazement at what you as a single persin has done here is warranted.
Amazing.
This is AWESOME! I will defintiely buy those, thanks for making one free to test. Anamorphic definitely wanted as well. What is your Graphics card and does it slow down your render time?
I too am interested in ANAMORPHICS and ZOOM ... can your lens blueprints be applied on top of the anamorphics in ue5 (I'm sure not, but I had to ask) ,,, maybe in Jetset Pro?
Camera men in UE: wish that camera was not perfect.
Meanwhile me in real life: Wish that my camera was perfect.
Amazing video 🎉❤
My master's in computer science was on cameras in computer graphics and the main drawbacks that current systems in modern game engines have, which mostly boil down to an over reliance on post-processing effects to make up for inadequate lighting systems and overly abstract cameras.
Your video is a huge step in the right direction in bringing awareness to how game engine cameras need an overhaul if we want better-looking scenes. Great job 👍👍👍
this is gonna be the next step of virtual film making, good work Joshua !
Thanks so much man, It's been a long road i'm actually shaking with nerves putting this video out.
A leap backwards
Great effect, nice work! Really interesting to see how the "feel" of the resulting image is similar to the result I got with my virtual camera in blender, no doubt because the lenses you defined were similar to the triplet design I used.
I'm surprised with how fast it seemed to render. How long did it take (and how many samples) to get a reasonable looking result?
I tried it but sadly it doesn't work with the HDRI Backdrop in pathtracing.
Probably cos any backdrop doesn't have actual depth?
It should work. let me test
First video I watched. I love your geeky enthusiasm. I can totally imagine myself wasting days on doing this. Keep it up!
Glad you enjoyed the video. I love a bit of geeky enthusiasm:)
Looks cool, but it's going to be hell for sampling! You'll get a much grainier image, or spend much more time rendering.
The best place to do this stuff is in a projection shader (or whatever name unreal uses for that step), so the bokeh and lens roughness can be importance sampled.
Also if the engine treats camera rays as special (many do), it would preserve that classification.
Thats probably true. Im more interested in the possibilities across the next two version of unreal. adaptive sampling in 5.4 and an excellent new denoiser in 5.5 is coming.
Nice video and good technique, like that Blender guy did some time ago! But not a word on render time is kinda weird.
I think I covered it on a follow up video
@@JoshuaMKerr Good to know! Thank you
Hate to break this to you, but all these effects can be achieved a lot more efficiently in compositing. What would be really cool would be to figure out how to achieve them with a post process material that runs in realtime.
Can you change bokeh shape in post? If you pull this out your work will change the industry 😂
@@TomLee-lv8ql you can change the bokeh in engine. With a PPM
@@AnthonyDeCrocethank you but this was not my question. Can you change the bokeh in post.
@@TomLee-lv8ql true.
@@AnthonyDeCroce 🤗
Is there a way to use lenses like 16mm, 120mm, 200mm etc for example?
This is awesome. Was curious if you were gonna make any anamorphics right before you said it. Super excited to try these out.
Love what you've done here Josh!
But I have a question. It may be just me, but I think the focus should still be able to become tack sharp, like real cinema lenses, but that's not what I'm seeing. Also, in your examples did you show the frames with Path Tracing in mid-render? Or is that the final rendered frame (for example the frame at 5:49). If it is mid-render it would be nice to see the final render. If it is the final render it's super noisy. Maybe I'm just not understanding the examples? Thanks, Gary 🙃
Hi man, they're, all mid-render. I show final rendered images on the site with before and after examples. The lenses can absolutely become sharp, but my focus (pun intended) of this project was to try and get the aberrations first and foremost and a lot of that is driven by materials on the lenses and the aperture planes.
@@JoshuaMKerr Got it, that's what I thought, thanks. My brain just needs the before and after 🙃
Joshua, I admire your approach and result looks phenomenal. I have few questions as "conservative CGI artist" who used to DaVinci and Fusion and all that "2d" filter stuff. Why not Post Processing Material? It might be less performance hungry and can do the same effects. Yes it is much harder to make proper and need to know blueprinting or sometimes even C++.
How much samples would you need for this to render without fireflies? So it all boils down to: how much performance is affected with this approach?
Keep doing great stuff man!
The way he smiled when he revealed the models of the lenses ... tells me he just wanted to do it that way.
I'm also on the same question square with Postprocessed: Why not Post processing Material?
The point of the lenses is to simulate their effects with physical accuracy.
@@JoshuaMKerr, I wish you had compared it to an image that included the post-processing that can be added within UE5. You can approximate much of what you are doing with pretty minimal overhead. Of course, the through-the-lens shot will look more real than the unfiltered 'perfect' Cinema camera. That's why the other stuff exists.
@@JoshuaMKerr You should really look into a way to do this inside the pathtracer itself instead of a 3D mesh. 3D meshes aren't perfect, and to make them perfectly round, you need a lot of triangles, which is very expensive.
This looks amazing. Is there any chance of a realtime implementation?
I'm afraid pathtracer is the only way
Photographers spend tens of thousands of dollars getting great lenses that minimize these effects. This guy simulates the optics of a crappy disposable kit lens stored in a dust bin and is thrilled.
You have to start somewhere.
Because it s fun
He is a 3d modeler not a photographer
Photographers are not Videographers. Never wondered why there are so many cine lenses and seperated PHOTO lenses?
Once lens technology reached its peak and we had nearly eliminated all perceived flaws, it allowed us the freedom to use various lens characteristics as deliberate artistic choices. But because of the general attitude, it's more common to see such choices in filmmaking and in movie industry.
Great work!
Would there be a way to actually add the lenses where the lenses start? (Focal point) not just added on top of the final lens?
These lenses were built to work with Unreal's perspective based cine camera. To do what you're suggesting, you would need an orthographic camera to act as a sensor plane. This is possible but would alter the focal length of these lenses.
@@JoshuaMKerr Cool Thanks for the reply. I know very little about UR so thanks :) Cool vid btw!
This is a hilarious solution.
Hey Joshua, any idea why this is broken for 5.5? something has broken it
I need to look into 5.5. It may be something to do with substrate materials. Are you in the discord?
@@JoshuaMKerr I'm not but will join now
You really should have mentioned the guy who made a working camera in blender. AND he gives away his camera for free!
There are about three people (that I know of) who are doing awesome things like this but their work wasn't part of my process of learning and building this, and I'm creating for different software.
@@JoshuaMKerr you still got the idea from
Actually, that's not the case.
If man wants to charge for his work and time spent then let him do it. You scrubs want everything for free these days. Make your own.
Good for him. Maybe this guy's not in a position to give this away for free. Either way, it was a ton of work, and looks like it has real value.
If you don't want to buy it, that's fine. But don't complain that it's not free. Go make your own
This is insanely cool, I clicked on this out of curiosity even though I have never even touched unreal engine but as a vintage lens nerd who obsesses over these sort of characteristics in real glass I am just so impressed by how cool this is! Unreal should buy licenses for these from you and make it part of the application.
Haha that would he nice. But there's even cooler things coming.
Excellent work, I thought about that myself every time I try and render a cinematic out of unreal. Glad you found the answer!! Definitely going to purchase this.
Awesome, thank you!
I would say that the title is a bit misleading. Very interesting experiment and result, but the reason is because UE pathtracer is comparatively new and limited in features compared to an offline renderer like Octane that has all of this functionality built into the camera already.
If you're forced to path trace these effects in UE then I don't see why you wouldn't use a proper path tracer in a DCC? Nevertheless, I admire the passion and the achievement.
i wonder how far you could control for this with just a post-process material and render targets. like others have said, it looks like this will affect render times by a lot and create a much grainier image
You can't really fake bokeh in post processing
Excellent job. Amazing to see how proud you are of your creation.
Had a headache trying to make my own lenses in blender, CAN I GIVE YOU A KISS!
EDIT AND IT'S ONLY 15 GOOFY DOLLARS, ok you can get a hug for that one.
Edit 2: Where is the cart? It's pissing me off because I'm so excited.
This is really cool, I work on games and nothing in realism so I wasn't aware of this issue nor have I gotten into anything that would require the solution, but the idea of creating real world objects to better simulate reality in the simulation is very cool. Props to you on coming up with this, and I get some vicarious joy out of how stoked you are about your discovery and solution here.
Why do people want lens glare? Surely the end goal is to make everything look as it would with your naked eyes, right?
Unless you want to make something look like it was filmed with a camera
We don't shoot movies with our eyes.
It's what we're used to seeing in photos and films, and also, this effect actually does happen with our vision. It's called glare.
Chromatic aberration and lens dirt effects are the first things I always remove in games settings. My eyes don't have those things.
But it is surely possible to recreate eye lens effects no? Some fatigue, dryness, floating dirt in the eyes or on the surface, astigmatism..! Now that would be real 😉
Still amazing virtual camera lens you have there to transform light paths! Crazy tools for ultimate cinematics!
Yeah, I really don't like all those glare/blur/motion blur/distortion effects by default turned in every game. It's wild
Octane Render has all the things you have reated just built in from custom aperture shape, bokeh bias, all sorts of lens distortions, exponential glow, spectral dispersion glare, chromatic abberation, ACES tonemapping etc. I have never tested Octane in UE but since it´s free, it´s likely having less features, But if those feature are existent in Unreal, you will find them at octane camera settings and octane render post settings. Render settings is global. Camera setting override global individually
But anyway, creating all of this from scratch is crazy awesome. Kudos!
The things you listed are also in the Unreal cine camera. They're probably just tweaked wrong by default.
@StainlessPot The effects I'm making are physically based, thats the point of them. Not to say people can't use postprocess filters.
What he's doing is a little on the nerdy cinematographer side, so i can see how some would say its not needed considering UE already has camera effects. What's cool about it is that it opens the door for a similar process that cinematographers use which is choosing vintage, flawed or just hyper-specfic glass to achieve a certain look. And it's not just increased chromatic aberration or a certain boke shape, it's usually a mix of things that create a "secret sauce", a complex mix of variables (not just specs) that shape a unique look. What you get, and what cinematographers tend to look for, are those unexpected effects that are natural artifacts of the lens. Someone can engineer these types of lenses for other users to download. Like a Helios 44-2 Russian lens or Ultra Panavision 70 anamorphic from the 1970's.
Making "ies" spotlights like in real life, before the option exists in your renderer was a fun and epic game in the past. Well done man!
Thank you, I never thought of that as an equivalent.
@@JoshuaMKerr Basically, reproducing light behavior under certain beloved conditions with modeling or shaders etc tricks 😉
It is called unreal for a reason😂
Haha, I originally wanted to call the video Unreal...istic
This is INCREDIBLE!!! I really love seening the push in technology to go beyond the old limits of 3d work.
Going from trying desperately to get a human looking shape on screen to being able to virtually replicate how a modern camera lens affects small details which drastically change the image.
All of these things you describe as being (potentially) great are fine, as long as you are making Interactive Movies/Stories. They are in fact garbage if you are trying to make Video Games or Simulations, in which the player/user is _not_ looking through a camera, but only out of his own eyeballs.
not everyone's vision is crystal clear 20/20. We have floaters, focus issues, near sightedness, far sightedness, uneven strength in each eyeball, so next would be to model a pair of eyeballs to look through
@@Sajith050683 With respect, I honestly can't tell whether or not you're serious.
Actually, it's quite a good idea for a horror game where the main character has poor sight. Damn I need to get into game dev now haha
Actually, a lot of new games that people think look very realistic are achieving that effect in part by employing visual artifacts associated with cameras and imperfect lenses. Look at a game like Unrecord for example.
@@JoshuaMKerr My apologies... I didn't realize that you weren't speaking with respect to video game development.
P.S. I had been watching videos on video game development and commented too hastily.
Joshua, this is so cool! As you may or may not know I made a video a while back about real life anamorphic lenses in Unreal Engine 5, but this is just taking that to next level creativity - I love it. Also as mentioned, it reminds me of the camera video from Blender haha. Shame it's not for Lumen which is what I use 99% of the time. 🤔
Hi, I don't think i've seen that video, could you pop me a link here. Would love a look.
@@JoshuaMKerr Sure, it's this one: ruclips.net/video/3n-lBb_DCl4/видео.htmlsi=SEc9C9OQTYQUwCz5 😄
bro show us a clip were you use the lenses in a video please, your selling something but you are not showing us a final result 🤕
Did you even watch the video LMAO
@@y0blue yes I did those samples were in the editor not a real render sample
@@chelo111 whether it is in editor or rendered is irrelevant - it's cinematic preview, if anything it's just a lower quality preview of the final render.
@@chelo111 also wtf do you think 2:20 is
@@y0blue thank you bro, i like that, i just wanna see in a real render, i wanna see the asset in action if you know what i mean....that's all. josh a real one we all know that
Why are you using previews for your demo? It would be much more interesting to see the final render.
If you want to see a really good render, there's some at the end of this video.
ruclips.net/video/lUgUt5IsMZU/видео.htmlsi=7KzaB27LHKcMr4Fy
This is lovely work and a clever solution!
This is impressive. Well bloody done! I don't have a use for them yet, but if I ever do, I'll be supporting you. And I'll be telling everyone about this.
Thanks so much! Glad you like the idea.
Also in VFX we delens plates in Nuke.....often in CG we dont render with lens distortion and apply the lens distortion in Nuke. Also in VFX we do NOT use Bloom. You might want to try manual focus in your camera settings. Your Bloom looks horrific...and comp sup in VFX would slay you. Your example of Bloom looks like someone licked the lens LOL. You might want to rent an Arri, Sony or Red camera.
Should've stopped at ARRI, in the context of filmic looks. Sony & RED are well known for their "too digital" looks. Hell, even an ARRI can be made to look like digital trash, just look at how Marvel used them...
And the bloom does not look lens-licked, unless you do enjoy the Marvel look. It very much looks akin to what a lot of vintage lenses would output on film.
But pedantry and the fear of finding new ways to do things and iterating upon them, vs the status quo, are art in themselves, I guess.
@@justaguyfromeurope I have no problems with new techniques. I love and use Unreal proffesionally in VFX.....im just dont agree with his look and methodology ...hes not getting the most out of UE. You dont have to fudge a camera..you can get all camera and mathematical lens specs on Cinematographers guild. And yes I have worked with Red, Sony, Arri from some of the world's top cinematographers. But if you would be happy to list your IMDB credits Id be happy to take your thoughts on Movie cinematography seriously.
@@justaguyfromeurope also with fudging lens....means he wont get plate line up.....cameras have to match what was shot on set....adding cheap 2D effects doesnt make in camera DOF....Also all VFX CG needs to be undistorted for comp. I would also recommend using a proffesional compositing program like Nuke
Great video, when Cyan created their masterpiece 'Riven' they used real-world textures to map onto their geometry, they are currently remaking this in Unreal and the textures don't look anywhere as realistic as the original. Great advice here, love it.
I'm already 20 seconds in and I'm already asking: "how much is this gonna cost me".
I just bought the pack! What setting in the blueprints or component of the lenses make them specifically compatible with 16:9? Can we adjust something to our liking so they work with other filmbacks?
They were modlled to cover a 16:9 Digital Film sensor. They don't automatically scale fro different filmbacks at the moment. Deffinitely something I'd like to implement.
Is this stolen content from “I made a real camera inside of Blender [or Unreal engine]”?
definitely not stolen content
@user-mn8lz7gf6d Definitely not
Love it! Will you be putting these on the Epic Marketplace??
That's the plan!
phD in unreal engine for sure
too kind ;)
I love this. It may not be efficient for workflow, but it takes the finickyness and mysticism out of how to get from point a to point b When you're searching for a look and you don't know how to get there, and you don't have real life vintage camera's and lenses to use for reference. You also get an incredible result in a way that moves several layers of abstraction. This is such a great tool, even if it's just used for learning, or a keyframe diagnostic/reference when you send renders down the pipeline.
Honestly, the renders without the lens are way better to me.
It's always going to be down to your personal choice.
This is a must have man! I am starting my Virtual Production journey on UE so I'll take this right now! Much love
Hey man, thanks for saying. dont forget to join the discord too
As a VFx artist we use cameras that the production shot with. You can get camera data out of Nuke.... There are also cameras you can get cinematographers guild with exact camera specs with lens packages from the academy... No offense your home made cameras are trash.
did you map the glass of the lens's internals the same as how lens are in reality?
Lenses come in many different types. These are based on cooke triplet lenses.
I plan to branch out into other lens desgins.
You saw a video of a random guy on RUclips that explained how to create lens, and you thought: "why not?"
😂👍🏻
the cycle continues
I love how you had an idea, tried to solve it and it works. maybe even better than you expected. you can see it from start to end on your face. Love it! Great job.
We absolutely need anamorphic lenses. 🔥🔥 Beautiful work!
Beautiful! Like the Blender plug in. Great job 🙌
which blender plugin?
With upcoming new Denoiser with Unreal 5.5 and more future improvements in this field... I'd say this man is Ahead of his time. Great Work Man. ❤
YES! I've been saying this in my replies to comments. Well done for being the only one who mentioned it. Also 5.4 has adaptive sampling.
Amazing!! Can 4 lenses be arbitrarily attached to lenses such as 18mm and 16mm or to lenses such as 120mm and 200mm?
This is exactly what I've been doing in Blender for the last few weeks - glad to see others loving lenses and having success with virtual optics in other software
What's the link/name of that Blender add-on?
@@eladbari I've been building the lenses myself from spec - no add-on
@@hanktremainI saw a video on that topic which was super interesting. But it affects render times significantly right? Is it even plausible to render out animations with it in Blender!
@@JuanGoreTex Yeah, absolutely adds significant time to your renders. I've been rendering animations with success, but it's all subjective as to what you're willing to accept for render times and how complicated your scene is to begin with.
@@hanktremain Got it, thanks for explaining. I may give it a shot for fun
This is so amazing! I have been trying to recreate a lense on Adobe Substance to some degree i made it work but it was not perfect. This takes it to another level.
It's so bizarre and unbelievable to know that a lens has been virtually created and has working functions just like in real life. Awesome job!
Really glad you enjoyed the video and the results. It's been a long time coming and it's only the beginning.
when i press render it unattaches from the camera. how to fix this?
Have you attached the lens in sequencer?
omg this is amazing. will it work with megalights? thanks!!
Its pathtracing only I'm afraid
Could you do. tutorial on placing the lenses? I can't get my images to focus when using them. I really want this to work!
There's a tutorial with the download. Don't forget to join the discord too
how does this effect how you approach virtual production? Do the artifacts and virtual focal lengths make it harder to realistically composite your live action footage?
Early days, but these are the questions im working on right now.
@@JoshuaMKerr awesome man, can't wait too see what you come up with
I’ll be honest, it looks like you just turned on a tonne of image blur.
Want. Where can I buy?
Downloading and looking forward to checking out with more purchases
From my own experience, if we speak about realism in animation, a main problem for Unreal and any other realtime rendering (and even for most software renderers) is a lack of good antialiasing and motion blur. Blooming can be simulated mostly in screen space. Lense flares also, with some tricks like rendering a second pass with a larger FOV. What cant be done efficiently is good antialiasing and motion blur. This type of physical lenses can be simulated accurately without a need of using raytrace. It's enough to use multistep rendering.
no you can't get this with traditional screen space effects, this is physically accurate LOL
Impressive work, although this product is not for me. I prefer to have the msot control until the end of production, so I do not want to bake these imperfections in engine. I also think that these shots are way too soft - I use vintage glass daily in my film work and I usually shoot with 50+ years old lenses and they still have less halation / blur and are much sharper than your examples, so I hope it is possible to regain some sharpness even with these virtual lenses. What makes lenses truly unique is not only imperfection part but color rendition and contrast. Some lenses are better than others in this field. Nonetheless - good work, hope you will continue to develop this. :)
Really cool stuff, Joshua! Looks great 🔥
I adore finding people who do things that not only do i need actively, but to see them solve it in a way i'm far too stupid to do myself.
Too much time on my hands probably
In game development you'd usually achieve this using shaders. Either by adjusting/rewriting the standard surface shaders or adding different shaders for certain objects. The standard surface shader in unreal does have a cartoon look to it.
Thats fair. This is a filmmaking and virtual production channel though. I use Unreal as a renderer quite a lot.
This is so cool! I don't even use Unreal but love the concept and execution of this. I also totally understood that laugh at 2:06 🤣 spending 6 months on something for it to work properly is quite the dedication hahaha
Really nice work Joshua!! You're such an inspiration! Keep it up brother 🔥🔥🔥