** it's been 2 years , I think an updating video on this one will be very useful because we can also see and compare the amount of improvement and changes they had sins this video. **
redshift looked the best to me here, i mean look at the wood textures, in redshift u can see the details of the wood texture while in the other render engines its sorta blurred out
Blender does one thing the others may not do, that thing being texture mipmapping (it will deliberately load a lower res of the original texture depending on distance from the camera) That's something usually used in video games actually
If you are concerned about looks you should use Arnold. I use both RS and Arnold. I used other renderers and my favorite def is Arnold. But I'm poor and using it for pro work is too slow... Specially when comparing with Redshift, which is insanly fast.
I don't think Is denoiser fault, look at woods texture... Blender Is worse in desk but Better in planks Wall... Redshift opposite. Imho it's mainoy because of texture and light settings that works differently. Maybe he didnt spent much time on try matching material looks
@@captureinsidethesound mipmaps only lower the texture quality. a more accurate way to get low res object depending on distance would be through LoD (Level of Detail)...which is simply lowering polygon count as the distance increases
3 года назад+13
Wow! The amount of work you must have put into this.. Awesome work and really nice and balanced!
For the blender denoising issues, have you got animated noise turned on? Because in my experience that is what causes most of the flickering like the floor in your scene. If you are using OPTIX denoise I would recommend turning animated noise off so you can retain the same noise pattern between frames. This will massively help with your flickering.
Rather than increasing the sample count, I would increase the image resolution to fight aliasing. Then you do the compositing on the high resolution and finally scale it down - noise and aliasing goes away.
@@blur-r5n instead of rendering it at native 8K which is pretty much impossible for any conventional PCs (unless you are using server grade), I recommend just using an AI upscale at that point from 4K.
I just built a desktop with an RTX3070 card and finally have enough horsepower to actually render bad animations with annoying fireflies in them. I was frustrated with my lack of info but this video makes me realize how difficult these issues are to overcome completely with the current Cycles in some setups. This is an excellent tutorial and the first time in about a year that I've bothered to read You Tube comments - everyone is contributing excellent thoughtful information and our host is doing a great job replying. Subscribed. Thanks!
With Cycles you have to have under control your albedos and your speculars as you mention in the pinned comment, but you have to have control over other things, like light intensity, clamping and exposure all together. Also the denoise result you are getting is not the best one you can extract, you can use an advanced denoise technique with the compositor using passes and you will be amazed one how much texture detail you will recover :) Keep in mind that 0.6 in value is white, not 0.8, 0.8 means white snow, so SUPER HIGH Albedo, also with specularity, the usual value should be around 0.2 and 0.5 as much, 0.8 is veeeeeeeeeery bright. But your glasses are metallic I think (I may be wrong), you shoul NOT be using Specular value at all, it should be at 0.0 and Metallic at 1.0, not sure if that's what you have there, just wanted to mention. If you want some help with Cycles and some collaboration, just tell me, I'll be glad to help if you think I may be of help :) BTW Caustics are not the only thing that can cause fireflyes, we use them, but there are ways to "diffuse" them if you need too, there is a value in Cycles precisely for that, but Caustics are needed if you want proper light behaviour, otherwise you are cutting up one big part of the light trip :)
Personally I like luxcore for it's caustics so would recommend if there is a lot of glass or water in your scene, but for everything else Cycles seems to perform better with equal or slightly better results. This is coming from someone that needed to compare these render engines professionally for a client.
@@ImaSneke I think no one will disagree that Cycles performs better at this point - especially with Cycles X! However, I don't think many would argue that Cycles is a more beautiful or realistic renderer than Luxcore. Its held by many to be in the top tier of beauty and realism with Corona etc. Cycles is super flexible and awesome at what it does but its not that inherently pretty imo.
At the end of the day, the image is the product. The tools you use will change depending on the job and what it requires. You can do all kinds of machinations to try to get certain results from certain software, but that's not efficient if you know of a better tool. You can usually tell who the noobs and amateurs in the room are when they go apoplectic about software choices ("Blender isn't used by professionals!!") ... while the pros realize that the software used can either make the job harder or easier (depending on the job) - but the end result is the image ... and yes, the chances of finding jobs that use your ffffavorite software might be good or bad, depending ... but also be wary of places that demand software loyalty. Pixar, for example, lets their modelers use whatever they want - as long as the output can port to their pipeline. For fireflies, try playing around with the Clamping settings in Blender. You can also set up geometry to cast shadows, but not be considered for GI bounces, etc - but you are correct in that Blender's portal lights are a bit weird (compared to Arnold and Redshift). I've used a lot of different software (starting with Crystal 3d on SGI) and a lot of different renderers (like 10+) over the past 29 years. They all have weaknesses or things they just suck at - and in those cases you have to work around the limitations or possibly composite in a fix - again, the image is the product. Thanks for putting this together - very informative. It would be nice to see the comparison if you get the fireflies under control (again, clamping is your friend - as are tiny adjustments in Roughness and watching if you're getting too much Specular vs Metallic overlap). Cycles is still evolving, and it's actually nice to see that even though it's got some holes (cough - caustics - cough) it's not THAT far behind. I use it every day to make a living ;)
Redshift looks really crisp like it always does, has some slight AA issues on some edges but the speed alone and the low amount of hassle to go through to make it work should be reason enough to use it over the other two. Renderman will most likely get a big boost when XPU gets production ready, but that could very well be a few years off. Been waiting for XPU since 2018 and if Pixar is bad at one thing it is the speed at which they roll out things.
Afaik this totally natural and "realistic", it's just that real cameras end up adding a small glow to these areas, so you don't really notice them. That's the explanation I've seen been given a couple times by vray devs/team in their forums at least. Anyways, if you still want to clamp them down (as I suspect some renders do to save rendering time), redshift has sampling controls for it in the samples filtering section, this will make it render faster as well. Check out the docs, they show some pictures: docs.redshift3d.com/display/RSDOCS/Output?product=maya#Output-MaxSubsampleIntensity
@@jangrashei1752 Hey Jan, thanks for the advice I didn't know that about the sub-sampling option to fix aliasing, I might do a bit more research into it and make a video so others are more aware :)
Redshift in this shot looks way better than renderman which is surprising to me, I really thought since Renderman was a Pixar thing it would be the best .
Interesting test. Thanks for sharing. I have production experience using Renderman (Reyes), Mental Ray, VRay, Arnold, Redshift, 3Delight, and even Maxwell. I just wanted to point out that I've never worked at a studio that actually used post process denoisers. Is this commonplace with using RIS? Even when I use redshift, these types of denoisers typically introduce artifacts in animations. Is there a new use case for them that I'm not aware of...
Hey Raphael thanks for your comment. That's a valid point, I imagine that's the case with most studios. For freelance, which is what I work in mostly, I use denoising as I'm restricted in my computing power unless I choose to farm it out (project dependent). With Redshift I will usually use Atlas as it tends to provide a cleaner result, though longer denoising time. With RiS I believe Disney talked about using the denoiser back when it was introduced for Big Hero 6 (I think, I'm stretching my memory here) but don't quote me on that. I know people using RiS at major studios at the moment and will ask them and try to get back to you as to whether denoising has in fact become practice. Cheers!
@@SmallRobotStudio I see. Yeah I would love to know what were Pixar's intentions with post process denoising. I mean if it works out well with things like motion blur, tiny details, soft gradations in values, over frame sequences, it could be an edge. And if you are already having some good results, I sure would like to learn.
thanks for this videi! it was super informative. I'm on the border of using blenders Cycles or sticking to my maya workflow and just getting a Redshift subscription
@@jbdh6510 I rather go with Redshift and Arnold for Maya. Arnolds new GPU render option works great, but i dont like that some shaders still arn't working on it. the fog shader for example doesnt work rn, but if they fix all the shaders ill end up using Arnolds GPU option as my main, then Redshift, and then cycles as my 3rd option
wow there were some quite surprising informations in this one! Thanks for the work! I actually like certain aspects of every image, so I can't really say that one is better than the other. Blender had some significant updates since this video and I guess the other softwares aswell, so I'm wondering how much changes since then.
Great video! I tend to prefer Altus as a denoiser though. Results are much crisper and without much in the way of flickering, even with lower sample rates. Optix produces mushy garbage results in my opinion. But everyone has their own way of working. I’m sure you also have RT option ticked on in the experimental options tab too. That can make a huge difference on some scenes. I also find better results using brute force for primary and secondary GI which seem to render better results in my experience than irradiance cache which can contribute to flickering renders. I bounce back and forth between Redshift , Octane and Cycles4D depending on the project but for a few years Redshift has become my go renderer.
At lower sample rates you're always going to see some noise, that's just the nature of Monte Carlo rendering. Large studios don't need to worry as much about render times as much as small studios/freelancers as they take care of their rendering on farms, so they can go into the thousands of samples even on IPR. This comparison was mainly meant for people working from home, enthusiasts and small studios. Cheers :)
Min 64, Max 4096 is the production preset for RenderMan, but like most will tell you that have worked in big productions, there is no set number ever. You will always need to tweak numbers based on characters, environments, interior vs exterior, glass, reflections, etc.
The main issue in your approach to Redshift was treating it like Renderman in sample settings. Instead of cranking up the scene samples during the test, you only needed to increase the Brute Force ray count. A min/max samples of 4/16 would've covered all geometry related aliasing, and any blurred reflection noise could've been controlled with the overall reflection samples. Also trying the 'auto samples' feature would've been worth the trouble. The other was using Optix. Optix is amazing for WIP rendering and generally terrible for final quality rendering. A better test would've been a dual Altus approach. You can use low sample counts without the issue you saw on the wallpaper appearing even with low samples. When PRman gets fully ported to the GPU it will once again be worth considering for smaller studios. Until that time, it's just not worth the investment in the additional hardware that's required to get through frames.
They do look at the drawers of the desk at 3:04 the effect is minimised by the strong key light however. The angle of the blinds is similar to the key as well so there's only a fine amount of shadow that can be cast.
@@SmallRobotStudio Perfect thank you, so essentially the blind blades are relatively parallel to the key's light paths, and whatever shadows are left get washed out by the multiple bounces. Thanks for clearing that up!
9:54 Compare the wood grain on the lower drawer, and also some of the detail in the painting on the wall -- there is a definite softness when you go down to 128 samples.
Editing the video plus V/O and graphics was probably about a days work, converting the scenes was about 5 hours each and the initial modelling of the scene was about 2 years ago so I can't recall, I did it over the course of a month in my free time. Rendering time was spread out throughout a week with testing as well. In all, I spent way too much time on it! ;)
More light rays per pixel are calculated. For a 100% glossy material it makes no difference, as all rays get refelected in the same angle, but for every diffuse material, light is scattered in different directions, so with 1 sample you get extreme noise. So with every additional sample more Information is calculated and pixels next to each other are less different. So no "extra" calculations are done, just the one is repeated more often to get a smoother ore "more accurate" Result.
Thanks for sharing this info, though imo, the anti-noise thing masks all your efforts to show the real picture. Also, the image suffers from it, its better to have some noise but with all the details.
Hi! Can you share that how to do the post effect? I mean how to do the volumetric lighting in compositing. I am a redshift user and I know render the volumetric will take a lots of time. BTW This is very good comparison. Thanks for the sharing!
Did you use bump to roughness in renderman? I think that really improves the surface details. Also renderman being available in Blender now is fantastic
on the Bcon 22 the talk about how & why blender struggle ....I think it could the talk about the new upcoming Bsdf or the talk about importend light sampling or the talk about path guiding that had the topic ...sry I know it not exactly...to much good interesting talks this year... maybe you can talk about that in the future again... p.s. thx..
@@SmallRobotStudio that and because they can scale up easily while us mortals cannot :) And Renderman still has some of the best SSS solution next to Arnold and produces amzing micro surface detail if you bruteforce it without denoising. Quality wise Renderman and Arnold are easily the best out there at this point.
Idk, this not convinced me at 100%... It seems that Redshift Is the absolutely winner in term of quality/time overall... In Renderman and In Blender a lot of textures are gone. Imho with some different texture setting and some lighst adjustment other render engine can reach the same quality. It's strange that the cycles denoiser destroyed so much the wood texture. And looking at the images i noticed that cycles have more texture in the Wall wood and less in the desk, th opposite in Redshift. So i can Say that the problems are because of the settings
Comrade in the first segment you mentioned that it took you 3 days to render ,,,, but when you talk about redman and redshift at the end you don't mention how long it took you to render help us on that ....
eevee is a real-time renderer. It can’t really compare on a technical basis, though I wouldn’t be surprised if you could get it looking pretty close, with a lot of tweaking though.
In Blender for the portal light, the walls you would have added you could have changed their ray visibility to have them not add bounces. Would have prefered this video without any denoising. I hate denoisers most of the time. They look so mushy and especially in this scene. Perhaps youtube compression is adding to that. Would have loved to see to 2048 sample render with no denoise out of curiosity. Thanks for sharing. More of a denoiser comparison even though each is optix than a renderer comp it felt. Also would have loved to see an Eevee version using the Eevee G.I. addon. Noticed the chrome on the chair legs in Renderman looks nasty, is that a rust texture?
at 512 samples you're getting negligible benefits in Redshift, but you can see the stepping antialiasing on all the specular highlights (desk drawer handle is egregiously stepped) and that's when you reduce the threshold amount to 0.0001 to try to address it. There is no control specifically for antialiasing specular highlights in RS and there should be, as its method for resolving burned-out pixels is useless compared to a spectral renderer with floating point color precision. Antialiasing in Redshift is wack and Optix is wack and it's these two things that makes me switch to octane at the start of projects
You need to learn to use the altus denoiser. optix sucks and requires much more samples to function correctly... Altus is slightly tedious at first, but you can gauge it to work at even terrible sample-rates very fast.
Better lights but bad shiny reflects (cycles), better quality of details but artifacts on diagonal lines (redshift), better fluid image but too flat (Renderman).
GPU's are still nowhere near where they need to be to handle production scenes steadily. CPU still leads in that aspect, and thus XPU should be thought of by users as an excellent Look Dev tool.
@@iceman10129 Depends on the GPU cluster and what the cluster is doing and the scene size's ability to fit in VRAM. CPU's are slow but have access to massive amounts of memory and scalability. I have access to multiple DGX systems so XPU will scream once released. Look dev will be great for sure, but Octane, Redshift, Maxwell (somewhat) etc. alreasdy can do final-frame whole scenes, and XPU should be no different.
@@GoatDirt While Redshift is really good for production, especially smaller scale projects, it will definately have problems on massive projects where you have mad artists using textures of a couple hundred meg in size each and other such crazy stuff. A good example would be that Moana beach example scene which I believe is also on the Renderman site for download somewhere. GPU will struggle with such a high amount of texture and mesh data compared to CPU, but it really all comes down to that. For everything else you are better off with Redshift right now.
If you're thinking in terms of render times keep in mind that my gpu is 4 years newer than my cpu in this video (check the specs at the start). Renderman is really well integrated into most DCC pipelines so a lot of studios use it, and since they're rendering on farms CPU load times aren't as much of an issue. If you're a freelancer (like me) you'd likely go something like Redshift or a similar competitor as it scales a bit better
@@manojlogulic4234 Hey mate, thank! I'd love to do more but it was an absolute mission to translate this scene 3 times and run all the renders! I think my computer (and my power bill) might not like any more. Also, I don't own Vray so that might be difficult (though there's some version available in Blender apparently, I haven't looked into it properly yet...)
Regardless of render engines, Blender is a bad boy.... I mean AWESOME!!
3 года назад
E-Cycles is better than the strandard Cycles and have since 2.93 version a new AI denoiser based on intel denoiser and it's really great with 16 samples ;) Good video to see the comparaison.
Yes there is a thing called "portals" in Blender however it doesn't act as portal lights act in Maya. It doesn't restrict light rays from an environment to ONLY be shot through the portal, it merely defines the importance of those rays as I said in the video.
**
it's been 2 years , I think an updating video on this one will be very useful because we can also see and compare the amount of improvement and changes they had sins this video.
**
redshift looked the best to me here, i mean look at the wood textures, in redshift u can see the details of the wood texture while in the other render engines its sorta blurred out
Blender does one thing the others may not do, that thing being texture mipmapping (it will deliberately load a lower res of the original texture depending on distance from the camera)
That's something usually used in video games actually
If you are concerned about looks you should use Arnold. I use both RS and Arnold. I used other renderers and my favorite def is Arnold. But I'm poor and using it for pro work is too slow... Specially when comparing with Redshift, which is insanly fast.
I don't think Is denoiser fault, look at woods texture... Blender Is worse in desk but Better in planks Wall... Redshift opposite. Imho it's mainoy because of texture and light settings that works differently. Maybe he didnt spent much time on try matching material looks
@@Mima_the_vengeful_spirit , this should be a technique regardless considering distant objects have no reason to be high resolution, right?
@@captureinsidethesound mipmaps only lower the texture quality. a more accurate way to get low res object depending on distance would be through LoD (Level of Detail)...which is simply lowering polygon count as the distance increases
Wow! The amount of work you must have put into this.. Awesome work and really nice and balanced!
Thanks!
For the blender denoising issues, have you got animated noise turned on? Because in my experience that is what causes most of the flickering like the floor in your scene. If you are using OPTIX denoise I would recommend turning animated noise off so you can retain the same noise pattern between frames. This will massively help with your flickering.
Redshift one is the best looking while showing most of the texture details, leather shine and realistic light rays going everywhere.
It has jaggies/aliasing on edges that the other renders don't have
Rather than increasing the sample count, I would increase the image resolution to fight aliasing. Then you do the compositing on the high resolution and finally scale it down - noise and aliasing goes away.
yesss. sometimes samples make little difference when u double it
how do you increase image resolution in blender?
@@afrosymphony8207 no Render it with high resolution
This is one of the easiest ways. When the final result is output in 4k, the problem is that it should come out in 8k...
@@blur-r5n instead of rendering it at native 8K which is pretty much impossible for any conventional PCs (unless you are using server grade), I recommend just using an AI upscale at that point from 4K.
I just built a desktop with an RTX3070 card and finally have enough horsepower to actually render bad animations with annoying fireflies in them. I was frustrated with my lack of info but this video makes me realize how difficult these issues are to overcome completely with the current Cycles in some setups.
This is an excellent tutorial and the first time in about a year that I've bothered to read You Tube comments - everyone is contributing excellent thoughtful information and our host is doing a great job replying. Subscribed. Thanks!
Thank you for taking the time to make these videos.
For blender, are you using sRGB or Filmic colourspace?
Filmic
With Cycles you have to have under control your albedos and your speculars as you mention in the pinned comment, but you have to have control over other things, like light intensity, clamping and exposure all together.
Also the denoise result you are getting is not the best one you can extract, you can use an advanced denoise technique with the compositor using passes and you will be amazed one how much texture detail you will recover :)
Keep in mind that 0.6 in value is white, not 0.8, 0.8 means white snow, so SUPER HIGH Albedo, also with specularity, the usual value should be around 0.2 and 0.5 as much, 0.8 is veeeeeeeeeery bright.
But your glasses are metallic I think (I may be wrong), you shoul NOT be using Specular value at all, it should be at 0.0 and Metallic at 1.0, not sure if that's what you have there, just wanted to mention.
If you want some help with Cycles and some collaboration, just tell me, I'll be glad to help if you think I may be of help :)
BTW Caustics are not the only thing that can cause fireflyes, we use them, but there are ways to "diffuse" them if you need too, there is a value in Cycles precisely for that, but Caustics are needed if you want proper light behaviour, otherwise you are cutting up one big part of the light trip :)
Would love to see this revisited with Cycles 3.x and Intel OpenInageDenoise. Such a good denoiser
You could have made the walls and just disconnect the bsdf Shader so they appear black and reflect no color
This is a really good tip, thanks!
Blender use a colour profile by default called Filmic, not sRGB so that could be why the colour shift is different in Blender/Cycles.
Yeah, I added ACES to blender and its a game changer
@@Keyboard_ink yeah how so?
@@AshleyGlover This is the tutorial I followed ruclips.net/video/B7FWNNDXBl0/видео.html
I'd love to hear your take on Luxcore!
Me too!
Personally I like luxcore for it's caustics so would recommend if there is a lot of glass or water in your scene, but for everything else Cycles seems to perform better with equal or slightly better results. This is coming from someone that needed to compare these render engines professionally for a client.
@@ImaSneke I think no one will disagree that Cycles performs better at this point - especially with Cycles X! However, I don't think many would argue that Cycles is a more beautiful or realistic renderer than Luxcore. Its held by many to be in the top tier of beauty and realism with Corona etc. Cycles is super flexible and awesome at what it does but its not that inherently pretty imo.
At the end of the day, the image is the product. The tools you use will change depending on the job and what it requires. You can do all kinds of machinations to try to get certain results from certain software, but that's not efficient if you know of a better tool. You can usually tell who the noobs and amateurs in the room are when they go apoplectic about software choices ("Blender isn't used by professionals!!") ... while the pros realize that the software used can either make the job harder or easier (depending on the job) - but the end result is the image ... and yes, the chances of finding jobs that use your ffffavorite software might be good or bad, depending ... but also be wary of places that demand software loyalty. Pixar, for example, lets their modelers use whatever they want - as long as the output can port to their pipeline.
For fireflies, try playing around with the Clamping settings in Blender. You can also set up geometry to cast shadows, but not be considered for GI bounces, etc - but you are correct in that Blender's portal lights are a bit weird (compared to Arnold and Redshift).
I've used a lot of different software (starting with Crystal 3d on SGI) and a lot of different renderers (like 10+) over the past 29 years. They all have weaknesses or things they just suck at - and in those cases you have to work around the limitations or possibly composite in a fix - again, the image is the product.
Thanks for putting this together - very informative. It would be nice to see the comparison if you get the fireflies under control (again, clamping is your friend - as are tiny adjustments in Roughness and watching if you're getting too much Specular vs Metallic overlap). Cycles is still evolving, and it's actually nice to see that even though it's got some holes (cough - caustics - cough) it's not THAT far behind. I use it every day to make a living ;)
Redshift looks really crisp like it always does, has some slight AA issues on some edges but the speed alone and the low amount of hassle to go through to make it work should be reason enough to use it over the other two.
Renderman will most likely get a big boost when XPU gets production ready, but that could very well be a few years off. Been waiting for XPU since 2018 and if Pixar is bad at one thing it is the speed at which they roll out things.
A detailed and enjoyable video. Need new version of it, just for eye candy.
Interesting analisys, still the redshift render has a bad aliasing on some very bright edges, this is a problem I often encounter too
Yeah I actually didn't notice that until you mentioned it as I was so focused on noise, worth pointing out!
Afaik this totally natural and "realistic", it's just that real cameras end up adding a small glow to these areas, so you don't really notice them. That's the explanation I've seen been given a couple times by vray devs/team in their forums at least. Anyways, if you still want to clamp them down (as I suspect some renders do to save rendering time), redshift has sampling controls for it in the samples filtering section, this will make it render faster as well. Check out the docs, they show some pictures: docs.redshift3d.com/display/RSDOCS/Output?product=maya#Output-MaxSubsampleIntensity
@@jangrashei1752 Hey Jan, thanks for the advice I didn't know that about the sub-sampling option to fix aliasing, I might do a bit more research into it and make a video so others are more aware :)
These edges also look better with a tonemap, like an ACES odt (ACES - sRGB for example).
That was a massive amount of work you put into this, thanks man really appreciated the info. Peace out.
This test was done before the release of CyclesX which got a whole kernal update and now cycles has features like path guiding.
4:56 Those sample numbers seem small to me. I’m accustomed to specifying several times that to get good-quality renders out of Cycles.
I have the same problem with Blender, by increasing resolution you eliminate that effect
Redshift in this shot looks way better than renderman which is surprising to me, I really thought since Renderman was a Pixar thing it would be the best .
This was a very informative video. Subbed instantly!
Interesting test. Thanks for sharing. I have production experience using Renderman (Reyes), Mental Ray, VRay, Arnold, Redshift, 3Delight, and even Maxwell. I just wanted to point out that I've never worked at a studio that actually used post process denoisers. Is this commonplace with using RIS? Even when I use redshift, these types of denoisers typically introduce artifacts in animations. Is there a new use case for them that I'm not aware of...
Hey Raphael thanks for your comment. That's a valid point, I imagine that's the case with most studios. For freelance, which is what I work in mostly, I use denoising as I'm restricted in my computing power unless I choose to farm it out (project dependent). With Redshift I will usually use Atlas as it tends to provide a cleaner result, though longer denoising time. With RiS I believe Disney talked about using the denoiser back when it was introduced for Big Hero 6 (I think, I'm stretching my memory here) but don't quote me on that. I know people using RiS at major studios at the moment and will ask them and try to get back to you as to whether denoising has in fact become practice. Cheers!
@@SmallRobotStudio I see. Yeah I would love to know what were Pixar's intentions with post process denoising. I mean if it works out well with things like motion blur, tiny details, soft gradations in values, over frame sequences, it could be an edge. And if you are already having some good results, I sure would like to learn.
thanks for this videi! it was super informative. I'm on the border of using blenders Cycles or sticking to my maya workflow and just getting a Redshift subscription
I'm curious, for what have you decided?
?
@@jbdh6510 I rather go with Redshift and Arnold for Maya. Arnolds new GPU render option works great, but i dont like that some shaders still arn't working on it. the fog shader for example doesnt work rn, but if they fix all the shaders ill end up using Arnolds GPU option as my main, then Redshift, and then cycles as my 3rd option
@@jbdh6510 I also tried both Cycles GPU and Arnold GPU rendering on commercial projects so final product choice was pretty important for me
@@EvelynHernandez thank your for the answer! I am now focusing on Game Development and I use Unity. But if I do a side project I might try Redshift !
I've noticed some bad aliasing in redshift. I've never seen any jaggies in cycles, it just gets a little blurrier with lower resolution
wow there were some quite surprising informations in this one! Thanks for the work! I actually like certain aspects of every image, so I can't really say that one is better than the other. Blender had some significant updates since this video and I guess the other softwares aswell, so I'm wondering how much changes since then.
which gpu was used to render this? or Did u use CPU to render?
Great video! I tend to prefer Altus as a denoiser though. Results are much crisper and without much in the way of flickering, even with lower sample rates. Optix produces mushy garbage results in my opinion. But everyone has their own way of working. I’m sure you also have RT option ticked on in the experimental options tab too. That can make a huge difference on some scenes.
I also find better results using brute force for primary and secondary GI which seem to render better results in my experience than irradiance cache which can contribute to flickering renders.
I bounce back and forth between Redshift , Octane and Cycles4D depending on the project but for a few years Redshift has become my go renderer.
2:42 I’ve never used RenderMan, but I recognize the same sort of denoising blotchiness from Blender Cycles.
They all seem to have problems with low samples. How many samples are usually used Big CGI production?
At lower sample rates you're always going to see some noise, that's just the nature of Monte Carlo rendering. Large studios don't need to worry as much about render times as much as small studios/freelancers as they take care of their rendering on farms, so they can go into the thousands of samples even on IPR. This comparison was mainly meant for people working from home, enthusiasts and small studios. Cheers :)
Min 64, Max 4096 is the production preset for RenderMan, but like most will tell you that have worked in big productions, there is no set number ever. You will always need to tweak numbers based on characters, environments, interior vs exterior, glass, reflections, etc.
The main issue in your approach to Redshift was treating it like Renderman in sample settings. Instead of cranking up the scene samples during the test, you only needed to increase the Brute Force ray count. A min/max samples of 4/16 would've covered all geometry related aliasing, and any blurred reflection noise could've been controlled with the overall reflection samples. Also trying the 'auto samples' feature would've been worth the trouble.
The other was using Optix. Optix is amazing for WIP rendering and generally terrible for final quality rendering. A better test would've been a dual Altus approach. You can use low sample counts without the issue you saw on the wallpaper appearing even with low samples.
When PRman gets fully ported to the GPU it will once again be worth considering for smaller studios. Until that time, it's just not worth the investment in the additional hardware that's required to get through frames.
Why dont the blinds cast shadows through the portal? Great video nonetheless. How did the adaptive sampling turn out?
They do look at the drawers of the desk at 3:04 the effect is minimised by the strong key light however. The angle of the blinds is similar to the key as well so there's only a fine amount of shadow that can be cast.
@@SmallRobotStudio Perfect thank you, so essentially the blind blades are relatively parallel to the key's light paths, and whatever shadows are left get washed out by the multiple bounces. Thanks for clearing that up!
@@GoatDirt Exactly :)
Make a tutorial on those god rays and the conplete compositing
ruclips.net/video/CL6BdLAZuSk/видео.html
@@SmallRobotStudio thanks ❤️
9:54 Compare the wood grain on the lower drawer, and also some of the detail in the painting on the wall -- there is a definite softness when you go down to 128 samples.
Any tips on how to get rid of those fireflies? My biggest issue and its so damn annoying.
Take another render software! Thre are tons out there.
very in depht and a lot of effort, thx mate!
You did great! Did this stuff a few times, never easy. I prefer Renderman, because its a noise free animation. Next please add Octan..
Very insightful video. One can see that you put a lot of effort into it.
If I may ask how much time did it take you to make it?
Editing the video plus V/O and graphics was probably about a days work, converting the scenes was about 5 hours each and the initial modelling of the scene was about 2 years ago so I can't recall, I did it over the course of a month in my free time. Rendering time was spread out throughout a week with testing as well.
In all, I spent way too much time on it! ;)
@@SmallRobotStudio You truly have my deepest respect for it. Keep it up! ^^
@@reym388 Thanks mate :)
18:20 Another option that can help with simple cases is the Despeckle filter in Blender’s compositor.
To be fair, that would break the test slightly as you are introducing something outside of the base render though.
@@iceman10129 So is the denoiser.
RedshiftRT is out now :D
newbie here, what exactly happens when you increase the sample rate? what extra calculations are being done?
More light rays per pixel are calculated. For a 100% glossy material it makes no difference, as all rays get refelected in the same angle, but for every diffuse material, light is scattered in different directions, so with 1 sample you get extreme noise. So with every additional sample more Information is calculated and pixels next to each other are less different. So no "extra" calculations are done, just the one is repeated more often to get a smoother ore "more accurate" Result.
@@thefynn thanks appreciate it
Thanks for sharing this info, though imo, the anti-noise thing masks all your efforts to show the real picture. Also, the image suffers from it, its better to have some noise but with all the details.
What do you mean by the anti noise thing, covering it up in post?
could be noise fixed for specific AOV's? Not for beauty shot.
Wonderful! Thank you!
Be interesting to see how things have changed in a couple months time
cycles x dropped and its really good, i can render 1024 samples in a minute for a fairly complex scene.
Omg I can't even assimilate what all these configs do, imagine nailing them
Redshift kind of looking good right now
Thanks for this!
Is Cycles still this flickering in animation?
Redshift show some highlight on the shelf supports, the ones near the corner of the room. Why is thta?
14:05 If you made those extra walls 100% black, you wouldn’t get any light bouncing off them.
19:23 I’m thinking, at this point, things might look better without denoising.
You should use SID for denoising in Cycles.
great video. very helpful thank you
Hi! Can you share that how to do the post effect? I mean how to do the volumetric lighting in compositing. I am a redshift user and I know render the volumetric will take a lots of time. BTW This is very good comparison. Thanks for the sharing!
ruclips.net/video/CL6BdLAZuSk/видео.html made a tutorial a few years ago on how to do it, hope it helps!
@@SmallRobotStudio Thank you so much!
Did you use bump to roughness in renderman? I think that really improves the surface details. Also renderman being available in Blender now is fantastic
on the Bcon 22 the talk about how & why blender struggle ....I think it could the talk about the new upcoming Bsdf or the talk about importend light sampling or the talk about path guiding that had the topic ...sry I know it not exactly...to much good interesting talks this year...
maybe you can talk about that in the future again...
p.s. thx..
Cool I'll try check it out sometimes thanks for the heads up!
That's why you don't use optix for production... I can't believe they are actually removing the default denoiser.
quetion : why pixar copany and grand movies make it with renderman only ?
Why wouldn't they use their proprietary software?
@@SmallRobotStudio that and because they can scale up easily while us mortals cannot :) And Renderman still has some of the best SSS solution next to Arnold and produces amzing micro surface detail if you bruteforce it without denoising. Quality wise Renderman and Arnold are easily the best out there at this point.
Is this cycles? or Cycles X?
the chsir could be desaturated because it is not linear by default
I think blender's denoiser needs more resolution than samples, and i think redshift is the best for time
Idk, this not convinced me at 100%... It seems that Redshift Is the absolutely winner in term of quality/time overall... In Renderman and In Blender a lot of textures are gone. Imho with some different texture setting and some lighst adjustment other render engine can reach the same quality. It's strange that the cycles denoiser destroyed so much the wood texture. And looking at the images i noticed that cycles have more texture in the Wall wood and less in the desk, th opposite in Redshift. So i can Say that the problems are because of the settings
But, after all, I can't understand - why image (textures and details) are soo different between all three? 😵
Comrade in the first segment you mentioned that it took you 3 days to render ,,,, but when you talk about redman and redshift at the end you don't mention how long it took you to render help us on that ....
There are graphs throughout the video showing the render time for each renderer.
Im interested how LuxCore would handle the task, should be great imo
If I had it is have a look, it's not in the budget unfortunately
Think colors in Blender cycles would be closer if you use ACEScg color space 🙂
All scenes were using linear colour space, switching all to another colour space would have yielded the same results
This can be done with Eevee, right?
And try clamping caustics in Cycles.
eevee is a real-time renderer. It can’t really compare on a technical basis, though I wouldn’t be surprised if you could get it looking pretty close, with a lot of tweaking though.
@@ashrude1071 : O, I know, I've been trying ^_^
It's going to get easier with coming changes, though (2.93 is a big improvement already!)
you should try luxcore render
A few people have said, if I had the spare bones I'd like to compare other renderers but I'm restricted by what I have access to unfortunately
El hdri es de Guanajuato?
Si!
@@SmallRobotStudio Mi ciudad natal!
Redshift for sure
Redshift takes easily the cake. The other 2 are fine, but far from great.
funny, in cycles you put 3 in the total, that means nothing will go above 3 even if you put 6 in transmission
In Blender for the portal light, the walls you would have added you could have changed their ray visibility to have them not add bounces.
Would have prefered this video without any denoising. I hate denoisers most of the time. They look so mushy and especially in this scene. Perhaps youtube compression is adding to that.
Would have loved to see to 2048 sample render with no denoise out of curiosity.
Thanks for sharing. More of a denoiser comparison even though each is optix than a renderer comp it felt.
Also would have loved to see an Eevee version using the Eevee G.I. addon. Noticed the chrome on the chair legs in Renderman looks nasty, is that a rust texture?
Redshift FTW
please dont use optix for animation. you can get clean render with 128 samples in redshift and altus denoiser.
9:13 Seems like you really want to be able to render different parts of your scene at different sample rates.
Hey this Video should be updated, it is currently popping up in the youtube algorythm...
Once I have rm24 running I will
@@SmallRobotStudio great ;-) btw your stuff is really cool
Cheers mate :)
Try this again with cycles x
at 512 samples you're getting negligible benefits in Redshift, but you can see the stepping antialiasing on all the specular highlights (desk drawer handle is egregiously stepped) and that's when you reduce the threshold amount to 0.0001 to try to address it. There is no control specifically for antialiasing specular highlights in RS and there should be, as its method for resolving burned-out pixels is useless compared to a spectral renderer with floating point color precision. Antialiasing in Redshift is wack and Optix is wack and it's these two things that makes me switch to octane at the start of projects
You need to learn to use the altus denoiser. optix sucks and requires much more samples to function correctly... Altus is slightly tedious at first, but you can gauge it to work at even terrible sample-rates very fast.
Better lights but bad shiny reflects (cycles), better quality of details but artifacts on diagonal lines (redshift), better fluid image but too flat (Renderman).
nice information thank u
Cheers, you're welcome :)
cant wait until Pixar gets their XPU released. My gpu cluster will wipe my cpu cluster out easy
Same here
GPU's are still nowhere near where they need to be to handle production scenes steadily. CPU still leads in that aspect, and thus XPU should be thought of by users as an excellent Look Dev tool.
@@iceman10129 Depends on the GPU cluster and what the cluster is doing and the scene size's ability to fit in VRAM. CPU's are slow but have access to massive amounts of memory and scalability. I have access to multiple DGX systems so XPU will scream once released. Look dev will be great for sure, but Octane, Redshift, Maxwell (somewhat) etc. alreasdy can do final-frame whole scenes, and XPU should be no different.
@@GoatDirt While Redshift is really good for production, especially smaller scale projects, it will definately have problems on massive projects where you have mad artists using textures of a couple hundred meg in size each and other such crazy stuff. A good example would be that Moana beach example scene which I believe is also on the Renderman site for download somewhere. GPU will struggle with such a high amount of texture and mesh data compared to CPU, but it really all comes down to that. For everything else you are better off with Redshift right now.
So... why would anyone use Renderman?
If you're thinking in terms of render times keep in mind that my gpu is 4 years newer than my cpu in this video (check the specs at the start). Renderman is really well integrated into most DCC pipelines so a lot of studios use it, and since they're rendering on farms CPU load times aren't as much of an issue. If you're a freelancer (like me) you'd likely go something like Redshift or a similar competitor as it scales a bit better
Octane where
nowhere
In Blender I’ve gotten better results with NLM denoiser for final renders
I'll be doing a full optimising video with this scene for blender and I'll make sure to compare the results of NLM as well
@@SmallRobotStudio please include vray and arnold in this testing. Btw, awesome analysis, love it. Thanks 🍻
@@manojlogulic4234 Hey mate, thank! I'd love to do more but it was an absolute mission to translate this scene 3 times and run all the renders! I think my computer (and my power bill) might not like any more. Also, I don't own Vray so that might be difficult (though there's some version available in Blender apparently, I haven't looked into it properly yet...)
sound like a podcast
update it for cycles x
Looks like all renders were made with Unreal Engine.
for anyone having trouble with artifacts and noise in blender render at 4k with 100-200 samples and it'll start looking much better =)
The only downside being that the render will take 10,000 years.
@@julienceaser4018 1 minute max for me per frame
mm blender is not that bad, cool!
Renderman is the best.
Да болие качетсве но чем говно сайкалс корые от евалс не чем не отличается
boss lvl mic
Regardless of render engines, Blender is a bad boy.... I mean AWESOME!!
E-Cycles is better than the strandard Cycles and have since 2.93 version a new AI denoiser based on intel denoiser and it's really great with 16 samples ;) Good video to see the comparaison.
To my opinion Altus is a way better than optix
13:30 I think there is a setting called Portals in blender cycles.
6:44 in this video ruclips.net/video/8gSyEpt4-60/видео.html
Yes there is a thing called "portals" in Blender however it doesn't act as portal lights act in Maya. It doesn't restrict light rays from an environment to ONLY be shot through the portal, it merely defines the importance of those rays as I said in the video.
Redshift won here..