There's a case to be made about INCREASING the size of the final image rather than reducing it. If you render at 200% there's 4 times the pixels, so you reduce the samples by a factor of 4. What this does is keep constant the amount of "samples per image", so in theory it should take the same amount of time; but the additional render passes used by the denoiser are almost free since they take just one sample for the most part. This to say that the denoiser has more data for the same amount of render time, so it performs better and doesn't take longer. Also, always use multi-pass denoising
this is amazing info - adds some minor extra seconds with layers of compositing but compared to the quality increase its worth it. Adding that adding grain/noise in post diffuses pixels and means that you can get away with real low samples and denoising, then add that grain back in at a higher res.
i rendered a video at 4k and 86 samples came out pretty good i think.. but also 60fps.. fast denoise.. it's pretty basic scene though i guess.. my first real one with blender.. it's on my youtube chan main page if anyone interested
Dude, that brings back old traumas from my studies. Because that's exactly what the math lectures were like in the first semester. The professor wrote on the board at the same speed as he spoke. He was like a machine gun: talk, write, erase the board, talk, write... 😂😂😂
one of the best cycles optimization vids but you actually never want to use random noise seed plus denoiser since it ends up with more flickering on your final animation
@@br11bronc34 not sure tbh, don't use default denoisers usually. It has some sense tho, even small sampling differences from frame to frame causing flickering issue. According to that, I'd rather have this problem in area with deforming mesh than in my whole frame. Use temporal denoiser, combined with relatively clean image it gives best results from what I've seen
@@helloworld_2472I think they mean “can you talk more about why rendering via terminal is good for computers with low amounts of RAM? What specifically makes that technique so good?”
@@qulzam685 When you render via terminal, it doesn't have to open the project in a blender window with the ui, editor, tools, etc. Which take up memory It just opens the renderer and renders with the project's render settings, it also doesn't have to show you the image preview, which takes memory too.
It’s incredibly helpful! I’m cutting down my frame renders huge amounts!. I don’t need super duper renders, while I’m learning and building showreels of text animations , most people wouldn’t know the difference if you are putting up 5 sec samples on social media . It gets around hardware limits
It's a pretty good tutorial, but I think that you should render like a 2 or 3 different settings and show us a difference between them at the end. Also maybe show us how to optimize render settings for scene with volumetrics, as volumetrics are very heavy to render. Thanks for the video tho.
Agree! These are great tips yes and I've used most of them but i usually get way worse quality especially when rendering something organic. He seems like he's very confident in these methods (ik ik he said at the end that "this is an extreme example") But the confidence portrays that the quality is not affected that much so i wanna know what's the catch? How much is the quality reduced... Etc..
If you have finer details on your textures and if your camera is further away from your models, I don't recommend downscaling the render resolution because the denoiser will get rid of the details as it has less pixels to work with. Just a heads up.
It can all be done faster, but if you want to make a photo realistic rendering, you still want to keep the quality high. It would be ideal if they used EEVEE rendering as a template for Cycles rendering. Calculations are made in the background and put into a database. If there are any changes to your visual, it will be updated in the background.
Applying the subdivision and decimate modifiers makes the build faster in some scenes in my experience, despite activating Persistent Data, it consumes less ram and it's useful if you're testing the settings for your scene.
Three things I really appreciate; the very useful content, the speed of delivery, and the advertisement at the end which I watched because it was all worth it.
All of the computation power went into making the fastest Render for this Tutorial, that almost nothing was left for the microphone to export No, but awesome tutorial as always! A gem every time
I guess persistent data does not work as I hoped with the current hair curves. I'm making characters with fur; it takes like 1.5 minutes of preprocessing/applying modifiers before anything gets sent to the GPU for rendering. Which makes sense, most of the time character is animated so the hair system needs to be built from scratch for the current mesh deformation.
ill never forget the day i heard a movie effects guy reference you in an interview they did. I cant remember where i saw it but i see why they did lol.
@wydua2049 I have no idea searching cgmatter referenced in interview doesn't bring up much. I just remember he was working on a animated movie and was saying how a video he did really helped him solve a issue he had
Normally i watch RUclips in 1.25-1.5 Speed, but yours i have to slow right down. You are the Zero Punctuation of Blender (if Zero puctuation guy had actual talent)
What laptop do you use? I've been mac for so long am reluctant to switch back to pc, but even latest M3 render stats are so far behind am thinking maybe time to bite the bullet and just get a gaming computer or something with Nvidia gpu
For sampling and tiling I'd suggest using a number lower than the number of your cores in the gpu that way no all the cores can work as hard as they can. This only applies to the older models as you'd have to use a ridicilous amount of samples and huge resolutions
If you have a realistic scene like this... use camera calling is not a good thing, because if you have reflective materials, but behind the camera you have nothing to reflect, the result will be ugly... Or even worse, maybe by rotating or moving the camera objects appear that were previously hidden. So things appear out of nowhere. In an interior it is right to keep the things that are needed turned on and turn off the things in the other rooms that cannot be seen for example And be careful because if you lower the samples too much and use denoise you can have artifacts... it's a guaranteed click video, to do even better you could render with CPU at 10000 samples, and then switch to GPU 128 samples with denoise. so you can change the title of the video to render at 10000x faster :) I've been following you for a long time and I respect you, and I know that you are a master and know all these things, in fact I wonder why you made a video like this, which in my opinion is clearly useless and without explanation... If an entry level who wants to learn blender for the first time followed all the steps in your video he would want to never use this software again, because the result would not be nice to look at. then oh.. do as you want
Hey, i was wondering if you could put out a tutorial about tracking objects with markers that occasionally go off screen, for example a hand, when it rotates a lot of the markers cant be tracked, is there an efficient way to track an object like that?
I always thought, there should be an option that the object should not be rendered which are out of camera boundary, but today i got to know it already exists... I feel stoopid!!
I actually can't believe I could've been using cycles this whole time I thought my PC was just not good enough but I realized that the one thing that was making it so slow in cycles was that I hadn't setup my GPU render settings correctly
Or you know for this kind of scene you could just bake all the lighting in textures and then render almost instantaneously at full resolution with workbench
with using Commnad line is longer than using BLENDER xD haha im on 4.2 , anyway differnece is About 2 seconds. from 16 to 17.5 second . Its looks like persistent data aint work under CMD. // EDIT: same speed i did more test.
I was always wondering if you would convert your meshes into simple shapes, and use parallax occlusion mapping on the faces how much different would it be, and how much faster would it be. (like on the default cube tutorial, just for every face)
Before AI was all about stealing art, I took a video I rendered of a spinning Suzanne coin at 15 fps and... 480... and used Topaz to upscale to 60 fps and 1080. Over 90% of the end product was extrapolated. I then rendered out the same video. There were very remarkable differences - the AI botched the reflection something awful. But one took less than an hour, and the other an entire week. Definitely worth it if it's not meant to be a finished product. BUT you now have to contend with negative AI sentiment and AI hallucination will trigger that in a hurry.
Hey, I have a problem: I have an area light with some node groups inside, with some drivers that are depend on the area light's parameters. Problem is I can't figure out how to duplicate them so the drivers are updated to use the duplicated are light's parameters and not the original's.
watch this in 1.5x speed to render 4,492 times faster in cycles
lol nice
lmfao
watching this with 635x speed in blender cycles so it could render 1,901,825 times faster
Recorded the tutorial at half framerate and forgot to slow down 💀
2:06
"Fast means fast"
-CGMatter 2024
Lmao
2:05
"Fast means faster"
~CGMatter 2024
Corrected 👍
a better explanation: it gets rid of global illumination making there not be light bouncing, kinda like Eevee.
There's a case to be made about INCREASING the size of the final image rather than reducing it. If you render at 200% there's 4 times the pixels, so you reduce the samples by a factor of 4. What this does is keep constant the amount of "samples per image", so in theory it should take the same amount of time; but the additional render passes used by the denoiser are almost free since they take just one sample for the most part. This to say that the denoiser has more data for the same amount of render time, so it performs better and doesn't take longer. Also, always use multi-pass denoising
this is amazing info - adds some minor extra seconds with layers of compositing but compared to the quality increase its worth it. Adding that adding grain/noise in post diffuses pixels and means that you can get away with real low samples and denoising, then add that grain back in at a higher res.
i rendered a video at 4k and 86 samples came out pretty good i think.. but also 60fps.. fast denoise.. it's pretty basic scene though i guess.. my first real one with blender.. it's on my youtube chan main page if anyone interested
Im glad no geo nodes used, just a classical CGMatter adventure
Dude, that brings back old traumas from my studies. Because that's exactly what the math lectures were like in the first semester. The professor wrote on the board at the same speed as he spoke. He was like a machine gun: talk, write, erase the board, talk, write... 😂😂😂
That unexpected " No difference :( " cracked me up for some reason
one of the best cycles optimization vids but you actually never want to use random noise seed plus denoiser since it ends up with more flickering on your final animation
the static noise does de same noise flickering if your meshes have a animation, no?
@@br11bronc34 not sure tbh, don't use default denoisers usually. It has some sense tho, even small sampling differences from frame to frame causing flickering issue. According to that, I'd rather have this problem in area with deforming mesh than in my whole frame.
Use temporal denoiser, combined with relatively clean image it gives best results from what I've seen
brainrot speed
🗣🗣🗣🗣🗣🗣
And unfortunately it's working on me 😭
I watched this on 2x speed
Lol 😂 how broh 😮
I'm usually watching his videos 0n 1.5, but 2x it's hardcore
@@VitaliiThe I'm built different bruh
to make it 5990x faster...I see
Howdy, Chuck?
Important note: superscale in DaVinci requires studio version. Don’t down-scale your resolution unless you have a tool that will upscale at the end.
and even then, superscale wont be as as good as exporting it at 200% resolution..
Rendering via terminal is really good for computers with low amounts of RAM like 16gb
would you elaborate it more?
Huh?
Not having blender software running taking up ram leaves more ram for your render @@qulzam685
@@helloworld_2472I think they mean “can you talk more about why rendering via terminal is good for computers with low amounts of RAM? What specifically makes that technique so good?”
@@qulzam685 When you render via terminal, it doesn't have to open the project in a blender window with the ui, editor, tools, etc. Which take up memory
It just opens the renderer and renders with the project's render settings, it also doesn't have to show you the image preview, which takes memory too.
The speed takes me back to the fast tutorials, it's been a while
And I did not forget the rap one
honey , cgmatter dropped a new vid
honey: "he already sent me last night lol"
@@sicfxmusic oh my... 🤭
Probably the most useful video about render optimization on RUclips. Litteraly saved my life!
It’s incredibly helpful! I’m cutting down my frame renders huge amounts!. I don’t need super duper renders, while I’m learning and building showreels of text animations , most people wouldn’t know the difference if you are putting up 5 sec samples on social media . It gets around hardware limits
It's a pretty good tutorial, but I think that you should render like a 2 or 3 different settings and show us a difference between them at the end.
Also maybe show us how to optimize render settings for scene with volumetrics, as volumetrics are very heavy to render.
Thanks for the video tho.
Agree! These are great tips yes and I've used most of them but i usually get way worse quality especially when rendering something organic.
He seems like he's very confident in these methods (ik ik he said at the end that "this is an extreme example")
But the confidence portrays that the quality is not affected that much so i wanna know what's the catch?
How much is the quality reduced... Etc..
If you have finer details on your textures and if your camera is further away from your models, I don't recommend downscaling the render resolution because the denoiser will get rid of the details as it has less pixels to work with. Just a heads up.
There isnt a single channel where the learning is equal to the laughs. Thank you for everything you do
It can all be done faster, but if you want to make a photo realistic rendering, you still want to keep the quality high. It would be ideal if they used EEVEE rendering as a template for Cycles rendering. Calculations are made in the background and put into a database. If there are any changes to your visual, it will be updated in the background.
NGL I had to double check if I had my speed setting in 2x. Bro is transcending.
These tips reduced a 4.5hr cycles render to about 28 minutes with very close to the same image quality! Thank you!
Applying the subdivision and decimate modifiers makes the build faster in some scenes in my experience, despite activating Persistent Data, it consumes less ram and it's useful if you're testing the settings for your scene.
Three things I really appreciate; the very useful content, the speed of delivery, and the advertisement at the end which I watched because it was all worth it.
What a good time to be a live. Thanks for making this possible CGMatter
All of the computation power went into making the fastest Render for this Tutorial, that almost nothing was left for the microphone to export
No, but awesome tutorial as always! A gem every time
This is the kind of tutorial that solves problem effectively, thankyou very much
I guess persistent data does not work as I hoped with the current hair curves.
I'm making characters with fur; it takes like 1.5 minutes of preprocessing/applying modifiers before anything gets sent to the GPU for rendering.
Which makes sense, most of the time character is animated so the hair system needs to be built from scratch for the current mesh deformation.
Tip: Optix is actually better for denoising in the viewport compared to the final render. imo open image denoise is better for final renders.
Bro is indirectly challenging unreal engine
Found this at 4am instead of sleeping, worth it 😆
I appreciate the speed of this info dump. ❤❤
ill never forget the day i heard a movie effects guy reference you in an interview they did. I cant remember where i saw it but i see why they did lol.
what guy
@wydua2049 I have no idea searching cgmatter referenced in interview doesn't bring up much.
I just remember he was working on a animated movie and was saying how a video he did really helped him solve a issue he had
@@jayhawk184 blender Bob?
@rennightmare I'm not sure but I like the idea of people helping find him so CG himself can see
Normally i watch RUclips in 1.25-1.5 Speed, but yours i have to slow right down. You are the Zero Punctuation of Blender (if Zero puctuation guy had actual talent)
What laptop do you use? I've been mac for so long am reluctant to switch back to pc, but even latest M3 render stats are so far behind am thinking maybe time to bite the bullet and just get a gaming computer or something with Nvidia gpu
For sampling and tiling I'd suggest using a number lower than the number of your cores in the gpu that way no all the cores can work as hard as they can. This only applies to the older models as you'd have to use a ridicilous amount of samples and huge resolutions
Is this available in normal speed? The audio is glitching, and if we drop it to 0.5 (which I suppose is normal speed) it's glitching even more.
Just watch in .075
@@matbob56 Still glitchy sound
If you have a realistic scene like this... use camera calling is not a good thing, because if you have reflective materials, but behind the camera you have nothing to reflect, the result will be ugly... Or even worse, maybe by rotating or moving the camera objects appear that were previously hidden. So things appear out of nowhere. In an interior it is right to keep the things that are needed turned on and turn off the things in the other rooms that cannot be seen for example
And be careful because if you lower the samples too much and use denoise you can have artifacts...
it's a guaranteed click video, to do even better you could
render with CPU at 10000 samples, and then switch to GPU 128 samples with denoise. so you can change the title of the video to render at 10000x faster :)
I've been following you for a long time and I respect you, and I know that you are a master and know all these things, in fact I wonder why you made a video like this, which in my opinion is clearly useless and without explanation...
If an entry level who wants to learn blender for the first time followed all the steps in your video he would want to never use this software again, because the result would not be nice to look at.
then oh.. do as you want
Hey, i was wondering if you could put out a tutorial about tracking objects with markers that occasionally go off screen, for example a hand, when it rotates a lot of the markers cant be tracked, is there an efficient way to track an object like that?
when you were referring to programs like davinci resolve for upscaling...is that like AI upscaling or...?
Thank you so much you helped me a lot!! I was so frustrated that my animation took so long to render.
I always thought, there should be an option that the object should not be rendered which are out of camera boundary, but today i got to know it already exists... I feel stoopid!!
you are a maniac (in the best way) 👏
Getting some flickering shadows after using some of this methods, what do you reckon would be the issue? is it maybe seed?
The only fast render tutorial you will ever need.
Perma Bookmarked
Increasing the noise threshold to 1 is what you need.
I actually can't believe I could've been using cycles this whole time I thought my PC was just not good enough but I realized that the one thing that was making it so slow in cycles was that I hadn't setup my GPU render settings correctly
Saw Cycles, ended up taking notes for Eevee as well. Many thanks.
This video is the future of youtube..
3:20 under rated Jedi tip!
So pure.
Bruh, I had to watch at 0.75 as I was missing some stuff. Dude goes relentlessly
Or you know for this kind of scene you could just bake all the lighting in textures and then render almost instantaneously at full resolution with workbench
You're the best:) Thanks for video, very informative.
great tut..thanks as always
Is it possible to use Eevee next from Blender version 4.2 in an earlier version, such as 4.0? if possible
i watched the video in 2x and i love it :D
thank you, this was informative and directly to the point.
the culling settings are not showing up with boxes for me to check?
Thank you for making a no nonsense tutorial we’re you don’t talk about your cats diet and other random things that some other people do with tutorials
familiar feeling of walking into the wrong room
This looks like some initial knowledge next to deleting the default cube. 😮
This video is also 2995x faster than an average Blender video.
Thanks ❤
What version of blender is this? I started in v4.1 and don’t know the earlier versions!😂
best wasy is to go at 4K and use low samples .
I prefer the results I get with openimagedenoise over Optix denoise. But Optix is faster. I don’t mind the slightly slower denoising.
CGMatter except every time he optimizes render speed the video gets faster
Yea thanks for tutorial mate, Now I can see future render frames while opening blender 👍
Amazing ❤❤❤ thank you!!! EVEE please too ! ❤❤❤
Wow caseoh I didn’t know you animated
thanks a LOT !!
Temporal savings is why blender should have optical flow built in
dude how does davinci realize a sequence, although its missing every second number? Best video ever about this topic, though!
Can you make a video about EEVEE? is all of this still necessary?
Fast GI Approximation making my frosty glass dark grey( black glass) and no lights passing through it. why that?
bro made me slow the vid to speed up the render. worth.
We need a checklist of this hahaha
with using Commnad line is longer than using BLENDER xD haha im on 4.2 , anyway differnece is About 2 seconds. from 16 to 17.5 second . Its looks like persistent data aint work under CMD. // EDIT: same speed i did more test.
Thank you for this video
I disagree with a few things, but the majority are helpful
With an RTX 3050 laptop GPU, should I use the CUDA or OptiX setting in Blender?
OptiX makes the most out of RTX
@@syaoranli7869
Okay, cool. I did set it right. Thanks and cheers.
👍🏻👍🏻👍🏻👏🏻👏🏻👏🏻👏🏻👏🏻thank you!!!
I found out by doubling the playback speed of this video I basically doubled your render speed too.
2:54 love this this is great but unles you wanna become a vfx master you can say goodbye to exrs...
OpenImageDenoise: Quality but slower
Optix: Faster but worse quality
I was always wondering if you would convert your meshes into simple shapes, and use parallax occlusion mapping on the faces how much different would it be, and how much faster would it be. (like on the default cube tutorial, just for every face)
Just set play speed at 0.75 and good to go
Ninja technique for making a long video short
Good stuff!👏
Why not decrease noise threshold from 0.001 to 0.1? It speeds the render 1000times
Change the White to blue denoise ! You can decrease the samples better !
huh ?
You mean use blue noise, not blue denoise, that's a new feature from 4.2 which relased this week
1:47 What texture limit should i keep for movie renders 2048 or 4096?
Before AI was all about stealing art, I took a video I rendered of a spinning Suzanne coin at 15 fps and... 480... and used Topaz to upscale to 60 fps and 1080. Over 90% of the end product was extrapolated. I then rendered out the same video. There were very remarkable differences - the AI botched the reflection something awful. But one took less than an hour, and the other an entire week. Definitely worth it if it's not meant to be a finished product.
BUT you now have to contend with negative AI sentiment and AI hallucination will trigger that in a hurry.
Thank you
Step 1: switch samples to 1
Step 2: win
The result is full of artifacts though (at least the "times faster" shots we are seeing)
Thank god I have GTX 1650 & AMD 5500 and after those updates I can render so much faster and better now
Yo i didn't know they added camera culling to render, that's gonna save me more time.
My brain hurts so much, why you playing us like this 😭😭😭😩
Hey, I have a problem: I have an area light with some node groups inside, with some drivers that are depend on the area light's parameters. Problem is I can't figure out how to duplicate them so the drivers are updated to use the duplicated are light's parameters and not the original's.