For those who are complaining that I used OptiX for 3070 instead of CUDA for more proper comparison I want to say that it's more realistic case of use. I mean If you're working and need fastest viewport why would you use CUDA in the first place if you paid for RT cores? For the same reason you can't turn on Optix on 10XX cards
Optix is a game changer, I compared results between my desktop GTX 750Ti that took lots of time, laptop GTX 1650 which took over a minute, and my new RTX 3080 that took almost 1/10th of the time taken by 1650!
@@PascalWiemers @Bert I can confirm it's implemented and working right now in 2.92 Release Candidate (should come out this week), I didn't know that so thank you for letting me know. CPU+GPU has also been implemented (it also uses RAM in addition to VRAM, preventing crashes from "out of memory" errors, provided your scene isn't so complex it takes even more memory than your system+gpu can provide). All that is left to implement is baking support and maybe branched path tracing, but that might be entirely removed in favor of more options for regular path tracing.
Dude how??? I am using the same exact Card idk what am I missing When I Hit render in cycles it takes YEARS to render one frame on top of that the device heats up like crazy it turns into a jet do you have any advices to optimize it?
Even a basic 2060 shines with OptiX, in OctaneBench it can be as fast as a Titan RTX rendering with CUDA (granted, the amount of memory is a bit different, but still).
@@creper9000 It's generally called interactive real-time preview, which is what threw many people off-course when Turing was released and they kept saying "Radeon Rays has had real-time RT since X years! Nvidia is scamming everybody!", but of course it's not the same as producing a 60fps.
I can imagine, its faster then the speed of human computation ability, if not it will be around blender 3.0 as its code continues to improve at crazy speeds.
in comparison and if you take the energy consumption into account the 1060 is not so bad.....better than my card, I would be happy with the 1060 too :-). Can you make a render example as well , maybe??not only viewport
The energy consumption is crazy. I just got my 3080, and my UPS would shut down every time while trying to play a game (tested CyberPunk 2077). Had to order a bigger one
@@Scultronic Do you also have integrated graphics on your computer? I've seen cases where Blender uses the integrated graphics for the viewport and the proper, discrete graphics card for the F12 render. To see if that might happening in your case, run Blender and check in the Task Manager which GPU it's using . If your computer has integrated graphics, GPU 0 is that, and GPU 1 should be your 3080. Another thing you should do is make sure Blender is set to use the 3080 in the Nvidia Control Panel. Yet another thing to try is a different version of Blender.
I haven't been impressed by tech this much sense I got my gsync 2 years ago this is real impressive I only have used this program for small files never a episode this results is crazy good looking at how bad this show and others look at the normal quality
I mean it's higly depends on your scene. Amout of textures, particle systems, fog etc gonna affect RAM usage. And I sorry in advance, but maybe you forgot to change your render device in render settings? By defalt it's CPU and when you render with cpu, all textures and other stuff is stored in your RAM, but when you use GPU as render device, it's stored in you VRAM. (I have 32gb of RAM btw)
looks better in the viewport. yeah, it's slower, and you waste your PC's resources, but I think it's just a matter of preference. For this particular case (the demonstration) as a viewer, I'd rather have denoiser turned on for each sample
I am currently using the 2070super(Optix render). I would like to make the cycles viewport render faster. Which of these is the fastest? 1. 3080 3. 3060ti*2 ??
2x3060 til you scene fits to memory. Although i would go 3080 - the difference would be insignificant(i expect it be next to none) but considering that other application can't utilize 2 cards- 3080 is pretty much nobrainer.
The viewport will only use 1 GPU and if you want to render animations the faster memory will be better in the 3080 vs 3060's as it needs to copy the data to VRAM every frame
@@yrussq and you are a blender expert I guess, Multi GPU will only work if using the Cuda/Optix (Eevee and OpenGL do not work with Multi GPU one google and you will find this out) blenderartists.org/t/multiple-gpu-viewport-performance/632168 and as for animation blenderartists.org/t/increasing-render-speed-by-skipping-sync-cycles/584002
Well I tried only BMW scene to test it. My 1060 rendered it on CUDA for 7min 40sec and 3070 with optix rendered it for 18 seconds. But remember that my 1060 in quite old and have poor cooling system. Other benchmarks I saw were around 4-6 minutes
I wanted to write this too ... but both of them work on optix :) it describes what they work on, not what is turned on ... gtx works on cuda because even though optix is turned on, it doesn't have it.
@@ernestojosevargas9493 i use Blender 2.91 and use GTX 1080 on ubuntu, and blender can't find the optix gpu in the preferences. Is there any step that i''m missing? EDIT : apparently i don't have CUDA installed in my computer, after i install it, Optix work like it should be. I'm so happy :D
Why use Optix denoising ? It slow your render... If you do final render, there is own denoiser in compositing tab, which is much faster. I say this video is pointless.
You may never worked with 3d Because then you'd know that optix denoising is rtx accelerated therefore super fast ...i tested it with my 3090 and blender classroom Test So with optix it takes 22 seconds Also this Video is about the viewport properties And having an almost Realtime view of your work while being able to change Materials or even model is an incredible timesaver
Why do you scheme here? The 1060was running in Cuda , not optix ... Of course optix is quite a bit faster , even on my old ass 960 lol. Sorry but this is a shit video
@@nabarunroy998 yes I know you can turn it on but as you can see it doesn't work ... But having two GPU gtx and rtx in the computer you don't have to limit RTX to cuda(slowing her down which was a problem) when you want to render on both, I guess that's what this option is for.
@@666Azmodan666yeah, was just pointing that option is there. I tried optix on my gtx 1060. Technically it works but I got no speed improvement. So, I think it will require the RT cores for the speed improvement.
For those who are complaining that I used OptiX for 3070 instead of CUDA for more proper comparison I want to say that it's more realistic case of use. I mean If you're working and need fastest viewport why would you use CUDA in the first place if you paid for RT cores? For the same reason you can't turn on Optix on 10XX cards
Optix is a game changer, I compared results between my desktop GTX 750Ti that took lots of time, laptop GTX 1650 which took over a minute, and my new RTX 3080 that took almost 1/10th of the time taken by 1650!
@@cyberghost7777 optix is still missing some features sadly. like bevel on rendertime
@@PascalWiemers you mean bevel node? It will support it in blender 2.92 if I remember right.
@@BertWhite nice to hear!
@@PascalWiemers @Bert I can confirm it's implemented and working right now in 2.92 Release Candidate (should come out this week), I didn't know that so thank you for letting me know. CPU+GPU has also been implemented (it also uses RAM in addition to VRAM, preventing crashes from "out of memory" errors, provided your scene isn't so complex it takes even more memory than your system+gpu can provide). All that is left to implement is baking support and maybe branched path tracing, but that might be entirely removed in favor of more options for regular path tracing.
Holy shit. This is exceptional. Due to the microchip shortage, I'll probably be unable to upgrade for another year, but damn, this is extraordinary.
upgraded yet?
@@kratoziscool2501 Yup! Just ordered mine the other day!
@@ChernobylTaco How is your 1060 holding up? I got one already but Im just wondering what you think of it
Impressive! Thank you for sharing, i have 1060 too and waiting for upgrade 😅
I have a 750Ti on desktop man 😂
Just got the 1060 in a used PC package. It's super fast 3090 is better but damn is this fast!
@@ratisok best gpu for this price, agree
@@cyberghost7777 i have amd radeon hd 5800 2 giga😀 not cuda 😀
@@DREAMTEFO Man you gotta upgrade! I just got my RTX 3080 and I must say the optix viewport denoising is crazy fast!
To be honest, it only takes like 2 seconds for the details to be clear in the gtx 1060
And i am a very patient guy
So i think it's a good deal
Dude how??? I am using the same exact Card idk what am I missing When I Hit render in cycles it takes YEARS to render one frame on top of that the device heats up like crazy it turns into a jet do you have any advices to optimize it?
how much is ur vram?
@@OtherPeople159 6gb vram
This is where ampere architecture shines
Yeah, and it utilizes the special rt cores design for Ray tracing acceleration
So gud
Even a basic 2060 shines with OptiX, in OctaneBench it can be as fast as a Titan RTX rendering with CUDA (granted, the amount of memory is a bit different, but still).
That's super fast, so just imagine the 3090!
u will not see any visual differences between 3070 and 3090 (excluding rendering speed)
@@ivananetdoma8382 actually my friend got the 3090 and he's saying you can't tell the difference between viewport and render since it's all real time
@@TheShinyMooN So it all comes down to rendering performance, hence the price of the 3090 I suppose compared to the 3060 Ti
@@blenderbreath7119 I'll get myself a 3060ti for sure no need for extra fancy things
@@TheShinyMooN Doesn't it come down to CUDA count which makes the rendering time difference? and the 3090 has significantly more CUDA cores.
Bruh why use evee when cycles becomes real time? Jeezus...
3090 does not give real time in cycle ... we probably won't live to see such times.
@@666Azmodan666 And even if it does on this current version of Cycles, in the future, Cycles will be upgraded even more.
@@666Azmodan666 technically it's not real time, but you see a clear image in less than half a second, so it's pretty fast.
@@creper9000 It's generally called interactive real-time preview, which is what threw many people off-course when Turing was released and they kept saying "Radeon Rays has had real-time RT since X years! Nvidia is scamming everybody!", but of course it's not the same as producing a 60fps.
@@666Azmodan666 Yeah right wait until 2023 I'll come back to this comment
I cannot wait for my 3060 to arrive, this is a 3d modeller treat
Have you bought one
Mom, i need a 3090. To work.
That is SO true.
That's 1:1 the upgrade I'm about to make. Thanks!
compared to my GTX 970 ------ GTX 1060 look good enough for me... thanks for comparison
For me, after changing to 3070, Insanely fast on viewport.
Whats your cpu?
@@atf56 ryzen 7 1700x. I think blender doesn't matter the cpu.
Same applies for maya too?
I just bought a 3070 FE, I was on 1060 right before. I confirm: it's impressive.
how much faster do you find your renders taking?
@@lv-cxs approximately 4x for my works
This is exactly what I was looking for. Thank you.
I have the exact same GTX and I was planning to bui a RTX 3070! Thanks for the video, really enlightening. Can you share your other machine specs too?
32 gb ram, ryzen 3700x
Ampere shines in out of stock technology
imagine a CPU compute beside this
I can imagine, its faster then the speed of human computation ability, if not it will be around blender 3.0 as its code continues to improve at crazy speeds.
did you make that it looks' scary awesome work btw
I have a gtx970 for cycles and Vray...
I think is time to upgrade ahah 😁
Laptop with GTX 860 here lol . Fortunately my daily work only need render simple objects. Still need to save money for build new PC 🙂
Thanks for comparison!
that's some serious RAW power
Great improvement.
Да, разница есть)) крутая моделька
in comparison and if you take the energy consumption into account the 1060 is not so bad.....better than my card, I would be happy with the 1060 too :-). Can you make a render example as well , maybe??not only viewport
The energy consumption is crazy. I just got my 3080, and my UPS would shut down every time while trying to play a game (tested CyberPunk 2077). Had to order a bigger one
That is one awesome 3d work!
Just built a bang for buck 5600x workstation for Photoshop and it still runs my old 1060 6gb.
Boy this sold me!
Hey! would you share specs cause I want to build a 5600x pc too with a 3070 so just wondering
I have an RTX 3080 and my viewport is really slow for some reason, the final render from f12 is fast but the viewport didnt improve a bit, it sucks :/
This shouldn't be happening. Are you using the Optix Denoiser?
@@xayzer Yeah of couse but it must to be something, who know what it should be I just dont want to format and install everything again xD
@@Scultronic Do you also have integrated graphics on your computer? I've seen cases where Blender uses the integrated graphics for the viewport and the proper, discrete graphics card for the F12 render. To see if that might happening in your case, run Blender and check in the Task Manager which GPU it's using . If your computer has integrated graphics, GPU 0 is that, and GPU 1 should be your 3080. Another thing you should do is make sure Blender is set to use the 3080 in the Nvidia Control Panel. Yet another thing to try is a different version of Blender.
change viewport cycles to gpu instead of cpu
You just have too many verts and particles in your scene , blenders viewport is very slow and has an unoptimized dependency graph
Hi! I am using the GTX 1060 but once I hit render on Cycles the computer just freezes
we in the future now son
I don't get it why people have the start sample set to 1
Exactly
Can't wait to try this once my 3090 arrives!
i will updrage my entire pc from an old one with a pathetic gt740 to a gaming beast with rtx 3060 ti. i bought it just today :)
I haven't been impressed by tech this much sense I got my gsync 2 years ago this is real impressive I only have used this program for small files never a episode
this results is crazy good looking at how bad this show and others look at the normal quality
RX radeon... cry in this momment. ironically, in MAC are only RAdeon GPUS, and MAC is born just for design...
Only productivity guys can get 😀😀
Is gtx 1650 faster than gtx 1060 ?
*eevee* real time engine
*cycles* almost real time engine
IF only you can find one RTX 3070, of course...
Nice vidio. Are doing the exakt same uppgrade
I have a 1060 and want to upgrade to a 3070, and holy fuck thats a big difference
Just got the 1060. For me it isn't that much considering the price.
is gtx 1060 is enough to make 3d animation?
This makes me want to cry
Those who are complaining about 1060, I have GT 710
Ну, вполне ожидаемо. Между карточками года 4
Show us some complicated GI transport
man this makes me feel so bad for having gtx 1060 3gb... its not really keeping up anymore
I have GTX 960 : P Living with it :D
Im making an animation with a 750ti
It's the skill that matters, you can be loaded and have the best tech, but you need the skill
GT 1030 here, rendering on cpu. Going to upgrade to 3060TI when it's in stock at a reasonable price.
@@chrisp.401 yeah 3060 is that sweet spot of performance to price, i hope i can cop one as well
one question, RTX 3070 fast as hell, but after baking and rendering we will get the same result in quality with both GPU?
You will. It depends on the render settings.
What about two 1060?
Cool.
What's your cpu?
What about maya viewport?
How much ram are you using....my viewport is consuming all 16gb and is still worse than yours.
I mean it's higly depends on your scene. Amout of textures, particle systems, fog etc gonna affect RAM usage. And I sorry in advance, but maybe you forgot to change your render device in render settings? By defalt it's CPU and when you render with cpu, all textures and other stuff is stored in your RAM, but when you use GPU as render device, it's stored in you VRAM. (I have 32gb of RAM btw)
Thanks for the info...great work !!
Please do another one with the Amd cards when they release
Only Nvidia cards support optix viewport denoiser 🥺
Forget about Amd when it comes to content creation and 3d especially.
How many polygons?
It was lazy Zremeshed from zbrush. About 100k tris
What sense does it make to activate denoise at each sampling step...
looks better in the viewport. yeah, it's slower, and you waste your PC's resources, but I think it's just a matter of preference.
For this particular case (the demonstration) as a viewer, I'd rather have denoiser turned on for each sample
it's great if you set the start sample to 2
Thanks
Ok...
I am currently using the 2070super(Optix render). I would like to make the cycles viewport render faster. Which of these is the fastest? 1. 3080 3. 3060ti*2 ??
2x3060 til you scene fits to memory. Although i would go 3080 - the difference would be insignificant(i expect it be next to none) but considering that other application can't utilize 2 cards- 3080 is pretty much nobrainer.
The viewport will only use 1 GPU and if you want to render animations the faster memory will be better in the 3080 vs 3060's as it needs to copy the data to VRAM every frame
@@clementkirton No. Completely wrong. You know nothing about how the Blender's renderer works, so stop spreading this disinformation.
@@yrussq and you are a blender expert I guess, Multi GPU will only work if using the Cuda/Optix (Eevee and OpenGL do not work with Multi GPU one google and you will find this out) blenderartists.org/t/multiple-gpu-viewport-performance/632168 and as for animation blenderartists.org/t/increasing-render-speed-by-skipping-sync-cycles/584002
@@clementkirton Nobody uses EeVee with Multi-GPU. Go back to your homework, boi
I NEED IT
What about rendering times?
Well I tried only BMW scene to test it. My 1060 rendered it on CUDA for 7min 40sec and 3070 with optix rendered it for 18 seconds. But remember that my 1060 in quite old and have poor cooling system. Other benchmarks I saw were around 4-6 minutes
@@BertWhite that's 25x faster, omg omg
@@BertWhite 18 seconds!!! Holy sh*t
@@cyberghost7777 The BMW takes 18 seconds on my 3070 too. 11 seconds with adaptive sampling enabled!
@@BertWhite bmw 16 sec on my zotac3070 paired with 3900x
dreamin with rtx 4XXX no wating time
You must use Optix instead of Cuda in gtx 1060 if you want to compare properly.
I wanted to write this too ... but both of them work on optix :) it describes what they work on, not what is turned on ... gtx works on cuda because even though optix is turned on, it doesn't have it.
I don't know if you'll see much of a difference on a 1060, since it doesn't have RT cores.
Введите здесь текст для поиска
Довольно таки быстро
how to activate optix on gtx 1060?
blender 2.9, OptiX is now available on all NVIDIA GPUs that support it, which is Maxwell or higher (GeForce 700, 800, 900, 1000 series).
@@ernestojosevargas9493 i use Blender 2.91 and use GTX 1080 on ubuntu, and blender can't find the optix gpu in the preferences. Is there any step that i''m missing?
EDIT : apparently i don't have CUDA installed in my computer, after i install it, Optix work like it should be. I'm so happy :D
@@johantri7482 Install the proprietary driver from Nvidia.
@@typingcat that's what i did, thanks! :)
i have a 320m :))))))
can u upload that blend file so i can test on my gpu?
meh... will wait super models maybe they will have more vram
wtf?!
Why use Optix denoising ? It slow your render... If you do final render, there is own denoiser in compositing tab, which is much faster. I say this video is pointless.
You may never worked with 3d
Because then you'd know that optix denoising is rtx accelerated therefore super fast ...i tested it with my 3090 and blender classroom Test
So with optix it takes 22 seconds
Also this Video is about the viewport properties
And having an almost Realtime view of your work while being able to change Materials or even model is an incredible timesaver
Why do you scheme here? The 1060was running in Cuda , not optix ... Of course optix is quite a bit faster , even on my old ass 960 lol. Sorry but this is a shit video
GTX doesn't have optix
@@666Azmodan666 they have enabled optix for gtx cards too
@@nabarunroy998 yes I know you can turn it on but as you can see it doesn't work ... But having two GPU gtx and rtx in the computer you don't have to limit RTX to cuda(slowing her down which was a problem) when you want to render on both, I guess that's what this option is for.
@@666Azmodan666yeah, was just pointing that option is there. I tried optix on my gtx 1060. Technically it works but I got no speed improvement. So, I think it will require the RT cores for the speed improvement.
Hot daym. Making me really look forward to a new 3080ti PC when I can get one
Edit: I got a 3090 instead lol