GTX 1060 vs RTX 3070 Cycles viewport comparison (Blender 2.9)

Поделиться
HTML-код
  • Опубликовано: 3 окт 2024
  • Nothing special, just showing the difference.

Комментарии • 185

  • @BertWhite
    @BertWhite  3 года назад +92

    For those who are complaining that I used OptiX for 3070 instead of CUDA for more proper comparison I want to say that it's more realistic case of use. I mean If you're working and need fastest viewport why would you use CUDA in the first place if you paid for RT cores? For the same reason you can't turn on Optix on 10XX cards

    • @cyberghost7777
      @cyberghost7777 3 года назад +6

      Optix is a game changer, I compared results between my desktop GTX 750Ti that took lots of time, laptop GTX 1650 which took over a minute, and my new RTX 3080 that took almost 1/10th of the time taken by 1650!

    • @PascalWiemers
      @PascalWiemers 3 года назад

      @@cyberghost7777 optix is still missing some features sadly. like bevel on rendertime

    • @BertWhite
      @BertWhite  3 года назад +2

      @@PascalWiemers you mean bevel node? It will support it in blender 2.92 if I remember right.

    • @PascalWiemers
      @PascalWiemers 3 года назад

      @@BertWhite nice to hear!

    • @Hayreddin
      @Hayreddin 3 года назад +1

      @@PascalWiemers @Bert I can confirm it's implemented and working right now in 2.92 Release Candidate (should come out this week), I didn't know that so thank you for letting me know. CPU+GPU has also been implemented (it also uses RAM in addition to VRAM, preventing crashes from "out of memory" errors, provided your scene isn't so complex it takes even more memory than your system+gpu can provide). All that is left to implement is baking support and maybe branched path tracing, but that might be entirely removed in favor of more options for regular path tracing.

  • @ChernobylTaco
    @ChernobylTaco 3 года назад +27

    Holy shit. This is exceptional. Due to the microchip shortage, I'll probably be unable to upgrade for another year, but damn, this is extraordinary.

    • @kratoziscool2501
      @kratoziscool2501 2 года назад +3

      upgraded yet?

    • @ChernobylTaco
      @ChernobylTaco 2 года назад +1

      @@kratoziscool2501 Yup! Just ordered mine the other day!

    • @ZepAnimations
      @ZepAnimations Год назад

      @@ChernobylTaco How is your 1060 holding up? I got one already but Im just wondering what you think of it

  • @СергейКиселев-э6у
    @СергейКиселев-э6у 3 года назад +26

    Impressive! Thank you for sharing, i have 1060 too and waiting for upgrade 😅

    • @cyberghost7777
      @cyberghost7777 3 года назад +2

      I have a 750Ti on desktop man 😂

    • @ratisok
      @ratisok 3 года назад

      Just got the 1060 in a used PC package. It's super fast 3090 is better but damn is this fast!

    • @СергейКиселев-э6у
      @СергейКиселев-э6у 3 года назад +1

      @@ratisok best gpu for this price, agree

    • @DREAMTEFO
      @DREAMTEFO 3 года назад +2

      @@cyberghost7777 i have amd radeon hd 5800 2 giga😀 not cuda 😀

    • @cyberghost7777
      @cyberghost7777 3 года назад +1

      @@DREAMTEFO Man you gotta upgrade! I just got my RTX 3080 and I must say the optix viewport denoising is crazy fast!

  • @zinc845
    @zinc845 2 года назад +10

    To be honest, it only takes like 2 seconds for the details to be clear in the gtx 1060
    And i am a very patient guy
    So i think it's a good deal

    • @darkcdev4906
      @darkcdev4906 4 месяца назад +1

      Dude how??? I am using the same exact Card idk what am I missing When I Hit render in cycles it takes YEARS to render one frame on top of that the device heats up like crazy it turns into a jet do you have any advices to optimize it?

    • @OtherPeople159
      @OtherPeople159 2 месяца назад

      how much is ur vram?

    • @darkcdev4906
      @darkcdev4906 2 месяца назад

      @@OtherPeople159 6gb vram

  • @Droid3455
    @Droid3455 3 года назад +74

    This is where ampere architecture shines

    • @ClintochX
      @ClintochX 3 года назад +2

      Yeah, and it utilizes the special rt cores design for Ray tracing acceleration

    • @doppiak2227
      @doppiak2227 3 года назад

      So gud

    • @Hayreddin
      @Hayreddin 3 года назад

      Even a basic 2060 shines with OptiX, in OctaneBench it can be as fast as a Titan RTX rendering with CUDA (granted, the amount of memory is a bit different, but still).

  • @blenderbreath7119
    @blenderbreath7119 3 года назад +85

    That's super fast, so just imagine the 3090!

    • @ivananetdoma8382
      @ivananetdoma8382 3 года назад +7

      u will not see any visual differences between 3070 and 3090 (excluding rendering speed)

    • @TheShinyMooN
      @TheShinyMooN 3 года назад +12

      @@ivananetdoma8382 actually my friend got the 3090 and he's saying you can't tell the difference between viewport and render since it's all real time

    • @blenderbreath7119
      @blenderbreath7119 3 года назад +2

      @@TheShinyMooN So it all comes down to rendering performance, hence the price of the 3090 I suppose compared to the 3060 Ti

    • @TheShinyMooN
      @TheShinyMooN 3 года назад +3

      @@blenderbreath7119 I'll get myself a 3060ti for sure no need for extra fancy things

    • @blenderbreath7119
      @blenderbreath7119 3 года назад +4

      @@TheShinyMooN Doesn't it come down to CUDA count which makes the rendering time difference? and the 3090 has significantly more CUDA cores.

  • @-saca-1297
    @-saca-1297 3 года назад +110

    Bruh why use evee when cycles becomes real time? Jeezus...

    • @666Azmodan666
      @666Azmodan666 3 года назад +5

      3090 does not give real time in cycle ... we probably won't live to see such times.

    • @Skullbrothers
      @Skullbrothers 3 года назад +3

      @@666Azmodan666 And even if it does on this current version of Cycles, in the future, Cycles will be upgraded even more.

    • @creper9000
      @creper9000 3 года назад +25

      @@666Azmodan666 technically it's not real time, but you see a clear image in less than half a second, so it's pretty fast.

    • @Hayreddin
      @Hayreddin 3 года назад +2

      @@creper9000 It's generally called interactive real-time preview, which is what threw many people off-course when Turing was released and they kept saying "Radeon Rays has had real-time RT since X years! Nvidia is scamming everybody!", but of course it's not the same as producing a 60fps.

    • @DeZiio
      @DeZiio 3 года назад +3

      @@666Azmodan666 Yeah right wait until 2023 I'll come back to this comment

  • @RetroPlus
    @RetroPlus 2 года назад +2

    I cannot wait for my 3060 to arrive, this is a 3d modeller treat

  • @jefm32
    @jefm32 3 года назад +17

    Mom, i need a 3090. To work.

    • @typingcat
      @typingcat 3 года назад +1

      That is SO true.

  • @familsproductions1711
    @familsproductions1711 3 года назад +6

    That's 1:1 the upgrade I'm about to make. Thanks!

  • @rg1112
    @rg1112 3 года назад +4

    compared to my GTX 970 ------ GTX 1060 look good enough for me... thanks for comparison

  • @turquoisekamlyn9618
    @turquoisekamlyn9618 3 года назад +13

    For me, after changing to 3070, Insanely fast on viewport.

    • @atf56
      @atf56 3 года назад

      Whats your cpu?

    • @turquoisekamlyn9618
      @turquoisekamlyn9618 3 года назад

      @@atf56 ryzen 7 1700x. I think blender doesn't matter the cpu.

    • @atf56
      @atf56 3 года назад

      Same applies for maya too?

  • @PatriceFERLET
    @PatriceFERLET 3 года назад +8

    I just bought a 3070 FE, I was on 1060 right before. I confirm: it's impressive.

    • @lv-cxs
      @lv-cxs Год назад +1

      how much faster do you find your renders taking?

    • @PatriceFERLET
      @PatriceFERLET Год назад +1

      @@lv-cxs approximately 4x for my works

  • @ChunkyJo
    @ChunkyJo 3 года назад +2

    This is exactly what I was looking for. Thank you.

  • @lucas-correa
    @lucas-correa 3 года назад +8

    I have the exact same GTX and I was planning to bui a RTX 3070! Thanks for the video, really enlightening. Can you share your other machine specs too?

    • @BertWhite
      @BertWhite  3 года назад +2

      32 gb ram, ryzen 3700x

  • @johnny2598
    @johnny2598 3 года назад +2

    Ampere shines in out of stock technology

  • @mathusuthanvenkatesan
    @mathusuthanvenkatesan 3 года назад +7

    imagine a CPU compute beside this

    • @namonaite
      @namonaite 3 года назад +2

      I can imagine, its faster then the speed of human computation ability, if not it will be around blender 3.0 as its code continues to improve at crazy speeds.

  • @3dhotshot
    @3dhotshot Год назад

    did you make that it looks' scary awesome work btw

  • @sugy747
    @sugy747 3 года назад +5

    I have a gtx970 for cycles and Vray...
    I think is time to upgrade ahah 😁

    • @rizamoriza9931
      @rizamoriza9931 2 года назад

      Laptop with GTX 860 here lol . Fortunately my daily work only need render simple objects. Still need to save money for build new PC 🙂

  • @Flowvful
    @Flowvful 5 месяцев назад

    Thanks for comparison!

  • @k-vandan
    @k-vandan 3 года назад +1

    that's some serious RAW power

  • @SpaceSTG
    @SpaceSTG 3 года назад +2

    Great improvement.

  • @zloiser
    @zloiser 3 года назад +2

    Да, разница есть)) крутая моделька

  • @3deoskill
    @3deoskill 3 года назад +2

    in comparison and if you take the energy consumption into account the 1060 is not so bad.....better than my card, I would be happy with the 1060 too :-). Can you make a render example as well , maybe??not only viewport

    • @cyberghost7777
      @cyberghost7777 3 года назад

      The energy consumption is crazy. I just got my 3080, and my UPS would shut down every time while trying to play a game (tested CyberPunk 2077). Had to order a bigger one

  • @alancooke9357
    @alancooke9357 2 года назад

    That is one awesome 3d work!

  • @teahousereloaded
    @teahousereloaded 3 года назад +1

    Just built a bang for buck 5600x workstation for Photoshop and it still runs my old 1060 6gb.
    Boy this sold me!

    • @hairanbhvats592
      @hairanbhvats592 3 года назад

      Hey! would you share specs cause I want to build a 5600x pc too with a 3070 so just wondering

  • @Scultronic
    @Scultronic 3 года назад +7

    I have an RTX 3080 and my viewport is really slow for some reason, the final render from f12 is fast but the viewport didnt improve a bit, it sucks :/

    • @xayzer
      @xayzer 3 года назад +1

      This shouldn't be happening. Are you using the Optix Denoiser?

    • @Scultronic
      @Scultronic 3 года назад

      @@xayzer Yeah of couse but it must to be something, who know what it should be I just dont want to format and install everything again xD

    • @xayzer
      @xayzer 3 года назад +4

      @@Scultronic Do you also have integrated graphics on your computer? I've seen cases where Blender uses the integrated graphics for the viewport and the proper, discrete graphics card for the F12 render. To see if that might happening in your case, run Blender and check in the Task Manager which GPU it's using . If your computer has integrated graphics, GPU 0 is that, and GPU 1 should be your 3080. Another thing you should do is make sure Blender is set to use the 3080 in the Nvidia Control Panel. Yet another thing to try is a different version of Blender.

    •  3 года назад +2

      change viewport cycles to gpu instead of cpu

    • @goku21youtub
      @goku21youtub 3 года назад

      You just have too many verts and particles in your scene , blenders viewport is very slow and has an unoptimized dependency graph

  • @darkcdev4906
    @darkcdev4906 5 месяцев назад +1

    Hi! I am using the GTX 1060 but once I hit render on Cycles the computer just freezes

  • @Cornellie
    @Cornellie 3 года назад +1

    we in the future now son

  • @kokoze
    @kokoze 3 года назад

    I don't get it why people have the start sample set to 1

  • @DustinJones511
    @DustinJones511 3 года назад +2

    Can't wait to try this once my 3090 arrives!

  • @digital_ferret7263
    @digital_ferret7263 3 года назад +2

    i will updrage my entire pc from an old one with a pathetic gt740 to a gaming beast with rtx 3060 ti. i bought it just today :)

  • @cmoneytheman
    @cmoneytheman 3 года назад +1

    I haven't been impressed by tech this much sense I got my gsync 2 years ago this is real impressive I only have used this program for small files never a episode
    this results is crazy good looking at how bad this show and others look at the normal quality

  • @iaia5368
    @iaia5368 Год назад

    RX radeon... cry in this momment. ironically, in MAC are only RAdeon GPUS, and MAC is born just for design...

  • @wirelessguy2211
    @wirelessguy2211 3 года назад

    Only productivity guys can get 😀😀

  • @Azhar.49
    @Azhar.49 Год назад +1

    Is gtx 1650 faster than gtx 1060 ?

  • @BirBilgin
    @BirBilgin 3 года назад +1

    *eevee* real time engine
    *cycles* almost real time engine

  • @Cyber_Kriss
    @Cyber_Kriss 3 года назад

    IF only you can find one RTX 3070, of course...

  • @esbjornmonteliusrisberg6834
    @esbjornmonteliusrisberg6834 3 года назад

    Nice vidio. Are doing the exakt same uppgrade

  • @Yaya0001
    @Yaya0001 3 года назад

    I have a 1060 and want to upgrade to a 3070, and holy fuck thats a big difference

    • @ratisok
      @ratisok 3 года назад

      Just got the 1060. For me it isn't that much considering the price.

  • @hassanabdullah-jm3lu
    @hassanabdullah-jm3lu 11 месяцев назад

    is gtx 1060 is enough to make 3d animation?

  • @stiky5972
    @stiky5972 3 года назад

    This makes me want to cry

  • @arjunaryan9425
    @arjunaryan9425 3 года назад

    Those who are complaining about 1060, I have GT 710

  • @АркадийАрнольдовичШницель

    Ну, вполне ожидаемо. Между карточками года 4

  • @Brejla
    @Brejla 3 года назад

    Show us some complicated GI transport

  • @izackpaz
    @izackpaz 3 года назад +3

    man this makes me feel so bad for having gtx 1060 3gb... its not really keeping up anymore

    • @mcan-piano4718
      @mcan-piano4718 3 года назад +1

      I have GTX 960 : P Living with it :D

    • @ultraboy99x
      @ultraboy99x 3 года назад +4

      Im making an animation with a 750ti

    • @karandras2854
      @karandras2854 3 года назад +4

      It's the skill that matters, you can be loaded and have the best tech, but you need the skill

    • @chrisp.401
      @chrisp.401 3 года назад +2

      GT 1030 here, rendering on cpu. Going to upgrade to 3060TI when it's in stock at a reasonable price.

    • @izackpaz
      @izackpaz 3 года назад

      @@chrisp.401 yeah 3060 is that sweet spot of performance to price, i hope i can cop one as well

  • @mr.answer6084
    @mr.answer6084 3 года назад

    one question, RTX 3070 fast as hell, but after baking and rendering we will get the same result in quality with both GPU?

    • @BertWhite
      @BertWhite  3 года назад +1

      You will. It depends on the render settings.

  • @zian.2493
    @zian.2493 3 года назад

    What about two 1060?

  • @sinzy1067
    @sinzy1067 3 года назад

    Cool.

  • @atf56
    @atf56 3 года назад

    What's your cpu?

  • @atf56
    @atf56 3 года назад

    What about maya viewport?

  • @priyatoshnayak8017
    @priyatoshnayak8017 3 года назад

    How much ram are you using....my viewport is consuming all 16gb and is still worse than yours.

    • @BertWhite
      @BertWhite  3 года назад

      I mean it's higly depends on your scene. Amout of textures, particle systems, fog etc gonna affect RAM usage. And I sorry in advance, but maybe you forgot to change your render device in render settings? By defalt it's CPU and when you render with cpu, all textures and other stuff is stored in your RAM, but when you use GPU as render device, it's stored in you VRAM. (I have 32gb of RAM btw)

    • @priyatoshnayak8017
      @priyatoshnayak8017 3 года назад

      Thanks for the info...great work !!

  • @saileelahospital9354
    @saileelahospital9354 3 года назад +4

    Please do another one with the Amd cards when they release

    • @mr.lightwhite6431
      @mr.lightwhite6431 3 года назад

      Only Nvidia cards support optix viewport denoiser 🥺

    • @yrussq
      @yrussq 3 года назад +2

      Forget about Amd when it comes to content creation and 3d especially.

  • @peckpech8655
    @peckpech8655 3 года назад

    How many polygons?

    • @BertWhite
      @BertWhite  3 года назад

      It was lazy Zremeshed from zbrush. About 100k tris

  • @Gary_Hun
    @Gary_Hun 2 года назад

    What sense does it make to activate denoise at each sampling step...

    • @Sychyov
      @Sychyov Год назад

      looks better in the viewport. yeah, it's slower, and you waste your PC's resources, but I think it's just a matter of preference.
      For this particular case (the demonstration) as a viewer, I'd rather have denoiser turned on for each sample

    • @Denomote
      @Denomote 7 месяцев назад

      it's great if you set the start sample to 2

  • @manojlogulic4234
    @manojlogulic4234 3 года назад

    Thanks

  • @GodexOficialyt
    @GodexOficialyt 2 года назад

    Ok...

  • @jonggueshin5542
    @jonggueshin5542 3 года назад

    I am currently using the 2070super(Optix render). I would like to make the cycles viewport render faster. Which of these is the fastest? 1. 3080 3. 3060ti*2 ??

    • @yrussq
      @yrussq 3 года назад

      2x3060 til you scene fits to memory. Although i would go 3080 - the difference would be insignificant(i expect it be next to none) but considering that other application can't utilize 2 cards- 3080 is pretty much nobrainer.

    • @clementkirton
      @clementkirton 3 года назад +1

      The viewport will only use 1 GPU and if you want to render animations the faster memory will be better in the 3080 vs 3060's as it needs to copy the data to VRAM every frame

    • @yrussq
      @yrussq 3 года назад

      @@clementkirton No. Completely wrong. You know nothing about how the Blender's renderer works, so stop spreading this disinformation.

    • @clementkirton
      @clementkirton 3 года назад +1

      @@yrussq and you are a blender expert I guess, Multi GPU will only work if using the Cuda/Optix (Eevee and OpenGL do not work with Multi GPU one google and you will find this out) blenderartists.org/t/multiple-gpu-viewport-performance/632168 and as for animation blenderartists.org/t/increasing-render-speed-by-skipping-sync-cycles/584002

    • @yrussq
      @yrussq 3 года назад

      @@clementkirton Nobody uses EeVee with Multi-GPU. Go back to your homework, boi

  • @andreaspadaccini948
    @andreaspadaccini948 3 года назад

    I NEED IT

  • @-_-Alex2k23-_-
    @-_-Alex2k23-_- 3 года назад +1

    What about rendering times?

    • @BertWhite
      @BertWhite  3 года назад +13

      Well I tried only BMW scene to test it. My 1060 rendered it on CUDA for 7min 40sec and 3070 with optix rendered it for 18 seconds. But remember that my 1060 in quite old and have poor cooling system. Other benchmarks I saw were around 4-6 minutes

    • @Lumitex
      @Lumitex 3 года назад +5

      @@BertWhite that's 25x faster, omg omg

    • @cyberghost7777
      @cyberghost7777 3 года назад +3

      @@BertWhite 18 seconds!!! Holy sh*t

    • @DECODEDVFX
      @DECODEDVFX 3 года назад

      @@cyberghost7777 The BMW takes 18 seconds on my 3070 too. 11 seconds with adaptive sampling enabled!

    • @atf56
      @atf56 3 года назад

      @@BertWhite bmw 16 sec on my zotac3070 paired with 3900x

  • @MRVI-qp6on
    @MRVI-qp6on 3 года назад +1

    dreamin with rtx 4XXX no wating time

  • @fatihmtlm
    @fatihmtlm 3 года назад

    You must use Optix instead of Cuda in gtx 1060 if you want to compare properly.

    • @666Azmodan666
      @666Azmodan666 3 года назад

      I wanted to write this too ... but both of them work on optix :) it describes what they work on, not what is turned on ... gtx works on cuda because even though optix is turned on, it doesn't have it.

    • @user-eu5ol7mx8y
      @user-eu5ol7mx8y 3 года назад

      I don't know if you'll see much of a difference on a 1060, since it doesn't have RT cores.

  • @papich_52
    @papich_52 3 года назад

    Введите здесь текст для поиска

  • @esstx
    @esstx 3 года назад

    Довольно таки быстро

  • @johantri7482
    @johantri7482 3 года назад

    how to activate optix on gtx 1060?

    • @ernestojosevargas9493
      @ernestojosevargas9493 3 года назад

      blender 2.9, OptiX is now available on all NVIDIA GPUs that support it, which is Maxwell or higher (GeForce 700, 800, 900, 1000 series).

    • @johantri7482
      @johantri7482 3 года назад

      @@ernestojosevargas9493 i use Blender 2.91 and use GTX 1080 on ubuntu, and blender can't find the optix gpu in the preferences. Is there any step that i''m missing?
      EDIT : apparently i don't have CUDA installed in my computer, after i install it, Optix work like it should be. I'm so happy :D

    • @typingcat
      @typingcat 3 года назад +1

      @@johantri7482 Install the proprietary driver from Nvidia.

    • @johantri7482
      @johantri7482 3 года назад

      @@typingcat that's what i did, thanks! :)

  • @andrewp796
    @andrewp796 3 года назад

    i have a 320m :))))))

  • @hervains2869
    @hervains2869 3 года назад

    can u upload that blend file so i can test on my gpu?

  • @Grom84
    @Grom84 3 года назад

    meh... will wait super models maybe they will have more vram

  • @drancosqui
    @drancosqui 3 года назад

    wtf?!

  • @GameBacardi
    @GameBacardi 3 года назад

    Why use Optix denoising ? It slow your render... If you do final render, there is own denoiser in compositing tab, which is much faster. I say this video is pointless.

    • @tobiaskuberka2213
      @tobiaskuberka2213 3 года назад +1

      You may never worked with 3d
      Because then you'd know that optix denoising is rtx accelerated therefore super fast ...i tested it with my 3090 and blender classroom Test
      So with optix it takes 22 seconds
      Also this Video is about the viewport properties
      And having an almost Realtime view of your work while being able to change Materials or even model is an incredible timesaver

  • @goku21youtub
    @goku21youtub 3 года назад +1

    Why do you scheme here? The 1060was running in Cuda , not optix ... Of course optix is quite a bit faster , even on my old ass 960 lol. Sorry but this is a shit video

    • @666Azmodan666
      @666Azmodan666 3 года назад +1

      GTX doesn't have optix

    • @nabarunroy998
      @nabarunroy998 3 года назад +1

      @@666Azmodan666 they have enabled optix for gtx cards too

    • @666Azmodan666
      @666Azmodan666 3 года назад +1

      @@nabarunroy998 yes I know you can turn it on but as you can see it doesn't work ... But having two GPU gtx and rtx in the computer you don't have to limit RTX to cuda(slowing her down which was a problem) when you want to render on both, I guess that's what this option is for.

    • @nabarunroy998
      @nabarunroy998 3 года назад

      @@666Azmodan666yeah, was just pointing that option is there. I tried optix on my gtx 1060. Technically it works but I got no speed improvement. So, I think it will require the RT cores for the speed improvement.

  • @suckmysilencer747
    @suckmysilencer747 3 года назад +11

    Hot daym. Making me really look forward to a new 3080ti PC when I can get one
    Edit: I got a 3090 instead lol