AMD vs. NVIDIA GPU Rendering Performance: Blender, Octane, Redshift & More

Поделиться
HTML-код
  • Опубликовано: 28 авг 2024

Комментарии • 258

  • @Techgage
    @Techgage  3 года назад +54

    This video is a 1080p reupload, as our original 4K version was stuck in a never-ending "processing" encode state. We're going to stick to 1080p going forward to prevent issues. Apologies for the reupload, and potential double notification.

    • @Nathan__A
      @Nathan__A 3 года назад +2

      Gem of a video mate. Thanks for the upload and insight

    • @krazyfrog
      @krazyfrog 3 года назад +2

      Weird it did play fine at 4K on our end.

    • @Techgage
      @Techgage  3 года назад +3

      Aye, that's what's confusing about the whole thing. To us, it looked like every resolution was fine, but in the backend, it was stuck in a never-ending processing state. We had a feeling this could have been tanking the potential traffic to the video, so we cut our losses. RUclips really shouldn't even let you upload 4K if it's going to take a week to finish and get a fair shake on the platform.

    • @beritakilat77
      @beritakilat77 3 года назад +1

      @@Techgage upload 2k then

    • @leucome
      @leucome 3 года назад +6

      @@Techgage You just had to wait more it is normal. The first quick encoding is done in avc1... then it continue to process and swap the video for a better higher quality VP9 version in 24 to 48 hours sometime more... If your channel is big enough it can continue to process longer to make an ultra high quality av01 video. And it seem that your channel get the av01 encoding. So long time are to be expected.
      I had one of my video that took over a week of processing to make the VP9 version. Smaller channel dont have high encoding priority. But yeah the video was available during that time just with a bit worse encoding quality.

  • @tanmoymridha5002
    @tanmoymridha5002 2 года назад +337

    AMD seriously needs to come with good professional software support. Only focusing on gaming cannot hold them in a competitive state.

    • @siddharth-wy2kp
      @siddharth-wy2kp 2 года назад +5

      how to render with an AMD GPU?? help..

    • @w0lf149
      @w0lf149 2 года назад +47

      Well, if they’re aiming for just gaming and they’re cheaper. They attract people like me. I don’t want to pay extra for nvidias features.

    • @AngeI_PC
      @AngeI_PC Год назад +15

      @@w0lf149 facts for people that don’t render and just want to play video games amd is the best option especially since it will perform better than nvidia and is still cheaper

    • @izayoiigames3467
      @izayoiigames3467 Год назад +3

      @@AngeI_PC console can do better and have more cheaper price than the whole pc

    • @AngeI_PC
      @AngeI_PC Год назад +37

      @@izayoiigames3467 can you get 200 fps in warzone max settings? Can you even change your fov 💀💀

  • @DavoGalavotti
    @DavoGalavotti 2 года назад +47

    I would love to see the same testing with Blender 3.1. Thanks for making such an amazing and comprehensive comparison.

  • @karlisstigis
    @karlisstigis 2 года назад +8

    Thank you, not easy to find these kind of comparisons.

  • @chungusbek9636
    @chungusbek9636 Год назад +3

    It's been nearly 2 years. I think many of us waiting for a new benchmark comparisons. Where are you?

  • @bgtubber
    @bgtubber 3 года назад +19

    I really appreciate these tests. Thank you!

  • @max-lee
    @max-lee 2 года назад +3

    Love it. Its hard to find this kind of test on youtube.

  • @eray784
    @eray784 Год назад +2

    Thanks for your support

  • @karlosmartos4646
    @karlosmartos4646 3 года назад +10

    Its so insane how fast these gpus get. I still have a very old gtx 670 from 2012 i think because i havent been gaming at all lately. But if i upgrade i will see an insane improvement, so lets hope there will be some in stock this year that arent 1000$ more expensive than they should lol

  • @matthewgross4068
    @matthewgross4068 2 года назад +4

    Providing rendering benchmarks is god's work. bless you. But, why no evaluations of Quadro/ RTX-A class?

  • @vortexsophia
    @vortexsophia 3 года назад +6

    Thank you for a real video on rendering. The only thing to find now is MATLAB and ArcGIS with these non-Quadro cards

  • @xxRuleZzZxx
    @xxRuleZzZxx 3 года назад +10

    Thank you for this! Really appreciate the time and effort it took to make this video.
    What would you say is the render time difference between just rendering on the Threadripper by itself VS the Nvidia 3090 / 3080ti / 3080 on renderers that support both CPU or GPU rendering?

  • @MatteoCeccarini
    @MatteoCeccarini 3 года назад +11

    Octane and Redshift can actually use AMD GPUs under MacOs. I would be nice to see that comparison in the mix too.

    • @Techgage
      @Techgage  3 года назад +1

      We don't have access to Apple hardware, but if anyone does, they could run either OctaneBench or Redshift Benchmark and directly compare to our results.

  • @tomroeder7348
    @tomroeder7348 3 года назад +5

    Nice work, quality content as always!!

  • @VickriFahrurozi
    @VickriFahrurozi 2 года назад

    Hei , i just bought the 6600xt , everything is going ok.
    But everytime i render a video or play a playback in premier pro , there is a sound that comes out from inside the gpu, it's not the fan sound, its different sound.
    Idk is the sound like this just a normal thing or no.
    But what makes me confuse is the sound just comes out only when i render or play a playback in premier pro , when i playing games or even mining , the sound doesn't comes out.
    Do u have any idea ?

  • @benedekfodor269
    @benedekfodor269 9 месяцев назад

    It whoud me a blessing if you could upload this video!

  • @lemmylemonade8933
    @lemmylemonade8933 Год назад +3

    Thank you so much for this information, Im probably gonna stick to getting a 3070, since its a little more affordable in my region. specially with how prices have been dipping for these cards

  • @zoqin
    @zoqin 2 года назад +1

    You just gain a subscriber, incredibly useful video

  • @webinatic216
    @webinatic216 3 года назад +19

    I have a 3060 with 12gb. Severals times I've been over 10gb of gpu memory.
    Every time it went over 12 gb and used the ram the renders got much slower.
    Also the prices of these cards are crazy. The 3060 should have been like 400 euros but I had to buy a whole PC just because it had that card inside. 1200 euros for that PC. To even look any card faster than that would cost 100%-600% more just to have 30%-250% faster results.
    I even limited the voltage to my card just because I don't need it to use extra 70 watts to get 10% better performance. So the 3060 uses max 100 watts and is practically silent as it doesn't generate that much heat.

    • @Techgage
      @Techgage  3 года назад +5

      Out of curiosity, what kind of creative work are you doing? If you're regularly hitting 12GB, is it fair to say you're doing some really high-end work? We haven't heard from too many that hit 12GB that regularly.

    • @webinatic216
      @webinatic216 3 года назад +5

      @@Techgage When I use apps that uses CUDA and I go really high resolution. I work with print on demand. So I need to render over 10K pixels and work with documents that are over 10k. Working with 4K is not any problem but it's when I get with those huge resolutions and micro-displacements/subdivisions in Blender.

    • @Techgage
      @Techgage  3 года назад

      Ahh, yup, that sounds pretty intensive! It's great though that you can get it done on the RTX 3060. Hopefully frame buffers double next generation...

    • @Gasconauxolives
      @Gasconauxolives 2 года назад +1

      Hello and thanks for sharing: please how did you limited the voltage of your GPU? I never did this but find the idea really good actually. Also makes the hardware last longer... I also plan to get an RTX 3060 with 12 GB because hitting VRAM limit is really an issue. Thank you in advance, regards.

    • @webinatic216
      @webinatic216 2 года назад +2

      @@Gasconauxolives I installed MSI Afterburner and lowered the Power Limit to 58% - In theory it uses 40% less wattage but it's still 90-95% in performance in the apps I use.

  • @banmuanhacom
    @banmuanhacom 2 года назад

    thx for all testing, with full vgas at 2022

  • @hippoatwork7261
    @hippoatwork7261 2 года назад +2

    很詳細的比較和解說,謝謝提供。我有給您按個讚。

  • @HuntaKiller91
    @HuntaKiller91 3 года назад +3

    I use adobe premiere mostly and handbrake to compress my videos
    Do u think a 6700xt is good to replace my current 570?I hate to buy Nvidia since my monitor only hv freesync1 with 75hz

    • @Techgage
      @Techgage  3 года назад +1

      Whoops, I missed this comment entirely. For Premiere, it's really hard to say right now, because we're actually in the middle of updating some of our tests to get fresh testing done for the W6800 review. Generally speaking, you should be fine, but since it's been a while, I need to retest to be sure.

    • @RetroBleck
      @RetroBleck 2 года назад

      Get a different monitor, that's the easy answer. Not sure why you're still using a monitor like that. By the way, modern freesync monitors fully support Nvidia and vice versa. I bought a Dell 1440p Freesync monitor about 4 years ago and it works flawlessly with my old 1080 Ti. It's not locked to brand anymore, I believe it was patched in driver updates many years ago, but the oldest freesync monitors might not work.

  • @BarAlexC
    @BarAlexC Год назад +1

    Can we hope for you to compare Lovelace and RDNA 3 cards to Ampere?
    This video is by far the best comparison for my needs, out of what I've seen on RUclips.

    • @Techgage
      @Techgage  Год назад +1

      Thanks for the nice comment. We do hope to have a video up for Ada Lovelace. Video is realllly not my forte, so it's hard to get going sometimes, but I think it's clear that an update is long overdue.

  • @squarelemon5251
    @squarelemon5251 3 года назад +1

    Great video! Thank you.

  • @jono8326
    @jono8326 2 года назад +5

    Thank you very much for the info. I want to support AMD but Nvidia really excels at rendering.........

  • @bruce4551
    @bruce4551 2 года назад +3

    Appreciated the performance test, now I can consider to reserve a lower budget for a GPU getting RTX 3080 ti instead of 3090 as they both look pretty much the same in rendering time and quality.

    • @siddharth-wy2kp
      @siddharth-wy2kp 2 года назад

      how to render with an AMD GPU?? help..

    • @RetroBleck
      @RetroBleck 2 года назад +1

      @@siddharth-wy2kp If you don't even understand what rendering is, there's quite a few questions you need to be asking before asking that one.

    • @DerekDavis213
      @DerekDavis213 2 года назад +1

      @@RetroBleck He is asking how to to get GOOD performance when rendering with AMD gpu. Something closer to NVidia.

    • @MarcABrown-tt1fp
      @MarcABrown-tt1fp Год назад

      @@DerekDavis213 No he is asking him "how to render with an AMD GPU". that means "how can he render just about anything anything in general" like, games, productivity, photos, word documents. It's a pretty stupid question if you ask me, and its not even worded right. wait...
      Maybe he is on to something... I wonder just how different the two Uarch's are, and how Modern RDNA renders things differently compared to NVIDIA. Maybe AMD could elaborate on that someday...

    • @DerekDavis213
      @DerekDavis213 Год назад

      @@MarcABrown-tt1fp When doing 3D modeling with Blender, NVidia definitely gets much higher performance than similarly-priced AMD cards. NVidia CUDA and Optix are superior, in Blender.
      Also, ray-tracing in games is still better on the NVidia cards. AMD is improving though.

  • @NikolasKarampelas
    @NikolasKarampelas Год назад +1

    Thank you for this video, you really demonstrate the benefits of each GPU and you cleaned some things up about using AMD GPUs. Is there any follow up? Redshift is supporting AMD now, although not sure if it can manage much with my 5500XT 4GB, I would love to know where AMD GPUs stand with it.
    Also you ok? I see you haven't uploaded anything since a year ago.

    • @manuel8123
      @manuel8123 Год назад +1

      They seem to be fine, they just stopped uploading videos to YT, their page have been updating at least once a month and both have the same name of the author Rob Williams
      They just uploaded a new article yesterday, I hope they do a Blender Cycles AMD HIP RT vs Cycles NVIDIA CUDA RT comparison now that Blender 3.6 is out

  • @vmafarah9473
    @vmafarah9473 Год назад +3

    WHAT ABOUT THE VPORT PERFORMANCE OF THESE CARDS.

    • @Techgage
      @Techgage  Год назад +1

      Not a video, sorry, but this might help:
      techgage.com/article/blender-3-2-the-fastest-gpus-for-cycles-eevee-viewport/
      techgage.com/article/specviewperf-2020-v3-linux-windows/

    • @vmafarah9473
      @vmafarah9473 Год назад

      @@Techgage THANKS FOR YOUR REPLY, THERE ARE PEOPLE WHO CARE MORE ABOUT VPORT PERFORMANCE THAN RENDER SPEED.

    • @vmafarah9473
      @vmafarah9473 Год назад

      @@Techgage Im an Architect, who care about Vport Performance of Render engine Lumion,also the video walk through render time. It takes hours to render .Last week I came across a video about amd rx6800m laptop asus g15, which was cheap as rtx 3060 laptop but was 2.1X faster than 3060 in video walkthrough render. I use Sketchup as main modeling software and Autocad for drawing and their vport performance is fine for me.

  • @jubayeralam5252
    @jubayeralam5252 2 года назад

    I am so excited to the performance of your Radeon pro w6800. Please can you make video in it?

  • @torontovo
    @torontovo 2 года назад +3

    So is my 3060 still going to be valid in a couple years ?

    • @suorsodavit7421
      @suorsodavit7421 2 года назад +1

      Looking to get a 3060 as well cause it got 12GB of VRAM. Is it good?

  • @Turgineer
    @Turgineer 5 месяцев назад

    ProRender and HIP RT are great things from AMD for Blender users.

  • @NNG_yt
    @NNG_yt 2 года назад

    I made too many comments for the first time however
    Wanted to say
    I generally subscribe if i need more of the content topic or demanding question
    But u get an easy sub here
    Though my doubts got clarified
    For ur knowledge
    Work and efforts ooof
    Thanks! 🙏

  • @iuriharacemko
    @iuriharacemko Год назад +1

    Thank you, we need an update of this video, because I pray that blender started to work better with AMD 6000 series cards from blender 3.0, I would really like to see if this is true, as I intend to do a gpu update.

  • @Home_Rich
    @Home_Rich Год назад +8

    Even though, I came here for 6700, I just can't stop looking at 3060ti.
    Highly appreciated using 590, which is my current card. Really shows the contrast in, well, everything!

  • @chungusbek9636
    @chungusbek9636 Год назад +1

    did amd & blender cycles x update changed anything?

  • @Livingston3d
    @Livingston3d Год назад +1

    Amazing! Great sir.! This is the best review for us. thanks for that. can you Please help me. Will this GPUs (Radeon RX 6800 XT, Radeon RX 6600 XT, Radeon RX 7000 XT ) good for Maya, 3ds max, Zbrush, painter, blender, etc. Is it good for our modeling and hard file viewport navigation and rendering purposes?

  • @twistedwizardry5153
    @twistedwizardry5153 2 года назад +1

    I have an older laptop with a 1060 6gb. When I switch from CUDA to Optix it actually takes more time for me. Is that normal?

  • @yearofthegarden
    @yearofthegarden Год назад +2

    Just picked up a 3060 12gb for $220. I'd like to try the 5700 xt, but at $60 more and 100 watts more thirsty, I think 3060 is perfect for my blender education, and casual gaming

    • @AJ18634
      @AJ18634 Год назад

      Did you end up getting the 3060? If so, how is is it holding up?

    • @yearofthegarden
      @yearofthegarden Год назад +2

      @AJ18634 yes. Got it for $220 shipped. And now they seem to be that price consistently. Great card, there is nothing negative. Plays every game I got maxed out in 1080 and renders on blender fast

    • @AJ18634
      @AJ18634 Год назад

      @@yearofthegarden cheers!

    • @articatsvao7317
      @articatsvao7317 Год назад

      which gpu vendor did you buy? and in which store if it's not a secret?

    • @articatsvao7317
      @articatsvao7317 Год назад +1

      and as I understand it, 3060 is better than 5700 xt for render (3ds max, blender, etc), right?

  • @ItsAkile
    @ItsAkile 2 года назад +1

    Informative for sure, thanks for the video. will try to refer to the Article because I get the 3060 or 3060 ti question, the 12gb really throws everyone off for productivity. do you have a good answer to that question for say a blender animation/video editor ?

    • @MLWJ1993
      @MLWJ1993 2 года назад +2

      The moment you *need* more VRAM it'll be a bit faster (large projects). For anything else a 3060Ti is going to yield better results.
      Do note that at the moment you do large projects 12gb of VRAM may not hold up versus the same amount of VRAM, but with ECC. Errors can massively increase render times of larger projects.

    • @ItsAkile
      @ItsAkile 2 года назад +1

      @@MLWJ1993 Yeah definitely, 3060 Ti by a mile the better choice. I was curious what it takes to pass 8GB, thanks for the ECC Tips, I guess that's partly why the A-series are so much faster

    • @Gasconauxolives
      @Gasconauxolives 2 года назад +1

      @@ItsAkile Hello, please what do you mean by A-series ?

    • @ItsAkile
      @ItsAkile 2 года назад +3

      @@Gasconauxolives NVIDIA Rebranded the Quadro series of GPUS to A Series, (Nvidia A4000 for example) is a 3070 class GPU but with ECC memory and 16GB VRAM plus Production Driver optimization

    • @Gasconauxolives
      @Gasconauxolives 2 года назад +2

      @@ItsAkile Thanks man, I thought you were talking about some rare consumer products : all clear and obvious now. I'll be happy enought with a modest GeForce card :)

  • @avirtualdesigner6396
    @avirtualdesigner6396 3 года назад +3

    Hello, I want to buy a new PC to render fast in Blender (Cycles GPU), which of these 3 computers do you think would render faster in Blender using GPU?
    Option A) Intel Core i9-11900K, 32 GB RAM, NVIDIA GeForce RTX 3080 10GB
    Option B) Intel Core i9-11900F, 64 GB RAM, MSI Geforce RTX 3080 Ti Gaming X trio 12G
    Option C) Intel Core i7-11700K, 32 GB RAM, NVIDIA GeForce RTX 3090 24GB

    • @Techgage
      @Techgage  3 года назад +4

      There are different ways to look at this. If you opt for the 3090, that means you have an enormous frame buffer that you're likely not going to exceed any time soon. On the other hand, if you are not likely to exceed 12GB, then the 3080 Ti almost keeps up to the 3090 in performance. You'd then gain the even faster single-thread 11900F to use. So it really boils down to your needs, but I feel like your best overall value would be with option B. I have sadly been abysmally slow at getting a video done, but you can find a LOT more Blender performance information here: techgage.com/article/blender-2-93-performance-best-cpus-gpus/

  • @MarekNowakowski
    @MarekNowakowski 2 года назад +2

    would love to see how 6900 does after HIP rendering was added and if it changed anything

  • @randomscooterrider5334
    @randomscooterrider5334 3 года назад +4

    Looks like 3060ti or 3070 is the best price/performance card for rendering and casual gaming.

  • @VedranKlemen
    @VedranKlemen 2 года назад

    Hello. So iff i got it correctly, Prorender is around 3x times slower than Cycles?

  • @cryptok9185
    @cryptok9185 3 года назад +2

    Hey so I was wanting your advice please I just subbed btw :)
    I am about to pull the trigger on my build but I cant decide what GPU to go with,
    I play mainly fps like warzone and battlefield and am going the a AMD 5800x CPU
    And either want a 3080 GPU or 6800 XT, reviews both come out fairly similar but should I go with the AMD since I got the AMD cpu?
    I will be playing 1440p 144hz for reference.
    Thanks !

    • @Techgage
      @Techgage  3 года назад +4

      Just to make sure, are you also doing creation work, or are you strictly a gamer? If you do any creator work, this video probably proves that you'll want to go with NVIDIA for the best performance and probably the best experience (we're even having issues with the newest Radeon driver in Blender...). If you don't do any creation work, then either of those GPUs is a great choice, but NVIDIA has DLSS and has faster RT performance. We didn't do a video on this, but you can see current benchmarks here:
      techgage.com/article/nvidia-geforce-rtx-3070-ti-ultrawide-4k-performance/
      I assume you are using resolutions lower than these since you mentioned 144Hz, but the results here should help. Your target CPU will be up to the task. Thanks for the sub!

  • @justme8675
    @justme8675 2 года назад

    Hey Techgage, Is it safe to assume a 3d studio based on Unity3d Engine gets around the same performance as the Blender test you did?

    • @bharat7188
      @bharat7188 Год назад

      I wanted to know the same, did you find any answer?

  • @MrZZooh
    @MrZZooh 2 года назад

    Hi. I commented on your channel when I bought my first high end gpu almost 4 years ago. It was a Vega 64 and I soon realized that I was pissing at the wrong tree cuz Nvidia is the ONLY manufacturer that gets proper support for GPU rendering. Anyways I just bought a 3070 FE at MSRP and am kinda doubting it because of the 8 gb memory and the gddr6x on 3080 as well as the 30 percent higher performance. I plan on using this for After Effects, which is still mostly CPU bound absent some of the plug-ins, and probably do some arnold gpu rendering. But it's no studio production work. Should I ditch the 3070 and go straight for 3080? I would appreciate a response.

  • @sumo479
    @sumo479 2 года назад

    Good job mate.

  • @rglmmo
    @rglmmo 2 года назад +1

    Hello, could you tell me if the RX 6600 XT is compatible with V-Ray? Thank you very much

    • @Techgage
      @Techgage  2 года назад +2

      Unfortunately, no. V-Ray is completely glued to NVIDIA CUDA/OptiX for the moment. I hope that changes down-the-road. Redshift is soon to add Radeon support, so it'd be nice to see that become commonplace.

    • @rglmmo
      @rglmmo 2 года назад

      @@Techgage Thanks !!

  • @Unreal2424
    @Unreal2424 3 года назад +1

    thank you for the video, i'm planning to learn 3D motion design, If I understand Nvidia's GPU are more potent than AMD's ? With my budget I can get a 3080 or a 6800 XT

    • @Techgage
      @Techgage  3 года назад +9

      It of course depends on the workload you're involved in ultimately, but NVIDIA excels a lot in rendering, so it's important to pay attention to that. NVIDIA wins where Blender is concerned.

  • @TheStickofWar
    @TheStickofWar Год назад +1

    To render the BMW scene on Blender 2.3 with Cycles GPU Compute with an AMD 6800 XT, it takes me about 2-3 minutes. I am not sure why i can't get blender to utilize my GPU even though it recognizes it and the drivers are up to date

    • @cazmatism
      @cazmatism 5 месяцев назад

      turn on HIP in preferences

  • @HARIsubha1
    @HARIsubha1 3 года назад +2

    HI can you pls compare with rtx 30 series with rtx A4000 & A5000

    • @Techgage
      @Techgage  3 года назад

      Will do if we can ever secure cards from NVIDIA. The global chip shortage is not helping with this right now ;-)

  • @ryanlaseter7626
    @ryanlaseter7626 3 года назад +4

    Appreciate this video, I happen to have 3 6700xts and a 3070 right now for mining but, I want to get into 3D art again. Thinking about having the 3070 on my main rig, and setting up a render machine to use the 3 6700xts for full renders. Good to see that the 6700xts are decent, and x3 of them could beat out a single higher Nvidia card in theory!

    • @DerekDavis213
      @DerekDavis213 2 года назад

      Three 6700xts will be much more expensive than a single RTX 3070! How come Blender doesn't properly support Radeon cards?

  • @vinidoidopixel
    @vinidoidopixel Год назад +2

    I would love to build my new desktop with a Radeon, but im an Architect and i need good performance in these programs in addition to gaming. So Nvidia it will be.

  • @mohamedrafik9271
    @mohamedrafik9271 Год назад +1

    Thank you for the video bro) I would be so happy if you see my comment and answer my question! I'm confused and I want to buy the right gpu. I want to get a gpu for 1440 gaming, but I also do some after effects and I'm trying to learn blender. I found RX 6700 XT and a 3060 Ti both at the same price range, however the 3060 Ti is 25 euros more expensive. I am heavily inclined towards RX 6700 XT for the 12GB Vram it has, because having 50% more Vram feels safer in the future, because I saw many games using 5 or 6GB of Vram at least. But I'm also a bit inclined towards 3060ti because everyone is saying that the RX 6700 XT is garbage for after effects, premiere, and also blender. Is the difference in performance and productivity that huge that it deserves to sacrifice the RX 6700 XT and 12GB Vram for the 3060ti with its 8GB of Vram? I also want to know which card holds the better reputation and value and future proofing in your opinion? Thank you so much! P.S These are the only cards available for me rn.

    • @ragnarthomsen3914
      @ragnarthomsen3914 Год назад +2

      I have also taken a close look at both cards, and it's worth noting that overall Nvidia tends to offer much better performance and support for creative media applications like after effects, premiere, and Blender. Regarding tasks like video editing, rendering, and other creative work, AMD GPUs have been known to struggle compared to their Nvidia counterparts. The biggest reason for this is that AMDs driver support mainly focus on gaming, and they seem to have forgotten creative media completely.
      As for 1440p gaming, they both have a really similar performance. The Vram difference will not make a big enough performance impact to justify getting the 6700 XT. If you're actually worried about the difference in Vram in future I would rather wait for the 4060 Ti 16Gb which would give you a 100% increase in Vram.
      The 3060 Ti would be the best choice, considering you will use it for creative programs as well as gaming.

  • @NipahAllDay
    @NipahAllDay 3 года назад +9

    I have a 6900 x. While Nvidia has better support, I really don't want to touch their products. I just hope by the time I get serious in blender, AMD gets better support for Radeon GPUs. (or at the very least for their future GPUs)

  • @juanocybe
    @juanocybe 2 года назад

    Do LHR version decrease the rendering performance?
    i have a ASUS TUF 3080 OC LHR

  • @polynightingale3969
    @polynightingale3969 3 года назад

    Hi can you get your advice on what gpu should i select for rendering purpose in blender. I was recently rendering in blender using rtx 3080 and 5900x as my processor and my card got faulty during rendering I don't understand why this happened. This is the second time when my card has stopped working especially in blender and now i am planning to change my gpu to AMD 6800 xt. I had msi gaming x trio and then after that inno3d card and both failed. Should i stick to rtx 3080 or change to amd. Or should i just downgrade my rig with 5700 xt. Any advice at all Please help??

    • @Techgage
      @Techgage  3 года назад

      Apologies for the slow response! Is it only Blender that you're finding issues with? Is there a particular instance when the issue will arise? We've honestly had no problems with NVIDIA in all of our testing, but run into issues with AMD often. The most current Radeon driver crashes during some of our tests. You can read more about that here:
      techgage.com/article/blender-2-93-performance-best-cpus-gpus/
      We plan on doing a video, but have been running behind. AMD right now is less stable (from our testing), and doesn't run the viewport as fast as NVIDIA. It's really too bad you are seeing issues; I feel like there is something else at fault that's causing the GPU to screw up. Just to make sure, are you using two separate power cables to plug the card in (to the adapter), or are you using a single two-headed cable? We've had issues in the past using only a single daisy-chained cable, so I am curious if this is your issue.

    • @polynightingale3969
      @polynightingale3969 3 года назад

      @@Techgage hi thanks for the reply !! I am actually using single two headed cable for powering GPU. Cards crashed mostly when I was using optix denoiser in blender during rendering. Single cable might be issue here. Please update your review on amd cards I would very much like to see the reviews. And one thing more rtx a4000 are worth over rtx 3080 for rendering purposes.

    • @Techgage
      @Techgage  3 года назад

      If you happen to move to two separate cables, and the issue goes away, I'd like to hear about it. As for A4000 vs. 3080, the A4000 is closer in spec to the 3070 Ti, whereas the 3080 is close to A5000. The A5000 would be good for those who can benefit from Quadro-esque optimizations, but both A5000 and 3080 will perform similarly in rendering.

  • @ivanerast
    @ivanerast Год назад

    Спасибо за полезные тесты!

  • @alteredmeshyt
    @alteredmeshyt Год назад

    Amazing video

  • @irfanayari1333
    @irfanayari1333 Год назад

    Have you retest this with amd HIP support on blender ?

  • @OwnedByChristian
    @OwnedByChristian 9 месяцев назад

    so i am choosing between the rtx 3060 12gb or the rx 6700 xt, in gaming the 6700 xt wins. so does this video displays that it is also same speed in rendering if not even faster than the 3060? so in conclusion, if i want to do gaming but mostly want to use it for editing and rendering, should i go for the 3060 (in my country at the moment 306 dollar) or the rx 6700 xt going for 349 at the moment?

  • @lhbbq
    @lhbbq 2 года назад

    @Techgage - Is there any reason the Nivia 3090 and 3080 TI are almost identical in performance? If you wanted a dual 3XXX why the 3090 over the 3080 TI?

    • @bobanders6672
      @bobanders6672 2 года назад

      Because Memory allocation in SLI does not double unlike compute performance. So with two 3080ti's you'd still be stuck with 12GB max, whilst two 3090's you'd have 24GB.

  • @carlosleite1215
    @carlosleite1215 Год назад

    Recently I created a mining rig with 9 AMD GPUs, as mining may not work anymore, I was thinking about get 5 6800 xt and make a render farm, do you have any idea how could I do it using the same rig? I’m opened to make it a project.

  • @alenalen2237
    @alenalen2237 Год назад

    Hi I want to buy a desktop for 3D modeling like I will use rhino matrix and I stay between
    Two gpu rx6800 and rtx 3070-3070ti but there is a huge difference in price. And I will use that desktop for learning. I want to know is it be worth to give that huge price for 3070-3070ti.

  • @SoylentGamer
    @SoylentGamer Год назад +1

    As much as I see reviewers cheering that Nvidia is dead, I really don't see their hegemony in the productivity space budging. Even if AMD ups their performance a lot, Nvidia always has had more software engineering resources.

    • @Techgage
      @Techgage  Год назад +2

      Some people might think NVIDIA is "dead" from a gaming perspective, because if AMD delivers performant Radeons at better prices, then team red is going to look attractive. But creator workloads are different... if NVIDIA ends up halving rendering performance again with the new generation, then at least one crowd is going to be stoked about it. At least on paper, it looks like that could happen. Real testing might show a different story. Time will soon tell.

    • @SoylentGamer
      @SoylentGamer Год назад +1

      @@Techgage Nvidia's software suite is also easier to use, and more fully featured from my experience. Nvidia will still command some premium brand mindshare at least for this generation. However the cracks are definitely showing for gamers. Most gamers on a budget are definitely going to reach for AMD this gen for sure.

  • @rickybustillos
    @rickybustillos 2 года назад

    Can you remake these tests with newer drivers?

  • @anonime2932
    @anonime2932 2 года назад

    Write down which video cards you compared in scene renderers

  • @tusharkaidal
    @tusharkaidal Год назад

    I have recently purchased The AMD Radeon™ RX 6700 XT graphics card. The workflow is basically divided between blender, vray and lumion. Did i made a mistake while selecting the card? Or any new updates on blender speeds up the processing on amd gpu's

    • @iaia5368
      @iaia5368 Год назад

      no. is a politics market problem of the company. Amd is good, but... not enough. Buy a RTX 3070 for this instead.

    • @herroberbesserwisser7331
      @herroberbesserwisser7331 Год назад

      I have seen amd card perform up to 2 times better under Linux and multiple people have said the same thing. The main difference is that under Linux you use a completely different compiler which translates the code better and faster. Amd can gain good ground and even if you use WSL for to run the Linux version of Blender under windows it runs better. I would say if you want to be productive Linux can help so much if you use AMD.

  • @Nathan__A
    @Nathan__A 3 года назад +1

    The 3080ti benchmarks are just below the 3090, which yeah is good, however the price is pretty much exactly the same.
    Why even bother to release the 3080ti for the same price of the 3090?
    Why would anyone even bother

    • @Techgage
      @Techgage  3 года назад +1

      It could be that the RTX 3090 supply will dry out sooner than we think. In some ways, the RTX 3090 probably should have been a TITAN card, and the 3080 Ti the top-end GeForce.

  • @mikekorsunov3261
    @mikekorsunov3261 Год назад

    Hello, thank you for grate video, Сould you please make comparison between how radeon and nvidia working in substance painter texturing?
    (with texturing only 1 object with one UV shell mode, baking mode, and with few UV shell texturin mode (UV tile mode))
    would be very grate to see how memory encreasing woorks , im pretty sure baking mode will win nvidia, interesting in
    texturing or just rotating scene to see freezing.

  • @Mo-fq6kx
    @Mo-fq6kx 3 года назад +3

    Well amd is being stupid with neglecting the video encoding

  • @MooseX2012
    @MooseX2012 3 года назад

    Will you ever test the MacBook Pro for these?

  • @CaspianWolf26
    @CaspianWolf26 2 года назад

    is there an AMD Hip test ? comparable to those

  • @suryono7219
    @suryono7219 2 года назад

    For the next can you add maverick render banckmark?

  • @Lerdes
    @Lerdes 2 года назад

    Hi sir. Great video! I am building a rig right now for lumion and I can’t choose the gpu. I can buy 3080 ti and W6800 but I can’t afford 3090. I have to finish the build so I can’t wait for better prices. Can you help me plz?

  • @DeathCoreGuitar
    @DeathCoreGuitar Год назад

    Well now you must do AMD HIP in 3.0+ Blender benchmarks

  • @Prajwal____
    @Prajwal____ 2 года назад

    Will quadro t600 be good for me?? With ryzen 5 3600

  • @fransmurdani83
    @fransmurdani83 2 года назад

    Can you do price to peformance and peformance per watt, many do it but only on gaming

  • @NNG_yt
    @NNG_yt 2 года назад

    I am a beginner in game design course
    Student who still didnt start learning but gng to this year
    Currently i have absolutely no experience in 3d and dont understand what i will need or will be sufficient with no difficulties
    Purchasing a gaming laptop
    I would like to know for coming 4 to 5 years
    Will a ryzen 7 5800h with 3060 work for me or better to go with 3070
    Both max tgp
    Also the cpu is fine right
    Wouldnt require an intel level single core performance right

  • @AK-qp2pz
    @AK-qp2pz 2 года назад

    Rtx 3050 vs Rtx 2060 which is best for maya 3d rendering ?
    This 2 graphics card in my budget , Please suggest me 🙏

    • @SHUTENSEPC
      @SHUTENSEPC 2 года назад +2

      ofc 2060 lol

    • @AK-qp2pz
      @AK-qp2pz 2 года назад

      @@SHUTENSEPC thanks man

  • @trevor4664
    @trevor4664 2 года назад

    You have a radeon pro sitting on top of your pc but you don't include it? It's 700 dollar card, cheaper than most cards you used epically since this is a rendering video. Most people will prefer work station hardware over gaming anyways but why not include pro models?

  • @juanocybe
    @juanocybe 2 года назад

    just got my ASUS TUF 3080 OC, im beyond happy!

  • @mahmoudmowafy2872
    @mahmoudmowafy2872 2 года назад

    thank you bro

  • @clausbohm9807
    @clausbohm9807 2 года назад

    Great stuff, just replaced my two 1080ti hybrids with a 3090/3090ti combo ... 4x improvement in blender demo files (using optix). When running barbershop demo for 50 frames I get a 45 second/frame render time when undervolting my two GPU's. When running them full out (I wouldn't suggest that) I get an average of around 41-43 sec/frame (single frame rendering doesn't give you a true picture). As for this video, things have changed by the time you get to Blender 3.2 So those of you that have the team red cards, this is one comparison. One more compare I ran the gooseberry demo for a few frames at undervolt mode (.875v-1455Mhz/.925v-1950mhz) and got average 33.5 frame/sec. Full out not much difference but saved over 200W (allowing me to use my 1000W PS) and several degrees c with the undervolting (FYI).

  • @arypramudito2963
    @arypramudito2963 2 года назад +1

    please add twinmotion at next video, put performance per dollar ratio also, also check for how big the model that can eat 8gb vram, a lot 3070 3060 sold at 8gb, another let say it is 8gb vram sufficient?

  • @gianlucapx
    @gianlucapx Год назад

    Time to make an update. Blender 3.0 ditched Open CL and AMD is now using a new API, called HIP. This can be used only if you install a custom AMD driver and supports only the current gen cards, not the old ones like the RX series. HIP is slower than CUDA.
    AMD is useless if someone uses certain programs and is significantly slower than NVIDIA with the programs that support both manufacturers.

  • @tomaszwaszka3394
    @tomaszwaszka3394 2 года назад

    15:12 Final Thoughts hmmm... a little out of sync (voice and video)... but the content very interesting, while others only test the games...

  • @nanezferrer3565
    @nanezferrer3565 2 года назад

    AMD Rx gpu's are not made for 3d Rendering ???

  • @banmuanhacom
    @banmuanhacom 2 года назад

    why 2022 i just see this test :))))

  • @chadyonfire7878
    @chadyonfire7878 Год назад

    Thx useful

  • @k7rie
    @k7rie Год назад

    Currently 6700xt in my country is $100-200 cheaper than the 3060ti.
    So i think i'll get the 6700xt for now..

  • @shoopdawhoop
    @shoopdawhoop 3 года назад

    Just to cut to the chase: does AMD CPUs like Ryzen 2700 or 5950x support Optix raytracers? Someone told me no, so I want to double-check

    • @Techgage
      @Techgage  3 года назад +1

      You just need an NVIDIA GPU for OptiX. The CPU installed doesn't matter. Even GeForce cards without RT cores can use OptiX, but it's obviously much better on the cards that do have them. I've never seen a huge benefit from it, but OptiX does allow GPU + CPU rendering, which would still work on those AMD chips.

  • @leucome
    @leucome 3 года назад +2

    The Eevee demo is supposed to be 64 sample. 1000 too much. This demo render really fast at 64 sample and look fantastic. So I can not be sure if the final ranking is realistic because the setting are weird. I tried for fun at 1000 sample and my 5700xt match your 6800xt result. So AMD GPU are indeed faster than that with Eevee... On linux/Messa openGL. I did not expect the windows Open GL driver to be that slow. I knew Mesa was better but 30 to 40% faster is crazy.
    Other than that I got the same number that your test for cycles with Rocm open-cl on linux.

    • @Techgage
      @Techgage  3 года назад +1

      The 1,000 samples for EEVEE is just for the sake of easier benchmarking. I've run the same render as an animation encode with 64 or so samples per frame, and the scaling came out the same. It's just cleaner output (in results files) when it's a single frame. Also, rendering through the GUI and CLI will result in slightly different results, and with Radeon in particular, driver version can impact a lot.
      Ultimately, you may not be able to directly compare in all cases, but all of our GPUs are tested with the exact same settings.
      Just to be clear, you are saying EEVEE performs better in Linux? With 2.92, I tested Linux and Windows both, but for GPU, I was seeing identical results, so I didn't bother doing GPU in Linux. But that said, I am not sure I compared the performance from EEVEE (only Cycles), so I am going to have to do that at some point soon.
      techgage.com/article/blender-2-92-linux-windows-performance/

    • @leucome
      @leucome 3 года назад +1

      ​@@Techgage Yes with Eevee at 1000samples and Blender 2.93. For cycles I get the same exact result than you so like you said linux and windows are about the same with Cycles. The difference is with Eevee. I retried Mr Elephant 1000 samples on Linux with AMD Radeon driver I get 88sec and with Mesa driver I get 72sec.

  • @Mohammad_Imran_
    @Mohammad_Imran_ 2 года назад

    Help: previously I had c4d r19 and r20 with redshift 2.6.41..(all pirated) working just fine.
    Now I just bought an rtx 3060 and now my redshift is not working..
    I even tried to install redshift 3.0 and it also didn't work.
    Does anybody know what version of c4d and redshift will work with rtx card 3060.
    Obviously I don't have much knowledge about these.
    Thanks in advance.

    • @siddharth-wy2kp
      @siddharth-wy2kp 2 года назад

      how to render with an AMD GPU?? help..

    • @NNG_yt
      @NNG_yt 2 года назад

      @@siddharth-wy2kp aww

  • @kuunami
    @kuunami 2 года назад

    Is it possible to use redshift with a GTX 970?

    • @Techgage
      @Techgage  2 года назад

      Yes. According to Redshift's tech documents, it requires a CUDA 5.0+ GPU. GTX 970 supports 5.2, so it should still be good to roll. If you've not used Redshift before, there's a trial you can take for a spin.

    • @siddharth-wy2kp
      @siddharth-wy2kp 2 года назад

      how to render with an AMD GPU?? help..

  • @willianpareira1930
    @willianpareira1930 2 года назад

    1080tti is still badass in games, but now it melts in rendering

  • @AlphaKilo09
    @AlphaKilo09 3 года назад

    Anyone who can answer, Quick question. Does Quadro GPUs require workstation motherboards only or do they work just fine on any decent gaming motherboards as well? I mean if the card has a PCIe interface, it should work fine right? or is there anything else more to this. I have a R9 5900X CPU and a MAG X570 Tomahawk Motherboard. I use Seimens NX. Should I go for RTX 3060 12GB or RTX 3060 Ti 8GB or Quadro RTX 4000 8GB?? Please help.

    • @Techgage
      @Techgage  3 года назад +2

      You can use a gaming motherboard just fine! If you can swing for the Quadro RTX 4000 for SNX, I'd recommend going that route. That's one CAD suite that will actually notably perform better on a workstation card over a gaming one:
      techgage.com/article/best-gpus-for-workstations-viewport-performance-of-catia-solidworks-siemens-nx-blender-more/

    • @AlphaKilo09
      @AlphaKilo09 3 года назад +1

      @@Techgage Thank you for replying Rob. After seeing your article it's crystal clear now. Thanks for the awesome article, I was looking exactly for this. Appreciate a lot and Thanks again.

  • @abc00888
    @abc00888 Год назад

    I wanna try AMD GPU before watching this video, but It seems Optix rendering is more powerful. (Hope AMD can beat Nvidia in a decade.)

  • @DerekDavis213
    @DerekDavis213 2 года назад

    Now that Blender 3.2 is out, how about another AMD vs NVIDIA video? AMD is much stronger now.

    • @godofdeath8785
      @godofdeath8785 2 года назад

      I think about maybe to try to buy in future rx 6800 xt and use it for games and montage, rendering, 3d you think this gpu will enough good for that?

    • @DerekDavis213
      @DerekDavis213 2 года назад

      @@godofdeath8785 6800 XT is a great card for gaming, 3D, etc.
      Wait a few months, then the next generation cards from AMD and NVidia are releasing.
      The older 6800 XT will get a big price reduction for sure.

  • @romulocastilho6506
    @romulocastilho6506 2 года назад +1

    My disadvantage is being Brazilian, I can't even buy a TV hehehehe

    • @NNG_yt
      @NNG_yt 2 года назад

      Damn why bro
      Thats horrible
      Not to not have a tv
      But not able to buy one

    • @romulocastilho6506
      @romulocastilho6506 2 года назад +1

      @@NNG_yt because here in Brazil and everything is expensive, it is very difficult to buy an rtx