@@Slickstaff_Stainpants Interesting! For these benchmarks, I normally just keep the same settings across tests, and I don't always check out the result. I will keep this in mind for final renders! Thanks!
Hello, you can activate a Denoiser for AMD ProRender too. If you go to View Layer Properties (2 Blocks under the Render Settings Tab) you can activate the RPR Denoiser, i recommend leaving it on Machine Learning. This will help to get rid of the white noise in your scene.
for prorender if you want speed ( and less noise ) use the rpr uber shader - conversion eats soem time plus stuff like glossyness has to be usually refinetuned. also you need to adapt world settings for the sky. as for textures depends how you set them up, some procedurals need baking ( theers a button )
Awesome! Thank you for this. I will have to re-try ProRender with these considerations. There is a lot of interest in ProRender so I am glad to hear there are ways to make it run better.
Well, at least for me, the only option for GPU rendering is Pro Render configured to use OpenCL. It's unlikely that HIP will ever get support for other earlier GPUs and even if they did, the latest version of ROCm requires PCIe atomics, which means you have to have PCIe 3.0 in your motherboard and possibly your card as well. This is extremely frustrating considering I could take the puny 1 GB GT710 I have in a box, put it into my PC and it supports the latest version of Cycles X without anything having to be done about it.
Yes it is super frustrating. People say "it is not AMD to blame, its XXX". But if you want people to buy your hardware, maybe help support it on the software side too. What GPU from AMD are you using? I have a few RX 580. I had not thought to try those in ProRender yet. Wonder if that would work.
@@ContradictionDesign I have a RX 550 which I now have setup using the Adrenalin Pro drivers. It works well, even though rendering times can feel slower as it does in sample multiples rather than one at a time. Currently using the latest version of Pro Render (3.6). I don't know if using HIP would yield more performance because ROCm 4.x dropped support for GFX8 GPUs, which includes Polaris.
@@BrunodeSouzaLino Very interesting! Thank you for the information. I will try a 580 in ProRender now I think. May as well use them. I also really doubt HIP will ever support Polaris.
@@BrunodeSouzaLino sampling multiples is faster. as the biggest timeloss is in memcopy and bus transfers which you have to do in order to update the preview - this stalls teh whole rendering pipe the most. Less updates = faster compute on polaris.
I saw some Blender 3.5 render results from Linus Tech channel (video on 4060 Ti, I guess) where RX 6700 XT is faster than RTX 3060 12GB (3m 59s vs 3m 32s) BUT, and I think this is some dirty game, according to Blender Benchmark RX 6700 XT gains 1493 scores while RTX 3060 12GB does 2432. This is weird and frustrating. I may be wrong as there are all that HIPs, CUDAs and Optics, etc.
Yeah generally Nvidia on Optix will beat AMD on HIP. If they test Nvidia on CUDA, they say it "levels the playing field". But there is no reason to use CUDA instead of Optix. So it can be misleading. The reason the gaming guys do this is to try to compare the hardware on a raster performance level. Rendering in Blender is more about RT performance now, so HIP vs Optix is the only good Blender test for AMD vs Nvidia. So if you look at benchmark lists for rendering, look for Optix and HIP. 6700 XT is a great card in general though. HIP-RT is supposed to come to Blender in 3.6, so that may give AMD cards a slight boost as well.
The Cores on both GPUs qre quite different. There might actually be an architectural advantage on AMDs Side in this case. nVidia are very focussed on software, ai and server gpus. They do not care much about gaming and have strong software productivity support already.
I just started using Blender on my AMD gaming rig and this demonstration + my own experiences show me that HIP is not as nice as CUDA or OptiX but definitely more usable than ProRender at this moment. I tested it on an AMD Ryzen 9 3900X + AMD Radeon RX 7900 XTX.
Hey there is some good news though. In an upcoming release of Blender they are adding HIP-RT support, which will make AMD GPUs at least 25% faster, according to the rumors. So there is still hope that AMD software support will make it's products competitive with NVIDIA. How is your 7900 XTX doing for you otherwise?
@@ContradictionDesign Gaming performance is of course crazy fast. It's hard to not create a CPU bottleneck while combined with a 3900X. Raytracing performance is nicely improved in comparison to the 6000s cards. Power consumption on Desktop and video playback is fixed since 2-3 months. Now only taking 13 watts in idle and 40 watts during video playback. The improved Xilinx based video encoder is amazing. You can get up to 1000 fps in h264 or h265 while using Handbrake. I wasn't able to test hardware accelerated AV1-encoding as Handbrake doesn't support it yet. And the Cycles HIP performance is good enough if you don't have a direct comparison to OptiX. I remember using Software-Raytracers on Single-Core CPUs and letting it render overnight to create one single 480p frame. Seeing Pathtracing done in seconds feels unreal.
@@edgartheface Oh yeah having any modern card is still so much faster than CPU work of the old days. So really it doesn't matter which GPU you have because they are all "fast". I have used AV1 encode for recording my content, and I will stream sooner or later. When you get to use it you will love it. Smaller files, less frame drops and 0 performance hits. Absolutely fantastic. Thanks for sharing about your system!
Can you compare the new HIP RT option of Blender 3.6 on an amd card vs. using optiX on an rtx card? I'm curious if it finally means choosing an AMD card doesn't mean missing out a ton on Cycles performance at an equivalent price.
My AMD GPUs are up to 6700 XT. I do not have any 7000 Series yet. When I tested the new HIP-RT on that the other day, I got very similar render times to HIP. I am wondering if the 7000 series benefit more, or if the drivers have not been optimized for HIP-RT yet. I will continue to test until I can make a conclusion for sure though. Also, I have a friend on here that has seen a 25% increase in render speed on a 7900 XT from this update. So there is some hope, but it will likely not make AMD comparable to NVIDIA for the $ per frame. More like, it is a free speed boost, but not life changing. Anything helps though
@@ContradictionDesign Thanks for the reply! I think there's a reasonable question of balance of multiple aspects between choosing Radeon and GeForce beyond, "is it just as good at this one thing?" For example, if I prioritize gaming, I wonder how much I'd feel ripped off if I get superior Cycles render times but my gaming experience is significantly worse at the same price. So I'm looking for "is the Cycles performance difference acceptable considering I'm getting more out the card in the other aspects I care about?" The balance between getting a card for gaming vs Blender is ultimately subjective, but it'd be nice to have numbers to base a decision on.
@@rhoharane alright sure! I would say, they are fairly even in gaming between Nvidia and AMD. But cycles is still much faster on Nvidia for the same price of GPU. So if you prioritize gaming, then an AMD GPU can be a good value at the right price. But I would say for now, Nvidia is the better choice if you prioritize Blender.
I was just trying to get rid of those white dots in 3ds Max AMD ProRender, but the documentation is quite lacking. Ended up installing blender and importing the fbx file into it and getting new materials from BlenderKit. I'm using Cycles HIP right now to render an animation. If you decide to redo the tests with a newer graphic card, you can send me the blender file and I can test it with a 6950XT and 5800X3D and give you the results. I'm curious what's the performance difference.
I have checked on the website for pro render to see if they have new versions, but it looks like they don't update it much. It definitely doesn't use rt cores properly yet. But I can always retest just to be sure
@@ContradictionDesign So it's an RT problem? In the 3ds max plugin, there's an option for AI denoising, which does get rid of the dots, but produces terrible artifacts. Although... I can't really blame them for not updating it - especially for 3ds max. If you have enough money to go with Autodesk software instead of Blender, you probably have money for Nvidia hardware. Oddly enough, there are people still working on AMD ProRenderer plugins. On Github, the plugin repository for Blender was updated 2 days ago, while the 3ds Max was last updated Feb 5, 2021 (weekly dev build).
@@SapereAude1490 Well I mostly just mean I'm not convinced they update pro render much. So I think it's just out of date. Blender HIP-Rt looks good and runs fine. So it makes sense to just use that in Blender.
Thank you for informative video! I have a question. Are AMD GPUs like RX 6700XT or 6750XT okay for 3d applications like Blender, Substance Painter, Marmoset Toolbag, etc.?
Hi! You are welcome. AMD GPUs will work in most software just fine. Check their documentation to make sure before you buy the program or the GPU, just to be sure. With Blender, there are not rendering engines that properly utilize AMD RT hardware, and AMD is generally slower than Nvidia for Blender. They will work fine, they just are not as fast currently. I recommend for Blender at least, buy Nvidia if Blender is your primary use case. If you are just casually learning Blender for fun, then Nvidia or AMD will do just fine. Let me know if you have other questions! Oh, and I use Blender, Substance Painter, Unreal Engine, and some other addons in Blender. So for the other softwares you mentioned, I am not sure how fast AMD hardware is.
Congratulations on the channel, I'm thinking about purchasing an amd pro 4gb wx3200, is it compatible with blender? my computer is a Ryzen 5 5600GT 1T of storage with 32 Gb of ram. thank you for your attention
Do you know an alternative for OpenImageDenoise that uses the GPU or if something is planned for GPU denoising while using HIP? I was curious and installed Blender on my laptop with GeForce 3070 Ti and OptiX is already very nice ... but having a Denoiser that doesn't clogs up the CPU and works fine in ViewPort is absolutely mindblowing. What is your preferred Denoise workflow? OptiX, OpenImageDenoise or is it even possible to render out frames without denoising and doing that in a Hotfolder from an external Workstation?
I use optix denoise for rendering. My rendering is done almost entirely on Nvidia GPUs for now. I test them to keep up with hardware in the hopes that AMD will get it figured out. But yeah optix renderer with optix denoise is my go to
@@MrMariozzz78 make sure your CPU is not checked on as a render device. After that, it is probably working. Task manager does not show ray tracing workloads properly sometimes. So if your render times seem like they make sense, it is just a reporting bug from task manager. If you want more accurate reporting, try using HWinfo app. Just be careful to download it safely
I tried pro render with an Rx 570 8 gb and it has a major crash issue . it crashes when even I enter the materials tab, instant crash . tired on 3.6, 2.9 and 2.8 with no luck.
Yeah the Rx 500 series is not supported in Blender 3.0+, but I think it was in Blender 2.8 and 2.9. I have always had a hard time with them even when they were supposed to be usable.
@@ContradictionDesign is there some way I can used my gpu for rendering? Pro render is free but has compatibility issues. I used to work on c4d but was thinking of jumping to blender bcs its more ease of use and tools, however I cant use my gpu on cycles bcs again not supported by HIP cuz old architecture.
@@HaelstormGaming I think as far as blender goes, you probably won't be able to use the 570 anymore unfortunately. I don't know about other software though.
@@ContradictionDesign yeah it sucks kinda. 570 is old but still kicks in gaming sadly rendering has alot of compatibility issues. Alright I'll see what i can make of my gpu 👍🏻
any help? i have a rx 6700 xt and as soon as i select the prorender engine my gpu memory loads fully the 12gb vram it has and then my screen freezes. does anyone know what can i do and which setting can be wrong? thanks in advance
Technically yes, but when I tested it for this video it was slower than hip in Blender anyway. But maybe I should retest it with blender 3.6 or 4.0, and on the 7800 XT. Cause this video is pretty old now
I am watching for a 7800 XT so I can test it. I have seen in general, that AMD is still not as fast in blender per dollar. So, I expect the 4070 will still beat the 7800 XT by quite a bit more than the price difference. But we will see!
@@ContradictionDesignI just wanted to offer to help with testing AMD and Nvidia cards if it helps. I’m launching a new channel next month relating to gear for content creation and currently have a Arc A770 16GB, 3060 12GB, 4070, 6800 XT, 7800 XT and 7900 XTX ready to run on a 13900K with 192GB of RAM. If you can put whatever assets/tests together that you want data on, I’d be happy to do a collaboration and send you the testing data. Also, I have a lot of PugetBench results for DaVinci Resolve collected in a spreadsheet and would be happy to share that. The main findings are that when it comes to h.264/h.265 performance, the 7800 XT saw a huge improvement over the 6800 XT in some applications (even if the GPU FX processing was more flattering to the 6800 XT). For video editing, 7800 XT outperforms the 4070 in PugetBench extended for Resolve and Premiere - and in DaVinci Resolve Fusion the AMD x800 cards I tested outperformed the 4070 by 38-46%. I’m very curious as to whether AMD can see similar gains in Blender eventually. Right now there are some other 3D content creation applications (like Particle Illusion Standalone particle emitters in 3D mode) where the 7800 XT already has shorter render times than the 4070, it appears mainly due to decreased CPU overhead for the drivers compared to Nvidia cards (since particle emitters can get CPU limited easily in some applications). Overcoming the Nvidia gap isn’t outside the realm of possibility - just depends on the respective software support.
@@perlichtman1562 awesome! Thank you for this! I do have a fluid test for CPU baking. My recent videos has the info. Otherwise I use the Blender benchmark scenes for rendering. But in the future I will have more tests. Thanks for reaching out!
Awesome! 😃 I've never checked out or used AMD ProRender before. I've always wondered about using it. Thanks so much for sharing. I wonder if AMD ProRender is better optimized for use with AMD Radeon graphics cards. It would be interesting to see if the render time on an AMD GPU. A 2x increase in render time over Cycles on an NVIDIA graphics card is huge! I know you can't use the AMD HIP setting in Blender since you have an NVIDIA graphics card, so the only way to denoise would be to use OpenImageDenoise with ProRender? It was really cool to see the comparison between Cycles and ProRender. I noticed the texture of the table the glass is sitting on is different for ProRender. Is this due to the texture settings not working with ProRender? Great video my friend! 😃
This test was with two AMD Rx 6600 GPUs. ProRender was slower for them than HIP Cycles. I have not tested an Nvidia GPU on ProRender yet. And yeah the texture issues were something to do with ProRender.
@@ContradictionDesign Oh man. I totally missed the part where you mentioned this is actually on AMD GPUs. There's even text in the video showing that. 😂 .....and you even mentioned the issues with the textures in the video as well. Sorry about that. Thanks again for the video! It was great to see. 😃
Hello sir More than a comment to this video, I am here finding out some answers , answers of life That is I am going to build a pc for gaming , blender , and all the college stuff . Thinking about going with Ryzen 5 7600x and Rx 6750 Xt for gaming and blender or maybe da Vinci resolve As you are a professional I want your opinion(s) if possible Guide me to the right path sir because 4060 Ti with 8GB makes me question a lot of things
How serious will your Blender use be relatively? I can say with gaming as one of the use cases, something with at least 12 GB of VRAM is a good idea. I think the Rx 6750 xt will run very well for you and they have some good sales on them now. This card will lag behind Nvidia alternatives in pure Blender rendering speed, but the question is, does that even matter for you? The 6750 XT will still render frames quick enough if you are just learning or starting out. Unless you are a studio who makes more money with the most optimized GPU for rendering, you will do great with most modern GPUs. I would compare to an RTX 4070 and see what you think, because they have lower power draw and small sized coolers. AMD GPUs can compete in games, but not yet in Blender or other 3D rendering apps. For Resolve, I have only ever used Nvidia to edit videos. I know they are blazing fast in Resolve, but I am not able to comment on how well AMD performs. Maybe I can test this in the future. If your price limit is closer to the $400 US price, then the 6750 XT might be the best value. I would probably avoid the 8 GB GPUs from now on unless they really check all the boxes for you though. The 7600 X CPU will be fantastic at these tasks. BTW I am self-taught on everything I am showing you here on RUclips. So not technically a pro because I also do not do this type of work full-time. But I will take that as a huge compliment. Hope this helps!
@@ContradictionDesign Honestly never expected such in-depth reply And I will be starting to work on blender for the very first time with this This will be my learning curve And one more thing Thank You for catering us so well You earned a sub 🙇♂️
Technically it should, but it is not supported in Blender 3+. When I try to run it in prorender it causes instant crashes. It's definitely not a good idea to buy them for blender at this point.
@@badmingos that seems to be the case. I can get prorender to show the Rx 580 as the render device. As soon as I click render, it crashes. So I think even though prorender supports it, Blender has a problem and can't run. It is very unfortunate
Hello, thanks for this. But what do you mean by "I do not recommend buying AMD GPUs just for Blender, but if you have them, these tests may help you know what to expect. "? does it means you're unhappy with the performance you get for the price of the graphic cards?
Hi! So if you bought a GPU from AMD for gaming then feel free to use it for Blender. I would not buy one over Nvidia if your use case is Blender. The driver support and speed on software and hardware levels are much better with Nvidia than AMD for now. I hope AMD will catch up but they have not yet. So yeah. For gaming, AMD is great. For Blender, it's just not as good.
@@ContradictionDesign Thank you very much sir, you're a saint. I've been asking around and nobody has replied like this before, bless you. I hope you have a nice day
@@ContradictionDesign, it's not AMD that needs to catch up. Blender programmers are responsible for natively supporting AMDs newer architecture and even Apples new M-Series. Programers for Blender figured a way to use NVIDIAs code on AMD cards, but the code overhead kills performance. The same happens with any DirectX 11 title with Ray Tracing enabled on AMD +RDNA 2 cards because the API is old and programmed to favor NVIDIA. Unreal Engine 5 natively supports AMD on an API level (DirectX 12) for Lumen and hardware RT. In the same way, Blender would need to support AMD by programming a renderer specifically for AMD GCN/RDNA. Sure, AMD and Apple could do it for Blender but that would also give NVIDIA an ego boost.
@@captureinsidethesound I guess we can look at it either way, because someone has to make the render engine run well and nobody has yet. Basically, for the end user, AMD is not the best for Blender yet. So maybe I mean "AMD hardware needs more software support". And not just that AMD needs to catch up. Not trying to assign any fault. But AMD driver software is even a pain to use still. And I have had horrible luck with more than 1 or 2 GPUs on a motherboard with AMD. Could be my relative lack of experience with AMD software, but if I want to use the hardware, but the software is not agreeable, it is just an extra roadblock. And the other part is hardware. AMD is just about to catch Nvidia and I hope they do. Dedicated cores for RT and AI in Nvidia GPUs has made them better for ray tracing rendering software up til now. The 7000 series is fantastic for gaming, and I am excited to see what happens next. I might need to look into testing GPUs on UE5. Very interesting. I bet AMD GPUs do a bit better on there then on Blender? Anyway, thanks for the comments! Great stuff
Yeah when I made this video, it had very outdated info on the website to download it. So I started off suspicious haha. But I think Blender's native Cycles HIP-RT is way better now for sure. I do not think they continued to update ProRender
Yeah not yet. It just sucks because the 6800 xt would be fast with good software support and it has 16 GB VRAM. But the RTX 3060 murders it in rendering. Even the newer 7000 series AMD are slower than lower 30 series Nvidia in rendering. Which is weird, because the 7900 XTX is great for gaming, which is just a form of rendering haha.
@@Wingnut353 yeah that's what I've been reading. Maybe I will try the experimental one. Thanks for the idea! I would like to test some newer 7000 series AMD but I can't justify buying one just for Blender yet.
Hey! Let me know what you think. I want to do more tests like this soon. I have quite a few machines to play with so there are plenty of options.
never render an animation with a noise threshold set. It causes flickering. Better to set it to a fixed number of samples.
@@Slickstaff_Stainpants Interesting! For these benchmarks, I normally just keep the same settings across tests, and I don't always check out the result. I will keep this in mind for final renders! Thanks!
Hello, you can activate a Denoiser for AMD ProRender too. If you go to View Layer Properties (2 Blocks under the Render Settings Tab) you can activate the RPR Denoiser, i recommend leaving it on Machine Learning. This will help to get rid of the white noise in your scene.
Ohhh very nice, thank you. I will try it out and compare results again. I appreciate the tip!
for prorender if you want speed ( and less noise ) use the rpr uber shader - conversion eats soem time plus stuff like glossyness has to be usually refinetuned. also you need to adapt world settings for the sky. as for textures depends how you set them up, some procedurals need baking ( theers a button )
Awesome! Thank you for this. I will have to re-try ProRender with these considerations. There is a lot of interest in ProRender so I am glad to hear there are ways to make it run better.
Well, at least for me, the only option for GPU rendering is Pro Render configured to use OpenCL. It's unlikely that HIP will ever get support for other earlier GPUs and even if they did, the latest version of ROCm requires PCIe atomics, which means you have to have PCIe 3.0 in your motherboard and possibly your card as well. This is extremely frustrating considering I could take the puny 1 GB GT710 I have in a box, put it into my PC and it supports the latest version of Cycles X without anything having to be done about it.
Yes it is super frustrating. People say "it is not AMD to blame, its XXX". But if you want people to buy your hardware, maybe help support it on the software side too. What GPU from AMD are you using? I have a few RX 580. I had not thought to try those in ProRender yet. Wonder if that would work.
@@ContradictionDesign I have a RX 550 which I now have setup using the Adrenalin Pro drivers. It works well, even though rendering times can feel slower as it does in sample multiples rather than one at a time. Currently using the latest version of Pro Render (3.6). I don't know if using HIP would yield more performance because ROCm 4.x dropped support for GFX8 GPUs, which includes Polaris.
@@BrunodeSouzaLino Very interesting! Thank you for the information. I will try a 580 in ProRender now I think. May as well use them. I also really doubt HIP will ever support Polaris.
@@BrunodeSouzaLino sampling multiples is faster. as the biggest timeloss is in memcopy and bus transfers which you have to do in order to update the preview - this stalls teh whole rendering pipe the most. Less updates = faster compute on polaris.
I saw some Blender 3.5 render results from Linus Tech channel (video on 4060 Ti, I guess) where RX 6700 XT is faster than RTX 3060 12GB (3m 59s vs 3m 32s) BUT, and I think this is some dirty game, according to Blender Benchmark RX 6700 XT gains 1493 scores while RTX 3060 12GB does 2432. This is weird and frustrating. I may be wrong as there are all that HIPs, CUDAs and Optics, etc.
Yeah generally Nvidia on Optix will beat AMD on HIP. If they test Nvidia on CUDA, they say it "levels the playing field". But there is no reason to use CUDA instead of Optix. So it can be misleading.
The reason the gaming guys do this is to try to compare the hardware on a raster performance level. Rendering in Blender is more about RT performance now, so HIP vs Optix is the only good Blender test for AMD vs Nvidia. So if you look at benchmark lists for rendering, look for Optix and HIP.
6700 XT is a great card in general though. HIP-RT is supposed to come to Blender in 3.6, so that may give AMD cards a slight boost as well.
The Cores on both GPUs qre quite different. There might actually be an architectural advantage on AMDs Side in this case. nVidia are very focussed on software, ai and server gpus. They do not care much about gaming and have strong software productivity support already.
I just started using Blender on my AMD gaming rig and this demonstration + my own experiences show me that HIP is not as nice as CUDA or OptiX but definitely more usable than ProRender at this moment. I tested it on an AMD Ryzen 9 3900X + AMD Radeon RX 7900 XTX.
Hey there is some good news though. In an upcoming release of Blender they are adding HIP-RT support, which will make AMD GPUs at least 25% faster, according to the rumors. So there is still hope that AMD software support will make it's products competitive with NVIDIA.
How is your 7900 XTX doing for you otherwise?
@@ContradictionDesign Gaming performance is of course crazy fast. It's hard to not create a CPU bottleneck while combined with a 3900X. Raytracing performance is nicely improved in comparison to the 6000s cards.
Power consumption on Desktop and video playback is fixed since 2-3 months. Now only taking 13 watts in idle and 40 watts during video playback.
The improved Xilinx based video encoder is amazing. You can get up to 1000 fps in h264 or h265 while using Handbrake. I wasn't able to test hardware accelerated AV1-encoding as Handbrake doesn't support it yet.
And the Cycles HIP performance is good enough if you don't have a direct comparison to OptiX. I remember using Software-Raytracers on Single-Core CPUs and letting it render overnight to create one single 480p frame. Seeing Pathtracing done in seconds feels unreal.
@@edgartheface Oh yeah having any modern card is still so much faster than CPU work of the old days. So really it doesn't matter which GPU you have because they are all "fast". I have used AV1 encode for recording my content, and I will stream sooner or later. When you get to use it you will love it. Smaller files, less frame drops and 0 performance hits. Absolutely fantastic. Thanks for sharing about your system!
Can you compare the new HIP RT option of Blender 3.6 on an amd card vs. using optiX on an rtx card?
I'm curious if it finally means choosing an AMD card doesn't mean missing out a ton on Cycles performance at an equivalent price.
My AMD GPUs are up to 6700 XT. I do not have any 7000 Series yet. When I tested the new HIP-RT on that the other day, I got very similar render times to HIP. I am wondering if the 7000 series benefit more, or if the drivers have not been optimized for HIP-RT yet. I will continue to test until I can make a conclusion for sure though. Also, I have a friend on here that has seen a 25% increase in render speed on a 7900 XT from this update. So there is some hope, but it will likely not make AMD comparable to NVIDIA for the $ per frame. More like, it is a free speed boost, but not life changing. Anything helps though
@@ContradictionDesign Thanks for the reply! I think there's a reasonable question of balance of multiple aspects between choosing Radeon and GeForce beyond, "is it just as good at this one thing?"
For example, if I prioritize gaming, I wonder how much I'd feel ripped off if I get superior Cycles render times but my gaming experience is significantly worse at the same price.
So I'm looking for "is the Cycles performance difference acceptable considering I'm getting more out the card in the other aspects I care about?" The balance between getting a card for gaming vs Blender is ultimately subjective, but it'd be nice to have numbers to base a decision on.
@@rhoharane alright sure! I would say, they are fairly even in gaming between Nvidia and AMD. But cycles is still much faster on Nvidia for the same price of GPU. So if you prioritize gaming, then an AMD GPU can be a good value at the right price. But I would say for now, Nvidia is the better choice if you prioritize Blender.
I was just trying to get rid of those white dots in 3ds Max AMD ProRender, but the documentation is quite lacking.
Ended up installing blender and importing the fbx file into it and getting new materials from BlenderKit. I'm using Cycles HIP right now to render an animation.
If you decide to redo the tests with a newer graphic card, you can send me the blender file and I can test it with a 6950XT and 5800X3D and give you the results. I'm curious what's the performance difference.
I have checked on the website for pro render to see if they have new versions, but it looks like they don't update it much. It definitely doesn't use rt cores properly yet. But I can always retest just to be sure
@@ContradictionDesign
So it's an RT problem? In the 3ds max plugin, there's an option for AI denoising, which does get rid of the dots, but produces terrible artifacts.
Although... I can't really blame them for not updating it - especially for 3ds max.
If you have enough money to go with Autodesk software instead of Blender, you probably have money for Nvidia hardware.
Oddly enough, there are people still working on AMD ProRenderer plugins. On Github, the plugin repository for Blender was updated 2 days ago, while the 3ds Max was last updated Feb 5, 2021 (weekly dev build).
@@SapereAude1490 Well I mostly just mean I'm not convinced they update pro render much. So I think it's just out of date. Blender HIP-Rt looks good and runs fine. So it makes sense to just use that in Blender.
@@SapereAude1490 well maybe I'll try it again soon.
Thank you for informative video! I have a question. Are AMD GPUs like RX 6700XT or 6750XT okay for 3d applications like Blender, Substance Painter, Marmoset Toolbag, etc.?
Hi! You are welcome. AMD GPUs will work in most software just fine. Check their documentation to make sure before you buy the program or the GPU, just to be sure. With Blender, there are not rendering engines that properly utilize AMD RT hardware, and AMD is generally slower than Nvidia for Blender. They will work fine, they just are not as fast currently. I recommend for Blender at least, buy Nvidia if Blender is your primary use case. If you are just casually learning Blender for fun, then Nvidia or AMD will do just fine. Let me know if you have other questions!
Oh, and I use Blender, Substance Painter, Unreal Engine, and some other addons in Blender. So for the other softwares you mentioned, I am not sure how fast AMD hardware is.
@@ContradictionDesign Much appreciated for the quick reply!
@@CorvusSpiritus yep! And hey, I can test a 6700 XT on my test scenes. Maybe I'll run 4070 vs 6700 XT for frames per dollar value.
@@ContradictionDesign It's gonna be David vs Goliath 😄
@@CorvusSpiritus Maybe in total speed, but the 6700 XT is now almost half the price. So value wise, I guess we shall see
Congratulations on the channel, I'm thinking about purchasing an amd pro 4gb wx3200, is it compatible with blender? my computer is a Ryzen 5 5600GT 1T of storage with 32 Gb of ram. thank you for your attention
Thanks for being here! The Wx3200 is Polaris/vega era, so not compatible with current blender releases of Blender.
Do you know an alternative for OpenImageDenoise that uses the GPU or if something is planned for GPU denoising while using HIP? I was curious and installed Blender on my laptop with GeForce 3070 Ti and OptiX is already very nice ... but having a Denoiser that doesn't clogs up the CPU and works fine in ViewPort is absolutely mindblowing. What is your preferred Denoise workflow? OptiX, OpenImageDenoise or is it even possible to render out frames without denoising and doing that in a Hotfolder from an external Workstation?
I use optix denoise for rendering. My rendering is done almost entirely on Nvidia GPUs for now. I test them to keep up with hardware in the hopes that AMD will get it figured out. But yeah optix renderer with optix denoise is my go to
i have a rx6600 i enabled cycles HIP for amd gpu render , but blender use cpu too and gpu is 10-20% , why?
@@MrMariozzz78 make sure your CPU is not checked on as a render device. After that, it is probably working. Task manager does not show ray tracing workloads properly sometimes. So if your render times seem like they make sense, it is just a reporting bug from task manager.
If you want more accurate reporting, try using HWinfo app. Just be careful to download it safely
I tried pro render with an Rx 570 8 gb and it has a major crash issue . it crashes when even I enter the materials tab, instant crash . tired on 3.6, 2.9 and 2.8 with no luck.
Yeah the Rx 500 series is not supported in Blender 3.0+, but I think it was in Blender 2.8 and 2.9. I have always had a hard time with them even when they were supposed to be usable.
@@ContradictionDesign is there some way I can used my gpu for rendering? Pro render is free but has compatibility issues. I used to work on c4d but was thinking of jumping to blender bcs its more ease of use and tools, however I cant use my gpu on cycles bcs again not supported by HIP cuz old architecture.
@@HaelstormGaming I think as far as blender goes, you probably won't be able to use the 570 anymore unfortunately. I don't know about other software though.
@@ContradictionDesign yeah it sucks kinda. 570 is old but still kicks in gaming sadly rendering has alot of compatibility issues. Alright I'll see what i can make of my gpu 👍🏻
@@HaelstormGaming yeah sorry to deliver bad news. Good luck with it all!
any help? i have a rx 6700 xt and as soon as i select the prorender engine my gpu memory loads fully the 12gb vram it has and then my screen freezes. does anyone know what can i do and which setting can be wrong? thanks in advance
Does it have this problem in cycles with Hip-RT? It's not just a big scene?
AMD Pro render works in Blender ?
Technically yes, but when I tested it for this video it was slower than hip in Blender anyway. But maybe I should retest it with blender 3.6 or 4.0, and on the 7800 XT. Cause this video is pretty old now
Hello I want to buy rx7800xt is good for 3d blender or what because the price is good campiering with Nvidia gpu's
I am watching for a 7800 XT so I can test it. I have seen in general, that AMD is still not as fast in blender per dollar. So, I expect the 4070 will still beat the 7800 XT by quite a bit more than the price difference. But we will see!
@@ContradictionDesign I can wait 5 months and after that i decide what I buy. Thank fo replay for me I appreciate that
@@kero9691 no problem! I'll have everything tested eventually!
@@ContradictionDesignI just wanted to offer to help with testing AMD and Nvidia cards if it helps. I’m launching a new channel next month relating to gear for content creation and currently have a Arc A770 16GB, 3060 12GB, 4070, 6800 XT, 7800 XT and 7900 XTX ready to run on a 13900K with 192GB of RAM.
If you can put whatever assets/tests together that you want data on, I’d be happy to do a collaboration and send you the testing data.
Also, I have a lot of PugetBench results for DaVinci Resolve collected in a spreadsheet and would be happy to share that. The main findings are that when it comes to h.264/h.265 performance, the 7800 XT saw a huge improvement over the 6800 XT in some applications (even if the GPU FX processing was more flattering to the 6800 XT). For video editing, 7800 XT outperforms the 4070 in PugetBench extended for Resolve and Premiere - and in DaVinci Resolve Fusion the AMD x800 cards I tested outperformed the 4070 by 38-46%. I’m very curious as to whether AMD can see similar gains in Blender eventually.
Right now there are some other 3D content creation applications (like Particle Illusion Standalone particle emitters in 3D mode) where the 7800 XT already has shorter render times than the 4070, it appears mainly due to decreased CPU overhead for the drivers compared to Nvidia cards (since particle emitters can get CPU limited easily in some applications). Overcoming the Nvidia gap isn’t outside the realm of possibility - just depends on the respective software support.
@@perlichtman1562 awesome! Thank you for this! I do have a fluid test for CPU baking. My recent videos has the info. Otherwise I use the Blender benchmark scenes for rendering. But in the future I will have more tests. Thanks for reaching out!
Hey should i buy rtx 4070 or Rx 6800xt or Rx 7800xt for using blender?
The 4070 should be a bit faster, so probably the best option. It is also more power efficient than the 7800 XT
thank you
@@stephen285 you are welcome!
Awesome! 😃
I've never checked out or used AMD ProRender before. I've always wondered about using it. Thanks so much for sharing.
I wonder if AMD ProRender is better optimized for use with AMD Radeon graphics cards. It would be interesting to see if the render time on an AMD GPU. A 2x increase in render time over Cycles on an NVIDIA graphics card is huge!
I know you can't use the AMD HIP setting in Blender since you have an NVIDIA graphics card, so the only way to denoise would be to use OpenImageDenoise with ProRender?
It was really cool to see the comparison between Cycles and ProRender. I noticed the texture of the table the glass is sitting on is different for ProRender. Is this due to the texture settings not working with ProRender?
Great video my friend! 😃
This test was with two AMD Rx 6600 GPUs. ProRender was slower for them than HIP Cycles. I have not tested an Nvidia GPU on ProRender yet.
And yeah the texture issues were something to do with ProRender.
@@ContradictionDesign Oh man. I totally missed the part where you mentioned this is actually on AMD GPUs.
There's even text in the video showing that. 😂
.....and you even mentioned the issues with the textures in the video as well.
Sorry about that.
Thanks again for the video! It was great to see. 😃
@@MRUStudios Haha you are good man. Yeah so AMD definitely not worth it yet.
@@eye776 I just ran a 6800 XT last night and the results are looking good. I am excited for HIP-RT for sure
@@eye776 Awesome! Thanks for the information my friend. I hope you have a great rest of your week! 😃
Hello sir
More than a comment to this video, I am here finding out some answers , answers of life
That is
I am going to build a pc for gaming , blender , and all the college stuff . Thinking about going with Ryzen 5 7600x and Rx 6750 Xt for gaming and blender or maybe da Vinci resolve
As you are a professional I want your opinion(s) if possible
Guide me to the right path sir because 4060 Ti with 8GB makes me question a lot of things
How serious will your Blender use be relatively?
I can say with gaming as one of the use cases, something with at least 12 GB of VRAM is a good idea. I think the Rx 6750 xt will run very well for you and they have some good sales on them now. This card will lag behind Nvidia alternatives in pure Blender rendering speed, but the question is, does that even matter for you? The 6750 XT will still render frames quick enough if you are just learning or starting out. Unless you are a studio who makes more money with the most optimized GPU for rendering, you will do great with most modern GPUs.
I would compare to an RTX 4070 and see what you think, because they have lower power draw and small sized coolers. AMD GPUs can compete in games, but not yet in Blender or other 3D rendering apps.
For Resolve, I have only ever used Nvidia to edit videos. I know they are blazing fast in Resolve, but I am not able to comment on how well AMD performs. Maybe I can test this in the future.
If your price limit is closer to the $400 US price, then the 6750 XT might be the best value.
I would probably avoid the 8 GB GPUs from now on unless they really check all the boxes for you though.
The 7600 X CPU will be fantastic at these tasks.
BTW I am self-taught on everything I am showing you here on RUclips. So not technically a pro because I also do not do this type of work full-time. But I will take that as a huge compliment.
Hope this helps!
@@ContradictionDesign Honestly never expected such in-depth reply
And
I will be starting to work on blender for the very first time with this
This will be my learning curve
And one more thing
Thank You for catering us so well
You earned a sub 🙇♂️
@@sumit1607 Well I am glad this is helpful! I do enjoy helping in comments so thank you for your compliments! Welcome to the channel 😁
Does rx580 work with amd prorender?
Technically it should, but it is not supported in Blender 3+. When I try to run it in prorender it causes instant crashes. It's definitely not a good idea to buy them for blender at this point.
@@ContradictionDesign that's actually really sad. So now the only hope is hoping AMD adds support?
@@badmingos that seems to be the case. I can get prorender to show the Rx 580 as the render device. As soon as I click render, it crashes. So I think even though prorender supports it, Blender has a problem and can't run. It is very unfortunate
@@ContradictionDesign it is. But xt and vega cards are OK?
@@badmingos Rx 5000+ and Vega 56, Vega 64, and Radeon vii all work fine. Just nothing from Polaris and earlier
Hello, thanks for this. But what do you mean by "I do not recommend buying AMD GPUs just for Blender, but if you have them, these tests may help you know what to expect. "? does it means you're unhappy with the performance you get for the price of the graphic cards?
Hi! So if you bought a GPU from AMD for gaming then feel free to use it for Blender. I would not buy one over Nvidia if your use case is Blender. The driver support and speed on software and hardware levels are much better with Nvidia than AMD for now. I hope AMD will catch up but they have not yet. So yeah. For gaming, AMD is great. For Blender, it's just not as good.
@@ContradictionDesign Thank you very much sir, you're a saint. I've been asking around and nobody has replied like this before, bless you. I hope you have a nice day
@@noirlavender6409 You are so welcome! Thank you for the kind words, it means a ton.
@@ContradictionDesign, it's not AMD that needs to catch up. Blender programmers are responsible for natively supporting AMDs newer architecture and even Apples new M-Series. Programers for Blender figured a way to use NVIDIAs code on AMD cards, but the code overhead kills performance. The same happens with any DirectX 11 title with Ray Tracing enabled on AMD +RDNA 2 cards because the API is old and programmed to favor NVIDIA. Unreal Engine 5 natively supports AMD on an API level (DirectX 12) for Lumen and hardware RT. In the same way, Blender would need to support AMD by programming a renderer specifically for AMD GCN/RDNA. Sure, AMD and Apple could do it for Blender but that would also give NVIDIA an ego boost.
@@captureinsidethesound I guess we can look at it either way, because someone has to make the render engine run well and nobody has yet. Basically, for the end user, AMD is not the best for Blender yet. So maybe I mean "AMD hardware needs more software support". And not just that AMD needs to catch up. Not trying to assign any fault. But AMD driver software is even a pain to use still. And I have had horrible luck with more than 1 or 2 GPUs on a motherboard with AMD. Could be my relative lack of experience with AMD software, but if I want to use the hardware, but the software is not agreeable, it is just an extra roadblock.
And the other part is hardware. AMD is just about to catch Nvidia and I hope they do. Dedicated cores for RT and AI in Nvidia GPUs has made them better for ray tracing rendering software up til now. The 7000 series is fantastic for gaming, and I am excited to see what happens next.
I might need to look into testing GPUs on UE5. Very interesting. I bet AMD GPUs do a bit better on there then on Blender?
Anyway, thanks for the comments! Great stuff
ProRender seems to be a bit of a disappointment.
Yeah when I made this video, it had very outdated info on the website to download it. So I started off suspicious haha. But I think Blender's native Cycles HIP-RT is way better now for sure. I do not think they continued to update ProRender
yeah i don't think that AMD would be a good render. at least not yet
Yeah not yet. It just sucks because the 6800 xt would be fast with good software support and it has 16 GB VRAM. But the RTX 3060 murders it in rendering. Even the newer 7000 series AMD are slower than lower 30 series Nvidia in rendering. Which is weird, because the 7900 XTX is great for gaming, which is just a form of rendering haha.
@@Wingnut353 yeah that's what I've been reading. Maybe I will try the experimental one. Thanks for the idea! I would like to test some newer 7000 series AMD but I can't justify buying one just for Blender yet.