Daz 3D IRay GPU Render Speed Test RTX 3090 VS RTX 2070
HTML-код
- Опубликовано: 14 окт 2024
- Hardware
CPU AMD Ryzen7 3700X
RAM 64GB DDR4 3200MHz
GPU
1. MSI RTX 2070 Gaming Z 8Gb
2. MSI RTX 3090 Gaming X Trio 24Gb
Softwave
Daz3D 4.12.2 Bata
Nvidia Studio Drive 456.38
Music
Gibbous
by Michael FK
Weird that I liked this so much. Maybe the music and choice of scenes to test with.
3090 is a beast of a card. Thx for the video
I am getting an RTX card soon
upgrading from a GTX card
are there RTX-specific settings that show up when using an RTX card?
like upscaling settings with DLSS?
or does the extra features just accelerate the render?
Thank you for showing us, now I realize that with a RTX 3090 Iray is still very slowly =(
Yeah. Wait for 4090 or 5090.
Will a graphic card tdp affect its rendering time…like a 3060 with 130 W vs a 3060 with 100? Or is it solely dependant on cuda cores ?
Hello, thanks for the demonstration!
I also have an RTX 2070 and an RTX 3090 but Daz Studio doesn't want to use the RTX 3090 for rendering, it only uses the RTX 2070.
When I try it to work for 15 seconds then it finalizes me with a rendered black image.
I'm on the Daz Studio 4.12.2 Beta version and the NVIDIA Studio Driver v.456.71 with AMD Ryzen Threadripper 1920X 12-core and 64 GB RAM.
If you have any advice for me, I'm interested.
This is weird. All I can think of is to update DAZ Beta. I am using 4.12.2.51. Make sure it is the version after October this year. If you can’t solve your problem, I’m sorry.
you have to download afterbunner and put it at a frequency of at least 20 mhz more because the gpu when they reach a certain degree like the new ones cracks is a problem of the nvidia gpu of the 30 range
@@noahgraphicz5251 Thanks for your help. I finaly install the last beta version of dazstudio and it work ! The RTX 3090 is very powerfull ! For a big scene with 5 characters in 1920x1080px my render time is now at 2.30mn VS 15mn with my RTX 2070 !!! I do not regret at all the investment of my one and a half year of savings!
@@creagraph2903 YOU CAN ALSO SAVE MORE TIME IF YOU ACTIVATE THE RRAY TRACING YOU WILL GO TO CONFIGURATION THEN YOU WILL SEE THAT RAYTRACING IS ON AUTO, OF PUT IT IN ACTIVE THEN YOU WILL SEE HOW THE AMOUNT OF VRAM LOW AND RENDERING FASTER AS THE RT PROCESSOR WILL RENDER YOUR ILLUMINATION .
@@kaskuoart I also encountered the same problem. Maybe the version is too low. Where can I get the latest version?
Thank you for this!!!
I have 3080 and no matter what is there in my viewport. DAZ never uses GPU for rendering. It only uses cpu.
Even if there is just 1 figure and 1 light source, then also it doesn't use GPU. You know how to fix it?
1: Make sure your cards drivers are up to date.
2: In the "Advanced" tab in the Render Settings, uncheck the CPU and make sure the card is the only thing selected.
@@mordredsillence it was when i was using 1070 but it didn't work. Right now with 3080 and new daz versio its working
Disappointing compared to the render times in Blender, Arnold GPU in Maya and Unreal Engine 4. Daz will never catch up it seems.
What are the blender times if you don't mind me asking i'm looking to getting a 3080 soon or 3080 ti
The real question is how fast will it render clips for film i'm a animation short film kinda guy
Another youtuber made a short film called peaches. He said before the 3090, it was around 4 mins per frame. With the 3090 it was 30 second renders.
@@redone823 Oh my! Will be getting thing this card someday
@@HeLIEl yeah. I like buying used hardware and rendering in a farm. Tech is like the car market and can lose value after a few generations or usage.
@@redone823 good idea
Hi, I have a 3090 and am using beta 4.14 version but when I hit render it doesn't use GPU... how do I make it use GPU?
Same question here 🙋🏼♂️ Any answers on this?
@@runidjurhuus3937 i solved it by disabling all cpu in render settings. also i upgraded from 4.12 to 4.14
@@Enigmo1 thanks will try this 🙌
update the drivers
Me with my 1660 waiting 2 hours straight
still 2070 doing pretty well. 3090 is a want if you already have 2070, not a need.
500 sample? I could not imagine getting a good render at that level. Then again I only deal with realism and the eye and skin setting require 3000 min to look decent but most of the time I take it to 10000 at sub 6. I have two 2080tis and it makes short work of it
I suposse the render time is in seconds ?
what resolution where these images rendered at?
1920 1080
When will the beta become public?
www.daz3d.com/daz-studio-beta
Does this card still only work in BETA?
Yeah and costs 5 times more then my 2070Super.
i'm still using gtx 1060 6GB =D
How fast are ur rendering speed?
Teach me, Senpai!
Iray is slow AF
No thanks, i don't need to waste 1500$ for this, lol i rather bulid a powerful pc alone with that amount of money if i was in USA, i'm still happy with my beast the rtx 2070S 8GB
For DAZ, it is a waste of money. 8GB~10GB DDR6 is enough. I use Marmoset Toolbag for my work, so I decided to buy 3090. Under normal circumstances, it is not recommended to spend US1500.
@@kaskuoart To be fair 1500 is far, far, far more reasonable than the bloody 2000 some sellers where asking for 2080 ti before the 3000 series was announced. Plus I won't hit a bottle neck when I use more than 2 characters and a quality environment, I could just que my renders and leave without fearing that one scene run over the vram limitation and render on CPU.
@@kaskuoart I find that 8GB of VRAM is often insufficient for larger scenes w/ >3 characters. Of course, this depends on the assets you use. Some assets come w/ ridiculous 8K or even 16K textures by default. Nonetheless, that 24GB of VRAM would mean never having to compromise your scene design or having to stop and do optimization.
A 16GB option like AMD offers would be idea IMO. It's ridiculous that the 3070 is at 8GB, which is the same as my 1070 from two generations ago!
@@tianxiangxiong8223 you can use "Scene Optimizer" is good
www.daz3d.com/scene-optimizer
@@kaskuoart I do use it, but it takes time to process a scene. I often find myself wishing for 1-2 GB more VRAM. 16 would be plenty and quite future-proof for my purposes.