I moved from 3090 to 4090, mostly working with Substance Painter and Marmoset. I was very disappointment with the 'perceived' speed improvement in Substance Painter in terms of basic activities, like changing scene texture resolution on a complex scene with many layers and smart materials. That program really needs a standardized benchmark. I would love to hear any other artists who upgraded similarly and would give their input. Mediaman if you ever have an interest to try and develop some unofficial benchmarks or scenes for Substance Painter, I would be down to brainstorm and maybe figure out some scenes to setup. I just don't know enough about benchmarks and controls.
@@jacksonsingleton yep, that's exactly why I would love to try and standardize some sort of benchmark/scene/set of operations so we could do some real controlled comparisons.
As someone who has experience both 3090 and 4090, do you think the 5000(90) series is worth the wait? I'm currently running a 3070ti so I think any 90 series would be a huge upgrade for me. I'm using unreal and Houdini the most these days and both are struggling with my current build.
@shredmajor700 Houdini is a different beast from Substance, I'm sure it could make use of all the GPU speed. Whether it's worth the wait, I can't say, but personally I would wait because the 50 series should have guaranteed good power cable connector, and the 4090 is being discontinued making warranty claims/issues potentially more difficult if you have a problem in a year or whatever.
I just bought a used 3090 from a miner four months ago and find it really increased what I can do! I can only imagine how much more a 4090 could help me! With the 3090 I am able to do much more complex stuff than I could before. I find myself making much more complex animations now, so my artwork has increased in value. As far as concerns with the 4090 power usage, I think it might actually use less power per unit of work done than the 3090, just by virtue of finishing almost twice as fast! Also I often process that might take my 3090 18 hours! This means the GPU will likely be processing in the late afternoon/early evening when we in California face higher rates in an effort to reduce peak-time usage. A 4090 might get the job done before then and actually SAVE money on electricity.
Thanks for this and other videos :) I'm waiting for material dedicated to Vram and limitations in large 3D scenes. And also the one comparing sets of two cards, including the 3090x2, also in terms of large 3D scenes and VRAM shared by a total of two graphics cards. I am currently planning to build a new computer for 3D rendering in Vray (expansion based on an old board and a first-generation 1950X TR processor). I need at least two graphics cards to increase the VRam capacity. Is the 4090 even an option for me to consider or is it not an option? If I had: 1. 2x 4090 will Vray use 48GB of Vram or will it only use 24Gb from one graphics card? 2. 1x4090 + 1x3090 - as above, will I have 48GB of VRAM available despite the lack of NVlink connection? will I only use 24? 3. 2x 3090 - here, as I understand it, there will be no problem with using the entire VRAM - connecting the cards directly via NVink To use the entire VRAM from two or more cards, do I have to have NVlink or is it also possible without it? Speed is important, but it is more important to have enough memory for large scenes. I was thinking about two cards and maybe buying a third one in the future (also for additional memory. I have a board that will allow this, although the first two cards will be x16 and the additional one would be on eight lines.) @MediamanStudioServices @steve55619 @spontanp @dreagea Can any of you tell me whether Vray in the variants under consideration will be able to use all the VRam in each card - if only the 3D scene requires it?
Thanks for the video. I'm a huge Blender fan. The rtx 4090 was way out of my affordability range. I got a rtx 4080 for $1050 from best buy early black Friday. $1050 is out of my affordability range too but I bought it anyway.
If the combined specs were comparable, are there any advantages to running 2 graphics cards as opposed to 1 killer graphics card? it can certainly be much cheaper to buy 2 lesser Gods. I was just cruising the CPU Benchmarks site looking at value ratings and was thinking that 2 Radeon RX 6800's might be almost as good as the 4090 at a fraction of the price. Any thoughts on this matter?
Great video. However the 3090 has some other considerations too; - prices are even lower used on eBay, 3090 used goes for about $750, while 4090 used it still $1400+ - 3090 has NVLink - 3090 does not require the new power connector (which requires new PSU)
- More VRAM actually does equate to more computing power. There are multiple applications which simply won't train large models or render complex scenes without enough VRAM. His analogy suggesting otherwise simply made no sense.
@@WantedForTwerkingHe said that more VRAM didn't equate to more computing power. But in fact, it does. If your application crashes due to memory constraints while executing fine on a card with more memory, then you have actually gained additional computing power, that you didn't have before.
@@mrquicky i experienced issues with VRAM on vray that crashed during calculation stage. Vray swaped calculation stage from CPU to GPU and that made an issue, was it the driver or shortage of memory, i managed to finish the animation by optimizing it. My guess is it was VRAM related.
i have a question and love to see a video about it, usually when i upgrade my rig i go for the high end gpu, so when the gpu get outdated i cant sell them because it worth nothing and i dont have a great experience with selling my used pc parts its just not worth it for me. so my question is, is there away to build a workstation to add my outdated different cards like my 2080 and my current 3080 when i upgrade to the 5080 to it to work as a render node to support my main machine(i use Vray and blender)? its like a retirement plan for my old gpus. if this build possible what cpu should i get? a threadripper because they offer 64 lanes? or just a ryzen and use the cpu in 8x4 config? linux or windows? WS motherboard or a server racks for gpu? i just came a cross your channel, thenk you for all the videos you made so helpful and to the point.
Great video!! Could you do the same video with a RTX4090 vs 2 x RTX3090?? Really curious about that outcome. Also curious what is does with Davinci Resolve. Keep it up 😃👍
I was wondering if you could test Dual GPU in 1 system vs Dual Machines for rendering. Your Dual X8 vs Single X16 video was very helpful, but I haven't been able to find anyone who has tested this. I'm wondering is it better to render off 2 GPUs in the same machine vs buying low end-mid motherboard, cpu... etc and diverting the 2nd GPU into that machine and network render off LAN. Thanks in advance, appreciate the videos/tests you make.
I am using a 4070 in my film editing rig. I find that system memory and M.2 drives are boss when it comes to video editing with Davinci Resolve Studio. I have dual 22 Core Intel Xeons and 256GB DDR4 Ram in my rig.
Oh nice! I just ordered a 4090 cuz I used a Quad monitor setup and a 3090 is crying hard whenever I connect my 4K monitor and have 3-4 Adobe apps open. Can't wait to see how much it improves things..
Hi Sir Mike, I like to ask you a question and please guide me sir, I assembled a workstation computer for myself but the problem is the booting is very slow and I have to wait for a long time around to 1 minute 10 sec until my computer booting,please tell me what should i do? this is my pc info motherboard is asus wrx80e___cpu amd threadrriper pro 5975___256 gb ddr4 corsir 3200mhz____samsung ssd 980pro ____and rtx 4090, in the bios setting i set rams on docp but it make boot longe than 1min so i set it on auto and put ram’s speed manually on 3200 mhz
Good to see you're back. Would it make sense for Dual 3090, if the cost was about the same? You'd have 2x the vram (say for motion design or rendering) and perhaps almost the same performance? Ofc, the power use would be much higher.
Possibly! but I have never been able to get two 3090 and NvLink them together. Let me see what I can do and if I can get my hands on a second blower style 3090 and do the testing. Thanks for watching
I think with DaVinCi Resolve the memory won't stack, so one would still have 24GB VRAM. I've gotten mine nearly pegged out before at about 23GB. But that's fairly rare, mine runs at about 15GB usage when running Resolve.
We had that question with dual 3090’s before we moved to dual 4090’s. A 4090 was approx 2.1x faster in blender bench mark at the time we upgraded. 4090’s have faster video encoding. 4090’s are far more power efficient. Any investment we made in to 3090’s just didn’t scale or make sense. Ie if we wanted four 3090’s we would need water cooling etc and then time, effort and moment was better invested in 4090’s going forwards. I suspect driver and software updates now strongly favour 4090’s. HTH
The first thing to do when getting a gpu so power hungry is too undervolt it: It takes a couple of minutes with MSI Afterburner and can reduce the heat, the noise and the power consumption. Worth to mention that the RTX 3090 can be run in couple with NVlink and programs like Blender will see a single stack of 48 GB of vram.
I'm actually quite curious about the power draw on the 4090 when rendering. It is extremely power efficient in games, I'm curious if it ever reaches the power limit when rendering.
looking forward to your VRAM video. The Environments I'm building don't fit in 24gb Card in Cycles. The heat is bad. AsRock WRX80 Creator MB, 3090, 4090, A5000, 256GB Ram, Open Case, 1600 WPS.
Would love to see your thoughts and tests on blower style cards like the ASUS or Gigabyte RTX 3090 Turbo cards and also the upcoming AFOX RTX 4090 cards. How much performance is being left the table vs the usual FE coolers and three fan designs.
I have one of the Gigabyte RTX3090 turbo GPU. This works great for me. The fan is loud but this is expected for the blower-style cards. But it does not run hot. I have also tested the 4090 blower style GPU, which ran 10 degrees cooler than the three fan styles I was testing.
You can get 4x used 3090 or 1x new 4090 for basically the same money. You can nvlink 3090 so you get 48GB VRAM total where with 4090 you will be limited to 24GB always (no nvlink). I'm going with 4x3090.
Nvlink bridges are expensive too and right now it might be hard to find them, i agree with you though, two 3090s is the better option if energy is cheap where you live
I would like to but there are many videos on RUclips that have proven that AMD is just not performing as well as Nvidia for most creative workflows. most applications take advantage of the CUDA tech from Nvidia and AMDs just not there in this market. It's not to say that AMD GPUs don't work in creative apps. But there is a reason that 90% of the GPUs used in the creative market are Nvidia. Thanks for watching.
"If you can afford the best, buy the 4090; it is outperforming the 3090, in some cases, by 50%" Small but important nitpick: If the 4090's performance is, in some cases, twice that of the 3090, then the 4090 is outperforming the 3090 by *100%*. Depending on which position you make your baseline, the correct description of the difference in performance changes. It would be correct to say the 3090 is underperforming the 4090 by 50%, but the correct way to describe the performance of the 4090 is to say it is outperforming the 3090 by 100%.
Indeed ... ! And From the quote, surely, if money is not a problem, always buy the latest, strongest and most expensive ... I think that is quite obvious ... But I guess the matter is to help people, for whom money is not overflowing, to think about the "sacrifice / trade off" ... faster vs Electricity bill
I knew 4090's were quite a bit better, but didn't think it was that much better. I still don't regret getting my 3090, it I need more possessing power I will get a second 3090 with a NVME link. When you run blender in SLI mode, blender gives you the option to stack memory, not just run it in parallel.
So if you have 2x4090, the application sees and will only use 24Gb from one graphics card? Is this only in blender or also in Vray? If you can't use the full VRAM memory using two 4XXX cards, it doesn't look good for 3D applications.
@@marceliszpak3718 unfortunately that won't work with two 4090s because you can't NVLINK them. 3090's have extra connection on the outside allows for NVLINK SLI backet to connect both cards to achieve shared memory. To be able to stack VRAM for the 4000 series you need to get the work station version of those cards. Those cards are called NVIDIA RTX 6000 and can stack Vram like the 3090s. but they are extremely expensive.
Personally, I don't care about either Nvidia card. I bought an AMD RX 6900 XT for $550 USD, and thus far it's more than enough for my gaming and content creation needs.
That's fine, but us people in the professional industry's. Unfortunately Nvidia is the only real option for 3D rendering. AMD isn't really useful except for basic video editing using resolve ect. Even then CUDA is much better.
So happy to have you back! And thank you very much for your detailed benchmarks and also your personal thoughts on the matter. Much appreciated!
Thank you so much! hcgul06
Man was waiting on this.
Back almost after a year hope more content is on its way.
GL guys.
I hope to make more in the next few weeks. thanks for watching
I moved from 3090 to 4090, mostly working with Substance Painter and Marmoset. I was very disappointment with the 'perceived' speed improvement in Substance Painter in terms of basic activities, like changing scene texture resolution on a complex scene with many layers and smart materials. That program really needs a standardized benchmark. I would love to hear any other artists who upgraded similarly and would give their input. Mediaman if you ever have an interest to try and develop some unofficial benchmarks or scenes for Substance Painter, I would be down to brainstorm and maybe figure out some scenes to setup. I just don't know enough about benchmarks and controls.
I don't imagine that Substance tools would be as GPU intensive in a situation like that, but rather RAM and CPU intensive.
@@jacksonsingleton yep, that's exactly why I would love to try and standardize some sort of benchmark/scene/set of operations so we could do some real controlled comparisons.
As someone who has experience both 3090 and 4090, do you think the 5000(90) series is worth the wait?
I'm currently running a 3070ti so I think any 90 series would be a huge upgrade for me. I'm using unreal and Houdini the most these days and both are struggling with my current build.
@shredmajor700 Houdini is a different beast from Substance, I'm sure it could make use of all the GPU speed. Whether it's worth the wait, I can't say, but personally I would wait because the 50 series should have guaranteed good power cable connector, and the 4090 is being discontinued making warranty claims/issues potentially more difficult if you have a problem in a year or whatever.
So happy to see you guys back!
thanks rehanrajput7506. its good to be back. I hope to start making many more videos
thanks rehanrajput7506, good to be back
Thank you dear Mediaman for the video. I am happy you are back now. Hope to see more from you. Best.
thanks @zafertopaloglu2020
I just bought a used 3090 from a miner four months ago and find it really increased what I can do!
I can only imagine how much more a 4090 could help me!
With the 3090 I am able to do much more complex stuff than I could before. I find myself making much more complex animations now, so my artwork has increased in value.
As far as concerns with the 4090 power usage, I think it might actually use less power per unit of work done than the 3090, just by virtue of finishing almost twice as fast!
Also I often process that might take my 3090 18 hours! This means the GPU will likely be processing in the late afternoon/early evening when we in California face higher rates in an effort to reduce peak-time usage. A 4090 might get the job done before then and actually SAVE money on electricity.
that may be a great idea for a video to see what the cost of rendering would be for the two GPUs. thanks for the idea and for watching
These are great videos. Glad to see you're still posting them
Thanks for this and other videos :)
I'm waiting for material dedicated to Vram and limitations in large 3D scenes. And also the one comparing sets of two cards, including the 3090x2, also in terms of large 3D scenes and VRAM shared by a total of two graphics cards.
I am currently planning to build a new computer for 3D rendering in Vray (expansion based on an old board and a first-generation 1950X TR processor). I need at least two graphics cards to increase the VRam capacity. Is the 4090 even an option for me to consider or is it not an option? If I had:
1. 2x 4090 will Vray use 48GB of Vram or will it only use 24Gb from one graphics card?
2. 1x4090 + 1x3090 - as above, will I have 48GB of VRAM available despite the lack of NVlink connection? will I only use 24?
3. 2x 3090 - here, as I understand it, there will be no problem with using the entire VRAM - connecting the cards directly via NVink
To use the entire VRAM from two or more cards, do I have to have NVlink or is it also possible without it?
Speed is important, but it is more important to have enough memory for large scenes.
I was thinking about two cards and maybe buying a third one in the future (also for additional memory. I have a board that will allow this, although the first two cards will be x16 and the additional one would be on eight lines.)
@MediamanStudioServices
@steve55619
@spontanp
@dreagea
Can any of you tell me whether Vray in the variants under consideration will be able to use all the VRam in each card - if only the 3D scene requires it?
Thanks for the video. I'm a huge Blender fan. The rtx 4090 was way out of my affordability range. I got a rtx 4080 for $1050 from best buy early black Friday. $1050 is out of my affordability range too but I bought it anyway.
Great to have you back! This video was super helpful, explained nicely too. Thank you.
Glad it was helpful! RafiAnimates
If the combined specs were comparable, are there any advantages to running 2 graphics cards as opposed to 1 killer graphics card? it can certainly be much cheaper to buy 2 lesser Gods. I was just cruising the CPU Benchmarks site looking at value ratings and was thinking that 2 Radeon RX 6800's might be almost as good as the 4090 at a fraction of the price. Any thoughts on this matter?
how was it for you?
I have yet to try either but the guy at the shop was pushing 1 high end card over multiple lesser ones.
Thanks for your time. Great to get a sober, matter-of-fact benchmark review from someone in the industry.
Great video. However the 3090 has some other considerations too;
- prices are even lower used on eBay, 3090 used goes for about $750, while 4090 used it still $1400+
- 3090 has NVLink
- 3090 does not require the new power connector (which requires new PSU)
- More VRAM actually does equate to more computing power. There are multiple applications which simply won't train large models or render complex scenes without enough VRAM. His analogy suggesting otherwise simply made no sense.
@@mrquicky you just repeated what he said in the video? so your just saying what he said lol
@@WantedForTwerkingHe said that more VRAM didn't equate to more computing power. But in fact, it does. If your application crashes due to memory constraints while executing fine on a card with more memory, then you have actually gained additional computing power, that you didn't have before.
@@mrquicky i experienced issues with VRAM on vray that crashed during calculation stage. Vray swaped calculation stage from CPU to GPU and that made an issue, was it the driver or shortage of memory, i managed to finish the animation by optimizing it. My guess is it was VRAM related.
Yay new video! Thanks for coming back
good to be back. Thanks blackstar_1069, for watching
THX for sharing this... very learning ...
Thanks for video! Would like to see tests in Substance painter, as it's more relevant for me as 3d artist
These videos are great. This is exactly the kind of information I've been looking for.
thanks Jon
i have a question and love to see a video about it, usually when i upgrade my rig i go for the high end gpu, so when the gpu get outdated i cant sell them because it worth nothing and i dont have a great experience with selling my used pc parts its just not worth it for me.
so my question is, is there away to build a workstation to add my outdated different cards like my 2080 and my current 3080 when i upgrade to the 5080 to it to work as a render node to support my main machine(i use Vray and blender)? its like a retirement plan for my old gpus.
if this build possible what cpu should i get? a threadripper because they offer 64 lanes? or just a ryzen and use the cpu in 8x4 config? linux or windows? WS motherboard or a server racks for gpu?
i just came a cross your channel, thenk you for all the videos you made so helpful and to the point.
Great video!! Could you do the same video with a RTX4090 vs 2 x RTX3090?? Really curious about that outcome. Also curious what is does with Davinci Resolve. Keep it up 😃👍
Yes I can. once I get my hands on a second 3090, I will do these tests. Thanks for watching
I was wondering if you could test Dual GPU in 1 system vs Dual Machines for rendering. Your Dual X8 vs Single X16 video was very helpful, but I haven't been able to find anyone who has tested this. I'm wondering is it better to render off 2 GPUs in the same machine vs buying low end-mid motherboard, cpu... etc and diverting the 2nd GPU into that machine and network render off LAN. Thanks in advance, appreciate the videos/tests you make.
from Malaysia to Vancouver? Wow, good luck with that : ) nice video!
I am using a 4070 in my film editing rig. I find that system memory and M.2 drives are boss when it comes to video editing with Davinci Resolve Studio. I have dual 22 Core Intel Xeons and 256GB DDR4 Ram in my rig.
Sheesh, I’m using 64gb ram, 14900k and 3070 I want to upgrade to at least a 3090
WOW! This is a pleasant surprise. Welcome back Mike! We missed you 😮
Thank you! 😃 Eric1935, glad to be back
im happy to see you back ... best channel for compare gpu with blender . thnx a lot man ! if i get money ofcourse im gonna get 2 4090 ofcourse ^^ .
happy to see you again🙂🙂
happy to be back, thanks howitworks101
like your videos, man. Hope to see more videos
can you also give Unreal engine 5 score in future videos?
Oh nice! I just ordered a 4090 cuz I used a Quad monitor setup and a 3090 is crying hard whenever I connect my 4K monitor and have 3-4 Adobe apps open. Can't wait to see how much it improves things..
Very nice! thanks for watching
Hi Sir Mike, I like to ask you a question and please guide me sir, I assembled a workstation computer for myself but the problem is the booting is very slow and I have to wait for a long time around to 1 minute 10 sec until my computer booting,please tell me what should i do? this is my pc info motherboard is asus wrx80e___cpu amd threadrriper pro 5975___256 gb ddr4 corsir 3200mhz____samsung ssd 980pro ____and rtx 4090, in the bios setting i set rams on docp but it make boot longe than 1min so i set it on auto and put ram’s speed manually on 3200 mhz
Good to see you're back. Would it make sense for Dual 3090, if the cost was about the same? You'd have 2x the vram (say for motion design or rendering) and perhaps almost the same performance? Ofc, the power use would be much higher.
Possibly! but I have never been able to get two 3090 and NvLink them together. Let me see what I can do and if I can get my hands on a second blower style 3090 and do the testing. Thanks for watching
@@MediamanStudioServices That would be awesome. Np, you're one of the few that really tests for practial useage!
I think with DaVinCi Resolve the memory won't stack, so one would still have 24GB VRAM.
I've gotten mine nearly pegged out before at about 23GB. But that's fairly rare, mine runs at about 15GB usage when running Resolve.
We had that question with dual 3090’s before we moved to dual 4090’s.
A 4090 was approx 2.1x faster in blender bench mark at the time we upgraded.
4090’s have faster video encoding.
4090’s are far more power efficient.
Any investment we made in to 3090’s just didn’t scale or make sense. Ie if we wanted four 3090’s we would need water cooling etc and then time, effort and moment was better invested in 4090’s going forwards.
I suspect driver and software updates now strongly favour 4090’s. HTH
@legendhasit2568 ahh, great feedback!!! How much faster is the encoding? I do a lot of video editing and 3d, so I think I'll have to move to the 4090.
Hello, welcome back, I am very happy and very excited about your future videos
good to be back. Thanks for watching
I wish you would do an update on how to run llm's and on what machines for local
Hi there. 5090 is about to launch. Would de nice to see you back on that 😂
Good to see you back Mike.
thanks Adam
Thanks a ton for your videos !! Miss them !! How to combine 4 x 4090 Sir? :)
The first thing to do when getting a gpu so power hungry is too undervolt it: It takes a couple of minutes with MSI Afterburner and can reduce the heat, the noise and the power consumption.
Worth to mention that the RTX 3090 can be run in couple with NVlink and programs like Blender will see a single stack of 48 GB of vram.
amazing as always very useful information in all of your videos
Glad you think so! jibranbaig8206 , thanks for watching
Happy to see you back again
Excellent videos always Thanks
I'm actually quite curious about the power draw on the 4090 when rendering. It is extremely power efficient in games, I'm curious if it ever reaches the power limit when rendering.
looking forward to your VRAM video. The Environments I'm building don't fit in 24gb Card in Cycles. The heat is bad. AsRock WRX80 Creator MB, 3090, 4090, A5000, 256GB Ram, Open Case, 1600 WPS.
Thanks! Any Radeon gpu is interesting for rendering? 🎄🥂🇮🇹
Welcome to Vancouver! Great choice! Although I guess it could be Van in Washington.
*Canada, very nice!
Thanks! 😃 FluxStage
At last new video😊, already thank you before iwatch it.
thanks for watching arslantezel9772
Used 3090 EVGA FTW for $700 here. Had it for 4 ish months👍
Would love to see your thoughts and tests on blower style cards like the ASUS or Gigabyte RTX 3090 Turbo cards and also the upcoming AFOX RTX 4090 cards. How much performance is being left the table vs the usual FE coolers and three fan designs.
I have one of the Gigabyte RTX3090 turbo GPU. This works great for me. The fan is loud but this is expected for the blower-style cards. But it does not run hot. I have also tested the 4090 blower style GPU, which ran 10 degrees cooler than the three fan styles I was testing.
You can get 4x used 3090 or 1x new 4090 for basically the same money. You can nvlink 3090 so you get 48GB VRAM total where with 4090 you will be limited to 24GB always (no nvlink). I'm going with 4x3090.
Nvlink bridges are expensive too and right now it might be hard to find them, i agree with you though, two 3090s is the better option if energy is cheap where you live
fantastic review!
great video. thanks
I use rtx 4090 asus tuf plus ekwb water block +active back plate with gooood result
What about RTX 4070 for blender learning?
4070 is a good gpu still, if you don't do things the require the extra vram, you are fine!
Holy moly... he is back :D :D
He’s back! 🎉
The real test i wanted to see, is 2 RTX 3090 on NVlink 48gb vs 1 4090 24gb , on a 50gb + scene
🔥
Very informative ❤
Welcome back 😊
I would like to see you do comparison with some AMD cards like the 7900XTX
I would like to but there are many videos on RUclips that have proven that AMD is just not performing as well as Nvidia for most creative workflows. most applications take advantage of the CUDA tech from Nvidia and AMDs just not there in this market. It's not to say that AMD GPUs don't work in creative apps. But there is a reason that 90% of the GPUs used in the creative market are Nvidia. Thanks for watching.
"If you can afford the best, buy the 4090; it is outperforming the 3090, in some cases, by 50%"
Small but important nitpick: If the 4090's performance is, in some cases, twice that of the 3090, then the 4090 is outperforming the 3090 by *100%*. Depending on which position you make your baseline, the correct description of the difference in performance changes. It would be correct to say the 3090 is underperforming the 4090 by 50%, but the correct way to describe the performance of the 4090 is to say it is outperforming the 3090 by 100%.
Indeed ... !
And From the quote, surely, if money is not a problem, always buy the latest, strongest and most expensive ...
I think that is quite obvious ... But I guess the matter is to help people, for whom money is not overflowing, to think about the "sacrifice / trade off" ... faster vs Electricity bill
I knew 4090's were quite a bit better, but didn't think it was that much better. I still don't regret getting my 3090, it I need more possessing power I will get a second 3090 with a NVME link. When you run blender in SLI mode, blender gives you the option to stack memory, not just run it in parallel.
So if you have 2x4090, the application sees and will only use 24Gb from one graphics card?
Is this only in blender or also in Vray? If you can't use the full VRAM memory using two 4XXX cards, it doesn't look good for 3D applications.
@@marceliszpak3718 unfortunately that won't work with two 4090s because you can't NVLINK them. 3090's have extra connection on the outside allows for NVLINK SLI backet to connect both cards to achieve shared memory. To be able to stack VRAM for the 4000 series you need to get the work station version of those cards. Those cards are called NVIDIA RTX 6000 and can stack Vram like the 3090s. but they are extremely expensive.
Rtx 4090 vs rtx 6000 ada vs rtx a6000 amp ...cool.
What about RTX 4070ti ?
nice video
miss you
LOL Welcome back🤣
Why not 3090tI? Its so obvious.. Also nobody (almost) now would buy "new" 3090, so 3090ti used now from 600$+ vs new or used 4090 for 1400$+..
The 4090 is faster but the ram amount did not change. If my 3090 decided to to die. Then I would get a 4090 or something better with more than 24gb
well the price is way more when you jump into these higher Vram GPU, but yes, some workflows really need this amount of memory. Thanks for watching
I am wait for 6090ti 2027
I wish yhe 4090 had more vram.
you are being missed
You have retUrned
yes, thanks for watching
Personally, I don't care about either Nvidia card. I bought an AMD RX 6900 XT for $550 USD, and thus far it's more than enough for my gaming and content creation needs.
That's fine, but us people in the professional industry's. Unfortunately Nvidia is the only real option for 3D rendering. AMD isn't really useful except for basic video editing using resolve ect. Even then CUDA is much better.
@@jacobstanley7089 Besides gaming and video editing, I do a lot of AutoCAD and my 6900 XT handles it just fine.