RTX3090 vs RTX4090 for creative workflows.

Поделиться
HTML-код
  • Опубликовано: 27 сен 2024
  • #Nvidia #RTX #WorkstationGPU
    Testing the RTX 3090 and the RTX 4090 in Blender, Vray, Redshift and SpecView Perf.
    Redshift render:
    www.redshift3d...
    OctaneBench: render.otoy.co...
    SpecviewPerf: www.spec.org/g...
    Blender demo files: www.blender.or...
    Soba Noodle House:
    www.behance.ne...)
    Visit:
    Discord: / discord
    Facebook: / mediamanstudioreview

Комментарии • 117

  • @hcgul06
    @hcgul06 Год назад +13

    So happy to have you back! And thank you very much for your detailed benchmarks and also your personal thoughts on the matter. Much appreciated!

  • @ask_carbon
    @ask_carbon Год назад +8

    Man was waiting on this.
    Back almost after a year hope more content is on its way.
    GL guys.

  • @rehanlive2
    @rehanlive2 Год назад +7

    So happy to see you guys back!

  • @talk2zafer
    @talk2zafer Год назад +5

    Thank you dear Mediaman for the video. I am happy you are back now. Hope to see more from you. Best.

  • @DanielJoseph-w4p
    @DanielJoseph-w4p 7 месяцев назад +2

    These are great videos. Glad to see you're still posting them

  • @aintnomeaning
    @aintnomeaning Год назад +29

    I moved from 3090 to 4090, mostly working with Substance Painter and Marmoset. I was very disappointment with the 'perceived' speed improvement in Substance Painter in terms of basic activities, like changing scene texture resolution on a complex scene with many layers and smart materials. That program really needs a standardized benchmark. I would love to hear any other artists who upgraded similarly and would give their input. Mediaman if you ever have an interest to try and develop some unofficial benchmarks or scenes for Substance Painter, I would be down to brainstorm and maybe figure out some scenes to setup. I just don't know enough about benchmarks and controls.

    • @jacksonsingleton
      @jacksonsingleton Год назад +4

      I don't imagine that Substance tools would be as GPU intensive in a situation like that, but rather RAM and CPU intensive.

    • @aintnomeaning
      @aintnomeaning Год назад +1

      @@jacksonsingleton yep, that's exactly why I would love to try and standardize some sort of benchmark/scene/set of operations so we could do some real controlled comparisons.

    • @shredmajor700
      @shredmajor700 10 дней назад

      As someone who has experience both 3090 and 4090, do you think the 5000(90) series is worth the wait?
      I'm currently running a 3070ti so I think any 90 series would be a huge upgrade for me. I'm using unreal and Houdini the most these days and both are struggling with my current build.

    • @aintnomeaning
      @aintnomeaning 10 дней назад

      @shredmajor700 Houdini is a different beast from Substance, I'm sure it could make use of all the GPU speed. Whether it's worth the wait, I can't say, but personally I would wait because the 50 series should have guaranteed good power cable connector, and the 4090 is being discontinued making warranty claims/issues potentially more difficult if you have a problem in a year or whatever.

  • @ambientnaturally
    @ambientnaturally Год назад +6

    I just bought a used 3090 from a miner four months ago and find it really increased what I can do!
    I can only imagine how much more a 4090 could help me!
    With the 3090 I am able to do much more complex stuff than I could before. I find myself making much more complex animations now, so my artwork has increased in value.
    As far as concerns with the 4090 power usage, I think it might actually use less power per unit of work done than the 3090, just by virtue of finishing almost twice as fast!
    Also I often process that might take my 3090 18 hours! This means the GPU will likely be processing in the late afternoon/early evening when we in California face higher rates in an effort to reduce peak-time usage. A 4090 might get the job done before then and actually SAVE money on electricity.

    • @MediamanStudioServices
      @MediamanStudioServices  Год назад +5

      that may be a great idea for a video to see what the cost of rendering would be for the two GPUs. thanks for the idea and for watching

  • @RafiAnimates
    @RafiAnimates Год назад +1

    Great to have you back! This video was super helpful, explained nicely too. Thank you.

  • @Badutspringer
    @Badutspringer Год назад

    Thanks for your time. Great to get a sober, matter-of-fact benchmark review from someone in the industry.

  • @blackstar_1069
    @blackstar_1069 Год назад +1

    Yay new video! Thanks for coming back

  • @Eric1935
    @Eric1935 Год назад

    WOW! This is a pleasant surprise. Welcome back Mike! We missed you 😮

  • @zzzzzz-p2z
    @zzzzzz-p2z 2 месяца назад

    from Malaysia to Vancouver? Wow, good luck with that : ) nice video!

  • @steve55619
    @steve55619 Год назад +3

    Great video. However the 3090 has some other considerations too;
    - prices are even lower used on eBay, 3090 used goes for about $750, while 4090 used it still $1400+
    - 3090 has NVLink
    - 3090 does not require the new power connector (which requires new PSU)

    • @mrquicky
      @mrquicky 11 месяцев назад

      - More VRAM actually does equate to more computing power. There are multiple applications which simply won't train large models or render complex scenes without enough VRAM. His analogy suggesting otherwise simply made no sense.

    • @WantedForTwerking
      @WantedForTwerking 11 месяцев назад

      @@mrquicky you just repeated what he said in the video? so your just saying what he said lol

    • @mrquicky
      @mrquicky 11 месяцев назад +1

      @@WantedForTwerkingHe said that more VRAM didn't equate to more computing power. But in fact, it does. If your application crashes due to memory constraints while executing fine on a card with more memory, then you have actually gained additional computing power, that you didn't have before.

    • @branislavdjordjevic7226
      @branislavdjordjevic7226 22 дня назад +1

      @@mrquicky i experienced issues with VRAM on vray that crashed during calculation stage. Vray swaped calculation stage from CPU to GPU and that made an issue, was it the driver or shortage of memory, i managed to finish the animation by optimizing it. My guess is it was VRAM related.

  • @abdelkarimkeraressi1418
    @abdelkarimkeraressi1418 5 месяцев назад

    im happy to see you back ... best channel for compare gpu with blender . thnx a lot man ! if i get money ofcourse im gonna get 2 4090 ofcourse ^^ .

  • @howitworks101
    @howitworks101 Год назад +2

    happy to see you again🙂🙂

  • @zoltanmaszlag2982
    @zoltanmaszlag2982 Год назад

    Hello, welcome back, I am very happy and very excited about your future videos

  • @Max-kf7on
    @Max-kf7on Год назад +1

    Thanks for video! Would like to see tests in Substance painter, as it's more relevant for me as 3d artist

  • @zavavastudio
    @zavavastudio 11 месяцев назад

    Good to see you back Mike.

  • @gianlucapx
    @gianlucapx 6 месяцев назад

    The first thing to do when getting a gpu so power hungry is too undervolt it: It takes a couple of minutes with MSI Afterburner and can reduce the heat, the noise and the power consumption.
    Worth to mention that the RTX 3090 can be run in couple with NVlink and programs like Blender will see a single stack of 48 GB of vram.

  • @Atsolok
    @Atsolok Год назад +2

    Great video!! Could you do the same video with a RTX4090 vs 2 x RTX3090?? Really curious about that outcome. Also curious what is does with Davinci Resolve. Keep it up 😃👍

    • @MediamanStudioServices
      @MediamanStudioServices  Год назад +2

      Yes I can. once I get my hands on a second 3090, I will do these tests. Thanks for watching

  • @arslantezel9772
    @arslantezel9772 Год назад +1

    At last new video😊, already thank you before iwatch it.

  • @bernardsantos210
    @bernardsantos210 Год назад +1

    Oh nice! I just ordered a 4090 cuz I used a Quad monitor setup and a 3090 is crying hard whenever I connect my 4K monitor and have 3-4 Adobe apps open. Can't wait to see how much it improves things..

  • @clausbohm9807
    @clausbohm9807 3 дня назад

    With two of these MSI Hybrid bad boy 4090's (powered by the G3 thermaltake 1650W PS) I will cut through my Blender renders like a hot knife thru butter!

  • @IstiakAhmed_UE5_VR-Dev
    @IstiakAhmed_UE5_VR-Dev Год назад +1

    like your videos, man. Hope to see more videos
    can you also give Unreal engine 5 score in future videos?

  • @jibranbaig3d
    @jibranbaig3d Год назад

    amazing as always very useful information in all of your videos

  • @marceliszpak3718
    @marceliszpak3718 10 месяцев назад +2

    Thanks for this and other videos :)
    I'm waiting for material dedicated to Vram and limitations in large 3D scenes. And also the one comparing sets of two cards, including the 3090x2, also in terms of large 3D scenes and VRAM shared by a total of two graphics cards.
    I am currently planning to build a new computer for 3D rendering in Vray (expansion based on an old board and a first-generation 1950X TR processor). I need at least two graphics cards to increase the VRam capacity. Is the 4090 even an option for me to consider or is it not an option? If I had:
    1. 2x 4090 will Vray use 48GB of Vram or will it only use 24Gb from one graphics card?
    2. 1x4090 + 1x3090 - as above, will I have 48GB of VRAM available despite the lack of NVlink connection? will I only use 24?
    3. 2x 3090 - here, as I understand it, there will be no problem with using the entire VRAM - connecting the cards directly via NVink
    To use the entire VRAM from two or more cards, do I have to have NVlink or is it also possible without it?
    Speed is important, but it is more important to have enough memory for large scenes.
    I was thinking about two cards and maybe buying a third one in the future (also for additional memory. I have a board that will allow this, although the first two cards will be x16 and the additional one would be on eight lines.)
    @MediamanStudioServices
    @steve55619
    @spontanp
    @dreagea
    Can any of you tell me whether Vray in the variants under consideration will be able to use all the VRam in each card - if only the 3D scene requires it?

  • @kkramlogan
    @kkramlogan Год назад +1

    Good to see you're back. Would it make sense for Dual 3090, if the cost was about the same? You'd have 2x the vram (say for motion design or rendering) and perhaps almost the same performance? Ofc, the power use would be much higher.

    • @MediamanStudioServices
      @MediamanStudioServices  Год назад +1

      Possibly! but I have never been able to get two 3090 and NvLink them together. Let me see what I can do and if I can get my hands on a second blower style 3090 and do the testing. Thanks for watching

    • @kkramlogan
      @kkramlogan Год назад

      @@MediamanStudioServices That would be awesome. Np, you're one of the few that really tests for practial useage!

    • @ambientnaturally
      @ambientnaturally Год назад

      I think with DaVinCi Resolve the memory won't stack, so one would still have 24GB VRAM.
      I've gotten mine nearly pegged out before at about 23GB. But that's fairly rare, mine runs at about 15GB usage when running Resolve.

    • @legendhasit2568
      @legendhasit2568 Год назад +1

      We had that question with dual 3090’s before we moved to dual 4090’s.
      A 4090 was approx 2.1x faster in blender bench mark at the time we upgraded.
      4090’s have faster video encoding.
      4090’s are far more power efficient.
      Any investment we made in to 3090’s just didn’t scale or make sense. Ie if we wanted four 3090’s we would need water cooling etc and then time, effort and moment was better invested in 4090’s going forwards.
      I suspect driver and software updates now strongly favour 4090’s. HTH

    • @kkramlogan
      @kkramlogan Год назад

      @legendhasit2568 ahh, great feedback!!! How much faster is the encoding? I do a lot of video editing and 3d, so I think I'll have to move to the 4090.

  • @FluxStage
    @FluxStage Год назад

    Welcome to Vancouver! Great choice! Although I guess it could be Van in Washington.
    *Canada, very nice!

  • @dinisdesigncorner332
    @dinisdesigncorner332 Год назад

    fantastic review!

  • @a_1236
    @a_1236 Год назад

    Holy moly... he is back :D :D

  • @johnreiland9180
    @johnreiland9180 6 месяцев назад

    "If you can afford the best, buy the 4090; it is outperforming the 3090, in some cases, by 50%"
    Small but important nitpick: If the 4090's performance is, in some cases, twice that of the 3090, then the 4090 is outperforming the 3090 by *100%*. Depending on which position you make your baseline, the correct description of the difference in performance changes. It would be correct to say the 3090 is underperforming the 4090 by 50%, but the correct way to describe the performance of the 4090 is to say it is outperforming the 3090 by 100%.

    • @anianait
      @anianait 22 дня назад

      Indeed ... !
      And From the quote, surely, if money is not a problem, always buy the latest, strongest and most expensive ...
      I think that is quite obvious ... But I guess the matter is to help people, for whom money is not overflowing, to think about the "sacrifice / trade off" ... faster vs Electricity bill

  • @raw_pc
    @raw_pc 11 месяцев назад

    You can get 4x used 3090 or 1x new 4090 for basically the same money. You can nvlink 3090 so you get 48GB VRAM total where with 4090 you will be limited to 24GB always (no nvlink). I'm going with 4x3090.

    • @Unizuka
      @Unizuka 8 месяцев назад +1

      Nvlink bridges are expensive too and right now it might be hard to find them, i agree with you though, two 3090s is the better option if energy is cheap where you live

  • @ubertcoolie8694
    @ubertcoolie8694 Год назад

    great video. thanks

  • @brushrunner
    @brushrunner Год назад +3

    The real test i wanted to see, is 2 RTX 3090 on NVlink 48gb vs 1 4090 24gb , on a 50gb + scene

  • @legendhasit2568
    @legendhasit2568 Год назад

    He’s back! 🎉

  • @whitecrow5672
    @whitecrow5672 8 месяцев назад

    Very informative ❤

  • @bullit199
    @bullit199 Год назад

    Used 3090 EVGA FTW for $700 here. Had it for 4 ish months👍

  • @lockdown_therapy6915
    @lockdown_therapy6915 Год назад +1

    LOL Welcome back🤣

  • @Enjoyfun02
    @Enjoyfun02 6 месяцев назад

    nice video

  • @dreagea
    @dreagea Год назад +1

    looking forward to your VRAM video. The Environments I'm building don't fit in 24gb Card in Cycles. The heat is bad. AsRock WRX80 Creator MB, 3090, 4090, A5000, 256GB Ram, Open Case, 1600 WPS.

  • @DoggoCatto-xz2jg
    @DoggoCatto-xz2jg 10 месяцев назад

    I was wondering if you could test Dual GPU in 1 system vs Dual Machines for rendering. Your Dual X8 vs Single X16 video was very helpful, but I haven't been able to find anyone who has tested this. I'm wondering is it better to render off 2 GPUs in the same machine vs buying low end-mid motherboard, cpu... etc and diverting the 2nd GPU into that machine and network render off LAN. Thanks in advance, appreciate the videos/tests you make.

  • @burnit7489
    @burnit7489 Год назад

    i have a question and love to see a video about it, usually when i upgrade my rig i go for the high end gpu, so when the gpu get outdated i cant sell them because it worth nothing and i dont have a great experience with selling my used pc parts its just not worth it for me.
    so my question is, is there away to build a workstation to add my outdated different cards like my 2080 and my current 3080 when i upgrade to the 5080 to it to work as a render node to support my main machine(i use Vray and blender)? its like a retirement plan for my old gpus.
    if this build possible what cpu should i get? a threadripper because they offer 64 lanes? or just a ryzen and use the cpu in 8x4 config? linux or windows? WS motherboard or a server racks for gpu?
    i just came a cross your channel, thenk you for all the videos you made so helpful and to the point.

  • @sir1junior
    @sir1junior Год назад

    I'm actually quite curious about the power draw on the 4090 when rendering. It is extremely power efficient in games, I'm curious if it ever reaches the power limit when rendering.

  • @sirchet555
    @sirchet555 Год назад

    miss you

  • @ricbattaglia6976
    @ricbattaglia6976 9 месяцев назад

    Thanks! Any Radeon gpu is interesting for rendering? 🎄🥂🇮🇹

  • @a.tevetoglu3366
    @a.tevetoglu3366 4 месяца назад

    Rtx 4090 vs rtx 6000 ada vs rtx a6000 amp ...cool.

  • @williamsideas
    @williamsideas Год назад

    I knew 4090's were quite a bit better, but didn't think it was that much better. I still don't regret getting my 3090, it I need more possessing power I will get a second 3090 with a NVME link. When you run blender in SLI mode, blender gives you the option to stack memory, not just run it in parallel.

    • @marceliszpak3718
      @marceliszpak3718 10 месяцев назад +1

      So if you have 2x4090, the application sees and will only use 24Gb from one graphics card?
      Is this only in blender or also in Vray? If you can't use the full VRAM memory using two 4XXX cards, it doesn't look good for 3D applications.

    • @williamsideas
      @williamsideas 10 месяцев назад

      @@marceliszpak3718 unfortunately that won't work with two 4090s because you can't NVLINK them. 3090's have extra connection on the outside allows for NVLINK SLI backet to connect both cards to achieve shared memory. To be able to stack VRAM for the 4000 series you need to get the work station version of those cards. Those cards are called NVIDIA RTX 6000 and can stack Vram like the 3090s. but they are extremely expensive.

  • @xplorer5694
    @xplorer5694 Год назад

    What about RTX 4070 for blender learning?

    • @shadow777explorer
      @shadow777explorer 8 месяцев назад +1

      4070 is a good gpu still, if you don't do things the require the extra vram, you are fine!

  • @rehanlive2
    @rehanlive2 Год назад

    Would love to see your thoughts and tests on blower style cards like the ASUS or Gigabyte RTX 3090 Turbo cards and also the upcoming AFOX RTX 4090 cards. How much performance is being left the table vs the usual FE coolers and three fan designs.

    • @MediamanStudioServices
      @MediamanStudioServices  Год назад +1

      I have one of the Gigabyte RTX3090 turbo GPU. This works great for me. The fan is loud but this is expected for the blower-style cards. But it does not run hot. I have also tested the 4090 blower style GPU, which ran 10 degrees cooler than the three fan styles I was testing.

  • @Hartley94
    @Hartley94 Год назад

    🔥

  • @miladbehnam7481
    @miladbehnam7481 10 месяцев назад

    Hi Sir Mike, I like to ask you a question and please guide me sir, I assembled a workstation computer for myself but the problem is the booting is very slow and I have to wait for a long time around to 1 minute 10 sec until my computer booting,please tell me what should i do? this is my pc info motherboard is asus wrx80e___cpu amd threadrriper pro 5975___256 gb ddr4 corsir 3200mhz____samsung ssd 980pro ____and rtx 4090, in the bios setting i set rams on docp but it make boot longe than 1min so i set it on auto and put ram’s speed manually on 3200 mhz

  • @majidio8585
    @majidio8585 Год назад

    What about RTX 4070ti ?

  • @3Dgamespot
    @3Dgamespot 3 месяца назад

    I am wait for 6090ti 2027

  • @airun5362
    @airun5362 3 месяца назад

    Why not 3090tI? Its so obvious.. Also nobody (almost) now would buy "new" 3090, so 3090ti used now from 600$+ vs new or used 4090 for 1400$+..

  • @furynotes
    @furynotes Год назад

    The 4090 is faster but the ram amount did not change. If my 3090 decided to to die. Then I would get a 4090 or something better with more than 24gb

    • @MediamanStudioServices
      @MediamanStudioServices  Год назад

      well the price is way more when you jump into these higher Vram GPU, but yes, some workflows really need this amount of memory. Thanks for watching

  • @Anti-FreedomD.P.R.ofSouthKorea
    @Anti-FreedomD.P.R.ofSouthKorea Год назад +2

    You have retUrned

  • @OlettaLiano
    @OlettaLiano Год назад

    Personally, I don't care about either Nvidia card. I bought an AMD RX 6900 XT for $550 USD, and thus far it's more than enough for my gaming and content creation needs.

    • @jacobstanley7089
      @jacobstanley7089 Год назад +2

      That's fine, but us people in the professional industry's. Unfortunately Nvidia is the only real option for 3D rendering. AMD isn't really useful except for basic video editing using resolve ect. Even then CUDA is much better.

    • @OlettaLiano
      @OlettaLiano Год назад

      @@jacobstanley7089 Besides gaming and video editing, I do a lot of AutoCAD and my 6900 XT handles it just fine.

  • @ubertcoolie8694
    @ubertcoolie8694 11 месяцев назад +2

    Thanks for the video. I'm a huge Blender fan. The rtx 4090 was way out of my affordability range. I got a rtx 4080 for $1050 from best buy early black Friday. $1050 is out of my affordability range too but I bought it anyway.

  • @miladbehnam7481
    @miladbehnam7481 Год назад +1

    I use rtx 4090 asus tuf plus ekwb water block +active back plate with gooood result

  • @KellyCocc
    @KellyCocc Год назад +2

    If the combined specs were comparable, are there any advantages to running 2 graphics cards as opposed to 1 killer graphics card? it can certainly be much cheaper to buy 2 lesser Gods. I was just cruising the CPU Benchmarks site looking at value ratings and was thinking that 2 Radeon RX 6800's might be almost as good as the 4090 at a fraction of the price. Any thoughts on this matter?

    • @obebcervantesperez4754
      @obebcervantesperez4754 9 месяцев назад

      how was it for you?

    • @KellyCocc
      @KellyCocc 9 месяцев назад

      I have yet to try either but the guy at the shop was pushing 1 high end card over multiple lesser ones.

  • @kszanika7782
    @kszanika7782 Год назад +2

    I am using a 4070 in my film editing rig. I find that system memory and M.2 drives are boss when it comes to video editing with Davinci Resolve Studio. I have dual 22 Core Intel Xeons and 256GB DDR4 Ram in my rig.

    • @liceafilms
      @liceafilms 3 месяца назад

      Sheesh, I’m using 64gb ram, 14900k and 3070 I want to upgrade to at least a 3090

  • @rusticagenerica
    @rusticagenerica 6 месяцев назад

    Thanks a ton for your videos !! Miss them !! How to combine 4 x 4090 Sir? :)

  • @mcroman-superfeat
    @mcroman-superfeat 2 месяца назад

    THX for sharing this... very learning ...

  • @Unizuka
    @Unizuka 5 месяцев назад

    I wish yhe 4090 had more vram.

  • @sudubaii
    @sudubaii 5 месяцев назад

    you are being missed

  • @anpowersoftpowersoft
    @anpowersoftpowersoft 4 месяца назад

    Excellent videos always Thanks

  • @jonva13
    @jonva13 11 месяцев назад

    These videos are great. This is exactly the kind of information I've been looking for.

  • @LeandroJack
    @LeandroJack Год назад

    I would like to see you do comparison with some AMD cards like the 7900XTX

    • @MediamanStudioServices
      @MediamanStudioServices  Год назад

      I would like to but there are many videos on RUclips that have proven that AMD is just not performing as well as Nvidia for most creative workflows. most applications take advantage of the CUDA tech from Nvidia and AMDs just not there in this market. It's not to say that AMD GPUs don't work in creative apps. But there is a reason that 90% of the GPUs used in the creative market are Nvidia. Thanks for watching.

  • @santiag0pictures
    @santiag0pictures 5 месяцев назад

    Happy to see you back again

  • @sir1junior
    @sir1junior Год назад

    Welcome back 😊