Why people are WRONG about NVIDIA RTX 4090! [Asus TUF RTX 4090 Review]

Поделиться
HTML-код
  • Опубликовано: 22 окт 2024

Комментарии • 513

  • @theTechNotice
    @theTechNotice  2 года назад +25

    What do you think about the 4090 board partner designs? What's your favourite? 👇👇

  • @georgioszampoukis1966
    @georgioszampoukis1966 Год назад +65

    I upgraded to a 4090 from a 3070ti for deep learning applications and also gaming. Gaming wise, I have set the card at 60% power target at which it only loses about 6-8% of performance. To give a perspective, I play Apex Legends at medium settings on a 4K 144Hz monitor, the game is locked at 144 fps constantly, with the card working at around 40% and consuming on average 120-150W. The 3070ti almost maintained 144 fps on 4K low but overclocked and it consumed on average around 300W. Insane efficiency.

    • @jcdenton7914
      @jcdenton7914 Год назад

      That 60% power limit thing was done with the founders edition which has a better PCB then most of the AIB's. Basically it has 70a power phases and a larger count of them then reference boards and most AIB's with custom PCB's aren't matching. There's maybe 5 total 4090's not including liquid editions that exceed the FE. This matters because it allows both greater overclocks or the ability to drop the power limits lower without performance throttling.
      When I did crypto mining with the 10 series when those were new I would be able to power limit a good EVGA card to 60% meanwhile some PNY that uses a reference PCB would throttle hard on 70%. The reference PCB would perform less hashes at 80% then the custom PCB on EVGA at 60%.

    • @Bubble1989
      @Bubble1989 Год назад

      Hello how are you if I may ask is your 4090 connector still fine or has it melted

    • @georgioszampoukis1966
      @georgioszampoukis1966 Год назад +2

      @@Bubble1989 it has been running 24/7 at full load (training deep learning networks) for almost 4 months now and everything is perfectly fine. I am using the official Corsair cable though, not the adapter.

    • @Bubble1989
      @Bubble1989 Год назад

      ​@@georgioszampoukis1966 thank you for your reply.the reason I'm asking is because have order me a 4090 aswel and I was this wondering if I made the right choice

    • @OrigIswed
      @OrigIswed Год назад

      @@georgioszampoukis1966 hi Geordios, I've been using a 2080 but have a 4090 on its way. How does it compare with the 3070ti for training models? Does it reduce training time substantially?

  • @freshwater1059
    @freshwater1059 2 года назад +79

    Love the fact you include creator export and rendering times here. Becoming increasingly more relevant.

    • @theTechNotice
      @theTechNotice  2 года назад +1

      although just realised I mixed up the 3090 and 4090 screen grabs, but the stats are still the same :)

    • @nubnubbud
      @nubnubbud 2 года назад +2

      ​@@theTechNotice the issue is that the adobe suite is useless as a benchmark, especially after effects. it was made before multi-core computers were really a thing, let alone discrete GPU's. This is coming from a professional VFX editor standpoint. Only 4 or 5 of the effects in AE are actually GPU ready, and many aren't even 16 bit, and none are multithreaded. that was removed because they broke it during an update to the decrepit 30 year old program. As you can see here, in most adobe software, better hardware gets you diminishing returns at best, and many generations of hardware actually made it worse. *it's just not made to run on today's hardware, and is poorly optimized. there's only so much throwing more watts and cores at it can do.*

  • @chebron4984
    @chebron4984 2 года назад +49

    This is exactly the kind of stuff that needs to be talked about. Too many people are looking at these cards from the perspective of casual gaming, of course it's not going to be worth it for that unless you have deep pockets. The real value for a card like this is the time savings that can be had for creators and professionals where time is literally money. In such scenarios, the card literally pays for itself. I don't think anyone using these cards professionally care at all about how much power it draws but rather how quickly it can get the job done. And that's another thing, if it can complete heavy workloads quickly then how much total power is it actually drawing at the end of the day?

    • @brolly8667
      @brolly8667 2 года назад +10

      Seems you're not living in Europe. Energy prizes are something to consider here. Also the global warming thing. Energy efficiency is the easiest way to help out on that, each one bit by bit.

    • @chebron4984
      @chebron4984 2 года назад +10

      @@brolly8667 you completely lost me at global warming.

    • @brolly8667
      @brolly8667 2 года назад

      @@chebron4984 or climate change. What ever you call it. It's a thing, probably the biggest problem yet to solve by humanity. Whether you will accept it or not. I know that it is highly politicized in the US especially. Which it shouldn't. Science should be free of that.

    • @Fin365
      @Fin365 2 года назад +12

      @@brolly8667 this video is literally all about how you'll probably use LESS energy creating & rendering on a 4090 than a 3090.

    • @brolly8667
      @brolly8667 2 года назад +5

      @@Fin365 Sure. Which is a good thing. But I was reacting on the comment from Chebron which said, that nobody cares about energy consumption in the professional field and I said why they also should. It's a good development that now efficiency seems also on focus and not just pure speed and workload. But I guess mostly due to necessity that cost or climate reasons because you also have to get rid of all the energy putting in by dissipating the heat again.

  • @viktordimitrov9353
    @viktordimitrov9353 2 года назад +53

    Ok, I admit I wasn't expecting such good results in some benchmarks, and apparently I was wrong about how powerful the new 4090 could be. Thanks for the comprehensive video, it was quite helpful.

    • @theTechNotice
      @theTechNotice  2 года назад

      haha, yeah, did you see your comment in the video ;)

    • @theTechNotice
      @theTechNotice  2 года назад +1

      Honestly, I was amongst the ones who were wrong!

    • @viktordimitrov9353
      @viktordimitrov9353 2 года назад

      @@theTechNotice 😄😄😄 yes

    • @memeityy
      @memeityy 2 года назад

      It's due to an older chip, OS, RAM clock speed, among other things. Benchmarking a 4090 requires a 7950x/13900k, 6000MT/s DDR5, Win11, Z790 Mobo, etc. He used (a) garbage CPU, RAM, and Mobo. Watch the Linus Tech Tips video.

    • @carloscruz7317
      @carloscruz7317 2 года назад

      We knew how powerful it would be months ago explore other Chanel’s .

  • @BaldGuyTalks
    @BaldGuyTalks 2 года назад +26

    For me Davinci Resolve performance is more important than gaming, that's why I appreciate your work.

    • @krishnabaro6036
      @krishnabaro6036 2 года назад

      Getting these cards just for gaming is ridiculous 3080 is more than enough I think. Unless you're loaded and just want the best available

    •  3 месяца назад

      For me gaming is more important than Davinci Resolve, that why I appreciate your work also.

  • @erictang28
    @erictang28 2 года назад +7

    This channel is underrated, one of the best channels for a creator's pc build.
    Rendering time is more important than the fps to me.
    When I find some creator pc build or comparison on the search engine, I just come back to this channel for the answers every single time.
    Love your video so much, good job.
    Btw I think you can also add win 10 for next comparison.

  • @originalityisdead.9513
    @originalityisdead.9513 2 года назад +12

    You don't play, you put the work in. Subscribed.

    • @theTechNotice
      @theTechNotice  2 года назад +1

      Thanks dude, and you're right, don't play at all! :)

  • @b6s4shelter
    @b6s4shelter Год назад +6

    The Tuf 4090 can draw up to 600W compared to other cards at 450W. It's a pretty good AIB card.

  • @1981AdamGs
    @1981AdamGs Год назад +2

    Thank you for putting this in a different perspective. The truth is these cards aren't targeted at casual gamers. Most of the people complaining about the power consumption and the price of these cards were never going to buy one to begin with. The graphics card landscape has shifted some. It's become obvious that the 4000 series of cards aren't meant for casuals. They're meant for hardcore gamers chasing the most performance possible and professionals for work. If you're a casual gamer and want an entry level or midrange card you need to look at last generation.

  • @menteasoqquadro
    @menteasoqquadro 2 года назад +4

    Thanks for being always focused on creators stuff, among all the mass of reviewers that think PCs are just for gaming. Are you planning to review also Intel ARC A770? Is Daz Studio performance comparable to Blender regarding benchmarks? Nobody tests DAZ Studio so it could be a nice thing to see in your channel.
    I also really like the technical quality of your videos regarding the colour palette, the accurate use of lights and the savy camera use. Clearly your videos are very well planned and have all a consistent style.

    • @theTechNotice
      @theTechNotice  2 года назад +2

      Thanks dude, yeah I'm in the works with Intel about their GPUs :)

  • @noobsaibot2255
    @noobsaibot2255 2 года назад +6

    I definitely think the TUF feels more premium than a STRIX , yeah they got those race car decals and RGB but the TUF is just looks much better, that's why I got TUF3080 the design and the temps were great, I really hoped ASUS were gonna continue this line and they made exactly the 4090 card I wanted, ordered the moment it was listed :)

    • @The1Mischief
      @The1Mischief 2 года назад +3

      Not to mention this non-oc TUF is MSRP ($1600)

  • @enntense
    @enntense 2 года назад +9

    What's interesting is across several channels, testing multiple card manufacturers, none of them are any better than the FE. So as of now it appears NVIDIA has saddled its partners with producing higher cost, larger (non liquid) cards that offer...nothing in terms of performance. EVGA's withdraw becomes clearer.

  • @herculesmare4209
    @herculesmare4209 2 года назад +9

    Really good video. Everyone is looking at games and FPS and it hard to justify an upgrade on a really solid card like the 3090 or 3080 wich will keep you playing games for some years still. In 3D production the 3090 was a god send. The 4090 will enable smaller studios to achieve results faster and increase the scope of the projects. I agree most board partner cards are the worst examples of industrial design. Spending so much money on something that looks like a bad christmass tree, baffles my mind.

    • @jefferybucci30
      @jefferybucci30 2 года назад +1

      It's only hard to justify if you don't have a high refresh rate 4k monitor. I got a 4k 240hz monitor that's $1500 by itself. I think I can justify a $1600 gpu to push that amount of 4k frames to that beast of a monitor!

    • @herculesmare4209
      @herculesmare4209 2 года назад

      @@jefferybucci30 Sadly the pro or colour accurate monitors for production that is above 60 hz is so crazy expensive

    • @jefferybucci30
      @jefferybucci30 2 года назад

      @@herculesmare4209 true, I got mine for gaming and I’m not regretting it. Currently have a 3080 and it’s okay but it’s definitely not able to push the amount of frames at 4k that I want to.

    • @Monty16v
      @Monty16v 2 года назад

      @@jefferybucci30 except they screwed us with display port 1.4a and hdmi 2.1
      So we won't be gaming at 240hz 4k without chroma subsampling.
      That's the one thing that annoyed me about this launch

    • @amrishpatel3501
      @amrishpatel3501 Год назад

      I'm going for a 4070 Ti instead, since I can't afford a 4090 & also I'd rather build a new system if I wanted a 4090.
      I'm going to pair that 4070 Ti to a Ryzen 9 5950X after I upgrade it from a 3950X.

  • @alexandrevaliquette1941
    @alexandrevaliquette1941 2 года назад +1

    PLEASE HELP ME TO CHOSE A BUDGET OPTION:
    3060(12Go) or 3060Ti (8Go) or 3070(8Go) ?
    I'm doing only Davinci Resolve 4k with H.264 codex from my Pixel 3xl phone.
    My setup: 1060 (6Go), 64Go RAM, 6 Cores double thread CPU.
    Thanks for any comment here!

  • @Gr13fM4ch1n3
    @Gr13fM4ch1n3 2 года назад +17

    As a Blender artist, this video was so reassuring for me. Thank you for putting your time and effort into making a thorough creator review.

    • @stuartfury3390
      @stuartfury3390 2 года назад

      Its wrong though, since when can you use DLSS with blender? Its not applicable

  • @winterfireofficial2101
    @winterfireofficial2101 2 года назад +23

    very impressive for 3d workloads, I'm hoping AMD brings something impressive to the table with RDNA3 , especially for creators, I'm building a new workstation for my daughter soon and it would be great to have something that can compete, ideally at a slightly lower price point. Additionally i really don't want to give nvidia my money, the have a great product but very questionable business practices and i just don't like the way they treat the consumers of their product (and aib partners for that matter *cough*EVGA*)

    • @djsahilking3807
      @djsahilking3807 2 года назад

      I don,t want to pay tax i don,t like my government the way they treat people then *cough* inflation 8.1

  • @ArtemDzhadzha
    @ArtemDzhadzha Год назад +2

    I bought precisely this one (ASUS TUF 4090 OC) about 2 months ago, and I'm extremely happy I didn't go with STRIX (I had money, they just were out of stock). I like the design, I like the minimal RGB, which I set to reflect GPU temp. Never runs higher than 65 C, even on the most demanding games with 4K and RT on.

    • @damienfiche2592
      @damienfiche2592 Год назад

      Do you get any coil whine? If not, I'm curious what PSU you are using?

    • @sequayach
      @sequayach Год назад

      ​@@damienfiche2592
      Mine tuf 4080 whines a lot.

  • @Real_MisterSir
    @Real_MisterSir Год назад +5

    With a little undervolting, this card should still perform well above the 3090Ti, but at the TDP of my current 2070S. Now that is quite an achievement! I saw some tests (Derbauer I believe) that found the efficiency curve for the 4090 could easily retain most of its performance at far lower wattage than its stock tdp rating. And you'd have an ultracool gpu on top due to the overbuilt cooler for this gen cards. I truly like everything about it, only the price is quite spicy but I'd much rather get this than any of the AMD cards or the 4080..
    My only personal drawback is that my render software of choice, Corona renderer, relies on CPU rendering and only uses the GPU for denoising 😭 At least my 13900K should take care of the rendering I hope

    • @Astelch
      @Astelch Год назад +1

      LinusTechTips did the undervolt experiment where he tried to undervolt the gpu to make it work with a 550w psu. Even at like 60% of its powerusage the gpu only lost roughly 10 frames at 4k ultra in cyberpunk.

  • @rf-cinematic1576
    @rf-cinematic1576 2 года назад +3

    Thank you for doing a creators perpective review of this card. I build a new pc last year with a 12900K and 128GB of DDR5 ram, but with my old graphics card ( GTX 1080 TI ). RTX 3090 costs about 3000€ in that time. And my most important program is davinci resolve and mostly fusion. So this card will completely change everything for me.
    I ordered directly the MSI 4090 suprim liquid X. I like the design... not like the other 4090s like you said in the video. And with a liquid cooler, so the card is way smaller than the others.

    • @mct8888
      @mct8888 2 года назад

      $1749? RIP.

    • @Xyz_Litty
      @Xyz_Litty 2 года назад +1

      @@mct8888 I just ordered the 4090 Zotac Amp Xtreme Airó .. Rip my 1800$

    • @rf-cinematic1576
      @rf-cinematic1576 2 года назад

      @@Xyz_Litty I'm living in germany... it's so much more expensive here... I paid 2329€

    • @Icenfyre
      @Icenfyre 2 года назад +1

      @@rf-cinematic1576 same price in portugal. and the strix is 2549€. we getting fucked here

  • @nenickvu8807
    @nenickvu8807 2 года назад +14

    Great stuff! Glad you focused on a creator oriented review! Love your channel!

  • @sanaksanandan
    @sanaksanandan 2 года назад +6

    Best review of 4090. For Resolve and Blender, it's a no brainer, if you need that kind of performance.

  • @LRNZ09
    @LRNZ09 2 года назад +3

    You're the only reviewer among the videos I watched about the 4090s that noticed the typo on the card 😅, and I totally agree with you about the overall design of the cards, they all look so damn cheap other than a few exceptions.

  • @MatthewLienMusic
    @MatthewLienMusic 10 месяцев назад

    Can you recommend a version of this card which is the most quiet? Is the ASUS TUF version quiet with 3 fans? Thanks.

  • @angelaguilera600
    @angelaguilera600 2 года назад +1

    If I buy a 3080ti or 3090 for 1440p gaming, how many years will it be good for before I need to upgrade to another gpu to be able to play new AAA games in the future?

    • @StevieCEmpireofUnitedGaming
      @StevieCEmpireofUnitedGaming 2 года назад +1

      5 years or more maybe 10 max

    • @angelaguilera600
      @angelaguilera600 2 года назад +1

      @@StevieCEmpireofUnitedGaming thanks

    • @StevieCEmpireofUnitedGaming
      @StevieCEmpireofUnitedGaming 2 года назад

      @@angelaguilera600 it’s cool xD I have a rog strix 3090 oc for that very reason plan to stick with 3440 1440p highest settings. Hope you find a good deal my 3090 was a limited edition 2200 pounds

  • @uriel-rl4zy
    @uriel-rl4zy Год назад

    Hi, at 3:55, you show the encoder results for both RTX 3090 and 4090, showing a 26.5% increase on the RTX 4090. Are you sure this is correct? Isn't that time a lower is better, as is seems to consider the time it took for the encoder to finish the task, so less time should be better, no?

  • @DannyNilsson
    @DannyNilsson 2 года назад

    i hoped it was a review of the TUF card. I think you missed out a lot of details like idle wattage where 4090 will use a bit more than 3090. also that ASUS is the only card having 2 HDMI. and for the last part if it was using 3-4 power connectors, as this depend on if it goes beyond 450W in overclocking.
    what is the the difference between the TUF and TUF OC model. all cards look very much the same. and looking at the INNO3D OC model it was not even able to overclock. so what does this OC mean in ASUS terms?

  • @TheGameBench
    @TheGameBench 2 года назад +1

    Keep in mind that DLSS3 is just DLSS2 with reflex and frame generation. So, it's using AI to create frames that DO NOT exist, which I could see as being problematic for a content creator as well. And I take it that it has dual ENCODERS, but not dual DECODERS? I'm not as interested in encoding support as I am about decode support. I was super excited about this, but less so if it's just the encoders. Since I work in x264, I was hoping for big gains in timeline perf.
    Regardless... these were the benchmarks I was interested in. It'll be interesting to see where prices fall once NVIDIA burns through 30 series cards and they run out of less price sensitive people.

    • @theTechNotice
      @theTechNotice  2 года назад

      Yeah just 1 decoder but 2 encoders...

  • @ChatGTA345
    @ChatGTA345 Год назад

    Did you experience any coil whine on your TUF, and if so, has it gone away over time?

  • @lewiswright4038
    @lewiswright4038 2 года назад

    Is this a 3 slot card ?? Want to put it in the NR200 always liked the tuf cards

  • @torom86
    @torom86 2 года назад +4

    I totally agree, the Founders Edition has the best design, but this Asus TUF looks nice too. Most other designs are for gamers as that's the majority of people who will buy these cards, much more than professionals. I'm a gamer too but I really prefer a more professional looking product over a flashy RGB one.

    • @rpospeedwagon
      @rpospeedwagon 2 года назад

      Same. I really want the FE or MSI Suprim X. I have a Gigabyte OC coming in, but thinking of cancelling.

  • @vidiveniviciDCLXVI
    @vidiveniviciDCLXVI Год назад +1

    I just got the MSI suprim Liquid X RTX 4090, in my 13700kf, 570, DRR5 6400 system and hit top 120 systems in the world on benchmarking (superposition) and can get 140 FPS in Cyberpunk 2077 at 4k everything maxed out. Loving it.

    • @vidiveniviciDCLXVI
      @vidiveniviciDCLXVI Год назад +1

      Also please Vertical mount your 4090, it looks badass, it protects it from damage, HYTE Y60 Case is small beautiful case, I got the white one with Red RGG with black accents and it looks out of this world. Got my 4090 with push pull and 360mm Rad in the top cooling the CPU. Cost me £3859 for the build and I've already had a mate offer me £5000 to buy it off me, I refused of course.

  • @bing8893
    @bing8893 2 года назад

    what case would you recommend for the 4090 then? seen a mix of recommendations like the p600s, 011 dynamic xl etc.

    • @theTechNotice
      @theTechNotice  2 года назад +1

      yeah, both of these work. I guess all full towers work, just have to be careful with the mid-towers :)

  • @ArielBrener
    @ArielBrener 2 года назад +1

    Can't wait for a 13900k+4090 vs Mac studio Ultra for video editing extensive comparison!!

  • @joescalon541
    @joescalon541 2 года назад +3

    Main problem with the 4090 is the 3090. Unless you only game at 4k, get a 3090 or 6900xt. Also many 3090s are going on sale between 799-980 new, and lower used. Can build an entire 3090 gaming computer for price of 4090 especially if you can’t find a 4090 FE.

    • @alexandrevaliquette1941
      @alexandrevaliquette1941 2 года назад +4

      You know this channel is about content creator right?

    • @Real_MisterSir
      @Real_MisterSir Год назад

      I still prefer seeing 60-70% better Raytracing performance in games, as going with anything but max graphic settings is the sole reason to get a card like this for gaming anyways. Also why I am not fond of the AMD 7900XTX, as it doesn't care about anything but raster, which is quite short sighted imo. The 3090 would have to be at a really good deal for me to consider it, it's over 2 years old now and has its share of problems, some cooling hotspot issues etc, and overall just not as high quality as the 4090 - without even considering the performance upgrade.
      If the 3090 performance point is what's relevant, then I'd rather wait for 4080 cards to get in stock and drop down in price a bit.

  • @justinreinsma9772
    @justinreinsma9772 2 года назад +1

    Performance wise and thermally, is there a gap between the FE and the partner designs? I absolutely hate this generation of designs and the massive prices on the aib cards but because it's for work performance is king for me.

    • @theTechNotice
      @theTechNotice  2 года назад +1

      Now, not a huge if any difference at all :)

  • @asryan7582
    @asryan7582 2 года назад +1

    Just ordered a Asus TUF 4090 (non oc)
    Is it the same as the oc one? i can just do it myself? I read it was possible on old TUF to flash the bios of the strix, is it safe and doable on this card as well?
    I saw a test of the TUF, performance and temps looks amazing, 62c vs 74 on founder at load 😊 will get it by saturday can't wa

    • @lawyerlawyer1215
      @lawyerlawyer1215 2 года назад

      Im not sure if the non OC card allows overclocking.
      I understood that OC stands for “it allows overclocking” and not much for “it comes overclocked”
      That said , you won’t really need to overclock this card 😅

    • @blacklevo9989
      @blacklevo9989 2 года назад

      Its not even 50mhz in oc so no difference :)

  • @JohnSmith-vd8nn
    @JohnSmith-vd8nn 2 года назад

    I don't trust your wattage numbers for the 4090. Look at the differential being pulled by the whole system from the wall and see if it lines up (after adjusting for the difference in idle wattage).

  • @leopardchermar8556
    @leopardchermar8556 2 года назад +1

    I9 9900k with 4090, will it have a big loss of performance in 1440p? bottleneck?

  • @Static-MT
    @Static-MT 2 года назад +3

    I'd like to see the 4090 down clocked to a 350W max and see how it compares to a 3090. The Nvidia driver command line utility should let you do it.

    • @DannyNilsson
      @DannyNilsson 2 года назад

      yeah when i buy the 4090 i will lock it for FPS max so i get some energy saving in return. the down side of 4090 seems to be a quite high idle watt usage and tops the list

    • @Static-MT
      @Static-MT 2 года назад +1

      I'd like to actually see the test results at the low end and at idle for 2D productivity work. I'm not sure how low it can actually be throttled. Maybe... just maybe, the card actually gives people options to either run at bleeding edge speeds or scale down for work and save on electricity. I haven't seen any PC press look at it this way, yet. Everyone's been "OMG so much POWER!" - but it's actually completely optional that the card runs at these speeds.

  • @JayzBeerz
    @JayzBeerz 2 года назад +1

    nice but needing a more powerful power supply and a bigger case is extra money.

  • @m0nztam0nk
    @m0nztam0nk 2 года назад +1

    No h265 422 10bit decoding…😂…in 2022…thank you for pointing that out. This cements the mac as THE platform for videoediting to me. Sure, intels egpu does it kinda decent, but nit nearly as smooth and fast as what a macbook pro max does. So thanks, you saved me a TON of money!

  • @Marcus_R
    @Marcus_R 2 года назад +1

    Nice comparison of this crazy gpu.
    Do you think in a Ryzen 5900x 12 core system the rtx 4090 can use it's full potential in davinci resolve?
    Or should I wait untill the 3090 prices drop some more and stick to that card?

  • @NewTestamentDoc
    @NewTestamentDoc 2 года назад

    when doing email/browsing, what are the power uses of a 3090 and 4090? Are they the same when just cruising along? Or are we going to see 225 watts even when idling away on the 4090?

    • @theTechNotice
      @theTechNotice  2 года назад +2

      No no, it'll idle at a few watts, I saw it at like 8w lowest if I remember correctly.

    • @NewTestamentDoc
      @NewTestamentDoc 2 года назад

      @@theTechNotice I wonder why no other reviewers ever say this? So thankful you took the time to share it! Surprised how low the wattage is when its only idling!

  • @TR-707
    @TR-707 Год назад

    Did Nvidia state that 5000 series will be double the performance of 4, or was it speculation. That def affects my decision to jump.

  • @davidbenini8144
    @davidbenini8144 2 года назад +1

    I hope and believe that the performance in Resolve, in pugetbench is tremendously limited by the platform and the cpu ... in fact there is superior performance with 7950x and rtx3090. I would have expected an overall above 3000 pts

    • @theTechNotice
      @theTechNotice  2 года назад

      Yeah hugely cpu limited and ram bottlenecked.

    • @davidbenini8144
      @davidbenini8144 2 года назад

      ​@@theTechNotice Pugetbench released the results of the 4090, from a quick search for exactly identical systems between the 3090ti and the 4090, examining only the gpu effects score there is an increase of 36% (7950x overall score 2770) . Let's see what they will do with the 13900

  • @PaR2020
    @PaR2020 2 года назад +2

    To get the actual wattage it is better to use hardware. As Gamer Nexus channel did. There it was draining almost 500 watts! This means that in your tests in could have done the same. It's just that software can be not accurate...

  • @JoseFernandez-cd7ew
    @JoseFernandez-cd7ew Год назад

    Is this card good for Davinci Resolve? Better or worse than Radeon 7000?

  • @DizConnected
    @DizConnected 2 года назад

    I wanted a RTX 4090 FE but they were sold out in 40 mins when I checked their website. ALL the RTX 4090's are sold out at retailers atm.

    • @Monty16v
      @Monty16v 2 года назад

      Sold out in less than 1 minute you mean

  • @ojtothemax
    @ojtothemax 2 года назад +1

    Still a comparison 3090ti to 4090 would be great to know, which one is the better deal now.

  • @tek1645
    @tek1645 2 года назад +3

    Never trust mid sized channels like this channel as they want to satisfy the big three to get free parts. Only watch niche or big TechTubers

  • @anfproduction72
    @anfproduction72 Год назад

    How good on Unreal Engine and Aximmetry software?

  • @TEXASMOFO
    @TEXASMOFO Месяц назад

    Excellent analysis! Just what I wanted to hear for use with DaVinci Resolve. 👍 Do you plan to conduct any local AI Rendering (i.e., Stable Diffusion, Flux1 Dev, Mangio, etc.) comparisons if future videos? 🤔

  • @daicaghund3587
    @daicaghund3587 Год назад +1

    Its interesting to worry about the appearance of the card.
    I guess if I spent my time looking at the GPU instead of using it for its intended purpose I might care about aesthetics. My concern for the GPU's appearanace lasts as long as it takes to get it installed. Once Im using it I couldnt care less.

  • @mshea5906
    @mshea5906 2 года назад +1

    With you 100% on the partners designs of the cards. Only 2 I would consider are the ASUS TUF Gaming & Founders edition.

  • @romzen
    @romzen 2 года назад +1

    The 4090 is a sick piece of tech. And people were ungrateful for the 20 series and 30 series already and bitching despite the ridiculous performance increases over the years. Hell, people still try to get their hands on a PS5 which has the gpu power of a 3060/2080. And in the end everyone tries to get his hands on these new cards. Hypocrites.
    Much love to NVIDIA for this next huge step.

  • @vertigo1721
    @vertigo1721 2 года назад

    Is GPU a necessity for streaming?

  • @sergiosanchez2012
    @sergiosanchez2012 2 года назад +2

    The Asus TUF is $1799 plus taxes for 40% increase in performance for video editing. You can get a 3090 (I just got one yesterday) for $830 taxes included.
    Less than half the price.
    It's a great card, don't get me wrong, but since for nvidia Moore Law is dead they Will Charger You for it. Meaning inna couple of months wey Will get a 4080 12 with less performance than a Last gen 3090 for the same money.

    • @beastslayer9153
      @beastslayer9153 Год назад +1

      Your comment was spot on! tho its the same performance for MORE money! haha

  • @AnatoleBranch
    @AnatoleBranch 2 года назад

    waited outside Best buy this am 4.30 only to be told they have nothing in stock rushed to a Micro center and got the Gigabyte OC as availability of TUF was already taken, now I see a bunch of scalped cards on ebay with 1k + markups, insane. Best buy and Nvidia seem to have no interest selling directly to users. consumers. grateful I got one, very frustrating with Best Buy and FE edition.

  • @alpenfoxvideo7255
    @alpenfoxvideo7255 2 года назад

    does it feel smoother in the resolve timeline? I'm tempted by this but also for a new intel chip with the iGPU

    • @theTechNotice
      @theTechNotice  2 года назад

      Well the decoders are the same, so the only boost you get is the effects and colour grade and fusion etc... also dewatering on r3d and other stuff, but purely h.264 and h.265 timeline is very similar.

    • @alpenfoxvideo7255
      @alpenfoxvideo7255 2 года назад

      @@theTechNotice thanks man!

  • @davidfidler8761
    @davidfidler8761 Год назад +1

    And then we look at "performance-per-£". Here in the UK I can pick up an RTX 3090 on ebay for ~£725 (though some, on rare occasions, go for £650), and then compare it to the prices from any vendor in the UK, the LOWEST PRICE I can get an RTX 4090 is £1800... but they go up to £2300. (Note: When you see the MSRP in the US, that is not including sales tax - so add another 5-10% to the pricetag. so it's not $1599, it's closer to $1700 USD, which, btw, is more than the cost of every other bleeding-edge component that I can put into a new build).
    So... that means (here in the UK) the RTX4090 is 2.5x the price of the RTX3090 (using the numbers that are most favourable to the 4090 - I could have made it look worse). So even taking the best score comparisons that you showed (which fluctuated between 1.9-2.3x the performance) then the 4090 it comes down to...
    1. RTX 4090 consumes more power - other channels have measured from the wall, not from inside windows, as you have
    2. 4090 gets better performance-per-watt
    3. 3090 gets better performance-per-£
    So the real answer is, what can you afford. I've seen sites that say "most will just spend a tiny bit more to get the 4090) - but at nearly £2000, most of us can't drop that kind of cash every couple of years on a GPU. It's utterly ridiculous.
    Basically, you would only buy the 4090 if you are going to make money from it (or if you just have the spare cash lying around).
    In my eyes, it's a golden opportunity to get into the 3090 if you are still on a 20-series card. Because buying that 4090 is gonig to cost you £1000+ to rent it for a year (if you flip it just before the 50-series come out), or if you like to keep your cards forever, you're going to absorb nearly the full cost of that card - ouch.
    I honestly think that the best solution for all of "us normal folk" to NOT buy any 40-seriest stock. Let nVidia control the stock levels... I would love to see them sitting on surplus inventory, drowning in pools of their own sweat. Sure, let the content creators and professional should buy it - they have a reason to - it's their livelihood.
    BUt the rest of us? I think we should vote for lower-priced GPUs by starving team green (and red). I have 4 TVs in my house and they - collectively - cost me £3000 and my family gets way more use out of them than most of us would ever get out of a GPU - and they'll last 5+ years.
    The GPU market just doesn't add up. People are wasting cash on these GPUs because videos like this are making it seem like a reasonable purchase. Please stop supporting this lunacy with positive messaging like this.

    • @ofensatul
      @ofensatul Год назад +2

      @David Fidler this is the most sane answer to the RTX 4090 i've read all week , everyone on reddit is just pushing this 4090 for gaming and not work load/ cash flow IRL and they don't take into account the actual cost of electricity.
      I live in Romania , it's not a 3d world country anymore but the salaries are still some of the worst compared to the rest of the europe , we pay 26-27 cents per kilowat in a house hold and our salaries are stuck at 700-800 EURO max.
      Ppl have become so ignorant today it's crazy.

  • @ninbura
    @ninbura 2 года назад

    Do all 5 video outputs work at once? Or is it still limited to 4 monitors even with the extra output?

  • @DouglasDoNascimento
    @DouglasDoNascimento 2 года назад

    Very nice review. Is the 4090 supported by the ASUS Z690 Creator Pro MB? Will it play nice with other components like RAM, i9 12900KS, etc?

    • @theTechNotice
      @theTechNotice  2 года назад

      Yes, it's supported on all motherboards that support Gen 4 PCIe x16 slots :)

  • @GySgt_USMC_Ret.
    @GySgt_USMC_Ret. Год назад

    Outstanding video! Just received my 4090 TUF OC and waiting on Cablemod order before I install it.
    Fair winds and following seas to all.

  • @valhallasashes4354
    @valhallasashes4354 Год назад

    3:59 - I could be wrong, but I think you've accidentally swapped the encode results. You clearly say the 4090 was faster and even write it in green in the frame, but the results screenshots have the better performing numbers (both higher number of frames processed per second and shorter encode time) on the left under the "3090" designation and the worse performant results on the right under the "4090 +26.5%" designation.

  • @davidlyons5108
    @davidlyons5108 2 года назад

    nvidia also stated its 2-3 hours of measurement for heat percentage. Not initial hear measurements.

  • @yekyawaung6195
    @yekyawaung6195 2 года назад

    plz mention about TUF version temps, we have seen that 4090 are cooler use less power with eye catching card designs (MSI suprim , TUF and founder)

  • @The-Weekend-Warrior
    @The-Weekend-Warrior 2 года назад

    If a CPU temp is lower, shouldn't that be a PLUS rather than a MINUS? In your chart at 9:30, the baseline is green (100%) and the BETTER temp (92.19% and 98.19% less) is shown in RED... a bit counter-intuitive.

  • @ShadowCatGambit
    @ShadowCatGambit Год назад

    I'm looking to find ways to make money off of my current GPU (RTX3070). What would be some good things for me to do?

  • @viktordimitrov9353
    @viktordimitrov9353 2 года назад +1

    I honestly wonder if it is possible to have an SFF build with 13900K , R9 7950x combined with RTX4090? It will be quite interesting whether it will be possible to adequately cool the CPU.

    • @domenicpolsoni8370
      @domenicpolsoni8370 2 года назад

      LOL. No. It isn't possible.

    • @Mr.Plutonium
      @Mr.Plutonium 2 года назад

      ITX will probably need a waterblock 99% sure.

    • @theTechNotice
      @theTechNotice  2 года назад

      that's a good challenge!

    • @viktordimitrov9353
      @viktordimitrov9353 2 года назад

      @@theTechNotice I personally plan to build a mini-ITX configuration with 13900К and RTX 3080TI. The goal is to be able to use 850W PSU. The case is ASUS Prime AP201, mobo is ROG STRIX Z690-G GAMING WIFI. Time will tell what will happen. 🙂

  • @Tiesto940
    @Tiesto940 2 года назад +1

    alright I take my words back, those days have gone where every generation saved 250W power limit :(

  • @junhadiatmadja9603
    @junhadiatmadja9603 Год назад

    Hi, could i check if the CPU you used for doing the benchmark testing is intel i7 12th generation?

    • @theTechNotice
      @theTechNotice  Год назад

      Yes

    • @junhadiatmadja9603
      @junhadiatmadja9603 Год назад

      Hi Thank you. Could I ask would there be a significant improvement if the CPU was intel i9? is it worth the extra cost in for 3D modelling, rendering?

  • @richardweddle3408
    @richardweddle3408 2 года назад +1

    Very helpful. Concise and well-organized review. You're getting high performance results on a gen4 mobo and a gen4 CPU. Although I wonder if you'd get the same results with a Ryzen 5950x.

  • @sudd3660
    @sudd3660 2 года назад

    im a blender user, but 2x performance is irrelevant if the price is to high.
    then for all that money i still have to deal with god damn LED's

  • @aurianschoune3736
    @aurianschoune3736 2 года назад

    Thanks for all what you do for your community :)
    I'd like to know if GPU have an impact on the payback in general ? In Premiere pro ?
    Gor now, I'm using a GTX 1660. But somethimes the effects on the timeline are a bit buggy. I don't really care about exporting faster but, by changing a GPU, will the payback improve ? Even for a bit ?
    Technical question I know.
    Thanks ny advance !

    • @baoquoc3710
      @baoquoc3710 2 года назад

      I don't really into creators stuff you know, but you can get the RTX 3060 to reduce and minimize stutters in Premier Pro

    • @WINT3R_94
      @WINT3R_94 Год назад

      It does help. You still need to use proxies if editing 4k or 8k content for smooth playback.

  • @clausbohm9807
    @clausbohm9807 Год назад

    Why dont you use Davinci instead of Premier?

  • @GG-gr2px
    @GG-gr2px 2 года назад

    Can someone explain why this card in the USA costs $1600 and in all over Europe the cheapest one costs €2100?

  • @toddpeterson5904
    @toddpeterson5904 2 года назад +1

    I see used 3090's coming down to $700 now, which makes it a better value than a $1600 (plus) tax card. Same amount of VRAM. In some cases you'll get 2x performance with the 4090 or you could get 2 x 3090's and have several hundred dollars left over. In apps like Resolve, 40% faster is not worth 2x the price unless your paid work will make you more money with that increase.

    • @JedHurricane
      @JedHurricane 2 года назад

      Are you forgetting the power draw? the 4090 only pulls 450w, with a 3090 it's 350w each

    • @toddpeterson5904
      @toddpeterson5904 2 года назад

      @@JedHurricane I depends on how much the power cost matters to you. Running 2x 3090 would be 250w more at full load. Assuming full load for 2000 hours a year at 10 cents/kWh, the difference is $50 a year - again, assuming you run at full load 8 hours a day 5 days a week for 50 weeks a year.

  • @petrsasek6077
    @petrsasek6077 2 года назад +1

    I just love how people will say it is not worth it.... I mean if you are on budget dont look at 4090.... And if you use it for work, then money you save on time are always worth. If you are just rich why even bother to think about it.... Buy the best :D

  • @BaldGuyTalks
    @BaldGuyTalks 2 года назад +1

    I think relying on the HWinfo power consumption information produces inaccurate results. Der8auer and GamersNexus measured the wattage of 4090 FE and came up with much higher numbers during regular gaming.

    • @theTechNotice
      @theTechNotice  2 года назад

      Well If I do a furmark test I can see 450w pulled as well, It depends on the application and what part of GPU is uses :)
      I'd like to do that test as they've done it, but I don't have such equipment, Der8auer had a cool Wattage meter indeed! :)

    • @BaldGuyTalks
      @BaldGuyTalks 2 года назад

      @@theTechNotice
      Der8auer is apparently ready to offer his metre for sale soon.
      Regardless, I truly appreciate the fact that you do productivity benchmarking and tests. Most high-end "gaming" cards are overkill for gaming, however they offer great value for creators on a budget.

  • @radwoc
    @radwoc Год назад

    13:13 I am not sure what testing platform you had for Puget Bench for Davinci Resolve, but when lookin at user results on pugetbench page, most results of standard test for 3090 was about 2000 and most results of 4090 is about 3000.

  • @Netsuko
    @Netsuko Год назад

    Pretty sure the DLSS3 part is nonsense. DLSS works by using pre-trained data. They use 16K resolution images from the game to train DLSS. In a 3D modeling application, there is no data to train on yet. So this bit makes no sense. DLSS can’t magically upscale everything without being trained on it first.

  • @mayorc
    @mayorc 2 года назад

    Well construction is 4nm vs 8nm, so it has to be very efficient on power consumption, probably 4090 also have better starting drivers, and things can also improve.
    So far Is the most efficient/performant card out there, though we must wait for Retail suggested price to be respected (currently is not).

  • @erenixereinx5409
    @erenixereinx5409 2 года назад +1

    Wow 268 w on octan is crazy. This new gen of core cuda is unbelievable

  • @davidpringuer
    @davidpringuer 2 года назад

    Specious reasoning. In terms of productivity more will get done, but the card will still be cranking away the watts for the working day.

  • @carloscruz7317
    @carloscruz7317 2 года назад

    Dlss 3.0 is possible with 3000 series and I think you’ll get it when the inventory issue corrects

  • @blueskys6265
    @blueskys6265 2 года назад

    This thing launches today right? How and were do I buy it? Nothing appears available

    • @azyndragon89
      @azyndragon89 2 года назад

      It launched a little over an hour ago. Availability obviously depends on where you're from. Assuming you're from America they went live on newegg and Best Buy this morning and are now sold out.

    • @blueskys6265
      @blueskys6265 2 года назад

      Oh thanks bud. I’ve been looking all morning on those two sites and even selected to auto notify when available. Such pain in the butt

    • @azyndragon89
      @azyndragon89 2 года назад

      @@blueskys6265 Damn, I'm sure you'll be able to get one in the next round if not this round. They sure seem to have a lot more stock and without crypto/scalpers it should be easier.

    • @blueskys6265
      @blueskys6265 2 года назад

      @@azyndragon89 Thanks a bunch. I’ll just keep looking. One will show up on the apps at some point.

  • @NarekAvetisyan
    @NarekAvetisyan 2 года назад

    For the record Ray Tracing speed in Blender is not as impressive as Ampere was over Turing. Ampere was 3 times faster! It was due to new motion blur acceleration instructions in the RT cores. Efficiency though, in the 4090 in ray tracing is incredibly impressive!
    For me I'm gonna go with 2 RTX 3090's with NVLink, which 4090 doesn't support. That will double my VRAM to 48GB in Blender in OptiX mode. I will get the same performance as the 4090 with double the VRAM, although efficiency will be out the window. But more VRAM is more important.

    • @theTechNotice
      @theTechNotice  2 года назад

      And it's still gonna be cheaper than 1x a6000 so I get it :)

  • @Kaizen_Films
    @Kaizen_Films Год назад

    Excellent video! I have a question, if you could make a video showing the differences between a system using an RTX 4090 and the same system using two RTX 4080s. Would there be much of a difference in performance within the programs?
    Thanks for the content! Keep up the great work you've been doing =D

  • @TechTusiast
    @TechTusiast 2 года назад

    Same thing I have said/asked in other 4090 bechmarks: Wouldn't 4090 consume more if you pared it with a CPU that can push it to full speed. Atleast 7000-series if not 13th gen Intel.

  • @AngryOscar
    @AngryOscar 2 года назад

    Will the Asus 4090 tuff fit in the lian li 011 evo case? Some people say it might hit the side panel

    • @nbk2134
      @nbk2134 2 года назад

      My 3090 used to hit the side case I ended up getting a PCIe 4 riser cable and all is good with the world. Even if you manage to get this in you'll never be able to fit the power cables.

  • @TruExFlame
    @TruExFlame 2 года назад +1

    I sincerely hope that GPU's will start getting smaller from this point on (and actually look good), the 4090 is way too chunky and will not fit in your average case. I also hope that the prices will be more reasonable with 5000 series. In Europe these prices are actually even higher, which is awful considering the situation people are affected by with bills going up an insane amount since February.
    If you're not aware, using the UK prices here as an example, 4080 FE non imposter is 1269 GBP.
    Last gen 3080 MSRP was 649 GBP.
    For the 2080 the MSRP was also 649 GBP.
    That is almost DOUBLE the price.
    as for the 4090, the FE model is 1699 GBP
    Last gen 3090 FE was 1399 GBP
    That's a 20% price increase.
    The 4090 performs really well, and I'm sure the 4080 non imposter will also look really good performance wise, but the prices are just absolutely awful which makes these cards unattractive, sure the 3090 was already ridiculous, but the 4090 went even steeper at a time like this, again people earn the same kind of money over here despite the fact that cost of living has gone up in price as much as nvidia claimed the 4090 to be faster than 3090. It would be nice if AMD will bring the same kind of performance to the table for a significantly lower price.

    • @javiej
      @javiej 2 года назад +2

      Don't worry, prices will fall after Christmas. The new models are artificially overpriced so that they can sell the big stock of previous gen. They will fall afterwards

    • @TruExFlame
      @TruExFlame 2 года назад

      I've never seen a card fall below its MSRP way before the next gen's launch. ​If they sell their 30 series stock and aren't left with 40 series overstock on the 50 series launch, then maybe the 50 series will fall back to "Normal" prices at MSRP. But then Nvidia has a reputation for being greedy.

    • @ryanthompson3737
      @ryanthompson3737 2 года назад

      First, you mentioned the highest priced 4080 while then mentioning the lowest priced 3080... it doesn't help your cause to play with numbers.
      Secondly, everyone expects a new card every couple years with at least 50% performance boost while remaining the same price.... this isn't realistic. The computer age is coming to an end on the traditional computing parts, and it's becoming increasing harder and more expensive to make massive improvements without literally doubling up what the previous card has. There has been some technological improvements (4nm vs 8nm before), but it's not going to magically print money for the companies inventing these technologies.

    • @TruExFlame
      @TruExFlame 2 года назад

      @@ryanthompson3737 Fair point, although from my perspective the one playing with numbers here is Nvidia. I can mention the 4080 imposter too and the price will still be significantly higher. The 12GB variant of the 3080 does not have a founders edition card and I reserve the 3080Ti price comparisons to when the 4080Ti launches. I used the 4080 16GB model here as the 12GB variant has significantly lower specs. I completely agree with Hardware Unboxed who said that the 4080 12GB is a terribly named product.
      I do not expect a 50% improvement for the same price, but it sure is nice to have something like you would've had going from 2080 to 3080 for the same MSRP (if it weren't for all the stuff that went down that year).
      If it's unrealistic for prices to be more reasonable for the average consumer, what do you think will happen if prices continue to increase like this with future generations? I don't think anyone wants that, if it does cost nvidia significantly more to make these improvements then they will have to figure out a way to make it cheaper for themselves.
      I have no data on how much it costs Nvidia to make these products, and how that compares to previous gens (for example Pascal 16nm going to Turing 12nm and Ampere 8nm), but I know that prices can not just keep going up like that.

    • @ryanthompson3737
      @ryanthompson3737 2 года назад

      @@TruExFlame For your first section, that's why you can't compare prices to previous generations. If we were to compare the price to performance between generation, the 4000 series is actually cheaper per performance metric than last generation, even if it's more expensive overall. My point was that people don't cheer for a 10-20% improvement and are reluctant to spend money on miniscule performance increases. Nvidia knows this and knows more people will pay for 50-100% performance improvements, especially when the new gen cards are nearly identical in performance and price to a higher last gen card. Compare a 4080 to a 3090, and it's pretty worth it.
      The issue with saying "just make it cheaper" is that you can't make physics cheaper. When you're bordering on physical limitation, you can't just make it cheaper. This new generation is really a relic of the past. They used to just add more hardware and use some new advancements to make it a bit smaller just to get that increase, and now Nvidia is doing the same thing. The only difference is 20-30 years ago it was a lack of knowledge and advancement, now it's the universal limitations to the designs graphic card companies have boxed themselves into. I'd give it a few more decades before any real improvements are done with quantum computers or even some other design that isn't the standard used today.

  • @lolowfi
    @lolowfi 2 года назад

    All your calculation assume you can actually buy a 4090 at MSRP, which isn't true at all.
    The cheapest card in Germany is € 2100, with most decent brands starting at € 2500.

  • @txmits507
    @txmits507 2 года назад

    Every yt says these are using similar power to 3090tis but way more powerful. 2 weeks ago before every channel actually got their hands on one is not comparable to this week.

  • @icionreagis
    @icionreagis 2 года назад +3

    We have the same effect when they pass the 980 ti to 1080ti. The change was enormous and at that time the people was shock, the double of performance for the 1080 ti was amazing.

  • @inspiration7
    @inspiration7 2 года назад

    nice, can you please tell me what music you had played in the beginning of the video? I loved it and want to hear the full version

    • @theTechNotice
      @theTechNotice  2 года назад

      It was from Artlist, not sure what the title was called again :)

  • @thedunkshot929
    @thedunkshot929 Год назад

    Nice to see that AV1 was used to encode this video. Probably why 8K was possible on my internet for this video!

  • @MistyKathrine
    @MistyKathrine 2 года назад +1

    One of my favorite designs is probably the Colorful Vulcan, that one is nice looking and comes with a mini OLED screen that can show temps and other information.

  • @ivanr9408
    @ivanr9408 2 года назад

    DLSS is good for solo player games like cyberpunk but it's useless in FPS and multiplayer games. It increases the latency for response by quite a bit for FPS games but the card is so damn strong and efficient you don't need to run it for most games.

  • @Bazooka407
    @Bazooka407 2 года назад +1

    13900k 4090 monster build