AMD Says You’re Doing it Wrong. - Best settings for AMD GPUs

Поделиться
HTML-код
  • Опубликовано: 28 янв 2025

Комментарии • 3,4 тыс.

  • @koizumiizumi5426
    @koizumiizumi5426 2 года назад +11603

    kinda nice seeing a company push the knowledge to get better performance with their equipment instead of just telling us to upgrade to better hardware so they get more money

    • @chessprince1164
      @chessprince1164 2 года назад +245

      It's simply smarter

    • @theunkownviolinist
      @theunkownviolinist 2 года назад +1

      let's be real, those of us who can afford to upgrade to better tech will do so when the new gen drops because we're sluts for fresh new tech

    • @MinkSquared
      @MinkSquared 2 года назад +340

      well, a majority of people wont buy their products if they say "buy this for gooder" because *noone* wants to mess with vulkan.
      they're getting much more marketing value if they just seem like the cool older brother company as opposed to nvidia giving a big L to linux

    • @m_sedziwoj
      @m_sedziwoj 2 года назад +34

      yes, but not mention that most of video is FSR 1.0, is bad as education video, because 2.0 will be different.

    • @gabrieldhondt6432
      @gabrieldhondt6432 2 года назад +103

      That was exactly the same thing i was thinking when i saw the title. Thats the problem with many new tech, people just dont understand it. People buy by names. One time i had someone say his gpu needed to be an RTX and when i said why, he couldnt respond.

  • @semibreve
    @semibreve 2 года назад +791

    this is actually so cool to see AMD give this advice to end users: optimising and tweaking has always been the most slept on and most confusing part of PC building so it's awesome to seeing Linus covering this

    • @madangel71
      @madangel71 Год назад +10

      Just activating SAM on bios, I got 20% more FPS on Benchmark of Division 2. On my RX 580 8Gb, it went from 90 FPS Average, to 108 FPS :D. On my Ryzen 9 5900X system, this GPU is a huge bottleneck and activating SAM really helped.

    • @umamifan
      @umamifan Год назад +10

      It’s because “optimizing” is seen as a low spec gamer kind of thing, and is thus something a person who buys a mid range card doesn’t want to do. They expect out of the box settings, and don’t want any of the “low spec inconveniences” just because they think they’re too good for it, or because they “paid too much” to have to optimize!

    • @nlldiamond
      @nlldiamond Год назад

      @@madangel71 how did you manage to get sam on? I also have an rx580 8gb. Setting resize bar to On doesn't seem to work for me cause when I do after i save settings restart then it will say vga card is not supported.

    • @relax9086
      @relax9086 Год назад

      ​@@madangel71 sam don't work on the rx580, it's not compatible cause i have a 580 card!

    • @madangel71
      @madangel71 Год назад

      ​@@nlldiamond Hi. I have a Motherboard ASUS TUF GAMING B550-PLUS bought in 2022 February, and when I activated Sam with RX-580, it worked fine on first attempt without any issue. I replaced RX580 with an RX 6700 XT, on 2023 February also, and with the more recent GPU, I got a new issue where one of the SSD stopped being recognized. So it seems it depends on the mainboard/GPU combo.

  • @jakedill1304
    @jakedill1304 2 года назад +1394

    I love how gearbox has managed to turn every single advantage that you should be getting with cell shading into a disadvantage...

    • @coffee7180
      @coffee7180 2 года назад +228

      It's just insane. I always fought more devs should use cell shading for great art style, fast development, great performance, great aging, easy to build assets and contentbut gearbox showed me how incompletents can destroy everything.

    • @ishiddddd4783
      @ishiddddd4783 2 года назад +171

      indeed lmao, near to 0 graphical improvement but with a game not only terribly optimized that it's so stupidly demanding, but while looking like an early 2010s game

    • @Just_RJ
      @Just_RJ 2 года назад +34

      They should take some lessons from Arc System Works.

    • @Rov-Nihil
      @Rov-Nihil 2 года назад +16

      ​@@ishiddddd4783 probably randy pitchfords middle finger to gamers

    • @otnima
      @otnima 2 года назад +43

      @@Rov-Nihil Oh he already did that when he accepted that huge check from Epic Games to make the Borderlands franchise an Epic Games store exclusive.

  • @TheMx5Channel
    @TheMx5Channel 2 года назад +1911

    Would help alot if developers start optimizing programs instead of increasing raw system power.

    • @burkino7046
      @burkino7046 2 года назад +228

      especially looking at you Windows, theres basically no reason why you still need code thats 20 years old

    • @clipsedrag13
      @clipsedrag13 2 года назад +86

      why optimize when you can just keep releasing new hardware

    • @CarbonPanther
      @CarbonPanther 2 года назад +276

      Imagine if all games were as optimized as Doom Eternal, we wouldn't even need ANY of the tech shown here... What a dream to have.

    • @SiniSael
      @SiniSael 2 года назад +26

      U cant do that - thats common sense...
      Like its obvious so OFC they wont do that - Its all about profit! ;)

    • @moistjohn
      @moistjohn 2 года назад +106

      The time honored tradition of throwing hardware at a software problem

  • @dnakatomiuk
    @dnakatomiuk 2 года назад +225

    "If GPUs get any more powe hungry" Nvidias series 4000 launches a few months after

    • @attmrcmailik9653
      @attmrcmailik9653 2 года назад +3

      Such tdp isnt really high. Check out amd's r9 top models (450w in stock in 2013)

    • @OsGere
      @OsGere 2 года назад +10

      @@attmrcmailik9653yeah that was in 2013 so its irrelevant now😂 its 2023 now buddy

    • @AndyTomT
      @AndyTomT 2 года назад

      Electric suppliers wringing their hands

    • @frostedflakes3168
      @frostedflakes3168 11 месяцев назад +5

      40 series uses a lot less power than the previous gen of GPUs

    • @iamhardwell2844
      @iamhardwell2844 8 месяцев назад

      4070 only use 1x8pin PCIE connector LOL

  • @NoLongo
    @NoLongo 2 года назад +864

    Game developers doing everything they can to make the most unoptimized games imaginable because resolution upscaling and other features give them more headroom.

    • @ShiroCh_ID
      @ShiroCh_ID 2 года назад +62

      lol i thought the same thing
      we made solution of the problem that actually didnt exist or so it should
      and seeing the Ages old Nintendo 64 games or PSone games or even some PS2 games i can see that they sure work around the hardware limitation nicely,heck even C64 and MS-DOS games also have same clever wokraround of that limitation,nowdays they just care about Graphics and sometimes forgot about space

    • @nogoat
      @nogoat 2 года назад +121

      The best example for this is the GTA Trilogy.
      Like, how can a remaster of a 20 year old game have higher minimum requirements compared to your own title from 2018 (RDR2)

    • @DailyCorvid
      @DailyCorvid 2 года назад +43

      If you ignore shitty triple A's then it is the opposite! Indie devs rocking currently.

    • @EnigmaticGentleman
      @EnigmaticGentleman 2 года назад +55

      Yep, i miss the 90s when game developers actually tried to optimize their games so as many people could game as possible. With computers having gotten as powerful as they have, there is simply no excuse for games to have trouble on cards like the 1060 (at medium settings).

    • @CaptPatrick01
      @CaptPatrick01 2 года назад +13

      @@ShiroCh_ID Squeezing water from stone as someone put it. It is a lost art.

  • @ThatNerdChris
    @ThatNerdChris 2 года назад +389

    Thanks for the FPS Papa Linus & AMD!

  • @dei_stroyer
    @dei_stroyer 2 года назад +459

    3:00 - Quantum Tunneling has been an issue for 5nm for a long time, it's only recently that TSMC has been able to overcome it, 3nm is where it really starts to get difficult though, we've made almost no progress on a 3nm node, and 4nm is awhile away. I feel 5nm will be here to stay for quite awhile.

    • @Jane-qh2yd
      @Jane-qh2yd 2 года назад +8

      I'm pretty sure Nvidia's upcoming Love Lace will be built on TSMC's 4nm platform

    • @IschChef
      @IschChef 2 года назад +52

      @@Jane-qh2yd TSMC "4nm" is not actually truly 4nm, just like "Intel 7" is not 7nm. The numbers are only marketing and completely decoupled from real transistor size.
      On top of that quantifying three dimensional transistor structures like we have since the start of the FinFET era with a single number is a bit questionable anyway.

    • @Jane-qh2yd
      @Jane-qh2yd 2 года назад +9

      @@IschChef The comment above is specifically talking about TSMC chips, so TSMC 4nm is relevant here

    • @yacquubfarah6038
      @yacquubfarah6038 2 года назад +11

      @@IschChef Nah they are 4nm lmao.

    • @Ignacio.Romero
      @Ignacio.Romero 2 года назад +8

      @@IschChef TSMC nm sizes are real, unlike Intel, but that doesn't mean it's inherently better than Intel's process just because the number is smaller.
      That being said, mobile chips are already using TSMC's 4nm process (Snapdragon 8 gen 1, upcoming Apple A16), so your point about 5nm sticking for a while isn't true, or at least only true for computer CPUs

  • @AdistiantoYuwono
    @AdistiantoYuwono 2 года назад +538

    Would be nice if there’s an “efficiency war” for desktop component also, as fierce as seen in mobile devices. New Intel CPUs and Nvidia GPUs are no joke in sucking those electrons. It almost looks like instead of improving processing technique or architecture, adding more power is the easier way out to gain a marketable performance. However in reality, electricity is very rarely free (in terms of money and environmental impact). Apple and other ARM vendor, as well as AMD in laptop market already shows us that this is actually possible in consumer market.

    • @83hjf
      @83hjf 2 года назад +27

      the problem with Apples M1 performance is that its only good for specific things. because the efficiency comes from having dedicated hardware for common tasks, like for example, compressing one specific video codec, or encrypting one specific algorithm. GPUs are general-purpose devices, because what they do changes with every game. Apple (and ARM) are no gods with secret knowledge, they just optimize for 99% of use cases. A M1 machine, as powerful as it is right now, it's probably not going to fare that well in 10 years when we have moved away from H.265, for example, and it has to do software decoding in the CPU.

    • @Hybris51129
      @Hybris51129 2 года назад +8

      @SomeoneOnlyWeKnow Honestly I think this is more of a sign that the years of "You don't need a 600, 700, 800, 1000w PSU" coming back to bite gamers hard. Bad advice has now driven people to have to add another $150-300 to their upgrades because they didn't build their rigs to handle the march of technology.
      While power efficiency is nice just looking at how desktops have gone from 250w PSU in the late 90's/early 2000's to realistically only a 1000w for exponentially more processing power we have achieved a lot with relatively little increase in power useage.

    • @custard131
      @custard131 2 года назад +12

      i think if you look beyond the headline power ratings there have also been huge steps forward in efficiency too, its just that when looking at the top end chips that increased efficiency is always used to get more performance because there isnt really any reason to reduce the power budget for high end desktop components just for the sake of drawing less power like there is in laptops where battery life is a big selling point
      if you wanted to you could look at the product list and see how much performance you could get for a given power budget, and im fairly confident you would still see performance improvements from generation to generation.
      kind of related to that, in past gpu generations it wasnt that uncommon to have multi-gpu sli setups with up to 4 gpus, with ampere only the 3090 even supports that and even then only 2-way but it can still out perform even the most overkill of past gen setups
      another thing to keep in mind is that manufacturers have also been making improvements to turbo boost / variable clock rates so while the headline figure for a shiny new gpu might say it can draw 400w that doesnt mean it draws that much at the time, that means if you choose to give it a workload that can push it to its max thats how much it has to give.

    • @poipoi300
      @poipoi300 2 года назад +4

      @@custard131 Thank you for that comment I was about to write something similar haha. People focus on the power right now and act like it's always more power for more performance while you could probably take any one generation over the last at the same wattage and it would outperform the last gen. An example of that is the 1080ti vs 2080ti. The 2080ti is significantly faster all the while consuming the exact same amount of power out of the box. Using more power is entirely a matter of choice at that point, but I don't know many who would give up part of their performance for not even a few cents an hour. Personally if all it takes to get more performance is increased power consumption, I'll do it. After all I'm not constrained by a battery like I would be on a laptop or phone. Those are extremely different markets and sets of considerations. People who complain about wattage like that should seriously calculate the amounts they're complaining about. Of course it depends where you live, but electricity is cheap. You're not using your gaming rig all year round either. Where I live, if you run something like a 2080 all year long (mining) it costs you about 40$ CAD if I recall correctly.

    • @joshjlmgproductions3313
      @joshjlmgproductions3313 2 года назад +8

      @@custard131 Many reviewers state average and peak power draw separately.
      Also, comparing the 2060 Super to the 3060, it looks bad.
      RTX 2060 Super: 180W
      3DMark score: 8739
      MSRP: $399
      RTX 3060: 210W+
      3DMark score: 8766
      MSRP: $329
      That's why reviewers almost never compare cards to the previous generation.

  • @sirspamalot4014
    @sirspamalot4014 2 года назад +1070

    We're finally coming back to a point where developers have to optimize their games to run on hardware for longer. Remember, the Xbox 360 can run GTA V, even though it has no right being able to, that's the kind of optimization I want to see, where 10 plus year old hardware can still run a game REALLY well.

    •  2 года назад +50

      And to this day it is the only version of GTA V that I have played, I really liked it but not enough to give it another go when the other versions came out, also I think it was the last game I played in my 360 so it had a good send off.

    • @jordanwardle11
      @jordanwardle11 2 года назад +8

      @@SimonBauer7 yes... 4 years later

    • @chriswright8074
      @chriswright8074 2 года назад +1

      @@SimonBauer7 not really

    • @bolland83
      @bolland83 2 года назад +6

      @ I was in the same boat, spent the $99 on the special edition bundle when it came out on 360. I ended up getting it again on steam though, it was on sale for $10 since I never play on consoles anymore. Anything I want now just goes in my wishlist and I get a notification if it goes on sale, if it's a good deal I get it.

    • @cozza819
      @cozza819 2 года назад

      I don't really want this because it hinders game development progression

  • @goolash1000
    @goolash1000 Год назад +12

    FYI: most us outlet boxes are fed by two phases. Which means in many cases, a two-outlet was fixture can be changed by a qualified electrician into a single outlet 220V fixture for half the amp draw, and most PC power supplies are compatible with 220V out of the box.

  • @natec1
    @natec1 2 года назад +293

    I love how computers are getting so small that modern physics is an actual issue

    • @citizenkane2349
      @citizenkane2349 2 года назад +23

      I got to admit it is pretty cool.

    • @kz03jd
      @kz03jd 2 года назад +61

      But isn't physics just plain.....physics? I mean it's the same 10,000 years ago as it is today. Physics doesn't change, our understanding of it does.

    • @Boringpenguin
      @Boringpenguin 2 года назад +10

      Even cooler, harnessing those "issues" properly will lead to the next breakthrough in computing. I hope I could see the day when general quantum computers enter the consumer market.

    • @BixbyConsequence
      @BixbyConsequence 2 года назад +29

      @@kz03jd "modern" as in quantum physics being a modern field of study.

    • @natec1
      @natec1 2 года назад +26

      @@kz03jd That’s true but the whole principle of modern physics is that we don’t understand it fully yet. Physics at our scale behaves differently than it does at the extremes.

  • @seank4148
    @seank4148 2 года назад +187

    Just FYI. If your boot drive is MBR partition, enabling Resize Bar disables CSM support, which means you can't boot off your MBR. MBR boot needs CSM enabled, but CSM can't be enabled simultaneously as Resize Bar. You'll have to first convert to GPT partition on your MBR boot partition to enable Resize Bar.

    • @mikebrownphotography2784
      @mikebrownphotography2784 2 года назад +8

      This might have been my issue as I had to reset my bios after attempting to enable SAM. My PC wouldn’t boot

    • @jonasedvardsson7612
      @jonasedvardsson7612 2 года назад +3

      @@mikebrownphotography2784 convert system disk to gpt isnt hard though🙂

    • @5avvproductions41
      @5avvproductions41 Год назад +2

      Great now I’m stuck in bios fml 😂

    • @joshua7015
      @joshua7015 Год назад

      What if my boot drive is GPT partition? Are there any precautions I should know beforehand?

    • @seank4148
      @seank4148 Год назад +1

      @@joshua7015 only issue is with MBR, which is a much older way to setup a boot drive. GPT has no issues I'm aware of. 👍

  • @diederickvermeulen8927
    @diederickvermeulen8927 2 года назад +7

    This really helped. Thank you so much. In Assassin's Creed Valhalla I was getting only 40-50fps average at 1440p. When enabling SAM in bios and RSR it has increased to over 70fps. Really cool

  • @thestig007
    @thestig007 2 года назад +1090

    I think we also desperately need game developers to optimize their game engines and also we need to come up with better performing game engines. Some games just run like crap even with top tier hardware!

    • @sphygo
      @sphygo 2 года назад +113

      Like Minecraft, running one core at max and the rest barely at all...

    • @HMSNeptun
      @HMSNeptun 2 года назад +37

      @@sphygo fabric+sodium/iris will fix that

    • @sphygo
      @sphygo 2 года назад +104

      @@HMSNeptun Yes but that doesn’t change the fact that it’s poorly optimized for modern processors. I wish the devs would focus on updating the base code to be more in line with the times. Movable tile entities are also something that could be fixed with a code rework, rather than relying on mods for all the performance optimizations.

    • @SirDragonClaw
      @SirDragonClaw 2 года назад +68

      Most modern game engines are very optimised. The real issue is game developers not making proper use of engines and leaning too hard on scripting languages.

    • @CanIHasThisName
      @CanIHasThisName 2 года назад +33

      The real problem here is people playing games on the absolute highest settings. Those aren't meant for top tier hardware of today, those settings are there for several generations later when the game runs great even on modern low end. And don't get me started on people expecting high FPS with RT enabled.
      Most games that end up hitting the market are well optimised and run well, you just gotta take your time to set the details right for your HW. Nowadays you often can't tell the visual difference between Ultra and High except for some absolutely minor deatils, and even Medium often doesn't bring any notable graphical downgrade. The problem is that people still think that they're just gonna put everything on Ultra and call it a day. Or worse, they're just using the presets without going through the individual settings.

  • @thatsgottahurt
    @thatsgottahurt 2 года назад +528

    Interested to see what Smart Access Storage or Direct Storage will bring to the table once we finally see it in games.

    • @DailyCorvid
      @DailyCorvid 2 года назад +45

      Probably a marginal gain for most people, I expect 2-3% at most.
      Software isn't really advanced enough to take advantage of it yet. Developers need to write code in a different way to make use of it.

    • @vith4553
      @vith4553 2 года назад +1

      check out forspoken's result

    • @LiveType
      @LiveType 2 года назад +26

      Probably just stutter free high speed open world exploration. So performance in that aspect will see a massive boost. Otherwise, negligible difference as that doesn't touch the render pipeline.

    • @Archmage1809
      @Archmage1809 2 года назад +17

      Direct storage helps asset streaming. But most likely it's gonna reduce the game size (compression implementation, use GPU horsepower to decompress), improve load time, and maybe improve 0.1% and 1% low.

    • @ThejusRao
      @ThejusRao 2 года назад +7

      @@DailyCorvid It doesn't offer any performance increase, but it's supposed to improve loading times by a factor over atleast 30% if implemented right.

  • @reyalPRON
    @reyalPRON 2 года назад +32

    This is also a great way to save power. And heat in your system surely will be less of a problem and you might get away with less fan noise too :)

  • @Steve30x
    @Steve30x 2 года назад +142

    2:01 here in Ireland my electric bill due in two weeks is €133. The same bill last year was €80. That's a €53 increase despite the fact I've used less electricity this time around.

    • @MalcolmCrabbe
      @MalcolmCrabbe 2 года назад +10

      You got off lightly. My £100 direct debit for Electric has jumped to £265pm from July 1st !!!

    • @emanuelmayer
      @emanuelmayer 2 года назад +2

      My new reading is in June 2022. And a week later, the bill arrives. I plan not to buy anything until I have paid the increase -.-

    • @welkfor1753
      @welkfor1753 2 года назад +3

      Jesus why is so expensive over there I've never seen a bill higher than 30bucks for a month

    • @emanuelmayer
      @emanuelmayer 2 года назад +8

      @@welkfor1753 a large portion of electric costs are taxes and stuff like "we have to pay nuclear plant owners because we forced plants to be shut down but we have contracts to let them run until ... lets say 2050". Also ecological taxes (which should/have been invested in renewable energies).
      I am usually at 1500-1700 Kilowatt-hour a year.

    • @roqeyt3566
      @roqeyt3566 2 года назад +9

      i switched to solar, steam deck, and eating more cold foods. Showering at the gym too. even so my bill went up...

  • @digiscream
    @digiscream 2 года назад +88

    Yeah...here in the UK, I had a noticeable reduction in my power bill by downgrading my daily driver to a Quadro K2200 (60W) a couple of months ago. Madness.

    • @tuckerhiggins4336
      @tuckerhiggins4336 2 года назад +9

      That's why I use a laptop, gives me the most efficient hardware

    • @Rov-Nihil
      @Rov-Nihil 2 года назад +3

      I rather paid 80 bucks over MSRP to get a 6600 that's 70 watts less for 30-70% performance than to stick to my rx 580, even though it's seen as an entry tier card. Plus selling the old card will net me 150 so win win!!!

    • @digiscream
      @digiscream 2 года назад +4

      @@tuckerhiggins4336 - sadly, I can't do that. Fan whine is a sensory nightmare for me, and quite a lot of what I do is fairly CPU-intensive stuff.

    • @tuckerhiggins4336
      @tuckerhiggins4336 2 года назад

      @@digiscream ah, the headphones must not do it for you then

  • @JesusOfTheJungle
    @JesusOfTheJungle Год назад +4

    I run a 2080 ti and have moved to the blue mountains (Australia) in the last year. Through winter house temp wouldn't get much more than 8 degrees celsius but man, my gaming/entertainment room was, I mean, I wouldn't say warm but by comparison to the rest of the house, it was comfortable!

  • @Melchirobin
    @Melchirobin 2 года назад +35

    This is a great way to do a sponsored video. I just expected weekend videos to be huge and usually the ones you have to watch so getting used to this is a change

    • @vincent67239
      @vincent67239 2 года назад

      I didn’t skip the sponsor this time because I didn’t notice until the opening song started playing lmao

  • @GABN0MAD
    @GABN0MAD 2 года назад +96

    They actually gave us help so that we can enjoy our stuff , after this i have alot more respect for them

    • @cozza819
      @cozza819 2 года назад +3

      Absolutely 👍

    • @zodwraith5745
      @zodwraith5745 2 года назад +2

      Well, technically they simply copied Nvidia that did it years earlier. Nvidia even has an RSR type feature that they never talk about that doesn't require RTX cores. This is literally just an AMD infomercial. I'd rather have them spend the money actually getting it implemented into any real freaking games instead of just crying "me too!"

    • @GABN0MAD
      @GABN0MAD 2 года назад +2

      @@zodwraith5745 i don't disagree but we do have to give th credit as they didn't need to do it , it's still cool they did it , if anything it helps create a new Standard

  • @Tobser
    @Tobser 2 года назад +21

    Thanks for the tip. It really improved the vr performance of 6600xt from unplayable to playable.

  • @JordonAM
    @JordonAM 2 года назад +189

    FSR at 1080p can always be counteracted using Radeon Image Sharpening, and RSR just added a sharpening slider so you can also use that for games that don't support FSR. RIS isn't magic though, but it'll help some definitely

    • @johndoh5182
      @johndoh5182 2 года назад +5

      AND AMD will improve RSR over time.
      Frankly I think AMD is doing a great job right now, better than other companies for trying to get consumers through a period where money is tight. I don't like it when some reviewers spend an entire video bashing them for a specific product.

    • @AntonioNoack
      @AntonioNoack 2 года назад

      @@johndoh5182 RSR will never have FSR 2.0 though. They can't, because they don't have the motion vector information on all engines.
      They might be able to do some driver magic, but I doubt it.

    • @zodwraith5745
      @zodwraith5745 2 года назад +1

      DSR has literally been available on Nvidia since 2014 as well as a sharpener. Why do you think Nvidia hasn't said anything about RSR?

    • @JordonAM
      @JordonAM 2 года назад

      @@zodwraith5745 No one mentioned NVIDIA at all in this thread.

    • @zodwraith5745
      @zodwraith5745 2 года назад

      @@JordonAM My point was Nvidia did the exact same tech long ago. They don't promote it _because_ it sucks.
      You can't create data that's simply not there without AI having information to fill in blanks. It's like asking a robot to drive you to the store without explaining what a car is or how to drive. It often ends up looking even worse than just dropping the resolution.
      The only way you could effectively have high quality upscaling on the driver side is building it into a frame buffer to analyze multiple frames. Works great for video where adding 100ms of lag goes unnoticed as long as sound is synched, but completely unusable for gaming where adding just 20ms is highly noticeable.
      It's a boondoggle for marketing. Nothing more. Nvidia got their marketing in back in 2014, AMD's using it for marketing now.

  • @faizmumtazramadhan
    @faizmumtazramadhan 2 года назад +38

    1:08 Now we know why Alex has been obsessed with so many crazy cooling solutions lately 🤣🤣

  • @SergiusXVII
    @SergiusXVII 4 месяца назад

    This is exactly how sponsored content should be! Not overly exaggerating any features, but presenting them in a manner that accurately reflects what the technology is capable of. ReBAR (SAM) is absolutely crucial and should be enabled on any system that supports it. Very cool, reinvigorated my oddly specific love for _computer components_ …

  • @ravenclawgamer6367
    @ravenclawgamer6367 2 года назад +41

    Warning to Linus : Huge number of $5000 cheques from intel coming at a high speed.

    • @nadir4562
      @nadir4562 2 года назад +3

      lmao

    • @NaoyaYami
      @NaoyaYami 2 года назад +2

      He'll just rip them again.

  • @BReal-10EC
    @BReal-10EC 2 года назад +64

    I think these temporal upscaling techs will become more and more important in the future as part of normal video compression. A 4k movie is 100 GB now.

    • @Sabbra
      @Sabbra 2 года назад +1

      It is 25g cmon

    • @Lubinetsm
      @Lubinetsm 2 года назад +14

      @@Sabbra if it's 25G, you got scammed and compression is a bit too lossy.

    • @sermerlin1
      @sermerlin1 2 года назад +8

      4K movie is about 50-60 GB, rarely you'll see 70-80GBs at this time and extremely rarely (like lord of the rings) where it will exceed 100GB.

    • @formdoggie5
      @formdoggie5 2 года назад

      @@sermerlin1 thats not at 60 to 120 frames though.
      Getting 24-30 frames kinda defeats the whole point of all the tech you buy.

    • @Dave5gun
      @Dave5gun 2 года назад +9

      @@formdoggie5 for movies 24 frames are enough, actually even better than 60 frames. HDTVTest once explained that 24 frames look more dreamlike, not like a cheap smartphone camera. However, there has to be a certain amount of motionblur, which is not really the case with OLED anymore, so 24 frames can look pretty choppy sometimes. i still think it's better than 60 frames.

  • @fraxesz1598
    @fraxesz1598 2 года назад +15

    This video is indeed very helpful for most people who aren't into PC hardware to get the most out of AMD hardware :D Keep it up guys!

    • @iikatinggangsengii2471
      @iikatinggangsengii2471 Год назад

      they just need to spend more tinkering and testing graphic settings, some settings are just not meant to be enabled/maxed on radeons, oc etc arent exactly necessary (although more fps always welcomed, but try to find correct settings first before oc ing)

  • @CaptKornDog
    @CaptKornDog 2 года назад +6

    Wish there were more tips like this in general out there.

    • @elbowsout6301
      @elbowsout6301 2 года назад

      check out the lowspecgamer channels for how to get more from low to mid tier gear.

  • @josuad6890
    @josuad6890 2 года назад +21

    GPU power has been insane lately, so much so that I decide to just buy the 3070 series even though I can afford the 3080 easily. Going 3080 will also require me to ditch my PSU, and I just don't feel like doing anymore cable work ever again.

  • @ChandlerUSMC
    @ChandlerUSMC 2 года назад +72

    Thank you, LTT and AMD. Someone finally said what I've been thinking since the 3xxx/69xx series launches. My wife and I play on two different computers in the same room- frequently together -- it's a past time we share. In the same room means we're on the same 15amp circuit (U.S.). With the way things are going, how are we supposed to have 2 gaming machines on the same circuit? Are we both going to have 4xxxTi or Radeon 7xxxx with modern CPUs? How, exactly? Are we going to be stringing wires from the study to the game room? What about smaller homes that don't have another room to extend from? Eventually, I'm going to need to install custom electrical circuits (20+ amp) and treat a game room like a kitchen? That just buys some time but how much time before we blow past that?
    I was also thinking, what do apartment dwellers do? Is the gaming market going to "power their way" past the people living in apartments because they can't power their graphics cards and CPUs? Then there's the cost in Europe.
    In other words, won't the idea of "more power!" winning over "more efficient" eventually shrink the available market for these devices?

    • @TheSliderW
      @TheSliderW 2 года назад +1

      That's my concern as well and My wife and I do the same. I don't know about America or Canada but In France (ok, we're on 230V but still), we were always able to have LAN parties with 5+ players running from extensions out of the same room's breaker including file servers and other additional devices. My take is that your PC's are not going to be pushed to their maximum consumption during gameplay. Only if you stress test them using specialised benchmarks. So you and your wife should be fine anyway.

    • @Tn5421Me
      @Tn5421Me 2 года назад +1

      Yes.

    • @Rushtallica
      @Rushtallica 2 года назад

      A good quality extension cable from another room.

    • @juliendesrosiers3177
      @juliendesrosiers3177 2 года назад

      Im happy to live in Québec, where energy costs nothing.

  • @omgawesomeomg
    @omgawesomeomg 2 года назад +90

    Props to AMD for allowing Linus to talk about the shortcomings of FSR 1.0

    • @ShadowGirl-
      @ShadowGirl- 2 года назад +7

      I believe AMD sponsored this to force them to not talk about Nvidia's use of ResizeBar, which does exactly the same thing and they gave it to us over a year ago. Also it was figured out that not all titles gain advantages from accessing more of a GPU's memory, so not all titles even make use of the technology. Normally these things would be mentioned by Linus, but this is the "cost" of sponsorship.

    • @chriswright8074
      @chriswright8074 2 года назад +25

      @@ShadowGirl- but AMD actually put in the work unlike nivida which doesn't work or require more work than needed

    • @lauchkillah
      @lauchkillah 2 года назад +13

      @@ShadowGirl- Nvidia did not support Res. bar at launch, AMD was the first to drop this tech. Early 3000 series GPUs require a vbios update to even support it

    • @ValentineC137
      @ValentineC137 2 года назад +18

      @@ShadowGirl- Nvidia's Res. Bar literally just doesn't give the same performance improvement as AMD's SAM
      Comparing Hardware Unboxed videos on the two systems on average
      Nvidia is the same while AMD has a 3% gain 1080p
      Nvidia gets 1% while AMD gets 3% at 1440p and 4K
      the peak gains are
      11% 9% and 6% at 1080, 1440 and 4K for Nvidia
      20% 19% and 17% at 1080, 1440 and 4K for AMD

    • @santmc12
      @santmc12 2 года назад

      @@ShadowGirl- NIS? Is trash

  • @bluehatbrit
    @bluehatbrit 2 года назад +18

    My AMD graphics card just hit 10 years old and it's still my main driver with no tweaks to the settings while running along side a Ryzen 7. Perhaps it'll make it to 15 years when the prices have eventually come down!

    • @ShadowDragoon004
      @ShadowDragoon004 2 года назад +3

      Not sure if you've been out of the loop, but GPU prices are getting much more reasonable. I've seen some 3060s damn near MSRP very recently. Definitely much more affordable than it was at the beginning of the year. And with the crypto crash going on, there's a very decent chance stock on the 40 series will be reasonable enough to make a purchase then as well.

    • @10unnecessary01
      @10unnecessary01 2 года назад +2

      What card do you have an what games do you play that you're able to still use it? I upgraded from my 290x in 2019 and even then the difference was night and day

    • @vanilla4064
      @vanilla4064 2 года назад

      unless there is too much supply and not enough demand, those prices aren't coming down.

    • @alishabab3
      @alishabab3 2 года назад

      Covid killed the pricing of cards..that and the conflict between China and Taiwan

    • @kidthebilly7766
      @kidthebilly7766 Год назад +1

      if your card is 10 years old then getting a 1080 ti for $200 is probably worth it lol

  • @JUMPJUMP2TIMES
    @JUMPJUMP2TIMES 2 года назад +1

    The resize bar helped tremendously. Thank you so much!

  • @jackbenimblejack1
    @jackbenimblejack1 2 года назад +35

    the best way to get more performance is by purchasing a $69.99 screw driver from LTT

    • @Ghost-hs6qq
      @Ghost-hs6qq 2 года назад +3

      Linus? is that you?

    • @SaucerX
      @SaucerX 2 года назад +1

      It comes with rgb, or is that a separate purchase?

    • @DailyCorvid
      @DailyCorvid 2 года назад +4

      @@Ghost-hs6qq Is that you? Find out after this short sponsor message...

    • @frozenturbo8623
      @frozenturbo8623 2 года назад +2

      @@DailyCorvid Today's sponsor is Linus Tech Tips, Buy our $69.99 screwdriver for better performance from the RGB and 78.28% less kernel crashes.

    • @jackbenimblejack1
      @jackbenimblejack1 2 года назад +1

      @@Ghost-hs6qq yes it is me, Linus ... buy the LTT Screw driver and get your friends to buy it too.... I have to pay for a really really really expensive house

  • @joshualopez9045
    @joshualopez9045 2 года назад +16

    For those who can't find DOCP then it's going by to be EOCP or XMP depending on the motherboard you have. These are overclocking profiles for your RAM.

    • @antebellum1776
      @antebellum1776 2 года назад

      I still can't find it... does it depend on your motherboard/ram/cpu?

    • @joshualopez9045
      @joshualopez9045 2 года назад

      What motherboard do you have?

    • @antebellum1776
      @antebellum1776 2 года назад

      @@joshualopez9045 GA-AB350M-DS3H V2 (rev. 1.1)

    • @joshualopez9045
      @joshualopez9045 2 года назад +1

      In BIOS go to the M.I.T. tab. You'll move down to the Advanced Memory Settings then to Extreme Memory Profile (X.M.P.). Enable by selecting profile 1. Look on page 23 and 24 of your user manual that came with your mobo.

    • @antebellum1776
      @antebellum1776 2 года назад

      @@joshualopez9045 I'll double check this tomorrow when I get home but I'm pretty sure I don't have that setting anywhere.

  • @Eugomphodus
    @Eugomphodus 2 года назад +8

    There's also a toggle (once enabled in BIOS) to switch SAM on/off in the Radeon software. Some games I've found out REALLY doesn't like this feature and performs really bad with it on, but I'm sure it'll be more useful as times goes on.
    An extreme example I found so far is "Element TD 2". I've set a cap to 144 fps and normally it's around 90-144 fps @1440p max settings during intense moments with a 5700 XT. But with SAM toggled on the fps drops as low as 20 fps and stutters a lot, including the audio.

  • @LegendaryGodKing
    @LegendaryGodKing 2 года назад +13

    I've been with amd since I was 12. I'm 26 and I have to say, AMD, your price to performance is what I've always needed, Fm2+ Athlon x2 Am3+ FX8350 black, Am4+ Ryzen2 3700x, and now I'm eyeing your new chips. AMD is the only thing pushing Intel to compete. Amd started from a clone, but they have worked hard to compete in the market. I'm a huge fan, they will ALWAYS be my first option for cpu choice, and forever have a spot in my heart from all the memories of gaming, and overclocking, the GPUs are great, just not for mineing!

  • @johnrehak
    @johnrehak 2 года назад +7

    AMD free performance gain is so simple even without standard overclocking. Highest jump in my case was just simply enabling Smart Acces Memory (SAM). Another free performance gain which turned my Gigabyte Aorus RX 6800 xt in to reference RX 6900 xt and beyond was thanks to undervolting GPU from 1150mV to 1100mV, increasing power limit to +15% and switch memory timing to fast timings. I havent touched GPU clock or memory clock. Card boost up to 2474mhz from factory settings and as long as temps allows it stays at that speed. FPS gains are in CP 2077 in 1440p ultra from 84fps avg. 54fps min. in to 92fps avg. 60fps min. In Horizon Zero Dawn from 135fps avg. 60fps min. to 159fps avg. 78fps min. (GPU FPS - 168, 95% - 152fps, 99% - 132fps), The Division 2 from 128fps avg. to 137fps avg.

    • @salxncy5998
      @salxncy5998 10 месяцев назад

      Can you help me over clock mine, I have a Ryzen 5 5600g with rx 6600 eagle, 32 gb ram

  • @Spartan536
    @Spartan536 5 месяцев назад +2

    2 years into the future and I am using FSR 3.1 in Cyberpunk 2077 (modded in) with an RX6900XT and a 5800X3D, MAX GRAPHICS SETTINGS, 4k texture mods, in fact over 1200 mods via the "City of Dreams" modpack that makes everything hyper realistic looking, I play at 1440P.
    Normally something like that would melt any PC and cause framerates in the sub 30's, however without FSR just using SAM I get 58-60 fps when locked to 60 fps, without framerate lock it goes as high as 68 fps and as low as 45 fps, the frame locking helps maintain a nice framerate. Pretty impressive stuff...
    Turn on any kind of raytracing and its down into the 20's
    Turn on FSR 3.1 with Frame generation..... oh boy, I hit 163 - 171 frames per second, yeah at 1440P on a heavily modded Cyberpunk 2077. In fact with those settings I can turn on ALL of the RayTracing stuff to PSYCHO levels (no path tracing) and still get 90-93 FPS! Raytracing on PSYCHO levels in Cyberpunk on a 6900XT AMD GPU!
    FSR 3.1 and Frame Generation is just NUTS.
    What I am doing right now however is turning Ray Tracing off entirely, and I have my system locked to 120 fps (my monitor supports 120hz at 10 bit color or 144hz at 8 bit color), but what I have done is set the FSR into overdrive where its upscaling to 4K while frame generating and I get a solid locked 120 fps with a 0.1% low of 118 fps!
    Impressive stuff, and it looks like its only going to get better from here!

  • @PixelShade
    @PixelShade 2 года назад +13

    I actually thought you guys would do more of a deep dive into all the tech AMD has available in their software. Sure, you did talk about SAM, FSR/RSR, But there are other great technologies which are almost never mentioned by tech media and people often don't have a clue about. I am thinking about:
    - Radeon Chill (something for Alex hot room)
    - Radeon Boost
    - Radeon Image Sharpening.
    Radeon Chill & Radeon Boost are both movement based framerate compensations. Chill basically lowers framerate when standing still (to your lowest set value), It instantly increases framerate when moving forward (a in-between value of your min. & max.) and when you are turning, when you need maximum framerate, you get just that. This greatly improves power consumption and with a freesync monitor you barely notice a difference from running full tilt (unless it's a highly competitive title like CS:GO where split millisecond reaction is king). Radeon Boost on the other hand reads mouse movement and depending on how fast you are turning it downscales the resolution accordingly. Which is awesome in theory as you don't have the same perception of detail in motion. This however, is not available for all games. It does work in Cyberpunk and Metro Exodus to name a few. Unfortunately Radeon Boost is a bit harsh with nearest neighbor scaling. (I wish it was tightly integrated with FSR), and you can't use it together with Radeon Chill, which seems like a missed opportunity. When it comes to Image Sharpening (although being a part of FSR/RSR) can be activated individually for a next to non-existent performance impact. It almost act as a resolution bump especially in soft TAA games (textures in 1080p with Radeon Sharpening can almost appear more detailed than 1440p without sharpening). Yet it doesn't produce ringing artifacts. This is great when you want to increase perceived sharpness but don't affect performance much.

    • @lake5044
      @lake5044 2 года назад +1

      Quite interesting! Especially for people who don't own an AMD gpu and don't know such things even exist. Thanks!

    • @Dark88Dragon
      @Dark88Dragon 2 года назад +2

      Yeah you are right...from my observations people with AMD-ware are a lil bit more into the tech than the average Intel-, Nvidia- or Apple-User

    • @PixelShade
      @PixelShade 2 года назад +2

      @@lake5044 It's actually kind of crazy that AMD isn't promoting these unique features more. And also invest more time developing them.. like I mentioned Radeon Chill alongside Radeon Boost are fantastic features, but unfortnuately Radeon Boost still feels a bit half baked. Right now, it can't be activated together with Radeon Chill, and it uses "nearest neighbor" scaling which is a bit jarring. If this was connected to the in-game FSR pipeline (away from HUD and post-processing) It would be such a "killer feature" you would basically get the FPS where you absolutely need them (when things get chaotic and you are turning a lot) and you would save on power/heat + gain fidelity when watching cutscenes, standing still watching a scenery or just moving forward (where you don't have a lot of motion). :)
      I fully baked smart solution of this, where you can combine Radeon Chill, Radeon Boost and FSR 2.0 would honestly be a REAL killer feature.

    • @CameraObscure
      @CameraObscure 2 года назад +3

      I use Chill in most games I play and have done since it came out. Saves power, keeps my GPU cooler dosent affect game play for the games i play, even better its not tied to any specific game / engine. Also the fact you can set feautures on or off, for each game in your Library seperate from global settings of GPU, meaning you have no need to reset each time you play that game as its stored in the driver already.

    • @PixelShade
      @PixelShade 2 года назад +2

      ​@@CameraObscure Me too, awesome to hear that more people are using it! It saves me around 60W on a Ryzen 2700 and a 6600XT (which isn't a very power hungry combo in the first place). Taking a look at Witcher 3 I have opted for a spectrum of 48fps to 96fps. When standing still (If I watch a cutscene, get a phone call, fetch a drink or snacks, make some lunch) The computer only draws 63W (48fps). When walking forward (which you do most of the game) the computer draws 91W (65fps). meanwhile when I turn the camera it draws 144W (96fps). I wouldn't actually "feel" any difference on a freesync display when using this framerate spectrum compared to a locked 96fps. And the power saving is around 60W. I mean, I could go full tilt with 120fps+, If that was the case the computer would draw 177W.... So really the power savings is about 90-100W and the gaming experience itself is still really awesome. Saving around 100W is actually substantial for room heat, and in case of the European electricity market it's actually a substantial cost saving

  • @Fredjikrang
    @Fredjikrang 2 года назад +61

    A quick FYI, most breakers are only actually rated to run 80% of their rated capacity continuously, so a 15A circuit is actually only rated to power 1,440W for longer periods of time.

    • @Phynellius
      @Phynellius 2 года назад

      That and if it’s your own house a 15 or 20 amp 240 volt circuit is east to setup, just should get it done by someone qualified or have it inspected. Unfortunately many consumer UPS systems are 120 only

    • @m0r73n
      @m0r73n 2 года назад +12

      Laughs in 230V

    • @Jaker788
      @Jaker788 2 года назад +5

      That's why every device you buy that's intended for more continuous use is max 13 amps. Corded lawn mowers and space heaters are 13A max, but blenders are 15A or 1875W max.

    • @Fadeese
      @Fadeese 2 года назад +3

      But a 20 amp circuit is fine @ 16 amps or approx 1920W -> making 1800 W range just fine all day long.

    • @serdarcam99
      @serdarcam99 2 года назад +4

      Imagine you can't open your computer cuz of 110 freedom volts

  • @nintendowiids12
    @nintendowiids12 2 года назад +2

    Played God of War with FSR 2.0 on a 6700XT. HUUGE visual / performance uplift over 1.0. I want FSR 2.0 in all the games!!

  • @IncognitoX8
    @IncognitoX8 2 года назад +26

    1:47 It's actually 230v and not 240v, that is the standard in Europe. That can usually fluctuate between 225v-235v. So for a standard 10A fuse, you can expect a output of 2300w. If you have a newer house (or electrically renovated), in my country with confirmed cabling approved of this, you can install 13A fuses instead, that will take the wattage up to 2990w for a single phase group.

    • @GreenCinco12Official
      @GreenCinco12Official 2 года назад +7

      It's actually both and more.
      There is everything between 220v and 240v..
      There is also everything in between 110v and 127v..

    • @GrimK77
      @GrimK77 2 года назад +3

      Standard load for TN-C single phase circuit in EU household is rather 16A breakers (fuses are going out of fashion), so 3.6kW. And RCCB's, too for additional protection. Single phase household electric stoves are usually 2.9-3kW.

    • @iaadsi
      @iaadsi 2 года назад +3

      @@GreenCinco12Official Up to 253 V in Czech Republic (230 V ±10%). If you're close to the branch transformer, you'll be seeing over 250 V even when the street behind you is pulling heavily.
      Plus we have 400V three-phase nearly everywhere, even in old commie block apartments, for really heavy loads like induction cooking or car charging.

    • @jort93z
      @jort93z 2 года назад +3

      Here in Germany we usually use 16A fuses for outlets, and 6A-10A fuses for lights(normal light switches are 10A, so using any higher value would be risky).

    • @p_mouse8676
      @p_mouse8676 2 года назад +3

      Since P=I^2 * R . It also goes to show how much bigger the power losses are in 115/120V countries or how much bigger the wires need to be.

  • @Bobbias
    @Bobbias 2 года назад +20

    People here need to understand a few things about optimization:
    1: you typically want to avoid preoptimization, because you may waste considerable time on code that has little to no impact on performance.
    2: optimized code is quite often much more difficult to write, and reason with, so whenever possible it's better to write slow but clear code of anyone is going to have to see that code on the future (unless it absolutely must be optimized).
    3: optimizing everything will take significantly longer than it already does to release products.
    4: any bugs in optimized code will be much harder to track down and fix due to the nature of optimized code.
    5: games simply cannot be optimized for PCs the same way they can be for consoles due to the fixed hardware in consoles compared to the very large variation in hardware across PCs.
    That's not too say that things cannot be better optimized than they are now, there are cases where it may take little effort for reasonable gains in some cases. But it does mean that if we want our code to be better optimized in the future, we need to be willing to deal with increased production time and costs associated with taking the time to more heavily optimize things.

    • @Sightbain.
      @Sightbain. 2 года назад +4

      I don't disagree with anything you have said but some of the things people mean when they say "it should be optimized!" is the lack of compression for files, the insane size some of these games are requiring is just bonkers. Remember when Titanfall launched without compressed audio and it was like 60gb and everyone lost their minds and they went back and compressed it and did some other "optimization" about downloading local languages etc. That is the sort of things people can see and understand as well as some obvious things like having better occlusion or terrain that isn't actually giant blocks mashed together that stick through the ground and eat up resources. Most gamers aren't talking about very specific code implementation but rather less subtle general laziness or the biggest one of all, utilizing more than 1-2 cores which can either be a dev issue or the engine they are using so I do agree it is not so black and white but some things are.

    • @chuzzle44
      @chuzzle44 2 года назад

      Very good points. Though I do feel the need to point out that I would never expect developers to optimize games on PC as well as they do on consoles. For me, it's not so much about optimizing as it is about experience. I don't personally care if a game runs a few fps slower than it's console counterpart. However, if enabling vsync causes the game to stutter badly, or if performance drastically drops upon loading a new area, or if the game is just completely broken on release, I'm going to complain. With a plethora of mature engines and techniques available, there is no excuse for not only releasing games in an unfinished state, but leaving them that way for years.

    • @peterw1534
      @peterw1534 2 года назад +3

      So basically it's hard. See doom eternal. They did it somehow. (Twice really, 2016 was also polished af)

    • @frozenturbo8623
      @frozenturbo8623 2 года назад

      @@peterw1534 Doom has way smaller map and doesn't really matter on graphics.

    • @ashlyy1341
      @ashlyy1341 2 года назад +1

      i for one would be happier w/ fewer, but higher quality (and better optimised), releases. the same w/ tv shows and films tbh. AAA games, Hollywood films etc. are treated as disposable. the reason why is obvious: profit (make the flashiest, most hyped, thing to get the most sales, then release the next one quick as to counteract dropoff)
      anywho, your points are thoughtful and i agree with them overall. people forget that one can't optimise for every circumstance (such as hardware/platform) all at once - that's just a general purpose application. however, making use of standards that exist (especially as consoles are now x86) is always good. other things include: better data compression (pick an algorithm that balances speed and ratio), intelligent asset streaming (including handling what to keep in vram), smarter level design (eg stop making everything an entity when it can be a static and decorative), consideration of lighting and rendering - how best to employ different types of culling etc.

  • @SkarTisu
    @SkarTisu 4 месяца назад

    Thank you! I just made these changes to my system, and my FPS doubled. I can finally turn on some details in iRacing!

  • @hunn20004
    @hunn20004 2 года назад +7

    Fury X is perhaps my favourite GPU I wish I owned.
    The Blower style coolers of Radeon have always appealed to me, wish they actually paired them with low Watt components.

    • @iikatinggangsengii2471
      @iikatinggangsengii2471 Год назад +1

      yeah fury and vega are interesting mainly bcs they prob only available at limited numbers

  • @randykran6699
    @randykran6699 2 года назад +6

    There needs to be a warning that unless your windows is installed as GPT enabling Resize BAR support makes your drive not bootable. Just spent an hour trouble shooting to disable it and switch back to CSM in bios.

  • @izraelurbane
    @izraelurbane 2 года назад

    i like that return talking about ecosystem and electricity over cost of the util in ampere to watt, money and around repercussions on climate

  • @SeabooUsMultimedia
    @SeabooUsMultimedia 2 года назад +19

    Some AMD 300 series motherboards actually got smart access memory support via a bios update. My x370 board got it via a bios update. It came with the update that introduced compatibility for 5000 series CPUs.

    • @ertai222
      @ertai222 2 года назад +1

      Yeah MSI gaming pro carbon x370 got it

    • @TechKnightBrit
      @TechKnightBrit 2 года назад

      @@ertai222 mine didn't let me update, I opened M-Flash, then it said 'Entering Flash Mode' and then was a blank screen. Did yours update fine?

    • @ertai222
      @ertai222 2 года назад

      @@TechKnightBrit yeah I've been updated since the BIOS came out.

    • @TechKnightBrit
      @TechKnightBrit 2 года назад

      @@ertai222 Just curious but does your system have an M.2 SSD, I've read that it could be causing my issue.

    • @ertai222
      @ertai222 2 года назад

      @@TechKnightBrit yes this board has 2. But if you use the second slot you get half speeds or something.

  • @waynemiller3187
    @waynemiller3187 2 года назад +9

    This video really needs a warning about mbr partitioning and disabling csm compatibility mode, cause if you're not careful you can easily disable your boot disk, and the only way to recover is to reset your cmos

  • @thecandyman9308
    @thecandyman9308 7 месяцев назад +1

    This is a greatly informative video for average folks on getting more out of their existing setup. Props to AMD and Linus for making this!

  • @argon6520
    @argon6520 2 года назад +4

    I think it should also be added that such tricks can make wonders for VR, especially with high-end headsets that put 4K monitors to shame with their resolution.

  • @depravedrogue
    @depravedrogue 2 года назад +6

    Thanks for the SAM tip. I was already using Sapphire's version of -RSR- Radeon Boost, Trixxx Boost (works better and allows more customization in my opinion, but limited to Sapphire cards I assume), but wasn't aware my 3600x and 5700xt could use SAM by simply tweaking the BIOS settings.

    • @sbrader97
      @sbrader97 2 года назад +2

      Trixx boost isnt the same as rsr afaik its more like there version of radeon boost and only uses standard gpu upscaling from the lowered render res

    • @sbrader97
      @sbrader97 2 года назад +1

      @John-Paul Hunt idk if they could do fsr 2.0 at a driver level it needs deeper intergration into a game to have access to the colour and depth buffer and it has to be done earlier in the render pipeline rsr 1.0 works more like a post proccesing filter stretching and sharpening the final image

    • @depravedrogue
      @depravedrogue 2 года назад +1

      @@sbrader97 Yeah, I messed that up. It's Radeon Boost that I dislike. It makes the image look terrible during movement in FPS games I play, whereas it seems Trixxx Boost just decreases the resolution and upscales it the entire time, so it's at least consistent.

  • @ilovelamplongtime
    @ilovelamplongtime 2 года назад +2

    This worked well for me. 20-25ish frame increase in MWII and getting slightly cooler GPU temps

  • @sudl5346
    @sudl5346 2 года назад +10

    For those having Problems with "No bootable Device found" after activating ReBAR:
    Make sure, that Secure Boot is disabled (restart after disabling it) and then enable CSM and it should be fine again.

    • @Poweroffunky
      @Poweroffunky 2 года назад

      I just straight activated it and got black screened then no input device detected on my monitor. Had to clear the bios to get the picture back. Gonna give this method a try next. Thanks!

    • @sudl5346
      @sudl5346 2 года назад +1

      @@Poweroffunky Tbh, i'm not sure, if it will fix the problem, you encountered.
      The problem, i mentioned, occoures when the system drive is in wrong "format". ReBAR and/or 4G decoding (not sure, if both or just one of them) needs the drive to be in GPT instead of MBR to be detected as bootable.

  • @kedarsharma487
    @kedarsharma487 2 года назад +16

    I don't remember the last time AMD sponsored an LTT video. Its been so long

    • @DailyCorvid
      @DailyCorvid 2 года назад

      AMD barely sponsor anything lol they spend almost their entire budget on R&D.

    • @ShadowGirl-
      @ShadowGirl- 2 года назад

      I believe AMD sponsored this to force them to not talk about Nvidia's use of ResizeBar, which does exactly the same thing and they gave it to us over a year ago. Also it was figured out that not all titles gain advantages from accessing more of a GPU's memory, so not all titles even make use of the technology. Normally these things would be mentioned by Linus, but this is the "cost" of sponsorship.

    • @DailyCorvid
      @DailyCorvid 2 года назад

      @@ShadowGirl- Lol you should rethink thinking

  • @marcasswellbmd6922
    @marcasswellbmd6922 2 года назад

    It's nice to see you go back to your roots.. Which was showing people how to get the most out of their Tech.. Just a shame someone has to sponsor the Vid for you to do it..

  • @LucasDaRonco
    @LucasDaRonco 2 года назад +12

    Btw, this works exactly the same for Nvidia. SAM is called Resizable Bar so go ahead and activate that too and you also have DLSS for IA super sampling (or upscaling) with antialiasing included. You also get image sharpening if you want it too.

    • @YaBoiBigNutz
      @YaBoiBigNutz 2 года назад +2

      Works for Nvidia 30 series only

    • @small_pc_gaming
      @small_pc_gaming 2 года назад

      do you know why i dont see a above 4G decoding.
      the o e i instead have is called above 4g memory/crypto currency mining

    • @iikatinggangsengii2471
      @iikatinggangsengii2471 Год назад

      no i can enable rebar too, try updating bios

  • @Cheesebread42
    @Cheesebread42 2 года назад +13

    I bought the 6950 XT last week for my RDNA2 system so this was a nice reminder.
    I tried to enable it as described though but the Compatibility Support Module must still be enabled to boot - apparently my boot drive was not automatically set up as UEFI when I built it the other year. Linus missed an opportunity to catch this hiccup. I now need to convert the Master Boot Record/Legacy drive to UEFI.... >_>

    • @AdaaDK
      @AdaaDK Год назад +1

      Your issue is not linked to sam(or anything related to amd really). Its a Windows security Thing (required for win11). That you got issues with from what your saying.

  • @apebabel
    @apebabel 6 месяцев назад

    I planned on getting an AMD build because of the CPU/GPU compatibility market. Its nice to see that they have software that actually intertwines the two. It should get better as time goes on. (This video is also 2 years old, so it should already be better.)

  • @jaredvillhelm2002
    @jaredvillhelm2002 2 года назад +26

    AMD going all in to be top dog, really excited to see where this pushes computing!

  • @landoishisname
    @landoishisname 2 года назад +4

    important to note that SAM can actually reduce performance in some cases

  • @onetunepauly1194
    @onetunepauly1194 2 года назад

    Thanks so much for this video. Recently upgraded my GPU to a 6700 XT and have a 3600x. Downloaded a bios update and turned theses setting on

  • @imopapa6680
    @imopapa6680 2 года назад +59

    Undervolted my gpu because it was getting too hot. It also increased the fps in some of my games. I feel like undervolting is too underrated.

    • @OmniUni
      @OmniUni 2 года назад +15

      Yup. I actually wish they'd delve more in to the various options that Adrenaline gives you. AMD's control center is REALLY nice, and makes it very simple to both undervolt AND overclock if you so choose.

    • @RetroPlus
      @RetroPlus 2 года назад +2

      Yeah it's fantastic

    • @thetruthisoutthere5173
      @thetruthisoutthere5173 2 года назад +4

      Extremely easy and underrated unfortunately

    • @ffwast
      @ffwast 2 года назад +6

      The manufacturers really configure these things for e-peen instead of a reasonable power to performance.
      The top 150 watts of the 3090ti power spec is only like the top 10% of the performance. 2/3 power for 9/10 fps in benchmarks with the power settings from the equivalent professional quadro type card.

    • @Catinthehackmatrix
      @Catinthehackmatrix 2 года назад +1

      It works better for me too to only use a little bit of msi overclock 1, because maybe msi 2 system ram overclock wasnt stable, or maybe it was amd drivers that got better in last two years, with this pc always learning something.

  • @tntTom174
    @tntTom174 2 года назад +4

    I build my new system again after like a decade just couple days ago and now I learn about this? Wonderful. I mean it, coz I let myself talk into 5600X and 6700XT instead of 12400F and 3060ti and I'm actually really happy that I did.

    • @blee-bleep3906
      @blee-bleep3906 2 года назад

      congrats on your upgrade! i also got talked into getting into team red but honestly, after building mine (r5 5600g x A580), i dont think im going team blue anytime soon, value is too good

    • @tntTom174
      @tntTom174 2 года назад +1

      @@blee-bleep3906 Thx. I had them all thru time, team blue, and green and red and I was always happy with what I got. Coz as long as u do ur homework and shop smart and for own needs instead of blind fanboy-ing to this or that, u can't go wrong with either. True tho - thru time they sometimes out-race one another and can lead the field for a bit, but thats normal.

  • @fyntop
    @fyntop 6 месяцев назад +2

    rare footage of linus tech tips actually giving tech tips

  • @djsnowpdx
    @djsnowpdx 2 года назад +6

    You’re not the only one, Alex! I haven’t attempted undervolting yet but I slash my CPU and GPU power limits to their minimum values (98 watt 5700XT and 87 watt 5950X) when the outside temperature gets warmer than inside.

    • @joel3399
      @joel3399 2 года назад

      I just have a rx 570 but the heat bothers me so much that in summer I usually play with my laptops integrates graphics

    • @rrekki9320
      @rrekki9320 2 года назад

      @@joel3399 Tried repasting and freeing it of dust? Otherwise I can recommend takin off the shroud, puttin two 120mm Arctic P/F12 on it with Zipties and ya done :D

    • @CarbonPanther
      @CarbonPanther 2 года назад +1

      A smart tip for the 5700XT is to reduce the clock slider instead of the power limits.
      You'll save about as much on power consumption, but you will have much more of that performance back that you'll lose by hampering the cards power limit!
      Try running the card at 1600Mhz, look at the FPS before and after and please tell me the results!

    • @-eMpTy-
      @-eMpTy- 2 года назад

      @@rrekki9320
      lowering the temps by repasting and/or adding fans doesn't decrease the heat output to your room

    • @CarbonPanther
      @CarbonPanther 2 года назад

      @@-eMpTy- I know this will be heavily debated but i firmly believe that: The cooler the Chip runs and the lower the surface temperature of the heatsink is, the colder the air will be that is expelled from the case into the ambient room.

  • @SoniasWay
    @SoniasWay 2 года назад +6

    It’s been so long since the last time AMD sponsored LTT

    • @makisekurisu4674
      @makisekurisu4674 2 года назад

      Goes to show how rich they got off miners and Ryzen.

    • @conorabc
      @conorabc 2 года назад

      @@makisekurisu4674 I’m going to assume that they made much, much more money in the datacenter. Either way it’s good to see them put this newfound cash flow into R&D and sponsorships like this one. They have a mindshare problem

    • @ShadowGirl-
      @ShadowGirl- 2 года назад

      I believe AMD sponsored this to force them to not talk about Nvidia's use of ResizeBar, which does exactly the same thing and they gave it to us over a year ago. Also it was figured out that not all titles gain advantages from accessing more of a GPU's memory, so not all titles even make use of the technology. Normally these things would be mentioned by Linus, but this is the "cost" of sponsorship.

  • @fw.phxtom
    @fw.phxtom Год назад

    Years later and that intro animation still leaves me amazed

  • @NFG-Nero
    @NFG-Nero 2 года назад +9

    "You wanna faster gpu?"
    No Linus
    i just want a gpu

    • @lvl5monk297
      @lvl5monk297 2 года назад +1

      GPUs are basically back at msrp on ebay. Get with the times lmao the shortage is essentially over

    • @NFG-Nero
      @NFG-Nero 2 года назад +1

      @@lvl5monk297 not here, where inflation exceeds 12% and where + 23% tax has to be paid :/

  • @-_-_-_-_
    @-_-_-_-_ 2 года назад +11

    It's really amazing that we've developed technology so far that the universe itself is posing hard limitations on us. Imagine telling someone in the 1930s that by 2022 we will have developed technology so far that the laws of physics start to break down (well not really but it sure feels like it) and we can't really progress any further.

    • @philmccracken2012
      @philmccracken2012 2 года назад +3

      Mina......What are you babbling on about? What are you even referring to?

    • @coolghoul9
      @coolghoul9 2 года назад +4

      @@philmccracken2012 I think they are referring to the 120v outlet the universe gave to us, if only there was a bigger outlet to be discovered

  • @Mandaeus
    @Mandaeus 2 года назад

    If you look at the top right of the top line bios menu bar @4:13 you will see the option "Resize BAR". This is the option you need. Was there on mine ;)

  • @gamiseus
    @gamiseus 2 года назад +6

    I had an AMD card in my last PC, and literally just opening the Radeon center program allowed you to tweak settings for much better performance. It was awesome how easy they made it. Stay awesome amd!

    • @iikatinggangsengii2471
      @iikatinggangsengii2471 Год назад

      yeah amds basically more customizable, nvidia intel more like plug and play great at default

    • @leanlifter1
      @leanlifter1 Год назад

      ​@@iikatinggangsengii2471Cept on Linux AMD is plug and play and Nvidia is a big L. Meanwhile the Radeon control panel in windows is decades ahead of Nvidia as they have not updated their outdated console for almost Two decades lol.

  • @why_though
    @why_though 2 года назад +10

    The basic idea of quantum tunneling is actually quite simple: all particles are waves. Waves are inherently not point like. This manifests itself in the form of a probability density function... a fancy way of saying that electrons are by definition not located at an exact point in space, according to our best physics theory. Their location is fuzzy in a way. This means that at short distances they can effectively teleport though any barrier. We call this quantum tunneling as it is not really teleportation as their location wasn't exact to begin with. This is just how the universe works, location is not exact and we just have to accept that. For chips this means that they can randomly escape barriers which breaks the fundamental idea of chip computing which is based on trapping these electrons with barriers.

    • @ydfhlx5923
      @ydfhlx5923 2 года назад +1

      Let's be real there's like zero % chance anyone without physics background understands quantum tunneling.

    • @conorabc
      @conorabc 2 года назад +8

      @@ydfhlx5923 idk OPs explanation helped me understand it a little and I’m a certified idiot

    • @why_though
      @why_though 2 года назад

      @@ydfhlx5923 Not really. But you are right in the sense that this topic is extremely complex and to understand it on a deep level you do need a physics background. Not to mention that one could argue even physicists don't really understand it, they just know the theory, which we know is still incomplete... However, a basic understanding is not as hard to acquire as you may think. The internet and even RUclips has loads of easy to grasp resources on physics phenomena like this. You just have to dive in and you'll see it can be learned.

    • @why_though
      @why_though 2 года назад

      @@ydfhlx5923 And quantum physics being so mind-blowing and eye-opening is what makes me want to keep learning more... and no, I'm not a physicist by trade.

    • @Priyajit_Ghosh
      @Priyajit_Ghosh 2 года назад +1

      Here's a like from me Alex, you have explained it beautifully. I am an electronics engineering student and have to study this for next couple of years (or throughout my life). This thing is damn interesting and complex though.

  • @OxllSunslayer
    @OxllSunslayer 2 года назад

    Linus is changing manufacturers marketing and business conduct. You sir are a legend. To think I was watching Linus when I used to buy stuff at NCIX. Love how I still watch Linus, and NCIX, well.... :)

  • @GasolineNl
    @GasolineNl 2 года назад +13

    WIth my new 6700XT on 1440p ultrawide I gain 2 to 5 fps with Assetto Corsa. Playing around 110/120 FPS.

    • @NotJulius44
      @NotJulius44 2 года назад

      that's it?

    • @10vingers
      @10vingers 2 года назад

      @@NotJulius44 It's marginal. But it's more, so he did not lie ;)

    • @Skelterbane69
      @Skelterbane69 2 года назад

      6700XT gang

  • @aschen
    @aschen 2 года назад +22

    Just here to say I love the design of that new ASRock 6950xt. I keep seeing it on Newegg and other places and it's so simple. I love it. Not the bladed gamery nonsense most other brands go with. And it's sad because I'd prefer Asus. Good on them though

    • @ShiroCh_ID
      @ShiroCh_ID 2 года назад +1

      isnt ASRock and Asus is same?
      i mean "Technically"

    • @Ravenousjoe
      @Ravenousjoe 2 года назад +1

      AsRock is also new to the GPU game, so it may have some pretty subpar components on it. Their first RX580 (only 2 generations ago) performed worse than AMD's reference blower cooler and it was just a bad product. Asrock is known for cost savings, so imo, they are a risk to buy until we see plenty of reviews on numerous products from them.

    • @aschen
      @aschen 2 года назад

      @@ShiroCh_ID that is true. I didn't even know that lol. Just looked it up

    • @CarbonPanther
      @CarbonPanther 2 года назад

      @@Ravenousjoe It's true that their low end cards (including the low end coolers on mid to high end) cards aren't usually on par (I myself had that 580 ones and it blew my ears out lol)
      But jokes on you though, the ASRock RX 6900/6950 XT OC Formula are some of the finest cards you can buy right now, MSI and Gigabyte don't even stand a simmer of a chance against it's phenomenal performance!

    • @Ravenousjoe
      @Ravenousjoe 2 года назад

      @@CarbonPanther who has done a review of them?

  • @BogDog9
    @BogDog9 2 года назад +2

    i have this working with a b350 board with 5800x3d and 5700xt, sweet compatability!

  • @hawaiian_143
    @hawaiian_143 2 года назад +3

    this is awesome how the heck is AMD not crushing Nvidia, also, Linus might gone over how to turn FSR on/off js

    • @X3Andy
      @X3Andy 2 года назад

      In game settings.

  • @probablyyourneighbororsome8412
    @probablyyourneighbororsome8412 2 года назад +6

    r9 380x user here: I don't want a faster gpu, I want a supported one that doesn't have dozens of graphics errors all the time!
    THE DAMN THING IS ONLY 6 YEARS OLD!

    • @sayacee5813
      @sayacee5813 2 года назад +3

      nimez driver

    • @chestermc9954
      @chestermc9954 2 года назад

      There are unofficial drivers for the 380x that keep it supported. Also I'm pretty sure that the Linux drivers are still fully supported.

  • @bricefleckenstein9666
    @bricefleckenstein9666 Год назад

    1:49
    The USA actually runs on 240 Volt power - but "split phase" for many of our outlets giving 120.
    If you want 240, it's fairly trivial as long as you can run the cable from the breaker box - and most computer power supplies for the last couple DECADES have been designed to handle anything from around 100 to 250 or so with the right power cord.

  • @12coco100
    @12coco100 2 года назад +6

    FSR 2.0 Looks really good I recommend watching MULTIPLE videos on it for Deathloop. Console level gpu's will be able to do 4K 60fps gaming if FSR 2.0 is in more games.

  • @zkatt3238
    @zkatt3238 2 года назад +4

    FSR works on Nvidia GPUs as well, since it isn't tied to specialized hardware like Nvidia's DLSS

    • @Ishan.khanna
      @Ishan.khanna 2 года назад

      Yep
      Sadly both can't be used together lol
      Atleast i haven't seen it in action

    • @janisir4529
      @janisir4529 2 года назад

      @@Ishan.khanna Just use a bigger upscaling for DLSS, duh.

  • @kingdon9690
    @kingdon9690 2 года назад +1

    I'm running 1080p Sceptre 165hz 24 inch monitor, and have RSR on not bad, and AMD freesync premium enabled.

  • @Ray13star
    @Ray13star 2 года назад +5

    RSR on my xfx rx 5700xt thicc iii is great at 1080p. The image quality that comes out is similar to 2k native in most games that I play. While there is shimmering that occurs, it isn't much of a distraction (since I don't play competitive esport games).

    • @gotworc
      @gotworc Год назад

      ​@elcactuar3354it's not 1080p is HD, 1440p is 2K and 3840 is 4K

  • @KalebSDay
    @KalebSDay 2 года назад +17

    You really should have mentioned how using SAM can have negative effects on some games. Hardware Unboxed made a few charts outlining this. For SOME games there are improvements, but it is NOT universal and can be negative on performance depending on the title as well. Hope you add an edit in the video to clarify this.

    • @CarbonPanther
      @CarbonPanther 2 года назад

      Weren't the deficits with SAM minimal to none though? And wasn't Nvidia's implementation the one that's actually noticeably degrading performance more often than not?

    • @KalebSDay
      @KalebSDay 2 года назад

      ​@@CarbonPanther There were definitely higher negative outliers such as Apex Legends having a -10% hit in performance at 1080p with SAM enabled. My main point is that SAM is not universally positive and LTT should have mentioned this in their review/highlight video just so people are aware SAM can hurt performance too. Users should know to look up if SAM negatively effects the games they intend to play on a consistent basis before enabling it.
      For those interested check out Hardware Unboxed's video titled 'AMD's Killer Feature? SAM, 36 Game Benchmark [1080, 1440p & 4K]' I would link the video, but I've had my comments hidden before for doing that. The 1080p, 1440p, and 4k results were all around the 12 minute mark.

  • @drizzle8763
    @drizzle8763 7 месяцев назад

    I have a Radeon 570 but its my first pc and I been having it for a year now and i still love it

  • @TheIrishAlchemist205
    @TheIrishAlchemist205 2 года назад +16

    Would love to see more about how tech can be more sustainable, Linus. You're right - we're at a point there's no real reason for the avg consumer to have much that's more powerful than what we've got.

    • @TheIrishAlchemist205
      @TheIrishAlchemist205 2 года назад +2

      @Christopher Grant for real. if AMD's (for instance) gear from 2, 3, 4years ago can still compete with modern things, why are we wasting silicon and cobalt like this?

  • @durillongaming
    @durillongaming 2 года назад +17

    i like how right after intel sponsors ltt, AMD instantly realizes that they need to as well

    • @zodwraith5745
      @zodwraith5745 2 года назад

      To be fair that massive glut of Intel sponsored videos weren't straight up infomercials like this. Hell most of them barely even mentioned Intel existed.

    • @durillongaming
      @durillongaming 2 года назад

      @@zodwraith5745 good point, well i mean at least amd had ltt make a useful video, even if it litterally seems like an ad, a lot of people might not know these things

    • @zodwraith5745
      @zodwraith5745 2 года назад

      @@durillongaming True, but it says a lot about Linus's integrity of late. Gamer's Nexus or HUB would have straight up told AMD to F off. And they have. But LMG is always for sale.

    • @durillongaming
      @durillongaming 2 года назад

      @@zodwraith5745 well why not? amd makes pretty good products, i have no problem with ltt getting sponsored by them

  • @frando2479
    @frando2479 2 года назад +2

    Great... I cannot even boot anything anymore. Thanks for not mentioning any problems one could run into like: Having to have mbr changed to gpt, and still not seeing that resizeable bar option at all and what that means etc. etc.
    From "oh just a few clicks" to hours wasted for nothing real quick.

  • @nukedathlonman
    @nukedathlonman 2 года назад +5

    With SAM you/AMD forgot one little hitch - it's not quite that simple of flipping a UEFI switch IF one isn't using GPT partition tables. Yes, I'm working on that migration as I type this. :-)

    • @bigmattvo
      @bigmattvo 2 года назад

      yeah that's what I'm struggling with. how do you migrate without the loss our data? does that mean I have to rebuy Windows?

    • @mirano15
      @mirano15 2 года назад

      I think for the majority of people its not a major issue. Who uses MBR or NTFS these days in a regular system?

    • @nukedathlonman
      @nukedathlonman 2 года назад

      @@bigmattvo No, but ensure you have a USB boot key. RUclips doesn't allow me to post links, but there is a good walk through on Microsoft for how to go about it. It's titled "Convert an MBR disk into a GPT disk"

    • @nukedathlonman
      @nukedathlonman 2 года назад +1

      @@mirano15 One whom has only been upgrading the same system for many many many years (and never needed to re-install Windows (or Linux)). ;-)

    • @mirano15
      @mirano15 2 года назад

      @@nukedathlonman My condolences. How many years has that system been running?

  • @oldowl4290
    @oldowl4290 2 года назад +3

    I have a dedicated circuit for my PC called an "isolated ground circuit" I used a 20 amp breaker, 12-2 romex wire (which is rated for 20 amp, 14-2 is rated for 15amp) and then the breaker is installed at the position in the service panel closest to ground and the ground wire is isolated. This is the best thing you can do short of additionally adding something like a Furman power conditioner for cleaning/stabilizing the power. Regardless, off this one circuit I run my PC which has two video cards and runs 6 visual display LCDs, the largest being a 32" 4k.

  • @Burbun
    @Burbun 7 месяцев назад +1

    You see an awful lot of people talking about Nvidia marketing words like you can't do something similar on AMD, so it's nice to see someone talk about AMD features. Especially the free extra frames from using all amd hardware, I think I heard some rumbling about Intel working on something similar
    "I would totally use AMD, but they didn't have fleep florps, not to mention AI powered X frames."

  • @the_dietcam
    @the_dietcam 2 года назад +6

    I wish they implemented an auto downscaling feature with an fps goal. Then when you're indoors for example you're playing at full resolution but when you're outside it'll drop the res a bit or fluctuate with movement to maintain your higher fps. Kinda puts a damper on freesync where the lower fps is supposed to feel smoother, so maybe just set a minimum framerate of 60 and then an upper limit of your monitor's max. God forbid the game drops below 60 only then will the res lower and should make for a pretty smooth experience. Also, there's some games that support dynamic quality settings after doing a benchmark and analysing the system, I'm sure doing that more would be helpful.

    • @GreenCinco12Official
      @GreenCinco12Official 2 года назад +2

      Apex Legends has this. Idk what it is called but it can automatically adjust the resolution and details (I think).
      I used it when I used my old gtx960 and i5 4570. Looked like sh*t but it ran smoother.

    • @TanteEmmaaa
      @TanteEmmaaa 2 года назад +2

      Some games provide auto resolution options, and they are great, i use them all the time if available. Sadly, it seems not possible to combine this with FSR.

    • @Darkbuilderx
      @Darkbuilderx 2 года назад +1

      FSR should be possible as long as the devs want to add it, but doing it with RSR (the system wide one) really isn't feasible since it changes the actual screen resolution.

  • @ilcugginocanadese
    @ilcugginocanadese 2 года назад +13

    It would be important to note that to enable the feature in the BIOS settings, the boot drive has to be set to UEFI or it will not boot at all after changing the settings. It's a bit of legwork to convert a regular MBR Windows installation to boot from UEFI BIOS but it can be done. I found out the hard way, after changing the settings as per the instructions the system would simply not boot, and instead cycle back to the BIOS setup page. Took me a few minutes to figure it out.

    • @rafaelduarte5875
      @rafaelduarte5875 2 года назад

      But how did you convert that? I'm running in to the same problem.

    • @ilcugginocanadese
      @ilcugginocanadese 2 года назад +1

      @@rafaelduarte5875 You have to change the partition boot type to be UEFI enabled. There's a CLI command for it. I don't have it offhand but you can find it on the internet.

    • @rafaelduarte5875
      @rafaelduarte5875 2 года назад +1

      @@ilcugginocanadese thank you was able to find it after copy pasting this "convert a regular MBR Windows installation to boot from UEFI BIOS" in to youtube there is a video for it showing where that option was. thanks, for your help, would not have sam active if not for your comment.

    • @jonasedvardsson7612
      @jonasedvardsson7612 2 года назад +2

      @@rafaelduarte5875 in win 11,disk management right click the drive convert to gpt,done🙂

  • @user-mx2qx4rl9o
    @user-mx2qx4rl9o 2 года назад

    I love AMD. Not sure how the hardware is today, but a few gens ago i was so happy with my amd gpu/cpu

  • @katerhamnorris3936
    @katerhamnorris3936 2 года назад +4

    here in germany we get max 3650watts out of the socket but we have 230 volt outlets at 50hz

  • @brothatwasepic
    @brothatwasepic 2 года назад +5

    Kind of nice that they don't just patch to slow down their gpu's forcing you to upgrade

    • @MaxIronsThird
      @MaxIronsThird 2 года назад +3

      Apple isn't in dGPU market yet