AMD Says You’re Doing it Wrong. - Best settings for AMD GPUs

Поделиться
HTML-код
  • Опубликовано: 31 май 2024
  • Thanks to AMD for sponsoring this video! Learn more at lmg.gg/WAdzj
    Buy an AMD Ryzen 7 5800X3D: geni.us/tnn0GBY
    Buy an AMD Radeon RX 6950 XT: lmg.gg/brtnL
    Buy an AMD Ryzen 7 5700X: geni.us/iIZeLJB
    Buy an AMD Radeon RX 6750 XT: geni.us/DG0UoTU
    Buy an AMD Ryzen 5 5600: geni.us/B6xE
    Buy an AMD Radeon RX 6650 XT: geni.us/5Gz0n7
    Buy an ROG Ryujin 240: geni.us/iIoj
    Buy an ROG Thor 850W: geni.us/hm8kxQc
    Buy a Corsair MP600: geni.us/hTIiyh
    Buy a kit of Vengeance RGB 3600 16GB RAM: geni.us/epcOV
    Buy an ROG Strix X570-E Gaming Wifi: geni.us/Z6oTluT
    Buy a HYTE Y60 Mid-Tower: geni.us/8pV13t
    Purchases made through some store links may provide some compensation to Linus Media Group.
    Discuss on the forum: linustechtips.com/topic/14338...
    ► GET MERCH: lttstore.com
    ► AFFILIATES, SPONSORS & REFERRALS: lmg.gg/sponsors
    ► PODCAST GEAR: lmg.gg/podcastgear
    ► SUPPORT US ON FLOATPLANE: www.floatplane.com/
    FOLLOW US
    ---------------------------------------------------
    Twitter: / linustech
    Facebook: / linustech
    Instagram: / linustech
    TikTok: / linustech
    Twitch: / linustech
    MUSIC CREDIT
    ---------------------------------------------------
    Intro: Laszlo - Supernova
    Video Link: • [Electro] - Laszlo - S...
    iTunes Download Link: itunes.apple.com/us/album/sup...
    Artist Link: / laszlomusic
    Outro: Approaching Nirvana - Sugar High
    Video Link: • Sugar High - Approachi...
    Listen on Spotify: spoti.fi/UxWkUw
    Artist Link: / approachingnirvana
    Intro animation by MBarek Abdelwassaa / mbarek_abdel
    Monitor And Keyboard by vadimmihalkevich / CC BY 4.0 geni.us/PgGWp
    Mechanical RGB Keyboard by BigBrotherECE / CC BY 4.0 geni.us/mj6pHk4
    Mouse Gamer free Model By Oscar Creativo / CC BY 4.0 geni.us/Ps3XfE
    CHAPTERS
    ---------------------------------------------------
    0:00 Intro
    1:08 Undervolt
    3:28 Smart Access Memory
    4:16 Resize BAR
    4:50 RSR
    5:50 Benchmarking
    8:41 Conclusion
  • НаукаНаука

Комментарии • 3,5 тыс.

  • @koizumiizumi5426
    @koizumiizumi5426 2 года назад +10714

    kinda nice seeing a company push the knowledge to get better performance with their equipment instead of just telling us to upgrade to better hardware so they get more money

    • @chessprince1164
      @chessprince1164 2 года назад +227

      It's simply smarter

    • @theunkownviolinist
      @theunkownviolinist 2 года назад +1

      let's be real, those of us who can afford to upgrade to better tech will do so when the new gen drops because we're sluts for fresh new tech

    • @MinkSquared
      @MinkSquared 2 года назад +325

      well, a majority of people wont buy their products if they say "buy this for gooder" because *noone* wants to mess with vulkan.
      they're getting much more marketing value if they just seem like the cool older brother company as opposed to nvidia giving a big L to linux

    • @m_sedziwoj
      @m_sedziwoj 2 года назад +31

      yes, but not mention that most of video is FSR 1.0, is bad as education video, because 2.0 will be different.

    • @gabrieldhondt6432
      @gabrieldhondt6432 2 года назад +96

      That was exactly the same thing i was thinking when i saw the title. Thats the problem with many new tech, people just dont understand it. People buy by names. One time i had someone say his gpu needed to be an RTX and when i said why, he couldnt respond.

  • @LonelySandwich
    @LonelySandwich 2 года назад +3416

    I don't remember the last time AMD sponsored an LTT video. Its been so long

    • @RossColborne
      @RossColborne 2 года назад +120

      Guess that's probably due to LTT being owned by nvidia, so they didn't want to directly hand over cash to their rivals (more or less)

    • @opvask
      @opvask 2 года назад +196

      @@RossColborne yea because in this space you’re gonna make it big by putting all your eggs in one basket.. or not.
      LMG is taking everybody’s money, they don’t discriminate like that.

    • @keldon1137
      @keldon1137 2 года назад +125

      @@RossColborne I hate internet, without facial expressions and stuff i cant really tell if u believe that or just repeat an old joke

    • @paris6817
      @paris6817 2 года назад +89

      @@RossColborne that was an April fools joke 6 years ago… 😂

    • @riztiz
      @riztiz 2 года назад +31

      @@opvask They're referencing a joke made by linus a few years ago

  • @semibreve
    @semibreve Год назад +547

    this is actually so cool to see AMD give this advice to end users: optimising and tweaking has always been the most slept on and most confusing part of PC building so it's awesome to seeing Linus covering this

    • @madangel71
      @madangel71 Год назад +7

      Just activating SAM on bios, I got 20% more FPS on Benchmark of Division 2. On my RX 580 8Gb, it went from 90 FPS Average, to 108 FPS :D. On my Ryzen 9 5900X system, this GPU is a huge bottleneck and activating SAM really helped.

    • @umamifan
      @umamifan Год назад +9

      It’s because “optimizing” is seen as a low spec gamer kind of thing, and is thus something a person who buys a mid range card doesn’t want to do. They expect out of the box settings, and don’t want any of the “low spec inconveniences” just because they think they’re too good for it, or because they “paid too much” to have to optimize!

    • @neildiamond610
      @neildiamond610 Год назад

      @@madangel71 how did you manage to get sam on? I also have an rx580 8gb. Setting resize bar to On doesn't seem to work for me cause when I do after i save settings restart then it will say vga card is not supported.

    • @relax9086
      @relax9086 Год назад

      ​@@madangel71 sam don't work on the rx580, it's not compatible cause i have a 580 card!

    • @madangel71
      @madangel71 Год назад

      ​@@neildiamond610 Hi. I have a Motherboard ASUS TUF GAMING B550-PLUS bought in 2022 February, and when I activated Sam with RX-580, it worked fine on first attempt without any issue. I replaced RX580 with an RX 6700 XT, on 2023 February also, and with the more recent GPU, I got a new issue where one of the SSD stopped being recognized. So it seems it depends on the mainboard/GPU combo.

  • @sythex92
    @sythex92 2 года назад +434

    3:00 - Quantum Tunneling has been an issue for 5nm for a long time, it's only recently that TSMC has been able to overcome it, 3nm is where it really starts to get difficult though, we've made almost no progress on a 3nm node, and 4nm is awhile away. I feel 5nm will be here to stay for quite awhile.

    • @Jane-qh2yd
      @Jane-qh2yd 2 года назад +7

      I'm pretty sure Nvidia's upcoming Love Lace will be built on TSMC's 4nm platform

    • @IschChef
      @IschChef 2 года назад +49

      @@Jane-qh2yd TSMC "4nm" is not actually truly 4nm, just like "Intel 7" is not 7nm. The numbers are only marketing and completely decoupled from real transistor size.
      On top of that quantifying three dimensional transistor structures like we have since the start of the FinFET era with a single number is a bit questionable anyway.

    • @Jane-qh2yd
      @Jane-qh2yd 2 года назад +9

      @@IschChef The comment above is specifically talking about TSMC chips, so TSMC 4nm is relevant here

    • @yacquubfarah6038
      @yacquubfarah6038 Год назад +11

      @@IschChef Nah they are 4nm lmao.

    • @Ignacio.Romero
      @Ignacio.Romero Год назад +8

      @@IschChef TSMC nm sizes are real, unlike Intel, but that doesn't mean it's inherently better than Intel's process just because the number is smaller.
      That being said, mobile chips are already using TSMC's 4nm process (Snapdragon 8 gen 1, upcoming Apple A16), so your point about 5nm sticking for a while isn't true, or at least only true for computer CPUs

  • @jakedill1304
    @jakedill1304 2 года назад +1272

    I love how gearbox has managed to turn every single advantage that you should be getting with cell shading into a disadvantage...

    • @coffee7180
      @coffee7180 2 года назад +216

      It's just insane. I always fought more devs should use cell shading for great art style, fast development, great performance, great aging, easy to build assets and contentbut gearbox showed me how incompletents can destroy everything.

    • @ishiddddd4783
      @ishiddddd4783 2 года назад +161

      indeed lmao, near to 0 graphical improvement but with a game not only terribly optimized that it's so stupidly demanding, but while looking like an early 2010s game

    • @Just_RJ
      @Just_RJ 2 года назад +31

      They should take some lessons from Arc System Works.

    • @silverhawkroman
      @silverhawkroman 2 года назад +15

      ​@@ishiddddd4783 probably randy pitchfords middle finger to gamers

    • @sammi7971
      @sammi7971 2 года назад +41

      @@silverhawkroman Oh he already did that when he accepted that huge check from Epic Games to make the Borderlands franchise an Epic Games store exclusive.

  • @TheMx5Channel
    @TheMx5Channel 2 года назад +1741

    Would help alot if developers start optimizing programs instead of increasing raw system power.

    • @burkino7046
      @burkino7046 2 года назад +214

      especially looking at you Windows, theres basically no reason why you still need code thats 20 years old

    • @clipsedrag13
      @clipsedrag13 2 года назад +80

      why optimize when you can just keep releasing new hardware

    • @CarbonPanther
      @CarbonPanther 2 года назад +263

      Imagine if all games were as optimized as Doom Eternal, we wouldn't even need ANY of the tech shown here... What a dream to have.

    • @SiniSael
      @SiniSael 2 года назад +22

      U cant do that - thats common sense...
      Like its obvious so OFC they wont do that - Its all about profit! ;)

    • @moistjohn
      @moistjohn 2 года назад +101

      The time honored tradition of throwing hardware at a software problem

  • @dnakatomiuk
    @dnakatomiuk Год назад +117

    "If GPUs get any more powe hungry" Nvidias series 4000 launches a few months after

    • @attmrcmailik9653
      @attmrcmailik9653 Год назад +2

      Such tdp isnt really high. Check out amd's r9 top models (450w in stock in 2013)

    • @OsGere
      @OsGere Год назад +4

      @@attmrcmailik9653yeah that was in 2013 so its irrelevant now😂 its 2023 now buddy

    • @10siWhiz
      @10siWhiz Год назад

      Electric suppliers wringing their hands

    • @frostedflakes3168
      @frostedflakes3168 3 месяца назад +4

      40 series uses a lot less power than the previous gen of GPUs

    • @iamhardwell2844
      @iamhardwell2844 11 дней назад

      4070 only use 1x8pin PCIE connector LOL

  • @goolash1000
    @goolash1000 Год назад +8

    FYI: most us outlet boxes are fed by two phases. Which means in many cases, a two-outlet was fixture can be changed by a qualified electrician into a single outlet 220V fixture for half the amp draw, and most PC power supplies are compatible with 220V out of the box.

  • @sirspamalot4014
    @sirspamalot4014 2 года назад +1020

    We're finally coming back to a point where developers have to optimize their games to run on hardware for longer. Remember, the Xbox 360 can run GTA V, even though it has no right being able to, that's the kind of optimization I want to see, where 10 plus year old hardware can still run a game REALLY well.

    •  2 года назад +46

      And to this day it is the only version of GTA V that I have played, I really liked it but not enough to give it another go when the other versions came out, also I think it was the last game I played in my 360 so it had a good send off.

    • @SimonBauer7
      @SimonBauer7 2 года назад +16

      true. when the ryzen laptop apus launched we where able to see just how slow the xbox one and ps4 where. i mean a ryzen 3 apu can run gta v at framerates the ps4 couldnt even dream of.

    • @jordanwardle11
      @jordanwardle11 2 года назад +8

      @@SimonBauer7 yes... 4 years later

    • @chriswright8074
      @chriswright8074 2 года назад +1

      @@SimonBauer7 not really

    • @bolland83
      @bolland83 2 года назад +6

      @ I was in the same boat, spent the $99 on the special edition bundle when it came out on 360. I ended up getting it again on steam though, it was on sale for $10 since I never play on consoles anymore. Anything I want now just goes in my wishlist and I get a notification if it goes on sale, if it's a good deal I get it.

  • @NoLongo
    @NoLongo 2 года назад +760

    Game developers doing everything they can to make the most unoptimized games imaginable because resolution upscaling and other features give them more headroom.

    • @ShiroCh_ID
      @ShiroCh_ID 2 года назад +57

      lol i thought the same thing
      we made solution of the problem that actually didnt exist or so it should
      and seeing the Ages old Nintendo 64 games or PSone games or even some PS2 games i can see that they sure work around the hardware limitation nicely,heck even C64 and MS-DOS games also have same clever wokraround of that limitation,nowdays they just care about Graphics and sometimes forgot about space

    • @nogoat
      @nogoat 2 года назад +117

      The best example for this is the GTA Trilogy.
      Like, how can a remaster of a 20 year old game have higher minimum requirements compared to your own title from 2018 (RDR2)

    • @DailyCorvid
      @DailyCorvid 2 года назад +41

      If you ignore shitty triple A's then it is the opposite! Indie devs rocking currently.

    • @EnigmaticGentleman
      @EnigmaticGentleman 2 года назад +54

      Yep, i miss the 90s when game developers actually tried to optimize their games so as many people could game as possible. With computers having gotten as powerful as they have, there is simply no excuse for games to have trouble on cards like the 1060 (at medium settings).

    • @CaptPatrick01
      @CaptPatrick01 2 года назад +13

      @@ShiroCh_ID Squeezing water from stone as someone put it. It is a lost art.

  • @reyalPRON
    @reyalPRON Год назад +28

    This is also a great way to save power. And heat in your system surely will be less of a problem and you might get away with less fan noise too :)

  • @Eugomphodus
    @Eugomphodus 2 года назад +8

    There's also a toggle (once enabled in BIOS) to switch SAM on/off in the Radeon software. Some games I've found out REALLY doesn't like this feature and performs really bad with it on, but I'm sure it'll be more useful as times goes on.
    An extreme example I found so far is "Element TD 2". I've set a cap to 144 fps and normally it's around 90-144 fps @1440p max settings during intense moments with a 5700 XT. But with SAM toggled on the fps drops as low as 20 fps and stutters a lot, including the audio.

  • @AdistiantoYuwono
    @AdistiantoYuwono 2 года назад +522

    Would be nice if there’s an “efficiency war” for desktop component also, as fierce as seen in mobile devices. New Intel CPUs and Nvidia GPUs are no joke in sucking those electrons. It almost looks like instead of improving processing technique or architecture, adding more power is the easier way out to gain a marketable performance. However in reality, electricity is very rarely free (in terms of money and environmental impact). Apple and other ARM vendor, as well as AMD in laptop market already shows us that this is actually possible in consumer market.

    • @83hjf
      @83hjf 2 года назад +27

      the problem with Apples M1 performance is that its only good for specific things. because the efficiency comes from having dedicated hardware for common tasks, like for example, compressing one specific video codec, or encrypting one specific algorithm. GPUs are general-purpose devices, because what they do changes with every game. Apple (and ARM) are no gods with secret knowledge, they just optimize for 99% of use cases. A M1 machine, as powerful as it is right now, it's probably not going to fare that well in 10 years when we have moved away from H.265, for example, and it has to do software decoding in the CPU.

    • @Hybris51129
      @Hybris51129 2 года назад +8

      @SomeoneOnlyWeKnow Honestly I think this is more of a sign that the years of "You don't need a 600, 700, 800, 1000w PSU" coming back to bite gamers hard. Bad advice has now driven people to have to add another $150-300 to their upgrades because they didn't build their rigs to handle the march of technology.
      While power efficiency is nice just looking at how desktops have gone from 250w PSU in the late 90's/early 2000's to realistically only a 1000w for exponentially more processing power we have achieved a lot with relatively little increase in power useage.

    • @custard131
      @custard131 2 года назад +12

      i think if you look beyond the headline power ratings there have also been huge steps forward in efficiency too, its just that when looking at the top end chips that increased efficiency is always used to get more performance because there isnt really any reason to reduce the power budget for high end desktop components just for the sake of drawing less power like there is in laptops where battery life is a big selling point
      if you wanted to you could look at the product list and see how much performance you could get for a given power budget, and im fairly confident you would still see performance improvements from generation to generation.
      kind of related to that, in past gpu generations it wasnt that uncommon to have multi-gpu sli setups with up to 4 gpus, with ampere only the 3090 even supports that and even then only 2-way but it can still out perform even the most overkill of past gen setups
      another thing to keep in mind is that manufacturers have also been making improvements to turbo boost / variable clock rates so while the headline figure for a shiny new gpu might say it can draw 400w that doesnt mean it draws that much at the time, that means if you choose to give it a workload that can push it to its max thats how much it has to give.

    • @poipoi300
      @poipoi300 2 года назад +4

      @@custard131 Thank you for that comment I was about to write something similar haha. People focus on the power right now and act like it's always more power for more performance while you could probably take any one generation over the last at the same wattage and it would outperform the last gen. An example of that is the 1080ti vs 2080ti. The 2080ti is significantly faster all the while consuming the exact same amount of power out of the box. Using more power is entirely a matter of choice at that point, but I don't know many who would give up part of their performance for not even a few cents an hour. Personally if all it takes to get more performance is increased power consumption, I'll do it. After all I'm not constrained by a battery like I would be on a laptop or phone. Those are extremely different markets and sets of considerations. People who complain about wattage like that should seriously calculate the amounts they're complaining about. Of course it depends where you live, but electricity is cheap. You're not using your gaming rig all year round either. Where I live, if you run something like a 2080 all year long (mining) it costs you about 40$ CAD if I recall correctly.

    • @joshjlmgproductions3313
      @joshjlmgproductions3313 2 года назад +8

      @@custard131 Many reviewers state average and peak power draw separately.
      Also, comparing the 2060 Super to the 3060, it looks bad.
      RTX 2060 Super: 180W
      3DMark score: 8739
      MSRP: $399
      RTX 3060: 210W+
      3DMark score: 8766
      MSRP: $329
      That's why reviewers almost never compare cards to the previous generation.

  • @thatsgottahurt
    @thatsgottahurt 2 года назад +522

    Interested to see what Smart Access Storage or Direct Storage will bring to the table once we finally see it in games.

    • @DailyCorvid
      @DailyCorvid 2 года назад +44

      Probably a marginal gain for most people, I expect 2-3% at most.
      Software isn't really advanced enough to take advantage of it yet. Developers need to write code in a different way to make use of it.

    • @vith4553
      @vith4553 2 года назад +1

      check out forspoken's result

    • @LiveType
      @LiveType 2 года назад +26

      Probably just stutter free high speed open world exploration. So performance in that aspect will see a massive boost. Otherwise, negligible difference as that doesn't touch the render pipeline.

    • @Archmage1809
      @Archmage1809 2 года назад +17

      Direct storage helps asset streaming. But most likely it's gonna reduce the game size (compression implementation, use GPU horsepower to decompress), improve load time, and maybe improve 0.1% and 1% low.

    • @ThejusRao
      @ThejusRao 2 года назад +7

      @@DailyCorvid It doesn't offer any performance increase, but it's supposed to improve loading times by a factor over atleast 30% if implemented right.

  • @JUMPJUMP2TIMES
    @JUMPJUMP2TIMES Год назад +1

    The resize bar helped tremendously. Thank you so much!

  • @diederickvermeulen8927
    @diederickvermeulen8927 2 года назад +5

    This really helped. Thank you so much. In Assassin's Creed Valhalla I was getting only 40-50fps average at 1440p. When enabling SAM in bios and RSR it has increased to over 70fps. Really cool

  • @thestig007
    @thestig007 2 года назад +1073

    I think we also desperately need game developers to optimize their game engines and also we need to come up with better performing game engines. Some games just run like crap even with top tier hardware!

    • @sphygo
      @sphygo 2 года назад +109

      Like Minecraft, running one core at max and the rest barely at all...

    • @HMSNeptun
      @HMSNeptun 2 года назад +35

      @@sphygo fabric+sodium/iris will fix that

    • @sphygo
      @sphygo 2 года назад +100

      @@HMSNeptun Yes but that doesn’t change the fact that it’s poorly optimized for modern processors. I wish the devs would focus on updating the base code to be more in line with the times. Movable tile entities are also something that could be fixed with a code rework, rather than relying on mods for all the performance optimizations.

    • @SirDragonClaw
      @SirDragonClaw 2 года назад +66

      Most modern game engines are very optimised. The real issue is game developers not making proper use of engines and leaning too hard on scripting languages.

    • @CanIHasThisName
      @CanIHasThisName 2 года назад +32

      The real problem here is people playing games on the absolute highest settings. Those aren't meant for top tier hardware of today, those settings are there for several generations later when the game runs great even on modern low end. And don't get me started on people expecting high FPS with RT enabled.
      Most games that end up hitting the market are well optimised and run well, you just gotta take your time to set the details right for your HW. Nowadays you often can't tell the visual difference between Ultra and High except for some absolutely minor deatils, and even Medium often doesn't bring any notable graphical downgrade. The problem is that people still think that they're just gonna put everything on Ultra and call it a day. Or worse, they're just using the presets without going through the individual settings.

  • @seank4148
    @seank4148 2 года назад +166

    Just FYI. If your boot drive is MBR partition, enabling Resize Bar disables CSM support, which means you can't boot off your MBR. MBR boot needs CSM enabled, but CSM can't be enabled simultaneously as Resize Bar. You'll have to first convert to GPT partition on your MBR boot partition to enable Resize Bar.

    • @mikebrownphotography2784
      @mikebrownphotography2784 Год назад +7

      This might have been my issue as I had to reset my bios after attempting to enable SAM. My PC wouldn’t boot

    • @jonasedvardsson7612
      @jonasedvardsson7612 Год назад +1

      @@mikebrownphotography2784 convert system disk to gpt isnt hard though🙂

    • @5avvproductions41
      @5avvproductions41 Год назад +2

      Great now I’m stuck in bios fml 😂

    • @joshua7015
      @joshua7015 9 месяцев назад

      What if my boot drive is GPT partition? Are there any precautions I should know beforehand?

    • @seank4148
      @seank4148 9 месяцев назад +1

      @@joshua7015 only issue is with MBR, which is a much older way to setup a boot drive. GPT has no issues I'm aware of. 👍

  • @ChandlerUSMC
    @ChandlerUSMC 2 года назад +71

    Thank you, LTT and AMD. Someone finally said what I've been thinking since the 3xxx/69xx series launches. My wife and I play on two different computers in the same room- frequently together -- it's a past time we share. In the same room means we're on the same 15amp circuit (U.S.). With the way things are going, how are we supposed to have 2 gaming machines on the same circuit? Are we both going to have 4xxxTi or Radeon 7xxxx with modern CPUs? How, exactly? Are we going to be stringing wires from the study to the game room? What about smaller homes that don't have another room to extend from? Eventually, I'm going to need to install custom electrical circuits (20+ amp) and treat a game room like a kitchen? That just buys some time but how much time before we blow past that?
    I was also thinking, what do apartment dwellers do? Is the gaming market going to "power their way" past the people living in apartments because they can't power their graphics cards and CPUs? Then there's the cost in Europe.
    In other words, won't the idea of "more power!" winning over "more efficient" eventually shrink the available market for these devices?

    • @TheSliderW
      @TheSliderW 2 года назад +1

      That's my concern as well and My wife and I do the same. I don't know about America or Canada but In France (ok, we're on 230V but still), we were always able to have LAN parties with 5+ players running from extensions out of the same room's breaker including file servers and other additional devices. My take is that your PC's are not going to be pushed to their maximum consumption during gameplay. Only if you stress test them using specialised benchmarks. So you and your wife should be fine anyway.

    • @Tn5421Me
      @Tn5421Me 2 года назад +1

      Yes.

    • @Rushtallica
      @Rushtallica Год назад

      A good quality extension cable from another room.

    • @juliendesrosiers3177
      @juliendesrosiers3177 Год назад

      Im happy to live in Québec, where energy costs nothing.

  • @fraxesz1598
    @fraxesz1598 2 года назад +16

    This video is indeed very helpful for most people who aren't into PC hardware to get the most out of AMD hardware :D Keep it up guys!

    • @iikatinggangsengii2471
      @iikatinggangsengii2471 Год назад

      they just need to spend more tinkering and testing graphic settings, some settings are just not meant to be enabled/maxed on radeons, oc etc arent exactly necessary (although more fps always welcomed, but try to find correct settings first before oc ing)

  • @Steve30x
    @Steve30x 2 года назад +136

    2:01 here in Ireland my electric bill due in two weeks is €133. The same bill last year was €80. That's a €53 increase despite the fact I've used less electricity this time around.

    • @MalcolmCrabbe
      @MalcolmCrabbe 2 года назад +10

      You got off lightly. My £100 direct debit for Electric has jumped to £265pm from July 1st !!!

    • @emanuelmayer
      @emanuelmayer 2 года назад +2

      My new reading is in June 2022. And a week later, the bill arrives. I plan not to buy anything until I have paid the increase -.-

    • @welkfor1753
      @welkfor1753 2 года назад +3

      Jesus why is so expensive over there I've never seen a bill higher than 30bucks for a month

    • @emanuelmayer
      @emanuelmayer 2 года назад +6

      @@welkfor1753 a large portion of electric costs are taxes and stuff like "we have to pay nuclear plant owners because we forced plants to be shut down but we have contracts to let them run until ... lets say 2050". Also ecological taxes (which should/have been invested in renewable energies).
      I am usually at 1500-1700 Kilowatt-hour a year.

    • @roqeyt3566
      @roqeyt3566 2 года назад +7

      i switched to solar, steam deck, and eating more cold foods. Showering at the gym too. even so my bill went up...

  • @minerskills
    @minerskills 2 года назад +214

    I‘d love to see a comparison of FSR 2.0 to DLSS and NIS.

    • @cromefire_
      @cromefire_ 2 года назад +72

      Hardware Unboxed has some of that, they tested NIS too, it's not too far off, but notable worse than FSR 1. FSR 2 though seems to be almost at DLSS level and on rare occasions even better (in the one game it's in, Deathloop).

    • @awetstone56
      @awetstone56 2 года назад +3

      Digital foundry probably has something

    • @minerskills
      @minerskills 2 года назад

      @@cromefire_ Ah thanks I'll check that out!

    • @ravenclawgamer6367
      @ravenclawgamer6367 2 года назад +2

      According to the testing so far, FSR 2.0 seems neck and neck with DLSS 2.3 However, does anybody know about Unreal Engine's TSR? I'd like to know if it performs better or worse than FSR 2.0.

    • @brothatwasepic
      @brothatwasepic 2 года назад

      Gamers Nexus is awesome at these comparisons

  • @Tobser
    @Tobser 2 года назад +19

    Thanks for the tip. It really improved the vr performance of 6600xt from unplayable to playable.

  • @hunn20004
    @hunn20004 2 года назад +7

    Fury X is perhaps my favourite GPU I wish I owned.
    The Blower style coolers of Radeon have always appealed to me, wish they actually paired them with low Watt components.

    • @iikatinggangsengii2471
      @iikatinggangsengii2471 Год назад +1

      yeah fury and vega are interesting mainly bcs they prob only available at limited numbers

  • @natec1
    @natec1 2 года назад +269

    I love how computers are getting so small that modern physics is an actual issue

    • @citizenkane2349
      @citizenkane2349 2 года назад +21

      I got to admit it is pretty cool.

    • @kz03jd
      @kz03jd 2 года назад +57

      But isn't physics just plain.....physics? I mean it's the same 10,000 years ago as it is today. Physics doesn't change, our understanding of it does.

    • @Boringpenguin
      @Boringpenguin 2 года назад +9

      Even cooler, harnessing those "issues" properly will lead to the next breakthrough in computing. I hope I could see the day when general quantum computers enter the consumer market.

    • @BixbyConsequence
      @BixbyConsequence 2 года назад +27

      @@kz03jd "modern" as in quantum physics being a modern field of study.

    • @natec1
      @natec1 2 года назад +26

      @@kz03jd That’s true but the whole principle of modern physics is that we don’t understand it fully yet. Physics at our scale behaves differently than it does at the extremes.

  • @digiscream
    @digiscream 2 года назад +89

    Yeah...here in the UK, I had a noticeable reduction in my power bill by downgrading my daily driver to a Quadro K2200 (60W) a couple of months ago. Madness.

    • @tuckerhiggins4336
      @tuckerhiggins4336 2 года назад +9

      That's why I use a laptop, gives me the most efficient hardware

    • @silverhawkroman
      @silverhawkroman 2 года назад +3

      I rather paid 80 bucks over MSRP to get a 6600 that's 70 watts less for 30-70% performance than to stick to my rx 580, even though it's seen as an entry tier card. Plus selling the old card will net me 150 so win win!!!

    • @digiscream
      @digiscream 2 года назад +4

      @@tuckerhiggins4336 - sadly, I can't do that. Fan whine is a sensory nightmare for me, and quite a lot of what I do is fairly CPU-intensive stuff.

    • @tuckerhiggins4336
      @tuckerhiggins4336 2 года назад

      @@digiscream ah, the headphones must not do it for you then

  • @joshualopez9045
    @joshualopez9045 2 года назад +14

    For those who can't find DOCP then it's going by to be EOCP or XMP depending on the motherboard you have. These are overclocking profiles for your RAM.

    • @antebellum1776
      @antebellum1776 Год назад

      I still can't find it... does it depend on your motherboard/ram/cpu?

    • @joshualopez9045
      @joshualopez9045 Год назад

      What motherboard do you have?

    • @antebellum1776
      @antebellum1776 Год назад

      @@joshualopez9045 GA-AB350M-DS3H V2 (rev. 1.1)

    • @joshualopez9045
      @joshualopez9045 Год назад +1

      In BIOS go to the M.I.T. tab. You'll move down to the Advanced Memory Settings then to Extreme Memory Profile (X.M.P.). Enable by selecting profile 1. Look on page 23 and 24 of your user manual that came with your mobo.

    • @antebellum1776
      @antebellum1776 Год назад

      @@joshualopez9045 I'll double check this tomorrow when I get home but I'm pretty sure I don't have that setting anywhere.

  • @Neversettle0o
    @Neversettle0o 3 месяца назад

    Thank you! Great info!

  • @ThatNerdChris
    @ThatNerdChris 2 года назад +377

    Thanks for the FPS Papa Linus & AMD!

  • @faizmumtazramadhan2212
    @faizmumtazramadhan2212 2 года назад +38

    1:08 Now we know why Alex has been obsessed with so many crazy cooling solutions lately 🤣🤣

  • @Davidpp-ix6zu
    @Davidpp-ix6zu Год назад +1

    Gran video! Explicas muy bien y claramente 👍 💪

  • @onetunepauly1194
    @onetunepauly1194 Год назад

    Thanks so much for this video. Recently upgraded my GPU to a 6700 XT and have a 3600x. Downloaded a bios update and turned theses setting on

  • @JordonAM
    @JordonAM 2 года назад +185

    FSR at 1080p can always be counteracted using Radeon Image Sharpening, and RSR just added a sharpening slider so you can also use that for games that don't support FSR. RIS isn't magic though, but it'll help some definitely

    • @johndoh5182
      @johndoh5182 2 года назад +5

      AND AMD will improve RSR over time.
      Frankly I think AMD is doing a great job right now, better than other companies for trying to get consumers through a period where money is tight. I don't like it when some reviewers spend an entire video bashing them for a specific product.

    • @AntonioNoack
      @AntonioNoack 2 года назад

      @@johndoh5182 RSR will never have FSR 2.0 though. They can't, because they don't have the motion vector information on all engines.
      They might be able to do some driver magic, but I doubt it.

    • @zodwraith5745
      @zodwraith5745 2 года назад +1

      DSR has literally been available on Nvidia since 2014 as well as a sharpener. Why do you think Nvidia hasn't said anything about RSR?

    • @JordonAM
      @JordonAM 2 года назад

      @@zodwraith5745 No one mentioned NVIDIA at all in this thread.

    • @zodwraith5745
      @zodwraith5745 2 года назад

      @@JordonAM My point was Nvidia did the exact same tech long ago. They don't promote it _because_ it sucks.
      You can't create data that's simply not there without AI having information to fill in blanks. It's like asking a robot to drive you to the store without explaining what a car is or how to drive. It often ends up looking even worse than just dropping the resolution.
      The only way you could effectively have high quality upscaling on the driver side is building it into a frame buffer to analyze multiple frames. Works great for video where adding 100ms of lag goes unnoticed as long as sound is synched, but completely unusable for gaming where adding just 20ms is highly noticeable.
      It's a boondoggle for marketing. Nothing more. Nvidia got their marketing in back in 2014, AMD's using it for marketing now.

  • @ravenclawgamer6367
    @ravenclawgamer6367 2 года назад +38

    Warning to Linus : Huge number of $5000 cheques from intel coming at a high speed.

    • @nadir4562
      @nadir4562 2 года назад +3

      lmao

    • @NaoyaYami
      @NaoyaYami 2 года назад +2

      He'll just rip them again.

  • @ilovelamplongtime
    @ilovelamplongtime Год назад +2

    This worked well for me. 20-25ish frame increase in MWII and getting slightly cooler GPU temps

  • @JesusOfTheJungle
    @JesusOfTheJungle 10 месяцев назад +2

    I run a 2080 ti and have moved to the blue mountains (Australia) in the last year. Through winter house temp wouldn't get much more than 8 degrees celsius but man, my gaming/entertainment room was, I mean, I wouldn't say warm but by comparison to the rest of the house, it was comfortable!

  • @jackbenimblejack1
    @jackbenimblejack1 2 года назад +33

    the best way to get more performance is by purchasing a $69.99 screw driver from LTT

    • @Ghost-hs6qq
      @Ghost-hs6qq 2 года назад +3

      Linus? is that you?

    • @SaucerX
      @SaucerX 2 года назад +1

      It comes with rgb, or is that a separate purchase?

    • @DailyCorvid
      @DailyCorvid 2 года назад +3

      @@Ghost-hs6qq Is that you? Find out after this short sponsor message...

    • @frozenturbo8623
      @frozenturbo8623 2 года назад +2

      @@DailyCorvid Today's sponsor is Linus Tech Tips, Buy our $69.99 screwdriver for better performance from the RGB and 78.28% less kernel crashes.

    • @jackbenimblejack1
      @jackbenimblejack1 2 года назад +1

      @@Ghost-hs6qq yes it is me, Linus ... buy the LTT Screw driver and get your friends to buy it too.... I have to pay for a really really really expensive house

  • @BReal-10EC
    @BReal-10EC 2 года назад +65

    I think these temporal upscaling techs will become more and more important in the future as part of normal video compression. A 4k movie is 100 GB now.

    • @Sabbra
      @Sabbra 2 года назад +1

      It is 25g cmon

    • @Lubinetsm
      @Lubinetsm 2 года назад +14

      @@Sabbra if it's 25G, you got scammed and compression is a bit too lossy.

    • @sermerlin1
      @sermerlin1 2 года назад +8

      4K movie is about 50-60 GB, rarely you'll see 70-80GBs at this time and extremely rarely (like lord of the rings) where it will exceed 100GB.

    • @formdoggie5
      @formdoggie5 2 года назад

      @@sermerlin1 thats not at 60 to 120 frames though.
      Getting 24-30 frames kinda defeats the whole point of all the tech you buy.

    • @Dave5gun
      @Dave5gun 2 года назад +8

      @@formdoggie5 for movies 24 frames are enough, actually even better than 60 frames. HDTVTest once explained that 24 frames look more dreamlike, not like a cheap smartphone camera. However, there has to be a certain amount of motionblur, which is not really the case with OLED anymore, so 24 frames can look pretty choppy sometimes. i still think it's better than 60 frames.

  • @thoraero
    @thoraero 6 месяцев назад

    Ok that smart mem access really helps. Thank you.

  • @Tantium
    @Tantium Год назад +2

    i have this working with a b350 board with 5800x3d and 5700xt, sweet compatability!

  • @GABN0MADormarcous
    @GABN0MADormarcous 2 года назад +96

    They actually gave us help so that we can enjoy our stuff , after this i have alot more respect for them

    • @cozza819
      @cozza819 2 года назад +3

      Absolutely 👍

    • @zodwraith5745
      @zodwraith5745 2 года назад +2

      Well, technically they simply copied Nvidia that did it years earlier. Nvidia even has an RSR type feature that they never talk about that doesn't require RTX cores. This is literally just an AMD infomercial. I'd rather have them spend the money actually getting it implemented into any real freaking games instead of just crying "me too!"

    • @GABN0MADormarcous
      @GABN0MADormarcous 2 года назад +1

      @@zodwraith5745 i don't disagree but we do have to give th credit as they didn't need to do it , it's still cool they did it , if anything it helps create a new Standard

  • @Melchirobin
    @Melchirobin 2 года назад +35

    This is a great way to do a sponsored video. I just expected weekend videos to be huge and usually the ones you have to watch so getting used to this is a change

    • @vincent67239
      @vincent67239 2 года назад

      I didn’t skip the sponsor this time because I didn’t notice until the opening song started playing lmao

  • @johnrehak
    @johnrehak Год назад +7

    AMD free performance gain is so simple even without standard overclocking. Highest jump in my case was just simply enabling Smart Acces Memory (SAM). Another free performance gain which turned my Gigabyte Aorus RX 6800 xt in to reference RX 6900 xt and beyond was thanks to undervolting GPU from 1150mV to 1100mV, increasing power limit to +15% and switch memory timing to fast timings. I havent touched GPU clock or memory clock. Card boost up to 2474mhz from factory settings and as long as temps allows it stays at that speed. FPS gains are in CP 2077 in 1440p ultra from 84fps avg. 54fps min. in to 92fps avg. 60fps min. In Horizon Zero Dawn from 135fps avg. 60fps min. to 159fps avg. 78fps min. (GPU FPS - 168, 95% - 152fps, 99% - 132fps), The Division 2 from 128fps avg. to 137fps avg.

    • @salxncy5998
      @salxncy5998 2 месяца назад

      Can you help me over clock mine, I have a Ryzen 5 5600g with rx 6600 eagle, 32 gb ram

  • @israelurbano2515
    @israelurbano2515 Год назад

    i like that return talking about ecosystem and electricity over cost of the util in ampere to watt, money and around repercussions on climate

  • @SeabooUsMultimedia
    @SeabooUsMultimedia 2 года назад +18

    Some AMD 300 series motherboards actually got smart access memory support via a bios update. My x370 board got it via a bios update. It came with the update that introduced compatibility for 5000 series CPUs.

    • @ertai222
      @ertai222 2 года назад +1

      Yeah MSI gaming pro carbon x370 got it

    • @ansaranduin
      @ansaranduin 2 года назад

      @@ertai222 mine didn't let me update, I opened M-Flash, then it said 'Entering Flash Mode' and then was a blank screen. Did yours update fine?

    • @ertai222
      @ertai222 2 года назад

      @@ansaranduin yeah I've been updated since the BIOS came out.

    • @ansaranduin
      @ansaranduin 2 года назад

      @@ertai222 Just curious but does your system have an M.2 SSD, I've read that it could be causing my issue.

    • @ertai222
      @ertai222 2 года назад

      @@ansaranduin yes this board has 2. But if you use the second slot you get half speeds or something.

  • @josuad6890
    @josuad6890 2 года назад +21

    GPU power has been insane lately, so much so that I decide to just buy the 3070 series even though I can afford the 3080 easily. Going 3080 will also require me to ditch my PSU, and I just don't feel like doing anymore cable work ever again.

  • @dapz44
    @dapz44 Год назад

    Great video looked like u had fun making it aswell good job guys 😁

  • @zezba9000
    @zezba9000 2 года назад +1

    HZD has foveated rendering (if not mistaken) which is the best solution.
    However this probably confuses AI upscales as that should happen per foveated region not the final image.

  • @bluehatbrit
    @bluehatbrit 2 года назад +18

    My AMD graphics card just hit 10 years old and it's still my main driver with no tweaks to the settings while running along side a Ryzen 7. Perhaps it'll make it to 15 years when the prices have eventually come down!

    • @ShadowDragoon004
      @ShadowDragoon004 2 года назад +3

      Not sure if you've been out of the loop, but GPU prices are getting much more reasonable. I've seen some 3060s damn near MSRP very recently. Definitely much more affordable than it was at the beginning of the year. And with the crypto crash going on, there's a very decent chance stock on the 40 series will be reasonable enough to make a purchase then as well.

    • @10unnecessary01
      @10unnecessary01 2 года назад +1

      What card do you have an what games do you play that you're able to still use it? I upgraded from my 290x in 2019 and even then the difference was night and day

    • @vanilla4064
      @vanilla4064 2 года назад

      unless there is too much supply and not enough demand, those prices aren't coming down.

    • @alishabab3
      @alishabab3 Год назад

      Covid killed the pricing of cards..that and the conflict between China and Taiwan

    • @kidthebilly7766
      @kidthebilly7766 Год назад

      if your card is 10 years old then getting a 1080 ti for $200 is probably worth it lol

  • @piotrj333
    @piotrj333 2 года назад +44

    5:10 FSR doesn't use AI. It uses Lanczos upscalling with sharpening filter. In FSR 2.0 it gets a bit more complicated, but it is still not AI.

    • @sh0dn
      @sh0dn 2 года назад +7

      Even DLSS not using AI directly. It's a DLSS was made with AI and this algorithm need some hardware acceleration to be fast enough.

    • @m_sedziwoj
      @m_sedziwoj 2 года назад +5

      as above DLSS is close system, and we don't know what it do, and if you listen to Nvidia, they did claim that it using AI but _only_ for 1.0, for 2.x+ they only saying that algorithm is optimized by AI, so not AI inside. So AMD when they use AI to optimize parameters of FSR 2.0 (and they should, to automate it), they can claim same things as Nvidia.

    • @grn1
      @grn1 2 года назад

      AI is used as a bit of a catch all term for anything that uses a complex algorithm. In most cases machine learning, what you could call true AI, is used to create an algorithm that runs on lighter hardware but then that algorithm itself is often labeled as AI.

    • @piotrj333
      @piotrj333 2 года назад

      @@m_sedziwoj Not really. DLSS actually does use convualation autoencoder and that is not simple parameters, but actually applying weight network. FSR 2.0 still doesn't reconstruct details on textures, but DLSS does, and DLSS does thanks to that AI step that FSR lacks. Also Tensor cores are both good at learning and applying machine learning,

  • @LucasDaRonco
    @LucasDaRonco 2 года назад +12

    Btw, this works exactly the same for Nvidia. SAM is called Resizable Bar so go ahead and activate that too and you also have DLSS for IA super sampling (or upscaling) with antialiasing included. You also get image sharpening if you want it too.

    • @YaBoiBigNutz
      @YaBoiBigNutz 2 года назад +2

      Works for Nvidia 30 series only

    • @small_pc_gaming
      @small_pc_gaming 2 года назад

      do you know why i dont see a above 4G decoding.
      the o e i instead have is called above 4g memory/crypto currency mining

    • @iikatinggangsengii2471
      @iikatinggangsengii2471 Год назад

      no i can enable rebar too, try updating bios

  • @teamgaming1294
    @teamgaming1294 Год назад

    I just enabled Resizable Bar on a Z170 Motherboard with the Bios Mod and yes it does give a good performance boost!!!

  • @omgawesomeomg
    @omgawesomeomg 2 года назад +90

    Props to AMD for allowing Linus to talk about the shortcomings of FSR 1.0

    • @ShadowGirl-
      @ShadowGirl- 2 года назад +8

      I believe AMD sponsored this to force them to not talk about Nvidia's use of ResizeBar, which does exactly the same thing and they gave it to us over a year ago. Also it was figured out that not all titles gain advantages from accessing more of a GPU's memory, so not all titles even make use of the technology. Normally these things would be mentioned by Linus, but this is the "cost" of sponsorship.

    • @chriswright8074
      @chriswright8074 2 года назад +25

      @@ShadowGirl- but AMD actually put in the work unlike nivida which doesn't work or require more work than needed

    • @lauchkillah
      @lauchkillah 2 года назад +13

      @@ShadowGirl- Nvidia did not support Res. bar at launch, AMD was the first to drop this tech. Early 3000 series GPUs require a vbios update to even support it

    • @ValentineC137
      @ValentineC137 2 года назад +18

      @@ShadowGirl- Nvidia's Res. Bar literally just doesn't give the same performance improvement as AMD's SAM
      Comparing Hardware Unboxed videos on the two systems on average
      Nvidia is the same while AMD has a 3% gain 1080p
      Nvidia gets 1% while AMD gets 3% at 1440p and 4K
      the peak gains are
      11% 9% and 6% at 1080, 1440 and 4K for Nvidia
      20% 19% and 17% at 1080, 1440 and 4K for AMD

    • @santmc12
      @santmc12 2 года назад

      @@ShadowGirl- NIS? Is trash

  • @LegendaryGodKing
    @LegendaryGodKing 2 года назад +13

    I've been with amd since I was 12. I'm 26 and I have to say, AMD, your price to performance is what I've always needed, Fm2+ Athlon x2 Am3+ FX8350 black, Am4+ Ryzen2 3700x, and now I'm eyeing your new chips. AMD is the only thing pushing Intel to compete. Amd started from a clone, but they have worked hard to compete in the market. I'm a huge fan, they will ALWAYS be my first option for cpu choice, and forever have a spot in my heart from all the memories of gaming, and overclocking, the GPUs are great, just not for mineing!

  • @argon6520
    @argon6520 2 года назад +4

    I think it should also be added that such tricks can make wonders for VR, especially with high-end headsets that put 4K monitors to shame with their resolution.

  • @nintendowiids12
    @nintendowiids12 2 года назад +2

    Played God of War with FSR 2.0 on a 6700XT. HUUGE visual / performance uplift over 1.0. I want FSR 2.0 in all the games!!

  • @CaptKornDog
    @CaptKornDog 2 года назад +6

    Wish there were more tips like this in general out there.

    • @elbowsout6301
      @elbowsout6301 2 года назад

      check out the lowspecgamer channels for how to get more from low to mid tier gear.

  • @kiefgringo
    @kiefgringo 2 года назад +6

    Thanks for the SAM tip. I was already using Sapphire's version of -RSR- Radeon Boost, Trixxx Boost (works better and allows more customization in my opinion, but limited to Sapphire cards I assume), but wasn't aware my 3600x and 5700xt could use SAM by simply tweaking the BIOS settings.

    • @sbrader97
      @sbrader97 2 года назад +2

      Trixx boost isnt the same as rsr afaik its more like there version of radeon boost and only uses standard gpu upscaling from the lowered render res

    • @sbrader97
      @sbrader97 2 года назад +1

      @John-Paul Hunt idk if they could do fsr 2.0 at a driver level it needs deeper intergration into a game to have access to the colour and depth buffer and it has to be done earlier in the render pipeline rsr 1.0 works more like a post proccesing filter stretching and sharpening the final image

    • @kiefgringo
      @kiefgringo 2 года назад +1

      @@sbrader97 Yeah, I messed that up. It's Radeon Boost that I dislike. It makes the image look terrible during movement in FPS games I play, whereas it seems Trixxx Boost just decreases the resolution and upscales it the entire time, so it's at least consistent.

  • @IggyBang
    @IggyBang Год назад

    Yoooo that S.A.M. Trick is killer. Thanks for the tech tips.

  • @kingdon9690
    @kingdon9690 Год назад +1

    I'm running 1080p Sceptre 165hz 24 inch monitor, and have RSR on not bad, and AMD freesync premium enabled.

  • @syndicate7929
    @syndicate7929 2 года назад +174

    I tried this out the other day after seeing it in the bios for the first time (recently updated it). Went from averaging 210-230 fps with a 1660ti, to over 270. A great improvement :)

    • @Ali.Yousif
      @Ali.Yousif 2 года назад +5

      So SAM works even if you have an AMD processor and an Nvidia GPU?

    • @doggo531
      @doggo531 2 года назад +2

      @@Ali.Yousif I want to know to

    • @bl1nd_ness664
      @bl1nd_ness664 2 года назад

      @@Ali.Yousif I don't think so

    • @david._.jessup2604
      @david._.jessup2604 2 года назад

      How?

    • @donovan9783
      @donovan9783 2 года назад +9

      @@Ali.Yousif no, SAM only works with an AMD GPU and CPU. FSR which is like Nvidia dlss works with nvidia and amd gpus

  • @IncognitoX8
    @IncognitoX8 2 года назад +26

    1:47 It's actually 230v and not 240v, that is the standard in Europe. That can usually fluctuate between 225v-235v. So for a standard 10A fuse, you can expect a output of 2300w. If you have a newer house (or electrically renovated), in my country with confirmed cabling approved of this, you can install 13A fuses instead, that will take the wattage up to 2990w for a single phase group.

    • @GreenCinco12Official
      @GreenCinco12Official 2 года назад +7

      It's actually both and more.
      There is everything between 220v and 240v..
      There is also everything in between 110v and 127v..

    • @GrimK77
      @GrimK77 2 года назад +3

      Standard load for TN-C single phase circuit in EU household is rather 16A breakers (fuses are going out of fashion), so 3.6kW. And RCCB's, too for additional protection. Single phase household electric stoves are usually 2.9-3kW.

    • @iaadsi
      @iaadsi 2 года назад +3

      @@GreenCinco12Official Up to 253 V in Czech Republic (230 V ±10%). If you're close to the branch transformer, you'll be seeing over 250 V even when the street behind you is pulling heavily.
      Plus we have 400V three-phase nearly everywhere, even in old commie block apartments, for really heavy loads like induction cooking or car charging.

    • @jort93z
      @jort93z 2 года назад +3

      Here in Germany we usually use 16A fuses for outlets, and 6A-10A fuses for lights(normal light switches are 10A, so using any higher value would be risky).

    • @p_mouse8676
      @p_mouse8676 2 года назад +3

      Since P=I^2 * R . It also goes to show how much bigger the power losses are in 115/120V countries or how much bigger the wires need to be.

  • @kessilrun6754
    @kessilrun6754 Год назад

    I’ve done all of this but was curious as to what DOCP is. That was interesting. Never noticed that one myself a I use an AMD CPU and Steel Legend mainboard board. I’ll have to go check it out.

  • @hiimkxiv
    @hiimkxiv 5 месяцев назад

    Thanks for this information Linus! Waiting for 2nd hand 6700xt to have stocks so I can buy. Sucks having to wait for so long lol I'm using my phone since December 23rd I think

  • @Cheesebread42
    @Cheesebread42 2 года назад +13

    I bought the 6950 XT last week for my RDNA2 system so this was a nice reminder.
    I tried to enable it as described though but the Compatibility Support Module must still be enabled to boot - apparently my boot drive was not automatically set up as UEFI when I built it the other year. Linus missed an opportunity to catch this hiccup. I now need to convert the Master Boot Record/Legacy drive to UEFI.... >_>

    • @AdaaDK
      @AdaaDK Год назад +1

      Your issue is not linked to sam(or anything related to amd really). Its a Windows security Thing (required for win11). That you got issues with from what your saying.

  • @PixelShade
    @PixelShade 2 года назад +13

    I actually thought you guys would do more of a deep dive into all the tech AMD has available in their software. Sure, you did talk about SAM, FSR/RSR, But there are other great technologies which are almost never mentioned by tech media and people often don't have a clue about. I am thinking about:
    - Radeon Chill (something for Alex hot room)
    - Radeon Boost
    - Radeon Image Sharpening.
    Radeon Chill & Radeon Boost are both movement based framerate compensations. Chill basically lowers framerate when standing still (to your lowest set value), It instantly increases framerate when moving forward (a in-between value of your min. & max.) and when you are turning, when you need maximum framerate, you get just that. This greatly improves power consumption and with a freesync monitor you barely notice a difference from running full tilt (unless it's a highly competitive title like CS:GO where split millisecond reaction is king). Radeon Boost on the other hand reads mouse movement and depending on how fast you are turning it downscales the resolution accordingly. Which is awesome in theory as you don't have the same perception of detail in motion. This however, is not available for all games. It does work in Cyberpunk and Metro Exodus to name a few. Unfortunately Radeon Boost is a bit harsh with nearest neighbor scaling. (I wish it was tightly integrated with FSR), and you can't use it together with Radeon Chill, which seems like a missed opportunity. When it comes to Image Sharpening (although being a part of FSR/RSR) can be activated individually for a next to non-existent performance impact. It almost act as a resolution bump especially in soft TAA games (textures in 1080p with Radeon Sharpening can almost appear more detailed than 1440p without sharpening). Yet it doesn't produce ringing artifacts. This is great when you want to increase perceived sharpness but don't affect performance much.

    • @lake5044
      @lake5044 2 года назад +1

      Quite interesting! Especially for people who don't own an AMD gpu and don't know such things even exist. Thanks!

    • @Dark88Dragon
      @Dark88Dragon 2 года назад +2

      Yeah you are right...from my observations people with AMD-ware are a lil bit more into the tech than the average Intel-, Nvidia- or Apple-User

    • @PixelShade
      @PixelShade 2 года назад +2

      @@lake5044 It's actually kind of crazy that AMD isn't promoting these unique features more. And also invest more time developing them.. like I mentioned Radeon Chill alongside Radeon Boost are fantastic features, but unfortnuately Radeon Boost still feels a bit half baked. Right now, it can't be activated together with Radeon Chill, and it uses "nearest neighbor" scaling which is a bit jarring. If this was connected to the in-game FSR pipeline (away from HUD and post-processing) It would be such a "killer feature" you would basically get the FPS where you absolutely need them (when things get chaotic and you are turning a lot) and you would save on power/heat + gain fidelity when watching cutscenes, standing still watching a scenery or just moving forward (where you don't have a lot of motion). :)
      I fully baked smart solution of this, where you can combine Radeon Chill, Radeon Boost and FSR 2.0 would honestly be a REAL killer feature.

    • @CameraObscure
      @CameraObscure 2 года назад +2

      I use Chill in most games I play and have done since it came out. Saves power, keeps my GPU cooler dosent affect game play for the games i play, even better its not tied to any specific game / engine. Also the fact you can set feautures on or off, for each game in your Library seperate from global settings of GPU, meaning you have no need to reset each time you play that game as its stored in the driver already.

    • @PixelShade
      @PixelShade 2 года назад +1

      ​@@CameraObscure Me too, awesome to hear that more people are using it! It saves me around 60W on a Ryzen 2700 and a 6600XT (which isn't a very power hungry combo in the first place). Taking a look at Witcher 3 I have opted for a spectrum of 48fps to 96fps. When standing still (If I watch a cutscene, get a phone call, fetch a drink or snacks, make some lunch) The computer only draws 63W (48fps). When walking forward (which you do most of the game) the computer draws 91W (65fps). meanwhile when I turn the camera it draws 144W (96fps). I wouldn't actually "feel" any difference on a freesync display when using this framerate spectrum compared to a locked 96fps. And the power saving is around 60W. I mean, I could go full tilt with 120fps+, If that was the case the computer would draw 177W.... So really the power savings is about 90-100W and the gaming experience itself is still really awesome. Saving around 100W is actually substantial for room heat, and in case of the European electricity market it's actually a substantial cost saving

  • @JustMartha07
    @JustMartha07 Год назад

    Yeah, I keep doing undervolts for years. First I do OC to check how much I can boost at lower power consuption and then I do Undervolting

  • @katerhamnorris3936
    @katerhamnorris3936 Год назад +4

    here in germany we get max 3650watts out of the socket but we have 230 volt outlets at 50hz

  • @waynemiller3187
    @waynemiller3187 2 года назад +8

    This video really needs a warning about mbr partitioning and disabling csm compatibility mode, cause if you're not careful you can easily disable your boot disk, and the only way to recover is to reset your cmos

  • @Bobbias
    @Bobbias 2 года назад +20

    People here need to understand a few things about optimization:
    1: you typically want to avoid preoptimization, because you may waste considerable time on code that has little to no impact on performance.
    2: optimized code is quite often much more difficult to write, and reason with, so whenever possible it's better to write slow but clear code of anyone is going to have to see that code on the future (unless it absolutely must be optimized).
    3: optimizing everything will take significantly longer than it already does to release products.
    4: any bugs in optimized code will be much harder to track down and fix due to the nature of optimized code.
    5: games simply cannot be optimized for PCs the same way they can be for consoles due to the fixed hardware in consoles compared to the very large variation in hardware across PCs.
    That's not too say that things cannot be better optimized than they are now, there are cases where it may take little effort for reasonable gains in some cases. But it does mean that if we want our code to be better optimized in the future, we need to be willing to deal with increased production time and costs associated with taking the time to more heavily optimize things.

    • @Sightbain.
      @Sightbain. 2 года назад +4

      I don't disagree with anything you have said but some of the things people mean when they say "it should be optimized!" is the lack of compression for files, the insane size some of these games are requiring is just bonkers. Remember when Titanfall launched without compressed audio and it was like 60gb and everyone lost their minds and they went back and compressed it and did some other "optimization" about downloading local languages etc. That is the sort of things people can see and understand as well as some obvious things like having better occlusion or terrain that isn't actually giant blocks mashed together that stick through the ground and eat up resources. Most gamers aren't talking about very specific code implementation but rather less subtle general laziness or the biggest one of all, utilizing more than 1-2 cores which can either be a dev issue or the engine they are using so I do agree it is not so black and white but some things are.

    • @chuzzle44
      @chuzzle44 2 года назад

      Very good points. Though I do feel the need to point out that I would never expect developers to optimize games on PC as well as they do on consoles. For me, it's not so much about optimizing as it is about experience. I don't personally care if a game runs a few fps slower than it's console counterpart. However, if enabling vsync causes the game to stutter badly, or if performance drastically drops upon loading a new area, or if the game is just completely broken on release, I'm going to complain. With a plethora of mature engines and techniques available, there is no excuse for not only releasing games in an unfinished state, but leaving them that way for years.

    • @peterw1534
      @peterw1534 2 года назад +3

      So basically it's hard. See doom eternal. They did it somehow. (Twice really, 2016 was also polished af)

    • @frozenturbo8623
      @frozenturbo8623 2 года назад

      @@peterw1534 Doom has way smaller map and doesn't really matter on graphics.

    • @ashlyy1341
      @ashlyy1341 2 года назад +1

      i for one would be happier w/ fewer, but higher quality (and better optimised), releases. the same w/ tv shows and films tbh. AAA games, Hollywood films etc. are treated as disposable. the reason why is obvious: profit (make the flashiest, most hyped, thing to get the most sales, then release the next one quick as to counteract dropoff)
      anywho, your points are thoughtful and i agree with them overall. people forget that one can't optimise for every circumstance (such as hardware/platform) all at once - that's just a general purpose application. however, making use of standards that exist (especially as consoles are now x86) is always good. other things include: better data compression (pick an algorithm that balances speed and ratio), intelligent asset streaming (including handling what to keep in vram), smarter level design (eg stop making everything an entity when it can be a static and decorative), consideration of lighting and rendering - how best to employ different types of culling etc.

  • @jkingston1410
    @jkingston1410 Год назад

    The most thorough yet entertaining tech tips. Thanks

  • @fabrb26
    @fabrb26 Год назад +1

    Should note that if you enable FSR you use higher ingame resolution and so is textures etc...
    On the other hand RSR want you to use the lower resolution ingame the algorithm will stretch from and so the textures and other stuff.
    My point is, if you are limited in GPU Vram, 1440P or even 4K may be achievable while being a complet no go starting at 4K because of the maxed out Vram.
    For me RSR make the use of my 60hz 4K screen a reality on my R5600/6500XT (4gb) setup

  • @imopapa6680
    @imopapa6680 2 года назад +58

    Undervolted my gpu because it was getting too hot. It also increased the fps in some of my games. I feel like undervolting is too underrated.

    • @OmniUni
      @OmniUni 2 года назад +15

      Yup. I actually wish they'd delve more in to the various options that Adrenaline gives you. AMD's control center is REALLY nice, and makes it very simple to both undervolt AND overclock if you so choose.

    • @RetroPlus
      @RetroPlus 2 года назад +2

      Yeah it's fantastic

    • @thetruthisoutthere5173
      @thetruthisoutthere5173 2 года назад +4

      Extremely easy and underrated unfortunately

    • @ffwast
      @ffwast 2 года назад +6

      The manufacturers really configure these things for e-peen instead of a reasonable power to performance.
      The top 150 watts of the 3090ti power spec is only like the top 10% of the performance. 2/3 power for 9/10 fps in benchmarks with the power settings from the equivalent professional quadro type card.

    • @jjcdrumplay
      @jjcdrumplay 2 года назад +1

      It works better for me too to only use a little bit of msi overclock 1, because maybe msi 2 system ram overclock wasnt stable, or maybe it was amd drivers that got better in last two years, with this pc always learning something.

  • @Ray13star
    @Ray13star 2 года назад +5

    RSR on my xfx rx 5700xt thicc iii is great at 1080p. The image quality that comes out is similar to 2k native in most games that I play. While there is shimmering that occurs, it isn't much of a distraction (since I don't play competitive esport games).

    • @gotworc
      @gotworc 9 месяцев назад

      ​@@elcactuar3354it's not 1080p is HD, 1440p is 2K and 3840 is 4K

  • @Mandaeus
    @Mandaeus 2 года назад

    If you look at the top right of the top line bios menu bar @4:13 you will see the option "Resize BAR". This is the option you need. Was there on mine ;)

  • @Aleksaa944
    @Aleksaa944 Год назад

    Tnx linus, couldnt find it in bios until i saw your video. Best youtuber 💪💪🤠

  • @probablyyourneighbororsome8412
    @probablyyourneighbororsome8412 2 года назад +6

    r9 380x user here: I don't want a faster gpu, I want a supported one that doesn't have dozens of graphics errors all the time!
    THE DAMN THING IS ONLY 6 YEARS OLD!

    • @sayacee5813
      @sayacee5813 2 года назад +3

      nimez driver

    • @chestermc9954
      @chestermc9954 2 года назад

      There are unofficial drivers for the 380x that keep it supported. Also I'm pretty sure that the Linux drivers are still fully supported.

  • @gluteusmaximus7608
    @gluteusmaximus7608 2 года назад

    Good vid! Gonna try SAM now

  • @RIOTStyx
    @RIOTStyx 5 месяцев назад

    Years later and that intro animation still leaves me amazed

  • @-_-_-_-_
    @-_-_-_-_ 2 года назад +10

    It's really amazing that we've developed technology so far that the universe itself is posing hard limitations on us. Imagine telling someone in the 1930s that by 2022 we will have developed technology so far that the laws of physics start to break down (well not really but it sure feels like it) and we can't really progress any further.

    • @philmccracken2012
      @philmccracken2012 2 года назад +3

      Mina......What are you babbling on about? What are you even referring to?

    • @coolghoul9
      @coolghoul9 Год назад +4

      @@philmccracken2012 I think they are referring to the 120v outlet the universe gave to us, if only there was a bigger outlet to be discovered

  • @DDRWakaLaka
    @DDRWakaLaka 2 года назад +3

    0:40 I'm still running an R9 Fury today with nimez drivers -- still handles everything at 1080p like a champ (provided I'm not going over 4GB VRAM)

    • @rrekki9320
      @rrekki9320 2 года назад +1

      Aged better than the 960/70 lol

    • @DDRWakaLaka
      @DDRWakaLaka 2 года назад +1

      @@rrekki9320 wish I didn't have to rely on 3rd party drivers, but that's *definitely* true

  • @bricefleckenstein9666
    @bricefleckenstein9666 6 месяцев назад

    1:49
    The USA actually runs on 240 Volt power - but "split phase" for many of our outlets giving 120.
    If you want 240, it's fairly trivial as long as you can run the cable from the breaker box - and most computer power supplies for the last couple DECADES have been designed to handle anything from around 100 to 250 or so with the right power cord.

  • @shadowsandfire
    @shadowsandfire 2 года назад

    we get 2400w per socket here in Aus, 230-240ishV @10A per socket outlet :)

  • @djsnowpdx
    @djsnowpdx 2 года назад +6

    You’re not the only one, Alex! I haven’t attempted undervolting yet but I slash my CPU and GPU power limits to their minimum values (98 watt 5700XT and 87 watt 5950X) when the outside temperature gets warmer than inside.

    • @joel3399
      @joel3399 2 года назад

      I just have a rx 570 but the heat bothers me so much that in summer I usually play with my laptops integrates graphics

    • @rrekki9320
      @rrekki9320 2 года назад

      @@joel3399 Tried repasting and freeing it of dust? Otherwise I can recommend takin off the shroud, puttin two 120mm Arctic P/F12 on it with Zipties and ya done :D

    • @CarbonPanther
      @CarbonPanther 2 года назад +1

      A smart tip for the 5700XT is to reduce the clock slider instead of the power limits.
      You'll save about as much on power consumption, but you will have much more of that performance back that you'll lose by hampering the cards power limit!
      Try running the card at 1600Mhz, look at the FPS before and after and please tell me the results!

    • @-eMpTy-
      @-eMpTy- 2 года назад

      @@rrekki9320
      lowering the temps by repasting and/or adding fans doesn't decrease the heat output to your room

    • @CarbonPanther
      @CarbonPanther 2 года назад

      @@-eMpTy- I know this will be heavily debated but i firmly believe that: The cooler the Chip runs and the lower the surface temperature of the heatsink is, the colder the air will be that is expelled from the case into the ambient room.

  • @sudl5346
    @sudl5346 2 года назад +11

    For those having Problems with "No bootable Device found" after activating ReBAR:
    Make sure, that Secure Boot is disabled (restart after disabling it) and then enable CSM and it should be fine again.

    • @mattgrabowski702
      @mattgrabowski702 2 года назад

      I just straight activated it and got black screened then no input device detected on my monitor. Had to clear the bios to get the picture back. Gonna give this method a try next. Thanks!

    • @sudl5346
      @sudl5346 2 года назад +1

      @@mattgrabowski702 Tbh, i'm not sure, if it will fix the problem, you encountered.
      The problem, i mentioned, occoures when the system drive is in wrong "format". ReBAR and/or 4G decoding (not sure, if both or just one of them) needs the drive to be in GPT instead of MBR to be detected as bootable.

  • @bootyjuice87
    @bootyjuice87 5 месяцев назад

    I'd love to see an update to this video!!

  • @JB-fh1bb
    @JB-fh1bb 2 года назад

    @6:12 you just sold me on floatplane

  • @landoishisname
    @landoishisname 2 года назад +4

    important to note that SAM can actually reduce performance in some cases

  • @Fredjikrang
    @Fredjikrang 2 года назад +60

    A quick FYI, most breakers are only actually rated to run 80% of their rated capacity continuously, so a 15A circuit is actually only rated to power 1,440W for longer periods of time.

    • @Phynellius
      @Phynellius 2 года назад

      That and if it’s your own house a 15 or 20 amp 240 volt circuit is east to setup, just should get it done by someone qualified or have it inspected. Unfortunately many consumer UPS systems are 120 only

    • @m0r73n
      @m0r73n 2 года назад +12

      Laughs in 230V

    • @Jaker788
      @Jaker788 2 года назад +5

      That's why every device you buy that's intended for more continuous use is max 13 amps. Corded lawn mowers and space heaters are 13A max, but blenders are 15A or 1875W max.

    • @igobyoz
      @igobyoz 2 года назад +3

      But a 20 amp circuit is fine @ 16 amps or approx 1920W -> making 1800 W range just fine all day long.

    • @serdarcam99
      @serdarcam99 2 года назад +4

      Imagine you can't open your computer cuz of 110 freedom volts

  • @Vanilla_Icecream1231
    @Vanilla_Icecream1231 2 года назад

    BTW In Canada’s electrical code circuits are rated for 80 percent of breaker rating so if you have a 15 amp breaker your load shouldn’t be higher then 1500 watts, there acception such as heating circuits which can be rated at 100%

  • @Magnulus76
    @Magnulus76 2 года назад

    I try to stick to 100-130 watt card because I live in Florida and I don't need all the extra heat. Optimizing the driver and game settings is a must under those conditions.

  • @filleswe91
    @filleswe91 2 года назад +3

    "while you were sneakily watching Newgrounds on the school computers."
    You got me Linus. You got me good. This was way back in 2001 for me, 10 years old at the time. 👍

    • @Marco-pf3te
      @Marco-pf3te 2 года назад

      Same, resonated for me! Was also 10 at that point in time :)

  • @jaredvillhelm2002
    @jaredvillhelm2002 2 года назад +26

    AMD going all in to be top dog, really excited to see where this pushes computing!

  • @joannecunliffe4874
    @joannecunliffe4874 2 года назад

    You might find the ARM based Firefly ITX-3588J board quite interesting. It's NEARLY a game changer. Any chance of a review? If only they had put a full ATX power connector on the board (like the Axzez Interceptor board (which takes the Pi CM4).

  • @marcasswellbmd6922
    @marcasswellbmd6922 2 года назад

    It's nice to see you go back to your roots.. Which was showing people how to get the most out of their Tech.. Just a shame someone has to sponsor the Vid for you to do it..

  • @NFG-Nero
    @NFG-Nero 2 года назад +10

    "You wanna faster gpu?"
    No Linus
    i just want a gpu

    • @lvl5monk297
      @lvl5monk297 2 года назад +1

      GPUs are basically back at msrp on ebay. Get with the times lmao the shortage is essentially over

    • @NFG-Nero
      @NFG-Nero 2 года назад +1

      @@lvl5monk297 not here, where inflation exceeds 12% and where + 23% tax has to be paid :/

  • @GasolineNl
    @GasolineNl Год назад +13

    WIth my new 6700XT on 1440p ultrawide I gain 2 to 5 fps with Assetto Corsa. Playing around 110/120 FPS.

    • @DeEchteZeus
      @DeEchteZeus Год назад

      that's it?

    • @10vingers
      @10vingers Год назад

      @@DeEchteZeus It's marginal. But it's more, so he did not lie ;)

    • @Skelterbane69
      @Skelterbane69 Год назад

      6700XT gang

  • @janynep.8944
    @janynep.8944 2 года назад +1

    3:16 I really thought he was about to say “Mr. Mushroom Cloud”

  • @Roger_Diz
    @Roger_Diz 2 года назад

    On Assetto Corsa Competizione fsr does a great job getting me great framerates on my old vega64 at 1440p with unnoticeable image loss.

  • @nukedathlonman
    @nukedathlonman 2 года назад +5

    With SAM you/AMD forgot one little hitch - it's not quite that simple of flipping a UEFI switch IF one isn't using GPT partition tables. Yes, I'm working on that migration as I type this. :-)

    • @bigmattvo
      @bigmattvo 2 года назад

      yeah that's what I'm struggling with. how do you migrate without the loss our data? does that mean I have to rebuy Windows?

    • @mirano15
      @mirano15 2 года назад

      I think for the majority of people its not a major issue. Who uses MBR or NTFS these days in a regular system?

    • @nukedathlonman
      @nukedathlonman 2 года назад

      @@bigmattvo No, but ensure you have a USB boot key. RUclips doesn't allow me to post links, but there is a good walk through on Microsoft for how to go about it. It's titled "Convert an MBR disk into a GPT disk"

    • @nukedathlonman
      @nukedathlonman 2 года назад +1

      @@mirano15 One whom has only been upgrading the same system for many many many years (and never needed to re-install Windows (or Linux)). ;-)

    • @mirano15
      @mirano15 2 года назад

      @@nukedathlonman My condolences. How many years has that system been running?