HW News - RTX 5090 & 5080 Leaks, Valve ARM64 Experiments, Intel Arc 0% Marketshare, 4090 Price

Поделиться
HTML-код
  • Опубликовано: 17 дек 2024

Комментарии • 1,7 тыс.

  • @GamersNexus
    @GamersNexus  2 месяца назад +144

    The fab tour was extremely educational for us! Loved working on the video. Please try out the format and let us know what you think! ruclips.net/video/IUIh0fOUcrQ/видео.html
    Or check out our Corsair i500 review: ruclips.net/video/Gqm4V-8F-7k/видео.html
    Grab the factory tour t-shirt here: store.gamersnexus.net/products/skeletal-factory-t-shirt-foil-blue-silver

    • @jackali5014
      @jackali5014 2 месяца назад +1

      Can you please investigate Gigabyte's misleading practices regarding their AM5 motherboards? Specifically, the B650 Gaming X. They sell different revisions under the same model name, but the critical difference is that rev 1.3 only supports the Ryzen 7000 series, while rev 1.5 supports the 8000 and 9000 series. When purchasing these motherboards, Gigabyte doesn’t clearly mention the revision number or clarify that different revisions exist, even though they have vastly different compatibility. This lack of transparency is misleading to consumers, as the model names are identical, and buyers might not realize they're purchasing an older revision that limits future upgrades.

    • @SLAYERSARCH
      @SLAYERSARCH 2 месяца назад

      why are they wasting time on arm 64, their client is unskinable.
      their newly added families code is missing features.
      and their moderation team varies from bots to can't read to cant understand context.
      i have been unfairly banned.. for 3 years from gang stalking alone.
      meaning i cannot edit my account or share content on steam now for 3 years.... it's unfair and their not even trying to look into it.
      and their moderator is clearly off his bracket and their pissing around and their playing with arm 64... what a waste!!!

    • @SLAYERSARCH
      @SLAYERSARCH 2 месяца назад

      people can barely afford 40XX hardware... O_O

    • @zivzulander
      @zivzulander 2 месяца назад +2

      That video was extremely well done. I'm glad they gave you access, even if it meant so many weeks editing and blurring out all that confidential stuff. 😅 Very cool to see the type of tech and machinery that gets used on operations of that scale.

    • @SLAYERSARCH
      @SLAYERSARCH 2 месяца назад

      Steam.. no skinning.. no moderators... people being gang stalked... steam families missing important features... why waste time on proton?

  • @ScottGrammer
    @ScottGrammer 2 месяца назад +2432

    4:46 The 5090 will indeed be a two-slot card. The two slots will be slot one and slot five.

    • @GamersNexus
      @GamersNexus  2 месяца назад +533

      hahahaha. Clever.

    • @devins7
      @devins7 2 месяца назад +19

      @@GamersNexusno, I think this is actually gonna be what it turns out to be

    • @ΔαυίδΠριδεαυξ
      @ΔαυίδΠριδεαυξ 2 месяца назад +53

      @@GamersNexus Maybe the "2-slot" leaks are just the server blower style 5090s?

    • @dolpoof2335
      @dolpoof2335 2 месяца назад +14

      The 4090 might have been slightly overbuilt so they could just be packing it full of fins and making it 2 slots, and accepting that it would run hot.

    • @Morpheus-pt3wq
      @Morpheus-pt3wq 2 месяца назад +22

      I hope they will add some antisag brackets. Otherwise northwestrepair will have lots of hard work in front of him fixing all the broken GPUs with rejected RMA...

  • @thatzaliasguy
    @thatzaliasguy 2 месяца назад +832

    Proton still uses WINE. Proton is a containerized compatibility layer that includes WINE, DXVK, and VKD3D (among other compatibility prerequisites).

    • @GamersNexus
      @GamersNexus  2 месяца назад +665

      Awesome. Thanks for the background on this. We don't follow the Linux world very closely -- though I will say that I've been a lot more interested in it with Windows' terrible decisions the last few years. Maybe time to start studying and deploying it in testing!

    • @thatzaliasguy
      @thatzaliasguy 2 месяца назад

      @@GamersNexus Man, that would be absolutely amazing if you guys started doing Linux testing! Setup a solid Arch Linux bench (check out CachyOS for a stupid-simple way to get Arch up and running without having to do the whole default setup process from the AUR) and do the normal comparisons.
      Who knows, it might make you fall in love with computing all over again!

    • @bonkerbanker
      @bonkerbanker 2 месяца назад +111

      @@GamersNexus Well i think itś time for "the mainstream" to at least follow a long whats happens in the linux world.

    • @ZombieJig
      @ZombieJig 2 месяца назад

      Yes please ​@@GamersNexus

    • @markclayton8977
      @markclayton8977 2 месяца назад

      @@GamersNexusDXVK is a DirectX to Vulkan translation library. You can even use it in windows to improve performance in DX9/10/11 older games.

  • @sdovhfunlahsvisegbakshfjbs4621
    @sdovhfunlahsvisegbakshfjbs4621 2 месяца назад +1495

    NVIDIA should introduce rounded models so that it does not hurt them that much when putting it where it belongs...

    • @GamersNexus
      @GamersNexus  2 месяца назад +626

      But where does it bel--- ooooh. I get it.

    • @Pickelhaube808
      @Pickelhaube808 2 месяца назад +72

      Exactly! I don't want to scratch my case up with these massive cards.

    • @GeoStreber
      @GeoStreber 2 месяца назад +283

      Apply thermal paste beforehand. That helps. Watch that The Verge video to determine the required amount of thermal paste.

    • @devins7
      @devins7 2 месяца назад +37

      @@GeoStreberLMAOOOOO

    • @Chibicat2024
      @Chibicat2024 2 месяца назад +13

      That is savage my dude

  • @jackali5014
    @jackali5014 2 месяца назад +543

    Can you please investigate Gigabyte's misleading practices regarding their AM5 motherboards? Specifically, the B650 Gaming X. They sell different revisions under the same model name, but the critical difference is that rev 1.3 only supports the Ryzen 7000 series, while rev 1.5 supports the 8000 and 9000 series. When purchasing these motherboards, Gigabyte doesn’t clearly mention the revision number or clarify that different revisions exist, even though they have vastly different compatibility. This lack of transparency is misleading to consumers, as the model names are identical, and buyers might not realize they're purchasing an older revision that limits future upgrades.

    • @GamersNexus
      @GamersNexus  2 месяца назад +472

      Thanks for bringing this one to our attention. Will try to find time to look into it, but we are pretty backed up right now on investigations. Can't promise this one only because we really are overloaded to a point where we need to pick and choose. It's tough since a lot of the deep dives require some serious time commitment to prove irrefutably.

    • @drewnewby
      @drewnewby 2 месяца назад +12

      Is that board even sold still? The revisions should be separate SKUs, how is that misleading? Who did you buy it from, I'm assuming not Gigabyte directly. The current Gaming X AX and Gaming X AX V2 are always listed separately.

    • @joemarais7683
      @joemarais7683 2 месяца назад +48

      @@drewnewby first off, they’re talking ONLY about the Gaming X AX. Not the V2. Secondly, No one buys directly from gigabyte. Every retailer that lists that board does not list the revision. That’s the problem, and that’s gigabyte’s problem when they release 5 B650 Gaming X AX boards, all named the same, with extremely varied specs. And yes, the board is still sold now, and if you order on Amazon, you have no way of knowing which revision you get.

    • @jeffb.6642
      @jeffb.6642 2 месяца назад +18

      That seems very shady, it would be normal for an earlier revision to not support future chips without a BIOS update but yeah you're right I just looked on their support site listed and under CPU support, the 9000 series are not even listed unless you select rev 1.5! That's what the kids would call "SUS AF" I believe...

    • @jackali5014
      @jackali5014 2 месяца назад +8

      @@GamersNexus Thank you

  • @Karti200
    @Karti200 2 месяца назад +220

    As daily Arc user since release... i won't lie
    it hurts to hear that

    • @P7ab
      @P7ab 2 месяца назад +95

      You're in the top 0% though, that's something

    • @Winnetou17
      @Winnetou17 2 месяца назад +25

      @@P7ab Pretty sure that the 0% is in sales, not the whole market share. A market of dozens of millions don't simply swing 5% from X to Y in several months, when we know there's people still running 6-7 year old cards out there.

    • @futuza
      @futuza 2 месяца назад +14

      The good news is that in 20 years it'll probably worth a tons of money as a collectors item that will be extremely rare and hard to find.

    • @samuelschwager
      @samuelschwager 2 месяца назад +6

      hang onto it, it will be a collectible ;)

    • @Karti200
      @Karti200 2 месяца назад +13

      @@samuelschwager i got 3 Arcs already xD
      both A750 and A770 in "LE"
      and Sparkle A580 xD

  • @TheRenalicious
    @TheRenalicious 2 месяца назад +312

    I'm still blown away by how much cache those X3D chips have. Like that's enough to fit old 90's games onto the CPU and never have to touch main memory. Hell, that's probably enough cache to run Windows 9x without main memory!

    • @GGigabiteM
      @GGigabiteM 2 месяца назад +65

      The minimum requirement for Windows 95 is 4 MB and 16 MB of 98/98SE, you could run dozens of copies of the former and a few of the latter entirely on cache. And yes, you could run games up into the late 90s on that much cache.

    • @HolyRaincloud
      @HolyRaincloud 2 месяца назад +5

      goodluck getting that to work though lol

    • @pixelfairy
      @pixelfairy 2 месяца назад

      ​@@HolyRaincloudYou could run freedos or boot Linux to auto start dosemu. Some distros still support rc.local so you don't have to make a systemd service.

    • @johnmclain250
      @johnmclain250 2 месяца назад +28

      And it's STILL beneficial to have more for some modern games. It's incredible how much of a difference cache size makes for gaming performance. I mean the 7800x3d is actually a fairly low performance processor, MUCH weaker than intel CPUs. But that v-cache has it running LAPS around anything intel has. It's the absolute undisputed KING for gaming right now in 90% of games.

    • @jackpowell8155
      @jackpowell8155 2 месяца назад

      I have one, it's amazing for CPU dependent games/sims. I kept my 5800x3d system together too, for the nostalgia.​@@johnmclain250

  • @christianvetter2906
    @christianvetter2906 2 месяца назад +43

    9:25 One important additional information concerning this topic ... FE cards are not officially sold in Germany. I tried to get one with the launch of the 40-series. Not a single online retailer was listing them.

  • @Noxeus1996
    @Noxeus1996 2 месяца назад +156

    Saying Proton is not a second-class citizen vs. Wine is misleading. Proton *is* Wine, just containerized and packaged in a user-friendly way. And while Valve has contributed to the overall ecosystem, if anyone should be given credit for the Proton 3D performance it should be people like Doitsujin who wrote the absolutely critical DXVK software for translating DirectX to Vulkan.

    • @Henrik_Holst
      @Henrik_Holst 2 месяца назад +25

      don't forget that they also add a huge list of patches on top of WINE, some that will never be included upstream.

    • @BOZ_11
      @BOZ_11 2 месяца назад

      @@Henrik_Holst well that's awful, especially since on a long enough timeline everybody is going to lose their Steam library

    • @Henrik_Holst
      @Henrik_Holst 2 месяца назад +14

      @@BOZ_11 I have no idea what you are talking about, clear to elaborate? Everything proton is open source so that not all patches are upstream have zero impact on the future, anyone can patch their own WINE and there are lots of other projects that apply the same patches to WINE and also apply their own patches that are not in Proton on top of that.

    • @Mr.Genesis
      @Mr.Genesis 2 месяца назад +12

      @@BOZ_11 its in the Steam ToS that Valve will give people the ability to download their games to Personal storage if the reality is that Steam will cease functioning. Please read the Steam Terms of Service before you speak about things like this.

    • @BOZ_11
      @BOZ_11 2 месяца назад

      @@Mr.Genesis People who think any given ToS can always be legally enforced should read more before speaking about such things.

  • @Ale-ch7xx
    @Ale-ch7xx 2 месяца назад +210

    Cool, I am one of the zero percent market share with my Arc A580 :)

    • @luna775
      @luna775 2 месяца назад +14

      Same good cards to pass these dark ages tbh

    • @trueNahobino
      @trueNahobino 2 месяца назад +31

      Yep I also don't exist with my A770 over here :))

    • @connivingkhajiit
      @connivingkhajiit 2 месяца назад +21

      Sparkle A770 here!!

    • @EnvAdam
      @EnvAdam 2 месяца назад +16

      same with my A770, pretty happy with it almost 2 years on now.

    • @elecman748
      @elecman748 2 месяца назад +3

      o7

  • @DankbeastPaul
    @DankbeastPaul 2 месяца назад +167

    28:52 Motherboard seller, I am going into battle and I need your STRONGEST motherboards!

    • @fus132
      @fus132 2 месяца назад +37

      Noooo, travelah, my motherboards are _too strong_ for you!

    • @normified
      @normified 2 месяца назад +22

      You can't handle my motherboards. They're too strong for you.

    • @Somebody374-bv8cd
      @Somebody374-bv8cd 2 месяца назад +3

      Just use the power of friendship.

    • @leec3881
      @leec3881 2 месяца назад +1

      They're designed to take *most* of the weight of the 5090... Most.😏

    • @normified
      @normified 2 месяца назад +1

      @@leec3881 Let us all be reminded of that one clip of an exec holding up a huge GPU with an ITX mobo and watching it fall out of the slot 🙏

  • @voyagerrock1137
    @voyagerrock1137 2 месяца назад +25

    We don't just get better frame pacing with proton, we just straight up do not have shader compilation stuttering at all

  • @ChristopherYeeMon
    @ChristopherYeeMon 2 месяца назад +22

    FYI, Proton is Wine. It's Wine plus two extension packages called DXVK and VKD3D-Proton. DXVK handles DirectX 9-11 call translation to Vulkan and VKD3D-Proton handles DirectX 12 translation to Vulkan. Valve's major innovation is those two additional packages.

    • @zabhoman3398
      @zabhoman3398 2 месяца назад +2

      Knowing how companies work, having valves/steams named on them also really helps solve problems, especially if they can move their weight around to solve odd problems that are hard to troubleshoot without deeper knowledge about individual games.

  • @ryanbrennan8183
    @ryanbrennan8183 2 месяца назад +257

    HIKE? My 1080 is never getting upgraded bro 😭

    • @jevonp
      @jevonp 2 месяца назад +55

      Lmao I just went from a 7700k to a 5800x3d, 16 to 32gb ram (ddr3 to 4) and a new case and drives for less than a new video card would cost 😂

    • @dolpoof2335
      @dolpoof2335 2 месяца назад +9

      Just get a 3080 it can go for abt 400 dollars sometimes and it's basically the same with vram.

    • @JSXSProductions
      @JSXSProductions 2 месяца назад +5

      How I feel with my i7-8700k and my 1070 that I have to underclock to keep stable lmao. But Planet Coaster 2 is coming out in like a month sooooooo...

    • @kunka592
      @kunka592 2 месяца назад +1

      @KrautessendeKartoffel My GTX 1080 died (still boots but crashes randomly under high load). Rip.

    • @MechAdv
      @MechAdv 2 месяца назад +5

      Used 3080ti bro. I’ve seen them on eBay as low as 450$.

  • @porforyticbasalt
    @porforyticbasalt 2 месяца назад +30

    Oh boy, i can't wait for the breakthrough where they make ai-ram and ai-rgb controllers

  • @AspectofFrost
    @AspectofFrost 2 месяца назад +161

    Im not really an intel fan, but sad to see the GPU side fall even more. More competition between Nvidia, AMD, Intel the better.

    • @qwesx
      @qwesx 2 месяца назад +28

      Realistically, Intel would only be competition to AMD and fighting for the crumbs. Even though there are - admittedly less good but still viable - alternatives, people have been advocating for vendor-lock-in tech made by Nvidia (DLSS, CUDA, ...) without a second thought of the long-term effects. Other companies now simply don't have the money to keep up with Nvidia's R&D while the latter can ask ludicrous prices for even average GPUs with comparatively little VRAM.

    • @ralph4370
      @ralph4370 2 месяца назад +15

      I purchased Arc GPU and been happy. After I had to replace a RX6600. People have short memories. During the lockdowns and shipping issues. Intel stepped up and brought out a GPU when Nvidia and AMD were over pricing. LinusTT had a good point of people to support the 3rd party candidate but if Intel does not step up it will be another AMD vs Nvidia.

    • @matt.stevick
      @matt.stevick 2 месяца назад +3

      NVIDIA has basically created a new industry. Jensen deserves TIME magazine person of all time and existence cover photo.

    • @kellywilson137
      @kellywilson137 2 месяца назад +2

      Arc was interesting, but when my Brother wanted me to build a PC for him.
      I chose Nvidia, Im not familiar with it and I don't have the patience to troubleshoot a device that isn't mine.
      Nvidia will take care of them.

    • @adlibconstitution1609
      @adlibconstitution1609 2 месяца назад +2

      No one is competing against Nvidia. AMD can't touch them, even more so if it's Intel

  • @schmutz1g
    @schmutz1g 2 месяца назад +62

    Remember when things used to get cheaper after initial release? Pepperidge farms remembers. *SIGH* It's gonna be a long long time before I get an upgrade

    • @Pan_Z
      @Pan_Z 2 месяца назад +3

      The $200 to $500 range is decently competitive. Depending on what you currently have, there might be a good value upgrade.

    • @schmutz1g
      @schmutz1g 2 месяца назад +3

      @@Pan_Z Right now I'm running a 5700X with 32GB 3600 and a 3060. My set up works great as is, I just like to tinker and build for fun. Unfortunately current prices don't make it practical to upgrade.

    • @haarex.
      @haarex. 2 месяца назад

      @@schmutz1g I don't see why you'd need to upgrade that. I'm running something similar (i7-12650H, 3060 6GB, 16GB DDR5) and it runs even new games just fine (as in, stable 60 frames). Obviously there's room for improvement, but I feel like there's little actual reason to upgrade unless you really have money to burn.

    • @schmutz1g
      @schmutz1g 2 месяца назад +2

      @@haarex. It has nothing to do with NEED. Its my hobby so I enjoy spending money on it regardless of need. With current pricing though, any meaningful upgrade just doesnt make sense at this point since there isnt a ton of performance to be gained.

    • @julianchillby1024
      @julianchillby1024 2 месяца назад

      @@schmutz1g Yes there is, you sound dumb. There are loads of games the 4090 cannot play at 4k 144hz maxed out which is the new standard for 4K TV's

  • @whatwelearned
    @whatwelearned 2 месяца назад +260

    What people need to understand with regard to Nvidia's price/performance is that this is no longer 2005; they don't get their profits from gamers now. They can charge silly money a) because we'll pay it, mostly and b) because their high-margin stuff is way more important to them. Things are not going back to 'old pricing' any time soon.

    • @Splarkszter
      @Splarkszter 2 месяца назад +64

      Not without competition

    • @RabbiKrieg
      @RabbiKrieg 2 месяца назад +25

      ​​@@Splarkszter Came here to say this, Until AMD/Intel, some other company is a real threat at the highest end/machine learning/AI stuff, the market won't shift.

    • @yousigiltube
      @yousigiltube 2 месяца назад +57

      I thought this years and years back but the PC market is not enthusiasts. It's a bunch of people that pretend they are enthusiasts but they are the equivalent of a 'car expert' that will only ever by one brand of car or a 'phone lover' that only buys iphones haha. It's really weird to me that the wealthier ones that upgrade every year, sometimes even more, and can easily try other brands just for fun (something enthusiasts would normally want to do, warts and all for the experience and enjoyment of trying tech) just don't do that.
      They're just Nvidia fanboys, they're not tech enthusiasts more than they are tech snobs and refuse to try other stuff even if priced fairly for the performance etc. I didn't rush to this conclusion but year after year it feels like it's proven by unshifting obsession with one graphics card brand, even after they constantly put out overpriced cards and put the industry into a terrible state.

    • @ghost_stars2307
      @ghost_stars2307 2 месяца назад +51

      @@yousigiltube I had a guy have a meltdown on reddit to the point he deleted all his comments and reported me to reddit care because i said the 4090 is not good value even thought it's the best performance money can buy.
      Price to performance seemed to be a foreign concept to him.

    • @JustinMcTavish14
      @JustinMcTavish14 2 месяца назад +8

      You are correct especially with amd giving up on high end the only way that nvidia would ever drop prices is if amd released a gpu with 120% the performance for 3/4 to 1/2 the price which will not be happening now, get ready for higher prices cause nvidia has no competition at all now 😢

  • @jbrone1241
    @jbrone1241 2 месяца назад +136

    5090 isn't the model its the price... People keep getting it confused... god I'm rooting for intel harder then ever.

    • @gertjanvandermeij4265
      @gertjanvandermeij4265 2 месяца назад +8

      *Sorry, but Intel's GPU's are already GAME OVER ! Battlemage will be their next big disaster !* And they will give up after that one !

    • @bazooka712
      @bazooka712 2 месяца назад +12

      @@gertjanvandermeij4265 Intel sleeper agents at this time of the year?

    • @jloiben12
      @jloiben12 2 месяца назад +1

      Good meme. 10/10. No comments

    • @PixelatedWolf2077
      @PixelatedWolf2077 2 месяца назад +6

      ​@@gertjanvandermeij4265 Bro likes monopolies

    • @jbrone1241
      @jbrone1241 2 месяца назад

      @@gertjanvandermeij4265 pretty sure you're right, but much like when I buy a lotto ticket. I dream for something better.

  • @KevPez-IS
    @KevPez-IS 2 месяца назад +55

    My 3080 died and I needed a GPU for video editing/class. Went for an A770….i love it. No, not as good at gaming, but good enough. Good gaming performance for what I play, excellent Adobe CC performance, outstanding value

    • @peterers3
      @peterers3 2 месяца назад +5

      How did 3080 die??

    • @LeegallyBliindLOL
      @LeegallyBliindLOL 2 месяца назад +9

      @@peterers3 A lot of the 30 series use awful thermal pads to save a few cents that cause the memory to run at over 100-120° degrees. Way higher than allowed per Samsung's memory spec. MSI was very much guilty of this. I had to mod mine to make it run cool enough. Yes, the core temps were fine (which is what u see in overlays) but silently the memory was being destroyed. I live in Germany, so it's not like it's a very hot country (Although, summers can get up to 40°C....)

    • @KevPez-IS
      @KevPez-IS 2 месяца назад +4

      @@peterers3 good question, not sure. Basically my computer would crash at random points, often during class or when I was using Photoshop, let alone games. My computer computer worked fine running on integrated graphics. I got into arc because I need to edit video and it’s been completely stable. I sent my 3080 back to MSI for repair. They sent it back and was still broken.

    • @ThatGuyOverThereWeird
      @ThatGuyOverThereWeird 2 месяца назад +1

      @@LeegallyBliindLOL im not a gamer but I bought a bunch of 3080s for mining back in 2021 and the first thing I did was replace the thermal pads with top quality ones and the temps dropped dramatically.

  • @drewnewby
    @drewnewby 2 месяца назад +168

    5080 $1699, 5090 $2399, just wait.

    • @JackCarsonite
      @JackCarsonite 2 месяца назад +79

      The more you buy, the more you save 😂

    • @zivzulander
      @zivzulander 2 месяца назад +21

      They wont sell many if the VRAM capacities havent gone up. Gamers dont have the money to buy those in quantities, and AI training demand wont be there with relatively low VRAM.

    • @devilmikey00
      @devilmikey00 2 месяца назад +17

      I think they'll keep the 5080 at $1000. Judging by those specs it's probably going to be doing half the work of the 5090 which I suspect will be $2000. The 4080 at $1200 was a flop last time around because it was 50% less performance for only a few hundred dollars less. They need to keep that price gap bigger this time if they want to keep the performance gap so big between cards.

    • @gertjanvandermeij4265
      @gertjanvandermeij4265 2 месяца назад +13

      5080 $1499, 5090 $2499 ( after that ...... the 24Gb 5080ti &1899 )

    • @eorzorian
      @eorzorian 2 месяца назад +10

      The leather jacket in each presentation is a different one, it may not look like that but it is. Those arent gonna pay themselves and the jacket guy is looking at our pockets to milk them!!!

  • @Percavius
    @Percavius 2 месяца назад +38

    People are reading way too much into Arc's current 0% sales numbers. This doesn't have to be indicative of crashing consumer interest in Arc. I'd argue it's not at all, and the falling sales numbers are to be expected. Arc always appealed to a particular kind of buyer. You can't just walk into a Best Buy and get an Arc card-- they're not on the shelf. People buying Arc knew what they were looking for in the first place, because they're people plugged into the hardware news cycle. They're the early adopters, and just about everyone who was going to buy one already bought theirs either on release, or when Arc rolled out driver updates for the game they wanted to play.
    So yeah, when Battlemage is released, that'll be the real testament to how much enthusiasm there is for Arc.

  • @ResumedPausing
    @ResumedPausing 2 месяца назад +156

    really hope that the rumored v-cache on both CCD's comes true

    • @jonathonschott
      @jonathonschott 2 месяца назад +12

      Agreed. From what I saw their 'solution' for having the system decide which ccd to use left something to be desired. They needed something similar back in the bulldozer days and I wasn't impressed then, sure it helped my 9120, but it was marginal and I still suffered from random stutters, their scheduling update only made them less frequent. I refuse to spend 9950x3d money for a solution that is only 'less annoying'

    • @BBWahoo
      @BBWahoo 2 месяца назад +20

      The final AM4 CPU being a 5950X3D with double v-cache would be a dream come true for me

    • @happygster922
      @happygster922 2 месяца назад +3

      I mean then you have 2 downclocked CCDs. If you want gaming, go for the single die imo.

    • @deepbludreams
      @deepbludreams 2 месяца назад +1

      Other issues with this is basicly no games are multithreaded enough to support a 16 core X3D at the moment, pretty much every title threads out to 8 cores​@happygster922

    • @jabroni6199
      @jabroni6199 2 месяца назад +4

      @@happygster922it’s not that simple. Want both. I don’t just play games with my PC. I’ll take two slightly lower clocked CCDs than this hybrid approach that often times doesn’t work the way it should.

  • @racerex340
    @racerex340 2 месяца назад +11

    I don't understand the people saying the 5090 leak can't be possible be real due to the core counts being too high for a generational increase. Does no one remember every previous Nvidia massive generational halo card spec increases? Titan RTX to RTX 3090 increased core counts by well over 100% (4608 to 10496), RTX 3090 to RTX 4090 had roughly a 60% increase in CUDA cores (10496 to 16384), and RTX 4090 to this rumored RTX 5090 is a 33% increase in cores (16384 to 21760). If anything Nvidia is actually getting stingier, cutting the GPU core spec increase they've given to Halo cards generationally by half each generation going back to RTX 2000. 2000 to 3000 was 100%, then 3000 to 4000 only 60%, 4000 to 5000 dropping to almost 30%. The 6090 will likely only have 15-20% more cores than the 5090 if it keeps tracking this way, price justified by new DLSS and frame gen that allows a 2X performance increase of 4K gaming over 4090. When we went 3090 to 4090, they took away / reduced a huge part of the generational prior years 2X hardware improvement and instead gave 40% of that performance with DLSS + Frame Gen, and people were still more than happy to shell out two grand for a GPU, they'll do it again.
    My bet is Nvidia cutting their 5090 production costs by another 25% over 4090, give even less hardware horsepower improvement and I bet the 4090 FE of $1599 turns into a liquid cooled 5090 FE that costs a minimum of $1999, or as much as $2499. No way they're going to sell consumer GPU's to us when those same GPU dies can be sold at triple or quadruple the profit margins to a datacenters desperate for any AI hardware.

    • @abenormal3070
      @abenormal3070 2 месяца назад +2

      it's not worth to explain that they get less for more money. if somebody pay leather jacket's exclusive margin (about 60% btw) just to play games on that brick, it is purely for the feeling that he is something more than others because he can afford it. i can purchase 4090 every month, but i had to be mentally ill to buy a card which is barely worth 50% of its price.

  • @Drakaziel
    @Drakaziel 2 месяца назад +26

    "A I DONT KNOW WHAT ANY OF THAT MEANS" xD got me going

  • @disasterincarnate
    @disasterincarnate 2 месяца назад +8

    The Valve/ARM news i imagine would be more realistic for their VR/Deckard projects, stuffing ARM tech in a headset is pretty much what everyone does anyways.

  • @thomasburchsted3287
    @thomasburchsted3287 2 месяца назад +67

    The elephants Steve, the elephants…

    • @GamersNexus
      @GamersNexus  2 месяца назад +59

      Just you wait until they finally do away with weight measurements on GPUs and replace them with fractions of one elephant.

    • @alexsmith5584
      @alexsmith5584 2 месяца назад +6

      @@GamersNexus with how much GPUs have grown in size lately I don’t think we’ll be measuring in fractions for long

    • @procedupixel213
      @procedupixel213 2 месяца назад +11

      @@GamersNexus Milliphant, microphant, nanophant, ...

    • @chronossage
      @chronossage 2 месяца назад

      ​We all need to remember the prototype 4090. ruclips.net/video/0frNP0qzxQc/видео.html

    • @turnerg
      @turnerg 2 месяца назад +2

      ​@@procedupixel213 😂😂😂

  • @enlightendbel
    @enlightendbel 2 месяца назад +37

    I have a A770 and loving it. Clocks really well, works on al games I've tried and sips power compared to both Nvidia and AMD equivalent performance models I have. Also gets driver updates extremely regularly.

    • @notrixamoris3318
      @notrixamoris3318 2 месяца назад +2

      Good for you but intel has a lot of problems that ARC is affected by it even though arc is good now...

    • @enlightendbel
      @enlightendbel 2 месяца назад

      @@notrixamoris3318 the operative is "had", not "has".
      Seriously, I'm yet to find a game where I have any issues left. In both performance and compatibility, Intel pulled of an astronomical improvement in the past year and now it's perfectly on par with the driver stability and compatibility you'd expect from AMD and Intel.

    • @PixelatedWolf2077
      @PixelatedWolf2077 2 месяца назад

      ​@@notrixamoris3318It's because of the architecture

    • @ZestyLemonSauce
      @ZestyLemonSauce 2 месяца назад +3

      it does not sip power. the idle is at least 60w from the wall.

    • @enlightendbel
      @enlightendbel 2 месяца назад

      @@ZestyLemonSauce Idle is 40ish max and my GPU is rarely ever idle, so that bit doesn't matter to me, it's 100% load power usage is lower than both the 6700XT and 3060.
      Besides you can get it down to 8W idle power if you set your monitor to run 60Hz when not in a game when you follow their PCIe low power settings recommendations.
      Which seems to be a peculiarity in their vbios or drivers they still need to fix. The PCIe low power settings do work, but only if you set your monitor to 60Hz. Going higher than that instantly brings it back to 40ish.

  • @MrFallenone
    @MrFallenone 2 месяца назад +27

    28:47 Huge missed opportunity to call the board Godl Ai ke.

  • @CloneMalone
    @CloneMalone 2 месяца назад +48

    Honestly I hope the public perception of ARC vastly improves with this next generation, especially if they continue trying to fill the hole of mid to low tier GPUs that NVIDIA is too high class or whatever for.

    • @LinusBerglund
      @LinusBerglund 2 месяца назад +5

      I love my arc a750, but that is under linux. Everything just works. Just not having to deal with Nvidia is amazing. I even play some minecraft 😂

    • @godnamedtay
      @godnamedtay 2 месяца назад +1

      It won’t, but they’re still kinda cool. Not as a gaming daily driver but I’m still down.

    • @mazditzo
      @mazditzo 2 месяца назад +3

      Two days ago, built my pc with a770, and feel okay for games and productivity

    • @arenzricodexd4409
      @arenzricodexd4409 2 месяца назад

      I doubt intel will really want to fill that hole. This is why intel market share at 0% right now. The last 2 quarters they only ahip extremely low quantity of Arc to the market. For intel it is better not to sell them rather than have to sell those card at very very low price to the point of selling it at loss.

    • @godnamedtay
      @godnamedtay 2 месяца назад +1

      @@arenzricodexd4409 this is a terrible take. Like I mean nonsensically bad.

  • @EhNothing
    @EhNothing 2 месяца назад +6

    If the 5090 is going to be a professional grade card, not a gaming focused card, I wish they'd remove it from the naming scheme of the gaming cards.

  • @krautworks
    @krautworks 2 месяца назад +31

    Those 5090 leaks are giving me R9 295X2 flashbacks.

    • @ggmgoodgamingmichael7706
      @ggmgoodgamingmichael7706 2 месяца назад +5

      What a card that was ... I can still smell the heat 😃

    • @GeeMannn
      @GeeMannn 2 месяца назад +4

      🎵 Chestnuts roasting on an open fire! 🎶

  • @mztik
    @mztik 2 месяца назад +23

    Valve working on ARM64 support for Proton also means good news for macOS devices running on Apple Silicon.

    • @neuronic85
      @neuronic85 2 месяца назад +3

      Wine already works on MacOS, so it's not a big leap for Valve. It would certainly please us Mac users. Linux is soooo far ahead for gaming right now, but I want games on all my computers.

    • @Niosus
      @Niosus 2 месяца назад +13

      Ever since Apple made it impossible to run 32bit applications, Valve has pretty much given up on supporting the platform. They make sure the Steam client runs, but that's about as far as they go. They haven't even bothered to update games like TF2 on Mac. It simply doesn't run anymore. I could play it on my 2011 MacBook Pro, yet it doesn't launch on my M2 MacBook Pro. It's so much faster, but it can play so much less.
      Valve puts a lot of time and effort into making games more compatible, only for Apple to break compatibility entirely. That's maybe fine for productivity software where you always run the latest version, but a disaster for games since their active support only lasts for so long. Apple doesn't understand (PC) gaming, and they never will. Every once in a while they'll parade a few games whose ports they funded, but other than that macOS is really not a platform worth buying into either for gamers or game developers... Valve tried, and all they got was a middle finger...

    • @pilkycrc
      @pilkycrc 2 месяца назад +1

      It’s kind of a double-edged sword. Dropping support for 32-bit apps screws over some older games*, but keeping support slows down your machine while further complicating future progress (e.g. the transition to Apple Silicon/ARM64). There’s no ideal solution. That said, it’s long been the case that if you’re really wanting to game you’re better off getting a console or a PC, though I guess we’ll see how Apple’s current push in gaming goes. I doubt it will do much, and I say that as a life long Mac user
      * though given Apple hadn’t sold 32-bit Macs since 2009 and macOS hadn’t supported 32-bit hardware since 2011 there was little reason for more modern games to still be 32-bit only

    • @Niosus
      @Niosus 2 месяца назад +3

      @@pilkycrc As a software engineer I really do have to debunk that argument.
      There really isn't a good reason to forcibly drop it, not for a company that size. All 64 bit x86 processors are fully backwards compatible with 32 bit instructions. It's literally native code. You're not slowing down anything there.
      So then you have the OS level. You need to have a translation layer between the 32bit applications and your 64bit OS libraries. The thing is, Apple had this. Microsoft also has been using this since 2005 at the very latest. It's absolutely fine to not bring new features to the compatibility layer, but there is no reason for them to simply remove the existing functionality. Sure it takes some engineering time to keep it going. But Apple is a huge company with tens of thousands of engineers. It's a huge middle finger to their community of users to not spend just a few resources on keeping that old compatibility layer functional.
      I'm sure the move to ARM has something to do with it. But even then... Microsoft's much inferior x86->ARM compatibility layer supports 32 bit apps. Rosetta by comparison is a much better compatibility layer. They just didn't bother.
      FYI, I can guarantee you they will pull the same sh!t again in a few years. In a few updates they will drop support for Rosetta and you'll lose access to ALL mac software written before ~2020. They did it with their PowerPC translation layer, they'll do it with the x86 translation layer. They have been doing crap like this for decades. Yeah it takes extra work, but that's nothing compared to the millions of hours of development work spent on software that will simply no longer work. For all its flaws, at least Microsoft understands the value of that...

    • @pilkycrc
      @pilkycrc 2 месяца назад

      @@Niosus As a fellow software engineer (who’s been building software for the Mac for 20 years now) I do have to partially disagree. There are several good reason to drop support that are specific to the Mac, its history, and various technical decisions.
      The first is due to how Apple ships its OSes. As they have been through a lot of transitions they are well versed in the concept of FAT binaries (also known as Universal Binaries). Rather than having separate downloads for different architectures you have single download that contains them all (during the Intel transition this led to 4-way binaries for PPC, PPC64, x86, and x64). This also extends to the OS and all its libraries. This makes things much easier for the user as they don’t need to care about their architecture. You could have macOS on an external drive with universal apps and boot a Mac of any supported architecture from it entirely natively (something Windows couldn’t do, and I believe still can’t).
      The downside is you have duplicates of all the binaries. In macOS 10.14 (the last version to support 32 bit apps) this took up about 500MB of disk space. It also takes up RAM. When a library is first used by an app, macOS loads it into memory. To save on RAM a lot of these OS-level libraries are shared (so you don’t have a version per app). However, they are never **unloaded** from memory. So if you launch an app by accident and quit it immediately it will load all the libraries it needs and leave them in RAM until you reboot. This can waste a lot of valuable RAM (especially given Apple’s stingy configs, but that’s another problem 😅). This isn’t an issue for 64 bit stuff as most Mac software was already 64 bit. But it meant those few 32 bit apps had an additional burden.
      Then there is the even more macOS specific issue of the Objective-C runtime. Obj-C is the language most high level macOS (and previously NeXTSTEP) APIs were built in (and to a large degree still are). Unfortunately Obj-C was created in the early 80s (it’s even older than C++) and the runtime had a few issues that made adding new features to existing classes very problematic. With the 64-bit transition Apple could introduce a new runtime that fixed all these issues. The problem is, they needed to keep supporting 32-bit and so the old runtime. This added a LOT of engineering complexity to do pretty basic stuff, leading to features being slower to add and more prone to bugs.
      Apple does have a lot of engineers, but I think many in the industry would be surprised at how small their teams actually are. They’re often a fraction of the size of similar teams at the likes of Microsoft or Google. This lets them be a bit more nimble, but has the downside of not having as many resources to manage the growth in complexity that super long term backwards compatibility requires. From the perspective of providing better and more efficient software going forward, then dropping the legacy Obj-C runtime was a pretty big thing that would benefit the entire company, especially as it ONLY affected the Mac, with iPhone, iPad, etc only ever having the modern runtime. In fact that leads to another point which is bringing iPhone and iPad apps to the Mac is also made a lot easier by dropping 32-bit support as the frameworks there never had to deal with the legacy runtime, but would have to on the Mac.
      And then you have the ARM transition, which again plays into this. By dropping 32 bit support Apple can focus their resources entirely on x64 -> ARM64 translation. This reduces the complexity of Rosetta and allows them to optimise for one architecture switch. Given they also added CPU instructions to Apple Silicon to speed up translation of x64 apps it wouldn’t surprise me if that played a roll in encouraging them to drop 32-bit.
      So in a nutshell, keeping 32-bit compatibility is a lot more complex a topic than most people would think and requires a lot more resources than one would expect. To be frank, it’s a miracle that Microsoft have kept it for so long (I know there are people at MS who frequently argue for dropping compatibility for older hardware and software). And in Apple’s case it’s not like it was a sudden or unexpected transition. As you’ve said, Apple has history of dumping old tech and architectures, but it usually takes them a while to do so. Apple’s first 64-bit capable Mac was released in 2003. macOS was fully 64-bit by 2007, the same year they last shipped 32-bit hardware. They dropped support for 32-bit hardware in macOS in 2011 and then spent 2017-2019 warning that 32 bit apps would no longer be supported before doing so in 2019. So that’s anywhere from 8 to 16 years of notice given to devs. Any software released during that time frame probably should have been built as 64 bit already (sadly I suspect the reason that some of it, especially games, wasn’t built that way is the devs were new to the Apple platform in that time and didn’t know to expect this).
      But yeah… ultimately there are pros and cons to each approach. Windows is legendary for backwards compatibility and lets you run pretty ancient software pretty well. This is a huge boon to people who enjoy older games or want to preserve older software without resorting to older machines. The downside is that it requires an increasing amount of engineering resources to pull off and limits your flexibility and nimbleness in moving forward. Having more limited backwards compatibility is why Apple has managed to pull off 5 architecture transitions (68k -> PPC, PPC -> PPC64, PPC -> x86, x86 -> x64, x64 -> ARM64) with incredible success in the time MS has struggled to pull off 2 (x86 -> x64, x64 -> ARM64). There’s no right or wrong answer here, which is why it’s good to have multiple computing platforms with differing priorities (even if that does mean you need to fork out for each of them separately to get both sets of benefits)

  • @SandyWhitmore
    @SandyWhitmore 2 месяца назад +34

    6:51 5080 is going to be such a lame "upgrade" over the 4080 (super). Almost no increase in core count and going from TSMC 5nm to 4nm will barely improve the clockspeed. Fully expecting 1200 USD or higher pricing too 💸

    • @MrCreativeHD100
      @MrCreativeHD100 2 месяца назад

      Upgrading from a 3080 to a 5080, would the uplift be worth the cost to upgrade? Not sure if it can be answered currently due to the lack of knowledge.

    • @deadpool790
      @deadpool790 2 месяца назад

      sadly upgrading from a 2080super would have loved a bigger node upgrade but it will still be a sizable upgrade because the gddr7 over gddr6x rather than the almost non existent 5nm to 4nm upgrade from the cards.

    • @ZackSNetwork
      @ZackSNetwork 2 месяца назад +4

      @@deadpool790If you have a 2080 Super even a 4070 Super would be a massive upgrade stop being delusional.

    • @zerosam5541
      @zerosam5541 2 месяца назад +1

      Just wait for 6080

    • @rusTORK
      @rusTORK 2 месяца назад +8

      I always feel 256bit bus as an insult.

  • @PiroteusGaming
    @PiroteusGaming 2 месяца назад +23

    Probably most people bought the low end Intels for the AV1 hardware encoding feature. Which now becomes more common among amd and nvdia gpus.

  • @47enterprises
    @47enterprises 2 месяца назад +2

    That camera mug after saying Valve picking up steam. Perfection.

  • @Oneiric_Benevolence
    @Oneiric_Benevolence 2 месяца назад +5

    3:32 Thanks Steve.

  • @SublimisSudes
    @SublimisSudes 2 месяца назад +5

    For those who don’t know last year the 4090 was at whopping low of 1450.00$ on Black Friday at multiple stores like microcenter and Best Buy. That was definitely the time to buy it and the only time to buy it and still feel good about the wallet drain haha price to performance at 1450 feels a whole lot better now knowing it’s worth 2k

    • @depthsounderdave
      @depthsounderdave 2 месяца назад

      I got lucky and built my once in a decade PC then. I hope it lasts!

  • @JaleTechAndGaming
    @JaleTechAndGaming 2 месяца назад +9

    Feels odd going from being able to rule out the CPU for the most part, to having to really consider it now. I remember when I only had to RMA 1 CPU maybe once a quarter? Even then it was probably the user's fault. We were building about a thousand PCs a year. I think 90%+ of our RMA's were motherboard, the rest were mostly hard drive or power supply.

    • @louisfreema5905
      @louisfreema5905 2 месяца назад

      Tell me more. How many Gigabyte motherboards ?

    • @JaleTechAndGaming
      @JaleTechAndGaming 2 месяца назад

      @@louisfreema5905 Soooooo many gigabyte boards... >.< Soooo many leaking caps.... I have too much trauma to buy the brand anymore.

  • @TheySeeMeTrollen
    @TheySeeMeTrollen 2 месяца назад +14

    The 5090 will be a 2 slot card. It will just be twice as tall and have an external 30CM pedestal fan cooling it

  • @alexthething
    @alexthething 2 месяца назад +9

    Man I wish I could grow my hair out like Steve's lol

  • @RunescapeMeister-san
    @RunescapeMeister-san 2 месяца назад

    Lines like at 27:07 are exactly why I keep coming back. You don't just provide top tier news coverage, you also provide laughs. Thanks Steve.

    • @bernds6587
      @bernds6587 2 месяца назад

      back to you, Steve!

  • @gertjanvandermeij4265
    @gertjanvandermeij4265 2 месяца назад +25

    *Pretty sure Ngreedia left room for an 5080ti with 24Gb !* Because just 16GB on the 5080 doesn't make sense !

    • @anreoil
      @anreoil 2 месяца назад +4

      Yes, but it will be called 5080 AI.

    • @simoSLJ89
      @simoSLJ89 2 месяца назад +4

      Enough room for 5080 Ti, Super and Ti Super.

    • @N4CR
      @N4CR 2 месяца назад +1

      it does, just like the 4080, so it runs out of VRAM in a year or so when people run the latest over-hyped RT and need to blurLSS it

    • @Aggrofool
      @Aggrofool 2 месяца назад

      Doubt. RTX 4080Ti also didn't happen.

    • @NameUserOf
      @NameUserOf 2 месяца назад

      @@simoSLJ89 AITI Super -> ATI Super -> ATI. The time has come, the circle will become complete.

  • @keith3761
    @keith3761 2 месяца назад +2

    Two slot card as in the PCB/power section only is 2 slots. aftermarket manufacturers adding a cooler adds another 2-3 slots.

  • @Charaqat
    @Charaqat 2 месяца назад +5

    Intel really needed battlemage, and battlemage being cut down into a minor series between celestial is going to hurt more. might not even get celestial because they'll use poor battlemage performance as a metric for a full celestial series

  • @JHe-f9t
    @JHe-f9t 2 месяца назад +3

    Living next to a quarry I can confirm that you need to get your limestone harvest in before the first frost or you'll be cooked. Limestone has the longest growing season of any of the stones.

  • @ShadySKWASHA
    @ShadySKWASHA 2 месяца назад

    The Green solder mat looks dope in the shot! Glad I bought one. It's been great, amazing quality and always glad to support your efforts GN!!!

  • @brlopwn
    @brlopwn 2 месяца назад +10

    Nvidia is definitely picking their customers now that they've successfully monopolized their markets. I think the 5090 being couched as "pro-sumer" is likely going to make the price $3k+ (and look at all these comments of people saying they would pay $5k+ no questions asked). As the middle class is being squashed and the consumer base is being split, many of these companies are finding that they can set whatever price they want once they've adequately captured their customers. I hope Lina Khan's FTC is taking a VERY close look at Nvidia.

    • @reav3rtm
      @reav3rtm 2 месяца назад +1

      I hope 4090 and 5090 stops being shown in gaming benchmarks, as if it was gaming card..

    • @johnsullivan8673
      @johnsullivan8673 2 месяца назад

      @@reav3rtmnah. I’d buy it and post the benchmarks

  • @edward-vonschondorf-dev
    @edward-vonschondorf-dev 2 месяца назад +2

    Does that GPU marketshare report not include the embedded gpu's in things like the steam deck, rog ally, etc.? I imagine that would change the results significantly.

  • @arenzricodexd4409
    @arenzricodexd4409 2 месяца назад +3

    JPR calculate their market share based on the unit shipped for that quarter. Intel market share is at 0% because for the last 2 quarters intel ship extremely low quantity of Arc to the market. They rather not selling those card if they need to give crazy discounts on it (and make a loss because of it). Intel market share won't be improving until they release BM to the market.

  • @icdansheep1873
    @icdansheep1873 2 месяца назад

    I appreciate the timestamps & chapters!

  • @Shantara11
    @Shantara11 2 месяца назад +24

    600 Watts?! That’s absolutely insane!

    • @flameguy3416
      @flameguy3416 2 месяца назад +3

      2000 Watt PSUs 😊

    • @HybOj
      @HybOj 2 месяца назад +1

      yeah, that will make your living room hot, its totally unusable

    • @Legz_inStyle
      @Legz_inStyle 2 месяца назад

      I mean, the 4090 was 600W as well

  • @urazoktay7940
    @urazoktay7940 2 месяца назад

    Amazing video, i thoroughly enjoyed it, thank you Stephen, thank you Gamer's Nexus. :)

  • @RandomTechChannel
    @RandomTechChannel 2 месяца назад +4

    I miss a hit 'RTX' version of 1080 Ti. Probably one of the best releases ever. If only NVIDIA could reproduce it with 50 series...

    • @Gavo172
      @Gavo172 2 месяца назад +6

      That's a mistake they're never going to repeat again, absolutely legendary card

    • @handle32169
      @handle32169 2 месяца назад

      They always could have reproduced it, they don't want to and they don't have to (no competition)

  • @imjust_a
    @imjust_a 2 месяца назад +1

    Thanks for mentioning the microcode update! As a 13700k owner who's CPU *hasn't* had a meltdown yet, I've been waiting for updates on that story. I haven't updated my bios yet, though, since I remembered hearing rumors that the microcode update could negatively impact performance. I'm also pretty distrustful of betas, especially betas that are pushed out as a result of panicked development.

    • @unholymerlin
      @unholymerlin 2 месяца назад +1

      Same here, waiting for someone to propper test it so I can update bios.

  • @wellox8856
    @wellox8856 2 месяца назад +3

    I can confirm we are working on the phantom chipset.

  • @myleft9397
    @myleft9397 2 месяца назад +1

    I LIKE YOUR FACTORY TOUR VIDEOS. MORE FACTORY TOURS. XD

  • @kaupaxup
    @kaupaxup 2 месяца назад +8

    The Intel fab tour also helps a lot with your Tier 6 builds in Satisfactory.

  • @ItsJustElenore
    @ItsJustElenore 2 месяца назад

    28:54 That X870E Godlike power connector side does look pretty interesting though. Having all connectors come out one side.

  • @longjohn526
    @longjohn526 2 месяца назад +13

    People have such short memories they don't even remember this same guy made the same 600W claim for the 4090 and it turned out to be 450W ...... Why should anyone believe him now?
    Why would a 10752 CUDA cores on a 5080 (400W) need almost as much power as a 16384 CUDA core 4090 (450W)? Does anyone really believe efficiency will go backwards? If it only gets 10% better efficiency it should be able to match the 320W 4080

    • @N4CR
      @N4CR 2 месяца назад +1

      yeah kopite is a poor source. he's 50:50 lmao

  • @caffeinecreature
    @caffeinecreature 2 месяца назад +2

    The ARM64 support is likely for their upcoming VR headset which is rumoured to be using an ARM processor and some version of SteamOS

  • @Rageousss
    @Rageousss 2 месяца назад +11

    Remember US consumers play a big role in why these prices are so high thanks to those who keep paying the absurd asking prices as it enables it.

    • @sonicboy678
      @sonicboy678 2 месяца назад +2

      Ironically, the 4090's sky-high price is still lower than what Nvidia wants for the not-Quadro GPUs for the same specs (and even for _lower_ specs), so I'd recommend bearing that in mind for that GPU's demand.

    • @J-Vill
      @J-Vill 2 месяца назад

      No one cares the US is consumer central of the world lol I'm I in the wrong for getting a 3080? (Not scalped) I'm looking forward for the 5080 depending on price to performance of course

    • @hossosplitternacken7819
      @hossosplitternacken7819 2 месяца назад

      600 WATT ... did u know thats alsmost 1 HP (horsepower) therefore more power as an E-bike and that at the current gas prices 😅 seriously who buys that Nvidia crap anyways?

    • @julianchillby1024
      @julianchillby1024 2 месяца назад

      People like you think food prices will go down and restaurant prices will go down dramatically too if their sales decrease a lot.. inflation doesn't care fool

  • @MikeHarris1984
    @MikeHarris1984 2 месяца назад

    I love when you guys do the factory tours or the in-depth documentary videos! And I love that you guys take a neutral position on everything! There is absolutely no bias in any of your videos and any of your news in any of your products. It can be huge fans of Intel and Nvidia but then also you're not afraid to call them on their shit whenever they are having shit. And even LTT. Hopefully he didn't take that shit personally and called out what need to be called out before he ended up imploding himself. Where I think you guys help save him from himself, by calling out all of the inconsistencies and issues that LTT had and had been continued building up and building up, and I think that they are actually a much better channel now after the fact. Where they are focusing more on quality and not quantity of the new video everyday no matter what format. I just hope to see LTT and GN together again

  • @2012ArsenalMega
    @2012ArsenalMega 2 месяца назад +3

    For not native speakers, set playback speed at 0,75. You welcome

  • @theredwedge9446
    @theredwedge9446 Месяц назад

    explaining the 5090 as a prosumer product is actually a cool way to think

  • @Someone-lr8os
    @Someone-lr8os 2 месяца назад +38

    i just hope they dont pull a 20 series on us again 🙏

    • @joelcarson4602
      @joelcarson4602 2 месяца назад +28

      It's ALL going to be a 20 series style pulling the wool over our eyes from here on out. As in "Woo-Hoo! Lookit this AMAZING 3% uplift over the previous generation!"

    • @anitaremenarova6662
      @anitaremenarova6662 2 месяца назад +1

      Yes they will, just look how bad the specs are for the 5080 it'll be incremental changes in raster compared to 40 series other than the 5090.

    • @Javierm0n0
      @Javierm0n0 2 месяца назад +1

      Don't do that to yourself😢

    • @DeepDownInTheOcean
      @DeepDownInTheOcean 2 месяца назад +1

      ​@@joelcarson4602 I'm praying that the 5080 is 10% better than the 4090 for cheaper. If so, I'm most definitely getting it. Also, 4090 in my area is over 2300 USD for pny brand.

    • @DaGhostToast
      @DaGhostToast 2 месяца назад +1

      They already did with the 3060ti and 4060ti

  • @Anton1699
    @Anton1699 2 месяца назад

    Worth adding that "ARM64-EC" (Emulation Compatible) refers to a Windows-on-ARM ABI (Application Binary Interface) that allows emulated x86-64 applications to use native ARM64 dependencies.

  • @zanbenurthadar
    @zanbenurthadar 2 месяца назад +32

    all AMD needs to do to get their market share back is release cards with much more VRAM like they did 10 years ago, everyone who runs AI locally just wants more vram capacity to the point where independent card manufacturers are making a mint now

    • @450AHX
      @450AHX 2 месяца назад +3

      Absolutely. Especially with the latest LLaMA versions that no longer have mid-size versions. You need at least 40 GB of RAM to run them with decent precision.

    • @VisturgAkter
      @VisturgAkter 2 месяца назад +6

      all amd needs to d to get their gaming segment market back is cater to like 2% of users?

    • @Nightykk
      @Nightykk 2 месяца назад +3

      They already do have a fair bit of RAM vs. Nvidia, not that it makes much difference. I get what you're saying, but currently I have doubts about such changing a whole lot.
      If they wish to compete with Nvidia, they gotta sort out the performance. Not just competing with Nvidia, but surpassing them. Even then, too many stories about the drivers - not even sure better performance alone would do it, people still scared.
      Price would do a lot. They got greedy with the 7000-series, based their pricing on Nvidia's silly prices - rather than trying to go for market share, with lower prices. AMD did exactly as Nvidia, nearly doubling the prices of previous generations.

    • @oneanother1
      @oneanother1 2 месяца назад +2

      Unfortunately they are too slow on the ai side of things. They are integrating their workstations architecture into the gaming GPU, udna, which will come after rnda4. Its a bit too late, I think the ai craze has died down. I haven't seen much ai stuff making headlines. Ai video are still stuck in slow no look.
      And the games aren't delivering, either take too long to make or just isn't appealing. I don't think amd can ever recover in the gaming GPU space.

    • @450AHX
      @450AHX 2 месяца назад

      @@oneanother1 Even if they're not as fast as Nvidia, if they're faster than running off CPU it can be worth it, depending on the price.

  • @daztora
    @daztora 2 месяца назад +1

    I love the way Steve said "low" during the pricing segment, they are so damn expensive haha

  • @Robert.101
    @Robert.101 2 месяца назад +28

    4090 price hike????

    • @chimpwithamachinegun
      @chimpwithamachinegun 2 месяца назад +1

      Real

    • @JollyGiant19
      @JollyGiant19 2 месяца назад +17

      It sell enough to justify it, as much as we don’t like that

    • @Robert.101
      @Robert.101 2 месяца назад

      @@JollyGiant19 I sort of feel is a mix of downscaling the manufacturing that rises the prices and also that they are aiming to slowly sell their stock on the assumption that there’s going to be a lack of stock for the 5090 and 5080 and therefore more demand for the 4090. It’s as right now the top of the line consumer gpu, so they still expect to get a premium for it.

    • @Danstealth
      @Danstealth 2 месяца назад +1

      They raised the price for the 3090s then then raised them again when the 4090 came so people just wouldn’t keep buying the cheaper cards. 3090ti was $1200 at one point

    • @angooyschannel721
      @angooyschannel721 2 месяца назад

      yes, just people trying to make their last bit of money off of it while they can

  • @danielturunen7237
    @danielturunen7237 2 месяца назад

    The Intel fab video was great! Thanks! 🙂

  • @sstier48
    @sstier48 2 месяца назад +7

    I appreciate the honest reporting.. way too many people in this space just go along with the 600w rumor. I don't believe it because they want a lot of buyers, and that number would push out alot of buyers from upgrading due to most people using an 850w or lower psu

    • @greebj
      @greebj 2 месяца назад +2

      I don't go along with it because we saw exactly the same story with the 4090 rumours

  • @fomoco173
    @fomoco173 2 месяца назад

    Awesome job as always, and yes, I watched and loved it the fab video!

  • @StephenMcGregor1986
    @StephenMcGregor1986 2 месяца назад +4

    600W, $5000 AUD, a few more CUDA cores, bit more memory, bit more speed, maybe another new revolutionary feature like frame generation, ooh boy, how exciting

  • @spenmac
    @spenmac 2 месяца назад

    @gamersnexus, @09:15 I think you guys used the incorrect background image, as it details info about nintendo amongst other things.

  • @ZaberfangX
    @ZaberfangX 2 месяца назад +2

    Valve working on proton on Arm does open the door for a risc-v chip handheld in the future. But windows can use DXVK with windows on arm can support working game over what microsoft offering.

    • @dead-claudia
      @dead-claudia 2 месяца назад +1

      risc-v provides a unique opportunity bc of its variable vector size. companies are already shipping chips with native 1024-bit vector registers, which is interesting on its own.

  • @redheadsg1
    @redheadsg1 2 месяца назад +4

    4:31 "Consumer card" that nobody of regular people will never be able to afford it.

  • @EVERON.
    @EVERON. 2 месяца назад +1

    Is the connector and the cable from a 4090 going to be compatible for the 5090 having a maximum 600W support? Bc if it has spikes over 600W it wouldnt be safe to use that 600 cable right?

  • @scaroo
    @scaroo 2 месяца назад +4

    Just wanted to add that proton is build upon WINE (and DXVK, VK3D...). So contrasting them as one being good and the other being bad doesn't make much sense. Those projects do not get the recognition they should by being hidden behind the common "Proton" name. Of course Valve, through direct contributions or subcontracting, improved them a ton.

  • @maniacalcoyote6087
    @maniacalcoyote6087 2 месяца назад

    5:15 - Probably mixup in comunications; 1 slot for the card, 2 slots for the fans/heat sinks.

  • @traiges414
    @traiges414 2 месяца назад +4

    Did not watch the video yet but here is my guess regarding the leak: The RTX 5090 can only be bought when bundled with a monthly NVIDIA subscription fee (for using DLSS & AI features).

  • @tacocat709
    @tacocat709 2 месяца назад

    I bought an Arc last month, hopefully at least improving the stats a little. Working great for me so far!

  • @royboysoyboy
    @royboysoyboy 2 месяца назад +8

    15:18 A few months ago I told some gamers that I thought NVIDIA is a monopoly in discrete graphics cards, and graphics technologies (CUDA, upscaling, etc.) and they basically owned the entire gaming/AI market. Yet no one believed me or cared.

    • @Artimidorus
      @Artimidorus 2 месяца назад

      Nobody cared because the market is open for anyone who wants to bring a good product. Say that just discounts one and other still isn't trusted.
      But it's not a monopoly.

    • @royboysoyboy
      @royboysoyboy 2 месяца назад +2

      @Artimidorus partners not being allowed to make graphics cards for other brands contradicts your first sentence

    • @LAndrewsChannel
      @LAndrewsChannel 2 месяца назад

      @@royboysoyboy WTF are you talking about? ASUS, Gigabyte and MSI have both the latest AMD and NVidia cards in production, in their contracts with NVidia there is clearly no such rule.

    • @royboysoyboy
      @royboysoyboy 2 месяца назад +1

      ​​​​@@LAndrewsChannelNVIDIA does not want partners to make Intel cards, a 3rd competitor who makes graphics chips. As of today, only Asrock, Sparkle, and Acer make Intel graphics cards, and none of the manufacturers you mentioned make graphics cards from a third competitor due to slimy, monopolistic business tactics from NVIDIA

    • @royboysoyboy
      @royboysoyboy 2 месяца назад +1

      NVIDIA is acting like Apple, and they need an anti trust lawsuit as of 6 years ago. No wonder EVGA left the industry, I don't blame them

  • @tech8438
    @tech8438 2 месяца назад +1

    Everyone in the Fabs should want one of these new shirts :D

  • @hankb27
    @hankb27 2 месяца назад +3

    The wattage is fn ridiculous on the new cards

  • @TheHippo-or5wi
    @TheHippo-or5wi 2 месяца назад

    Thanks steve for mentioning Valve Proton!
    cant wait till you show differences between Linux / windows. Framerate and frametimes.

  • @anonony9081
    @anonony9081 2 месяца назад +7

    I'm totally willing to buy a 5090 after selling my 4090, but the idea that it could use two 12 pin power cables and take up that much space in the case is making me a little concerned about the feasibility. I went and got a fancy power supply that has the native 12 pin cable just for the 4090, but now I feel like it's already outdated.

    • @papichuckle
      @papichuckle 2 месяца назад

      Same my psu has a 16 pin socket and won't be happy if I'm forced to get a new psu already just for one more socket,it would be really scummy doing that in a single generation
      5090 should only use one 16pin and one 8 pin for safety

    • @godnamedtay
      @godnamedtay 2 месяца назад

      Water cool it and you’ll be fine

    • @chronossage
      @chronossage 2 месяца назад

      Just like the prototype 4090. ruclips.net/video/0frNP0qzxQc/видео.html

  • @fourthhorseman4531
    @fourthhorseman4531 2 месяца назад

    The skeletal tech t-shirt gives off some serious Fear Factory vibes!

  • @inkredebilchina9699
    @inkredebilchina9699 2 месяца назад +5

    those "market share" percentages are the new hardware sold quarter by quarter. edit: market share is not what's reflected in those charts but a market trend. and trends change.
    for example someone who bought the intel product outside the scope of this window is not included in the chart so it doesn't mean a lot tbh because people buy a GPU once every couple of years. it just means sales are worse right now.

    • @mckinleyostvig7135
      @mckinleyostvig7135 2 месяца назад +8

      Actually in the current market nobody can buy a GPU ever

    • @inkredebilchina9699
      @inkredebilchina9699 2 месяца назад

      @@mckinleyostvig7135 lol
      but that's just it. those numbers mean 0 actually because 88% of 100 units sold doesn't mean monopoly. it just means sales are terrible. and those charts are based on sales numbers, not the actual market share. it's not far, but it's not 0% for intel either nor is it 12% for AMD.

    • @GeeMannn
      @GeeMannn 2 месяца назад +1

      I got a 4080 half-off. Will make it last at least ten years lmao

    • @rednammoc
      @rednammoc 2 месяца назад +1

      Market share is a term that's well-defined and understood to be: "a company's total sales within a given period divided by the industry's total sales within that same period", not whatever you seem to believe it is.

    • @inkredebilchina9699
      @inkredebilchina9699 2 месяца назад

      @@rednammoc yeah, okay. then let me ask you this: what are all the people whom have bought PCs thus far? not a market or?

  • @marcchapleau8343
    @marcchapleau8343 2 месяца назад

    Thanks for the news recaps!

  • @vulcan4d
    @vulcan4d 2 месяца назад +14

    You mean the 4090 which costs $200 to make is going up in price? Did someone sneeze on the Nvidia bs machine?

    • @chasingthefish9042
      @chasingthefish9042 2 месяца назад +2

      Thats kinda like calling the pacific ocean a puddle.

    • @masterkamen371
      @masterkamen371 2 месяца назад +2

      The R&D costs are ludicrous for semiconductors, but why tf would they raise the prices at the end of its life??????? They already earned like 20x the R&D and manufacturing cost off the AI datacentre deals.

    • @chasingthefish9042
      @chasingthefish9042 2 месяца назад

      @masterkamen371 probably because they are sacrificing profit for making the cards vs using the dies for something else.

  • @robinett74
    @robinett74 2 месяца назад

    I love that Thermaltake keep putting out more colors.

  • @BulukEtznab
    @BulukEtznab 2 месяца назад

    Thanks so much again for your cool News - always love to watch them to see what's going on in the IT-World (which I'm a part of in a way, too).
    Just a short hint for the price comparisons: in Europe all prices are usually *including* VAT, whereas I've learned, that in the US they're usually *excluding* VAT❕☝🏻🧐💁🏻‍♂ - Just f.y.i./short reminder of that (I'm sure you probably know that already, but maybe some folks who watch the Channel for the 1st time, don't)❣

  • @keith3761
    @keith3761 2 месяца назад +3

    Slowly moving toward the days where the top end GPU's for gaming are only owned by rockstars, saudi princes and influencers. I can't wait for when in 5 years I get to see a youtube star showing off his Bugatti GTX 10090.

    • @bernds6587
      @bernds6587 2 месяца назад +1

      can you preorder that Ferrari GTX 12090 ti super?

  • @PlantainSupernova
    @PlantainSupernova 2 месяца назад +1

    Now I wanna see how that looks, 20:30, Steve mentioning the idea of ADR in videos. Just a random inserted voice recorded in the Hemianechoic chamber that's slightly different from the rest of the narration, but you can't put your finger on it exactly.

  • @ThunderWill2
    @ThunderWill2 2 месяца назад +23

    5080 with 16gb would be horrible value

    • @HybOj
      @HybOj 2 месяца назад +1

      depends on price - sell it for 600 USD and you would talk differently. But thats dreaming. It well beat 4090 which costs lets say 1600 USD, so if they sell it cheaper... lets say 1200 USD, ppl will buy it. Unfortunatelly.

    • @JordanJ01
      @JordanJ01 2 месяца назад +1

      You people whine about vram too much, shows how uneducated you are.

    • @fortnitemaster2114
      @fortnitemaster2114 2 месяца назад +14

      ​@@JordanJ01zip up nvidia's pants when you're done

    • @_fr4mes743
      @_fr4mes743 2 месяца назад

      Tell me how it is uneducated to ask what you're paying for ​@@JordanJ01

    • @nervsouly
      @nervsouly 2 месяца назад +2

      @@JordanJ01 you should try 4k gaming. Lots of titles currently crack the 10gb already. Two more years and it will be 15gb. Then add on top that many games got slight memory leak because the devs just can't be bothered - instantly filled up your 16gb, performance drops, textures become washed out and you have to restart the app.

  • @jedward625
    @jedward625 2 месяца назад +1

    Really liked that AI joke. Was drinking water too 😂

  • @ChosenHandle
    @ChosenHandle 2 месяца назад +14

    The next press release from Intel: The chips were fine before GN visited the factory! (Probably)

  • @abetter-_-Gamer
    @abetter-_-Gamer 2 месяца назад

    Thanks for the info ! Great vid . I need to get some new merch .!.

  • @JustHavingBlast
    @JustHavingBlast 2 месяца назад +6

    16GB of VRAM is pure planned obsolescence. Some games even now push 16GB, I cannot imagine someone paying this kind of money to get 2 generations old VRAM capacity

    • @techkilledme
      @techkilledme 2 месяца назад

      RX 6800 sitting there like 😊

    • @sanji663
      @sanji663 2 месяца назад +4

      * *unoptimized games with useless ultra settings* *

    • @bernds6587
      @bernds6587 2 месяца назад

      @@sanji663 *unoptimized games ported from a console with an "automatic console porter" software from 2015 *