The fab tour was extremely educational for us! Loved working on the video. Please try out the format and let us know what you think! ruclips.net/video/IUIh0fOUcrQ/видео.html Or check out our Corsair i500 review: ruclips.net/video/Gqm4V-8F-7k/видео.html Grab the factory tour t-shirt here: store.gamersnexus.net/products/skeletal-factory-t-shirt-foil-blue-silver
Can you please investigate Gigabyte's misleading practices regarding their AM5 motherboards? Specifically, the B650 Gaming X. They sell different revisions under the same model name, but the critical difference is that rev 1.3 only supports the Ryzen 7000 series, while rev 1.5 supports the 8000 and 9000 series. When purchasing these motherboards, Gigabyte doesn’t clearly mention the revision number or clarify that different revisions exist, even though they have vastly different compatibility. This lack of transparency is misleading to consumers, as the model names are identical, and buyers might not realize they're purchasing an older revision that limits future upgrades.
why are they wasting time on arm 64, their client is unskinable. their newly added families code is missing features. and their moderation team varies from bots to can't read to cant understand context. i have been unfairly banned.. for 3 years from gang stalking alone. meaning i cannot edit my account or share content on steam now for 3 years.... it's unfair and their not even trying to look into it. and their moderator is clearly off his bracket and their pissing around and their playing with arm 64... what a waste!!!
That video was extremely well done. I'm glad they gave you access, even if it meant so many weeks editing and blurring out all that confidential stuff. 😅 Very cool to see the type of tech and machinery that gets used on operations of that scale.
I hope they will add some antisag brackets. Otherwise northwestrepair will have lots of hard work in front of him fixing all the broken GPUs with rejected RMA...
Awesome. Thanks for the background on this. We don't follow the Linux world very closely -- though I will say that I've been a lot more interested in it with Windows' terrible decisions the last few years. Maybe time to start studying and deploying it in testing!
@@GamersNexus Man, that would be absolutely amazing if you guys started doing Linux testing! Setup a solid Arch Linux bench (check out CachyOS for a stupid-simple way to get Arch up and running without having to do the whole default setup process from the AUR) and do the normal comparisons. Who knows, it might make you fall in love with computing all over again!
Can you please investigate Gigabyte's misleading practices regarding their AM5 motherboards? Specifically, the B650 Gaming X. They sell different revisions under the same model name, but the critical difference is that rev 1.3 only supports the Ryzen 7000 series, while rev 1.5 supports the 8000 and 9000 series. When purchasing these motherboards, Gigabyte doesn’t clearly mention the revision number or clarify that different revisions exist, even though they have vastly different compatibility. This lack of transparency is misleading to consumers, as the model names are identical, and buyers might not realize they're purchasing an older revision that limits future upgrades.
Thanks for bringing this one to our attention. Will try to find time to look into it, but we are pretty backed up right now on investigations. Can't promise this one only because we really are overloaded to a point where we need to pick and choose. It's tough since a lot of the deep dives require some serious time commitment to prove irrefutably.
Is that board even sold still? The revisions should be separate SKUs, how is that misleading? Who did you buy it from, I'm assuming not Gigabyte directly. The current Gaming X AX and Gaming X AX V2 are always listed separately.
@@drewnewby first off, they’re talking ONLY about the Gaming X AX. Not the V2. Secondly, No one buys directly from gigabyte. Every retailer that lists that board does not list the revision. That’s the problem, and that’s gigabyte’s problem when they release 5 B650 Gaming X AX boards, all named the same, with extremely varied specs. And yes, the board is still sold now, and if you order on Amazon, you have no way of knowing which revision you get.
That seems very shady, it would be normal for an earlier revision to not support future chips without a BIOS update but yeah you're right I just looked on their support site listed and under CPU support, the 9000 series are not even listed unless you select rev 1.5! That's what the kids would call "SUS AF" I believe...
@@P7ab Pretty sure that the 0% is in sales, not the whole market share. A market of dozens of millions don't simply swing 5% from X to Y in several months, when we know there's people still running 6-7 year old cards out there.
I'm still blown away by how much cache those X3D chips have. Like that's enough to fit old 90's games onto the CPU and never have to touch main memory. Hell, that's probably enough cache to run Windows 9x without main memory!
The minimum requirement for Windows 95 is 4 MB and 16 MB of 98/98SE, you could run dozens of copies of the former and a few of the latter entirely on cache. And yes, you could run games up into the late 90s on that much cache.
@@HolyRaincloudYou could run freedos or boot Linux to auto start dosemu. Some distros still support rc.local so you don't have to make a systemd service.
And it's STILL beneficial to have more for some modern games. It's incredible how much of a difference cache size makes for gaming performance. I mean the 7800x3d is actually a fairly low performance processor, MUCH weaker than intel CPUs. But that v-cache has it running LAPS around anything intel has. It's the absolute undisputed KING for gaming right now in 90% of games.
9:25 One important additional information concerning this topic ... FE cards are not officially sold in Germany. I tried to get one with the launch of the 40-series. Not a single online retailer was listing them.
Saying Proton is not a second-class citizen vs. Wine is misleading. Proton *is* Wine, just containerized and packaged in a user-friendly way. And while Valve has contributed to the overall ecosystem, if anyone should be given credit for the Proton 3D performance it should be people like Doitsujin who wrote the absolutely critical DXVK software for translating DirectX to Vulkan.
@@BOZ_11 I have no idea what you are talking about, clear to elaborate? Everything proton is open source so that not all patches are upstream have zero impact on the future, anyone can patch their own WINE and there are lots of other projects that apply the same patches to WINE and also apply their own patches that are not in Proton on top of that.
@@BOZ_11 its in the Steam ToS that Valve will give people the ability to download their games to Personal storage if the reality is that Steam will cease functioning. Please read the Steam Terms of Service before you speak about things like this.
FYI, Proton is Wine. It's Wine plus two extension packages called DXVK and VKD3D-Proton. DXVK handles DirectX 9-11 call translation to Vulkan and VKD3D-Proton handles DirectX 12 translation to Vulkan. Valve's major innovation is those two additional packages.
Knowing how companies work, having valves/steams named on them also really helps solve problems, especially if they can move their weight around to solve odd problems that are hard to troubleshoot without deeper knowledge about individual games.
Realistically, Intel would only be competition to AMD and fighting for the crumbs. Even though there are - admittedly less good but still viable - alternatives, people have been advocating for vendor-lock-in tech made by Nvidia (DLSS, CUDA, ...) without a second thought of the long-term effects. Other companies now simply don't have the money to keep up with Nvidia's R&D while the latter can ask ludicrous prices for even average GPUs with comparatively little VRAM.
I purchased Arc GPU and been happy. After I had to replace a RX6600. People have short memories. During the lockdowns and shipping issues. Intel stepped up and brought out a GPU when Nvidia and AMD were over pricing. LinusTT had a good point of people to support the 3rd party candidate but if Intel does not step up it will be another AMD vs Nvidia.
Arc was interesting, but when my Brother wanted me to build a PC for him. I chose Nvidia, Im not familiar with it and I don't have the patience to troubleshoot a device that isn't mine. Nvidia will take care of them.
Remember when things used to get cheaper after initial release? Pepperidge farms remembers. *SIGH* It's gonna be a long long time before I get an upgrade
@@Pan_Z Right now I'm running a 5700X with 32GB 3600 and a 3060. My set up works great as is, I just like to tinker and build for fun. Unfortunately current prices don't make it practical to upgrade.
@@schmutz1g I don't see why you'd need to upgrade that. I'm running something similar (i7-12650H, 3060 6GB, 16GB DDR5) and it runs even new games just fine (as in, stable 60 frames). Obviously there's room for improvement, but I feel like there's little actual reason to upgrade unless you really have money to burn.
@@haarex. It has nothing to do with NEED. Its my hobby so I enjoy spending money on it regardless of need. With current pricing though, any meaningful upgrade just doesnt make sense at this point since there isnt a ton of performance to be gained.
What people need to understand with regard to Nvidia's price/performance is that this is no longer 2005; they don't get their profits from gamers now. They can charge silly money a) because we'll pay it, mostly and b) because their high-margin stuff is way more important to them. Things are not going back to 'old pricing' any time soon.
@@Splarkszter Came here to say this, Until AMD/Intel, some other company is a real threat at the highest end/machine learning/AI stuff, the market won't shift.
I thought this years and years back but the PC market is not enthusiasts. It's a bunch of people that pretend they are enthusiasts but they are the equivalent of a 'car expert' that will only ever by one brand of car or a 'phone lover' that only buys iphones haha. It's really weird to me that the wealthier ones that upgrade every year, sometimes even more, and can easily try other brands just for fun (something enthusiasts would normally want to do, warts and all for the experience and enjoyment of trying tech) just don't do that. They're just Nvidia fanboys, they're not tech enthusiasts more than they are tech snobs and refuse to try other stuff even if priced fairly for the performance etc. I didn't rush to this conclusion but year after year it feels like it's proven by unshifting obsession with one graphics card brand, even after they constantly put out overpriced cards and put the industry into a terrible state.
@@yousigiltube I had a guy have a meltdown on reddit to the point he deleted all his comments and reported me to reddit care because i said the 4090 is not good value even thought it's the best performance money can buy. Price to performance seemed to be a foreign concept to him.
You are correct especially with amd giving up on high end the only way that nvidia would ever drop prices is if amd released a gpu with 120% the performance for 3/4 to 1/2 the price which will not be happening now, get ready for higher prices cause nvidia has no competition at all now 😢
My 3080 died and I needed a GPU for video editing/class. Went for an A770….i love it. No, not as good at gaming, but good enough. Good gaming performance for what I play, excellent Adobe CC performance, outstanding value
@@peterers3 A lot of the 30 series use awful thermal pads to save a few cents that cause the memory to run at over 100-120° degrees. Way higher than allowed per Samsung's memory spec. MSI was very much guilty of this. I had to mod mine to make it run cool enough. Yes, the core temps were fine (which is what u see in overlays) but silently the memory was being destroyed. I live in Germany, so it's not like it's a very hot country (Although, summers can get up to 40°C....)
@@peterers3 good question, not sure. Basically my computer would crash at random points, often during class or when I was using Photoshop, let alone games. My computer computer worked fine running on integrated graphics. I got into arc because I need to edit video and it’s been completely stable. I sent my 3080 back to MSI for repair. They sent it back and was still broken.
@@LeegallyBliindLOL im not a gamer but I bought a bunch of 3080s for mining back in 2021 and the first thing I did was replace the thermal pads with top quality ones and the temps dropped dramatically.
They wont sell many if the VRAM capacities havent gone up. Gamers dont have the money to buy those in quantities, and AI training demand wont be there with relatively low VRAM.
I think they'll keep the 5080 at $1000. Judging by those specs it's probably going to be doing half the work of the 5090 which I suspect will be $2000. The 4080 at $1200 was a flop last time around because it was 50% less performance for only a few hundred dollars less. They need to keep that price gap bigger this time if they want to keep the performance gap so big between cards.
The leather jacket in each presentation is a different one, it may not look like that but it is. Those arent gonna pay themselves and the jacket guy is looking at our pockets to milk them!!!
People are reading way too much into Arc's current 0% sales numbers. This doesn't have to be indicative of crashing consumer interest in Arc. I'd argue it's not at all, and the falling sales numbers are to be expected. Arc always appealed to a particular kind of buyer. You can't just walk into a Best Buy and get an Arc card-- they're not on the shelf. People buying Arc knew what they were looking for in the first place, because they're people plugged into the hardware news cycle. They're the early adopters, and just about everyone who was going to buy one already bought theirs either on release, or when Arc rolled out driver updates for the game they wanted to play. So yeah, when Battlemage is released, that'll be the real testament to how much enthusiasm there is for Arc.
Agreed. From what I saw their 'solution' for having the system decide which ccd to use left something to be desired. They needed something similar back in the bulldozer days and I wasn't impressed then, sure it helped my 9120, but it was marginal and I still suffered from random stutters, their scheduling update only made them less frequent. I refuse to spend 9950x3d money for a solution that is only 'less annoying'
Other issues with this is basicly no games are multithreaded enough to support a 16 core X3D at the moment, pretty much every title threads out to 8 cores@happygster922
@@happygster922it’s not that simple. Want both. I don’t just play games with my PC. I’ll take two slightly lower clocked CCDs than this hybrid approach that often times doesn’t work the way it should.
I don't understand the people saying the 5090 leak can't be possible be real due to the core counts being too high for a generational increase. Does no one remember every previous Nvidia massive generational halo card spec increases? Titan RTX to RTX 3090 increased core counts by well over 100% (4608 to 10496), RTX 3090 to RTX 4090 had roughly a 60% increase in CUDA cores (10496 to 16384), and RTX 4090 to this rumored RTX 5090 is a 33% increase in cores (16384 to 21760). If anything Nvidia is actually getting stingier, cutting the GPU core spec increase they've given to Halo cards generationally by half each generation going back to RTX 2000. 2000 to 3000 was 100%, then 3000 to 4000 only 60%, 4000 to 5000 dropping to almost 30%. The 6090 will likely only have 15-20% more cores than the 5090 if it keeps tracking this way, price justified by new DLSS and frame gen that allows a 2X performance increase of 4K gaming over 4090. When we went 3090 to 4090, they took away / reduced a huge part of the generational prior years 2X hardware improvement and instead gave 40% of that performance with DLSS + Frame Gen, and people were still more than happy to shell out two grand for a GPU, they'll do it again. My bet is Nvidia cutting their 5090 production costs by another 25% over 4090, give even less hardware horsepower improvement and I bet the 4090 FE of $1599 turns into a liquid cooled 5090 FE that costs a minimum of $1999, or as much as $2499. No way they're going to sell consumer GPU's to us when those same GPU dies can be sold at triple or quadruple the profit margins to a datacenters desperate for any AI hardware.
it's not worth to explain that they get less for more money. if somebody pay leather jacket's exclusive margin (about 60% btw) just to play games on that brick, it is purely for the feeling that he is something more than others because he can afford it. i can purchase 4090 every month, but i had to be mentally ill to buy a card which is barely worth 50% of its price.
The Valve/ARM news i imagine would be more realistic for their VR/Deckard projects, stuffing ARM tech in a headset is pretty much what everyone does anyways.
I have a A770 and loving it. Clocks really well, works on al games I've tried and sips power compared to both Nvidia and AMD equivalent performance models I have. Also gets driver updates extremely regularly.
@@notrixamoris3318 the operative is "had", not "has". Seriously, I'm yet to find a game where I have any issues left. In both performance and compatibility, Intel pulled of an astronomical improvement in the past year and now it's perfectly on par with the driver stability and compatibility you'd expect from AMD and Intel.
@@ZestyLemonSauce Idle is 40ish max and my GPU is rarely ever idle, so that bit doesn't matter to me, it's 100% load power usage is lower than both the 6700XT and 3060. Besides you can get it down to 8W idle power if you set your monitor to run 60Hz when not in a game when you follow their PCIe low power settings recommendations. Which seems to be a peculiarity in their vbios or drivers they still need to fix. The PCIe low power settings do work, but only if you set your monitor to 60Hz. Going higher than that instantly brings it back to 40ish.
Honestly I hope the public perception of ARC vastly improves with this next generation, especially if they continue trying to fill the hole of mid to low tier GPUs that NVIDIA is too high class or whatever for.
I doubt intel will really want to fill that hole. This is why intel market share at 0% right now. The last 2 quarters they only ahip extremely low quantity of Arc to the market. For intel it is better not to sell them rather than have to sell those card at very very low price to the point of selling it at loss.
Wine already works on MacOS, so it's not a big leap for Valve. It would certainly please us Mac users. Linux is soooo far ahead for gaming right now, but I want games on all my computers.
Ever since Apple made it impossible to run 32bit applications, Valve has pretty much given up on supporting the platform. They make sure the Steam client runs, but that's about as far as they go. They haven't even bothered to update games like TF2 on Mac. It simply doesn't run anymore. I could play it on my 2011 MacBook Pro, yet it doesn't launch on my M2 MacBook Pro. It's so much faster, but it can play so much less. Valve puts a lot of time and effort into making games more compatible, only for Apple to break compatibility entirely. That's maybe fine for productivity software where you always run the latest version, but a disaster for games since their active support only lasts for so long. Apple doesn't understand (PC) gaming, and they never will. Every once in a while they'll parade a few games whose ports they funded, but other than that macOS is really not a platform worth buying into either for gamers or game developers... Valve tried, and all they got was a middle finger...
It’s kind of a double-edged sword. Dropping support for 32-bit apps screws over some older games*, but keeping support slows down your machine while further complicating future progress (e.g. the transition to Apple Silicon/ARM64). There’s no ideal solution. That said, it’s long been the case that if you’re really wanting to game you’re better off getting a console or a PC, though I guess we’ll see how Apple’s current push in gaming goes. I doubt it will do much, and I say that as a life long Mac user * though given Apple hadn’t sold 32-bit Macs since 2009 and macOS hadn’t supported 32-bit hardware since 2011 there was little reason for more modern games to still be 32-bit only
@@pilkycrc As a software engineer I really do have to debunk that argument. There really isn't a good reason to forcibly drop it, not for a company that size. All 64 bit x86 processors are fully backwards compatible with 32 bit instructions. It's literally native code. You're not slowing down anything there. So then you have the OS level. You need to have a translation layer between the 32bit applications and your 64bit OS libraries. The thing is, Apple had this. Microsoft also has been using this since 2005 at the very latest. It's absolutely fine to not bring new features to the compatibility layer, but there is no reason for them to simply remove the existing functionality. Sure it takes some engineering time to keep it going. But Apple is a huge company with tens of thousands of engineers. It's a huge middle finger to their community of users to not spend just a few resources on keeping that old compatibility layer functional. I'm sure the move to ARM has something to do with it. But even then... Microsoft's much inferior x86->ARM compatibility layer supports 32 bit apps. Rosetta by comparison is a much better compatibility layer. They just didn't bother. FYI, I can guarantee you they will pull the same sh!t again in a few years. In a few updates they will drop support for Rosetta and you'll lose access to ALL mac software written before ~2020. They did it with their PowerPC translation layer, they'll do it with the x86 translation layer. They have been doing crap like this for decades. Yeah it takes extra work, but that's nothing compared to the millions of hours of development work spent on software that will simply no longer work. For all its flaws, at least Microsoft understands the value of that...
@@Niosus As a fellow software engineer (who’s been building software for the Mac for 20 years now) I do have to partially disagree. There are several good reason to drop support that are specific to the Mac, its history, and various technical decisions. The first is due to how Apple ships its OSes. As they have been through a lot of transitions they are well versed in the concept of FAT binaries (also known as Universal Binaries). Rather than having separate downloads for different architectures you have single download that contains them all (during the Intel transition this led to 4-way binaries for PPC, PPC64, x86, and x64). This also extends to the OS and all its libraries. This makes things much easier for the user as they don’t need to care about their architecture. You could have macOS on an external drive with universal apps and boot a Mac of any supported architecture from it entirely natively (something Windows couldn’t do, and I believe still can’t). The downside is you have duplicates of all the binaries. In macOS 10.14 (the last version to support 32 bit apps) this took up about 500MB of disk space. It also takes up RAM. When a library is first used by an app, macOS loads it into memory. To save on RAM a lot of these OS-level libraries are shared (so you don’t have a version per app). However, they are never **unloaded** from memory. So if you launch an app by accident and quit it immediately it will load all the libraries it needs and leave them in RAM until you reboot. This can waste a lot of valuable RAM (especially given Apple’s stingy configs, but that’s another problem 😅). This isn’t an issue for 64 bit stuff as most Mac software was already 64 bit. But it meant those few 32 bit apps had an additional burden. Then there is the even more macOS specific issue of the Objective-C runtime. Obj-C is the language most high level macOS (and previously NeXTSTEP) APIs were built in (and to a large degree still are). Unfortunately Obj-C was created in the early 80s (it’s even older than C++) and the runtime had a few issues that made adding new features to existing classes very problematic. With the 64-bit transition Apple could introduce a new runtime that fixed all these issues. The problem is, they needed to keep supporting 32-bit and so the old runtime. This added a LOT of engineering complexity to do pretty basic stuff, leading to features being slower to add and more prone to bugs. Apple does have a lot of engineers, but I think many in the industry would be surprised at how small their teams actually are. They’re often a fraction of the size of similar teams at the likes of Microsoft or Google. This lets them be a bit more nimble, but has the downside of not having as many resources to manage the growth in complexity that super long term backwards compatibility requires. From the perspective of providing better and more efficient software going forward, then dropping the legacy Obj-C runtime was a pretty big thing that would benefit the entire company, especially as it ONLY affected the Mac, with iPhone, iPad, etc only ever having the modern runtime. In fact that leads to another point which is bringing iPhone and iPad apps to the Mac is also made a lot easier by dropping 32-bit support as the frameworks there never had to deal with the legacy runtime, but would have to on the Mac. And then you have the ARM transition, which again plays into this. By dropping 32 bit support Apple can focus their resources entirely on x64 -> ARM64 translation. This reduces the complexity of Rosetta and allows them to optimise for one architecture switch. Given they also added CPU instructions to Apple Silicon to speed up translation of x64 apps it wouldn’t surprise me if that played a roll in encouraging them to drop 32-bit. So in a nutshell, keeping 32-bit compatibility is a lot more complex a topic than most people would think and requires a lot more resources than one would expect. To be frank, it’s a miracle that Microsoft have kept it for so long (I know there are people at MS who frequently argue for dropping compatibility for older hardware and software). And in Apple’s case it’s not like it was a sudden or unexpected transition. As you’ve said, Apple has history of dumping old tech and architectures, but it usually takes them a while to do so. Apple’s first 64-bit capable Mac was released in 2003. macOS was fully 64-bit by 2007, the same year they last shipped 32-bit hardware. They dropped support for 32-bit hardware in macOS in 2011 and then spent 2017-2019 warning that 32 bit apps would no longer be supported before doing so in 2019. So that’s anywhere from 8 to 16 years of notice given to devs. Any software released during that time frame probably should have been built as 64 bit already (sadly I suspect the reason that some of it, especially games, wasn’t built that way is the devs were new to the Apple platform in that time and didn’t know to expect this). But yeah… ultimately there are pros and cons to each approach. Windows is legendary for backwards compatibility and lets you run pretty ancient software pretty well. This is a huge boon to people who enjoy older games or want to preserve older software without resorting to older machines. The downside is that it requires an increasing amount of engineering resources to pull off and limits your flexibility and nimbleness in moving forward. Having more limited backwards compatibility is why Apple has managed to pull off 5 architecture transitions (68k -> PPC, PPC -> PPC64, PPC -> x86, x86 -> x64, x64 -> ARM64) with incredible success in the time MS has struggled to pull off 2 (x86 -> x64, x64 -> ARM64). There’s no right or wrong answer here, which is why it’s good to have multiple computing platforms with differing priorities (even if that does mean you need to fork out for each of them separately to get both sets of benefits)
6:51 5080 is going to be such a lame "upgrade" over the 4080 (super). Almost no increase in core count and going from TSMC 5nm to 4nm will barely improve the clockspeed. Fully expecting 1200 USD or higher pricing too 💸
sadly upgrading from a 2080super would have loved a bigger node upgrade but it will still be a sizable upgrade because the gddr7 over gddr6x rather than the almost non existent 5nm to 4nm upgrade from the cards.
For those who don’t know last year the 4090 was at whopping low of 1450.00$ on Black Friday at multiple stores like microcenter and Best Buy. That was definitely the time to buy it and the only time to buy it and still feel good about the wallet drain haha price to performance at 1450 feels a whole lot better now knowing it’s worth 2k
Feels odd going from being able to rule out the CPU for the most part, to having to really consider it now. I remember when I only had to RMA 1 CPU maybe once a quarter? Even then it was probably the user's fault. We were building about a thousand PCs a year. I think 90%+ of our RMA's were motherboard, the rest were mostly hard drive or power supply.
Intel really needed battlemage, and battlemage being cut down into a minor series between celestial is going to hurt more. might not even get celestial because they'll use poor battlemage performance as a metric for a full celestial series
Living next to a quarry I can confirm that you need to get your limestone harvest in before the first frost or you'll be cooked. Limestone has the longest growing season of any of the stones.
Nvidia is definitely picking their customers now that they've successfully monopolized their markets. I think the 5090 being couched as "pro-sumer" is likely going to make the price $3k+ (and look at all these comments of people saying they would pay $5k+ no questions asked). As the middle class is being squashed and the consumer base is being split, many of these companies are finding that they can set whatever price they want once they've adequately captured their customers. I hope Lina Khan's FTC is taking a VERY close look at Nvidia.
Does that GPU marketshare report not include the embedded gpu's in things like the steam deck, rog ally, etc.? I imagine that would change the results significantly.
JPR calculate their market share based on the unit shipped for that quarter. Intel market share is at 0% because for the last 2 quarters intel ship extremely low quantity of Arc to the market. They rather not selling those card if they need to give crazy discounts on it (and make a loss because of it). Intel market share won't be improving until they release BM to the market.
Thanks for mentioning the microcode update! As a 13700k owner who's CPU *hasn't* had a meltdown yet, I've been waiting for updates on that story. I haven't updated my bios yet, though, since I remembered hearing rumors that the microcode update could negatively impact performance. I'm also pretty distrustful of betas, especially betas that are pushed out as a result of panicked development.
People have such short memories they don't even remember this same guy made the same 600W claim for the 4090 and it turned out to be 450W ...... Why should anyone believe him now? Why would a 10752 CUDA cores on a 5080 (400W) need almost as much power as a 16384 CUDA core 4090 (450W)? Does anyone really believe efficiency will go backwards? If it only gets 10% better efficiency it should be able to match the 320W 4080
Ironically, the 4090's sky-high price is still lower than what Nvidia wants for the not-Quadro GPUs for the same specs (and even for _lower_ specs), so I'd recommend bearing that in mind for that GPU's demand.
No one cares the US is consumer central of the world lol I'm I in the wrong for getting a 3080? (Not scalped) I'm looking forward for the 5080 depending on price to performance of course
600 WATT ... did u know thats alsmost 1 HP (horsepower) therefore more power as an E-bike and that at the current gas prices 😅 seriously who buys that Nvidia crap anyways?
People like you think food prices will go down and restaurant prices will go down dramatically too if their sales decrease a lot.. inflation doesn't care fool
I love when you guys do the factory tours or the in-depth documentary videos! And I love that you guys take a neutral position on everything! There is absolutely no bias in any of your videos and any of your news in any of your products. It can be huge fans of Intel and Nvidia but then also you're not afraid to call them on their shit whenever they are having shit. And even LTT. Hopefully he didn't take that shit personally and called out what need to be called out before he ended up imploding himself. Where I think you guys help save him from himself, by calling out all of the inconsistencies and issues that LTT had and had been continued building up and building up, and I think that they are actually a much better channel now after the fact. Where they are focusing more on quality and not quantity of the new video everyday no matter what format. I just hope to see LTT and GN together again
It's ALL going to be a 20 series style pulling the wool over our eyes from here on out. As in "Woo-Hoo! Lookit this AMAZING 3% uplift over the previous generation!"
@@joelcarson4602 I'm praying that the 5080 is 10% better than the 4090 for cheaper. If so, I'm most definitely getting it. Also, 4090 in my area is over 2300 USD for pny brand.
Worth adding that "ARM64-EC" (Emulation Compatible) refers to a Windows-on-ARM ABI (Application Binary Interface) that allows emulated x86-64 applications to use native ARM64 dependencies.
all AMD needs to do to get their market share back is release cards with much more VRAM like they did 10 years ago, everyone who runs AI locally just wants more vram capacity to the point where independent card manufacturers are making a mint now
Absolutely. Especially with the latest LLaMA versions that no longer have mid-size versions. You need at least 40 GB of RAM to run them with decent precision.
They already do have a fair bit of RAM vs. Nvidia, not that it makes much difference. I get what you're saying, but currently I have doubts about such changing a whole lot. If they wish to compete with Nvidia, they gotta sort out the performance. Not just competing with Nvidia, but surpassing them. Even then, too many stories about the drivers - not even sure better performance alone would do it, people still scared. Price would do a lot. They got greedy with the 7000-series, based their pricing on Nvidia's silly prices - rather than trying to go for market share, with lower prices. AMD did exactly as Nvidia, nearly doubling the prices of previous generations.
Unfortunately they are too slow on the ai side of things. They are integrating their workstations architecture into the gaming GPU, udna, which will come after rnda4. Its a bit too late, I think the ai craze has died down. I haven't seen much ai stuff making headlines. Ai video are still stuck in slow no look. And the games aren't delivering, either take too long to make or just isn't appealing. I don't think amd can ever recover in the gaming GPU space.
@@JollyGiant19 I sort of feel is a mix of downscaling the manufacturing that rises the prices and also that they are aiming to slowly sell their stock on the assumption that there’s going to be a lack of stock for the 5090 and 5080 and therefore more demand for the 4090. It’s as right now the top of the line consumer gpu, so they still expect to get a premium for it.
They raised the price for the 3090s then then raised them again when the 4090 came so people just wouldn’t keep buying the cheaper cards. 3090ti was $1200 at one point
I appreciate the honest reporting.. way too many people in this space just go along with the 600w rumor. I don't believe it because they want a lot of buyers, and that number would push out alot of buyers from upgrading due to most people using an 850w or lower psu
600W, $5000 AUD, a few more CUDA cores, bit more memory, bit more speed, maybe another new revolutionary feature like frame generation, ooh boy, how exciting
Valve working on proton on Arm does open the door for a risc-v chip handheld in the future. But windows can use DXVK with windows on arm can support working game over what microsoft offering.
risc-v provides a unique opportunity bc of its variable vector size. companies are already shipping chips with native 1024-bit vector registers, which is interesting on its own.
Is the connector and the cable from a 4090 going to be compatible for the 5090 having a maximum 600W support? Bc if it has spikes over 600W it wouldnt be safe to use that 600 cable right?
Just wanted to add that proton is build upon WINE (and DXVK, VK3D...). So contrasting them as one being good and the other being bad doesn't make much sense. Those projects do not get the recognition they should by being hidden behind the common "Proton" name. Of course Valve, through direct contributions or subcontracting, improved them a ton.
Did not watch the video yet but here is my guess regarding the leak: The RTX 5090 can only be bought when bundled with a monthly NVIDIA subscription fee (for using DLSS & AI features).
15:18 A few months ago I told some gamers that I thought NVIDIA is a monopoly in discrete graphics cards, and graphics technologies (CUDA, upscaling, etc.) and they basically owned the entire gaming/AI market. Yet no one believed me or cared.
Nobody cared because the market is open for anyone who wants to bring a good product. Say that just discounts one and other still isn't trusted. But it's not a monopoly.
@@royboysoyboy WTF are you talking about? ASUS, Gigabyte and MSI have both the latest AMD and NVidia cards in production, in their contracts with NVidia there is clearly no such rule.
@@LAndrewsChannelNVIDIA does not want partners to make Intel cards, a 3rd competitor who makes graphics chips. As of today, only Asrock, Sparkle, and Acer make Intel graphics cards, and none of the manufacturers you mentioned make graphics cards from a third competitor due to slimy, monopolistic business tactics from NVIDIA
I'm totally willing to buy a 5090 after selling my 4090, but the idea that it could use two 12 pin power cables and take up that much space in the case is making me a little concerned about the feasibility. I went and got a fancy power supply that has the native 12 pin cable just for the 4090, but now I feel like it's already outdated.
Same my psu has a 16 pin socket and won't be happy if I'm forced to get a new psu already just for one more socket,it would be really scummy doing that in a single generation 5090 should only use one 16pin and one 8 pin for safety
those "market share" percentages are the new hardware sold quarter by quarter. edit: market share is not what's reflected in those charts but a market trend. and trends change. for example someone who bought the intel product outside the scope of this window is not included in the chart so it doesn't mean a lot tbh because people buy a GPU once every couple of years. it just means sales are worse right now.
@@mckinleyostvig7135 lol but that's just it. those numbers mean 0 actually because 88% of 100 units sold doesn't mean monopoly. it just means sales are terrible. and those charts are based on sales numbers, not the actual market share. it's not far, but it's not 0% for intel either nor is it 12% for AMD.
Market share is a term that's well-defined and understood to be: "a company's total sales within a given period divided by the industry's total sales within that same period", not whatever you seem to believe it is.
The R&D costs are ludicrous for semiconductors, but why tf would they raise the prices at the end of its life??????? They already earned like 20x the R&D and manufacturing cost off the AI datacentre deals.
Thanks so much again for your cool News - always love to watch them to see what's going on in the IT-World (which I'm a part of in a way, too). Just a short hint for the price comparisons: in Europe all prices are usually *including* VAT, whereas I've learned, that in the US they're usually *excluding* VAT❕☝🏻🧐💁🏻♂ - Just f.y.i./short reminder of that (I'm sure you probably know that already, but maybe some folks who watch the Channel for the 1st time, don't)❣
Slowly moving toward the days where the top end GPU's for gaming are only owned by rockstars, saudi princes and influencers. I can't wait for when in 5 years I get to see a youtube star showing off his Bugatti GTX 10090.
Now I wanna see how that looks, 20:30, Steve mentioning the idea of ADR in videos. Just a random inserted voice recorded in the Hemianechoic chamber that's slightly different from the rest of the narration, but you can't put your finger on it exactly.
depends on price - sell it for 600 USD and you would talk differently. But thats dreaming. It well beat 4090 which costs lets say 1600 USD, so if they sell it cheaper... lets say 1200 USD, ppl will buy it. Unfortunatelly.
@@JordanJ01 you should try 4k gaming. Lots of titles currently crack the 10gb already. Two more years and it will be 15gb. Then add on top that many games got slight memory leak because the devs just can't be bothered - instantly filled up your 16gb, performance drops, textures become washed out and you have to restart the app.
16GB of VRAM is pure planned obsolescence. Some games even now push 16GB, I cannot imagine someone paying this kind of money to get 2 generations old VRAM capacity
The fab tour was extremely educational for us! Loved working on the video. Please try out the format and let us know what you think! ruclips.net/video/IUIh0fOUcrQ/видео.html
Or check out our Corsair i500 review: ruclips.net/video/Gqm4V-8F-7k/видео.html
Grab the factory tour t-shirt here: store.gamersnexus.net/products/skeletal-factory-t-shirt-foil-blue-silver
Can you please investigate Gigabyte's misleading practices regarding their AM5 motherboards? Specifically, the B650 Gaming X. They sell different revisions under the same model name, but the critical difference is that rev 1.3 only supports the Ryzen 7000 series, while rev 1.5 supports the 8000 and 9000 series. When purchasing these motherboards, Gigabyte doesn’t clearly mention the revision number or clarify that different revisions exist, even though they have vastly different compatibility. This lack of transparency is misleading to consumers, as the model names are identical, and buyers might not realize they're purchasing an older revision that limits future upgrades.
why are they wasting time on arm 64, their client is unskinable.
their newly added families code is missing features.
and their moderation team varies from bots to can't read to cant understand context.
i have been unfairly banned.. for 3 years from gang stalking alone.
meaning i cannot edit my account or share content on steam now for 3 years.... it's unfair and their not even trying to look into it.
and their moderator is clearly off his bracket and their pissing around and their playing with arm 64... what a waste!!!
people can barely afford 40XX hardware... O_O
That video was extremely well done. I'm glad they gave you access, even if it meant so many weeks editing and blurring out all that confidential stuff. 😅 Very cool to see the type of tech and machinery that gets used on operations of that scale.
Steam.. no skinning.. no moderators... people being gang stalked... steam families missing important features... why waste time on proton?
4:46 The 5090 will indeed be a two-slot card. The two slots will be slot one and slot five.
hahahaha. Clever.
@@GamersNexusno, I think this is actually gonna be what it turns out to be
@@GamersNexus Maybe the "2-slot" leaks are just the server blower style 5090s?
The 4090 might have been slightly overbuilt so they could just be packing it full of fins and making it 2 slots, and accepting that it would run hot.
I hope they will add some antisag brackets. Otherwise northwestrepair will have lots of hard work in front of him fixing all the broken GPUs with rejected RMA...
Proton still uses WINE. Proton is a containerized compatibility layer that includes WINE, DXVK, and VKD3D (among other compatibility prerequisites).
Awesome. Thanks for the background on this. We don't follow the Linux world very closely -- though I will say that I've been a lot more interested in it with Windows' terrible decisions the last few years. Maybe time to start studying and deploying it in testing!
@@GamersNexus Man, that would be absolutely amazing if you guys started doing Linux testing! Setup a solid Arch Linux bench (check out CachyOS for a stupid-simple way to get Arch up and running without having to do the whole default setup process from the AUR) and do the normal comparisons.
Who knows, it might make you fall in love with computing all over again!
@@GamersNexus Well i think itś time for "the mainstream" to at least follow a long whats happens in the linux world.
Yes please @@GamersNexus
@@GamersNexusDXVK is a DirectX to Vulkan translation library. You can even use it in windows to improve performance in DX9/10/11 older games.
NVIDIA should introduce rounded models so that it does not hurt them that much when putting it where it belongs...
But where does it bel--- ooooh. I get it.
Exactly! I don't want to scratch my case up with these massive cards.
Apply thermal paste beforehand. That helps. Watch that The Verge video to determine the required amount of thermal paste.
@@GeoStreberLMAOOOOO
That is savage my dude
Can you please investigate Gigabyte's misleading practices regarding their AM5 motherboards? Specifically, the B650 Gaming X. They sell different revisions under the same model name, but the critical difference is that rev 1.3 only supports the Ryzen 7000 series, while rev 1.5 supports the 8000 and 9000 series. When purchasing these motherboards, Gigabyte doesn’t clearly mention the revision number or clarify that different revisions exist, even though they have vastly different compatibility. This lack of transparency is misleading to consumers, as the model names are identical, and buyers might not realize they're purchasing an older revision that limits future upgrades.
Thanks for bringing this one to our attention. Will try to find time to look into it, but we are pretty backed up right now on investigations. Can't promise this one only because we really are overloaded to a point where we need to pick and choose. It's tough since a lot of the deep dives require some serious time commitment to prove irrefutably.
Is that board even sold still? The revisions should be separate SKUs, how is that misleading? Who did you buy it from, I'm assuming not Gigabyte directly. The current Gaming X AX and Gaming X AX V2 are always listed separately.
@@drewnewby first off, they’re talking ONLY about the Gaming X AX. Not the V2. Secondly, No one buys directly from gigabyte. Every retailer that lists that board does not list the revision. That’s the problem, and that’s gigabyte’s problem when they release 5 B650 Gaming X AX boards, all named the same, with extremely varied specs. And yes, the board is still sold now, and if you order on Amazon, you have no way of knowing which revision you get.
That seems very shady, it would be normal for an earlier revision to not support future chips without a BIOS update but yeah you're right I just looked on their support site listed and under CPU support, the 9000 series are not even listed unless you select rev 1.5! That's what the kids would call "SUS AF" I believe...
@@GamersNexus Thank you
As daily Arc user since release... i won't lie
it hurts to hear that
You're in the top 0% though, that's something
@@P7ab Pretty sure that the 0% is in sales, not the whole market share. A market of dozens of millions don't simply swing 5% from X to Y in several months, when we know there's people still running 6-7 year old cards out there.
The good news is that in 20 years it'll probably worth a tons of money as a collectors item that will be extremely rare and hard to find.
hang onto it, it will be a collectible ;)
@@samuelschwager i got 3 Arcs already xD
both A750 and A770 in "LE"
and Sparkle A580 xD
I'm still blown away by how much cache those X3D chips have. Like that's enough to fit old 90's games onto the CPU and never have to touch main memory. Hell, that's probably enough cache to run Windows 9x without main memory!
The minimum requirement for Windows 95 is 4 MB and 16 MB of 98/98SE, you could run dozens of copies of the former and a few of the latter entirely on cache. And yes, you could run games up into the late 90s on that much cache.
goodluck getting that to work though lol
@@HolyRaincloudYou could run freedos or boot Linux to auto start dosemu. Some distros still support rc.local so you don't have to make a systemd service.
And it's STILL beneficial to have more for some modern games. It's incredible how much of a difference cache size makes for gaming performance. I mean the 7800x3d is actually a fairly low performance processor, MUCH weaker than intel CPUs. But that v-cache has it running LAPS around anything intel has. It's the absolute undisputed KING for gaming right now in 90% of games.
I have one, it's amazing for CPU dependent games/sims. I kept my 5800x3d system together too, for the nostalgia.@@johnmclain250
9:25 One important additional information concerning this topic ... FE cards are not officially sold in Germany. I tried to get one with the launch of the 40-series. Not a single online retailer was listing them.
Saying Proton is not a second-class citizen vs. Wine is misleading. Proton *is* Wine, just containerized and packaged in a user-friendly way. And while Valve has contributed to the overall ecosystem, if anyone should be given credit for the Proton 3D performance it should be people like Doitsujin who wrote the absolutely critical DXVK software for translating DirectX to Vulkan.
don't forget that they also add a huge list of patches on top of WINE, some that will never be included upstream.
@@Henrik_Holst well that's awful, especially since on a long enough timeline everybody is going to lose their Steam library
@@BOZ_11 I have no idea what you are talking about, clear to elaborate? Everything proton is open source so that not all patches are upstream have zero impact on the future, anyone can patch their own WINE and there are lots of other projects that apply the same patches to WINE and also apply their own patches that are not in Proton on top of that.
@@BOZ_11 its in the Steam ToS that Valve will give people the ability to download their games to Personal storage if the reality is that Steam will cease functioning. Please read the Steam Terms of Service before you speak about things like this.
@@Mr.Genesis People who think any given ToS can always be legally enforced should read more before speaking about such things.
Cool, I am one of the zero percent market share with my Arc A580 :)
Same good cards to pass these dark ages tbh
Yep I also don't exist with my A770 over here :))
Sparkle A770 here!!
same with my A770, pretty happy with it almost 2 years on now.
o7
28:52 Motherboard seller, I am going into battle and I need your STRONGEST motherboards!
Noooo, travelah, my motherboards are _too strong_ for you!
You can't handle my motherboards. They're too strong for you.
Just use the power of friendship.
They're designed to take *most* of the weight of the 5090... Most.😏
@@leec3881 Let us all be reminded of that one clip of an exec holding up a huge GPU with an ITX mobo and watching it fall out of the slot 🙏
We don't just get better frame pacing with proton, we just straight up do not have shader compilation stuttering at all
FYI, Proton is Wine. It's Wine plus two extension packages called DXVK and VKD3D-Proton. DXVK handles DirectX 9-11 call translation to Vulkan and VKD3D-Proton handles DirectX 12 translation to Vulkan. Valve's major innovation is those two additional packages.
Knowing how companies work, having valves/steams named on them also really helps solve problems, especially if they can move their weight around to solve odd problems that are hard to troubleshoot without deeper knowledge about individual games.
HIKE? My 1080 is never getting upgraded bro 😭
Lmao I just went from a 7700k to a 5800x3d, 16 to 32gb ram (ddr3 to 4) and a new case and drives for less than a new video card would cost 😂
Just get a 3080 it can go for abt 400 dollars sometimes and it's basically the same with vram.
How I feel with my i7-8700k and my 1070 that I have to underclock to keep stable lmao. But Planet Coaster 2 is coming out in like a month sooooooo...
@KrautessendeKartoffel My GTX 1080 died (still boots but crashes randomly under high load). Rip.
Used 3080ti bro. I’ve seen them on eBay as low as 450$.
Oh boy, i can't wait for the breakthrough where they make ai-ram and ai-rgb controllers
Im not really an intel fan, but sad to see the GPU side fall even more. More competition between Nvidia, AMD, Intel the better.
Realistically, Intel would only be competition to AMD and fighting for the crumbs. Even though there are - admittedly less good but still viable - alternatives, people have been advocating for vendor-lock-in tech made by Nvidia (DLSS, CUDA, ...) without a second thought of the long-term effects. Other companies now simply don't have the money to keep up with Nvidia's R&D while the latter can ask ludicrous prices for even average GPUs with comparatively little VRAM.
I purchased Arc GPU and been happy. After I had to replace a RX6600. People have short memories. During the lockdowns and shipping issues. Intel stepped up and brought out a GPU when Nvidia and AMD were over pricing. LinusTT had a good point of people to support the 3rd party candidate but if Intel does not step up it will be another AMD vs Nvidia.
NVIDIA has basically created a new industry. Jensen deserves TIME magazine person of all time and existence cover photo.
Arc was interesting, but when my Brother wanted me to build a PC for him.
I chose Nvidia, Im not familiar with it and I don't have the patience to troubleshoot a device that isn't mine.
Nvidia will take care of them.
No one is competing against Nvidia. AMD can't touch them, even more so if it's Intel
Remember when things used to get cheaper after initial release? Pepperidge farms remembers. *SIGH* It's gonna be a long long time before I get an upgrade
The $200 to $500 range is decently competitive. Depending on what you currently have, there might be a good value upgrade.
@@Pan_Z Right now I'm running a 5700X with 32GB 3600 and a 3060. My set up works great as is, I just like to tinker and build for fun. Unfortunately current prices don't make it practical to upgrade.
@@schmutz1g I don't see why you'd need to upgrade that. I'm running something similar (i7-12650H, 3060 6GB, 16GB DDR5) and it runs even new games just fine (as in, stable 60 frames). Obviously there's room for improvement, but I feel like there's little actual reason to upgrade unless you really have money to burn.
@@haarex. It has nothing to do with NEED. Its my hobby so I enjoy spending money on it regardless of need. With current pricing though, any meaningful upgrade just doesnt make sense at this point since there isnt a ton of performance to be gained.
@@schmutz1g Yes there is, you sound dumb. There are loads of games the 4090 cannot play at 4k 144hz maxed out which is the new standard for 4K TV's
What people need to understand with regard to Nvidia's price/performance is that this is no longer 2005; they don't get their profits from gamers now. They can charge silly money a) because we'll pay it, mostly and b) because their high-margin stuff is way more important to them. Things are not going back to 'old pricing' any time soon.
Not without competition
@@Splarkszter Came here to say this, Until AMD/Intel, some other company is a real threat at the highest end/machine learning/AI stuff, the market won't shift.
I thought this years and years back but the PC market is not enthusiasts. It's a bunch of people that pretend they are enthusiasts but they are the equivalent of a 'car expert' that will only ever by one brand of car or a 'phone lover' that only buys iphones haha. It's really weird to me that the wealthier ones that upgrade every year, sometimes even more, and can easily try other brands just for fun (something enthusiasts would normally want to do, warts and all for the experience and enjoyment of trying tech) just don't do that.
They're just Nvidia fanboys, they're not tech enthusiasts more than they are tech snobs and refuse to try other stuff even if priced fairly for the performance etc. I didn't rush to this conclusion but year after year it feels like it's proven by unshifting obsession with one graphics card brand, even after they constantly put out overpriced cards and put the industry into a terrible state.
@@yousigiltube I had a guy have a meltdown on reddit to the point he deleted all his comments and reported me to reddit care because i said the 4090 is not good value even thought it's the best performance money can buy.
Price to performance seemed to be a foreign concept to him.
You are correct especially with amd giving up on high end the only way that nvidia would ever drop prices is if amd released a gpu with 120% the performance for 3/4 to 1/2 the price which will not be happening now, get ready for higher prices cause nvidia has no competition at all now 😢
5090 isn't the model its the price... People keep getting it confused... god I'm rooting for intel harder then ever.
*Sorry, but Intel's GPU's are already GAME OVER ! Battlemage will be their next big disaster !* And they will give up after that one !
@@gertjanvandermeij4265 Intel sleeper agents at this time of the year?
Good meme. 10/10. No comments
@@gertjanvandermeij4265 Bro likes monopolies
@@gertjanvandermeij4265 pretty sure you're right, but much like when I buy a lotto ticket. I dream for something better.
My 3080 died and I needed a GPU for video editing/class. Went for an A770….i love it. No, not as good at gaming, but good enough. Good gaming performance for what I play, excellent Adobe CC performance, outstanding value
How did 3080 die??
@@peterers3 A lot of the 30 series use awful thermal pads to save a few cents that cause the memory to run at over 100-120° degrees. Way higher than allowed per Samsung's memory spec. MSI was very much guilty of this. I had to mod mine to make it run cool enough. Yes, the core temps were fine (which is what u see in overlays) but silently the memory was being destroyed. I live in Germany, so it's not like it's a very hot country (Although, summers can get up to 40°C....)
@@peterers3 good question, not sure. Basically my computer would crash at random points, often during class or when I was using Photoshop, let alone games. My computer computer worked fine running on integrated graphics. I got into arc because I need to edit video and it’s been completely stable. I sent my 3080 back to MSI for repair. They sent it back and was still broken.
@@LeegallyBliindLOL im not a gamer but I bought a bunch of 3080s for mining back in 2021 and the first thing I did was replace the thermal pads with top quality ones and the temps dropped dramatically.
5080 $1699, 5090 $2399, just wait.
The more you buy, the more you save 😂
They wont sell many if the VRAM capacities havent gone up. Gamers dont have the money to buy those in quantities, and AI training demand wont be there with relatively low VRAM.
I think they'll keep the 5080 at $1000. Judging by those specs it's probably going to be doing half the work of the 5090 which I suspect will be $2000. The 4080 at $1200 was a flop last time around because it was 50% less performance for only a few hundred dollars less. They need to keep that price gap bigger this time if they want to keep the performance gap so big between cards.
5080 $1499, 5090 $2499 ( after that ...... the 24Gb 5080ti &1899 )
The leather jacket in each presentation is a different one, it may not look like that but it is. Those arent gonna pay themselves and the jacket guy is looking at our pockets to milk them!!!
People are reading way too much into Arc's current 0% sales numbers. This doesn't have to be indicative of crashing consumer interest in Arc. I'd argue it's not at all, and the falling sales numbers are to be expected. Arc always appealed to a particular kind of buyer. You can't just walk into a Best Buy and get an Arc card-- they're not on the shelf. People buying Arc knew what they were looking for in the first place, because they're people plugged into the hardware news cycle. They're the early adopters, and just about everyone who was going to buy one already bought theirs either on release, or when Arc rolled out driver updates for the game they wanted to play.
So yeah, when Battlemage is released, that'll be the real testament to how much enthusiasm there is for Arc.
really hope that the rumored v-cache on both CCD's comes true
Agreed. From what I saw their 'solution' for having the system decide which ccd to use left something to be desired. They needed something similar back in the bulldozer days and I wasn't impressed then, sure it helped my 9120, but it was marginal and I still suffered from random stutters, their scheduling update only made them less frequent. I refuse to spend 9950x3d money for a solution that is only 'less annoying'
The final AM4 CPU being a 5950X3D with double v-cache would be a dream come true for me
I mean then you have 2 downclocked CCDs. If you want gaming, go for the single die imo.
Other issues with this is basicly no games are multithreaded enough to support a 16 core X3D at the moment, pretty much every title threads out to 8 cores@happygster922
@@happygster922it’s not that simple. Want both. I don’t just play games with my PC. I’ll take two slightly lower clocked CCDs than this hybrid approach that often times doesn’t work the way it should.
I don't understand the people saying the 5090 leak can't be possible be real due to the core counts being too high for a generational increase. Does no one remember every previous Nvidia massive generational halo card spec increases? Titan RTX to RTX 3090 increased core counts by well over 100% (4608 to 10496), RTX 3090 to RTX 4090 had roughly a 60% increase in CUDA cores (10496 to 16384), and RTX 4090 to this rumored RTX 5090 is a 33% increase in cores (16384 to 21760). If anything Nvidia is actually getting stingier, cutting the GPU core spec increase they've given to Halo cards generationally by half each generation going back to RTX 2000. 2000 to 3000 was 100%, then 3000 to 4000 only 60%, 4000 to 5000 dropping to almost 30%. The 6090 will likely only have 15-20% more cores than the 5090 if it keeps tracking this way, price justified by new DLSS and frame gen that allows a 2X performance increase of 4K gaming over 4090. When we went 3090 to 4090, they took away / reduced a huge part of the generational prior years 2X hardware improvement and instead gave 40% of that performance with DLSS + Frame Gen, and people were still more than happy to shell out two grand for a GPU, they'll do it again.
My bet is Nvidia cutting their 5090 production costs by another 25% over 4090, give even less hardware horsepower improvement and I bet the 4090 FE of $1599 turns into a liquid cooled 5090 FE that costs a minimum of $1999, or as much as $2499. No way they're going to sell consumer GPU's to us when those same GPU dies can be sold at triple or quadruple the profit margins to a datacenters desperate for any AI hardware.
it's not worth to explain that they get less for more money. if somebody pay leather jacket's exclusive margin (about 60% btw) just to play games on that brick, it is purely for the feeling that he is something more than others because he can afford it. i can purchase 4090 every month, but i had to be mentally ill to buy a card which is barely worth 50% of its price.
"A I DONT KNOW WHAT ANY OF THAT MEANS" xD got me going
The Valve/ARM news i imagine would be more realistic for their VR/Deckard projects, stuffing ARM tech in a headset is pretty much what everyone does anyways.
The elephants Steve, the elephants…
Just you wait until they finally do away with weight measurements on GPUs and replace them with fractions of one elephant.
@@GamersNexus with how much GPUs have grown in size lately I don’t think we’ll be measuring in fractions for long
@@GamersNexus Milliphant, microphant, nanophant, ...
We all need to remember the prototype 4090. ruclips.net/video/0frNP0qzxQc/видео.html
@@procedupixel213 😂😂😂
I have a A770 and loving it. Clocks really well, works on al games I've tried and sips power compared to both Nvidia and AMD equivalent performance models I have. Also gets driver updates extremely regularly.
Good for you but intel has a lot of problems that ARC is affected by it even though arc is good now...
@@notrixamoris3318 the operative is "had", not "has".
Seriously, I'm yet to find a game where I have any issues left. In both performance and compatibility, Intel pulled of an astronomical improvement in the past year and now it's perfectly on par with the driver stability and compatibility you'd expect from AMD and Intel.
@@notrixamoris3318It's because of the architecture
it does not sip power. the idle is at least 60w from the wall.
@@ZestyLemonSauce Idle is 40ish max and my GPU is rarely ever idle, so that bit doesn't matter to me, it's 100% load power usage is lower than both the 6700XT and 3060.
Besides you can get it down to 8W idle power if you set your monitor to run 60Hz when not in a game when you follow their PCIe low power settings recommendations.
Which seems to be a peculiarity in their vbios or drivers they still need to fix. The PCIe low power settings do work, but only if you set your monitor to 60Hz. Going higher than that instantly brings it back to 40ish.
28:47 Huge missed opportunity to call the board Godl Ai ke.
Honestly I hope the public perception of ARC vastly improves with this next generation, especially if they continue trying to fill the hole of mid to low tier GPUs that NVIDIA is too high class or whatever for.
I love my arc a750, but that is under linux. Everything just works. Just not having to deal with Nvidia is amazing. I even play some minecraft 😂
It won’t, but they’re still kinda cool. Not as a gaming daily driver but I’m still down.
Two days ago, built my pc with a770, and feel okay for games and productivity
I doubt intel will really want to fill that hole. This is why intel market share at 0% right now. The last 2 quarters they only ahip extremely low quantity of Arc to the market. For intel it is better not to sell them rather than have to sell those card at very very low price to the point of selling it at loss.
@@arenzricodexd4409 this is a terrible take. Like I mean nonsensically bad.
If the 5090 is going to be a professional grade card, not a gaming focused card, I wish they'd remove it from the naming scheme of the gaming cards.
Those 5090 leaks are giving me R9 295X2 flashbacks.
What a card that was ... I can still smell the heat 😃
🎵 Chestnuts roasting on an open fire! 🎶
Valve working on ARM64 support for Proton also means good news for macOS devices running on Apple Silicon.
Wine already works on MacOS, so it's not a big leap for Valve. It would certainly please us Mac users. Linux is soooo far ahead for gaming right now, but I want games on all my computers.
Ever since Apple made it impossible to run 32bit applications, Valve has pretty much given up on supporting the platform. They make sure the Steam client runs, but that's about as far as they go. They haven't even bothered to update games like TF2 on Mac. It simply doesn't run anymore. I could play it on my 2011 MacBook Pro, yet it doesn't launch on my M2 MacBook Pro. It's so much faster, but it can play so much less.
Valve puts a lot of time and effort into making games more compatible, only for Apple to break compatibility entirely. That's maybe fine for productivity software where you always run the latest version, but a disaster for games since their active support only lasts for so long. Apple doesn't understand (PC) gaming, and they never will. Every once in a while they'll parade a few games whose ports they funded, but other than that macOS is really not a platform worth buying into either for gamers or game developers... Valve tried, and all they got was a middle finger...
It’s kind of a double-edged sword. Dropping support for 32-bit apps screws over some older games*, but keeping support slows down your machine while further complicating future progress (e.g. the transition to Apple Silicon/ARM64). There’s no ideal solution. That said, it’s long been the case that if you’re really wanting to game you’re better off getting a console or a PC, though I guess we’ll see how Apple’s current push in gaming goes. I doubt it will do much, and I say that as a life long Mac user
* though given Apple hadn’t sold 32-bit Macs since 2009 and macOS hadn’t supported 32-bit hardware since 2011 there was little reason for more modern games to still be 32-bit only
@@pilkycrc As a software engineer I really do have to debunk that argument.
There really isn't a good reason to forcibly drop it, not for a company that size. All 64 bit x86 processors are fully backwards compatible with 32 bit instructions. It's literally native code. You're not slowing down anything there.
So then you have the OS level. You need to have a translation layer between the 32bit applications and your 64bit OS libraries. The thing is, Apple had this. Microsoft also has been using this since 2005 at the very latest. It's absolutely fine to not bring new features to the compatibility layer, but there is no reason for them to simply remove the existing functionality. Sure it takes some engineering time to keep it going. But Apple is a huge company with tens of thousands of engineers. It's a huge middle finger to their community of users to not spend just a few resources on keeping that old compatibility layer functional.
I'm sure the move to ARM has something to do with it. But even then... Microsoft's much inferior x86->ARM compatibility layer supports 32 bit apps. Rosetta by comparison is a much better compatibility layer. They just didn't bother.
FYI, I can guarantee you they will pull the same sh!t again in a few years. In a few updates they will drop support for Rosetta and you'll lose access to ALL mac software written before ~2020. They did it with their PowerPC translation layer, they'll do it with the x86 translation layer. They have been doing crap like this for decades. Yeah it takes extra work, but that's nothing compared to the millions of hours of development work spent on software that will simply no longer work. For all its flaws, at least Microsoft understands the value of that...
@@Niosus As a fellow software engineer (who’s been building software for the Mac for 20 years now) I do have to partially disagree. There are several good reason to drop support that are specific to the Mac, its history, and various technical decisions.
The first is due to how Apple ships its OSes. As they have been through a lot of transitions they are well versed in the concept of FAT binaries (also known as Universal Binaries). Rather than having separate downloads for different architectures you have single download that contains them all (during the Intel transition this led to 4-way binaries for PPC, PPC64, x86, and x64). This also extends to the OS and all its libraries. This makes things much easier for the user as they don’t need to care about their architecture. You could have macOS on an external drive with universal apps and boot a Mac of any supported architecture from it entirely natively (something Windows couldn’t do, and I believe still can’t).
The downside is you have duplicates of all the binaries. In macOS 10.14 (the last version to support 32 bit apps) this took up about 500MB of disk space. It also takes up RAM. When a library is first used by an app, macOS loads it into memory. To save on RAM a lot of these OS-level libraries are shared (so you don’t have a version per app). However, they are never **unloaded** from memory. So if you launch an app by accident and quit it immediately it will load all the libraries it needs and leave them in RAM until you reboot. This can waste a lot of valuable RAM (especially given Apple’s stingy configs, but that’s another problem 😅). This isn’t an issue for 64 bit stuff as most Mac software was already 64 bit. But it meant those few 32 bit apps had an additional burden.
Then there is the even more macOS specific issue of the Objective-C runtime. Obj-C is the language most high level macOS (and previously NeXTSTEP) APIs were built in (and to a large degree still are). Unfortunately Obj-C was created in the early 80s (it’s even older than C++) and the runtime had a few issues that made adding new features to existing classes very problematic. With the 64-bit transition Apple could introduce a new runtime that fixed all these issues. The problem is, they needed to keep supporting 32-bit and so the old runtime. This added a LOT of engineering complexity to do pretty basic stuff, leading to features being slower to add and more prone to bugs.
Apple does have a lot of engineers, but I think many in the industry would be surprised at how small their teams actually are. They’re often a fraction of the size of similar teams at the likes of Microsoft or Google. This lets them be a bit more nimble, but has the downside of not having as many resources to manage the growth in complexity that super long term backwards compatibility requires. From the perspective of providing better and more efficient software going forward, then dropping the legacy Obj-C runtime was a pretty big thing that would benefit the entire company, especially as it ONLY affected the Mac, with iPhone, iPad, etc only ever having the modern runtime. In fact that leads to another point which is bringing iPhone and iPad apps to the Mac is also made a lot easier by dropping 32-bit support as the frameworks there never had to deal with the legacy runtime, but would have to on the Mac.
And then you have the ARM transition, which again plays into this. By dropping 32 bit support Apple can focus their resources entirely on x64 -> ARM64 translation. This reduces the complexity of Rosetta and allows them to optimise for one architecture switch. Given they also added CPU instructions to Apple Silicon to speed up translation of x64 apps it wouldn’t surprise me if that played a roll in encouraging them to drop 32-bit.
So in a nutshell, keeping 32-bit compatibility is a lot more complex a topic than most people would think and requires a lot more resources than one would expect. To be frank, it’s a miracle that Microsoft have kept it for so long (I know there are people at MS who frequently argue for dropping compatibility for older hardware and software). And in Apple’s case it’s not like it was a sudden or unexpected transition. As you’ve said, Apple has history of dumping old tech and architectures, but it usually takes them a while to do so. Apple’s first 64-bit capable Mac was released in 2003. macOS was fully 64-bit by 2007, the same year they last shipped 32-bit hardware. They dropped support for 32-bit hardware in macOS in 2011 and then spent 2017-2019 warning that 32 bit apps would no longer be supported before doing so in 2019. So that’s anywhere from 8 to 16 years of notice given to devs. Any software released during that time frame probably should have been built as 64 bit already (sadly I suspect the reason that some of it, especially games, wasn’t built that way is the devs were new to the Apple platform in that time and didn’t know to expect this).
But yeah… ultimately there are pros and cons to each approach. Windows is legendary for backwards compatibility and lets you run pretty ancient software pretty well. This is a huge boon to people who enjoy older games or want to preserve older software without resorting to older machines. The downside is that it requires an increasing amount of engineering resources to pull off and limits your flexibility and nimbleness in moving forward. Having more limited backwards compatibility is why Apple has managed to pull off 5 architecture transitions (68k -> PPC, PPC -> PPC64, PPC -> x86, x86 -> x64, x64 -> ARM64) with incredible success in the time MS has struggled to pull off 2 (x86 -> x64, x64 -> ARM64). There’s no right or wrong answer here, which is why it’s good to have multiple computing platforms with differing priorities (even if that does mean you need to fork out for each of them separately to get both sets of benefits)
6:51 5080 is going to be such a lame "upgrade" over the 4080 (super). Almost no increase in core count and going from TSMC 5nm to 4nm will barely improve the clockspeed. Fully expecting 1200 USD or higher pricing too 💸
Upgrading from a 3080 to a 5080, would the uplift be worth the cost to upgrade? Not sure if it can be answered currently due to the lack of knowledge.
sadly upgrading from a 2080super would have loved a bigger node upgrade but it will still be a sizable upgrade because the gddr7 over gddr6x rather than the almost non existent 5nm to 4nm upgrade from the cards.
@@deadpool790If you have a 2080 Super even a 4070 Super would be a massive upgrade stop being delusional.
Just wait for 6080
I always feel 256bit bus as an insult.
Probably most people bought the low end Intels for the AV1 hardware encoding feature. Which now becomes more common among amd and nvdia gpus.
That camera mug after saying Valve picking up steam. Perfection.
3:32 Thanks Steve.
For those who don’t know last year the 4090 was at whopping low of 1450.00$ on Black Friday at multiple stores like microcenter and Best Buy. That was definitely the time to buy it and the only time to buy it and still feel good about the wallet drain haha price to performance at 1450 feels a whole lot better now knowing it’s worth 2k
I got lucky and built my once in a decade PC then. I hope it lasts!
Feels odd going from being able to rule out the CPU for the most part, to having to really consider it now. I remember when I only had to RMA 1 CPU maybe once a quarter? Even then it was probably the user's fault. We were building about a thousand PCs a year. I think 90%+ of our RMA's were motherboard, the rest were mostly hard drive or power supply.
Tell me more. How many Gigabyte motherboards ?
@@louisfreema5905 Soooooo many gigabyte boards... >.< Soooo many leaking caps.... I have too much trauma to buy the brand anymore.
The 5090 will be a 2 slot card. It will just be twice as tall and have an external 30CM pedestal fan cooling it
Man I wish I could grow my hair out like Steve's lol
Lines like at 27:07 are exactly why I keep coming back. You don't just provide top tier news coverage, you also provide laughs. Thanks Steve.
back to you, Steve!
*Pretty sure Ngreedia left room for an 5080ti with 24Gb !* Because just 16GB on the 5080 doesn't make sense !
Yes, but it will be called 5080 AI.
Enough room for 5080 Ti, Super and Ti Super.
it does, just like the 4080, so it runs out of VRAM in a year or so when people run the latest over-hyped RT and need to blurLSS it
Doubt. RTX 4080Ti also didn't happen.
@@simoSLJ89 AITI Super -> ATI Super -> ATI. The time has come, the circle will become complete.
Two slot card as in the PCB/power section only is 2 slots. aftermarket manufacturers adding a cooler adds another 2-3 slots.
Intel really needed battlemage, and battlemage being cut down into a minor series between celestial is going to hurt more. might not even get celestial because they'll use poor battlemage performance as a metric for a full celestial series
Living next to a quarry I can confirm that you need to get your limestone harvest in before the first frost or you'll be cooked. Limestone has the longest growing season of any of the stones.
The Green solder mat looks dope in the shot! Glad I bought one. It's been great, amazing quality and always glad to support your efforts GN!!!
Nvidia is definitely picking their customers now that they've successfully monopolized their markets. I think the 5090 being couched as "pro-sumer" is likely going to make the price $3k+ (and look at all these comments of people saying they would pay $5k+ no questions asked). As the middle class is being squashed and the consumer base is being split, many of these companies are finding that they can set whatever price they want once they've adequately captured their customers. I hope Lina Khan's FTC is taking a VERY close look at Nvidia.
I hope 4090 and 5090 stops being shown in gaming benchmarks, as if it was gaming card..
@@reav3rtmnah. I’d buy it and post the benchmarks
Does that GPU marketshare report not include the embedded gpu's in things like the steam deck, rog ally, etc.? I imagine that would change the results significantly.
JPR calculate their market share based on the unit shipped for that quarter. Intel market share is at 0% because for the last 2 quarters intel ship extremely low quantity of Arc to the market. They rather not selling those card if they need to give crazy discounts on it (and make a loss because of it). Intel market share won't be improving until they release BM to the market.
I appreciate the timestamps & chapters!
600 Watts?! That’s absolutely insane!
2000 Watt PSUs 😊
yeah, that will make your living room hot, its totally unusable
I mean, the 4090 was 600W as well
Amazing video, i thoroughly enjoyed it, thank you Stephen, thank you Gamer's Nexus. :)
I miss a hit 'RTX' version of 1080 Ti. Probably one of the best releases ever. If only NVIDIA could reproduce it with 50 series...
That's a mistake they're never going to repeat again, absolutely legendary card
They always could have reproduced it, they don't want to and they don't have to (no competition)
Thanks for mentioning the microcode update! As a 13700k owner who's CPU *hasn't* had a meltdown yet, I've been waiting for updates on that story. I haven't updated my bios yet, though, since I remembered hearing rumors that the microcode update could negatively impact performance. I'm also pretty distrustful of betas, especially betas that are pushed out as a result of panicked development.
Same here, waiting for someone to propper test it so I can update bios.
I can confirm we are working on the phantom chipset.
I LIKE YOUR FACTORY TOUR VIDEOS. MORE FACTORY TOURS. XD
The Intel fab tour also helps a lot with your Tier 6 builds in Satisfactory.
28:54 That X870E Godlike power connector side does look pretty interesting though. Having all connectors come out one side.
People have such short memories they don't even remember this same guy made the same 600W claim for the 4090 and it turned out to be 450W ...... Why should anyone believe him now?
Why would a 10752 CUDA cores on a 5080 (400W) need almost as much power as a 16384 CUDA core 4090 (450W)? Does anyone really believe efficiency will go backwards? If it only gets 10% better efficiency it should be able to match the 320W 4080
yeah kopite is a poor source. he's 50:50 lmao
The ARM64 support is likely for their upcoming VR headset which is rumoured to be using an ARM processor and some version of SteamOS
Remember US consumers play a big role in why these prices are so high thanks to those who keep paying the absurd asking prices as it enables it.
Ironically, the 4090's sky-high price is still lower than what Nvidia wants for the not-Quadro GPUs for the same specs (and even for _lower_ specs), so I'd recommend bearing that in mind for that GPU's demand.
No one cares the US is consumer central of the world lol I'm I in the wrong for getting a 3080? (Not scalped) I'm looking forward for the 5080 depending on price to performance of course
600 WATT ... did u know thats alsmost 1 HP (horsepower) therefore more power as an E-bike and that at the current gas prices 😅 seriously who buys that Nvidia crap anyways?
People like you think food prices will go down and restaurant prices will go down dramatically too if their sales decrease a lot.. inflation doesn't care fool
I love when you guys do the factory tours or the in-depth documentary videos! And I love that you guys take a neutral position on everything! There is absolutely no bias in any of your videos and any of your news in any of your products. It can be huge fans of Intel and Nvidia but then also you're not afraid to call them on their shit whenever they are having shit. And even LTT. Hopefully he didn't take that shit personally and called out what need to be called out before he ended up imploding himself. Where I think you guys help save him from himself, by calling out all of the inconsistencies and issues that LTT had and had been continued building up and building up, and I think that they are actually a much better channel now after the fact. Where they are focusing more on quality and not quantity of the new video everyday no matter what format. I just hope to see LTT and GN together again
For not native speakers, set playback speed at 0,75. You welcome
explaining the 5090 as a prosumer product is actually a cool way to think
i just hope they dont pull a 20 series on us again 🙏
It's ALL going to be a 20 series style pulling the wool over our eyes from here on out. As in "Woo-Hoo! Lookit this AMAZING 3% uplift over the previous generation!"
Yes they will, just look how bad the specs are for the 5080 it'll be incremental changes in raster compared to 40 series other than the 5090.
Don't do that to yourself😢
@@joelcarson4602 I'm praying that the 5080 is 10% better than the 4090 for cheaper. If so, I'm most definitely getting it. Also, 4090 in my area is over 2300 USD for pny brand.
They already did with the 3060ti and 4060ti
Worth adding that "ARM64-EC" (Emulation Compatible) refers to a Windows-on-ARM ABI (Application Binary Interface) that allows emulated x86-64 applications to use native ARM64 dependencies.
all AMD needs to do to get their market share back is release cards with much more VRAM like they did 10 years ago, everyone who runs AI locally just wants more vram capacity to the point where independent card manufacturers are making a mint now
Absolutely. Especially with the latest LLaMA versions that no longer have mid-size versions. You need at least 40 GB of RAM to run them with decent precision.
all amd needs to d to get their gaming segment market back is cater to like 2% of users?
They already do have a fair bit of RAM vs. Nvidia, not that it makes much difference. I get what you're saying, but currently I have doubts about such changing a whole lot.
If they wish to compete with Nvidia, they gotta sort out the performance. Not just competing with Nvidia, but surpassing them. Even then, too many stories about the drivers - not even sure better performance alone would do it, people still scared.
Price would do a lot. They got greedy with the 7000-series, based their pricing on Nvidia's silly prices - rather than trying to go for market share, with lower prices. AMD did exactly as Nvidia, nearly doubling the prices of previous generations.
Unfortunately they are too slow on the ai side of things. They are integrating their workstations architecture into the gaming GPU, udna, which will come after rnda4. Its a bit too late, I think the ai craze has died down. I haven't seen much ai stuff making headlines. Ai video are still stuck in slow no look.
And the games aren't delivering, either take too long to make or just isn't appealing. I don't think amd can ever recover in the gaming GPU space.
@@oneanother1 Even if they're not as fast as Nvidia, if they're faster than running off CPU it can be worth it, depending on the price.
I love the way Steve said "low" during the pricing segment, they are so damn expensive haha
4090 price hike????
Real
It sell enough to justify it, as much as we don’t like that
@@JollyGiant19 I sort of feel is a mix of downscaling the manufacturing that rises the prices and also that they are aiming to slowly sell their stock on the assumption that there’s going to be a lack of stock for the 5090 and 5080 and therefore more demand for the 4090. It’s as right now the top of the line consumer gpu, so they still expect to get a premium for it.
They raised the price for the 3090s then then raised them again when the 4090 came so people just wouldn’t keep buying the cheaper cards. 3090ti was $1200 at one point
yes, just people trying to make their last bit of money off of it while they can
The Intel fab video was great! Thanks! 🙂
I appreciate the honest reporting.. way too many people in this space just go along with the 600w rumor. I don't believe it because they want a lot of buyers, and that number would push out alot of buyers from upgrading due to most people using an 850w or lower psu
I don't go along with it because we saw exactly the same story with the 4090 rumours
Awesome job as always, and yes, I watched and loved it the fab video!
600W, $5000 AUD, a few more CUDA cores, bit more memory, bit more speed, maybe another new revolutionary feature like frame generation, ooh boy, how exciting
@gamersnexus, @09:15 I think you guys used the incorrect background image, as it details info about nintendo amongst other things.
Valve working on proton on Arm does open the door for a risc-v chip handheld in the future. But windows can use DXVK with windows on arm can support working game over what microsoft offering.
risc-v provides a unique opportunity bc of its variable vector size. companies are already shipping chips with native 1024-bit vector registers, which is interesting on its own.
4:31 "Consumer card" that nobody of regular people will never be able to afford it.
Yes people will buy it.
Is the connector and the cable from a 4090 going to be compatible for the 5090 having a maximum 600W support? Bc if it has spikes over 600W it wouldnt be safe to use that 600 cable right?
Just wanted to add that proton is build upon WINE (and DXVK, VK3D...). So contrasting them as one being good and the other being bad doesn't make much sense. Those projects do not get the recognition they should by being hidden behind the common "Proton" name. Of course Valve, through direct contributions or subcontracting, improved them a ton.
5:15 - Probably mixup in comunications; 1 slot for the card, 2 slots for the fans/heat sinks.
Did not watch the video yet but here is my guess regarding the leak: The RTX 5090 can only be bought when bundled with a monthly NVIDIA subscription fee (for using DLSS & AI features).
I bought an Arc last month, hopefully at least improving the stats a little. Working great for me so far!
15:18 A few months ago I told some gamers that I thought NVIDIA is a monopoly in discrete graphics cards, and graphics technologies (CUDA, upscaling, etc.) and they basically owned the entire gaming/AI market. Yet no one believed me or cared.
Nobody cared because the market is open for anyone who wants to bring a good product. Say that just discounts one and other still isn't trusted.
But it's not a monopoly.
@Artimidorus partners not being allowed to make graphics cards for other brands contradicts your first sentence
@@royboysoyboy WTF are you talking about? ASUS, Gigabyte and MSI have both the latest AMD and NVidia cards in production, in their contracts with NVidia there is clearly no such rule.
@@LAndrewsChannelNVIDIA does not want partners to make Intel cards, a 3rd competitor who makes graphics chips. As of today, only Asrock, Sparkle, and Acer make Intel graphics cards, and none of the manufacturers you mentioned make graphics cards from a third competitor due to slimy, monopolistic business tactics from NVIDIA
NVIDIA is acting like Apple, and they need an anti trust lawsuit as of 6 years ago. No wonder EVGA left the industry, I don't blame them
Everyone in the Fabs should want one of these new shirts :D
The wattage is fn ridiculous on the new cards
Thanks steve for mentioning Valve Proton!
cant wait till you show differences between Linux / windows. Framerate and frametimes.
I'm totally willing to buy a 5090 after selling my 4090, but the idea that it could use two 12 pin power cables and take up that much space in the case is making me a little concerned about the feasibility. I went and got a fancy power supply that has the native 12 pin cable just for the 4090, but now I feel like it's already outdated.
Same my psu has a 16 pin socket and won't be happy if I'm forced to get a new psu already just for one more socket,it would be really scummy doing that in a single generation
5090 should only use one 16pin and one 8 pin for safety
Water cool it and you’ll be fine
Just like the prototype 4090. ruclips.net/video/0frNP0qzxQc/видео.html
The skeletal tech t-shirt gives off some serious Fear Factory vibes!
those "market share" percentages are the new hardware sold quarter by quarter. edit: market share is not what's reflected in those charts but a market trend. and trends change.
for example someone who bought the intel product outside the scope of this window is not included in the chart so it doesn't mean a lot tbh because people buy a GPU once every couple of years. it just means sales are worse right now.
Actually in the current market nobody can buy a GPU ever
@@mckinleyostvig7135 lol
but that's just it. those numbers mean 0 actually because 88% of 100 units sold doesn't mean monopoly. it just means sales are terrible. and those charts are based on sales numbers, not the actual market share. it's not far, but it's not 0% for intel either nor is it 12% for AMD.
I got a 4080 half-off. Will make it last at least ten years lmao
Market share is a term that's well-defined and understood to be: "a company's total sales within a given period divided by the industry's total sales within that same period", not whatever you seem to believe it is.
@@rednammoc yeah, okay. then let me ask you this: what are all the people whom have bought PCs thus far? not a market or?
Thanks for the news recaps!
You mean the 4090 which costs $200 to make is going up in price? Did someone sneeze on the Nvidia bs machine?
Thats kinda like calling the pacific ocean a puddle.
The R&D costs are ludicrous for semiconductors, but why tf would they raise the prices at the end of its life??????? They already earned like 20x the R&D and manufacturing cost off the AI datacentre deals.
@masterkamen371 probably because they are sacrificing profit for making the cards vs using the dies for something else.
I love that Thermaltake keep putting out more colors.
Thanks so much again for your cool News - always love to watch them to see what's going on in the IT-World (which I'm a part of in a way, too).
Just a short hint for the price comparisons: in Europe all prices are usually *including* VAT, whereas I've learned, that in the US they're usually *excluding* VAT❕☝🏻🧐💁🏻♂ - Just f.y.i./short reminder of that (I'm sure you probably know that already, but maybe some folks who watch the Channel for the 1st time, don't)❣
Slowly moving toward the days where the top end GPU's for gaming are only owned by rockstars, saudi princes and influencers. I can't wait for when in 5 years I get to see a youtube star showing off his Bugatti GTX 10090.
can you preorder that Ferrari GTX 12090 ti super?
Now I wanna see how that looks, 20:30, Steve mentioning the idea of ADR in videos. Just a random inserted voice recorded in the Hemianechoic chamber that's slightly different from the rest of the narration, but you can't put your finger on it exactly.
5080 with 16gb would be horrible value
depends on price - sell it for 600 USD and you would talk differently. But thats dreaming. It well beat 4090 which costs lets say 1600 USD, so if they sell it cheaper... lets say 1200 USD, ppl will buy it. Unfortunatelly.
You people whine about vram too much, shows how uneducated you are.
@@JordanJ01zip up nvidia's pants when you're done
Tell me how it is uneducated to ask what you're paying for @@JordanJ01
@@JordanJ01 you should try 4k gaming. Lots of titles currently crack the 10gb already. Two more years and it will be 15gb. Then add on top that many games got slight memory leak because the devs just can't be bothered - instantly filled up your 16gb, performance drops, textures become washed out and you have to restart the app.
Really liked that AI joke. Was drinking water too 😂
The next press release from Intel: The chips were fine before GN visited the factory! (Probably)
Thanks for the info ! Great vid . I need to get some new merch .!.
16GB of VRAM is pure planned obsolescence. Some games even now push 16GB, I cannot imagine someone paying this kind of money to get 2 generations old VRAM capacity
RX 6800 sitting there like 😊
* *unoptimized games with useless ultra settings* *
@@sanji663 *unoptimized games ported from a console with an "automatic console porter" software from 2015 *