Not sure why my video chapters seem to disappear, but here they are as a comment: Chapters: 0:00 Nvidia RTX 50 series mobile lineup leaked 2:57 Nvidia GPU marketshare hits 88%, AMD 12% Intel 0% 5:16 Nvidia, Microsoft, OpenAI antitrust probes 6:04 When are we getting new desktop GPUs? 7:37 Ryzen AI 9 HX 370HX engineering sample benchmarked 9:43 Qualcomm Snapdragon X Elite benchmarked at Computex 13:53 Intel CEO is happy to make Arm chips 14:25 Arm, Qualcomm legal battle has potential to kill this wave of Windows on Arm 15:55 Microsoft reworks Recall after security pushback 18:09 PowerColor uses NPU to reduce AMD GPU power consumption by 22% 19:20 AMD and Intel Copilot Plus PCs won't have AI features at launch... 21:41 When are we getting Ryzen 9000X3D CPUs? (leak/rumor) 23:32 AMD working on "differentiators" for X3D CPUs... is this a bad thing? 26:21 Ryzen 5 9600X engineering sample L1/L2 cache benchmark shows massive gains 28:05 Ryzen 5 9600X engineering sample hits 5.7GHz all core OC 28:40 Phil Spencer almost confirms Xbox Handheld, says it is important for it to run games locally 30:08 New Xbox Series Consoles
Hi Daniel. You said you'd be looking to do more exclusive content for channel members in the summer break. I can only speak for myself of course but, as a member, I don't feel like I want any of your content to be behind a paywall of sorts. I don't know whether it'd be worth polling your other members to see what they think. I became a member because I like and appreciate the content you put out. I'd like it if it is all available to everyone, regardless of ability to support. Maybe not everyone feels the same way, just my opinion.
@@po1odo1o anything I do as members only content would be stuff I wouldn't post on the main channel anyway. I won't paywall anything that would be normal channel content.
@@mikelay5360 did you seriously recommend MacBook for gaming ? Also less than 10% of laptops have AMD GPUs and ones that do end up with crap like 6500M anyway lol. So stfu
@@BonusCrook I know right... I mean it would be crazy to think that my weak little 4060 laptop can run anything over 30fps @ 720p... typing this as I am playing Hellblade 2 @ 2k with 90fps optimized settings.
8 GB 5060 and 5070 is insanity in 2025. Its already unrecommendable in 2024 unless its like $1000. At this point, we are gonna have to hope there is more than 4 redeon laptops next gen and AMD has more than 2 brain cells to realize this is their selling point.
AMD's power draw sucks too badly to compete in laptops. Even RDNA3 is total trash at Laptop power draw. That's why AMD wisely focuses on their APUs for both consoles and laptops, they would fail HARD if they tried to make mobile dGPUs.
Well maybe AMD will give us more mini PCs with decent CPUs and GPUs.... Like to see the 780M/880M become more of a standard for PCs at stores that sell electronics....
@@aaz1992 4080 12GB 😆 That card was such trash value they had to unlaunch it for desktop and your trying to brag about it online to someone for not upgrading! 4080 Mobile ~ 4070 desktop
Same on a 3080 12GB looking like 6060 or 7060 before I can even consider upgrading... By then AMD's ray tracing probably will be quite a bit better, and I might as well switch to AMD to save money and have better Linux performance.... Only GPU I kinda even "need" right now is a modern SSLP 35w card....
Dear Daniel, please never stop to be how you review tech topics of the pc. You represent at the same time a good friend with good advices and point of view with knowledge. Thanks for your effort and information. Keep on it! Greetings from Austria 🇦🇹
I find it interesting and weird that despite all the discussion on RUclips comments, and most people not recommending Nvidia GPUs, they somehow skyrocketed in market share. It just makes it seem like Nvidia has a stronghold on the market, and are essentially too big to fail.
I have the same concern over the "differentiators" quote, Daniel. It seems like they mean that they'll be differentiating the X3D in the 9000 series from the X3D in the 7000 series by improving it, which is good, but cynical me wonders if they'll also make it even more beneficial in the higher core count (i.e. more expensive) CPUs so that they can sell more of those to gamers. The issue with that is that it may seem like they're crippling the lower core count CPUs to push people towards the higher core count ones. Hopefully, they don't do that.
Nvidia is essentially saying buy our laptop GPU and make peace with using dlss for everything as standard.. and maybe you will just about be ok to run games at 1080p for 4 years.
I have the 2023 Aorus 17H with 4080 12GB. I really want to upgrade again in 2025 but if the performance isn't at least 50% faster at the same 150W, I won't bother. Looks like it'll be the same 12GB vram for the 5080 or whatever they call it
50% GoG uplift will most likely not happen, especially considering how the transistor node improvement will be much more incremental than prior generations. I imagine you have a desired refresh rate and resolution target that you want to meet, but turning it down a little and waiting for the 60-series might be your best bet. RDNA 5 could also be promising, and that’s rumored to come out next year.
Impossible to get 50% better effiency without a big node jump. And RTX 5000 barely uses a refined version of the same process as Ada lovelace. I can already tell you it isnt happening.
19:31 Before you even mentioned this, the way Lisa Su at Computex told the Microsoft representative that AMD had to dedicate a lot of die space to this ... I got the same impression. She's not entirely convinced this is the best way to spend die space.
That's why you wait for that to happen and pick up 4090 for half price during the small window of gamers competing with the bots doing a mad rush for 50 series.... although in saying that, I am sure I read somewhere that Nvidia were going to try and prevent that from happening as much as possible (how they do, I wouldn't know) but ultimately scalper sales slows down the process of actually getting the GPUs in to the hands of the actual gamers and that somehow hurts nvidia when the gpus are just sitting around in storage and not being used.
This reminds me of a reddit comment i came across where some snob was trying to shame people for purchasing the 4080 and 4090 saying that the 50 series is gonna have more VRAM.....my my...how the turn tables have turned.
Happy Summer Break! I hear your concern about the 9950X3D being more gaming-powerful than the 9800X3D (for example). To me, it just makes more sense that you pay more and you get more performance. I'm not rich, btw. I like low price/high performance, too. But I also want a Porsche 911 Turbo convertible that I may never have ;-) That all said, I'm hoping the reason the higher numbered 3D parts will perform better in games isn't just due to more cache being thrown at them. Sure, I'll take that as a factor, just preferring they have ironed out any scheduling issues.
if microsoft makes something opt-in, they will surely remind of that option whenever it's possible and these reminders will not be easy to deactivate, but that's still better, I guess.
I think that the partners talking to Wccftech are actually talking about the same timeframe, but just naming it differently (it's possible that the partners weren't referencing nVIDIA's fiscal year quarter naming, but rather their own company's fiscal year quarter naming - different companies Begin and end their fiscal years at different times). I'm expecting to see a launch at CES in January, which is what I've been expecting for at least a year now. For nVIDIA's fiscal year, that is quarter 4 2024 (which ends January 31st), but for another company that maybe follows a standard year, that same timeframe would be quarter 1 2025...
I'm thinking of getting a rtx 4070 powered hp omen 16 this month, so should I get it or wait for the 50 series to hit the markets, I can do only one, meaning I cannot buy it now and exchange it for a new one since the exchange value of products in my country are shit
25:30 Translation: What if a higher end X3D CPU actually did what it should and gives you more performance than the chips below them? This sounds like someone whining about not being able to get better performance out of a lower tiered chip for less. Chips giving more performance cost more...the sky is falling! They aren't going to make the 9800 perform worse than a 7800 while costing more, so your "worries" make no sense. Even if you think they will be artificially limiting the 9800 to make the 9850/9900 look better by comparison, you will still be getting more performance as you go up in tiers and cost. Anyone remember the 1080 Ti? Nvidia does. The 7800 isn't happening again.
The current Zen4X3D cpus got a massive price cut today. Especially the 7900X3D now costing less than the 7800X3D. And the scheduling issue... is only an issue with Windows. On Linux, the scheduling is perfect.
Can someone make mouse cursor out of Daniel Owen? Static daniel pointing up as the regular mouse. Him yapping when something loads and so on and on. Would be nice.
Typically a 5090M is around the specs of a 5080 card. You can see this 4090M is a 16 gig product, much more similar to the 4080 than the desktop 4090. Works in the lower stack too the 4070M being comparable to the 4060(TI?) instead of the 4070 desktop with it's 12gigs. So... 16-16-12 that's probably 5080, 5070 TI and 5070 in desktop space.
The Recall database wass not accesible to other program even before. Those programs needed admin permissions or you to give UAC approval when they wanted to. But yeah, better to encript that thing.
As bad as this trend is, the sad reality is that the geforce gpus are still more reliable (aside from the cable failure on 4090s), "trusted" and energy efficient than radeon gpus are, regardless of vram capacity and price. most people dont care about it as they still run games on 1080p medium (or thereabout) and hence buy the "budget" cards, as long as they just "work". At this point, the only ones that can take down nvidia is nvidia themselves.
Maybe if AMD stopped releasing GPU's that flatlines on release. It wouldn't be so bad if their GPU's were A LOT cheaper, but they are just NOT. The 7900XTX should have been where the 7800 is in price. Also if the only way they can make GPU's faster is by putting INSANE amounts of VRAM on them, then its a HUGE issue! Very few games even use 16GB vram still even at 4k.
AMD buy their chips from the same place as Nvidia, TSMC. They literally can't charge any less without losing money. GPUs are a luxury good, not a charity.
@Wobbothe3rd if a 4090 sized die costs roughly 220 USD on TSMC N3, then it is probably cheaper on the N4 process. So the 4090 die costs around 200 USD. AMD uses chiplets for their 7900XTX and XT, so their costs per die are even lower. So the die is not really the deciding factor in price. The R&D costs make up the bigger part. Now does the Nvidia Ada generation justify their price with R&D costs? No. Is the AMD price justified by R&D? Maybe, but doubt it. The real force behind GPU prices is supply and demand. If people keep buying expensive GPUs then manufacturers will keep selling at the price.
I'm looking to build my first, mainly gaming PC and am trying to decide if I want want to go with a RTX and Intel, or go with a all AMD CPU and GPU build. I'll also be getting a new laptop for working on the go. I'm not loyal to either but I have noticed that Nvidia and Intel are heavily pushed the most.
You can use an AMD CPU, and an Nvidia GPU. Graphics cards aren't contingent on anything like CPU sockets are. You could run a 7800X3D and a 4090, for example.
Decide your workload. Decide your need for Ray tracing. Decide your need for cuda. Decide your ai workload. That will determine which graphic card. Workload will also tell you which. CPU provides the best value. Remember, you can mix AMD cpu and Nvidia, Intel and AMD GPU.
The recall function seems really useless. If they want to add some local AI functions, I wish they would make a local version of something similar to chatgpt but allow it to interact with your applications. for example, imagine being able to have a large spreadsheet open, and then instead of having to go through a ton of steps to make a decent graph, imagine instead being able to do something like type in the AI instruction box " Make a dual Y axis graph where the X axis represents time, and first Y axis represents temperature, and the second Y axis represents line ripple". If it can run locally then it could be more useful in securely working directly with user applications to automate the more tedious things.
Daniel, you are definitely right to be concerned about differentiators in the X3D lineup from AMD; however, I'm hoping that the differentiation will be a 16 core Zen5c CCD with vcache and an 8 core Zen5 CCD without vcache, where it will be more of a future proof thing with more cores. They could really blow Intel out of the water with that!
It makes sense that this improves the (alleged) 9900x3d and 9950x3d chips but it doesn't mean the 9800x3d is any worse than without this. I mean it was a huge problem for 7950x3d that it was so much worse: it was worse for both production work than the non-3d and worse in gaming than the 7800x3d. It was a compromise. I am more optimistic about this. So tldr: the 9800x3d would still be the same (or better) still so I don't see the issue. If somebody wants the 16 core x3d so be it but they won't kneecap the lower tier.
@@akosv96 It's sounding like the 9900x3D and 9950x3D are going to have cache on both CCDs this time around, since it sounds like AMD has solved the problems surrounding x3D overclocking. So it sounds like it will be less of a compromise than before, but I'm betting AMD will charge an arm and a leg for it though, unless Arrowlake shows up really competitive at the last minute. I'd love a 9950x3D with that much cache that can clock or at least boost almost as high as the standard parts! It's really got me thinking about upgrading from the 5950x; I was originally planning to just wait until Zen7...
if you count all the times microsoft has shafted AMD after they wasted time and effort on potential products its amazing AMD hasn't told them to F off yet.
Right now.. it's fine right now. Crucially you are not planning on the purchase of a 1500usd/£ product that you cannot upgrade the vram on.... Hoping it might be useful in 3 years.
Many Arc's have been returned due to driver issues. Arc never sold well in the first place, almost all of Intel's marketshare are integrated GPUs anyway.
@@HourRomanticist so the gpu's just disappear? Most people bought them for secondary systems or for media centers or as entry level machines, are you telling me those budget gamers that can't even afford nvidia then just upgraded to nvidia and threw away their arc gpus?
As far as the Microsoft Copilot+ features, Microsoft is mainly doing this to screw AMD just like they always try to do. I don't think that Intel's Lunar Lake NPUs even exceed the minimum TOPS to qualify for Copilot+, although if you combine the TOPS from the APU, it has more than enough. But I have a feeling that Microsoft intended for the Copilot+ features to be low power NPU specific, although I don't think they specifically stated that info anywhere... AMD's Strix Point definitely has the NPU performance alone to qualify for Copilot+. Microsoft is that girlfriend that no one ever wants to end up with; they'll eventually screw Qualcomm too, but right now, they really need Qualcomm to compete with Apple. Although, I bet AMD could make a better ARM based chip than Qualcomm...
with 4GB of Vram, you better buy something with Strix or even a 3060 6BG. I'd prefer to play everything Low (Something in medium) but textures in HIgh/Ultra than backwards.
This is not the solution for Recall. Hacker can access to someone's computer and activate Recall. Then return after some time in the future and get access to our data. Recall should not be implemented in Windows at all. It can be an option to download for those who wants this feature on their computers
Nvidia is so ahead of everything right now, they don't even need to compete with anyone. No competition also means high prices for next-gen, and not much impactful gains in performance, and that's bad for us customers. But one thing is against Nvidia, and that is there are not many new AAA good games that demand powerful hardware to play. So, they are letting fans add Nvidia tech to old critically acclaimed games. Unless every new multi-platform game has path-tracing options in the PC ports. Nvidia will still be at some disadvantage
It looks like people are simply ignorant to me. Why would you keep buying a nvidia GPU with very low V-ram when you can buy a very nice AMD GPU with a great amount of V-ram, it simply doesn't make any sense.
All these AI news, I just have to jump in with some perspective as a former dev. But this will help anyone who works with a lot of data in their day-to-day. Snapdragon has had NPU for years and years so their developer stack is way more mature than Intel or AMD. On the PC side, only NVidia has more solid dev tools for AI but they don't make x86 CPUs. So Microsoft has to go with Qualcomm to get Copilot+ up and running before anyone else. Just having the hardware isn't enough. The drivers and software stack needs to be robust enough to be actually useful. AMD don't even have AI enhanced FSR even though Radeon 7000 has the hardware. What is AI good for? A LOT of stuff on your mobile phones are AI enhanced especially those with Snapdragon SoC. Voice recognition, live camera enhancement (face focus, beautification, white balance, noise reduction, colour grading), OCR of text and QR code, face unlock, video upscaling, battery monitoring & charging. On the PC, many of these features will become available + potentially indexing of file and file content. I think Recall is great! Can't wait to use it to improve my productivity since I work with a lot of data. OCR happening in the background for my files and content generation will reduce time to search. I may want to have a third monitor dedicated to Recall. Auto SR is another great feature. CPUs are generally way cheaper than GPU these days. Upscaling of streaming video is another great feature. I use the Video enhance feature in Edge browser to upscale & improve SDR content on my HDR display. Remember we are still months away from Intel & AMD CPUs with 40+ TOPS to be generally available. That's why the AM5 X870 were pushed back to rush out the CPUs first. I'm a bit jealous of my wife's 5800X3D coz I'm using the 5900X and it's not maximising my 7900XTX performance. Upgrading to a 9900X3D or 9950X3D will be great for my gaming & productivity so I'd probably wait till early next year once X870 mobos launch to upgrade those components. Why upgrade my CPU? I expect Copilot+ to come to x86 in less than a year and my 5900X can barely handle Star Citizen too. And I'm worried about marketing speak too especially now there are Zen 5c parts thrown into the mix. So far the 9000 series is exactly as we expected. If AMD is able to solve the heat problem by moving the 3D Vcache onto the substrate rather than on top of a CCD, we can get high L3 cache + high clocks. That is wishful thinking of course. Without a new Socket, I doubt AMD can move the dies around too much due to die to socket trace length limitations. It may be possible if the Infinity fabric is upgraded to the latest version of TSMC COWOS but the cost will be crazy. It also seems like TSMC has maxed out their COWOS capacity for 2024. Just my speculations.
Copilot won't support AMD or Intel at Copilot lunch because AMD and Intel haven't released a new chip with NPU. When AMD and Intel release a new chip with NPU, Copilot will support that chip.
The one REAL improvement this "Recall feature" needs is the option not to install in the first place when installing Windows. I will create a custom install file anyways but normal users should not have to deal with spy and malware from Microsoft. Specially when this crap is only there to gather data for training their AIs and selling to 3rd party companies - so the data gets out there anyways and Microsoft takes the profit from your data.
Tell me where I can buy a 7900M laptop I woulda if I coulda, even with the M18x and it's garbage 300nit screen, but unfortunately I missed the black friday special on the one run of that laptop that made it here Down Under 😢
AMD and Intel also don't deserve any charity... They are multibillion dollar corps who both screw consumers just as hard when they can. Watch all the rage there will be over Zen 5 pricing for the rest of this year...
@@emma6648the vast majoritydo not, and you can literally prove it by observing the vram usage on the many benchmarking channels that exist all over youtube. VRAM is an AMD fanboy cope. Don't take my word for it, look up your favorite game's max vram usage at 1080p.
Recall is and always will be enough for me to never try windows on ARM,I also don't believe it will ever be turned off even when ''deactivated'' and its Microsoft so.... no thank you!!
What if they just stop making new graphics cards and just keep rolling out the same 50 series etc from now on 😂. I'm halfway kidding but series cards are so power hungry, large, and hot not it's gotten absurd.
Used to. When they dropped MXM and desktop sockets and awesome specs like overbuilt VRMs and 4x 4K display outs like on my P870DM3 and Prema went to work on Eluktronics' Tongfangs instead... They started making the same uninspiring stuff as all the others, they died in the eyes of the real laptop enthusiast community
@@8KilgoreTrout4 After my P170EM where I liquid metalled the GPU and overclocked the 680M beyond 1Ghz, I water-cooled a P370EM and put modded 1070s in it. The MXM MSI 1070s were insane, ate 200W on stock volts all day long if you could cool the VRMs The setup benched faster than that ridiculous $9000 Acer 21 inch monstrosity. I could shove 540W through that motherboard until it throttled the CPU, how they managed that I have no idea, the dual adapter thing didn't even come along until its successor model the P870! P870DM3 I've seen over 600W wall draw They built them special back in the day, that's for sure
When will AMD provide lower end 8000 they are specializing. Nvidia has gone full Monopoly, they don't care about consumer. And once again, "The more you buy ,the more you save".
Hahaha. It seems 3050 trash option is not enough. They are keeping 2050 also. Which i didnt even know existed. They will keep these trash options at already inflated proces and proce the next gen options at new levels of outrageous prices
Nvidia dominance shows some believe its why EVGA pulled out of the industry and why they didnt make GPUs for AMD instead. They've been rumors how Nvidia threatened manufacturers because Intel where looking for partners for Arc and Battlemage
EVGA were scumbags. They pulled a ton of anti consumer nonsense themselves over the years. I'm glad they're gone. Most AIBs are just parasites anyway. The vast majority would prefer to buy founders editions when possible.
In EVGA case once they stop becoming nvidia partner they are free too choose with whoever they want to work with. EVGA did not pursue to make GPU for AMD because it is will be worse for them with AMD than nvidia. They already complaining about the kind of margin they get with nvidia it will be significantly worse for them with AMD. And don't think AMD partner will quitely let another company to be AMD partner. because to them EVGA making AMD GPU will mean increased competition for them.
Ugh, I refuse to buy another laptop until I can get more than 8GB of VRAM for under $1500. That’s absurd. Fuck you NVIDIA, it’s not enough VRAM anymore.
@@Wobbothe3rd8gb does struggle even at 1080p with dlss quality, but only when using path tracing (only 2 games lol) and frame gen. so yeah for the most part 8gb is fine at 1080p but what about future PT games (fyi 5060 enough raw performance for 60+ 1080p dlss quality path tracing but the VRAM....)
Im kinda sick of reading nothing but complaints from all these precious princesses. "8GB of VRAM in 2025, that's disgusting".... look... if you dont like it, then dont buy it. Go buy a laptop for $6K ($9K in Australian $) If you want 16GB. There are people who just want mid tier laptops to play games like Counter Strike, Leage of Legends, WoW, PUBG etc and 8GB is all they need and shouldnt have to pay insane prices for high end machines and then there are those happy to drop thousands and thousands on a gaming laptop who want to run the latest AAA title @ 4K 120fps. I mean, this has been going on since the 486/Pentium days (for PC anyway, not so much laptops... but still), so if you are still hoping for some drastic change in the way hardware releases every 2 years when a new card comes out then you truly are in denial because it's been like this for 30 years since the arrival of the 3Dfx Voodoo.
We all complain about VRAM but most games even at 1440p don’t used 8gb yet. I assume with gddr7 it’ll be faster and used slightly less vram except for RT and the low end probably won’t need it for 1080p RT. We want more vram for “longevity” but it’s not necessary right now in gaming
Uhhh. Sure if textures are optimised sizewise and to be thrown out by the game but this hinders game development pace so freaking much you wouldn't believe. Nvidia is ducking over game devs so much with the VRAM cap it's criminal. And still, you can only compress textures so much until you cannot even call them "ultra" anymore since the whole point of ultra/high is that they are uncompressed and detailed (duh). If nvidia makes some compression algorithm and sells it as a magic to make 4k textures fit 8gb vram, that's snake oil salesman tacticts. They compress it and upscale it back which is lossy, not the same.
@@AdamGamingARK I don’t play a ton of games mostly sims and some AAA titles but I usually play them maxed at 1440p. I ALWAYS have my RTSS on and it’s pretty rare to see above 8gb especially if settings are on high instead of ultra
I have heard MS has thrown AMD under the Bus lately. A few Times. Maybe thats why Lisa wanted to have an confirmation from MS that they really do use AMDs Stuff this Time…
AMD has "lost the bid" for Surface Pro twice, and AFAIK MS also backed out of how many XBox chips they would buy for server use (yes, it was designed to do both), and now MS wants ARM to take over because they feel they can better control a hundred different ARM CPU vendors than deal with AMD and Intel. With Windows moving in the direction it is, I think it's time for AMD, IBM and Valve to get together and collaborate on a "de-facto Linux distribution" so we can finally get rid of all the fragmentation.
I really wish that when you turn yourself into a mouse pointer, that it also changes your voice into a chipmunk lol Making your voice higher the small you go and deeper the larger you are. Doesn't help the video but would of been fun
Not sure why my video chapters seem to disappear, but here they are as a comment:
Chapters:
0:00 Nvidia RTX 50 series mobile lineup leaked
2:57 Nvidia GPU marketshare hits 88%, AMD 12% Intel 0%
5:16 Nvidia, Microsoft, OpenAI antitrust probes
6:04 When are we getting new desktop GPUs?
7:37 Ryzen AI 9 HX 370HX engineering sample benchmarked
9:43 Qualcomm Snapdragon X Elite benchmarked at Computex
13:53 Intel CEO is happy to make Arm chips
14:25 Arm, Qualcomm legal battle has potential to kill this wave of Windows on Arm
15:55 Microsoft reworks Recall after security pushback
18:09 PowerColor uses NPU to reduce AMD GPU power consumption by 22%
19:20 AMD and Intel Copilot Plus PCs won't have AI features at launch...
21:41 When are we getting Ryzen 9000X3D CPUs? (leak/rumor)
23:32 AMD working on "differentiators" for X3D CPUs... is this a bad thing?
26:21 Ryzen 5 9600X engineering sample L1/L2 cache benchmark shows massive gains
28:05 Ryzen 5 9600X engineering sample hits 5.7GHz all core OC
28:40 Phil Spencer almost confirms Xbox Handheld, says it is important for it to run games locally
30:08 New Xbox Series Consoles
Thank youu! 🙂
Hi Daniel. You said you'd be looking to do more exclusive content for channel members in the summer break. I can only speak for myself of course but, as a member, I don't feel like I want any of your content to be behind a paywall of sorts. I don't know whether it'd be worth polling your other members to see what they think. I became a member because I like and appreciate the content you put out. I'd like it if it is all available to everyone, regardless of ability to support. Maybe not everyone feels the same way, just my opinion.
@@po1odo1o anything I do as members only content would be stuff I wouldn't post on the main channel anyway. I won't paywall anything that would be normal channel content.
Human eyes cannot see past 8gb of vram
- NVIDIA
Simple, Buy macbooks or AMD😂
who tf so dumb will do that@@mikelay5360
🤣
@@neyel8r good one
@@mikelay5360 did you seriously recommend MacBook for gaming ? Also less than 10% of laptops have AMD GPUs and ones that do end up with crap like 6500M anyway lol. So stfu
8GB is absolutely brutal.
"mmmm but aktually its a 1080p card!!!" 🤓
Nvidia fanboys suck so much dick these days its unreal
"But the 4060/4070 8GB are just 1080p cards!"🤓
I seriously hate nvidia fanboys they're almost as bad as AMD fanboys.
@@BonusCrook I know right... I mean it would be crazy to think that my weak little 4060 laptop can run anything over 30fps @ 720p... typing this as I am playing Hellblade 2 @ 2k with 90fps optimized settings.
@@thornstrikesback honestly they should put 6GB on them, it just makes sense! Why would anyone want textures or shadows on a 1080p card!
@@BonusCrook ikr delete textures and we just use body frames to game
8 GB 5060 and 5070 is insanity in 2025. Its already unrecommendable in 2024 unless its like $1000. At this point, we are gonna have to hope there is more than 4 redeon laptops next gen and AMD has more than 2 brain cells to realize this is their selling point.
AMD can't sell a laptop GPU with their garbage power draw. At Laptop wattage even RDNA3 is total trash. That's why AMD wisely focuses on APUs.
AMD's power draw sucks too badly to compete in laptops. Even RDNA3 is total trash at Laptop power draw. That's why AMD wisely focuses on their APUs for both consoles and laptops, they would fail HARD if they tried to make mobile dGPUs.
Well maybe AMD will give us more mini PCs with decent CPUs and GPUs.... Like to see the 780M/880M become more of a standard for PCs at stores that sell electronics....
For a laptop GPU, I don't find it that aggregious. If it's the same for desktop, then there will be a problem.
Mobile:
1070 8GB
2070 8GB
3070 8GB
4070 8GB
5070 8GB
Absolute insanity
Mouse pointer yesssss
Nvidia: the more you buy, the more you save
The more you buy the more your energy bill increases
Gamers be like : nvidia are assholes, selling us cutdown gpus for insane prices!! Also gamers : buy nvidia gpus anyway.
The more you buy the more Huang saves billions about 100 billion usd net worth now from 15 billion in 2022.
The more *YOU BUY* The more *WE SAVE* on VRAM 🤣
@@lolmao500there’s nothing wrong with it when competition full gpu and still can’t compete with Nvidia
Thank you again Nvidia for not giving me a single reason to upgrade my 2021 legion 5 pro with a 3070.
Bro my 4080 Aorus 17H is literally twice as fast in AAA games
@@aaz1992 and twice as expensive. What's your point?
Dude it's 2024. You SHOULDN'T have to upgrade a GPU you bought in 2021, wtf!? Since when do people buy new GPUs that often?
@@aaz1992 4080 12GB 😆 That card was such trash value they had to unlaunch it for desktop and your trying to brag about it online to someone for not upgrading!
4080 Mobile ~ 4070 desktop
Same on a 3080 12GB looking like 6060 or 7060 before I can even consider upgrading... By then AMD's ray tracing probably will be quite a bit better, and I might as well switch to AMD to save money and have better Linux performance.... Only GPU I kinda even "need" right now is a modern SSLP 35w card....
9:59 Thanks to Daniel for not running to catch the children who escaped from the basement to the roof and finishing the video
Xbox Handheld with the Gamepass and the 360 library would be insane.
Once again with only 8GB VRAM for anything other than the high end...
hopefully the speed of GDDR7 will make up for it
@@chomomma7403thats not how vram works.
As seen with the scummy 3050-4GB VRAM bandwidth cannot save you if you lack capacity.
@@chomomma7403 memory speed isn't going solve anything for the actual capacity being low it might help but it won't completely solve the VRAM issue
tHe mOrE yOu bUy ThE mOrE U SaVe
Most Laptop screens are lower resolution than the high end monitors you need 10GB+ of vram for.
The real news here is that Daniel finally got another jacket. 😁
Dear Daniel, please never stop to be how you review tech topics of the pc. You represent at the same time a good friend with good advices and point of view with knowledge. Thanks for your effort and information. Keep on it! Greetings from Austria 🇦🇹
I find it interesting and weird that despite all the discussion on RUclips comments, and most people not recommending Nvidia GPUs, they somehow skyrocketed in market share.
It just makes it seem like Nvidia has a stronghold on the market, and are essentially too big to fail.
Hopefully this is a Bubble that will pop or at least deflate in the coming years. things are getting quite awful.
If you have the money some of the models are still the better option
Social media is an echo chamber of bitter losers and Chinese influence bots. Nvidia dominates because their products ARE ACTUALLY MATERIALLY SUPERIOR.
Daniel did make a video talked about this topic. Also, check the comments and you will know why.
90% of prebuilts come with Nvidia and usually Intel. It's the same in laptop.
WHAT THE FUCK THEY STILL WON'T INCREASE THE VRAM FK THAT
Why would they! NVIDIA gpus are selling like hotcakes so people aka we customers love low vram! Makes us more thin and sexy!
😅
I have the same concern over the "differentiators" quote, Daniel. It seems like they mean that they'll be differentiating the X3D in the 9000 series from the X3D in the 7000 series by improving it, which is good, but cynical me wonders if they'll also make it even more beneficial in the higher core count (i.e. more expensive) CPUs so that they can sell more of those to gamers. The issue with that is that it may seem like they're crippling the lower core count CPUs to push people towards the higher core count ones. Hopefully, they don't do that.
Thanks for streaming all this interesting news together. Complete with solid insights!
Cool, another generation of 8gb laptop GPUs for the midrange.
Nvidia is essentially saying buy our laptop GPU and make peace with using dlss for everything as standard.. and maybe you will just about be ok to run games at 1080p for 4 years.
Mmm after how good my 5800X3D has been, definitely gonna be giving this 9000 series a look
I like the mouse pointer format
Love watching the videos where real life is actually going on in the background ( kids being loud, like they tend to do lol) Keep it up man!
2017 was an amazing time to be a gamer - GTX1080Ti came out and AMD came with the banger RX580. Crazy how fast time flies.
I have the 2023 Aorus 17H with 4080 12GB. I really want to upgrade again in 2025 but if the performance isn't at least 50% faster at the same 150W, I won't bother. Looks like it'll be the same 12GB vram for the 5080 or whatever they call it
With AMD, maybe maybe that may happen maybe, but 50% generational uplift for same power usage is probably impossible...
you can see in the first screenshot, 5080 laptop will have 16GB VRAM and my speculation (🤓☝️) is around 40% uplift over 4080 laptop at same watts.
Lol, 50% performance uplift from one gen to the other. That hasn't been the case in a long time mate.
50% GoG uplift will most likely not happen, especially considering how the transistor node improvement will be much more incremental than prior generations. I imagine you have a desired refresh rate and resolution target that you want to meet, but turning it down a little and waiting for the 60-series might be your best bet. RDNA 5 could also be promising, and that’s rumored to come out next year.
Impossible to get 50% better effiency without a big node jump. And RTX 5000 barely uses a refined version of the same process as Ada lovelace.
I can already tell you it isnt happening.
i just got my 7500f from ali express today, got a x670 mb for a solid deal as well. can't wait for the 9800x3d to come out, so i can sell the 7500f.
Thank god, I’ve been waiting for better 7zip performance! 11:53
Imagine if AMD priced more aggressively. Not feasible I know but a world where the 7900XT was $600 could have been great
Happy with my 480$ 16GB 7800XT
19:31 Before you even mentioned this, the way Lisa Su at Computex told the Microsoft representative that AMD had to dedicate a lot of die space to this ... I got the same impression. She's not entirely convinced this is the best way to spend die space.
Honestly it is not, and those "AI" chips need to be on a PCIE 5 lane, and just use M.2/PCIE, or their own slot.
Glad Micro$py is walking back Recall. Thanks for vloggers like you for spreading the word about this thing far and wide!
I’m just ready to watch you guys get beat by the scalper bots this Holiday on 50 series, if it releases
That's why you wait for that to happen and pick up 4090 for half price during the small window of gamers competing with the bots doing a mad rush for 50 series.... although in saying that, I am sure I read somewhere that Nvidia were going to try and prevent that from happening as much as possible (how they do, I wouldn't know) but ultimately scalper sales slows down the process of actually getting the GPUs in to the hands of the actual gamers and that somehow hurts nvidia when the gpus are just sitting around in storage and not being used.
Me waiting for Battlemage like 🗿:
@@cuteAvancer lol 😂 I like your reply
This reminds me of a reddit comment i came across where some snob was trying to shame people for purchasing the 4080 and 4090 saying that the 50 series is gonna have more VRAM.....my my...how the turn tables have turned.
Happy Summer Break! I hear your concern about the 9950X3D being more gaming-powerful than the 9800X3D (for example). To me, it just makes more sense that you pay more and you get more performance. I'm not rich, btw. I like low price/high performance, too. But I also want a Porsche 911 Turbo convertible that I may never have ;-) That all said, I'm hoping the reason the higher numbered 3D parts will perform better in games isn't just due to more cache being thrown at them. Sure, I'll take that as a factor, just preferring they have ironed out any scheduling issues.
if microsoft makes something opt-in, they will surely remind of that option whenever it's possible and these reminders will not be easy to deactivate, but that's still better, I guess.
I think that the partners talking to Wccftech are actually talking about the same timeframe, but just naming it differently (it's possible that the partners weren't referencing nVIDIA's fiscal year quarter naming, but rather their own company's fiscal year quarter naming - different companies Begin and end their fiscal years at different times). I'm expecting to see a launch at CES in January, which is what I've been expecting for at least a year now. For nVIDIA's fiscal year, that is quarter 4 2024 (which ends January 31st), but for another company that maybe follows a standard year, that same timeframe would be quarter 1 2025...
I stayed for the "more!!!!!!!!!!1," and I wasn't disappointed.
I'm thinking of getting a rtx 4070 powered hp omen 16 this month, so should I get it or wait for the 50 series to hit the markets, I can do only one, meaning I cannot buy it now and exchange it for a new one since the exchange value of products in my country are shit
Why would you want the 1984 Orwell copilot shit?
25:30 Translation: What if a higher end X3D CPU actually did what it should and gives you more performance than the chips below them?
This sounds like someone whining about not being able to get better performance out of a lower tiered chip for less. Chips giving more performance cost more...the sky is falling!
They aren't going to make the 9800 perform worse than a 7800 while costing more, so your "worries" make no sense. Even if you think they will be artificially limiting the 9800 to make the 9850/9900 look better by comparison, you will still be getting more performance as you go up in tiers and cost.
Anyone remember the 1080 Ti? Nvidia does. The 7800 isn't happening again.
Exactly. It's like saying "please don't do that, it will make the 9800x3d look worse!"
The current Zen4X3D cpus got a massive price cut today. Especially the 7900X3D now costing less than the 7800X3D. And the scheduling issue... is only an issue with Windows. On Linux, the scheduling is perfect.
Can someone make mouse cursor out of Daniel Owen? Static daniel pointing up as the regular mouse. Him yapping when something loads and so on and on. Would be nice.
Typically a 5090M is around the specs of a 5080 card.
You can see this 4090M is a 16 gig product, much more similar to the 4080 than the desktop 4090.
Works in the lower stack too the 4070M being comparable to the 4060(TI?) instead of the 4070 desktop with it's 12gigs.
So... 16-16-12 that's probably 5080, 5070 TI and 5070 in desktop space.
How do Microsoft force the "recall feature" on us, is it forced in windows 11 update or something?
There îs only one HX in Ryzen naming. Videocardz did a typo.
The Recall database wass not accesible to other program even before. Those programs needed admin permissions or you to give UAC approval when they wanted to. But yeah, better to encript that thing.
As bad as this trend is, the sad reality is that the geforce gpus are still more reliable (aside from the cable failure on 4090s), "trusted" and energy efficient than radeon gpus are, regardless of vram capacity and price. most people dont care about it as they still run games on 1080p medium (or thereabout) and hence buy the "budget" cards, as long as they just "work". At this point, the only ones that can take down nvidia is nvidia themselves.
Maybe if AMD stopped releasing GPU's that flatlines on release. It wouldn't be so bad if their GPU's were A LOT cheaper, but they are just NOT. The 7900XTX should have been where the 7800 is in price.
Also if the only way they can make GPU's faster is by putting INSANE amounts of VRAM on them, then its a HUGE issue! Very few games even use 16GB vram still even at 4k.
AMD buy their chips from the same place as Nvidia, TSMC. They literally can't charge any less without losing money. GPUs are a luxury good, not a charity.
@@Wobbothe3rdthey definitely can charge less there is a healthy profit margin on their cards, you think AMD is some kind of loss leader?
@Wobbothe3rd if a 4090 sized die costs roughly 220 USD on TSMC N3, then it is probably cheaper on the N4 process. So the 4090 die costs around 200 USD. AMD uses chiplets for their 7900XTX and XT, so their costs per die are even lower. So the die is not really the deciding factor in price. The R&D costs make up the bigger part.
Now does the Nvidia Ada generation justify their price with R&D costs? No. Is the AMD price justified by R&D? Maybe, but doubt it.
The real force behind GPU prices is supply and demand. If people keep buying expensive GPUs then manufacturers will keep selling at the price.
I'm looking to build my first, mainly gaming PC and am trying to decide if I want want to go with a RTX and Intel, or go with a all AMD CPU and GPU build.
I'll also be getting a new laptop for working on the go.
I'm not loyal to either but I have noticed that Nvidia and Intel are heavily pushed the most.
You can use an AMD CPU, and an Nvidia GPU. Graphics cards aren't contingent on anything like CPU sockets are. You could run a 7800X3D and a 4090, for example.
Decide your workload. Decide your need for Ray tracing. Decide your need for cuda. Decide your ai workload. That will determine which graphic card. Workload will also tell you which. CPU provides the best value. Remember, you can mix AMD cpu and Nvidia, Intel and AMD GPU.
Yes they are trying to push to make it so gamers who want the best have to pony up for the 16 core x3d version.
The recall function seems really useless. If they want to add some local AI functions, I wish they would make a local version of something similar to chatgpt but allow it to interact with your applications. for example, imagine being able to have a large spreadsheet open, and then instead of having to go through a ton of steps to make a decent graph, imagine instead being able to do something like type in the AI instruction box " Make a dual Y axis graph where the X axis represents time, and first Y axis represents temperature, and the second Y axis represents line ripple".
If it can run locally then it could be more useful in securely working directly with user applications to automate the more tedious things.
Copilot
Daniel, you are definitely right to be concerned about differentiators in the X3D lineup from AMD; however, I'm hoping that the differentiation will be a 16 core Zen5c CCD with vcache and an 8 core Zen5 CCD without vcache, where it will be more of a future proof thing with more cores. They could really blow Intel out of the water with that!
It makes sense that this improves the (alleged) 9900x3d and 9950x3d chips but it doesn't mean the 9800x3d is any worse than without this. I mean it was a huge problem for 7950x3d that it was so much worse: it was worse for both production work than the non-3d and worse in gaming than the 7800x3d. It was a compromise. I am more optimistic about this.
So tldr: the 9800x3d would still be the same (or better) still so I don't see the issue. If somebody wants the 16 core x3d so be it but they won't kneecap the lower tier.
@@akosv96 It's sounding like the 9900x3D and 9950x3D are going to have cache on both CCDs this time around, since it sounds like AMD has solved the problems surrounding x3D overclocking. So it sounds like it will be less of a compromise than before, but I'm betting AMD will charge an arm and a leg for it though, unless Arrowlake shows up really competitive at the last minute. I'd love a 9950x3D with that much cache that can clock or at least boost almost as high as the standard parts! It's really got me thinking about upgrading from the 5950x; I was originally planning to just wait until Zen7...
Amd and Microsoft going at it behind the scenes 😂
if you count all the times microsoft has shafted AMD after they wasted time and effort on potential products its amazing AMD hasn't told them to F off yet.
daniel is antman
I still have a second rig rocking a Geforce 1080, play at 1080p. Plays games just fine.
Right now.. it's fine right now. Crucially you are not planning on the purchase of a 1500usd/£ product that you cannot upgrade the vram on.... Hoping it might be useful in 3 years.
@@bogstandardash3751 Its a secondary rig, I don't have to hope. Its just extra. Its amazing how long custom PC's can last.
Anyone hearing of a AI focused add in card other than your GPU? There were a couple companies working on cards, but I lost word on them.
How does it make sense that Intel lost marketshare? That's gotta be due to nvidia selling millions and millions more units
I think this is in shipments and not install base which makes more sense. Arc users haven't just disappeared.
Many Arc's have been returned due to driver issues. Arc never sold well in the first place, almost all of Intel's marketshare are integrated GPUs anyway.
Intel is a generation old with no new generation. It makes perfect sense. Intel was competing with AMD 6000 series.
@@Wobbothe3rd still those numbed make 0 sense
@@HourRomanticist so the gpu's just disappear? Most people bought them for secondary systems or for media centers or as entry level machines, are you telling me those budget gamers that can't even afford nvidia then just upgraded to nvidia and threw away their arc gpus?
Cloud based handhelds are just not it. It’s a cool feature but can’t be the main way to play
funny i thought amd and microsoft had super tight connections 😎
As far as the Microsoft Copilot+ features, Microsoft is mainly doing this to screw AMD just like they always try to do. I don't think that Intel's Lunar Lake NPUs even exceed the minimum TOPS to qualify for Copilot+, although if you combine the TOPS from the APU, it has more than enough. But I have a feeling that Microsoft intended for the Copilot+ features to be low power NPU specific, although I don't think they specifically stated that info anywhere... AMD's Strix Point definitely has the NPU performance alone to qualify for Copilot+. Microsoft is that girlfriend that no one ever wants to end up with; they'll eventually screw Qualcomm too, but right now, they really need Qualcomm to compete with Apple. Although, I bet AMD could make a better ARM based chip than Qualcomm...
with 4GB of Vram, you better buy something with Strix or even a 3060 6BG. I'd prefer to play everything Low (Something in medium) but textures in HIgh/Ultra than backwards.
Low end is getting TOO POWERFUL for that 8 gb
You`ll be at +100 fps with blurry textures or the alternative: stuttery mess
This is not the solution for Recall. Hacker can access to someone's computer and activate Recall. Then return after some time in the future and get access to our data. Recall should not be implemented in Windows at all. It can be an option to download for those who wants this feature on their computers
Nvidia is so ahead of everything right now, they don't even need to compete with anyone.
No competition also means high prices for next-gen, and not much impactful gains in performance, and that's bad for us customers.
But one thing is against Nvidia, and that is there are not many new AAA good games that demand powerful hardware to play. So, they are letting fans add Nvidia tech to old critically acclaimed games. Unless every new multi-platform game has path-tracing options in the PC ports. Nvidia will still be at some disadvantage
The more you buy the more you spend
It looks like people are simply ignorant to me. Why would you keep buying a nvidia GPU with very low V-ram when you can buy a very nice AMD GPU with a great amount of V-ram, it simply doesn't make any sense.
All these AI news, I just have to jump in with some perspective as a former dev. But this will help anyone who works with a lot of data in their day-to-day.
Snapdragon has had NPU for years and years so their developer stack is way more mature than Intel or AMD. On the PC side, only NVidia has more solid dev tools for AI but they don't make x86 CPUs. So Microsoft has to go with Qualcomm to get Copilot+ up and running before anyone else.
Just having the hardware isn't enough. The drivers and software stack needs to be robust enough to be actually useful. AMD don't even have AI enhanced FSR even though Radeon 7000 has the hardware.
What is AI good for? A LOT of stuff on your mobile phones are AI enhanced especially those with Snapdragon SoC. Voice recognition, live camera enhancement (face focus, beautification, white balance, noise reduction, colour grading), OCR of text and QR code, face unlock, video upscaling, battery monitoring & charging.
On the PC, many of these features will become available + potentially indexing of file and file content. I think Recall is great! Can't wait to use it to improve my productivity since I work with a lot of data. OCR happening in the background for my files and content generation will reduce time to search. I may want to have a third monitor dedicated to Recall.
Auto SR is another great feature. CPUs are generally way cheaper than GPU these days. Upscaling of streaming video is another great feature. I use the Video enhance feature in Edge browser to upscale & improve SDR content on my HDR display.
Remember we are still months away from Intel & AMD CPUs with 40+ TOPS to be generally available. That's why the AM5 X870 were pushed back to rush out the CPUs first.
I'm a bit jealous of my wife's 5800X3D coz I'm using the 5900X and it's not maximising my 7900XTX performance. Upgrading to a 9900X3D or 9950X3D will be great for my gaming & productivity so I'd probably wait till early next year once X870 mobos launch to upgrade those components.
Why upgrade my CPU? I expect Copilot+ to come to x86 in less than a year and my 5900X can barely handle Star Citizen too.
And I'm worried about marketing speak too especially now there are Zen 5c parts thrown into the mix. So far the 9000 series is exactly as we expected. If AMD is able to solve the heat problem by moving the 3D Vcache onto the substrate rather than on top of a CCD, we can get high L3 cache + high clocks. That is wishful thinking of course. Without a new Socket, I doubt AMD can move the dies around too much due to die to socket trace length limitations.
It may be possible if the Infinity fabric is upgraded to the latest version of TSMC COWOS but the cost will be crazy. It also seems like TSMC has maxed out their COWOS capacity for 2024. Just my speculations.
Copilot won't support AMD or Intel at Copilot lunch because AMD and Intel haven't released a new chip with NPU. When AMD and Intel release a new chip with NPU, Copilot will support that chip.
I upgraded from a 3070 with 8gb vram to a 4070 with 12gb vram and i will never go back. the extra vram is absolutely worth it.
The one REAL improvement this "Recall feature" needs is the option not to install in the first place when installing Windows. I will create a custom install file anyways but normal users should not have to deal with spy and malware from Microsoft. Specially when this crap is only there to gather data for training their AIs and selling to 3rd party companies - so the data gets out there anyways and Microsoft takes the profit from your data.
I have seen Arnold’s Total Recall so I’ll pass on that 😜
Morreeee mouse pointer agaiinnn❤❤
AMD would just outright win the battle if they just give tons of vram to the consumers, because Nvidia refuses to
Ryzen 9 AI HX 6969 XT PHQFARTMPG3960-THX
Coming soon!
Consoles and game developers are making GPUs not as important as they were before.
need chapters Daniel!
"On/Off" button for AI its "On/Off" FOR YOU not for MS )
Yall need start buying AMD & Intel GPU
Lol, weaker GPUs for thee, but not for me!? Why don't you LEAD BY EXAMPLE lol
Tell me where I can buy a 7900M laptop
I woulda if I coulda, even with the M18x and it's garbage 300nit screen, but unfortunately I missed the black friday special on the one run of that laptop that made it here Down Under 😢
AMD and Intel also don't deserve any charity... They are multibillion dollar corps who both screw consumers just as hard when they can. Watch all the rage there will be over Zen 5 pricing for the rest of this year...
I got a Sapphire 7900xtx 🫡
it doesnt matter anymore mate Nvidia is bigger now you cant vote with your wallet anymore
Anything below 12GB next generation is a crime. Your GPU is held back with only 8GB of VRAM.
No it isn't. Most people play at 1080p. Just because you see something repeated over and over on social media doesn't make it true.
@@Wobbothe3rdeven at 1080 a lot of games use a shit ton of VRAM
4k native usually maxes between 8GB and 12GB.
People buy a 4060 for 4k? 😂
The only AMD fanboy talking point is to scream about vram.
@@emma6648the vast majoritydo not, and you can literally prove it by observing the vram usage on the many benchmarking channels that exist all over youtube. VRAM is an AMD fanboy cope. Don't take my word for it, look up your favorite game's max vram usage at 1080p.
No timestamps in the news video?
Recall is and always will be enough for me to never try windows on ARM,I also don't believe it will ever be turned off even when ''deactivated'' and its Microsoft so.... no thank you!!
What if they just stop making new graphics cards and just keep rolling out the same 50 series etc from now on 😂. I'm halfway kidding but series cards are so power hungry, large, and hot not it's gotten absurd.
I always love when you become the mouse cursor I wish that there was a mod to have you as a cursor 😂
Yeah so sounds like unless you spend a bunch of money gaming laptops are gonna suck
aghh 8gb is brutal :(
Clevo make DOPE laptops!
Used to. When they dropped MXM and desktop sockets and awesome specs like overbuilt VRMs and 4x 4K display outs like on my P870DM3 and Prema went to work on Eluktronics' Tongfangs instead... They started making the same uninspiring stuff as all the others, they died in the eyes of the real laptop enthusiast community
@@greebj I had one with SLi 680m's so yeah I guess that's like 2014 ish.
@@8KilgoreTrout4 After my P170EM where I liquid metalled the GPU and overclocked the 680M beyond 1Ghz, I water-cooled a P370EM and put modded 1070s in it. The MXM MSI 1070s were insane, ate 200W on stock volts all day long if you could cool the VRMs
The setup benched faster than that ridiculous $9000 Acer 21 inch monstrosity.
I could shove 540W through that motherboard until it throttled the CPU, how they managed that I have no idea, the dual adapter thing didn't even come along until its successor model the P870!
P870DM3 I've seen over 600W wall draw
They built them special back in the day, that's for sure
When will AMD provide lower end 8000 they are specializing. Nvidia has gone full Monopoly, they don't care about consumer.
And once again,
"The more you buy ,the more you save".
Hahaha. It seems 3050 trash option is not enough. They are keeping 2050 also. Which i didnt even know existed. They will keep these trash options at already inflated proces and proce the next gen options at new levels of outrageous prices
Nvidia dominance shows some believe its why EVGA pulled out of the industry and why they didnt make GPUs for AMD instead. They've been rumors how Nvidia threatened manufacturers because Intel where looking for partners for Arc and Battlemage
EVGA were scumbags. They pulled a LOT of anti consumer bullshit. Most AIBs are just parasites, the vast majority prefer founders editions anyway.
EVGA were scumbags. They pulled a ton of anti consumer nonsense themselves over the years. I'm glad they're gone. Most AIBs are just parasites anyway. The vast majority would prefer to buy founders editions when possible.
In EVGA case once they stop becoming nvidia partner they are free too choose with whoever they want to work with. EVGA did not pursue to make GPU for AMD because it is will be worse for them with AMD than nvidia. They already complaining about the kind of margin they get with nvidia it will be significantly worse for them with AMD. And don't think AMD partner will quitely let another company to be AMD partner. because to them EVGA making AMD GPU will mean increased competition for them.
There's no anti trust when AMD fails to compete. 12 percent.
Ugh, I refuse to buy another laptop until I can get more than 8GB of VRAM for under $1500. That’s absurd. Fuck you NVIDIA, it’s not enough VRAM anymore.
Stop whining. 8gb is fine for most games on a laptop screen. If ypu have they money for a 4K laptop you have the money for an 80 class GPU laptop.
@@Wobbothe3rd8gb does struggle even at 1080p with dlss quality, but only when using path tracing (only 2 games lol) and frame gen.
so yeah for the most part 8gb is fine at 1080p but what about future PT games (fyi 5060 enough raw performance for 60+ 1080p dlss quality path tracing but the VRAM....)
AMD HHHYZEN 9000 FTW!
Im kinda sick of reading nothing but complaints from all these precious princesses. "8GB of VRAM in 2025, that's disgusting".... look... if you dont like it, then dont buy it. Go buy a laptop for $6K ($9K in Australian $) If you want 16GB. There are people who just want mid tier laptops to play games like Counter Strike, Leage of Legends, WoW, PUBG etc and 8GB is all they need and shouldnt have to pay insane prices for high end machines and then there are those happy to drop thousands and thousands on a gaming laptop who want to run the latest AAA title @ 4K 120fps. I mean, this has been going on since the 486/Pentium days (for PC anyway, not so much laptops... but still), so if you are still hoping for some drastic change in the way hardware releases every 2 years when a new card comes out then you truly are in denial because it's been like this for 30 years since the arrival of the 3Dfx Voodoo.
We all complain about VRAM but most games even at 1440p don’t used 8gb yet. I assume with gddr7 it’ll be faster and used slightly less vram except for RT and the low end probably won’t need it for 1080p RT. We want more vram for “longevity” but it’s not necessary right now in gaming
Uhhh. Sure if textures are optimised sizewise and to be thrown out by the game but this hinders game development pace so freaking much you wouldn't believe. Nvidia is ducking over game devs so much with the VRAM cap it's criminal.
And still, you can only compress textures so much until you cannot even call them "ultra" anymore since the whole point of ultra/high is that they are uncompressed and detailed (duh). If nvidia makes some compression algorithm and sells it as a magic to make 4k textures fit 8gb vram, that's snake oil salesman tacticts. They compress it and upscale it back which is lossy, not the same.
many games uses more than 8gb of vram at 1080p
@@AdamGamingARK I don’t play a ton of games mostly sims and some AAA titles but I usually play them maxed at 1440p. I ALWAYS have my RTSS on and it’s pretty rare to see above 8gb especially if settings are on high instead of ultra
5080 16gb vram
5080ti 24gb vram
5090 24gb vram
5090 ti super 32gb vram
🤍🤍🤍🤍🤍🤍🤍
I have heard MS has thrown AMD under the Bus lately. A few Times. Maybe thats why Lisa wanted to have an confirmation from MS that they really do use AMDs Stuff this Time…
AMD has "lost the bid" for Surface Pro twice, and AFAIK MS also backed out of how many XBox chips they would buy for server use (yes, it was designed to do both), and now MS wants ARM to take over because they feel they can better control a hundred different ARM CPU vendors than deal with AMD and Intel.
With Windows moving in the direction it is, I think it's time for AMD, IBM and Valve to get together and collaborate on a "de-facto Linux distribution" so we can finally get rid of all the fragmentation.
Nvidia has always been cheap on the amount of vram for their graphics cards.
We need to take Nvidia to the court for wasting sand and increasing e-waste.
Nvidia angers me
I really wish that when you turn yourself into a mouse pointer, that it also changes your voice into a chipmunk lol
Making your voice higher the small you go and deeper the larger you are.
Doesn't help the video but would of been fun
Would be annoying for anyone actually watching the video
*smaller
would *have, not "of"...
If the 70-class has a 192-bit bus AGAIN, Nvidia can get bent. It's gonna be another skip generation. I can't wait to skip it! 👌
Very cool of you to let your kids play on the roof