Funny how you influencers are already hyping the 5090 as it's not going to cost over US$2K. Then again, influencers get them for free exactly for this reason.😂
When the 1080 came out, I upgraded my whole setup. Mobo, ram, i7, and a 1080. It cost 1 month of my monthly wage. Today I could not buy the 4090 alone...
"The PCB breaking from its own weight" Yeah, what do you think when you try to build another tower of babel sideways horizontally. I'm not a bridge engineer, but we gonna start analyzing structural design on PC talks from now on.
Video Title: 5090 does the impossible! Me: Be reasonably priced? Not lock features behind an upgrade? Fit in normal cases? Have a power cable that doesn't destroy your PC?
@@JohnDoeC78 5-15% would be pretty poor performance uplifts (I think the mid range RTX 40 series had some GPU close to that though) around 30% is the typical performance gain per generation for 80 series GPU, though their are generations where it's much higher than that.
Especially with how many ports are garbage, and even on a high end system run like crap. They just run "less crappy" the more powerful your PC is, but still crap. This will keep happening until developers either ditch the Unreal engine, or Epic gets their shit together and fixes this garbage. They've had years, and there doesn't seem to be any end in sight. Stutter, stutter, stutter, stutter. We aren't living in the 2010s anymore.
@@JohnDoeC78 1080 is a bad example, as a product it was an anomaly. Later Nvidia realized their mistake and started to tier their cards more carefully.
Intel is attempting to become a silicon foundry to rival TSMC, in order to do that it needs to take orders from AMD and Nvidia. ARM laptop/desktop chips do pose a potential threat to Intel, but at the same time. It was Intel not having exposure to mobile phones that gave TSMC the huge lead in foundry and AMD and Nvidia cheap access to high end low cost silicon to out compete Intel. Intel is building its place in the AI scene, being US based and with a history of operating a US foundry they are perfectly placed to take advantage of any disruption/restrictions on AI chips manufactured in taiwan or south korea.
This was by a new Amazon Vendor from China. I ordered one and then they cancelled the order and ultimatley Amazon told me they are no longer a vendor. So ulitmately they didn't ship any CPUs at that price.
I mean, it is good that Nvidia found a more efficient cooler for the 5090(and maybe rest of the FE Blackwell GPUs), that means less weight and space but... Tbh the most important thing is the price, power consumption and performance, oh... And yeah, that the connectors doesnt melt 😅, we will see.
its more that now intel has its own fabs, they need to upkeep them and that's just not possible with their own chips, and worse case senario they can always just make it so its super expensive for chips to be made that arn't theirs to where they can keep a similar price to performance.
This was by a new Amazon Vendor from China. I ordered one and then they cancelled the order and ultimatley Amazon told me they are no longer a vendor. So ulitmately they didn't ship any CPUs at that price.
I’m going to get the 5090 when it comes out. I am a dedicated gamer… Jk. I am getting because I am in school to become an AI Engineer and by the start of 2025 I’ll have finished my course. A good GPU will carry me a long way… I’m still gonna game on it tho lmao
Yeah until you try getting the AIO hoses behind or under it to get to the CPU. I have the triple fan 4080 Super OC and have to mount my radiator in the front of the case. The card is 340mm long. That removes routing options like you wouldn't believe. And it's in the top slot and needs a bracket to hold up the ass end so it won't sag and crack the board. They could have considered all that.
The GPU weight and sag can easily be solved with rear connectors. Extend the PCI slot and have extra connections into the MB for power. Or next to the PCI slot have a gap then the power connector and extra connections on the MB for power. This would spread the load, get rid of ugly cables. It's a win win. For now, we could just have GPU's shipped with an adjustable GPU anti sag bracket. The extended PCI slot though is my fav solution.
They are not going to allocate some of their GPU wafers at TSMC to do what would be a low volume CPU part to start. The intel process will be good enough for that design.
Not necessarily, it depends on where the Intel fabrication plant is located. If NVidia doesn't have to pay for all the overseas shipping (provided that Intel is made in the USA), then NVidia can save a ton of money and still charge more for setting your house on fire.
Enjoy the last Intel GPUs, that sucked as usually their integrated GPUs do; I think this 1 is the penultimate, and somewhere in 2025 the last Intel GPU, IIRC; and probably Intel will drop from the market of CPUs too: they're not out-designing AMD PC CPU, making crazy OCs, on Server are taking a beating too, and since they're making Fabs, probably stick to Chip production. At least this is my realistic perspective long term. NGreedia is making PC AI CPUs, but it's a start, might get good and General Purpose processing and compete with AMD. RISC-like CPUs might take off for AMD too; who knows... NGreedia GPU 5090+: They need their own case, make dual cases, and use a GPU Riser to link to the MB. The GPU Coolers suck for the most part, they should be more like the CPUs, efficient to take the hot air out; today's designs pushes the air back into the hot heatsink, so other forms of cooler should be tested. Anyway the NGreedia 5090 proves that GPUs need their own MB and very Special PCIe connection to the CPU MB: maybe on the side, really near the CPU, 64-Lanes, or at the very least 32-Lanes, would be totally possible
The channel "Daniel Owen" just released a video proving my point. Anybody who says 8GB isn't enough for modern games should go check it out. DEFINITELY enough for 1080p, which is what 8GB cards are for. If you're trying to run 1440p or 4K with an 8GB card, that's YOUR FAULT
To all consumer pc companies : STOP USING FUCKING WATER COOLING its a pain in the ass and practically useless when i live in a temperature controlled house thats colder than the fuckin arctic
4:00 - You're confusing server-specific parts with consumer GPUs. Even if the architecture it's based on is the same, there are different parts of the chip used in the data center that consumers would see a fraction of or not at all meaning until we see the exact layout of a Blackwell consumer GB202, we can't assume higher power draw. For the TDP, yes, they *can* use that much power but just because it has the TDP rating doesn't mean it's going to automatically use that. There's no way they'd expect consumers to budget anywhere near 1KW for the GPU alone as NA breakers couldn't handle a system then. We'll probably see the same 450-ish watts as TBP for the 5090 with AIB partners going higher.
Can companies finally stop building everything for AI? I wonder where our GPUs would be for gaming if it was focused on gaming performance and not 60% for AI
why the hell are you talking about 1500W cards without actualy knowing what the fuck you are talking about? those cards pull a huge amount of power for onboard 200GB ethernet
5090 2-slot is only a reflection of the fact there will be no 5080/90 AMD competition. There is zero reason to ramp up wattage if you're only competing with your self...though that won't stop them raping your wallet by ramping up prices ;) Of course, this is Founder cards. Expect different from partners. 2025/26 RDNA5 will be different - only IF(!!) AMD manages to get a proper chiplet GPU solution. AMD cannot compete with Nvidia without full chiplet design i.e. GPU + GPU-GPU interconnect. The full GPU-GPU interconnect is the hard bit - compared to an 8/16 core CPU chiplet.
Only way I could see it 2 slot is if it's liquid cooled somehow, coming with its own AIO solution. Not entirely unbelievable, as xx90's are not meant for purely gaming on a monitor; they're meant for people who use their home PC for production work of some type as well, as well as things like VR simulation. People with this sort of setup are often going to have cases well-suited for this, usually full towers with plenty of space to mount multiple radiators. Not really intended for your average mid-tower 1440p120/4k60 gaming rig: that's the domain of xx80 and xx70ti type cards, specifically if you like raytracing and DLSS. If not, then go AMD for a decent amount cheaper.
A food vendor selling hotdogs for years, suddenly now making burgers for a competitor, as a concept is not news. Look at such companies such as Mars, or Coke. They make a lot of competitive products, basically ensuring their future. By being the ones to make what consumers feel as having choices, by providing many of those choices, even at times under different names, rather than someone else. This could become the case, with Intel working with NVIDIA. If Intel is making competitive chips to their own chips, for others, is to me a smart move. For they will still be the ones making those new chips. Giving consumers the idea of have more choices, but having those choices come from them. The other smart thing about this move, is it helps Intel stay competitive to AMD. But with NVIDIA's help. Many say, it's best to have a CPU and GPU form the same comp. And for now, that be AMD. Intel tried to compete on there own, by making their own GPU to work with their CPU. Though, it was a good GPU, it was still greatly over shadows by both what AMD and NVIDIA makes already. And so, had not done well. Seeing that NVIDIA had announced their intentions to make a CPU, that would work great, in tandem with their GPU, Intel partnering directly with NVIDIA to make this new CPU for NVIDIA, keeps them still in the game of CPU manufacturing, should NVIDIA's idea to do what AMD have been doing, work for NVIDIA, may possibly take Intel out of the picture otherwise. So if Intel CPU chips may become unpopular, as both AMD and NVIDIA both have a CPU and GPU, and many wanting to turn to using them someday, even if Intel CPU chips alone may still be better than what AMD or NVIDIA may come up with, they are not left out by work with NVIDIA in being the ones making their CPUs for them. I only see this as a Win-Win for Intel, in the long run.
Could arm take over x86....It's possible, esp. in the mobile computer space... But I don't feel it's likely - at least not among us PC Master race crowd... I could change my mind if ARM CPU's become socketable and still allow for memory, CPU's, PCIe expandability, drives, accessories, etc. x86 is the absolute most universal platform around - and ARM (and RISC V) at this time really isn't competitive with that. Yes one can bring in PCIe and other stuff into the fold - but untill that happens - and I feel that a big if it happens at all - it's too limiting.
It doesn't matter what the value of the GPU is; AMD fans are spreading lies everywhere. They claim NVIDIA is expensive, yet they don't even purchase the most value-oriented GPU available, the RX 7800 XT, which they call the "NVIDIA killer." In reality, the only 7000 series GPU that appears in the Steam Hardware Survey is AMD's most expensive model RX 7900 XTX. Where are the other 7000 series GPUs? Lol.
Damnnnn, i honestly thought I was able to FINALLY catch up and upgrade with the newest series. I was literally finally able to go from my 1650 to the 4060 that i got totally brand new, still in box sealed with the cellophane at an 83.3% discount from an auction site. Like, i literally just won it last night and have to go pick it up today, lmao. I FINALLY get caught up with the world and i don't even have it picked up yet and already y'all are leaving me behind because this 50 series is about to drop 😂😂. Damnnn, i was so excited thinkin i was caught up with the times 😂😂.
The current no compromise high performance consumer PC requires a 1000w+ PSU large case and very expensive cooling, and it looks like an RTX 5090 high spec build will be around 1400w minimum. The ARM race, currently lead by Apple/ Snapdragon elite is looking more and more attractive. I'm hoping something turns up so that I don't need to keep giving Nvidia my money.
I own a 4090 and to this day, there is no game even close to fully utilizing its capacity. I guess the first game to release that says "recommended GPU: RTX 4090" will come out next year but more likely in 2 or 3 years. Nvidia will continue improving their GPUs but at this point I don´t see any reason for a gamer to upgrade to a 5090. If it wasn´t for my job as a 3D artist, I wouldn´t own a 4090 either and I wouldn´t recommend buying one unless it gets significantly cheaper when you will ultimately just use it for gaming.
In the 1950's components were huge and didn't have any cooling, in 2024 components are microscopic, but the coolers are huge. Soon the entire PC with GPU will be the size of a thumbnail, but will require a 3 storey outhouse of radiator cooling and Nvidia will require a HVAC license. The same issues blight quantum computers too. So, my question is, are we really going forward technologically? or is it time the industry realised they are flogging a dead donkey!
all of current Windows app takes advantage over x86 architecture... optimization toward ARM CPU will take some time.. at least an year ? look at all current ARm version of windows.. all of them shows performance issue comparing to x86 CPU including celeron. once ARM based CPU gain popularity then Microsoft considers to optimize its product seriously. MS always slow... and their eyes on AI stuff more than anything. so I doubt it.
This channel spits out generic trash. RTX 4000 cards, the PCB them self is tiny, only half the length of the cooler is PCB, the rest is the cooler. They run cool enough too that the coolers dont need to be as long an could be made closer to the actual physical length of the PCB
I have a 4070 laptop with i9 wich is really nice. This thing runs anything in ultra in 2k prob cuz of the i9 yes it helps. But i was somewhat disappointed with the 3070. Overall its good but AMD is way more worth it for the price. Now im on 7800xt nitro + and m sticking with it. And yes gre gre gre but i got it before that one launched and m not gnne sell it to get 5 more fps.
Dude intel isn't killing itself. It's making ARM and giving Nvidia all the risk to begin with. If it doesn't work they continue on with their own chips on x86 but when it does they either jack up the price and/or make their own for Windows. Apples bet is paying off on ARM basically because it is way more efficient and the change is coming. Windows knows it needs an ARM OS to stay relevant.
This Intel analysis is bad. Intel is already struggling in the CPU space and ARM based CPUs and any other project being created are going to exist whether they provide silicon for it or not. They are pushing their foundry side of their business in order to evolve, survive, and diversify. Their wafers are impressive, even more so than TSMC. They just happen to cost more. The more they sell, the more they can cut production costs.
RTX 5090 FOUNDERS EDITION - $2000 US DOLLARS . THE 1% ELITE GAMERS 1ST WORLD PROBLEMS. ALL THIS JUST TO GET 4K 60 FPS WITH RAY TRACING, RAY PATH AND REFLECTIVE RAYS... ON MAXIMUM SETTINGS IN CYBERPUNK 2077 ... IN 2025... 5 YEARS AFTER RELEASE LOL
I don't know why you gamers thought that the next gen card is gonna be that pricey... that'd be the wrong move from NVidia.. the economy isn't going great, the 40 series have proven that they are over priced, the Blackwell on your gaming rig won't be datacenter level Blackwell.. and y'all assuming you're gonna pay corporate level price for your gaming rig...
intel helping nvidia is intel changing from a cpu only to a chip platform service. it is less risky to build others designs and help with cost of new process nodes.
When they make such a boondoggle of the gen5 connector (the move which was aimed at crypto miners 100% power draw 100% of the time). In the end they put a little too much planned obsolescence into that connector and it failed even with normal gaming. Now take that attitude and apply it to ai. What sort of planned obsolescence can be cooked into ai so that they can continue to make money hand over fist. It’s got me stumped for now but I’ll come up with it soon enough. Not sure with that corporate attitude towards mankind do I want them leading the way with ai. But here we are and looks like we will get to be the ones who clean up that mess as well. And the movie atlas is a good indicator of the future we are being forced hurdled into.
Doesn't make much sense to me that intel is building nvidia's CPU if they aren't even building good enough CPU's themselves and they also don't build ARM CPU's which Nvidia is already making themselves. I haven't heard that from anywhere else either that intel is producing them.
Intel isn't just a design house, it's a semiconductor manufacturer just like TSMC. If they can't keep up with the new chip design, but they can keep up with the process node development (Don't forget that intel was basically the first to release FinFET)
So 5090 will basically be the ony considered new thing they will release and the rest will just be the same junk from yesterday, rebranded with higher clock speed and power usage hope the 12vhpc wont burn a 5070 too...
in less than 5 years the power consumption of PC has doubled and yet the games look and run like shit. Now it's going to triple or quadruple this generation. What's even the point? There are so many people who can't afford air con and these set-ups will get close to the consumption of an air con. If this trend of diminishing returns continues it will be basically over.
Romans 10:9-10 "That if you confess with your mouth, "Jesus is Lord," and believe in your heart that God raised him from the dead, you will be saved. For it is with your heart that you believe and are justified, and it is with your mouth that you confess and are saved."
😆 should be obvious that intel will make nvidias cpus. Havent you paid attention to your own News? Intel will also make amd chips. In case you still dont know, intel is out and the only way they can survive is to sell their technology and make chips for nvidia and amd like tsmc has been doing.
nobody trust nvidia after show middle finger to consumers and blame user error for they poor 40 series design . i lost two gpu and one pc coz of 4060 burn .
I am 100% fine as long as I can play AAA with 3440x1440p ultra settings with RTX at constantly above 60 FPS. So I think I just need a 5060 or 5070 prob.
The prices for those CPUs aren’t good… I have found them else where for cheaper brand new. Also really? Dude looks like the gaming pc market is gonna plummet In desire.
There won’t be a 5090… why would there be when the same die can be put into a $70k AI card?? And yes that’s the new price for that range… my guess this is only a paper launch.
What will you use that GPU for anyway... New games look worse and play like crap, you don't need any current flagship GPU at all, you can go back 2 gens and still play at max settings
oh well... if we wait another 2 generations of GPUs, there'll be no need to a GPU supports. Those cooling systems will be supported by the mighty earth her self at the bottom of the case at this point.... or We'll need a seperate case to keep the GPUs in.
As has been noted elsewhere, with AMD not doing a high-end release this generation, Nvidia has no particular incentive to push the envelope with their own cards. So why go there? (With the FE at least.)
JUST WATER COOL IT, AND GIVE US 4- EIGHT PIN PLUGS AND WERE FINE. OR JUST A MOTHERBOARD STYLE CONNECTOR WITH TO 4- EIGHT PIN CONNECTORS.... WHY HASNT AMYBODY THOUGHT OF THIS...!!???
Arm will almost certainly take over in the thin and light market. I fully expect to see an ARM board for the framework laptop in the future. x86 has quite a while to go on the high end.
Nvidia™, Buy this rtx 5090 at the ultra low cost of selling your own house, On the dotted line we do not accept any liability of the homelessness this may cause or any damages.
nvidia gpu need its own cpu case. it's price is more than your whole desktop set include your monitor and accessories. then they will release a better ti version, super for cost of another 1000$
Incase he changes the title the original is:
RRX 5090 Does The IMPOSSIBLE
I would LOVE to get the RRX 5090 for my future build!
Lol 😂 @@NEWLuigi64
Is that some kind of Amdvidia brand?
Day 3 still no title change lol
best gpu fr
RTX 5090 TI will come with gasoline electricity generator
This card might need to be hooked up at a hydro dam.
What NVidia needs to do is develop its own case and power supply for these things, that's how outlandish they are becoming.
@@scruffy7443it will also draw power from your nearest powerplant
came to write this. This is silly, here's hoping AMD has a proper solution, even if it's not as fast.
Nuclear powerd
The 5090 will be the first card where the model number is also the price.
3090 in the shortage was that too 😂
Nice lol
Lore accurate price if we are in a simulation i guess
lol this made me laugh here is a like
🤣🤣
5090 will be the most affordable budget card ever
😂😂😂😂😂
Just $1999
@@space6370 that'd be a steal tbh
@@space6370 i hate to be that guy but you mistyped 1, im sure you meant to type 2.
Source?
New Ryzen on the horizon. Horyzen.
Get out.
i approve.
Good one. I’m using this
GET OU-
It will set your house on fire three times as fast as
RRX 5090?
Is that a Nvidia AMD hybrid?
One way to figure out power draw
Funny how you influencers are already hyping the 5090 as it's not going to cost over US$2K. Then again, influencers get them for free exactly for this reason.😂
The biggest issue with the 5090 is sure to be the price.
It's only an issue if you don't have two kidneys and a liver to trade.
@@JohnDoeC78 agreed. Most of the "80" and "90" series are way to much and out of reach for most people.
@@JohnDoeC78 I hope I can to. I got a MSI suprim X 4080 when it came out. And can't wait to get a 5080/5090
2k isn't that much for at least 2 years (if you upgrade every gen) if you play games a lot. Save up and buy the best.
@@KOOLBOI2006 ???
Nvidia just needs to ditch the 12VPHR connector until they have vetted the product out. Just go back to the double or triple power connector.
or quadruple 💀
Naaa.... The single one it's cheaper!
@@gerardsitjait’s not cheaper if in the end you have to warranty most if not all cards.
Nothing wrong with that cable though? And it's an industrial standard now. Get out
@@grus.clausen is anyone but nvidia using it? Lol
"RRX" The card is so powerful that he has forgotten how to spell it 😂
🤯
🤯
🤯
🤯
🤯
When the 1080 came out, I upgraded my whole setup. Mobo, ram, i7, and a 1080. It cost 1 month of my monthly wage. Today I could not buy the 4090 alone...
You can still buy more performance for cheaper, so why exactly are you complaining?
RTX 5090 gets a saleh🇹🇷🥇🎂👑🥳😀🎊🎉👍
Yeah that’s pc gaming for you
@@mAny_oThERSshe meant to say he could buy the best setup at the time for his monthly wage and now just the latest gpu is too expensive
Would have you been able to get a titan xp back then? For apples to apples comparison you should put into equation the 4080 not 4090.
"The PCB breaking from its own weight"
Yeah, what do you think when you try to build another tower of babel sideways horizontally.
I'm not a bridge engineer, but we gonna start analyzing structural design on PC talks from now on.
ahh yes, the RRX 5090
Video Title: 5090 does the impossible!
Me: Be reasonably priced? Not lock features behind an upgrade? Fit in normal cases? Have a power cable that doesn't destroy your PC?
GPUs are basically at the point of diminishing returns for almost all gamers.
@@JohnDoeC78
5-15% would be pretty poor performance uplifts
(I think the mid range RTX 40 series had some GPU close to that though)
around 30% is the typical performance gain per generation for 80 series GPU,
though their are generations where it's much higher than that.
Especially with how many ports are garbage, and even on a high end system run like crap. They just run "less crappy" the more powerful your PC is, but still crap. This will keep happening until developers either ditch the Unreal engine, or Epic gets their shit together and fixes this garbage. They've had years, and there doesn't seem to be any end in sight. Stutter, stutter, stutter, stutter.
We aren't living in the 2010s anymore.
@@JohnDoeC78 1080 is a bad example, as a product it was an anomaly. Later Nvidia realized their mistake and started to tier their cards more carefully.
I expect my 4090 will be overkill for any game that interests me for the next decade. Does what I need it to for AI at the moment though
but does a 1080 have av1 encoding capability?
Intel is attempting to become a silicon foundry to rival TSMC, in order to do that it needs to take orders from AMD and Nvidia. ARM laptop/desktop chips do pose a potential threat to Intel, but at the same time. It was Intel not having exposure to mobile phones that gave TSMC the huge lead in foundry and AMD and Nvidia cheap access to high end low cost silicon to out compete Intel. Intel is building its place in the AI scene, being US based and with a history of operating a US foundry they are perfectly placed to take advantage of any disruption/restrictions on AI chips manufactured in taiwan or south korea.
Also to mention the tariffs that are coming on the horizon as well.
if everyone watching this kicks in a few bucks we may be able to crowd fund half of a 5090
good idea
$279 for a 7800x3D I wish!!!
Soon
This was by a new Amazon Vendor from China. I ordered one and then they cancelled the order and ultimatley Amazon told me they are no longer a vendor. So ulitmately they didn't ship any CPUs at that price.
Microcenter bundle w/ 32 GB DDR5 6000 RAM + a good MOBO for $470 (less with Insider discount).
wait for 9800x3d
@@wololo10 I am… pair that with 5070 or 5080!!
"RRX 5090" The GPU that nobody expected lol
I mean, it is good that Nvidia found a more efficient cooler for the 5090(and maybe rest of the FE Blackwell GPUs), that means less weight and space but... Tbh the most important thing is the price, power consumption and performance, oh... And yeah, that the connectors doesnt melt 😅, we will see.
So now they can say: 5090 is worth that price because its fit in better in cases
I'm sure Intel entered a strategic partnership with Nvidia that is not to their disadvantage.
its more that now intel has its own fabs, they need to upkeep them and that's just not possible with their own chips, and worse case senario they can always just make it so its super expensive for chips to be made that arn't theirs to where they can keep a similar price to performance.
@@mryellow6918 Good point. Intel might be making them, but they are also setting the fab price.
How many people took advantage of the $279 price tag on that AMD 7800X3D?
the seller looked like a scammer, it wasnt being sold or shipped by amazon
Just one of those third party shoppers with wrong prices. Scam for sure, I saw GPUs for insane prices by those sellers but I ain’t buying 😂
This was by a new Amazon Vendor from China. I ordered one and then they cancelled the order and ultimatley Amazon told me they are no longer a vendor. So ulitmately they didn't ship any CPUs at that price.
Imagine nvidea steals the Intel CPU rank, and Intel make new GPU to steal GPU rank
I’m going to get the 5090 when it comes out. I am a dedicated gamer…
Jk. I am getting because I am in school to become an AI Engineer and by the start of 2025 I’ll have finished my course. A good GPU will carry me a long way… I’m still gonna game on it tho lmao
I actually like that the 4090 is massive. I like the look of the giant coolers
Yeah until you try getting the AIO hoses behind or under it to get to the CPU. I have the triple fan 4080 Super OC and have to mount my radiator in the front of the case. The card is 340mm long. That removes routing options like you wouldn't believe. And it's in the top slot and needs a bracket to hold up the ass end so it won't sag and crack the board. They could have considered all that.
Don't get too excited, with the China ban, the RTX 5080 might not faster then RTX 4090, if Nvidia planning to sell it in China.
im going to be paying at lest $5000 aud for this card!!!!
Intel becoming US TSMC would be a very smart move.
Big facts!
The cable, and connectors need to be made of asphalt concrete!
They need to allow for the use of asbestos in this instance. then seal it and label it a bio-hazard.
I don't think arm will overtake x86 because of the gaming scene where arm can't use external graphics
arm can use graphics cards
@@lucky-segfault I did not know that thanks for telling me
lol intel is not that dumb.. they're making profit either way i'm sure
I was going with the 4090 but I was scared from the power connectors so I went with a 7900xtx
The GPU weight and sag can easily be solved with rear connectors. Extend the PCI slot and have extra connections into the MB for power. Or next to the PCI slot have a gap then the power connector and extra connections on the MB for power. This would spread the load, get rid of ugly cables. It's a win win. For now, we could just have GPU's shipped with an adjustable GPU anti sag bracket. The extended PCI slot though is my fav solution.
I just use an anti sag bracket that looks neat and was cheap
it drains consumers wallets even more than we thought
Intel 3 cost more & is less efficient than TSMC’s 3nm. Nvidia’s being dumb about this choice.
Where did you get the info that it's more expensive?
They are not going to allocate some of their GPU wafers at TSMC to do what would be a low volume CPU part to start. The intel process will be good enough for that design.
The decision comes from above - Washington.
TSMC is out of sale....even AMD buys from Samsung
Not necessarily, it depends on where the Intel fabrication plant is located. If NVidia doesn't have to pay for all the overseas shipping (provided that Intel is made in the USA), then NVidia can save a ton of money and still charge more for setting your house on fire.
Next CPUs will help CoPilot to spy on you, Next GPUs will be overpriced rehashes of current gen. There, I just told you the tech future :)
They need to back to the drawing board, the power consumption and size is too much. It will be way too expensive too.
inmagine if 5090 cooling will be technical gas like ac in cars
Enjoy the last Intel GPUs, that sucked as usually their integrated GPUs do; I think this 1 is the penultimate, and somewhere in 2025 the last Intel GPU, IIRC; and probably Intel will drop from the market of CPUs too: they're not out-designing AMD PC CPU, making crazy OCs, on Server are taking a beating too, and since they're making Fabs, probably stick to Chip production. At least this is my realistic perspective long term. NGreedia is making PC AI CPUs, but it's a start, might get good and General Purpose processing and compete with AMD. RISC-like CPUs might take off for AMD too; who knows...
NGreedia GPU 5090+: They need their own case, make dual cases, and use a GPU Riser to link to the MB. The GPU Coolers suck for the most part, they should be more like the CPUs, efficient to take the hot air out; today's designs pushes the air back into the hot heatsink, so other forms of cooler should be tested. Anyway the NGreedia 5090 proves that GPUs need their own MB and very Special PCIe connection to the CPU MB: maybe on the side, really near the CPU, 64-Lanes, or at the very least 32-Lanes, would be totally possible
The channel "Daniel Owen" just released a video proving my point. Anybody who says 8GB isn't enough for modern games should go check it out. DEFINITELY enough for 1080p, which is what 8GB cards are for. If you're trying to run 1440p or 4K with an 8GB card, that's YOUR FAULT
Either he misspelled on purpose for clicks or he genuinely made a mistake and just doesn’t check comments of title after posting.
To all consumer pc companies : STOP USING FUCKING WATER COOLING its a pain in the ass and practically useless when i live in a temperature controlled house thats colder than the fuckin arctic
Lol good one 😂 0:19
4:00 - You're confusing server-specific parts with consumer GPUs. Even if the architecture it's based on is the same, there are different parts of the chip used in the data center that consumers would see a fraction of or not at all meaning until we see the exact layout of a Blackwell consumer GB202, we can't assume higher power draw. For the TDP, yes, they *can* use that much power but just because it has the TDP rating doesn't mean it's going to automatically use that. There's no way they'd expect consumers to budget anywhere near 1KW for the GPU alone as NA breakers couldn't handle a system then. We'll probably see the same 450-ish watts as TBP for the 5090 with AIB partners going higher.
Intel has tried to kill the x86 a few times. First was iAPX 432, i860 and of course Itanium. Now there is talk of x86s, a 64 bit only x86 CPU
I saw how the acorn risc machine architecture was built, the power draw for the original ARM and heat conductivity was honestly crazy
Can companies finally stop building everything for AI? I wonder where our GPUs would be for gaming if it was focused on gaming performance and not 60% for AI
why the hell are you talking about 1500W cards without actualy knowing what the fuck you are talking about? those cards pull a huge amount of power for onboard 200GB ethernet
I don't believe in NVIDIA's hype, because they usually don't deliver what's promised.
RTX 4090 is a beast, but should have been better for the price.
5090 2-slot is only a reflection of the fact there will be no 5080/90 AMD competition. There is zero reason to ramp up wattage if you're only competing with your self...though that won't stop them raping your wallet by ramping up prices ;) Of course, this is Founder cards. Expect different from partners.
2025/26 RDNA5 will be different - only IF(!!) AMD manages to get a proper chiplet GPU solution. AMD cannot compete with Nvidia without full chiplet design i.e. GPU + GPU-GPU interconnect. The full GPU-GPU interconnect is the hard bit - compared to an 8/16 core CPU chiplet.
uhh just a quick question but how many drinks did you have before makeing this video?
Only way I could see it 2 slot is if it's liquid cooled somehow, coming with its own AIO solution. Not entirely unbelievable, as xx90's are not meant for purely gaming on a monitor; they're meant for people who use their home PC for production work of some type as well, as well as things like VR simulation. People with this sort of setup are often going to have cases well-suited for this, usually full towers with plenty of space to mount multiple radiators. Not really intended for your average mid-tower 1440p120/4k60 gaming rig: that's the domain of xx80 and xx70ti type cards, specifically if you like raytracing and DLSS. If not, then go AMD for a decent amount cheaper.
A food vendor selling hotdogs for years, suddenly now making burgers for a competitor, as a concept is not news. Look at such companies such as Mars, or Coke. They make a lot of competitive products, basically ensuring their future. By being the ones to make what consumers feel as having choices, by providing many of those choices, even at times under different names, rather than someone else.
This could become the case, with Intel working with NVIDIA. If Intel is making competitive chips to their own chips, for others, is to me a smart move. For they will still be the ones making those new chips. Giving consumers the idea of have more choices, but having those choices come from them.
The other smart thing about this move, is it helps Intel stay competitive to AMD. But with NVIDIA's help.
Many say, it's best to have a CPU and GPU form the same comp. And for now, that be AMD. Intel tried to compete on there own, by making their own GPU to work with their CPU. Though, it was a good GPU, it was still greatly over shadows by both what AMD and NVIDIA makes already. And so, had not done well.
Seeing that NVIDIA had announced their intentions to make a CPU, that would work great, in tandem with their GPU, Intel partnering directly with NVIDIA to make this new CPU for NVIDIA, keeps them still in the game of CPU manufacturing, should NVIDIA's idea to do what AMD have been doing, work for NVIDIA, may possibly take Intel out of the picture otherwise.
So if Intel CPU chips may become unpopular, as both AMD and NVIDIA both have a CPU and GPU, and many wanting to turn to using them someday, even if Intel CPU chips alone may still be better than what AMD or NVIDIA may come up with, they are not left out by work with NVIDIA in being the ones making their CPUs for them.
I only see this as a Win-Win for Intel, in the long run.
Could arm take over x86....It's possible, esp. in the mobile computer space... But I don't feel it's likely - at least not among us PC Master race crowd... I could change my mind if ARM CPU's become socketable and still allow for memory, CPU's, PCIe expandability, drives, accessories, etc. x86 is the absolute most universal platform around - and ARM (and RISC V) at this time really isn't competitive with that. Yes one can bring in PCIe and other stuff into the fold - but untill that happens - and I feel that a big if it happens at all - it's too limiting.
It doesn't matter what the value of the GPU is; AMD fans are spreading lies everywhere. They claim NVIDIA is expensive, yet they don't even purchase the most value-oriented GPU available, the RX 7800 XT, which they call the "NVIDIA killer." In reality, the only 7000 series GPU that appears in the Steam Hardware Survey is AMD's most expensive model RX 7900 XTX. Where are the other 7000 series GPUs? Lol.
Damnnnn, i honestly thought I was able to FINALLY catch up and upgrade with the newest series. I was literally finally able to go from my 1650 to the 4060 that i got totally brand new, still in box sealed with the cellophane at an 83.3% discount from an auction site. Like, i literally just won it last night and have to go pick it up today, lmao.
I FINALLY get caught up with the world and i don't even have it picked up yet and already y'all are leaving me behind because this 50 series is about to drop 😂😂.
Damnnn, i was so excited thinkin i was caught up with the times 😂😂.
The current no compromise high performance consumer PC requires a 1000w+ PSU large case and very expensive cooling, and it looks like an RTX 5090 high spec build will be around 1400w minimum. The ARM race, currently lead by Apple/ Snapdragon elite is looking more and more attractive. I'm hoping something turns up so that I don't need to keep giving Nvidia my money.
I own a 4090 and to this day, there is no game even close to fully utilizing its capacity. I guess the first game to release that says "recommended GPU: RTX 4090" will come out next year but more likely in 2 or 3 years. Nvidia will continue improving their GPUs but at this point I don´t see any reason for a gamer to upgrade to a 5090. If it wasn´t for my job as a 3D artist, I wouldn´t own a 4090 either and I wouldn´t recommend buying one unless it gets significantly cheaper when you will ultimately just use it for gaming.
In the 1950's components were huge and didn't have any cooling, in 2024 components are microscopic, but the coolers are huge. Soon the entire PC with GPU will be the size of a thumbnail, but will require a 3 storey outhouse of radiator cooling and Nvidia will require a HVAC license. The same issues blight quantum computers too. So, my question is, are we really going forward technologically? or is it time the industry realised they are flogging a dead donkey!
all of current Windows app takes advantage over x86 architecture... optimization toward ARM CPU will take some time.. at least an year ? look at all current ARm version of windows.. all of them shows performance issue comparing to x86 CPU including celeron. once ARM based CPU gain popularity then Microsoft considers to optimize its product seriously. MS always slow... and their eyes on AI stuff more than anything. so I doubt it.
This channel spits out generic trash.
RTX 4000 cards, the PCB them self is tiny, only half the length of the cooler is PCB, the rest is the cooler. They run cool enough too that the coolers dont need to be as long an could be made closer to the actual physical length of the PCB
I have a 4070 laptop with i9 wich is really nice. This thing runs anything in ultra in 2k prob cuz of the i9 yes it helps.
But i was somewhat disappointed with the 3070. Overall its good but AMD is way more worth it for the price.
Now im on 7800xt nitro + and m sticking with it. And yes gre gre gre but i got it before that one launched and m not gnne sell it to get 5 more fps.
Dude intel isn't killing itself. It's making ARM and giving Nvidia all the risk to begin with. If it doesn't work they continue on with their own chips on x86 but when it does they either jack up the price and/or make their own for Windows. Apples bet is paying off on ARM basically because it is way more efficient and the change is coming. Windows knows it needs an ARM OS to stay relevant.
This Intel analysis is bad. Intel is already struggling in the CPU space and ARM based CPUs and any other project being created are going to exist whether they provide silicon for it or not. They are pushing their foundry side of their business in order to evolve, survive, and diversify. Their wafers are impressive, even more so than TSMC. They just happen to cost more. The more they sell, the more they can cut production costs.
RTX 5090 FOUNDERS EDITION - $2000 US DOLLARS . THE 1% ELITE GAMERS 1ST WORLD PROBLEMS.
ALL THIS JUST TO GET 4K 60 FPS WITH RAY TRACING, RAY PATH AND REFLECTIVE RAYS... ON MAXIMUM SETTINGS IN CYBERPUNK 2077 ... IN 2025... 5 YEARS AFTER RELEASE LOL
Man im sorry but that identical voice inflexion at the end of every single sentebnce makes you impossible to watch. Best of luck!
I don't know why you gamers thought that the next gen card is gonna be that pricey...
that'd be the wrong move from NVidia.. the economy isn't going great, the 40 series have proven that they are over priced, the Blackwell on your gaming rig won't be datacenter level Blackwell.. and y'all assuming you're gonna pay corporate level price for your gaming rig...
intel helping nvidia is intel changing from a cpu only to a chip platform service. it is less risky to build others designs and help with cost of new process nodes.
When they make such a boondoggle of the gen5 connector (the move which was aimed at crypto miners 100% power draw 100% of the time). In the end they put a little too much planned obsolescence into that connector and it failed even with normal gaming. Now take that attitude and apply it to ai. What sort of planned obsolescence can be cooked into ai so that they can continue to make money hand over fist. It’s got me stumped for now but I’ll come up with it soon enough. Not sure with that corporate attitude towards mankind do I want them leading the way with ai. But here we are and looks like we will get to be the ones who clean up that mess as well. And the movie atlas is a good indicator of the future we are being forced hurdled into.
Doesn't make much sense to me that intel is building nvidia's CPU if they aren't even building good enough CPU's themselves and they also don't build ARM CPU's which Nvidia is already making themselves.
I haven't heard that from anywhere else either that intel is producing them.
Intel isn't just a design house, it's a semiconductor manufacturer just like TSMC. If they can't keep up with the new chip design, but they can keep up with the process node development (Don't forget that intel was basically the first to release FinFET)
Im looking for that new 40CU APU coming out later this year.
Next year for Strix Halo
So 5090 will basically be the ony considered new thing they will release and the rest will just be the same junk from yesterday, rebranded with higher clock speed and power usage hope the 12vhpc wont burn a 5070 too...
in less than 5 years the power consumption of PC has doubled and yet the games look and run like shit. Now it's going to triple or quadruple this generation. What's even the point? There are so many people who can't afford air con and these set-ups will get close to the consumption of an air con. If this trend of diminishing returns continues it will be basically over.
Here’s hoping the 40 series gets cheaper by the time I start looking to buy in September
Romans 10:9-10 "That if you confess with your mouth, "Jesus is Lord," and believe in your heart that God raised him from the dead, you will be saved. For it is with your heart that you believe and are justified, and it is with your mouth that you confess and are saved."
😆 should be obvious that intel will make nvidias cpus. Havent you paid attention to your own News? Intel will also make amd chips. In case you still dont know, intel is out and the only way they can survive is to sell their technology and make chips for nvidia and amd like tsmc has been doing.
nobody trust nvidia after show middle finger to consumers and blame user error for they poor 40 series design . i lost two gpu and one pc coz of 4060 burn .
I am 100% fine as long as I can play AAA with 3440x1440p ultra settings with RTX at constantly above 60 FPS. So I think I just need a 5060 or 5070 prob.
The prices for those CPUs aren’t good… I have found them else where for cheaper brand new.
Also really? Dude looks like the gaming pc market is gonna plummet
In desire.
There won’t be a 5090… why would there be when the same die can be put into a $70k AI card?? And yes that’s the new price for that range… my guess this is only a paper launch.
What will you use that GPU for anyway... New games look worse and play like crap, you don't need any current flagship GPU at all, you can go back 2 gens and still play at max settings
ARM……….🤔………..maybe……….but it’s been around for a while. What has changed that makes it a viable Windows/Gaming CPU? i’ll believe it when I see it.
oh well... if we wait another 2 generations of GPUs, there'll be no need to a GPU supports. Those cooling systems will be supported by the mighty earth her self at the bottom of the case at this point.... or We'll need a seperate case to keep the GPUs in.
As has been noted elsewhere, with AMD not doing a high-end release this generation, Nvidia has no particular incentive to push the envelope with their own cards. So why go there? (With the FE at least.)
JUST WATER COOL IT, AND GIVE US 4- EIGHT PIN PLUGS AND WERE FINE. OR JUST A MOTHERBOARD STYLE CONNECTOR WITH TO 4- EIGHT PIN CONNECTORS.... WHY HASNT AMYBODY THOUGHT OF THIS...!!???
Arm will almost certainly take over in the thin and light market. I fully expect to see an ARM board for the framework laptop in the future. x86 has quite a while to go on the high end.
Nvidia™, Buy this rtx 5090 at the ultra low cost of selling your own house, On the dotted line we do not accept any liability of the homelessness this may cause or any damages.
TSMC business model: Foundry = Good
Intel business model: Foundry = Bad?
This is the start of Intel as a 700B company in just 5 short years.
nvidia gpu need its own cpu case. it's price is more than your whole desktop set include your monitor and accessories. then they will release a better ti version, super for cost of another 1000$
700watts 😳 I’ll still stick with my 3090. Maybe get a 4090 when the price goes down. I’m sure the 5090 won’t even fit in my case
i tried to buy that 7800x3d 2 times at the 278 price but it was canceled by amz and my $$ was held for 10 days! I think it was a scammer.
So what was "the impossible"? And what's an "RRX"? Also, Intel making Nvidia chips isn't "news". This has been known for months now.
Any one shilling to Microsoft will be shooting themselves in the foot; because, people are flocking to Linux, thanks to Microsoft's dystopian choices.
Just for a cheap price of one kidney and half a toddler's lungs, you can get the brand new nvidia 5090 gpu with ultimate performance!
do you think 5090 will utilize pcie 4.0 and will procced to utilizing 5.0 speed
i want intel to finally start focusing on power consumption to performance, then I'll be happy, ESPECIALLY in the gpu market
For 40 years the top GPU was $500 brand new, now for some strange reason it's over $1500?????