Funny how you influencers are already hyping the 5090 as it's not going to cost over US$2K. Then again, influencers get them for free exactly for this reason.😂
When the 1080 came out, I upgraded my whole setup. Mobo, ram, i7, and a 1080. It cost 1 month of my monthly wage. Today I could not buy the 4090 alone...
"The PCB breaking from its own weight" Yeah, what do you think when you try to build another tower of babel sideways horizontally. I'm not a bridge engineer, but we gonna start analyzing structural design on PC talks from now on.
I mean, it is good that Nvidia found a more efficient cooler for the 5090(and maybe rest of the FE Blackwell GPUs), that means less weight and space but... Tbh the most important thing is the price, power consumption and performance, oh... And yeah, that the connectors doesnt melt 😅, we will see.
Video Title: 5090 does the impossible! Me: Be reasonably priced? Not lock features behind an upgrade? Fit in normal cases? Have a power cable that doesn't destroy your PC?
The GPU weight and sag can easily be solved with rear connectors. Extend the PCI slot and have extra connections into the MB for power. Or next to the PCI slot have a gap then the power connector and extra connections on the MB for power. This would spread the load, get rid of ugly cables. It's a win win. For now, we could just have GPU's shipped with an adjustable GPU anti sag bracket. The extended PCI slot though is my fav solution.
This was by a new Amazon Vendor from China. I ordered one and then they cancelled the order and ultimatley Amazon told me they are no longer a vendor. So ulitmately they didn't ship any CPUs at that price.
@@JohnDoeC78 5-15% would be pretty poor performance uplifts (I think the mid range RTX 40 series had some GPU close to that though) around 30% is the typical performance gain per generation for 80 series GPU, though their are generations where it's much higher than that.
Especially with how many ports are garbage, and even on a high end system run like crap. They just run "less crappy" the more powerful your PC is, but still crap. This will keep happening until developers either ditch the Unreal engine, or Epic gets their shit together and fixes this garbage. They've had years, and there doesn't seem to be any end in sight. Stutter, stutter, stutter, stutter. We aren't living in the 2010s anymore.
@@JohnDoeC78 1080 is a bad example, as a product it was an anomaly. Later Nvidia realized their mistake and started to tier their cards more carefully.
Yeah until you try getting the AIO hoses behind or under it to get to the CPU. I have the triple fan 4080 Super OC and have to mount my radiator in the front of the case. The card is 340mm long. That removes routing options like you wouldn't believe. And it's in the top slot and needs a bracket to hold up the ass end so it won't sag and crack the board. They could have considered all that.
its more that now intel has its own fabs, they need to upkeep them and that's just not possible with their own chips, and worse case senario they can always just make it so its super expensive for chips to be made that arn't theirs to where they can keep a similar price to performance.
I’m going to get the 5090 when it comes out. I am a dedicated gamer… Jk. I am getting because I am in school to become an AI Engineer and by the start of 2025 I’ll have finished my course. A good GPU will carry me a long way… I’m still gonna game on it tho lmao
Intel is attempting to become a silicon foundry to rival TSMC, in order to do that it needs to take orders from AMD and Nvidia. ARM laptop/desktop chips do pose a potential threat to Intel, but at the same time. It was Intel not having exposure to mobile phones that gave TSMC the huge lead in foundry and AMD and Nvidia cheap access to high end low cost silicon to out compete Intel. Intel is building its place in the AI scene, being US based and with a history of operating a US foundry they are perfectly placed to take advantage of any disruption/restrictions on AI chips manufactured in taiwan or south korea.
This was by a new Amazon Vendor from China. I ordered one and then they cancelled the order and ultimatley Amazon told me they are no longer a vendor. So ulitmately they didn't ship any CPUs at that price.
Only way I could see it 2 slot is if it's liquid cooled somehow, coming with its own AIO solution. Not entirely unbelievable, as xx90's are not meant for purely gaming on a monitor; they're meant for people who use their home PC for production work of some type as well, as well as things like VR simulation. People with this sort of setup are often going to have cases well-suited for this, usually full towers with plenty of space to mount multiple radiators. Not really intended for your average mid-tower 1440p120/4k60 gaming rig: that's the domain of xx80 and xx70ti type cards, specifically if you like raytracing and DLSS. If not, then go AMD for a decent amount cheaper.
Arm will almost certainly take over in the thin and light market. I fully expect to see an ARM board for the framework laptop in the future. x86 has quite a while to go on the high end.
B200 has 2.6 times as many transistors as the H100. A less than 50% TDP increase isn't even that crazy considering the amount of transistors and the fact that those aren't made with the newest, most efficient process node.
Intel isn't just a design house, it's a semiconductor manufacturer just like TSMC. If they can't keep up with the new chip design, but they can keep up with the process node development (Don't forget that intel was basically the first to release FinFET)
Whether Arm or x86 doesn't matter. What matters is performance, compatibility, stability, pricing and flawless change to Arm processor if it will make to desktop.
intel helping nvidia is intel changing from a cpu only to a chip platform service. it is less risky to build others designs and help with cost of new process nodes.
The current no compromise high performance consumer PC requires a 1000w+ PSU large case and very expensive cooling, and it looks like an RTX 5090 high spec build will be around 1400w minimum. The ARM race, currently lead by Apple/ Snapdragon elite is looking more and more attractive. I'm hoping something turns up so that I don't need to keep giving Nvidia my money.
Apple's M3 Max already has one of the most powerful CPU on the market with only 16 cores and even rivals desktop CPUs, So with noticable improvements each year ARM will for sure overtake x86(in effieciency it always had but in performace as well it will)
Love how I posted weeks or so ago that the 5080 will be just cpl frames give or take to the 4090. It was comfirmed yesterday that the 5080 is the exact same has the 4090
They are not going to allocate some of their GPU wafers at TSMC to do what would be a low volume CPU part to start. The intel process will be good enough for that design.
Not necessarily, it depends on where the Intel fabrication plant is located. If NVidia doesn't have to pay for all the overseas shipping (provided that Intel is made in the USA), then NVidia can save a ton of money and still charge more for setting your house on fire.
I think there's too much comfort, resistance, and reason financially and logistically not to change for the x86 to move to ARM, but that's not to say it won't be another option to choose from just like deciding PC or Mac.
ARM isn't the only player in town. There have been many advancements on the RSCI5 design here of late, and it's only a matter of time before someone drops one of these onto a laptop or PC.
gigabyte 4090 waterforce or msi suprim i think is the sweet spot for form factor and cooling as far as 4090's concerned. i hope they go that route with the 5090
Okay, everyone's meming on the title, but imagine how wild it would be if Nvidia actually renamed their RTX branding to RRX and Gamer Meld accidentally predicted it with a typo. It'd be the most unbelievable arc in tech RUclips, lmao.
Damnnnn, i honestly thought I was able to FINALLY catch up and upgrade with the newest series. I was literally finally able to go from my 1650 to the 4060 that i got totally brand new, still in box sealed with the cellophane at an 83.3% discount from an auction site. Like, i literally just won it last night and have to go pick it up today, lmao. I FINALLY get caught up with the world and i don't even have it picked up yet and already y'all are leaving me behind because this 50 series is about to drop 😂😂. Damnnn, i was so excited thinkin i was caught up with the times 😂😂.
A food vendor selling hotdogs for years, suddenly now making burgers for a competitor, as a concept is not news. Look at such companies such as Mars, or Coke. They make a lot of competitive products, basically ensuring their future. By being the ones to make what consumers feel as having choices, by providing many of those choices, even at times under different names, rather than someone else. This could become the case, with Intel working with NVIDIA. If Intel is making competitive chips to their own chips, for others, is to me a smart move. For they will still be the ones making those new chips. Giving consumers the idea of have more choices, but having those choices come from them. The other smart thing about this move, is it helps Intel stay competitive to AMD. But with NVIDIA's help. Many say, it's best to have a CPU and GPU form the same comp. And for now, that be AMD. Intel tried to compete on there own, by making their own GPU to work with their CPU. Though, it was a good GPU, it was still greatly over shadows by both what AMD and NVIDIA makes already. And so, had not done well. Seeing that NVIDIA had announced their intentions to make a CPU, that would work great, in tandem with their GPU, Intel partnering directly with NVIDIA to make this new CPU for NVIDIA, keeps them still in the game of CPU manufacturing, should NVIDIA's idea to do what AMD have been doing, work for NVIDIA, may possibly take Intel out of the picture otherwise. So if Intel CPU chips may become unpopular, as both AMD and NVIDIA both have a CPU and GPU, and many wanting to turn to using them someday, even if Intel CPU chips alone may still be better than what AMD or NVIDIA may come up with, they are not left out by work with NVIDIA in being the ones making their CPUs for them. I only see this as a Win-Win for Intel, in the long run.
Incase he changes the title the original is:
RRX 5090 Does The IMPOSSIBLE
I would LOVE to get the RRX 5090 for my future build!
Lol 😂 @@NEWLuigi64
Is that some kind of Amdvidia brand?
Day 3 still no title change lol
best gpu fr
The 5090 will be the first card where the model number is also the price.
3090 in the shortage was that too 😂
Nice lol
Lore accurate price if we are in a simulation i guess
lol this made me laugh here is a like
🤣🤣
RTX 5090 TI will come with gasoline electricity generator
This card might need to be hooked up at a hydro dam.
What NVidia needs to do is develop its own case and power supply for these things, that's how outlandish they are becoming.
@@scruffy7443it will also draw power from your nearest powerplant
came to write this. This is silly, here's hoping AMD has a proper solution, even if it's not as fast.
Nuclear powerd
5090 will be the most affordable budget card ever
😂😂😂😂😂
Just $1999
@@space6370 that'd be a steal tbh
@@space6370 i hate to be that guy but you mistyped 1, im sure you meant to type 2.
Source?
New Ryzen on the horizon. Horyzen.
Get out.
i approve.
Good one. I’m using this
GET OU-
It will set your house on fire three times as fast as
The biggest issue with the 5090 is sure to be the price.
It's only an issue if you don't have two kidneys and a liver to trade.
@@JohnDoeC78 agreed. Most of the "80" and "90" series are way to much and out of reach for most people.
@@JohnDoeC78 I hope I can to. I got a MSI suprim X 4080 when it came out. And can't wait to get a 5080/5090
2k isn't that much for at least 2 years (if you upgrade every gen) if you play games a lot. Save up and buy the best.
@@KOOLBOI2006 ???
"RRX" The card is so powerful that he has forgotten how to spell it 😂
🤯
🤯
🤯
🤯
🤯
RRX 5090?
Is that a Nvidia AMD hybrid?
One way to figure out power draw
Funny how you influencers are already hyping the 5090 as it's not going to cost over US$2K. Then again, influencers get them for free exactly for this reason.😂
Nvidia just needs to ditch the 12VPHR connector until they have vetted the product out. Just go back to the double or triple power connector.
or quadruple 💀
Naaa.... The single one it's cheaper!
@@gerardsitjait’s not cheaper if in the end you have to warranty most if not all cards.
Nothing wrong with that cable though? And it's an industrial standard now. Get out
@@grus.clausen is anyone but nvidia using it? Lol
When the 1080 came out, I upgraded my whole setup. Mobo, ram, i7, and a 1080. It cost 1 month of my monthly wage. Today I could not buy the 4090 alone...
You can still buy more performance for cheaper, so why exactly are you complaining?
RTX 5090 gets a saleh🇹🇷🥇🎂👑🥳😀🎊🎉👍
Yeah that’s pc gaming for you
@@mAny_oThERSshe meant to say he could buy the best setup at the time for his monthly wage and now just the latest gpu is too expensive
Would have you been able to get a titan xp back then? For apples to apples comparison you should put into equation the 4080 not 4090.
"The PCB breaking from its own weight"
Yeah, what do you think when you try to build another tower of babel sideways horizontally.
I'm not a bridge engineer, but we gonna start analyzing structural design on PC talks from now on.
ahh yes, the RRX 5090
I mean, it is good that Nvidia found a more efficient cooler for the 5090(and maybe rest of the FE Blackwell GPUs), that means less weight and space but... Tbh the most important thing is the price, power consumption and performance, oh... And yeah, that the connectors doesnt melt 😅, we will see.
Video Title: 5090 does the impossible!
Me: Be reasonably priced? Not lock features behind an upgrade? Fit in normal cases? Have a power cable that doesn't destroy your PC?
"RRX 5090" The GPU that nobody expected lol
The GPU weight and sag can easily be solved with rear connectors. Extend the PCI slot and have extra connections into the MB for power. Or next to the PCI slot have a gap then the power connector and extra connections on the MB for power. This would spread the load, get rid of ugly cables. It's a win win. For now, we could just have GPU's shipped with an adjustable GPU anti sag bracket. The extended PCI slot though is my fav solution.
I just use an anti sag bracket that looks neat and was cheap
$279 for a 7800x3D I wish!!!
This was by a new Amazon Vendor from China. I ordered one and then they cancelled the order and ultimatley Amazon told me they are no longer a vendor. So ulitmately they didn't ship any CPUs at that price.
Microcenter bundle w/ 32 GB DDR5 6000 RAM + a good MOBO for $470 (less with Insider discount).
wait for 9800x3d
@@wololo10 I am… pair that with 5070 or 5080!!
@@wololo10 wait for 11800x3d
GPUs are basically at the point of diminishing returns for almost all gamers.
@@JohnDoeC78
5-15% would be pretty poor performance uplifts
(I think the mid range RTX 40 series had some GPU close to that though)
around 30% is the typical performance gain per generation for 80 series GPU,
though their are generations where it's much higher than that.
Especially with how many ports are garbage, and even on a high end system run like crap. They just run "less crappy" the more powerful your PC is, but still crap. This will keep happening until developers either ditch the Unreal engine, or Epic gets their shit together and fixes this garbage. They've had years, and there doesn't seem to be any end in sight. Stutter, stutter, stutter, stutter.
We aren't living in the 2010s anymore.
@@JohnDoeC78 1080 is a bad example, as a product it was an anomaly. Later Nvidia realized their mistake and started to tier their cards more carefully.
I expect my 4090 will be overkill for any game that interests me for the next decade. Does what I need it to for AI at the moment though
but does a 1080 have av1 encoding capability?
I actually like that the 4090 is massive. I like the look of the giant coolers
Yeah until you try getting the AIO hoses behind or under it to get to the CPU. I have the triple fan 4080 Super OC and have to mount my radiator in the front of the case. The card is 340mm long. That removes routing options like you wouldn't believe. And it's in the top slot and needs a bracket to hold up the ass end so it won't sag and crack the board. They could have considered all that.
if everyone watching this kicks in a few bucks we may be able to crowd fund half of a 5090
good idea
it drains consumers wallets even more than we thought
So now they can say: 5090 is worth that price because its fit in better in cases
I saw how the acorn risc machine architecture was built, the power draw for the original ARM and heat conductivity was honestly crazy
I'm sure Intel entered a strategic partnership with Nvidia that is not to their disadvantage.
its more that now intel has its own fabs, they need to upkeep them and that's just not possible with their own chips, and worse case senario they can always just make it so its super expensive for chips to be made that arn't theirs to where they can keep a similar price to performance.
@@mryellow6918 Good point. Intel might be making them, but they are also setting the fab price.
Lol good one 😂 0:19
I’m going to get the 5090 when it comes out. I am a dedicated gamer…
Jk. I am getting because I am in school to become an AI Engineer and by the start of 2025 I’ll have finished my course. A good GPU will carry me a long way… I’m still gonna game on it tho lmao
Intel is attempting to become a silicon foundry to rival TSMC, in order to do that it needs to take orders from AMD and Nvidia. ARM laptop/desktop chips do pose a potential threat to Intel, but at the same time. It was Intel not having exposure to mobile phones that gave TSMC the huge lead in foundry and AMD and Nvidia cheap access to high end low cost silicon to out compete Intel. Intel is building its place in the AI scene, being US based and with a history of operating a US foundry they are perfectly placed to take advantage of any disruption/restrictions on AI chips manufactured in taiwan or south korea.
Also to mention the tariffs that are coming on the horizon as well.
Intel becoming US TSMC would be a very smart move.
Big facts!
They need to back to the drawing board, the power consumption and size is too much. It will be way too expensive too.
How many people took advantage of the $279 price tag on that AMD 7800X3D?
the seller looked like a scammer, it wasnt being sold or shipped by amazon
This was by a new Amazon Vendor from China. I ordered one and then they cancelled the order and ultimatley Amazon told me they are no longer a vendor. So ulitmately they didn't ship any CPUs at that price.
This channel is too underrated for only 300k subs
Only way I could see it 2 slot is if it's liquid cooled somehow, coming with its own AIO solution. Not entirely unbelievable, as xx90's are not meant for purely gaming on a monitor; they're meant for people who use their home PC for production work of some type as well, as well as things like VR simulation. People with this sort of setup are often going to have cases well-suited for this, usually full towers with plenty of space to mount multiple radiators. Not really intended for your average mid-tower 1440p120/4k60 gaming rig: that's the domain of xx80 and xx70ti type cards, specifically if you like raytracing and DLSS. If not, then go AMD for a decent amount cheaper.
Arm will almost certainly take over in the thin and light market. I fully expect to see an ARM board for the framework laptop in the future. x86 has quite a while to go on the high end.
B200 has 2.6 times as many transistors as the H100. A less than 50% TDP increase isn't even that crazy considering the amount of transistors and the fact that those aren't made with the newest, most efficient process node.
The cable, and connectors need to be made of asphalt concrete!
They need to allow for the use of asbestos in this instance. then seal it and label it a bio-hazard.
Here’s hoping the 40 series gets cheaper by the time I start looking to buy in September
Intel isn't just a design house, it's a semiconductor manufacturer just like TSMC. If they can't keep up with the new chip design, but they can keep up with the process node development (Don't forget that intel was basically the first to release FinFET)
If they go with the chaplet design they could spread out parts of the dye providing cooler temps and less hot points.
thats not really how that works.
The 5090 is going to be loud af with only two fans
Whether Arm or x86 doesn't matter. What matters is performance, compatibility, stability, pricing and flawless change to Arm processor if it will make to desktop.
i like the 4090 being giant because it stays ice cold all the time.
I don't think arm will overtake x86 because of the gaming scene where arm can't use external graphics
arm can use graphics cards
@@lucky-segfault I did not know that thanks for telling me
inmagine if 5090 cooling will be technical gas like ac in cars
intel helping nvidia is intel changing from a cpu only to a chip platform service. it is less risky to build others designs and help with cost of new process nodes.
Damn. That's a high bar for engineers nowadays to redesign something. Truly IMPOSSIBLE!
The current no compromise high performance consumer PC requires a 1000w+ PSU large case and very expensive cooling, and it looks like an RTX 5090 high spec build will be around 1400w minimum. The ARM race, currently lead by Apple/ Snapdragon elite is looking more and more attractive. I'm hoping something turns up so that I don't need to keep giving Nvidia my money.
RTX fifty ninedeez nuts💀
I was going with the 4090 but I was scared from the power connectors so I went with a 7900xtx
Apple's M3 Max already has one of the most powerful CPU on the market with only 16 cores and even rivals desktop CPUs, So with noticable improvements each year ARM will for sure overtake x86(in effieciency it always had but in performace as well it will)
Love how I posted weeks or so ago that the 5080 will be just cpl frames give or take to the 4090. It was comfirmed yesterday that the 5080 is the exact same has the 4090
i want intel to finally start focusing on power consumption to performance, then I'll be happy, ESPECIALLY in the gpu market
do you think 5090 will utilize pcie 4.0 and will procced to utilizing 5.0 speed
Intel 3 cost more & is less efficient than TSMC’s 3nm. Nvidia’s being dumb about this choice.
Where did you get the info that it's more expensive?
They are not going to allocate some of their GPU wafers at TSMC to do what would be a low volume CPU part to start. The intel process will be good enough for that design.
The decision comes from above - Washington.
TSMC is out of sale....even AMD buys from Samsung
Not necessarily, it depends on where the Intel fabrication plant is located. If NVidia doesn't have to pay for all the overseas shipping (provided that Intel is made in the USA), then NVidia can save a ton of money and still charge more for setting your house on fire.
They’re getting ready for the new cpu and gpus.
They want us to build a server room in our houses now ??
I think there's too much comfort, resistance, and reason financially and logistically not to change for the x86 to move to ARM, but that's not to say it won't be another option to choose from just like deciding PC or Mac.
Safe to say the RRX beats the RTX
Can't wait to have a toaster running hours on end for 4k Minecraft on my 1080p monitor
Can't wait for the RRX 5090
ARM isn't the only player in town. There have been many advancements on the RSCI5 design here of late, and it's only a matter of time before someone drops one of these onto a laptop or PC.
I wouldn't care since I'm not interested in a phone processor on my PC.
5090s will come with a complimentary fire extinguisher.
gigabyte 4090 waterforce or msi suprim i think is the sweet spot for form factor and cooling as far as 4090's concerned. i hope they go that route with the 5090
New home heaters are on the way
first android tablet cpus and now actual pc cpus. nvidia is evolving.
For 40 years the top GPU was $500 brand new, now for some strange reason it's over $1500?????
Not only do I think that ARM could completely replace X86, i think it's actually inevitable.
Melting on 4090 no PCB breaking.
Mlid was speculating that intel might go fab only or more fab focuses instead of making their own consumer products
Imagine spending thousands on a setup just to get slapped in cod by someone running a $300 setup 😳 😂 but sure totally worth it lol
That $280 for the 7800x3d was a farce. I ordered, and supposedly, it shipped. 2 weeks later, still nothing and got a refund
Haha, scalpers gonna milk another round.
ah yes! the RRX cards are finally here!
I guess you will need a mini fusion reactor to run the 6090.
Intel changing their CPU naming convention was waaayyyy overdue
Don't get too excited, with the China ban, the RTX 5080 might not faster then RTX 4090, if Nvidia planning to sell it in China.
Intel has tried to kill the x86 a few times. First was iAPX 432, i860 and of course Itanium. Now there is talk of x86s, a 64 bit only x86 CPU
Someone needs to build a whole mini pc inside a founders 5090 cooler
Today I'am doing my regular clickbaits Welcome to Baiter Melt
Already seeing huge demand for ARM in the enterprise space, so yes I can see X86 losing market share.
"RRX" 5090
With energy prices as they are, a 5090 will cost a fortune to use.
Ah yes, my favorite card, the RRX 5090.
Nothing does the imposible because the impossible can't be done.
5090 has real tdp less that everybody think, because tdp associated at computational power. 145 watt ia a good stime, but effective.
TSMC business model: Foundry = Good
Intel business model: Foundry = Bad?
This is the start of Intel as a 700B company in just 5 short years.
X86 is definitely hanging on for life arm is on its way
uhh just a quick question but how many drinks did you have before makeing this video?
I’m telling you all, rtx 5090 would need a complete house upgrade.
your video brightened my day, thanks for the positivity!
I remember how the Leakers where like: 30-40% IPC gains for the new 9000 series Ryzen CPUs. Where are those Copium Leakers now? 😂
Okay, everyone's meming on the title, but imagine how wild it would be if Nvidia actually renamed their RTX branding to RRX and Gamer Meld accidentally predicted it with a typo. It'd be the most unbelievable arc in tech RUclips, lmao.
Damnnnn, i honestly thought I was able to FINALLY catch up and upgrade with the newest series. I was literally finally able to go from my 1650 to the 4060 that i got totally brand new, still in box sealed with the cellophane at an 83.3% discount from an auction site. Like, i literally just won it last night and have to go pick it up today, lmao.
I FINALLY get caught up with the world and i don't even have it picked up yet and already y'all are leaving me behind because this 50 series is about to drop 😂😂.
Damnnn, i was so excited thinkin i was caught up with the times 😂😂.
A food vendor selling hotdogs for years, suddenly now making burgers for a competitor, as a concept is not news. Look at such companies such as Mars, or Coke. They make a lot of competitive products, basically ensuring their future. By being the ones to make what consumers feel as having choices, by providing many of those choices, even at times under different names, rather than someone else.
This could become the case, with Intel working with NVIDIA. If Intel is making competitive chips to their own chips, for others, is to me a smart move. For they will still be the ones making those new chips. Giving consumers the idea of have more choices, but having those choices come from them.
The other smart thing about this move, is it helps Intel stay competitive to AMD. But with NVIDIA's help.
Many say, it's best to have a CPU and GPU form the same comp. And for now, that be AMD. Intel tried to compete on there own, by making their own GPU to work with their CPU. Though, it was a good GPU, it was still greatly over shadows by both what AMD and NVIDIA makes already. And so, had not done well.
Seeing that NVIDIA had announced their intentions to make a CPU, that would work great, in tandem with their GPU, Intel partnering directly with NVIDIA to make this new CPU for NVIDIA, keeps them still in the game of CPU manufacturing, should NVIDIA's idea to do what AMD have been doing, work for NVIDIA, may possibly take Intel out of the picture otherwise.
So if Intel CPU chips may become unpopular, as both AMD and NVIDIA both have a CPU and GPU, and many wanting to turn to using them someday, even if Intel CPU chips alone may still be better than what AMD or NVIDIA may come up with, they are not left out by work with NVIDIA in being the ones making their CPUs for them.
I only see this as a Win-Win for Intel, in the long run.
I really hoped they had found a way to
Make it smaller but more powerful.
*sigh
*check wallet
*cry
Ah the next modern home heater
I think that the new strategy of intel may save the company.
the snail is stalking me i see it everywhere