The Logitech 'game console' is going to be marketed as a 'gaming device' even though it's just recycled and/or surplus phone chips being used to create 'emulation machines'. It's why the market is so saturated with dozens of emulation devices that almost always under-deliver when compared to contemporary hardware.
Not even designed to run emulation of old games natively. The original bet by Logitech was Cloud Gaming might become popular and Cloud Gaming Subscription Services would be run on that device. Looks like that bet did not pan out.
Intel is already like 6 months too late with these arc cards. Nvidia and Amd are already preparing to launch their new cards on the last quarter (most likely). Kinda shame since intel really could have had a huger market gap to exploit back when there was gpu shortages
well nvidia cards are just going to be power guzzling thieves so like who reallly needs them when we have the 3000 series. idk about amds future card havent really heard much abt it.
@@homeyworkey there still will be people who buy the powe hungry nvidia cards no matter what. We've all seen it before. Iirc Amd said theyre rdna 3 is 50% more "efficient" with same power. But we have to see how that turns out in real world.
@@OneDollaBill The thing about that “more efficient” figure is that it pretty much means it has better graphics capability at the same power level-something you’d expect in a next-gen card. The max wattage on these cards will be significantly higher than the current-gen. If you do a full upgrade with Ryzen 7000 series, most people will need a new PSU. I think we’re trending in the wrong direction with this insane power increase.
Nvidia and AMD STILL DO NOT HAVE GOOD 200$ CARDS. The A380 at least on paper is a very good 140$ card since it has 6GB VRAM instead of measely and outdated 4GB of VRAM.
This might be a "power at the crank" vs. "power at the wheels" situation. Even if Intel has the RT hardware (engine/crank) to compete with Nvidia, the software (transmission, axles etc.) won't be ready. This has also been AMDs issue tbh. Remember Vega series cards? In pure compute they bested Nvidia 1000 series cards, but in the things most people actually use their gaming GPUs for - it struggled.
3:18 I wonder if this means you can play HDR/4K content from streaming services on it, since most services don’t allow HDR/4K on a desktop, but only on phones or built in apps. If it has a built in app, it means it could actually playback 4K/HDR content from streaming apps.
That's not a hardware (or browser) limitation, that's purely a DRM thing. Companies consider mobile devices (and dedicated apps) more secure so they have DRM there that allows streaming higher quality content.
I don't think its will be 4k as the chip made to run max 1440p 120 fps. But a 1440p 120 hz with big battery would run GeForce now pretty good. Would crush a steamdeck for the graphic quality as geforce now 3080 tier is a beast of a service. As long as you have solid internet.
@@Sevicify It does depend where you buy it. In my neighbourhood a $1.3M “home” is a small 1 or 2 bedroom apartment. In Double Bay it’s a tiny studio apartment. In the outer western suburbs you might get a decently sized 2 bedroom home with a yard!
@@KRoOoOoZ This is just Sydney. But Melbourne is just as bad. If you go to rural areas it’s much cheaper. But then you’re hours away from literally everything. No local stores. No supermarkets. Nothing except bushland for an hour. The problem is that 95% of our entire population lives on the coast, and on the Eastern side of Australia we’re mostly in 3 large cities (Sydney, Melbourne, and Brisbane) so it’s driving up the cost of land in those relatively small areas that hosts most of our population. If you want to live in or near the city, you don’t have many options. You either pay the over inflated land value, or you live further away which just means you have to drive an hour or two to work everyday. Sydney is one of the most expensive cities in the world when it comes to house and land value. It’s up there with Manhattan where celebrities have driven up values. The median house price in the entirety of Syd is 1.25 million, which means some areas have $5 million dollar houses (Double Bay has a median of $4 million), while the next area 15 minutes away might be 900k. It’s a beautiful country, but it’s expensive. Luckily though our social welfare system is good. If you’re poor here, the government will give you a house to live at a massive subsidy, and you pay it from your welfare payments, so it’s effectively free. Healthcare is free for everyone though. Medications are highly subsidised and incredibly cheap. So that’s one less thing you need to worry about here at least
Intel had me fully believing in them, but is anyone buying their BS anymore? It's even souring me on their CPUs. I like it when a company under talks and over delivers.
I'd rather have some concrete examples like in the Ryzen presentation. A comparison between the new product and its competitors (current Intel i9 and Ryzen 9) in different applications (games and productivity) with percentages showing how it compares. A "yeah, ours is better" isn't convincing anyone of anything.
I don’t understand 12th gen cpu’s at all. They need an aftermarket plate for proper cooling? I’ll never understand their board scheme’s (which amd is starting to confuse me as well). Why do their integrated graphics still suck ass?
@@leonro It's just dumb as hell. Intel can't even get their GPUs to work properly on still very played games or with not-so-uncommon configurations like not having resizable BAR. That's all nVidia has to respond with (if they want to get down to Intel's level which they probably shouldn't). Intel needs to shut up, set extremely aggressive prices and sell themselves as "cheap but good enough for an average gamer" and flood the market. Then work on their shit so the next gen works flawlessly with the most played games and doesn't have boneheaded drivers full of issues. Only then should they actually work on fancy features and whatnot.
@@amunak_ IMO it's not a problem that Intel GPUs don't work well without resizeable BAR. The GPUs should only be bought by people with CPUs that support it, but that's still quite a good amount. Any completely new PC should work with it. However, they need to have a lower price than comparable alternatives because, even if they were equal in performance, competitors have the advantage of working better without the feature. On the topic of what features they should be working on, there must be multiple teams working at Intel on everything related to GPUs. They will be working in parallel both on legacy compatibility and on new games and features.
Even if AMD wouldn't support their new AM5 socket beyond 2025, this would be longer than pretty much all the Intel socket ever existed that has a new socket much more frequently. Two sockets for each CPU gen? Intel got you covered bro! You are in for an adventure when you can only pray and hope that you don't need a new socket (plus mainboard etc.) when you want to upgrade your PC.
Back when zen2 came out AMD said to start expecting longer times between CPU generations. With nearly two years between 5000 and 7000.... 2025 means two generations. 2025 will see the release of the APUs for the generation after 7000.
Support usually implies.. Well, support, not manufacturing. Just like how in smartphone world certain manufacturers guarantee 5 years of support. In this case it probably implies motherboard availability and bios updates.
Intel acts about their GPUs the same way as that one guy in every gaming circle that claims to have the best computer with top of the line hardware but never ever posts pictures of it or shares any clip of them playing at max graphics.
Oh they do share pictures sometimes of hardware boxes but extremely closeup so it could be anywhere really😂 we have this one guy with a series x he didn’t set up for like 8months till he ended up getting one for real..
@@latinouncle8752 Lol remember SSE-2 P4 benchmarks when nothing used it? remember AVX-512 was critical as the only thing they beat AMD with.. until intel ... yeah dropped it this year...
if i ever got my hands on 10 million that wasnt mine, id just try to make money off it with interest until someone came knockin' looking for the money back.
When Intel announced their GPU division, they said they could make a card that out performs the gtx 1080 and they've delivered...years later. Problem is, I don't think they're anticipating where Nvidia will be in 2-4 years with raytracing (when Battlemage launches) and only looking at RTX 3000 series performance today.
They aren't stupid first launches of a new product is always hard -> as we seen from amd (ryzen) and nvidia (rtx) too The 2. Gen Gpu will be way more optimised and mistakes made from the 1 Gen are already implemented to be fixed in the 2. Gen I mean isn't it a bit unfair to Intel which hadn't that over 20 year long expirience as like nvidia or amd and to expect that they will already have to be on par with them I think Intel will get the software fixed in the long run and we already know that the hardware isn't bad So this could be the rx 480 edition of the intel gpu which had also bad software at the start
The fact is people are still buying weaker cards than 1080 in 2022. It is a large market, not everyone is an oil prince to spend ridiculous amounts of money on gaming hardware every few years.
they unfortunately won’t be very competitive because intel doesn’t even know when they will be ready, so the next gen gpus from amd and nvdia will probably be out and destroy them
I think Intel are finally going to release something, at some point, with great fanfare. And then realise that people don't care anymore, and that they waited too damn long.
So lemme get this strait. Intel claims they are better than Nvidia's mid-tier video card, vs Intel's BEST video card that only exists theoretically. Yet Intel's GPU drivers makes Radeon drivers laugh at them. Meanwhile Matrox still wonders if anyone still remembers that they still exist.
Arc really could be faster in raytracing. Their RT core architecture works differently from Nvidia's, with much less performance penalty for branching due to smaller warp size. I'll test it myself as soon as they ship my free A750 from XeHPG Scavenger Hunt.
@@ProjectPhysX well with driver issues i don't think it will be unless they fix it and even if it is different from nvidia you just said based on what people behind arc stated which isn't 100% believe to be since they already tried to put false benchmarks before on arc perf. i would say even if intel raytracing hardware less taxing on performance i still won't agree them out performing nvidia second generation rtx. they probably better than nvidia's first gen rtx ( when it released ) and also better than amd first generation rt ( rx6000 series ). no matter what it is simply too absurd claim from intel
I think the chipset table at 2:17 is instructions to board makers, so the x670e has to include 2 x16 5.0 slots while it's ok to only have 4.0 slots on the x670 and b650, and the b650e it's recommended for the manufacturers to have at least one 5.0 slot. What's more interesting for me is that all will have quad channel memory instead of the dual channel on every other mainstream platform atm
That Samsung monitor that can apparently just receive streams by itself is a great idea. Not only does it avoid running a ~50W and maybe-loud PC, but streaming services like Disney+ are 720p at best on PC. Too bad they apparently added the feature to a curved non-4K monitor tho. Bring it to a flat 4K monitor and I'm sold.
Yes, AMD should have a much better time with supply for this gen. They're the main customer for TSMC N5. TSMC is going to take care of them making sure they get priority with contracts. But when companies were backing off TSMC contracts, such as reducing wafer capacity, AMD stuck with theirs. But they have such a large portfolio of products now, they can shift wafer production as they see fit to meet demand. AMD struggled to meet demand for EPYC for Zen 3. AMD hasn't really struggled with graphics supply like Nvidia because they shifted wafer production away from GPUs last year. It puts then in a good spot that both Intel and Nvidia are trying to emulate now.
It was about a decade or more when i first saw the prototype monitor in the shape of a mastercard , i guess it needed a lot of work to make such a big screen that flexes
One time a bank cashier accidentally deposited 70k$ in my account, when I went to the bank and informed them of the mistake, they didn't even say thank you for notifying
Windows 7 RX 6400/6500 XT ray tracing in Doom Eternal via Adrenalin 22.6.1 let's go!!! Wait you doin a Gaijin sponsor again. I hope the company's in a better state with their games now. I just need an excuse to jump back to Crossout again, except this time without hurting myself with DX10.1 graphics (hehe).
Hasn't Intel's marketing dept. learned anything? There has already been massive backlash due to evidence of false marketing/advertising claims. Someone make those clowns have to get permission from the higher-ups in the company before they throw around more bold statements, as currently they are actively causing damage!
@@bradhaines3142 it depends if you are a gamer and what kind of graphics card you use. The next two generations of GPUs will be getting so fast that they'll be slowed down much more quickly by an possibly CPU bottleneck. Probably even at 1440p.
How long should I expect support for arc gpu's? Yeah they might sell them but Isn't it DOA, didn't they kill the whole division? I'd expect driver support to be short-lived.
Why is no one mentioning the fact that it says Quad channel memory DDR5. My old Quad channel DDR3 System held on for a long time partially due to having those extra mem channels
OMG adjustable curvature OLD TVs be coming! Such goodness for someone like me who uses one for both desktop gaming and movie watching. Really hope this isn't just a flash in the pan and they'll stick around for a bit (and come in larger sizes too), can't afford to ditch my CX yet.
Something for future videos about graphics cards: What works best for video editing? I use CUDA acceleration on my exports and it's a lifesaver. How about AMD or Intel's products? Can they do something like that?
I updated to android 13 on my pixel 4 and I though I'd killed it because my home screen was frozen, and tapping didn't do anything, but I could still swipe down for my settings and stuff. Found out that the fix, oddly enough, was to turn dark mode off, then back on. Works like a charm now, but yeah. A bit scary for a sec. Haven't had any other bugs with 13 yet, but to be fair, my one bug almost made me reset my phone soooo... One big bug, many small bugs, which is better?
seriously if you get millions by mistake.. take the money and move to a country (easy with the money you now have) that your old country has zero jurisdiction in to take you to court to get the money back. And if you insist in staying in your country, invest the money, that way if you are forced to return the money someday, you only have to return what you were given by mistake and can keep the proceeds of your investments. Pretty much do what corporations, rich people, or wallstreet short stock investment firms do.
That last story would be very simple in Sweden; If the entity who paid to much wants the money back but the recieving entity don't want to give it back, there is no second transaction. *Tough luck for the over-payer.*
bad move spending accidental refund money, I'd have just put it in a savings account to earn some interest until they ask me to pay back the money. lol
People huffing about GTX 3080 and what not are forgetting that best performance/€ is still GTX 2060 so you just need to be competitive with it and you should be able to snatch decent market share.
Hahaha man you guys are so funny, always keep it that way. Absolutely love your videos. Nothing's better than good tech journalism that is combined with a good dose of comedy.
"...which has only made the question of PCI-e 5.0 support more confusing for everyone except the Anthonys of the world..." Tell Linus I want a "Deep Dive" channel with Anthony as the host!
Why would Intel be making bold claims when they can't even get their stock in order due to hardware problems they can completely prevent. 6 MONTHS TOO LATE!
@@daymianhogue1634 24" enough for me, i am playing competition game and i just want the motion clarity that oled offer plus its great support for hdr content.
@@izzadabdullah5565 yeah that's a LONG ways off. Oled is easier to make in bigger sizes because of the way its manufactured. I'd not expect anything below 32" for a few years unless you're planning to shell out like $3,000
lol I don't get how someone can accidently get 10 mil by mistake and think it's thier money and go spending millions of dollars right away. It's crazy the mindset people have to do that, it's same situation with lottery winners who end up bankrupt within like a year of winning because they just go spending the millions on stuff they could never possibly substain, it's stupid.
well in the US at least, if something is given to you it's yours wether the company did it on purpose or not, so that bit makes sense. but burning it all was stupid
@@bradhaines3142 i don’t think so actually. I remember in my business law class that this can be still tried as theft if its an honest mistake and they don’t want to give it back
@@TinyMiniStealth i know with deliveries that's how it works, and payments but im not sure if there's a cap. if it was a check or something in someone elses name that's definitely theft, but straight money is yours when they hand it to you
6:50 100% the crypto companies fault, literally an extra 6 0's added onto her requested $100 refund. Like if my managers screw up and give me an extra couple 0's on my check, that's their screw up, im not paying them back nor do i expect them to take future pay out of my pay check, same thing here, they screwed up, deal with the loss.
i don't know if they make something that's alot cheaper than nvidia but its last gen, i don't think i'd mind at all since i wanted a 3060 ti but msrp price was fine but scalpers made it impossible to afford with their prices. If i knew i scalper, i'd love to show him my frustration about those prices they thought they could get.
Does anyone else find it weird that AMD's boss is always referred to as DOCTOR Lisa Su, compared to the heads of other outfits who may also have doctorates but it's rarely ever mentioned?
Well literally any other CEO that's mentioned in LTT coverage doesn't have one. Pat has a Master's, Jensen has a Master's, Tim Cook has an MBA, Nadella also has an MBA. Gates is a dropout, Page and Brin both have a Master's, and Pichai has a MBA. Most of them were too busy with their jobs by the time they got their Master's.
@@kyonblack7901 Fair enough! To be fair, I never looked into it, but I figured at least one other Big Name CEO or the like would have a doctorate. Thank you for pointing this out to me - she definitely deserves it then! ^_^-b *goes back to hiding in the corner* :)
I think Anthony is everyone's favourite host, especially if you're oldschool and understand all the "weird technology stuff", and yeah, lets call out AMD for planning to maybe possibly replace AM5 in 3 years, when Intel has a habit of changing sockets every year, sometimes introducing up to 3 new sockets during a single year(sometimes making the exact same chip available for two different socket types), yeah, lets stick it to the man that deserves it...
With Intel's mainstream consumer products, you know what to expect, they've been very consistent about that over the past decade. And it's not like people aren't giving Intel shit for their support cycles. Currently, nobody knows what AM5 support will be like - the fact that up until recently AMD's website claimed AM5 would be supported for 5 years and now they're saying 3 really doesn't help. It could literally be just 3 years, which would be useless for 90+% of people because not many people upgrade CPUs that early. Additionally, AMD doesn't even clarify whether or not will all motherboards be supported, or just the socket itself - meaning you could potentially have to buy a new motherboard anyway. Even if AMD chose to only support AM5 for the next 2 years, it wouldn't be that big of a deal, the problem is that their communication isn't clear - which could be entirely intentional to lead people to believe the support will be the same as it was for AM4.
The Logitech 'game console' is going to be marketed as a 'gaming device' even though it's just recycled and/or surplus phone chips being used to create 'emulation machines'. It's why the market is so saturated with dozens of emulation devices that almost always under-deliver when compared to contemporary hardware.
The Toshiba Handibook of steam decks
This. 100% this is exactly what I thought about it, as well. Sad thing is people will buy it.
@@xKB616 it's mainly for Asian market where mobile gaming and gacha is huge
@@SRC267 but why need separate stuff when you can easily plays in smartphone.
Not even designed to run emulation of old games natively. The original bet by Logitech was Cloud Gaming might become popular and Cloud Gaming Subscription Services would be run on that device. Looks like that bet did not pan out.
Intel is already like 6 months too late with these arc cards. Nvidia and Amd are already preparing to launch their new cards on the last quarter (most likely).
Kinda shame since intel really could have had a huger market gap to exploit back when there was gpu shortages
Intel delayed he launch of Arc to give Nvidia and AMD a chance to compete. Such a gentleman Intel.
well nvidia cards are just going to be power guzzling thieves so like who reallly needs them when we have the 3000 series. idk about amds future card havent really heard much abt it.
@@homeyworkey there still will be people who buy the powe hungry nvidia cards no matter what. We've all seen it before. Iirc Amd said theyre rdna 3 is 50% more "efficient" with same power. But we have to see how that turns out in real world.
@@OneDollaBill The thing about that “more efficient” figure is that it pretty much means it has better graphics capability at the same power level-something you’d expect in a next-gen card. The max wattage on these cards will be significantly higher than the current-gen. If you do a full upgrade with Ryzen 7000 series, most people will need a new PSU. I think we’re trending in the wrong direction with this insane power increase.
Nvidia and AMD STILL DO NOT HAVE GOOD 200$ CARDS. The A380 at least on paper is a very good 140$ card since it has 6GB VRAM instead of measely and outdated 4GB of VRAM.
Remember: Bold claims require bold evidence.
i doubt anyone believes them
@@Luca-mu6hq
I mean people are still waiting for Full Self Driving lol, people will believe anything nowadays.
They don't even have any working stock to afford evidence
More importantly: **bold claims require bold typeface**
I have bold evidence!
*Bold Evidence.* There. Impressed yet?
To be fair to Intel, they said the _hardware_ was competitive, which may be true!
The software that runs it, on the other hand...
They can't even have working hardware considering they cancelled production of their high end cards for the first gen
I wish I was as confident as Intel 😭
Facts
Intel's got the confidence of a teenage boy with his first whisker.
Fake it till you make it bro
Intel's got that false confidence that you feel until you try to confess how you feel to someone and get crushed.
@@JohnDoe-wq5eu I mean intel did prove themselves with 12th gen. Now upto the ARC
This might be a "power at the crank" vs. "power at the wheels" situation.
Even if Intel has the RT hardware (engine/crank) to compete with Nvidia, the software (transmission, axles etc.) won't be ready.
This has also been AMDs issue tbh. Remember Vega series cards? In pure compute they bested Nvidia 1000 series cards, but in the things most people actually use their gaming GPUs for - it struggled.
Great analagy. I concur.
3:18
I wonder if this means you can play HDR/4K content from streaming services on it, since most services don’t allow HDR/4K on a desktop, but only on phones or built in apps.
If it has a built in app, it means it could actually playback 4K/HDR content from streaming apps.
hence i pirate anything thats not on netflix. Netflix is the only streaming service which has dd5.1+ and 4K on desktop.
This is actually because of a browser limitation.
That's not a hardware (or browser) limitation, that's purely a DRM thing. Companies consider mobile devices (and dedicated apps) more secure so they have DRM there that allows streaming higher quality content.
I don't think its will be 4k as the chip made to run max 1440p 120 fps. But a 1440p 120 hz with big battery would run GeForce now pretty good. Would crush a steamdeck for the graphic quality as geforce now 3080 tier is a beast of a service. As long as you have solid internet.
@@amunak_ exactly, so if this is some mobile chip powering it, maybe it could have the DRM it needs.
gotta love the editor's honesty on the vfx
The funny thing is a 1.3M dollar house in Australia is just a 2 bedroom home 😂
Eh depends where you buy at, and she apparently bought a 5-bedroom home.
@@Sevicify It does depend where you buy it. In my neighbourhood a $1.3M “home” is a small 1 or 2 bedroom apartment. In Double Bay it’s a tiny studio apartment. In the outer western suburbs you might get a decently sized 2 bedroom home with a yard!
damn Australia is expensive
@@KRoOoOoZ This is just Sydney. But Melbourne is just as bad. If you go to rural areas it’s much cheaper. But then you’re hours away from literally everything. No local stores. No supermarkets. Nothing except bushland for an hour. The problem is that 95% of our entire population lives on the coast, and on the Eastern side of Australia we’re mostly in 3 large cities (Sydney, Melbourne, and Brisbane) so it’s driving up the cost of land in those relatively small areas that hosts most of our population. If you want to live in or near the city, you don’t have many options. You either pay the over inflated land value, or you live further away which just means you have to drive an hour or two to work everyday. Sydney is one of the most expensive cities in the world when it comes to house and land value. It’s up there with Manhattan where celebrities have driven up values. The median house price in the entirety of Syd is 1.25 million, which means some areas have $5 million dollar houses (Double Bay has a median of $4 million), while the next area 15 minutes away might be 900k. It’s a beautiful country, but it’s expensive. Luckily though our social welfare system is good. If you’re poor here, the government will give you a house to live at a massive subsidy, and you pay it from your welfare payments, so it’s effectively free. Healthcare is free for everyone though. Medications are highly subsidised and incredibly cheap. So that’s one less thing you need to worry about here at least
PCIe 5.0 is backwards compatible, so it can be used with peripherals conforming to any of the previous PCIe revisions, including 4.0
so same as it has always been...?
James exudes 'whateveridontfkincare' energy and it's especially strong in this one. Good host, thanks for the (tech) news.
Intel had me fully believing in them, but is anyone buying their BS anymore?
It's even souring me on their CPUs. I like it when a company under talks and over delivers.
And if the 13th gen is not promising then AMD will take on the cpu market with zen 4
I'd rather have some concrete examples like in the Ryzen presentation. A comparison between the new product and its competitors (current Intel i9 and Ryzen 9) in different applications (games and productivity) with percentages showing how it compares. A "yeah, ours is better" isn't convincing anyone of anything.
I don’t understand 12th gen cpu’s at all. They need an aftermarket plate for proper cooling? I’ll never understand their board scheme’s (which amd is starting to confuse me as well). Why do their integrated graphics still suck ass?
@@leonro It's just dumb as hell. Intel can't even get their GPUs to work properly on still very played games or with not-so-uncommon configurations like not having resizable BAR. That's all nVidia has to respond with (if they want to get down to Intel's level which they probably shouldn't).
Intel needs to shut up, set extremely aggressive prices and sell themselves as "cheap but good enough for an average gamer" and flood the market. Then work on their shit so the next gen works flawlessly with the most played games and doesn't have boneheaded drivers full of issues. Only then should they actually work on fancy features and whatnot.
@@amunak_ IMO it's not a problem that Intel GPUs don't work well without resizeable BAR. The GPUs should only be bought by people with CPUs that support it, but that's still quite a good amount. Any completely new PC should work with it.
However, they need to have a lower price than comparable alternatives because, even if they were equal in performance, competitors have the advantage of working better without the feature.
On the topic of what features they should be working on, there must be multiple teams working at Intel on everything related to GPUs. They will be working in parallel both on legacy compatibility and on new games and features.
Rtx 40 is looking like it will be a double set up - 4060 to compete with 3070ti/3080.
So wondering where the 3060 competitors from intel fit in
Even if AMD wouldn't support their new AM5 socket beyond 2025, this would be longer than pretty much all the Intel socket ever existed that has a new socket much more frequently.
Two sockets for each CPU gen? Intel got you covered bro! You are in for an adventure when you can only pray and hope that you don't need a new socket (plus mainboard etc.) when you want to upgrade your PC.
Back when zen2 came out AMD said to start expecting longer times between CPU generations. With nearly two years between 5000 and 7000.... 2025 means two generations. 2025 will see the release of the APUs for the generation after 7000.
"support" doesn't necessarily mean a cpu release
Support usually implies.. Well, support, not manufacturing. Just like how in smartphone world certain manufacturers guarantee 5 years of support. In this case it probably implies motherboard availability and bios updates.
5:15 judging by how hard railey laughed, I'm gonna say he didn't write that joke.
I bet he was jealous too, because it was a good one.
railey
Intel acts about their GPUs the same way as that one guy in every gaming circle that claims to have the best computer with top of the line hardware but never ever posts pictures of it or shares any clip of them playing at max graphics.
Yeah, what's claimed without evidence can be dismissed without evidence. Intel, numbers or GTFO!
@@Donnerwamp
You can't even trust them with numbers nowadays, they will run whatever benchmark will benefit them
Oh they do share pictures sometimes of hardware boxes but extremely closeup so it could be anywhere really😂 we have this one guy with a series x he didn’t set up for like 8months till he ended up getting one for real..
@@latinouncle8752
Lol remember SSE-2 P4 benchmarks when nothing used it? remember AVX-512 was critical as the only thing they beat AMD with.. until intel ... yeah dropped it this year...
Man I love the idea of an AdBlock T-Shirt.
if i ever got my hands on 10 million that wasnt mine, id just try to make money off it with interest until someone came knockin' looking for the money back.
Lol, I just seen the intel claims pc gamer article on google now, about 10 seconds ago. And your bell rings. XD perfect timing!
I'll be honest, when the Arc GPUs finally come out, it'll be a touch choice between the A700 and the RTX 5060.
Great tech news! I like it when James comments "trust her, she's a doctor" referring to what Lisa was saying!🙃
The LG Curve monitor has a really cool feature not mentioned here: it curves itself! You can adjust it in 5% increments without touching it
When Intel announced their GPU division, they said they could make a card that out performs the gtx 1080 and they've delivered...years later.
Problem is, I don't think they're anticipating where Nvidia will be in 2-4 years with raytracing (when Battlemage launches) and only looking at RTX 3000 series performance today.
They aren't stupid first launches of a new product is always hard -> as we seen from amd (ryzen) and nvidia (rtx) too
The 2. Gen Gpu will be way more optimised and mistakes made from the 1 Gen are already implemented to be fixed in the 2. Gen
I mean isn't it a bit unfair to Intel which hadn't that over 20 year long expirience as like nvidia or amd and to expect that they will already have to be on par with them
I think Intel will get the software fixed in the long run and we already know that the hardware isn't bad
So this could be the rx 480 edition of the intel gpu which had also bad software at the start
The fact is people are still buying weaker cards than 1080 in 2022. It is a large market, not everyone is an oil prince to spend ridiculous amounts of money on gaming hardware every few years.
2:10 The background music in today's video was really nice :)
these days every Intel GPU news just make me laugh
@2:45 Ahhh, that's an exemplary wholesome Riley-James comedy duo joke
"I don't know which way is news" is my new favourite phrase to have ever been uttered on this channel
Logitech handheld: *Leaks*
*Images get taken down*
Thanks for confirming the leak logitech
2:00 I love the "subtle" reference to Anthonys role as iconic super nerd king of the nerdy LMG nerd company. ;)
Since the 3060 or 6600 range of GPUs is where I'm looking to buy when I upgrade, I'm excited that Intel is claiming to have something competitive.
they unfortunately won’t be very competitive because intel doesn’t even know when they will be ready, so the next gen gpus from amd and nvdia will probably be out and destroy them
Intel must be dreaming. They can't even produce a stable graphics driver, and abandons directx 9.
Not so sure they are dreaming, but it probably is in specific cases, where the drivers sort of work.
I think Intel are finally going to release something, at some point, with great fanfare.
And then realise that people don't care anymore, and that they waited too damn long.
there's a weird high pitched creaking in the audio around 0:52
Absolutely loved the LotR reference!
So lemme get this strait. Intel claims they are better than Nvidia's mid-tier video card, vs Intel's BEST video card that only exists theoretically. Yet Intel's GPU drivers makes Radeon drivers laugh at them. Meanwhile Matrox still wonders if anyone still remembers that they still exist.
people behind arc now lost their minds🤣 they can't even fix a driver issue yet making absurd claims
Arc really could be faster in raytracing. Their RT core architecture works differently from Nvidia's, with much less performance penalty for branching due to smaller warp size.
I'll test it myself as soon as they ship my free A750 from XeHPG Scavenger Hunt.
@@ProjectPhysX well with driver issues i don't think it will be unless they fix it and even if it is different from nvidia you just said based on what people behind arc stated which isn't 100% believe to be since they already tried to put false benchmarks before on arc perf. i would say even if intel raytracing hardware less taxing on performance i still won't agree them out performing nvidia second generation rtx. they probably better than nvidia's first gen rtx ( when it released ) and also better than amd first generation rt ( rx6000 series ). no matter what it is simply too absurd claim from intel
What was that "move to India" reference in last?
I think the chipset table at 2:17 is instructions to board makers, so the x670e has to include 2 x16 5.0 slots while it's ok to only have 4.0 slots on the x670 and b650, and the b650e it's recommended for the manufacturers to have at least one 5.0 slot.
What's more interesting for me is that all will have quad channel memory instead of the dual channel on every other mainstream platform atm
Loving the 'stouche, James.
That Samsung monitor that can apparently just receive streams by itself is a great idea. Not only does it avoid running a ~50W and maybe-loud PC, but streaming services like Disney+ are 720p at best on PC. Too bad they apparently added the feature to a curved non-4K monitor tho. Bring it to a flat 4K monitor and I'm sold.
You’ve been working out. 💪 nice
Well...The corsair monitor was built with the help of LG. I believe it's their panel, design and technology.
Thanks Mark!
More Dan please, thank you.
Yes, AMD should have a much better time with supply for this gen. They're the main customer for TSMC N5. TSMC is going to take care of them making sure they get priority with contracts. But when companies were backing off TSMC contracts, such as reducing wafer capacity, AMD stuck with theirs. But they have such a large portfolio of products now, they can shift wafer production as they see fit to meet demand.
AMD struggled to meet demand for EPYC for Zen 3. AMD hasn't really struggled with graphics supply like Nvidia because they shifted wafer production away from GPUs last year. It puts then in a good spot that both Intel and Nvidia are trying to emulate now.
It was about a decade or more when i first saw the prototype monitor in the shape of a mastercard , i guess it needed a lot of work to make such a big screen that flexes
One time a bank cashier accidentally deposited 70k$ in my account, when I went to the bank and informed them of the mistake, they didn't even say thank you for notifying
I would've bought so many Software, Servers, Networks, PCs, Supercomputers, Military stuff, Land n House, Transportation stuff, etc
As soon as Intel comes out with a RTX 3080/ Radeon 6800XT competitor, the world of GPU's will never be the same.
As soon as Intel comes out with any video card with drivers that are stable for more than 24 hours.
The problem is they will be competing with those video cards 2 years from now when RDNA 4 and rtx 50 series will be about to launch. 😂
Yep the year 2032 is going to be awesome.
when that time comes, AMD and Nvidia is already on RDNA 5 and Ada Lovelace
Windows 7 RX 6400/6500 XT ray tracing in Doom Eternal via Adrenalin 22.6.1 let's go!!! Wait you doin a Gaijin sponsor again. I hope the company's in a better state with their games now. I just need an excuse to jump back to Crossout again, except this time without hurting myself with DX10.1 graphics (hehe).
Hasn't Intel's marketing dept. learned anything? There has already been massive backlash due to evidence of false marketing/advertising claims. Someone make those clowns have to get permission from the higher-ups in the company before they throw around more bold statements, as currently they are actively causing damage!
"Trust her, she's the Doctor" made me bark out a laugh that woke up the rest of my house.
I’m still using my 3700x but I don’t see a need to upgrade anytime soon.
2600
same here. really good cpu
end of 2023/ early 2024 would be fine for your next upgrade.
@@Datenschutz_Datenschutz or you know, til it stops doing what he needs like most reasonable people do.
@@bradhaines3142 it depends if you are a gamer and what kind of graphics card you use. The next two generations of GPUs will be getting so fast that they'll be slowed down much more quickly by an possibly CPU bottleneck. Probably even at 1440p.
Wait, wasn't discrete ARC cancelled? i'm confused.
It's 98° F at 8:40 in the night in SoCal. It's mofokin hot
How long should I expect support for arc gpu's? Yeah they might sell them but Isn't it DOA, didn't they kill the whole division? I'd expect driver support to be short-lived.
So when is James hosting a video on LTT?
Where do I get that ADBLOCK tshirt?!
Why is no one mentioning the fact that it says Quad channel memory DDR5. My old Quad channel DDR3 System held on for a long time partially due to having those extra mem channels
Hardware Tolkien references, this is why I come back.
OMG adjustable curvature OLD TVs be coming! Such goodness for someone like me who uses one for both desktop gaming and movie watching. Really hope this isn't just a flash in the pan and they'll stick around for a bit (and come in larger sizes too), can't afford to ditch my CX yet.
Something for future videos about graphics cards: What works best for video editing? I use CUDA acceleration on my exports and it's a lifesaver. How about AMD or Intel's products? Can they do something like that?
When do we get all Intel capable gaming laptops( processors and GPUs?)
All of the claims for the new and upcoming hardware - let's see which pans out and are closest to the predictions of the performance.
I updated to android 13 on my pixel 4 and I though I'd killed it because my home screen was frozen, and tapping didn't do anything, but I could still swipe down for my settings and stuff. Found out that the fix, oddly enough, was to turn dark mode off, then back on. Works like a charm now, but yeah. A bit scary for a sec. Haven't had any other bugs with 13 yet, but to be fair, my one bug almost made me reset my phone soooo... One big bug, many small bugs, which is better?
"Oh I Intel, and this one is sponsored by you! " - Noah Kats
seriously if you get millions by mistake.. take the money and move to a country (easy with the money you now have) that your old country has zero jurisdiction in to take you to court to get the money back.
And if you insist in staying in your country, invest the money, that way if you are forced to return the money someday, you only have to return what you were given by mistake and can keep the proceeds of your investments. Pretty much do what corporations, rich people, or wallstreet short stock investment firms do.
3:58 holy fuck how old is that gameplay
I think it just looks different to make it easier to see
It's September 1st where I am. So a techtember news will come early. Stay tune.
Some people got their hands on the A770
Intel are a generation behind in performance with their GPUs (and almost CPUs). They've turned into early-2010s AMD
2 generations behind by the looks of it. Nvidia's said to launch their next gen in 1-2 months and Intel arc is still nowhere in sight.
That last story would be very simple in Sweden;
If the entity who paid to much wants the money back but the recieving entity don't want to give it back,
there is no second transaction. *Tough luck for the over-payer.*
I would not be against a card dedicated to ray/path tracing to match with your gfx card that would be around $100-150
it seems LMG forgot the recent rumor of the embargo of high end accelerator cards...
bad move spending accidental refund money, I'd have just put it in a savings account to earn some interest until they ask me to pay back the money. lol
My phone has similar specs to that Logitech handheld, and i'm not the only one, so it's useless for many people
The song during the AM5 story, for, me was either too loud , or it was the singing. Distracted too much from the news.
Intel totally has a girlfriend too... she just goes to another school... you wouldn't know her...
If they don't launch arc adverts/marketing with "Blue - eifel 65" they missed a trick.
Releasing late at night with this one!
I would have been long gone with that money. I can't imagine how that would still be in her account after 7 months... Just...WOW.
People huffing about GTX 3080 and what not are forgetting that best performance/€ is still GTX 2060 so you just need to be competitive with it and you should be able to snatch decent market share.
alright. ive been drinking. i tried to look up james shirt in this video. i need a link to buy it now.. lol.. wheres it from please. thanks.
Hahaha man you guys are so funny, always keep it that way. Absolutely love your videos. Nothing's better than good tech journalism that is combined with a good dose of comedy.
"...which has only made the question of PCI-e 5.0 support more confusing for everyone except the Anthonys of the world..."
Tell Linus I want a "Deep Dive" channel with Anthony as the host!
Did I here that something is being released September 2007?
man so competitive with those unreleased cards!
Why would Intel be making bold claims when they can't even get their stock in order due to hardware problems they can completely prevent. 6 MONTHS TOO LATE!
or you can get an backbone controller and stream your games to your phone from PlayStation, and pc?
me dying to get proper oled desktop monitor and now they just make it bigger.
I mean there's the Alienware 34" ultrawide? or the LG c2 42" which are atleast reasonable.
@@daymianhogue1634 24" enough for me, i am playing competition game and i just want the motion clarity that oled offer plus its great support for hdr content.
@@izzadabdullah5565 yeah that's a LONG ways off. Oled is easier to make in bigger sizes because of the way its manufactured. I'd not expect anything below 32" for a few years unless you're planning to shell out like $3,000
2:38 Wish Samsung could do that with their Semiconductor yield rates😮💨
lol I don't get how someone can accidently get 10 mil by mistake and think it's thier money and go spending millions of dollars right away. It's crazy the mindset people have to do that, it's same situation with lottery winners who end up bankrupt within like a year of winning because they just go spending the millions on stuff they could never possibly substain, it's stupid.
well in the US at least, if something is given to you it's yours wether the company did it on purpose or not, so that bit makes sense. but burning it all was stupid
@@bradhaines3142 i don’t think so actually. I remember in my business law class that this can be still tried as theft if its an honest mistake and they don’t want to give it back
@@TinyMiniStealth i know with deliveries that's how it works, and payments but im not sure if there's a cap. if it was a check or something in someone elses name that's definitely theft, but straight money is yours when they hand it to you
A good chunk of people value shiny objects far more than freedom sadly.
Can't get RGB in prison, unless we count shower time.
@@bradhaines3142 ah ok interesting
5:26 just like iphone 6 plus would be funnier
I completely forgot about arc.
6:50 100% the crypto companies fault, literally an extra 6 0's added onto her requested $100 refund. Like if my managers screw up and give me an extra couple 0's on my check, that's their screw up, im not paying them back nor do i expect them to take future pay out of my pay check, same thing here, they screwed up, deal with the loss.
Anthony is everybodys favourite host, at least he should be
I appreciate the ghosts for the lotr joke
I genuinely would have left the country in a heartbeat
i don't know if they make something that's alot cheaper than nvidia but its last gen, i don't think i'd mind at all since i wanted a 3060 ti but msrp price was fine but scalpers made it impossible to afford with their prices. If i knew i scalper, i'd love to show him my frustration about those prices they thought they could get.
Does anyone else find it weird that AMD's boss is always referred to as DOCTOR Lisa Su, compared to the heads of other outfits who may also have doctorates but it's rarely ever mentioned?
Well literally any other CEO that's mentioned in LTT coverage doesn't have one. Pat has a Master's, Jensen has a Master's, Tim Cook has an MBA, Nadella also has an MBA. Gates is a dropout, Page and Brin both have a Master's, and Pichai has a MBA.
Most of them were too busy with their jobs by the time they got their Master's.
@@kyonblack7901 Fair enough! To be fair, I never looked into it, but I figured at least one other Big Name CEO or the like would have a doctorate. Thank you for pointing this out to me - she definitely deserves it then! ^_^-b
*goes back to hiding in the corner* :)
I think Anthony is everyone's favourite host, especially if you're oldschool and understand all the "weird technology stuff", and yeah, lets call out AMD for planning to maybe possibly replace AM5 in 3 years, when Intel has a habit of changing sockets every year, sometimes introducing up to 3 new sockets during a single year(sometimes making the exact same chip available for two different socket types), yeah, lets stick it to the man that deserves it...
With Intel's mainstream consumer products, you know what to expect, they've been very consistent about that over the past decade. And it's not like people aren't giving Intel shit for their support cycles. Currently, nobody knows what AM5 support will be like - the fact that up until recently AMD's website claimed AM5 would be supported for 5 years and now they're saying 3 really doesn't help. It could literally be just 3 years, which would be useless for 90+% of people because not many people upgrade CPUs that early. Additionally, AMD doesn't even clarify whether or not will all motherboards be supported, or just the socket itself - meaning you could potentially have to buy a new motherboard anyway.
Even if AMD chose to only support AM5 for the next 2 years, it wouldn't be that big of a deal, the problem is that their communication isn't clear - which could be entirely intentional to lead people to believe the support will be the same as it was for AM4.