Even by NVIDIA standards It's insane that they have the balls to sell the 3050 for the price they're selling it at. It's not like at the high end where they either don't have competition or where their competition is just as horribly priced. It makes the 1650 vs rx 570 situation back then almost look reasonable. I genuinely cannot understand how they manage to shift a single unit.
nVidia sells because of mind share and nothing else. They know how to dangle shiny new useless features in front of ignorant users who then take the bait.
The problem is that it has nvidias name on it so it will sell no matter what. People need to start moving to team red to help open their eyes to their ridiculous prices. That's why evga left
Even the best "pro" of Nvidia, RayTracing, which is often mentioned to justify the prices falls completely off. Like what will you do with RT with a 3050? But it’s every buyers own fault if he decides to go for a 3050.
My guess is there weren't many 3050s. Cut down 3060s for $250 isn't something Nvidia wants to sell, and they originally intended the 3050 to have GA107 instead of 106 (hence weird limits like x8 PCIe) and planned to stealth launch the GA107 version. I think I heard some rumors that they did eventually did this but considering they want to prioritize laptops with GA107 (also why no x16) this gen and considering they want to sell more 3060s, the desktop 3050 is too much in a weird position for Nvidia to give a shit and sell at MSRP. And that's besides the fact they need to sell less than MSRP to compete with the 6600 and A750. Oof.
Hey Daniel, if you choose "override group" under Graph Limits for something like 1% low, you can give it the group name "1% Low", saving you from having to explain it over and over. It'll make the graphs more intuitive from the get go. Hope that helps!
@@andreslinn69 You need to have then enabled in the OSD and then have a key assigned to "start benchmark", then press that whenever you want RTSS to start measuring them. If you don't then you'll only be shown the instant FPS
@@b0ne91 thankyou for the reply.. yes I've enabled the options and check the OSD but still it doesn't show up. Guess the answer is to start benchmark first (?) as Klobb said in previous comment..
@@thelegendaryklobb2879 thankyou for the reply! Ah I see, do I need to assign "start benchmark" first to specific key? May I ask how to do that? I'm still new to MSI afterburner..
I sold my Gigabyte 3050 for $280 cash and used that to buy an EVGA 3060 12GB B stock on their pre-Christmas sale for $259.99. It was quite the free upgrade!
Also, thanks for making this, and I’m glad you didn’t include the 3060 or 3060ti. The 3050 is a joke, the 6600XT could be the new 580 if we stop buying it at almost $300, and the A750 is slept on hard.
Man, you REALLY have to be ignorant to buy a 3050 at this point in time. Late last year, my trusty RX 460 2GB gave up and when considering a new GPU, a quick look at the prices made it clear NVidia was a BAD CHOICE. Thanks in large part to my brother I built a rough PS5 equivalent for 600USD using an RX 6700 (+R5 5600). With any luck it should hold us through this generation...
@@MistyKathrine yeah it mostly only made sense when you were lucky with a newwgg shuffle at launch or got into the EVGA queue early enough. I didn't win the shuffle but someone nice on a discord server did and let me buy it from newegg. $329, and it even had a minor discount to make it more like $319 without shipping, or $329 with free shipping. The only time the 3050 made sense to buy.
@@roundduckkira EVGA was the only company trying to do right by the consumer in that situation and now they are out of the GPU game and I'm so sad about it.
@@MistyKathrine amen to that, funnily enough despite my failed attempts to get the XC Black and missing out on the backplated XC model due to hunting for the former ($329 was yikes still), it's funny that the specific GPU I ended up with was that backplated XC model lmao.
I like this comparison format. Knowing relative and absolute performance while seeing how those comparison change through RT is extremely useful. If I had seen videos like this I would had spent much less time choosing GPU
I am using Arc A750, it is really a great deal!😁 I didn't have that control permission Windows during pop up after reboot. What driver version are you using?
I never thought I would hear the phrase "murder-suicide" in a GPU video lol. Love these head-to-head videos. Especially when it shows more... disappointing cards like the 3050.
@@xClanPkSx Yeah... I have never seen an RTX 3050 go below $300 at my closest Microcenter, whereas RX 6600s are just sitting there at $230. Every time I help a friend with getting a new PC or whatever, I have to constantly break the stigma of "bUt aMD bAd" and it's so frustrating...
@@wrusst yes you should take the perf difference into the account obviously. But these deals go in and out of stock. So I would check all 3 models for the best deal. That's the point I was trying to make.
@@SirMo not everyone going to know a difference. Most people educated know a 3050 is pointless but the idea is to catch these people in informative content
You should add power consumption, at this prices point this make a huge difference, most of the tech tester include this on expensive card where it's less relevant vs the price of these GPU but not on cheaper one where 25$ is 10% of the price !
Remember the days when $250 could buy you a 60 series GPU from nvidia? yeah.. those days are probably gone forever, for all the crap that people talk about AMD and its prices at least you can still buy a 60 series class GPU from them for $250, and now even from Intel! which is great.
Yeah, and I imagine a base 6600 should be able to cruise 1080p native with decent settings until at least the next console gen. With fsr it might last near 10 years assuming nothing breaks
@@ashce10 Yeah for sure, my RX 480 which I bought at launch day it's going to be 7yrs old in 3 months and is still going strong for anything from 2020 and below at medium/high 1080p and that's without FSR, I think that the new gen will even outlast 10yrs with FSR3 with frame generation, can't wait for that tech to arrive.
To be fair, 60 series gpu's back in those days were pretty weak and slow. And modern 60 series like the 3060 is on par with a baseline RTX 2070 in performance, which is a big deal. And the coming RTX 4060 is going to have RTX 3070 Ti levels of performance, which is actually pretty crazy. So you realistically can't expect to pay only $250 for such a beast 60 series gpu. If you really want to pay $250, then the GTX 1660 is still being sold. The 1660 is technically a "60" series gpu right?
Intel gpu's have the possibility to truly be amazing cards for the price. They just need a ton of work on their drivers, and to Intel's credit, they have been hard at work constantly updating and improving their drivers. Maybe next gen or the gen after, Intel might take the cake for best graphics card for your dollar hands down. I hope they do tbh
and you know what? remember they where trying to decide if go Low end or High end for the launch? If they had chosen High-end i assure you that NOBODY would be considering them. That was the decision that made them not crash nor fail. Huge respect for intel.
The A750 needs to be another $20-30 cheaper to offset its sore spots vs AMD. At least the games in which the A750 excels show there is hope for substantial driver performance uplifts in the future. The raw compute is there, only need to make it work right in a more repeatable manner.
@@PinHeadSupliciumwtf Even if they "unfuck their drivers" now, most people will never hear about it and still go by launch-day reviews if anything at all. Best thing Intel can do is get the A750 in as many hands as possible to show how much progress it has made and whip up interest in BMG.
I noticed that after the update, some games started to use some of my ram from PC for the GPU, it like it sync together, weird but in a good way. Also some games like Arma 3 which are very CPU dependent, stared to use my GPU and it left my CPU under 15% percent, the game ran flawlessly and way better after the update. I have a i3-12100 on 16gb 3600ghz, not even a crazy build
Thanks for this really informative vid Daniel - what would you pair with the A750 from the MB/RAM perspective, taking in to account power as over here in the UK is very high atm. Also, for the chosen CPU, what cooling would you recommend? Air, water block and what model? Keep it up!
Considering power, A750 should be bottom of your list as it pulls lot more (225W). 3050 has the smallest power envelope there (130W) but price performance is abysmal, 6600 vanilla draws slightly more but improves price performance, and 6600XT (160W) is another step up in power draw but grants couple more performance percentage points for the trouble.
The 3050 really was a turd. A lot of people bought it - more than you would've thought based on its performance - purely because it was an RTX 3000 series. The less knowledgable assumed it was going to be better than say an RTX 2060, or at least on par, and it's quite a ways from the 2060's performance, and yet the 2060 is cheaper. The 3050 was a tax on ignorance AFAIC. It's such a turd. EDIT: You can get an RX 5600XT for $200, or an RX 5700XT for $230 - better performing GPUs, when an RTX 3050 is still $280. Don't buy a 3050 guys! It's a trap!
Yeah, I know someone who bought a laptop with a 3050 in it. Because it said rtx they were excited about RT. Then found out it was useless and basically like buying a card that is already out the door.
I really don't see much reason at this price point to go for anything but the 6650 XT. With that said, it will not matter in the present because Nvidia just has to make something/anything. Eventually, if they keep doing this it will come back to bite them. Meanwhile as long as AMD and Intel can survive, it gives us a better price point than what Nvidia is offering. Also, if it follows the RX 580 model it will give us a wonderful, cheap, used card down the road.
Why would Intel and AMD want to "survive"? AMD make a shit ton of money now, thanks to the console contracts and Ryzen. For over a decade, AMD have consistently made amazing GPU's that offer more features, better price and higher performance than Nvidia, only to see their cards sell less than 20% of the inferior nVidia model counterpart. It's a blessing that they have even stuck around this long, to be honest! It wouldn't surprise me if they called it a day. Same with Intel. Then these Nvidiots will be happy as pigs in shit, paying $5000 for a mid range card and still hitting 40fps with RTX on.
That said, the intel card is rather impressive and it kind of surprises me a bit, especially with it's raytracing performance, although it's useless performance-wise it is way better relative performance to AMD's counterpart. It also shows good performance in some games, showing it's potential. Interesting to see what they can come up with in the future. For now, the AMD card is indeed clearly the better choice. I still love the idea of having three major players in the GPU market again.
@@stefankoopmans2200 Impressive where? It uses more power than the 6600XT/6650XT, 10c higher too like in the vid, consistency issues, needs Re-Bar enabled otherwise it loses almost 50% performance, it's just marginally better at RT comparable to RDNA 2 which is IRRELEVANT to a low tier card, it's already losing to the RX 6000, imagine the 7600XT. if Intel continue with those price points, they would just be like AMD/NVIDIA, look how the card has the same price of the AMD and NVIDIA card while offering less performance, this card should be $200 MAX and the RTX 3050 is way overpriced too.
@@GhOsThPk I'm just saying I see potential here, not that it is very good as of this moment but it is certainly healthy to have a 3rd player in the mix. Drivers have already significantly improved over time. And it's fine though, our opinions may differ but I'm still rather impressed by certain aspects of it and I am not comparing prices here, I'm looking at it from a technology standpoint and what it can bring us in the future.
How do you mean? The 1060 wasn't as fast as the RX580, and the 1060 cost more than the RX580. The 1060 was worse in every way. Sure, it still sold 800% better than the RX580, but that's simply because the vast majority of gamers made the wrong choice. The 6600XT blows the 3050 away. Doesn't just beat it, like the RX580 beat the 1060, but beats it even more than the RX570 beat the 1050Ti, which was a huge amount in itself. 6600XT is an entire tier above the 3050. 1060 was a semi decent card, a time when Nvidia didn't offer far lower performance for more money. It offered roughly the same performance as AMD's card, at roughly the same price. Of course, that is basing numbers on the time of release back in 2016. Today, some 7 years later, the RX580 is much faster than the 1060 thanks to the RX580's true support for DX12 and Vulkan, compared to the 1060's emulated support, and thanks to AMD giving attention to it's legacy cards through drivers, as opposed to Nvidia dropping all but critical support for Maxwell/Pascal many, many years ago. 6600XT is the new 1060? I fucking HOPE not! XD
@@TheVanillatech It wasnt worse in everyway when the 1060 came out it was faster but anyone that had some knowledge knew the 1060 is faster today and maybe the next year but the 580 with more VRAM and better DX12/Vulcan support, in the future it will do better. Had better driver support AMD 5+ years ago was nowhere as good as they are now with RDNA and up The problem with that future was the time you needed to upgrade both the 580 and 1060. Sure you could stay longer on the 580 but to keep the same experience you did when you bought either of those cards you needed to 5700 style card
If we're talking about 1080p 60fps performance then you could argue for either the 6600 XT or 6600. However, for not too much money more you could get a 6700 XT which is even a bigger tier above and offers even more value especially at 1440p so you could also argue the 6700 XT is the true champ of this generation. But honestly, none of those cards offer their performance at prices which similar tier cards used to offer their performance at. Remember the 1070 at 380€? Well this gen you're paying 400€ MSRP just for the 3060 ti so I find it argue that any of those cards could actually match what graphics cards used to offer back then. I'm not saying they're bad value because they're still good value, but it's no longer what it used to be and GPU prices will probably continue to increase unless Intel actually manages to do something about it. And so as long as that's the case, I think it'll be hard to compare current gen cards with past gen cards in terms of value.
@@imo098765 Okay listen son - there are TWO THINGS that MATTER when buying a GPU. PRICE (okay) and PERFORMANCE (gotcha). The 1060 6GB was MORE EXPENSIVE than the RX580 8GB! (AMD WINS) The 1060 6GB was SLOWER than the RX580 8GB! (AMD WINS) Anandtech shows, from it's 2017 graphs, that across 25 games tested the AMD card was 5.7% faster. Anandtech shows that from 2022 graphs, across 25 games tested, the AMD card is a massive 19% faster. Now put THAT in your PIPE. And smoke it!
@@TheVanillatech 580 came out a year later, when 1060 came out my only option was 480 which was clearly behind and had the TDP of a 1070, the 580 "refresh" was the same card but with even higher TDP, i didn't make the wrong choice neither did anyone with 1060. In Europe i pay over 20c KwH and im some of the cheapest, people didin't want the 4/580.
I agree 6600XT consistently slays and is the right choice, mid-range card for productivity and gaming A750 will age better than both competitors. but that's base of what I crystal balls expect, as more games use require RT as well. A750 held back by drivers in bad games like Cyberpunk and UE5 but Intel has been dropping frequent fixes so eventually maybe they'll fix those. So much fixes to do when you have pass games and future ones to improve in DX11, one of their worse titles Halo Infinite got fixed a few days ago
A750 already struggles to get 60fps in newest titles without RT. and even if the performace was there with 8GB It'll be VRAM bottlenecked by the point RT really matters. The A770 16GB is the only GPU that I expect to age decently.
@@JaimeIsJaimeThe sun-lit Metro Exodus use of RTGI is my example of where the performance can be when optimized for console, not cyberpunk etc. interesting, I saw the a750 reach 60fps or better in new games often better than the 3060. I'm not advocating for this to be the decision for the purchase, but at this price range you can still get some ray tracing on with optimized settings. the same RT-cores will be added to applications like blender for Nvidia parity that the 6600XT or AMD in general cannot yet match. GPUs can do more is all I'm saying
Would you be able to get your hands on a Maxwell/Pascal card to compare with these? At $250, these cards seem in the sweet spot for people returning to PC gaming after the drought of the past few years, and I feel that era of card is relevant to show where you're improving nowadays. Maybe a 970/980 or a 1070/1080 as well and then a comparison with the 3060s and the 6700s.
I already did the research since my RX 580 and GTX 1060 6GB can't do much anymore. The best card for the price and super power efficiency is RX 6600 for ~$190-200 (with free game) Then RX 6700 at ~$260-280 Consider the XT version if you can find them with good price too
How can AMD compete that even with a massive win like 6600 XT vs 3050, Stuff in the past like RX470 vs 1050 they vastly outsell AMD. This just means AMD is forced not to compete at the low end thanks to stupid Nvidia buyers. Only Have Nvidia to blame. I bet even 4050 will be hard to match 6600 XT.
People love Nvidia like everyone like Apple. Samsung and many other phone companies also make great phones but people just buy Apple. Same goes with Nvidia as they were the first player to bring gpu for computer gaming. People have positive experience with them.
lol, it's not because they've made inferior products historically, and have a reputation of being less reliable, it's Nvidia fault that they can't sell their product.
1050Ti was an insult to budget gamers. Cost the same as an RX570, but had 50% less performance. Didn't stop 800% more people buying the insult, over the AMD card. You can't fix stupid, bro.
Those are significantly overpriced right now due to how new they are and lack of options/sales. Check out Jarrod's Tech for laptop GPU reviews. The increases were pretty underwhelming for the 4000 series.
ARC A750 and A770 are very promising architecture. Very new, drivers have to mature but it looks like if they would make a "high end" product maybe they could beat nVidia in raw Ray Tracing and AMD in Raw Rasterization. Who know, I'm rooting for them very much.
@@kimberlylewis5820 We will see what future brings. Because it is a very new architecture, their next archirtecture could be very strong an learn from its previous mistakes and I hope they will come better drivers. We really need a third party into the GPU competition...
@@PeterPauls we do need a third competitor, but Intel really dropped the ball. I have serious doubts Arc has a future if it can't deliver with battlemage.
I know these cards start to struggle pretty hard at 1440p for most of these games but I've seen benchmarks where Intel GPUs start to pull ahead at that resolution making the gap between AMD and Intel smaller. Kinda puts A750 in a hard place since A770 is more suited for that but you lose on the price to performance. I'm thinking that if you tweak a few settings and maybe run games that are a few years older, the Intel and AMD cards can become decent budget options at 1440p.
@@StevieCEmpireofUnitedGaming If you'd said 6700xt then sure, but just Hogwarts demands 9GB of VRAM in 1440, so how do you plan on making that 6600 handle it? Lower the settings I guess, but then it wouldn't be flawlessly anymore, would it? Plus the vram wouldn't be the only part to suffer in this case anyway
@@samarkand1585fair point i wasnt clear. i meant to say on older games, for next gen graphics you are right. I have a 6650xt and 6600xt in crossfire (civ vii prep) for current gen games and have a 3090oc for the more demanding games (3440-1440p)
@@samarkand1585 I have context. I can play world of tanks/warships at 95-100fps (i cap 100) on ultra HD settings with the 6650xt, with the 6600xt I was getting around 85-90fps average 3440-1440p to me that is flawless
the 6600XT can be bought as Little as $145 during last aliexpress sales that includes WW shipping. its sold as the 6600M. given its huge die size the 770 should have competed with the 6800/3070TI
You don't base GPU's on die size in terms of competition. You base it on PRICE and PERFORMANCE. Nobody cares about die size bro. Games don't run faster based on how physically big the GPU is, and you don't pay per square millimeter. PRICE and PERFORMANCE. Always was. Always will be.
I know this video is a month old but ... The RX 6600 XT doesn't seem to be on sale from the usual suspects over here, only the RX 6600 and RX 6650XT, unless you're prepared to pay a huge premium (OCUK has the RX 6600 for £230, the 6650XT for £270, the A750 for £230, and the RTX 3050 for £240); Amazon has the RX 6600XT for £315 or more). A quick gander indicates that the RX 6600 is about 20% slower than the RX 6600XT. How does that change your calculus? With regards to the A750 blackouts, you might speak to Graham of the Adamant IT channel. He's had blackouts with all sorts of cards - Nvidia, AMD, and Intel - so he may be able to help. He's generally found that the issue has been with the capture device, not the GPU. One worthwhile test you seem to have missed was performance while streaming: the RTX supports NVENC and while the A750 supports AV1, neither Twitch nor RUclips do.
Intel's A750 performance is.... weird to say the least. I have a feeling in the future, as more driver bugs get worked out, these will turn out as really good budget options.
No shit, it already beats the shit out the 3050 and costs 20% less. I'd say that ALREADY makes it a good budget option, wouldn't you???? Unless I'm crazy or something. I don't *think* im crazy..... Maybe I am crazy! Cos all I see is the 6600XT smashing the 3050 to pieces (and costing less) and the 750 beating the 3050 easy (and costing a lot less), yet the 3050 selling 700% more skus! Crazy...
@@TheVanillatech not crazy at all, though for a while now Nvidia has not been the company for budget gaming. I do find how Intel and AMD trade blows at this price point very interesting. Today I would recommend the 6600XT for budget gaming, but who knows, maybe in a few more months Intel will have drivers sorted out even better and the 750 might even take the Radeon card's spot.
@@SaxaphoneMan42 You say that Nvidia has not been the "company of budget gaming". In what world does that mean that a shitty ass 1050Ti, which can muster maybe 30fps in 1080p medium/high settings on release, outsells an RX570 with 60fps ALL THE WAY, for the same price, by 700%? You make zero sense. It doesn't matter if YOUD reccomend a 6600XT for budget gaming, the numbers speak for themselves. They always have. Thats why people benchmark cards. Fact is, Nvidia has been bitchslapped many, many times over the years by AMD (not even CLOSE!) and still went on to sell more of it's trash offering than AMD did of it's vastly superior counterpart. You can't fix stupid. Even if YOU reccomend it! lol
It's not that weird to me. Intel ARC is nothing more than a testbed; an introduction of a 3rd competitor to the gpu market. The problem is A770 is the best Intel has until maybe fall 2024. But by end of this year, Nvidia and AMD will have their below $400 budget Ada Lovelace and RDNA 3 cards. As a result, last gen RTX 3070 and RX 6750xt will be selling well below $300. Both cards 'wipe the floor' of Intel's best offering with less bugs and inconsistency. Good news, Intel will be much more competitive with their future generation of gpus. But this gen is nothing more than Nvidia's Turing intro to raytracing. A good idea and welcomed addition but an overpriced under achiever.
@@garyb7193 6700/6750 is competitor to the 3060, not the 3070 or even the 3060Ti (at least, in cost). Obviously the 6700/6750 destroys the 3060, whoops the 3060Ti and is more of an even match to the 3070. And RTX is STILL an over prices under achiever. Some 5+ years later, after Jensen promised, on stage of Turing launch "120+ RTX games within 18 months!" to raptuous applause, and in reality 18 months later there were less than 20 RTX games and one was Quake II from 1997, another Minecraft. RTX today is as shit as it was back then, with only stupidly expensive $1200+ GPU allowing for smooth framerates (sometimes with fake frame generation, and always with upscaling via DLSS). A ridiculous notion, and price, for shiny floors and reflective puddles. Intel have come in and on generation ONE, are outperforming NVIDIAS 3050 .... for 30% LESS MONEY!
Sweet video, Daniel. I'm getting a new graphics card so. Finally, freaking Finally, I've been saving for about a year. And no major life hiccups have occured. Serious question for you and others here. My budget allows me to get one of two cards. A Speedster swift Rx6700 10gb non-xt or a Zotac GeForce 3060ti? I'm leaning more towards the 6700 as I like 10gb as opposed to 8gb but I think the 3060 has a 256bit lane. I'll only be able to buy one and it has to last me as long as my GT 1030. 😅 I don't really play online games, mostly a single player gamer. I would really appreciate thoughts and insights because man, I'm going back and forth on this. Thanks again and keep up the awesome videos man, much appreciated. ❤
I'd say it really depends on what you are planning to play and if you care about nvidia exclusive features and raytracing, performance should be about the same without raytracing. Other than that they are mostly the same, you could also get whatever is cheaper.
@@andersonfrancotabares3614 it's like Dan said in his video, the premium for Nvida features is pretty high. I might miss having the option for fsr and dlss and raytracing would be nice to have but I guess not a priority for me. For years I've been lowering resolutions, using low graphical settings and using scaling resolution just to hit 30fps. I just don't want to invest in the wrong thing and regret it two years later, you know?
@@GamersTherapy Then I'd suggest you go with the Radeon GPU, I have a 6700 XT which now performs closer to a 3070 than a 3060 Ti I'd assume the non XT variant is closer to the 3060. DLSS and FSR are most useful at resolutions higher than 1440p, having used FSR at 1080p it smears everything and makes it blurry compared to native, also neither of them will have really good RT performance at 1080p anyway, AMD is also working on FSR 3 which might work pretty similarly to DLSS and features that use tensor cores in nvidia cards.
@@andersonfrancotabares3614 thanks for the advice bud, I appreciate it. I can't find a 6700 xt currently so will hold onto my money for a bit longer and see what happens in the market during my hunt for one here.
funny how people talked crap at launch about the arc cards. no company new to discreet gpus launches with no initial problems. now with the updates they perform much better. the a770 is on par now with the 3070. ties in some games and beats in others. videos out there showing it for the disbelievers.
@@Jenity it’s preforming very different in each game but It appears to do very well on 1080P medium- high settings 90-120 fps and around 60-90 on 1440P medium to high settings in most games I have tested.
It's a 1st gen product. I never buy first gen products. Hopefully, Intel can build on this. We need a third player in the GPU gaming market. Notice I said "Gaming" . Seems , like these guys are eyeballing dollar signs in the AI market. We may never see a sub $300 GPU again. When , they sell businesses GPU's for 4X the cost.
Hey Daniel, i need your help. My son has a gaming-pc 12700K/AMD 6900XT. Now we plan to build a extra streaming pc. My plan is to go with CPU I5 13500 and Intel ARC 770. Is that a good idea? Or are there better optipons spec. with another GPU? Thanks and regards from Germany
I'd guess that he wouldn't suggest DLSS upscaled from a lower than 1080p resolution vs a card that can just run 1080p at 60fps+ in most cases natively given he thinks upscaling from below 1080p produces a worse image.
Judging by (essentially just a hunch) Linus video on the ARC gpu where it behaved weirdly with his long active cables, seems Intel has something non-standard going on with their video out connectors or firmware
In my country arc a750 available in 255$ and rx6600 non xt version price is 304$. I think Intel giving good price to perform after latest drivers update
What? Why not include second hand gpu's in this comparison?? In The Netherlands i can buy a 1080TI for 250 euro. It is as good as a 3060TI which costs 440 euro! And you are showing a 3050 against that, with better American pricing as well. So many people bought 1080ti's, that there are enough second hand offers for anyone looking for them. Anyone looking for gpu's in this price range should absolutely consider 2nd hand offers.
That is what I am observe as well on your 4090 vs the XTX comparison video, although 4090 gives out so much FPS numbers but if you look at the videos closely the 4090 stutters are visible vs XTX native and FSR's video it was smooth. So I'd take that sample. I wonder how much will it be improve more with their FSR 3 now introducing their Fluid Motion Frame Technology and with Hyper RX, both can enhance your gaming performance and experience. If you might ask where are those stutters happen an example would be in Cyberpunk on the part where there's a paper on the wall and the black and orange stripe thing on the concrete wall and the early part of the video of Callisto.
@@lunator100hd If you have a really fast CPU, it sometimes stutters because the CPU cant feed the gpu fast enough, which is the reverse of what normally happens. Normally you can fix it in most games by capping frames.
Good video. Never confuse value with cost. On Newegg U.S right now, (that could change at anytime), a rtx 3070 is only about $100 more than a rtx 3050. The rtx 3070 is around 3x's faster in most games than the 3050. Spend a little more money and get a LOT more performance.
3060 is going for around $100 more but 3070? Even then, the 6600xt can go toe to toe with the 3060 at the price point of a 3050. Cheapest 3070 available is for $700 and that's massively overpriced. Someday I'll get my hands on an Nvidia gpu.
@@Wobbothe3rd if I'm lying, that means pcpartpicker is lying. At no point did I or the person I was replying to, ever mention used gpus...if you want to go that route, I'm sure you can find even cheaper entry level gpus on the used market. SMH.
Agreed- better to spend as much as you can so you don't regret the purchase later. There were times in the past you could squeak by with some lower tier stuff but that is just not the case right now. That said 6600xt is still the most powerful in the video and maybe $100 is too much for that person- picturing young teenagers buying their first cards or building pcs etc.
3050 is most expensive and slowest... this just shows how much people dont care if there is anything else and just buy nvidia. That level of performance should really be in $150 GPU.
RDNA2 GPUs right now are where it's at if you want the most bang per buck. I've been test driving an rx6600 on my Small Form Factor 1080p machine, using a 5800x3d, and I was actually floored with how well this combo handles CPU heavy titles I play (DCS World flight sim primarily). Makes me second guess my purchase of the much more powerful rx6800 because you also get so much power efficiency when going with a smaller GPU. My computer runs so cool and quiet and produces a fantastic gaming experience.
If you're in the bottom tier of graphics cards- AMD or Intel. You would buy an Intel card for the same reasons you would buy an AMD card in opposition to Nvidia....to support them and help them grow/improve and this will also cause AMD + Nvidia to improve further. If you don't care about any of that or the future landscape of graphics cards- AMD as always is the best bang for buck in that range.
6650XT smashes 3050 - same price 6700XT smashes 3060 - same price 6800XT smashes 3070ti - same price 6950XT smashes 3080 - same price Just the bottom tier, huh?
@@TheVanillatech Yes just the bottom tier- when you get into the higher tiers there are more factors to consider. To say the 6950 smashes the 3080 is a bit laughable. You can do about the same things with both cards and one has support for things that the other does not. Not saying there isn't value in the higher tier- if you only game and don't play a wide variety of games AMD seems pretty good. Seems like people playing FPS/shooter type games do well with AMD. One size fits all solutions are just not good. I will never say AMD doesn't have good value overall- they always have, that's the appeal though that value is growing less as they inch closer and closer to nvidia pricing.
@@glakoblako The 6950XT is better than the 3080, not just in a COUPLE of games, but across a 25 game spectrum. You can say that, in a couple of games, it DEMOLISHES the 3080 and even beats the 3090Ti. That would be true thing to say, cos it's true. And AMD's appeal isn't just "value" is it? When the 1050Ti is DOGSHIT in gaming, yet costs the same as the far superior RX570. That makes the appeal of the 570 not "value" but "can actually game". Value doesn't even come into it, does it? Given it's not even a fair contest. Same with the 3050 vs 6600XT. Same with MANY other situations. Dummy.
@@TheVanillatech Lol how can you be offended by what I said- you're a fangirl. Maybe one day you will get to meet AMD in person! AMD cards can't stream. AMD cards are slower to edit. AMD cards have more driver issues if you play more than 25 games. Then you make some argument that I never made- I said that the AMDs cheap card was the best value. Then you claim that a 6950 "demolishes" the 3080 when it's a 7% improvement but ignore all of the other reasons listed above as to why someone might choose one over the other. You even claim a 3090 isn't as good >< you are just emotionally invested in AMD. It's ok bro- they are just companies, don't have a cow.
Surprised by Intel GPU performance, just slightly behind amd in raster performance but better in ray tracing . Next gen Intel GPU will be crazy good , they done well on first try .
Your videos are very informative, but at this price point the tests are a little overkill. I don't think anybody is buying these cards for epic setting and especially not ray tracing because nobody in 2023 wants to game below 90fps. Maybe test some setting configurations that shoot to achieve high refresh rates as well as high video quality! Would be so helpful for people trying to get 144 fps in competitive games.
In UK 3050 is £60 more expensive than a750 and 6600xt pretty much impossible to find the next best thing is 6650xt which is more expensive than 3050 and way expensive than a750. Price vise a750 is the best what you can get for £250.
Easy win for AMD, however prices can vary noticeably in different countries, when you have 3050 for 290$ and 6600xt for 405$, choice in not that simple.
It doesn't matter what you say, pal. The 1060 lost in every way to the RX580 - the Nvidia was more expensive and 5% slower even back in 2017, today it's more like 20% slower. Still, 800% more people bought the slower, more expensive Nvidia card. You can't fix stupid. Around the same percentage bought the absolutely dogshit 1050Ti for the same price as the FAR superior RX570. And right here, as we see the 6600XT destroying the 3050 for the same price, it won't matter - MOST people will buy the Nvidia. Why do you think we are all having to pay so much for GPU's? If Nvidia can make SHIT cards, and charge people MORE money for their shit cards, and yet STILL the vast majority of people buy them over the superior, cheatper, better AMD cards - why on Earth wouldn't Nvidia raise their prices? "Hey Jensen, we figured out gamers are so stupid, we can make dogshit products and still sell them in droves compared to AMD! What should we do now?" ..... "RAISE PRICES! XD".
Once again, we're left with the question of "Why the fuck is anyone buying the RTX 3050, ever, for any reason". Reputation is in the garbage, reliability is fine but you're getting ripped off so badly with the card that you would do much, much better to just WAIT and buy a 3060. The 3050 is a trap. Bad product made in bad times while they were shipping off all the good product to farms.
Man, Intel has got a lot of catching up to do. Sometimes it performs miraculously well, but other times it's a complete stuttery mess. That's what I call unreliable
in performance 3050 is somewhere between 1070 and 1080 (worst deal ever, used 1070 can be had under 100$) 6600xt is around 1080ti (decent deal but I've seen 1080ti's go for as low as 150$) A770 is between 1080 and 1080ti ( good performance at higher res and better ray tracing than amd cards but much more cost efficient than nvidia ) 1080ti is between 3060 and 3060ti ( got myself one for 150$ and it runs everything nicely used 1080ti and some tweaking, you basically get a rtx 2080 like performance in 90% of the games out there, pretty sweet deal IMO, unless you really need ray tracing or newer architecture
1080ti was a great GPU at the release, but it's getting old now. The driver support won't last forever. Also worth noting. rx6600xt will use max 160 watts. 1080ti will use 250 watts. So the newer GPU is significantly more efficient.
Thank you very much for the in depth analysis for these cards. There’s is a lot of information beyond base fps and 1%that people need to see before purchasing. The Narrative that Intel is already at the level of AMD and Nvidia is clearly a false statement as it looks like they still need multiple interactions to really compete to other two.
@@Wobbothe3rd my problem with Ray Tracing is that even with Nvidia, it’s still in the early stages. Also since Consoles are bound to AMD and their jerryrigged 5700xt with ray tracing tech predating RDNA2, it’s very unlikely we see any real usage til another generation or two.
@@franciscoc905 your post sounds as if you are somewhat ignorant of the console's specs, and we already have plenty of RT around right now, for anyone to act like we need to wait another console gen or two. Anyway, its not the consoles or AMD that causes devs to hold back doing more or better RT - just look at the market share on the Steam survey - that much RT hardware in the upper whatever is a recent trend. Across this gen, RT will get mainstream, HOWEVER, if you means an RT only game (something like that newer version of Metro Ex) then yes, we are many years from that being normal because you would need ALL apu's to have RT for that to ever happen.
Sell your GPU at Jawa! bit.ly/jawagpu4
anything like that for EU?
FYI they offered me $111 in Jan for my lightly used 6650XT. You are also responsible for shipping costs. I do not recommend.
@@dotxyn 107 for my 6600 with original box
@@adritrace88 Ouch. I sold mine on eBay, and after fees, walked away with around $210
@@dotxynany idea how much I can get for my 1650 super?
I also forgot to say it in my voiceover, but some of the shadows and stuff seemed to render differently in Fortnite with RT on the Arc A750
Are A750 drivers stable?
Even by NVIDIA standards It's insane that they have the balls to sell the 3050 for the price they're selling it at. It's not like at the high end where they either don't have competition or where their competition is just as horribly priced.
It makes the 1650 vs rx 570 situation back then almost look reasonable. I genuinely cannot understand how they manage to shift a single unit.
nVidia sells because of mind share and nothing else. They know how to dangle shiny new useless features in front of ignorant users who then take the bait.
Customers are idiots that’s why.
The problem is that it has nvidias name on it so it will sell no matter what. People need to start moving to team red to help open their eyes to their ridiculous prices. That's why evga left
Even the best "pro" of Nvidia, RayTracing, which is often mentioned to justify the prices falls completely off. Like what will you do with RT with a 3050? But it’s every buyers own fault if he decides to go for a 3050.
My guess is there weren't many 3050s. Cut down 3060s for $250 isn't something Nvidia wants to sell, and they originally intended the 3050 to have GA107 instead of 106 (hence weird limits like x8 PCIe) and planned to stealth launch the GA107 version. I think I heard some rumors that they did eventually did this but considering they want to prioritize laptops with GA107 (also why no x16) this gen and considering they want to sell more 3060s, the desktop 3050 is too much in a weird position for Nvidia to give a shit and sell at MSRP.
And that's besides the fact they need to sell less than MSRP to compete with the 6600 and A750. Oof.
Hey Daniel,
if you choose "override group" under Graph Limits for something like 1% low, you can give it the group name "1% Low", saving you from having to explain it over and over. It'll make the graphs more intuitive from the get go. Hope that helps!
Thankyou for the tips! Btw sorry to interrupt, do you also have tips if afterburner doesn't show the 1% and 0.1% low on OSD? Thanks in advance..
@@andreslinn69 You need to enable them first AND then enable to be shown in the OSD. They are all deactivated by default.
@@andreslinn69 You need to have then enabled in the OSD and then have a key assigned to "start benchmark", then press that whenever you want RTSS to start measuring them. If you don't then you'll only be shown the instant FPS
@@b0ne91 thankyou for the reply.. yes I've enabled the options and check the OSD but still it doesn't show up. Guess the answer is to start benchmark first (?) as Klobb said in previous comment..
@@thelegendaryklobb2879 thankyou for the reply! Ah I see, do I need to assign "start benchmark" first to specific key? May I ask how to do that? I'm still new to MSI afterburner..
6600xt/6650xt for savvy shoppers, a750 for hardcores who can deal with it's issues and 3050 for idiots or normies
iGPUs for Chads.
hmmm
depends on regional pricing too tho
like here, a750 is like 10% cheaper than rx 6600(non xt)
about 20% cheaper than 6650xt and ~23% 6600 xt
A750 is for people that still say "YOLO" in 2023
a750 for hardcores who can deal with its issues of being slower than the 6650xt in 90% of games.
@@GewelReal 4k high fps gaming snob incoming.
Damn the 3050 is very consistent. Always staying at 100%. Must be good then! /s
I sold my Gigabyte 3050 for $280 cash and used that to buy an EVGA 3060 12GB B stock on their pre-Christmas sale for $259.99. It was quite the free upgrade!
Reminds me of when I sold my 1080 for $400 and got a 3070 FE for $500 right after
@@jerrodshack7610 nice
The 6600xt is slaying here. Pretty impressed with intels offering. Hopefully they get better. Would be nice to have a third option in future
I already I knew how much better a 6600xt is for the 1080p market but i did not know the A750 was this good. Good job intel!
Except the drivers and stuttering here and there. AMD is best here
Also, thanks for making this, and I’m glad you didn’t include the 3060 or 3060ti. The 3050 is a joke, the 6600XT could be the new 580 if we stop buying it at almost $300, and the A750 is slept on hard.
Man, you REALLY have to be ignorant to buy a 3050 at this point in time. Late last year, my trusty RX 460 2GB gave up and when considering a new GPU, a quick look at the prices made it clear NVidia was a BAD CHOICE. Thanks in large part to my brother I built a rough PS5 equivalent for 600USD using an RX 6700 (+R5 5600). With any luck it should hold us through this generation...
The only reason to have ever bought a 3050 was in 2021 because it was one of the few GPUs available for less than $500. No point buying one now.
@@MistyKathrine yeah it mostly only made sense when you were lucky with a newwgg shuffle at launch or got into the EVGA queue early enough.
I didn't win the shuffle but someone nice on a discord server did and let me buy it from newegg. $329, and it even had a minor discount to make it more like $319 without shipping, or $329 with free shipping. The only time the 3050 made sense to buy.
@@roundduckkira EVGA was the only company trying to do right by the consumer in that situation and now they are out of the GPU game and I'm so sad about it.
@@MistyKathrine amen to that, funnily enough despite my failed attempts to get the XC Black and missing out on the backplated XC model due to hunting for the former ($329 was yikes still), it's funny that the specific GPU I ended up with was that backplated XC model lmao.
I like this comparison format. Knowing relative and absolute performance while seeing how those comparison change through RT is extremely useful. If I had seen videos like this I would had spent much less time choosing GPU
I am using Arc A750, it is really a great deal!😁
I didn't have that control permission Windows during pop up after reboot. What driver version are you using?
I never thought I would hear the phrase "murder-suicide" in a GPU video lol.
Love these head-to-head videos. Especially when it shows more... disappointing cards like the 3050.
The sad true is that people still buy a 3050 over 6600/xt and A750 (a bit more reasonable in this case tho)
@@xClanPkSx Yeah... I have never seen an RTX 3050 go below $300 at my closest Microcenter, whereas RX 6600s are just sitting there at $230.
Every time I help a friend with getting a new PC or whatever, I have to constantly break the stigma of "bUt aMD bAd" and it's so frustrating...
@@moldyshishkabob a GPU might murder one's fps due to which he might suicide.
There you go sir. In a single sentence.
@@youcantakemygoogleaccount2359 Not quite the phrase... but I'll take it!
The 6650xt seems the buy
Whichever 66xx has the lowest price is what you should buy. All 3 can have fantastic deals. Like I scored an rx6600 for just $190. Can't beat it.
@@SirMo a 6650xt is significantly faster than a 6600 (around 20%) so it would have to be proportional less and not what's cheapest
No shit sherlock! XD
@@wrusst yes you should take the perf difference into the account obviously. But these deals go in and out of stock. So I would check all 3 models for the best deal. That's the point I was trying to make.
@@SirMo not everyone going to know a difference. Most people educated know a 3050 is pointless but the idea is to catch these people in informative content
You should add power consumption, at this prices point this make a huge difference, most of the tech tester include this on expensive card where it's less relevant vs the price of these GPU but not on cheaper one where 25$ is 10% of the price !
Remember the days when $250 could buy you a 60 series GPU from nvidia? yeah.. those days are probably gone forever, for all the crap that people talk about AMD and its prices at least you can still buy a 60 series class GPU from them for $250, and now even from Intel! which is great.
Yeah, and I imagine a base 6600 should be able to cruise 1080p native with decent settings until at least the next console gen.
With fsr it might last near 10 years assuming nothing breaks
@@ashce10 Yeah for sure, my RX 480 which I bought at launch day it's going to be 7yrs old in 3 months and is still going strong for anything from 2020 and below at medium/high 1080p and that's without FSR, I think that the new gen will even outlast 10yrs with FSR3 with frame generation, can't wait for that tech to arrive.
To be fair, 60 series gpu's back in those days were pretty weak and slow. And modern 60 series like the 3060 is on par with a baseline RTX 2070 in performance, which is a big deal. And the coming RTX 4060 is going to have RTX 3070 Ti levels of performance, which is actually pretty crazy.
So you realistically can't expect to pay only $250 for such a beast 60 series gpu. If you really want to pay $250, then the GTX 1660 is still being sold. The 1660 is technically a "60" series gpu right?
@@angrysocialjusticewarrior 🤦♂Every new 60s series normally has the power of the last gen 70s series, it's not rocket science.
@@AJ-po6up then why do you expect the price to be cheaper then if technology improves ?????
Intel gpu's have the possibility to truly be amazing cards for the price. They just need a ton of work on their drivers, and to Intel's credit, they have been hard at work constantly updating and improving their drivers. Maybe next gen or the gen after, Intel might take the cake for best graphics card for your dollar hands down. I hope they do tbh
and you know what?
remember they where trying to decide if go Low end or High end for the launch?
If they had chosen High-end i assure you that NOBODY would be considering them. That was the decision that made them not crash nor fail.
Huge respect for intel.
The A750 needs to be another $20-30 cheaper to offset its sore spots vs AMD. At least the games in which the A750 excels show there is hope for substantial driver performance uplifts in the future. The raw compute is there, only need to make it work right in a more repeatable manner.
I mean it's 100 Aussie dollars less than the 6600xt for me rn.
Not if they manage to unfuck their drivers.
@@PinHeadSupliciumwtf Even if they "unfuck their drivers" now, most people will never hear about it and still go by launch-day reviews if anything at all. Best thing Intel can do is get the A750 in as many hands as possible to show how much progress it has made and whip up interest in BMG.
You can get an arc a750 for $200 now
best gpu atm is a 6700xt for 1080p and 1440p
will last the longest and pricewise is in the ballpark of these cards
I bought one :)
Sure. Always buy -700XT / --60 Super/Ti series for high end 1080P. 5700XT and 2060Super are still good enough until today.
a750 at the price of 6600 basically here. good option. if dont wanna buy 2nd hand gpus
I noticed that after the update, some games started to use some of my ram from PC for the GPU, it like it sync together, weird but in a good way. Also some games like Arma 3 which are very CPU dependent, stared to use my GPU and it left my CPU under 15% percent, the game ran flawlessly and way better after the update. I have a i3-12100 on 16gb 3600ghz, not even a crazy build
Respect to Intel, very curious of next gen.
A750 has good RT, especially considering how they have basically entered the GPU market like...... Last year.
Only Av1 at price
Thanks for this really informative vid Daniel - what would you pair with the A750 from the MB/RAM perspective, taking in to account power as over here in the UK is very high atm. Also, for the chosen CPU, what cooling would you recommend? Air, water block and what model?
Keep it up!
Considering power, A750 should be bottom of your list as it pulls lot more (225W). 3050 has the smallest power envelope there (130W) but price performance is abysmal, 6600 vanilla draws slightly more but improves price performance, and 6600XT (160W) is another step up in power draw but grants couple more performance percentage points for the trouble.
A750 is very power hungry.
The 3050 really was a turd. A lot of people bought it - more than you would've thought based on its performance - purely because it was an RTX 3000 series. The less knowledgable assumed it was going to be better than say an RTX 2060, or at least on par, and it's quite a ways from the 2060's performance, and yet the 2060 is cheaper. The 3050 was a tax on ignorance AFAIC. It's such a turd.
EDIT: You can get an RX 5600XT for $200, or an RX 5700XT for $230 - better performing GPUs, when an RTX 3050 is still $280. Don't buy a 3050 guys! It's a trap!
The 3050 RT perf is amazing for such a cheap card, esp in power limited laptops.
Got a refurbished RX 5700 XT from Asus with warranty for $175 for my backup desktop. Can't believe the RTX 3050 is still $280 in 2023.
@@Wobbothe3rd weak argument , no one use 3050 for ray tracing , it cant even get 30 fps in 1080p and look like shite lol
Yeah, I know someone who bought a laptop with a 3050 in it. Because it said rtx they were excited about RT. Then found out it was useless and basically like buying a card that is already out the door.
I really don't see much reason at this price point to go for anything but the 6650 XT. With that said, it will not matter in the present because Nvidia just has to make something/anything. Eventually, if they keep doing this it will come back to bite them. Meanwhile as long as AMD and Intel can survive, it gives us a better price point than what Nvidia is offering. Also, if it follows the RX 580 model it will give us a wonderful, cheap, used card down the road.
Why would Intel and AMD want to "survive"? AMD make a shit ton of money now, thanks to the console contracts and Ryzen. For over a decade, AMD have consistently made amazing GPU's that offer more features, better price and higher performance than Nvidia, only to see their cards sell less than 20% of the inferior nVidia model counterpart.
It's a blessing that they have even stuck around this long, to be honest! It wouldn't surprise me if they called it a day. Same with Intel. Then these Nvidiots will be happy as pigs in shit, paying $5000 for a mid range card and still hitting 40fps with RTX on.
The best is clearly the 6650XT, not even close, Intel needs to lower even more the price and the RTX 3050 is just a laughing stock at this point.
That said, the intel card is rather impressive and it kind of surprises me a bit, especially with it's raytracing performance, although it's useless performance-wise it is way better relative performance to AMD's counterpart. It also shows good performance in some games, showing it's potential. Interesting to see what they can come up with in the future. For now, the AMD card is indeed clearly the better choice. I still love the idea of having three major players in the GPU market again.
@@stefankoopmans2200 Impressive where? It uses more power than the 6600XT/6650XT, 10c higher too like in the vid, consistency issues, needs Re-Bar enabled otherwise it loses almost 50% performance, it's just marginally better at RT comparable to RDNA 2 which is IRRELEVANT to a low tier card, it's already losing to the RX 6000, imagine the 7600XT.
if Intel continue with those price points, they would just be like AMD/NVIDIA, look how the card has the same price of the AMD and NVIDIA card while offering less performance, this card should be $200 MAX and the RTX 3050 is way overpriced too.
@@stefankoopmans2200 Also, at the low end relative perf starts to become a dubious phrase.
@@GhOsThPk I'm just saying I see potential here, not that it is very good as of this moment but it is certainly healthy to have a 3rd player in the mix. Drivers have already significantly improved over time. And it's fine though, our opinions may differ but I'm still rather impressed by certain aspects of it and I am not comparing prices here, I'm looking at it from a technology standpoint and what it can bring us in the future.
Looks like the 6600 xt is going to be the new 1060.
How do you mean?
The 1060 wasn't as fast as the RX580, and the 1060 cost more than the RX580. The 1060 was worse in every way. Sure, it still sold 800% better than the RX580, but that's simply because the vast majority of gamers made the wrong choice.
The 6600XT blows the 3050 away. Doesn't just beat it, like the RX580 beat the 1060, but beats it even more than the RX570 beat the 1050Ti, which was a huge amount in itself.
6600XT is an entire tier above the 3050.
1060 was a semi decent card, a time when Nvidia didn't offer far lower performance for more money. It offered roughly the same performance as AMD's card, at roughly the same price. Of course, that is basing numbers on the time of release back in 2016. Today, some 7 years later, the RX580 is much faster than the 1060 thanks to the RX580's true support for DX12 and Vulkan, compared to the 1060's emulated support, and thanks to AMD giving attention to it's legacy cards through drivers, as opposed to Nvidia dropping all but critical support for Maxwell/Pascal many, many years ago.
6600XT is the new 1060? I fucking HOPE not! XD
@@TheVanillatech It wasnt worse in everyway when the 1060 came out it was faster but anyone that had some knowledge knew the 1060 is faster today and maybe the next year but the 580 with more VRAM and better DX12/Vulcan support, in the future it will do better.
Had better driver support AMD 5+ years ago was nowhere as good as they are now with RDNA and up
The problem with that future was the time you needed to upgrade both the 580 and 1060. Sure you could stay longer on the 580 but to keep the same experience you did when you bought either of those cards you needed to 5700 style card
If we're talking about 1080p 60fps performance then you could argue for either the 6600 XT or 6600.
However, for not too much money more you could get a 6700 XT which is even a bigger tier above and offers even more value especially at 1440p so you could also argue the 6700 XT is the true champ of this generation.
But honestly, none of those cards offer their performance at prices which similar tier cards used to offer their performance at.
Remember the 1070 at 380€? Well this gen you're paying 400€ MSRP just for the 3060 ti so I find it argue that any of those cards could actually match what graphics cards used to offer back then. I'm not saying they're bad value because they're still good value, but it's no longer what it used to be and GPU prices will probably continue to increase unless Intel actually manages to do something about it.
And so as long as that's the case, I think it'll be hard to compare current gen cards with past gen cards in terms of value.
@@imo098765 Okay listen son - there are TWO THINGS that MATTER when buying a GPU. PRICE (okay) and PERFORMANCE (gotcha).
The 1060 6GB was MORE EXPENSIVE than the RX580 8GB! (AMD WINS)
The 1060 6GB was SLOWER than the RX580 8GB! (AMD WINS)
Anandtech shows, from it's 2017 graphs, that across 25 games tested the AMD card was 5.7% faster.
Anandtech shows that from 2022 graphs, across 25 games tested, the AMD card is a massive 19% faster.
Now put THAT in your PIPE.
And smoke it!
@@TheVanillatech 580 came out a year later, when 1060 came out my only option was 480 which was clearly behind and had the TDP of a 1070, the 580 "refresh" was the same card but with even higher TDP, i didn't make the wrong choice neither did anyone with 1060. In Europe i pay over 20c KwH and im some of the cheapest, people didin't want the 4/580.
I agree 6600XT consistently slays and is the right choice, mid-range card for productivity and gaming A750 will age better than both competitors. but that's base of what I crystal balls expect, as more games use require RT as well. A750 held back by drivers in bad games like Cyberpunk and UE5 but Intel has been dropping frequent fixes so eventually maybe they'll fix those. So much fixes to do when you have pass games and future ones to improve in DX11, one of their worse titles Halo Infinite got fixed a few days ago
A750 already struggles to get 60fps in newest titles without RT. and even if the performace was there with 8GB It'll be VRAM bottlenecked by the point RT really matters. The A770 16GB is the only GPU that I expect to age decently.
@@JaimeIsJaimeThe sun-lit Metro Exodus use of RTGI is my example of where the performance can be when optimized for console, not cyberpunk etc. interesting, I saw the a750 reach 60fps or better in new games often better than the 3060. I'm not advocating for this to be the decision for the purchase, but at this price range you can still get some ray tracing on with optimized settings. the same RT-cores will be added to applications like blender for Nvidia parity that the 6600XT or AMD in general cannot yet match. GPUs can do more is all I'm saying
What version of the Arc driver was used. These results don't track with what other reviewers have found. Also tables turn in the Arc's favor on 1440p
was resizable bar enabled for the intel gpu?
It still hurts my brain to see people choose the RTX3050 over the AMD cards! The RX6600 is cheaper and much stronger in every aspect!
Would you be able to get your hands on a Maxwell/Pascal card to compare with these? At $250, these cards seem in the sweet spot for people returning to PC gaming after the drought of the past few years, and I feel that era of card is relevant to show where you're improving nowadays. Maybe a 970/980 or a 1070/1080 as well and then a comparison with the 3060s and the 6700s.
I already did the research since my RX 580 and GTX 1060 6GB can't do much anymore.
The best card for the price and super power efficiency is RX 6600 for ~$190-200 (with free game)
Then RX 6700 at ~$260-280
Consider the XT version if you can find them with good price too
For gaming purposes only at 1080p, what is the best GPU between these 3: RX6700xt, Arc A750, RTX3060Ti
And which one is best P/P?
intels improved so much since launch with all the driver updates. can't wait to see how the battlemage will be when launched.
Hey Daniel i think you should add black background for your OSD because it can be difficult to see especially frametime graphs
How can AMD compete that even with a massive win like 6600 XT vs 3050, Stuff in the past like RX470 vs 1050 they vastly outsell AMD. This just means AMD is forced not to compete at the low end thanks to stupid Nvidia buyers. Only Have Nvidia to blame. I bet even 4050 will be hard to match 6600 XT.
4050 will be like a 3060
And 4050 mobile will be weak af like a 2060 super
People love Nvidia like everyone like Apple. Samsung and many other phone companies also make great phones but people just buy Apple. Same goes with Nvidia as they were the first player to bring gpu for computer gaming. People have positive experience with them.
lol, it's not because they've made inferior products historically, and have a reputation of being less reliable, it's Nvidia fault that they can't sell their product.
A XFX Rx 6700xt like $200 cheaper then a Rtx 3060ti in my country then the Nvidia brand a shitty zotac brand
@@stephendippenaar9986 I am fine people buying 4080/4090 cause they have reasons but 3050 is trash.
The 3050 is an insult for budget pc gamers at that price point, it should’ve been priced for under $200.
1050Ti was an insult to budget gamers. Cost the same as an RX570, but had 50% less performance. Didn't stop 800% more people buying the insult, over the AMD card.
You can't fix stupid, bro.
Cope, both of you.
Which is better 6600xt or 6650 xt amd graphics plz tell
i have both and they run in crossfire :) (no one mentioning this at all) the 6650xt is 5% faster
There's no point in buying a 3050. It's too overpriced for what it is. You'd be better buying a used 2060 or 2060 super if you only want a Nvidia card
Prices in Costa Rica, some RX 6600 under 280$ and there is even 1 under 235$, while the RTX 3050 can be found at almost 350$
Would you recommend the 4050, 4060 laptops if I want to stream but is a student?
Those are significantly overpriced right now due to how new they are and lack of options/sales. Check out Jarrod's Tech for laptop GPU reviews. The increases were pretty underwhelming for the 4000 series.
Of course, why would you pick a pc while still as a student ??
Is it reasonable to see the same kind of video with RTX 3060, RX 6650 XT and ARC A770?
Here in Denmark you'd have to drop the RX 6600 XT to af RX 6600 in order to match the price of 3050 and A750.
The 6650 XT is cheaper than the 6600 XT but still priced higher than 3050 / A750. AMD pricings are weird in Europe.
@@getawaydance maybe because European customers are smarter, so the demand for 6650XT is higher than that in the US.
happy with my 6600xt .. for one year now!
what intel driver? latest is 4146
Was the resizable bar / Smart Cache active for the GPUs? AMD and Intel GPUs can benefit from it.
I think so. Otherwise A750’s perf will be much worse.
Cool video! Maybe next a380 - 1650 - 6400 ?
Been done already.
@@joggabonkers6380 but the search does not find such a video on this channel...
ARC A750 and A770 are very promising architecture. Very new, drivers have to mature but it looks like if they would make a "high end" product maybe they could beat nVidia in raw Ray Tracing and AMD in Raw Rasterization. Who know, I'm rooting for them very much.
They were supposed to be high to mid teir. 3070'ish range but they never hit the mark.
@@kimberlylewis5820 We will see what future brings. Because it is a very new architecture, their next archirtecture could be very strong an learn from its previous mistakes and I hope they will come better drivers. We really need a third party into the GPU competition...
@@PeterPauls we do need a third competitor, but Intel really dropped the ball. I have serious doubts Arc has a future if it can't deliver with battlemage.
I know these cards start to struggle pretty hard at 1440p for most of these games but I've seen benchmarks where Intel GPUs start to pull ahead at that resolution making the gap between AMD and Intel smaller. Kinda puts A750 in a hard place since A770 is more suited for that but you lose on the price to performance. I'm thinking that if you tweak a few settings and maybe run games that are a few years older, the Intel and AMD cards can become decent budget options at 1440p.
Good work. Danny O
!
Please do one for 1440p gaming! Thanks for the awesome content!
Gonna want a higher tier than that for 1440p
@@samarkand1585 nope 6600xt runs 1440p flawlessly
@@StevieCEmpireofUnitedGaming If you'd said 6700xt then sure, but just Hogwarts demands 9GB of VRAM in 1440, so how do you plan on making that 6600 handle it? Lower the settings I guess, but then it wouldn't be flawlessly anymore, would it? Plus the vram wouldn't be the only part to suffer in this case anyway
@@samarkand1585fair point i wasnt clear. i meant to say on older games, for next gen graphics you are right. I have a 6650xt and 6600xt in crossfire (civ vii prep) for current gen games and have a 3090oc for the more demanding games (3440-1440p)
@@samarkand1585 I have context. I can play world of tanks/warships at 95-100fps (i cap 100) on ultra HD settings with the 6650xt, with the 6600xt I was getting around 85-90fps average 3440-1440p to me that is flawless
the 6600XT can be bought as Little as $145 during last aliexpress sales that includes WW shipping. its sold as the 6600M. given its huge die size the 770 should have competed with the 6800/3070TI
You don't base GPU's on die size in terms of competition. You base it on PRICE and PERFORMANCE. Nobody cares about die size bro. Games don't run faster based on how physically big the GPU is, and you don't pay per square millimeter.
PRICE and PERFORMANCE. Always was. Always will be.
No its NOT, the 6600m is the 6600 MOBILE its about 90% of a regular RX6600, not an XT.
I know this video is a month old but ...
The RX 6600 XT doesn't seem to be on sale from the usual suspects over here, only the RX 6600 and RX 6650XT, unless you're prepared to pay a huge premium (OCUK has the RX 6600 for £230, the 6650XT for £270, the A750 for £230, and the RTX 3050 for £240); Amazon has the RX 6600XT for £315 or more). A quick gander indicates that the RX 6600 is about 20% slower than the RX 6600XT. How does that change your calculus?
With regards to the A750 blackouts, you might speak to Graham of the Adamant IT channel. He's had blackouts with all sorts of cards - Nvidia, AMD, and Intel - so he may be able to help. He's generally found that the issue has been with the capture device, not the GPU.
One worthwhile test you seem to have missed was performance while streaming: the RTX supports NVENC and while the A750 supports AV1, neither Twitch nor RUclips do.
Man your videos are really good
Intel's A750 performance is.... weird to say the least. I have a feeling in the future, as more driver bugs get worked out, these will turn out as really good budget options.
No shit, it already beats the shit out the 3050 and costs 20% less. I'd say that ALREADY makes it a good budget option, wouldn't you????
Unless I'm crazy or something. I don't *think* im crazy.....
Maybe I am crazy! Cos all I see is the 6600XT smashing the 3050 to pieces (and costing less) and the 750 beating the 3050 easy (and costing a lot less), yet the 3050 selling 700% more skus!
Crazy...
@@TheVanillatech not crazy at all, though for a while now Nvidia has not been the company for budget gaming. I do find how Intel and AMD trade blows at this price point very interesting. Today I would recommend the 6600XT for budget gaming, but who knows, maybe in a few more months Intel will have drivers sorted out even better and the 750 might even take the Radeon card's spot.
@@SaxaphoneMan42 You say that Nvidia has not been the "company of budget gaming". In what world does that mean that a shitty ass 1050Ti, which can muster maybe 30fps in 1080p medium/high settings on release, outsells an RX570 with 60fps ALL THE WAY, for the same price, by 700%?
You make zero sense.
It doesn't matter if YOUD reccomend a 6600XT for budget gaming, the numbers speak for themselves. They always have. Thats why people benchmark cards. Fact is, Nvidia has been bitchslapped many, many times over the years by AMD (not even CLOSE!) and still went on to sell more of it's trash offering than AMD did of it's vastly superior counterpart.
You can't fix stupid. Even if YOU reccomend it! lol
It's not that weird to me. Intel ARC is nothing more than a testbed; an introduction of a 3rd competitor to the gpu market. The problem is A770 is the best Intel has until maybe fall 2024. But by end of this year, Nvidia and AMD will have their below $400 budget Ada Lovelace and RDNA 3 cards. As a result, last gen RTX 3070 and RX 6750xt will be selling well below $300. Both cards 'wipe the floor' of Intel's best offering with less bugs and inconsistency. Good news, Intel will be much more competitive with their future generation of gpus. But this gen is nothing more than Nvidia's Turing intro to raytracing. A good idea and welcomed addition but an overpriced under achiever.
@@garyb7193 6700/6750 is competitor to the 3060, not the 3070 or even the 3060Ti (at least, in cost). Obviously the 6700/6750 destroys the 3060, whoops the 3060Ti and is more of an even match to the 3070.
And RTX is STILL an over prices under achiever. Some 5+ years later, after Jensen promised, on stage of Turing launch "120+ RTX games within 18 months!" to raptuous applause, and in reality 18 months later there were less than 20 RTX games and one was Quake II from 1997, another Minecraft.
RTX today is as shit as it was back then, with only stupidly expensive $1200+ GPU allowing for smooth framerates (sometimes with fake frame generation, and always with upscaling via DLSS).
A ridiculous notion, and price, for shiny floors and reflective puddles.
Intel have come in and on generation ONE, are outperforming NVIDIAS 3050 .... for 30% LESS MONEY!
Great content! Keep it going man!
Somehow here, Rx 6650 xt is cheaper than Rx 6600 xt.
Sweet video, Daniel. I'm getting a new graphics card so. Finally, freaking Finally, I've been saving for about a year. And no major life hiccups have occured.
Serious question for you and others here.
My budget allows me to get one of two cards.
A Speedster swift Rx6700 10gb non-xt or a Zotac GeForce 3060ti?
I'm leaning more towards the 6700 as I like 10gb as opposed to 8gb but I think the 3060 has a 256bit lane.
I'll only be able to buy one and it has to last me as long as my GT 1030. 😅
I don't really play online games, mostly a single player gamer.
I would really appreciate thoughts and insights because man, I'm going back and forth on this.
Thanks again and keep up the awesome videos man, much appreciated. ❤
I'd say it really depends on what you are planning to play and if you care about nvidia exclusive features and raytracing, performance should be about the same without raytracing. Other than that they are mostly the same, you could also get whatever is cheaper.
@@andersonfrancotabares3614 it's like Dan said in his video, the premium for Nvida features is pretty high. I might miss having the option for fsr and dlss and raytracing would be nice to have but I guess not a priority for me. For years I've been lowering resolutions, using low graphical settings and using scaling resolution just to hit 30fps. I just don't want to invest in the wrong thing and regret it two years later, you know?
@@GamersTherapy Then I'd suggest you go with the Radeon GPU, I have a 6700 XT which now performs closer to a 3070 than a 3060 Ti I'd assume the non XT variant is closer to the 3060. DLSS and FSR are most useful at resolutions higher than 1440p, having used FSR at 1080p it smears everything and makes it blurry compared to native, also neither of them will have really good RT performance at 1080p anyway, AMD is also working on FSR 3 which might work pretty similarly to DLSS and features that use tensor cores in nvidia cards.
@@andersonfrancotabares3614 thanks for the advice bud, I appreciate it. I can't find a 6700 xt currently so will hold onto my money for a bit longer and see what happens in the market during my hunt for one here.
don't know about 750 vs 6600 XT version.... the price of the 750 and base 6600 is the same over here... the XT is a good extra chunk of cash.
funny how people talked crap at launch about the arc cards. no company new to discreet gpus launches with no initial problems. now with the updates they perform much better. the a770 is on par now with the 3070. ties in some games and beats in others. videos out there showing it for the disbelievers.
Recently I bought a used 3060ti for $250 so now I care even less about this new generation of ripping off our money.
Congrats! that was a great deal! I'm happy with my 2nd hand Red Devil 6700xt I got for $300. Your deal looks to be even better. Lol
I already own a Arc A770 that I have in my budget build PC 😁
How has it been perform for you?
@@Jenity it’s preforming very different in each game but It appears to do very well on 1080P medium- high settings 90-120 fps and around 60-90 on 1440P medium to high settings in most games I have tested.
It's a 1st gen product. I never buy first gen products. Hopefully, Intel can build on this. We need a third player in the GPU gaming market. Notice I said "Gaming" . Seems , like these guys are eyeballing dollar signs in the AI market. We may never see a sub $300 GPU again. When , they sell businesses GPU's for 4X the cost.
Gaming is all about consoles these days.
for whoever is updating sponsorBlock with this channel, thank you!
It's just ironic that Nvidia 50's cards used to be budget build kings and the 3050 is one of the worst value budget cards we've seen.
Was rebar enabled for the ARC?
Yes
Hey Daniel, i need your help. My son has a gaming-pc 12700K/AMD 6900XT.
Now we plan to build a extra streaming pc.
My plan is to go with CPU I5 13500 and Intel ARC 770.
Is that a good idea?
Or are there better optipons spec. with another GPU?
Thanks and regards from Germany
Don't get an intel gpu for streaming. There's a lot of issues with it at the moment. You can get a used rtx 2060 for cheap and it'll be better.
What do you by a streaming PC? A secondary PC to stream off the gaming PC? You could literally use software encoding x264
If you're building a dedicated streaming machine, you can just use CPU encoding. The quality can be better than GPU streaming. No need for a GPU.
@@fakethiscrap2083 no creo, tu no tienes tarjeta de Arc 750 nunca nada? Yo si tengo Arc 770 + mi i5 13600k son más increíble bestia mejores
@@evilleader1991 we want to try that with an extra pc ....
What's your thought on people buying 3050 by claiming Dlss over 6600 .
I'd guess that he wouldn't suggest DLSS upscaled from a lower than 1080p resolution vs a card that can just run 1080p at 60fps+ in most cases natively given he thinks upscaling from below 1080p produces a worse image.
Judging by (essentially just a hunch) Linus video on the ARC gpu where it behaved weirdly with his long active cables, seems Intel has something non-standard going on with their video out connectors or firmware
Why is the 6600xt compared to the 3050 and 750? In my country, 6600xt costs about the same as 3060 and 770.
because this is about the US market
In my country arc a750 available in 255$ and rx6600 non xt version price is 304$. I think Intel giving good price to perform after latest drivers update
At this tier ray tracing is pointless
Facts!
What? Why not include second hand gpu's in this comparison?? In The Netherlands i can buy a 1080TI for 250 euro. It is as good as a 3060TI which costs 440 euro! And you are showing a 3050 against that, with better American pricing as well. So many people bought 1080ti's, that there are enough second hand offers for anyone looking for them. Anyone looking for gpu's in this price range should absolutely consider 2nd hand offers.
That is what I am observe as well on your 4090 vs the XTX comparison video, although 4090 gives out so much FPS numbers but if you look at the videos closely the 4090 stutters are visible vs XTX native and FSR's video it was smooth. So I'd take that sample. I wonder how much will it be improve more with their FSR 3 now introducing their Fluid Motion Frame Technology and with Hyper RX, both can enhance your gaming performance and experience.
If you might ask where are those stutters happen an example would be in Cyberpunk on the part where there's a paper on the wall and the black and orange stripe thing on the concrete wall and the early part of the video of Callisto.
Stutters could also happen from poor game optimisation, it is not always gpu's problem.
@@lunator100hd If you have a really fast CPU, it sometimes stutters because the CPU cant feed the gpu fast enough, which is the reverse of what normally happens. Normally you can fix it in most games by capping frames.
How is the A750 (in cyberpunk ultra RT) using more VRAM than it has on board? I’d imagine that’s a driver issue
I noticed that too. I'm wondering if it somehow not reporting to msi afterburner correctly.
@@danielowentech Or its also counting shared mem.
Good video. Never confuse value with cost. On Newegg U.S right now, (that could change at anytime), a rtx 3070 is only about $100 more than a rtx 3050. The rtx 3070 is around 3x's faster in most games than the 3050. Spend a little more money and get a LOT more performance.
3060 is going for around $100 more but 3070? Even then, the 6600xt can go toe to toe with the 3060 at the price point of a 3050. Cheapest 3070 available is for $700 and that's massively overpriced. Someday I'll get my hands on an Nvidia gpu.
@WeDa1sJak youre lying. People have bought used 3080s for $500, there are definitely 3070s for less than $700. Cmon.
@@Wobbothe3rd if I'm lying, that means pcpartpicker is lying. At no point did I or the person I was replying to, ever mention used gpus...if you want to go that route, I'm sure you can find even cheaper entry level gpus on the used market. SMH.
Agreed- better to spend as much as you can so you don't regret the purchase later. There were times in the past you could squeak by with some lower tier stuff but that is just not the case right now. That said 6600xt is still the most powerful in the video and maybe $100 is too much for that person- picturing young teenagers buying their first cards or building pcs etc.
You can get a lot of used cards for cheap- just buy from a reputable seller.
3050 is most expensive and slowest... this just shows how much people dont care if there is anything else and just buy nvidia. That level of performance should really be in $150 GPU.
RDNA2 GPUs right now are where it's at if you want the most bang per buck. I've been test driving an rx6600 on my Small Form Factor 1080p machine, using a 5800x3d, and I was actually floored with how well this combo handles CPU heavy titles I play (DCS World flight sim primarily). Makes me second guess my purchase of the much more powerful rx6800 because you also get so much power efficiency when going with a smaller GPU. My computer runs so cool and quiet and produces a fantastic gaming experience.
6800 is the most efficient rdna2 GPU, especially when you consider the performance.
@@evilleader1991 Yes but it's overkill at 1080p for the games I play.
@@SirMo Sure at 1080p, but it will last you longer for newer triple A games.
@@evilleader1991 no the rx 6600 is slightly more efficient than the 6800
Would've liked to see 1440p medium!
Not enough VRAM for that.
@@justfun5479 bs
Agree. In all other benchmarks the A750 closes or even surpassed the RX6600xt in some games and in ray tracing
For games with RT effects Intel A750 only.
If you're in the bottom tier of graphics cards- AMD or Intel. You would buy an Intel card for the same reasons you would buy an AMD card in opposition to Nvidia....to support them and help them grow/improve and this will also cause AMD + Nvidia to improve further. If you don't care about any of that or the future landscape of graphics cards- AMD as always is the best bang for buck in that range.
6650XT smashes 3050 - same price
6700XT smashes 3060 - same price
6800XT smashes 3070ti - same price
6950XT smashes 3080 - same price
Just the bottom tier, huh?
@@TheVanillatech Yes just the bottom tier- when you get into the higher tiers there are more factors to consider. To say the 6950 smashes the 3080 is a bit laughable. You can do about the same things with both cards and one has support for things that the other does not. Not saying there isn't value in the higher tier- if you only game and don't play a wide variety of games AMD seems pretty good. Seems like people playing FPS/shooter type games do well with AMD. One size fits all solutions are just not good. I will never say AMD doesn't have good value overall- they always have, that's the appeal though that value is growing less as they inch closer and closer to nvidia pricing.
@@glakoblako The 6950XT is better than the 3080, not just in a COUPLE of games, but across a 25 game spectrum. You can say that, in a couple of games, it DEMOLISHES the 3080 and even beats the 3090Ti. That would be true thing to say, cos it's true.
And AMD's appeal isn't just "value" is it? When the 1050Ti is DOGSHIT in gaming, yet costs the same as the far superior RX570. That makes the appeal of the 570 not "value" but "can actually game". Value doesn't even come into it, does it? Given it's not even a fair contest. Same with the 3050 vs 6600XT. Same with MANY other situations.
Dummy.
@@TheVanillatech Lol how can you be offended by what I said- you're a fangirl. Maybe one day you will get to meet AMD in person! AMD cards can't stream. AMD cards are slower to edit. AMD cards have more driver issues if you play more than 25 games. Then you make some argument that I never made- I said that the AMDs cheap card was the best value. Then you claim that a 6950 "demolishes" the 3080 when it's a 7% improvement but ignore all of the other reasons listed above as to why someone might choose one over the other. You even claim a 3090 isn't as good >< you are just emotionally invested in AMD. It's ok bro- they are just companies, don't have a cow.
In Belgium, electricity is so expensive that Nvidia has kinda become a no go for me. They sacrificed efficiency for performance.
Surprised by Intel GPU performance, just slightly behind amd in raster performance but better in ray tracing .
Next gen Intel GPU will be crazy good , they done well on first try .
Your videos are very informative, but at this price point the tests are a little overkill. I don't think anybody is buying these cards for epic setting and especially not ray tracing because nobody in 2023 wants to game below 90fps. Maybe test some setting configurations that shoot to achieve high refresh rates as well as high video quality! Would be so helpful for people trying to get 144 fps in competitive games.
In UK 3050 is £60 more expensive than a750 and 6600xt pretty much impossible to find the next best thing is 6650xt which is more expensive than 3050 and way expensive than a750. Price vise a750 is the best what you can get for £250.
Easy win for AMD, however prices can vary noticeably in different countries, when you have 3050 for 290$ and 6600xt for 405$, choice in not that simple.
in sweden you can get the 3050 for the same price of a 6700xt almost. Dont get ripped off dont buy nvidia.
It doesn't matter what you say, pal. The 1060 lost in every way to the RX580 - the Nvidia was more expensive and 5% slower even back in 2017, today it's more like 20% slower. Still, 800% more people bought the slower, more expensive Nvidia card. You can't fix stupid. Around the same percentage bought the absolutely dogshit 1050Ti for the same price as the FAR superior RX570. And right here, as we see the 6600XT destroying the 3050 for the same price, it won't matter - MOST people will buy the Nvidia.
Why do you think we are all having to pay so much for GPU's? If Nvidia can make SHIT cards, and charge people MORE money for their shit cards, and yet STILL the vast majority of people buy them over the superior, cheatper, better AMD cards - why on Earth wouldn't Nvidia raise their prices? "Hey Jensen, we figured out gamers are so stupid, we can make dogshit products and still sell them in droves compared to AMD! What should we do now?" ..... "RAISE PRICES! XD".
@@TheVanillatech People are brain dead, its the same reason why apple is still a company.
Thats insane if Sweden people still buy 3050 at that price
@@mercurio822 There are times when buying Nvidia makes sense. There are never times when buying an Apple product makes sense.
@@MistyKathrine no its never a good idea to buy nGreedia.
Once again, we're left with the question of "Why the fuck is anyone buying the RTX 3050, ever, for any reason". Reputation is in the garbage, reliability is fine but you're getting ripped off so badly with the card that you would do much, much better to just WAIT and buy a 3060. The 3050 is a trap. Bad product made in bad times while they were shipping off all the good product to farms.
It would appear that Arc has the hardware to compete, but on the software/driver side...a single yike is awarded by the committee
Man, Intel has got a lot of catching up to do. Sometimes it performs miraculously well, but other times it's a complete stuttery mess. That's what I call unreliable
Who would buy a 3050 unless maybe a used one for cheap?? I just picked up a "used" 6650xt off Amazon for $200. It's almost brand new:)
jawa's website does not work
Amd first intel second and nvidia third… but everyone will buy the nvidia 3050.
Seems like you had a lot of coffee this morning 😂
in performance
3050 is somewhere between 1070 and 1080 (worst deal ever, used 1070 can be had under 100$)
6600xt is around 1080ti (decent deal but I've seen 1080ti's go for as low as 150$)
A770 is between 1080 and 1080ti ( good performance at higher res and better ray tracing than amd
cards but much more cost efficient than nvidia )
1080ti is between 3060 and 3060ti ( got myself one for 150$ and it runs everything nicely
used 1080ti and some tweaking, you basically get a rtx 2080 like performance in 90% of the games out there, pretty sweet deal IMO, unless you really need ray tracing or newer architecture
1080ti was a great GPU at the release, but it's getting old now. The driver support won't last forever. Also worth noting. rx6600xt will use max 160 watts. 1080ti will use 250 watts. So the newer GPU is significantly more efficient.
So...intel's only advantage is when Ray Tracing is Used?
Thank you very much for the in depth analysis for these cards. There’s is a lot of information beyond base fps and 1%that people need to see before purchasing. The Narrative that Intel is already at the level of AMD and Nvidia is clearly a false statement as it looks like they still need multiple interactions to really compete to other two.
Intel is behind Nvidia, but its RT is catching up to AMDs. XeSS is not as good as DLSS, but better thsn FSR.
@@Wobbothe3rd my problem with Ray Tracing is that even with Nvidia, it’s still in the early stages. Also since Consoles are bound to AMD and their jerryrigged 5700xt with ray tracing tech predating RDNA2, it’s very unlikely we see any real usage til another generation or two.
@@franciscoc905 your post sounds as if you are somewhat ignorant of the console's specs, and we already have plenty of RT around right now, for anyone to act like we need to wait another console gen or two.
Anyway, its not the consoles or AMD that causes devs to hold back doing more or better RT - just look at the market share on the Steam survey - that much RT hardware in the upper whatever is a recent trend. Across this gen, RT will get mainstream, HOWEVER, if you means an RT only game (something like that newer version of Metro Ex) then yes, we are many years from that being normal because you would need ALL apu's to have RT for that to ever happen.
don't forget the AMD new driver optimization that might reflect the result 🙂
These are the 23.2.2 latest drivers