Please, do a video on 6800xt to complete the lineup. This year, there's been a whole lot of scams going around in the used market of 6900xt with failed gpus. 6800/xt is a safer buy used, they are going around for 300 usd in Europe and sometimes even below that. Given it outperforms ps5 pro and has 16gb on board, it is quite interesting for purchase. Maybe a comparison with used 3080 10gb in 1440p ultra is in order. Like, how the situation changed with more RT in games, but also much higher vram reqiurements since 2020, where most were saying "you can't make use of 16gb in gaming". This difference of a whole 6gb, when 10gb hangs close to the edge, is much more apparent than 4gb difference between 3080ti and 6900xt, where 12gb is decently comfortable right now. Please do!
Driver issues acknowledged, but you can't ask for everything at that price point. The actual power of the new battlemage stuff is definitely the class leader though. I wonder how the 5060 will compare? I imagine similar performance but like 330 USD. And god please if you do exist please don't allow it to be 8gb.
They're on the way out at the end of the day and given how newer gpu gens are they probably aren't going to become amazing value now. Maybe battlemage will push the remaining stock/used prices down though. The new b580 battlemage intel vs the 6750 would be an interesting video.
Yeah the stock for the RX 7900 GRE absolutely evapored too. Saw an Asrock Steel Legend for $549 on Amazon recently I'm in NYC that sold very quickly. Dudes price gouging at like $700 plus recently which is just ridiculous 😂@IcebergTech
What a coincidence, 2304 shader units is also related to another graphics card that had deceptive/confusing marketing from AMD; the RX 580s, because I still remember the full fat one is 2304, while the most commonly available is the 2048SP, especially here in SEA.
AMD is not as bad as they once were with releasing refreshes of refreshes, which were a refresh. But they still are announcing announcements for an announcement.
@@interrobangings I don't remember. I see them on eBay all the time. Same with all the HD 5450 refreshes. The most refreshed graphics card ever. Similar to the HD 7750 1GB low profile (which i had in my HTPC in 2012, great LP card at the time) As far as i remember nVidia did this for a while too with it's OEM cards. There are 10 different versions of the OEM GT 730.
Long story short:Entry midrange level gpu from yesterday (2y old) vs new games. (RX 6700 10GB was in my reticle but there were not many good value options I picked RX6650XT - 2y ago - it does ok job in online gaming even at 1440p) That 12GB gpu family is still not that priced as it should be. Last 2 years were pretty fixed prices, (I think).
TBH, this video only exists because I want to compare the “PS5 GPU” to the B580. However, as I said in my community post, my B580 is gonna be a while… 😢
@alrecks619 i compared it to 3 benchmarks and it didn't beat the 6700xt at all and fell way behind in real world game test, I dont trust youtube propagenda.
@@alrecks619It also loses quite significantly in some games, scoring performance that's way closer to the 6600 XT. On average, the two cards seem about equal in raster.
I cant believe intel amd etc aren't providing you with all the gpus you want by this point. You've built up a pretty big and loyal audience (including yours truly)
RUclips appears to be automatically applying them to some videos again, just like with some caption translations which I absolutely loathe seeing in Dutch while the audio is still English lol.
Hello Iceberg Tech, how are you? I remember you making an i7 6950x video where is it? I wanna watch again. Also you asked for an e5 2696 v3 cpu too. Did you ever get that and oc it? Happy Holidays. 😊
Love the channel, fun video too :D Just a quick question. You mentioned you are planning to do another PS5 equivalent build video next year. Any thoughts and going with a CPU that has the same L3 cache as the PS5? Such as a Ryzen 5 4500?
I thought AMD cards did better in Starfield than Nvidia counterparts, or was that just in the beginning? I agree about the FSR in Cyberpunk btw, I use XeSS on my AMD card in that game.
Suggestion: You could try using the best APU of AMD’s FM2+ platform, the A10-7890K, and experiment with the hybrid CrossFire feature that’s unique to this platform, pairing it with an R7 250 (or 250X). It would be really interesting to see how this setup performs in 2025, perhaps with some overclocking as well!
Texture cache seems to be mostly what bogs vram in indiana jones, so 2 texture related options, quality and how much textures are kept in cache? rest is quite irrelevant from what i've seen
Personally I can't see myself getting a radeon until/unless FSR gets at least comparable to DLSS. I still need to try that intel XESS version only available on intel cards - I've heard on intel cards XESS is comparable with DLSS.
The B570 uses a different name than the B580 though, so it doesn't really fit in with the 6750GRE and 3080. It would be like the 6700 vs 6700XT, or if no 3080 12GB existed, only the 3080ti.
I predict that more and more games will not allow you to turn off ray tracing (like Indiana Jones) in the future as it is cheaper for developers to implement RT than placing custom lights. I think that AMD cards will use a bit of value for those that want to be able to play the latest games because of that. (just like the low vram cards from Nvidia that will probably have issues too)
I wonder how the Arc B570 compare to this? Both 10GB cards. The B580 is already comparable to the RX-6700XT, so (via transitive property) the B570 = 6750GRE
i got this in my new aliexpress build. cost me £200 delivered during BF sale. was as cheap as £180 but that seller canceled my order. Edit. just seen your specs and i have this exact build bar different ram brand, cooler and psu.
Ever since Cyberpunk came out, reviewers and benchmarkers have been making excuses for it. Same for Starfield, and seems now for Alan Wake 2 as well. RX 67xx, RX 76xx, and RTX 46xx series midrange cards with 8 GB or more should be able to play any game at near-60 or better fps at 1440 high, on Intel 10600 or Ryzen 5600. 1440 has replaced 1080 as the standard lower setting for a game. Hardware designers did their job. It's the game designers who are slacking off.
You need to turn on AMD SS from the settings, for the GRE we call it rabbit sauce instead of secret sauce. It will make FSR look better than DLSS, AFMF generate better frames than DLSS FG, and boost your fps by +20. Even RT becomes on par with Nvidia with SS turned on. Consoles have used SS for generations now and is usually turned on later in their lifespan, but all AMD GPUs have SS.
I'm representing for them budget ballers all across the world (Still) Hitting them fps in them benchmarks, girl Still taking my time to perfect the beat And I still got love for the games, it's the G-R-E
I remember checking these cards out when they were announced and realising that they were just rebranded 6700/6700 XT's To call them 6750's is just downright misleading at best and at worst dishonest.
I mean the 6650XT exists and is not misleading. The misleading part is not the '50' but the GRE part of the name. If they simply called them 6750 and 6750 XT then there would be very little misunderstanding.
Can you please disable automatic translation? The poorly translated title and description are annoying and I don't think anybody is watching a technical video with machine translated voice-over.
@monotheisticmortal5122 yes the 7900xt it's better while costing a small amount more, the 6950xt that came before has the same performance as the 7900gre. The gre it's less power hungry but it's something forgettable.
It's so incredibly sad that GPUs have been sold for going on 25 years with varying specs with similar models (names don't contain numbers.) and people are still (somehow????) confused by this concept. "Duhhh dis one iz kalled duh 10 GB and diz won iz dah 12 GB dey R duhhhhh same." *CLEARLY* they are not. They are not the same thing. Even the most cursory look at the model would tell you they're different. There is nothing "deceptive" about calling one thing 12GB and one thing 10GB. Duhh... this Toyota Camry SE is the same as this Toyota Camry LE. DEY R BOFE DUH CAMRY! No, they may contain *WILDLY* different engines and have completely different driving experiences, maintenance costs, etc.
china isn't allowed to get our normal gpu's. the 6700 in that example isn't sold in china. its not allowed by US law. so the 6750 GRE is a "china only" product because it passes US law for being sold in china. the more you know. its like how normal ryzen processors cannot be sold in china. they can import them, but can't be sold outright. and AMD sold a version of ryzen to a chinese chip maker, the first generation 1800x, but did NOT give them anything about security, only the basic chip side, meaning that chinese manufacturer has to develop their own security features for the cpu.
Lumen is just a cheap hack like DLSS that redces the devs chance for optimization. UE5 should be abandoned completely as a game engine so no more unoptimized garbage comes from it.
RDNA 1 and 2 are aging poorly it seems. Forced RT and bad optimization haven't been kind to them. Oh well, new games suck anyway, I don't need to play them anyway 😂
RDNA 1 is 5 and a half years old so I'd say it's holding up well. It's easy to forget how well almost all the 2019/20 GPUs are aging as we collectively forget how old they are since they've remained so relevant.
@@raresmacovei8382 nah the only one that isnt fine is RDNA1, AMD cuts too many corners when designing it that it lacks DX12 Ultimate support and even basic things like mesh shader
@@Eleganttf2 "Basic things", brah, there's 1 game that uses Mesh Shaders and even that one runs fine enough. This was all fluff in 2019. Some games require RT ... 5 years later? Big whoop. You can also play said RT only games on Linux on RDNA1 anyway, where you can emulate RT, lol.
Still don't understand the 1440p craze going on. Most of these new games that I never even bothered to play are terribly optimised and offer nothing new. I mean, Stalker, Star wars, Ratchet, God of War... its all just the same, nothing new. ‚In my opinion no matter how beautiful the game looks, if the gameplay is junk then the entire game is junk. Take Cyberjunk 2077 for example. Doesn't offer anything new that hasn't been already seen on GTA V that's 11 years old.
The only game out of these titles that looks good is cyberpunk. The rest look like shit and developers place all the things that the engine should handle and place it in hardware. Battlefield 4 looks better than half of these games. What a bunch of garbage.
I lost some footage from a couple of games, so I had to fill in with random clips from my archive. Just ‘cos I know some of you will have noticed 😊
Honesty is the best policy 😇
Please, do a video on 6800xt to complete the lineup. This year, there's been a whole lot of scams going around in the used market of 6900xt with failed gpus. 6800/xt is a safer buy used, they are going around for 300 usd in Europe and sometimes even below that. Given it outperforms ps5 pro and has 16gb on board, it is quite interesting for purchase. Maybe a comparison with used 3080 10gb in 1440p ultra is in order. Like, how the situation changed with more RT in games, but also much higher vram reqiurements since 2020, where most were saying "you can't make use of 16gb in gaming". This difference of a whole 6gb, when 10gb hangs close to the edge, is much more apparent than 4gb difference between 3080ti and 6900xt, where 12gb is decently comfortable right now. Please do!
The incoming B580 looks to be solving a lot of our issues in this budget segment.
Most def 🎉
Driver issues acknowledged, but you can't ask for everything at that price point. The actual power of the new battlemage stuff is definitely the class leader though. I wonder how the 5060 will compare? I imagine similar performance but like 330 USD. And god please if you do exist please don't allow it to be 8gb.
It did.
@@chilldude30 I've watched about 6 review videos and most of them always talk about how quickly they are fixing every driver issues while testing.
@@chilldude30I hope my comment ages badly, but current leaks suggest 8gb buffer, just unacceptable
0:54 the Still D.R.E reference was not lost upon me good sir.
It's great timing, too. Snoop and Dre released their album last night
I dunno about you, but I'm ready for the next episode of Radeon.
Yeston were selling these on their website till last week for $270 and $280, the only place that I had seen these cards for a reasonable price
I got this one for £200 on AE, but it’s taken three months for me to get round to reviewing it, and in that time stock just kinda evaporated…
They're on the way out at the end of the day and given how newer gpu gens are they probably aren't going to become amazing value now. Maybe battlemage will push the remaining stock/used prices down though. The new b580 battlemage intel vs the 6750 would be an interesting video.
Yeah the stock for the RX 7900 GRE absolutely evapored too. Saw an Asrock Steel Legend for $549 on Amazon recently I'm in NYC that sold very quickly. Dudes price gouging at like $700 plus recently which is just ridiculous 😂@IcebergTech
I recently got the 6750 xt at a very nice price last Black Friday, and it's doing amazingly well at 1440p with today's titles
Is it much faster that gre? Like 20%?
dude i listened to spotify and Still D.R.E. started, then i opened youtube and this video pops up in my recommended ahahahhaha
I have the 6700xt, got it last year for $225, highly recommend
Good price if new
Also got a 6700xt but for 250$. Used it for 2 years before upgrading. It was a solid card, loved it for 1440p!
got my DELL RX 6800 XT for 425€ new
What a coincidence, 2304 shader units is also related to another graphics card that had deceptive/confusing marketing from AMD; the RX 580s, because I still remember the full fat one is 2304, while the most commonly available is the 2048SP, especially here in SEA.
aliexpress has a strange obsession with 2048sps
Was actually vibing to Still D.R.E when this video went up😂
i love the rather misleading name, to make it sound like it's actually better than the 6700 XT.
They were acting like we forgot about GRE.
The soundtrack you're using for intro and outro is darn of a banger 🔥
Interested how it stands up to the new B580,
also love that it didn’t take long to see this vid after i asked on a previous one😊
well the 6700 XT that this card is lying about being better than can only trade blows with Arc and falls apart in RT, comparatively
These Dre references are lit homie! 🔥
I wasn't disappointed with the intro music :)
GUESS WHO'S RED????
I love the title…good job!👍
Any affordable GPU with more than 8GB of VRAM is desperately needed.
B580 says hallo
great video! thank you :)
AMD is not as bad as they once were with releasing refreshes of refreshes, which were a refresh. But they still are announcing announcements for an announcement.
hahaha, RX 200 and RX 300 were WILD for that. different GCN revisions all across the board!
@@interrobangings R9 280X and HD8950 were just different ways to say 7970 Ghz Edition
@@hyperturbotechnomike to play devil's advocate, wasn't Radeon HD 8000 OEM-only?
@@interrobangings I don't remember. I see them on eBay all the time. Same with all the HD 5450 refreshes. The most refreshed graphics card ever. Similar to the HD 7750 1GB low profile (which i had in my HTPC in 2012, great LP card at the time)
As far as i remember nVidia did this for a while too with it's OEM cards. There are 10 different versions of the OEM GT 730.
Nvidia can also this trick. NVIDIA GeForce 210/310/405/505 or Geforce GTX680/770 or Geforce GT520/610/710.
Long story short:Entry midrange level gpu from yesterday (2y old) vs new games.
(RX 6700 10GB was in my reticle but there were not many good value options I picked RX6650XT - 2y ago - it does ok job in online gaming even at 1440p)
That 12GB gpu family is still not that priced as it should be. Last 2 years were pretty fixed prices, (I think).
Great name for a great video!
And I still got love for the streets, it's the G-R-E
love the still dre reference lol
Love the title on this
опа! новое видео родилось!
лайк сразу лучшему обзорщику на бюджете!
This channel needs millions of views not thousands
I have the normal 6700 10GB. No GRE needed haha. Can you compare it to the B580 when it comes out here? :3
B580 went head to head with 6700XT, sometimes demolishing it on more recent titles.
TBH, this video only exists because I want to compare the “PS5 GPU” to the B580. However, as I said in my community post, my B580 is gonna be a while… 😢
@alrecks619 i compared it to 3 benchmarks and it didn't beat the 6700xt at all and fell way behind in real world game test, I dont trust youtube propagenda.
@@alrecks619It also loses quite significantly in some games, scoring performance that's way closer to the 6600 XT. On average, the two cards seem about equal in raster.
I cant believe intel amd etc aren't providing you with all the gpus you want by this point. You've built up a pretty big and loyal audience (including yours truly)
Clearly it's a 1080p card.
As a huge fan of the 2001 album the Dr. Dre references are brilliant to me 😂
Can you disable the audio tracks in different languages? It sounds absolutely horrible (at least in German).
RUclips appears to be automatically applying them to some videos again, just like with some caption translations which I absolutely loathe seeing in Dutch while the audio is still English lol.
please dont ever change your intro music that shit hits
Hello Iceberg Tech, how are you?
I remember you making an i7 6950x video where is it? I wanna watch again. Also you asked for an e5 2696 v3 cpu too. Did you ever get that and oc it? Happy Holidays. 😊
Love the channel, fun video too :D
Just a quick question. You mentioned you are planning to do another PS5 equivalent build video next year. Any thoughts and going with a CPU that has the same L3 cache as the PS5? Such as a Ryzen 5 4500?
10/10 video name
you should try the amd radeon pro v340 or the instinct mi50 for gaming
i have a GRE ... 7900GRE :D
lmao the song reference. :D
my gawd iceberg, i fn love the title!!! i don't think most people will get it🤣 nothing but a Gthang!!!
I thought AMD cards did better in Starfield than Nvidia counterparts, or was that just in the beginning? I agree about the FSR in Cyberpunk btw, I use XeSS on my AMD card in that game.
Suggestion: You could try using the best APU of AMD’s FM2+ platform, the A10-7890K, and experiment with the hybrid CrossFire feature that’s unique to this platform, pairing it with an R7 250 (or 250X). It would be really interesting to see how this setup performs in 2025, perhaps with some overclocking as well!
Get Rekt Edition.
i like the title
I fucking love the title of this video. Clicked faster than my brain would process the reference 😂😂😂
Texture cache seems to be mostly what bogs vram in indiana jones, so 2 texture related options, quality and how much textures are kept in cache? rest is quite irrelevant from what i've seen
Personally I can't see myself getting a radeon until/unless FSR gets at least comparable to DLSS. I still need to try that intel XESS version only available on intel cards - I've heard on intel cards XESS is comparable with DLSS.
putting this video out the day after Intel Arc puts out a card that beats this in 1080 and 1440 for a much better price is unfortunate
Nvidia and AMD
Even Intel couldn't resist having a 12gb card with a 10gb equivalent
The B570 uses a different name than the B580 though, so it doesn't really fit in with the 6750GRE and 3080. It would be like the 6700 vs 6700XT, or if no 3080 12GB existed, only the 3080ti.
Golden Rabbit Edition not even on year of Metal Rabbit.... they could have gone with FRE, Fluid Rabbit Edition
I predict that more and more games will not allow you to turn off ray tracing (like Indiana Jones) in the future as it is cheaper for developers to implement RT than placing custom lights. I think that AMD cards will use a bit of value for those that want to be able to play the latest games because of that. (just like the low vram cards from Nvidia that will probably have issues too)
I wonder how the Arc B570 compare to this? Both 10GB cards. The B580 is already comparable to the RX-6700XT, so (via transitive property) the B570 = 6750GRE
Nice 👍
so it's much slower than a 6750xt?
i got this in my new aliexpress build. cost me £200 delivered during BF sale. was as cheap as £180 but that seller canceled my order.
Edit. just seen your specs and i have this exact build bar different ram brand, cooler and psu.
Ever since Cyberpunk came out, reviewers and benchmarkers have been making excuses for it. Same for Starfield, and seems now for Alan Wake 2 as well. RX 67xx, RX 76xx, and RTX 46xx series midrange cards with 8 GB or more should be able to play any game at near-60 or better fps at 1440 high, on Intel 10600 or Ryzen 5600. 1440 has replaced 1080 as the standard lower setting for a game. Hardware designers did their job. It's the game designers who are slacking off.
You need to turn on AMD SS from the settings, for the GRE we call it rabbit sauce instead of secret sauce. It will make FSR look better than DLSS, AFMF generate better frames than DLSS FG, and boost your fps by +20. Even RT becomes on par with Nvidia with SS turned on. Consoles have used SS for generations now and is usually turned on later in their lifespan, but all AMD GPUs have SS.
Do you mean Super Sampling?
"Even RT becomes on par with Nvidia with SS turned on" what is this copium lol
I'm representing for them budget ballers all across the world
(Still) Hitting them fps in them benchmarks, girl
Still taking my time to perfect the beat
And I still got love for the games, it's the G-R-E
Nvidia prices so high that they put snoop dogg to shame
Give us arc b580 review
I am dreading EU5 games these days
The goofball AliExpress card. Then there'll also be the 7650 GRE coming out this CES, only to get completely overshadowed by RDNA 4 ayy lmao.
I remember checking these cards out when they were announced and realising that they were just rebranded 6700/6700 XT's
To call them 6750's is just downright misleading at best and at worst dishonest.
I mean the 6650XT exists and is not misleading. The misleading part is not the '50' but the GRE part of the name. If they simply called them 6750 and 6750 XT then there would be very little misunderstanding.
Wild title
needed 4060 8gb as comparison
I see what you did with that title
I just played stalker 2 with my 6750 gre and was getting 100 fps with frs..? 30fps is a straight lie cmon now man
6700 nonxt is the same as 5700xt/6600xt and cost much more
Can you please disable automatic translation? The poorly translated title and description are annoying and I don't think anybody is watching a technical video with machine translated voice-over.
AT $200 it would be an ok entry level card.... more expensive than that, forget it...
if AMD's new GPU still suck 2025, intel arc might take it's entry card place...
My PowerColor fighter RX 6700 performs at least 10% better, undervolted!...strange. You should compare to your regular 6700.
Never bought the _Greedy Refurbished Edition_ of any AMD card, my rx6750xt (xfx merc black) has all i need for now.
7900 Greedy Refurbished Edition eh?
@monotheisticmortal5122 yes the 7900xt it's better while costing a small amount more, the 6950xt that came before has the same performance as the 7900gre. The gre it's less power hungry but it's something forgettable.
@@Bsc8 Yeah then there's also the 7650 GRE coming out this CES for God knows why.
@@monotheisticmortal5122 damn XD
It's so incredibly sad that GPUs have been sold for going on 25 years with varying specs with similar models (names don't contain numbers.) and people are still (somehow????) confused by this concept.
"Duhhh dis one iz kalled duh 10 GB and diz won iz dah 12 GB dey R duhhhhh same."
*CLEARLY* they are not. They are not the same thing. Even the most cursory look at the model would tell you they're different.
There is nothing "deceptive" about calling one thing 12GB and one thing 10GB.
Duhh... this Toyota Camry SE is the same as this Toyota Camry LE. DEY R BOFE DUH CAMRY!
No, they may contain *WILDLY* different engines and have completely different driving experiences, maintenance costs, etc.
still G.R.E 💀
china isn't allowed to get our normal gpu's. the 6700 in that example isn't sold in china. its not allowed by US law. so the 6750 GRE is a "china only" product because it passes US law for being sold in china. the more you know. its like how normal ryzen processors cannot be sold in china. they can import them, but can't be sold outright. and AMD sold a version of ryzen to a chinese chip maker, the first generation 1800x, but did NOT give them anything about security, only the basic chip side, meaning that chinese manufacturer has to develop their own security features for the cpu.
wers a580
A very bad result for this video card and the ps5 analog
Lumen is just a cheap hack like DLSS that redces the devs chance for optimization. UE5 should be abandoned completely as a game engine so no more unoptimized garbage comes from it.
Now with b580, this gre need to be 12gb too! No point to buy this gre if only 10gb
Damn that's a rather scummy naming scheme, thanks for sharing
Wwoz
hello
day 1 of asking for 9400f
thats my GPU!!!
I am running an Ryzen 5 5600X3D + RX 6750 GRE 10GB. I am from EU btw. 😁
RDNA 1 and 2 are aging poorly it seems. Forced RT and bad optimization haven't been kind to them. Oh well, new games suck anyway, I don't need to play them anyway 😂
RDNA2 is doing just fine. The games with default RT are barely affected. RDNA1 is also holding up well as long as you don't hit VRAM limits.
RDNA 1 is 5 and a half years old so I'd say it's holding up well. It's easy to forget how well almost all the 2019/20 GPUs are aging as we collectively forget how old they are since they've remained so relevant.
@@raresmacovei8382 nah the only one that isnt fine is RDNA1, AMD cuts too many corners when designing it that it lacks DX12 Ultimate support and even basic things like mesh shader
@@Eleganttf2 "Basic things", brah, there's 1 game that uses Mesh Shaders and even that one runs fine enough. This was all fluff in 2019. Some games require RT ... 5 years later? Big whoop. You can also play said RT only games on Linux on RDNA1 anyway, where you can emulate RT, lol.
Gr. GRE vs N Vidia
Still don't understand the 1440p craze going on. Most of these new games that I never even bothered to play are terribly optimised and offer nothing new. I mean, Stalker, Star wars, Ratchet, God of War... its all just the same, nothing new. ‚In my opinion no matter how beautiful the game looks, if the gameplay is junk then the entire game is junk. Take Cyberjunk 2077 for example. Doesn't offer anything new that hasn't been already seen on GTA V that's 11 years old.
The only game out of these titles that looks good is cyberpunk. The rest look like shit and developers place all the things that the engine should handle and place it in hardware. Battlefield 4 looks better than half of these games.
What a bunch of garbage.
.
Was never a 1440p card and should have veen directly compared to the 7600 and the 3060.
dead card. b580 is cheaper and better
Junk
The title 💀 those who know -->
This was satire
6700XT is an amazing card.