got 6950xt and 7900gre from XFX, I was surprized by the design and overall quality given the lower price. Man both coolers were able to easily keep temps in check even when pushing 300W+, which also means that with some UV and 240W limit both cards ended up really quiet!
The RX 6000 was amazing, 7000 is mid not sure what they did with the cooler but even if you tune down that out of the box turbine, then the TUF or Nitro is still a good 10 degrees cooler on the same noise level. (XT/XTX)
DLSS isn't hardware based, it's hardware accelerated. it could run on any hardware (if not always very fast) , and could run just as fast on any hardware that has matrix solvers (that Nvidia markets as tensor cores) if nvidia would allow it.
Also worth noting is that Nvidia "hardware accelerated" features tend to use both the Tensor cores AND additional host CPU cycles.. at least on my 4080 Super running the same benchmark (Cyberpunk 2077) with features like RT or DLSS enabled.
It has been done, DLSS can run on GTX but due to lack of the hardware acceleration it runs much worse than native. As far as I remember I saw it in a video about uniscaler mods or something. The guy was trying out FSR 3.1 in older unsupported games.
So you mean its not viable unless hardware accelerated? Just like any other hardware accelerator that does something that can be done in software? Its hardware based. It doesnt work correctly without the hardware
13:00 Always find it somewhat funny that a lot of people in the west think Gigabyte, MSI or Asus are the "big boys" in GPU manufacturing and everyone else is small time, while in reality the real GPU giant is actually Palit Microsystems. They overtook Asus back in 2013 already to become the largest GPU AIB. As for the others players in the market which are not that well know by western audiences and reviewers but still very large. Colorful(Largest in China which moves massive volume) PC Partner Group(Manili,Zotac,InnoVISION) who also also does a lot of contract manufacturing
@@あなた以外の誰でもない Model is also relevant. Palit and zotac make some very underbuilt cards as their base models for midrange cards. Much worse thermals than you'd get with like a base model like an msi ventus 2x, asus dual or gigabyte windforce.
I game on Linux (sometimes) because some games support the OS directly from Steam. You can also use virtual machines if that doesn't work for you, but anti-cheat egines might complain lol.
I faced similar issue with my Mi 120g laptop. For me popos worked better than the rest. It's not perfect yet, I still face audio issues, and occasionally it becomes a chore to get it working. But, on my desktop, I don't face any issues with the popos. Everything just works. If you haven't tried popos yet, try it and see if it solves the issue for you.
PNY has been a NV OEM since before geforce was a thing for NVIDIA I think they started with the RIVA TNT. They have always been the lowest end reference designs.
I worked at Staples back in the stone age, and we carried PNY RAM. This was the SIMM/DIMM SDRAM era, but the return rate was abysmal. I've never got the impression their consumer line-up has improved to any notable extent.
I own a PNY 4090 XLR8 OC... It's a lovely card. Cool and quiet. Full frame. Solid performance. Vapour Chamber, 8 pipes (two 8mm). Triple circle enclosed fans. Compact design. It's not for benchmarking. But I get an excellent undervolt/OC at 2760MHz @ 975mV with +1200MHz (12%) VRAM OC. It doesn't look like a giant maxipad or candy bar so I'm quite happy.
Not only data center cards, they also do all the expensive Quadro workstation cards. I have two Quadro RTX 5000's in my engineering workstation and their radial fan design is better thought out than some of the consumer brands axial designs. I hold them to the same standard as EVGA was. Their fan bearings are of quality, which many other board partners cheap out on. My sister in law bought an RTX 3060Ti Eagle OC from Goigabyte. Loudest card ever, even more so than a GTX480 and rubbish cooling, flow through area was completely covered by a brittle plastics backplate. She sent it back and replaced it with a PNY 3060 Ti, PC is super quiet. Very big flow through and quality fan bearings.
I have a XFX merc 319 6800xt and its great. runs quiet cool and fast. i also really like the XFX style. no rgb puke or other weird light garbage. just a lit up XFX logo.
Have their 7900XTX Merc 310, very clean white lit up logo. Both the 4090 FE and their XTX just have clean professional looking white light. Our 7800XT Gigabyte OC has the RGB logo going on, very niche, matches with RGB ram. My main pc has a black out theme in a white case, elegant having less flashy elements sometimes.
XFX used to be one of the most recommended manufacturers out there back when they also made Nvidia cards. I remember cross-shopping between EVGA and XFX for my 8800GT when I was building a PC to play the original Crysis. And from what I’ve gathered over the years, XFX is very much still considered one of the “big 3” of the red team. I ended up getting ahold of an MSI Ventus 3080 10gb back in December 2020 (for actual MSRP too), and though it has served be very well, it has a plastic backplate. I was used to the build quality of EVGA and Sapphire cards, and holy crap, the shrouds in MSI’s newer graphics cards feel ridiculously cheap!
@@alrecks619 Yeah they had a big turn around, I had a Thicc 3 ultra 5700xt and you could drop temps by like 6 degrees by popping the back panel and removing some of the excess shrouding
There I see way more "repaired" zotac compared to any other brand. Might tell you something. Gets hotter than other brands too and cheaper. In their website they don't even tell you directly where their service centres are. You need to mail them, giving invoice and if only ok they will mail you the location. Very annoying experience.
I wish they could compete with CUDA too for developing AI models, but there is just too much support and momentum behind Nvidia from the network effect, I can't even imagine AMD competing with nvidia on this front.
technically the models aren't written in CUDA the stuff that runs them is pytorch, and that has a Backend for AMD GPU's though it only runs on Linux ( For image generators specifically ), part of the functionality has been ported over windows and that runs language models fine.
@@anonapache With the same effort that Linux lovers want everyone else to use Linux with. If Linux is easy for everyone to use, running CUDA on AMD shouldn't be any different. It's a matter of wanting it. Those who say they want to use AMD are full of bullshit. They should be honest about what they really want.
With Intel claiming 0x129 fixed the core issue of the Intel 13th & 14th gen stability problems and yet, they've released another microcode update 0x12B which they yet again claims is to improve the stability of those CPUs, what do you think they've done this time and is it time to do another Intel CPUs testing update?
FSR can be good, just look at GoW Ragnarok and Ghost of Tsushima, the thing is when comparing one of the worst implementations of FSR in Cyberpunk to DLSS, then DLSS will always look way better and on other hand, there's a lot about FSR 3.1 ghosting, it's almost non-existent in GoW Ragnarok (snowflakes do leave a trail), but when compared to how uncanny ghosting with DLSS in Wukong can be, then i'd take FSR every day of the week
@@sias.2569 TechPowerUp uploaded 3 days ago comparison of XeSS, FSR and DLSS in boat part, you can check that 2min video i think the only thing FSR does to water in that game is making puddles sligtly shimmer on the edges, nothing you will notice if you don't stare at it, as for reflections on that video, all looks good, but there's no native to compare
DLSS in Wukong is buggy, it looks very grainy compared to Alan Wake 2 or cyberpunk. If you want to compare FSR to DLSS, use the DLSS in Cyberpunk, stable, not grainy, barely any ghosting or artifacts. In fact, it looks almost as good as native even at 1080p. FSR looks good so far, still having a hard time recreating small fine details especially those that in the distance, ghost of Tsushima has falling leaves disappear then suddenly reappear, trees in distance looks blurry compared to native. I imagine FSR 4 might fix this with AI, but right now DLSS still better especially at lower resolution, since people like me who has a 3050 laptop can use DLSS at 1080p and it looks almost as good as native 1080p
@@camdustin9164 *akhem* Nvidia sponsored title and when AMD sponsored title has problems then i can hear it from every channel, my main point was that most comparisons are not with peak FSR performance, like setting up a race with Usain Bolt and 2nd fastest runner but 2nd runner has iron ball next to his leg, there's no competition... using cyberpunk is Nvidia shilling at it's best since FSR 2.1 is looking worse and worse with every patch they release, somehow, i don't know how and FSR 2.2 with frame gen is even worse, somehow 1080p is very low base for FSR, raindrops also has that problem, probably because AI can take more things into account, while algorithm is more likely to miss that while comparing neighoring pixels
for Radeon, Sapphire and Power Color are kings, XFX is good affordable option and rest is... rest, in short it's like EVGA, best performing and best quality products are not from biggest corporations as they are doing too much at once
@@El_Deen they did a lot of custom stuff before Nvidia restricted it, also they were overclockers, so a lot of their stuff was made strictly with overclocking in mind, which made it easier to prolong product life, but yeah, looking at repair shop content and their gripes about EVGA, then there was a lot problems
FSR shines on old GPUs. Probably the best case scenario would be 5700XT, 1070, 1080ti. Affordable first market (5700XT) and second hand market for NVidia old generation GPUs. Now saying this, i found out a lot of value for 5700XT with it's pricing below 200$.
@20:32 the ps5 came out around the same time as the 30 series but it was equivalent to a 2070super/6650xt/1080Ti NOT a 3070 which is 35% faster and closer to a ps5 pro gpu.
later the ports run like crap, or even the native ports, thats a myth you need more than a 3070 you need a 3070 and all the components that ps5 already have. you are full memes and myths.
PNY is really big in workstations in the US, now because of the CUDA monopoly most of these are Quadro cards, but i cant recall seeing anything other than PNY in the enterprise space. I mean HP and Dell make cards like the A100 and H100 but outside of those, you dont really buy an HP or dell workstation card, unless they come inside of one of their workstations. When talking about the PS5 Pro, my concern is that it us supposedly still Zen2 That means its may be just as slow as my 4700S/BC250 with a 7800XT Zen2 APUs, including PS5, have only 4MB of L3 available to any one core, Zen3 on the other hand has 16MB available to any one core due to having double the L3, along with a unified CCX/CCD I would genuinely be surprized if they didnt at least make a unified CCX version of Zen2, maybe they keep the 8MB of cache instead of bumping up to the 16MB in Zen3, but i think it would be a bad idea APUs share a memory bus, the more cache the CPU has, the less it uses the RAM, the less it uses the RAM, the more cycle time the GPU has to work with. Even with Zen3 i suspect it will be slower than an RX6800 in raw performance even if its RDNA3 based, console optimizations can hopefully make the thing perform better than a 7800XT, and i'd imagine that if they went with 16MB Zen3+ it might perform similar to a 7800X3D+7800XT
Regarding the Nvidia color thing: I believe you are referring to how the Nvidia cards will default to a limited dynamic range when using HDMI while AMD will default to full. You can change the setting in the Nvidia control panel to use 'full' over HDMI. DP is default set to full. I think they work under the assumption that HDMI means TV and TV operated best with limited since the TVs would compress the dynamic range anyway.
A few topics here are odd. 1. VRAM usage -> Like RAM, it will use as much as it can get so it doesn't need to hit the disk. As long as you are not getting out of memory usage, it's fine. You want it to fill it up as you pay the electricity cost whether it's full or 1 bit is used, so better to avoid having to go to slower memory (/disk). 2. FSR being open source does not stop them from optimizing it for their own hardware (or overriding/hooking at driver level so other implementation is used), but does increase the chance that the game comes with the correct API calls so they can use it.
Yeah, I came here to say point 2... Supporting your competitors' hardware is not part of open sourcing the software. I agree that maybe AMD doesnt need to focus as much on supporting old Nvidia cards, but at least for me, them open sourcing their software is a big selling point. It makes them more of a good-faith player than Nvidia.
When the Rx 7600 first launched I think I saw a laptop review that had the AMD card often perform significantly worse than the RTX 4060 because it was just slightly using more VRAM,a and they were testing games that all teetered on the edge of 8gb. AMD went over a lot more often and suffered hard. Even if it was just using like 300mb more, it was sending it over the edge more often and choking up.
This is true for memory in general. As long as you are able to run everything you need, you want your game and operating system to use as much memory as possible
I imagine their odd take on point 2 is due to that they looked at it from a gamer point of view rather then the developer point of view. If you're trailing OSS is a good way to keep your technology incorporated in as many products as possible. And so you keep good support despite your position.
I think what they meant by open source isn't actually anything to do with it being open source but that they are spending limited dev time making fsr compatible with rtx and gtx cards as well as older radeon designs like polaris instead of spending that time making it work better on modern radeon cards. That's nice for the people with a 1660, 1070 or a 580 but it's not doing anything to help sell radeons.
Will note on the PS5 Pro comparison. The 7800XT is way faster, primarily due to no Infinity Cache on the PS5 Pro And that has been a thing that historically cripples a RDNA card versus its IC-Bearing Counterparts. (680M/780M vs 6400, 890M vs 6500XT.etc) So 7700XT is fair for Raster performance for the PS5 Pro due to not having Infinity Cache. As for AI/RT Performance that is more variable but feature wise rumors state RDNA4 (Which Cerny has said the RT is actually coming from) will at least match Ampere's RT Feature set. So we can probably look for GPUs in the Ampere gen that are around 7700XT level in Raster then go from there Which would lead us to a RTX 3070/3070Ti for a lowball or an Downclocked 4070 for a highball
The vram is shared with system memory tho, not exactly the same but the ps5 gpu was roughly a 6700 non xt and performed close to one, as well as console only optimizations would bridge the gap, infinity cache helps bandwidth issues not 100% present on consoles due to a unified memory config, thats why modern games tend to require loads of vram because consoles don't have to worry about the transfers between cpu and gpu like pc typically does
@@PelonixYT no, not quite. The 6700 OEM is the card that is closest to PS5 and even then outside of wacky cases (poor optimization) on PC, often outperforms PS5 exponentially at some resolutions. Again, due to Infinity Cache.
Hot take I think FSR targeted universal card support so that it wouldnt be ignored, sure there was probably a hope some games would choose FSR over DLSS, but really i think the main goal was just to have it included at all. If a game dev often chose not to include DLSS despite it being in ~90% of new cards, why would they bother with FSR if it only worked on 5-10% of new cards? By working on all cards, including Nvidia cards that dont support DLSS, it evened the playing field on weather a developer would include it or not.
Yeah, at the time (and it may even still be the case), there were more people with Nvidia GPUs that couldn't use DLSS than there were people with AMD GPUs. AMD leveraged those neglected Nvidia owners to get FSR into games (because, if devs wouldn't add it for the sake of AMD owners, they'd at least do it for all of the 1080/1660 owners) and, now, it's an established technology that people expect to see in new games.
It was open, for much the same reason Java was Open, to stop the competitor closing the market (NVidia graphics or Windows OS). If it hadn't been for Java being Open Source, Microsoft would have closed the internet completely. Same for FSF. If AMD hadn't open sourced it, Nvidia would have closed off the market and everyone would have to license DLSS from NVidia, who can refuse.
It's not just that there are people that prefer XFX over ASUS, it's that XFX, PowerColor, and Sapphire make the best AMD GPUs and the big three generally don't spend much money to make sure their Nvidia designs work on an AMD board.
Unfortunately AMD didn't specified anything about whether that will be FSR 4 Upscaling or just FSR 3.1 with some AI Frame Generation, also no information if it will be supported by modern GPU such as 7000 and above series. And I can easily recommend Kryosheet, as I have this gorgeous little thing in my 7900XT from Gigabyte, which dropped my temps from 90-95C and fans at 2000rpm +, all the way dowm to 82C 1400-1500rpm at stock, while 88C max 1700-1800rpm with almost 400watts after unlocking power limit. But there is literally no FPS increases over 2-5% of power limit, so I have 3% over the stock and getting lile 84C 1500-1600rpm at max when full load. So yeah I truly recommend this Sheet if you have enough of pump out effect on your GPU. Just use termal electric tape (0.1mm yellow) around the Die to secure the transistors and you good to go.
If AMD doesn't incorporate some of their existing FSR algorithms with their AI I will be surprised since I imagine that will make training easier by providing essentially humanities best performing manually coded upscaler as additional input. While FSR isn't as good as DLSS it comes close in man ways after all.
I think console cost + online cost for the lifespan of the console makes it equal or more expensive than same tier pc, is just the cost is not upfront. I also feel the ps5 pro will be more like a 7700xt rather than a 7800
PS5 performance sits between a 3060 and 4060. Sony claims the PS5 Pro is 45% faster in rendering, so that would roughly put it right behind a 3070 ti. So it might not even match the 7700XT. As far as I know the GPU is still RDNA2 but customized with better Raytracing and new AI acceleration.
The RX 6700 has nearly the same specs as the PS5, with a 160-bit memory bus and a slightly higher clock speed, and both deliver similar performance according to Digital Foundry's tests. So, the idea that a PS5 Pro will surpass the 7800XT makes zero sense. The 7800XT is about 70% more powerful than the RX 6700, and Sony itself claimed the PS5 Pro will be around 45% faster than the PS5. Do you really think the 7800XT is only 45% faster than a PS5? Of course not. The numbers don’t lie-believing the PS5 Pro can match that level of performance is pure delusion.
AMD should be implementing XESS instead of going their own route.. they'd do this if they were genuinely about open standards rather than just using openness to compete with whatever Nvidia's latest tech is.
You can already run XESS on AMD cards, but not accelerated (rather, DP4A). If they used this as a reason to cut FSR, then they would be permanently stuck with a less performant implementation, and further development of upscaling on their cards would be reliant on Intel's fledging dGPU division, which is a competitor, and which could unfortunately be shut down at a moment's notice.
20:55 Not really, there is no "magical it's better because it's console optimized GPU". PS5 GPU is basically RX 6700 hardware and it runs around the same performance as PC RX 6700 - sometimes little bit slower (when game wants more compute power) and sometimes little bit faster (when game wants faster memory). PS5 Pro will most likely be little bit below 7800 XT (if it's RDNA 3) because it has both slower memory and lower compute performance.
But if PSSR is decent, and it does have RT from RDNA4, then the ps5 pro can output a better image from a lower input resolution, and will have better performance than any AMD equivalent GPU. Just like RTX cards in the same performance tier.
@@ArchieBunker11 According to digital foundry it's like an underclocked 4070. Which means it has close to 6800xt/7800xt levels of raster performance but much better RT performance and a better upscaler.
@@enmanuel1950 as far as raster is concerned the base 4070 is slightly worse than a 3080 and 7800xt, and marginally better than a 6800xt(speaking in averages). An undervolted 4070 would be closer to a 6800 non xt. Also, sony themselves say “45% better than a ps5. The 6800 is 45% faster than a 6700. Although id grant you the image quality and RT.
Asking Tesla to test drive self driving tech on a VW as it's more reliable is like asking AMD to test on Nvidia. What's next HW unboxed just doing a nexus clips channel as they did better testing or better equipment on certain topics ? Nice for the consumer but wouldn't be a great look for you guys . It's just not a smart decision to push a rivals card however if the percentage uplifts are the same then it doesn't really matter
I feel you overestimate consoles a bit. PS5 is more between 3060 and 3060 Ti, and Pro according to what Sony said is 45% faster which would land it around 7700XT, not above 7800XT.
RDNA2 in any generation cannot beat PS5 Pro in any enhance title game, for one main reason the PSSR, with this AI upscaling it's making RT easier to implement and process. and on that note RDNA2 doesn't have BF16 silicon so it's only compatible on FSR3, while RDNA3 and later generation will be running FSR4 because they have ML silicon.
I don't get it why people praise so much Sapphire. Their thermals are one of the worst. Powercolor, MSI and Asus have way better thermals and efficiency.
What amd/intel should do to their cpus to maximize the gains in bandwith from DDR5-8400+ compared to ddr5-6000 or no matter the ram bandwith speed, it wont matter until we have high speed ddr6?
bro ram speed doesnt matter much when the CPUs themselves dont grow much in terms of cache. CPUs work best when running sequentially, for that they need good cache algorithms to keep the cores fed but its really complicated. CPUs do branch prediction and out-of-order execution to mitigate this. But ram speed is so far away from being an issue, you wouldnt even believe. As of today, higher speed ram only "copes" for poorly written game engines and mostly doesnt even boost a % of performance....
Nvidia has quite a few on AMD when it comes to AMD catching up.. as long as AMD puts the effort in I don't see why it couldn't be just as good..maybe better if they figure something out Nvidia has not..but that's gunna be hard since Nvidia is a money printing factory
My understanding is DLSS 'AI' is merely AI tuned rather than 'run in AI.' I think leveraging this level of 'AI' is fine, as it streamlines tuning the various upscaling logic to closest match the original target resolution. This is something that could work with previous versions of FSR ... although would require tuned profiles on a per game basis
“It was interesting to see amd talk about this… normally with their future looking stuff they kind of hide it away” …. FSR3 frame generation would like a word
15:44 I found that not only are the smaller brands for AMD at least, higher quality, but they also have better customer support for the same price as the bigger brands. So I see no reason to get the bigger brands actually. I've never dealt with the smaller brands for Nvidia though, so I can't say about ZOTAC and PNY.
DLSS has seen no real improvement for a long time. It's good but some artifacts are still pretty distracting like power lines breaking up horribly. FSRFG proves to be very comptitve against DLSSFG. Fsr 4 is gonna be better than DLSS if it stays the same.
The PS5 itself is identical in performance to a RX 6700, there is no doubt about that it matches both raw and RT performance, so accounting for everything announced to the PS5 Pro, the only GPU that matches the performance gain and the "next level RT processor" and so on, is the RX 7700 XT, it has 42% increased performance from the RX 6700, and also has a next level RT processor. It can't be the 7900 GRE because that is a 89% uplift, and the 7800 XT is a 72% uplift from the standard PS5 performance. The NVIDIA Equivalent would be a RTX 3070 Ti, although the raw performance is a little bit slower, and the RT performance is a little bit higher.
@@Kukajin That is too much of a performance improvement from the PS5, the ammount of VRAM isn't a issue, the GPU is always custom made, usually is a shared Memory and they allocate the way it's needed. 7700XT has 12gb of VRAM, but PS5 uses a custom GPU, with 13,7GB... not exactly a reason to call it mismatch, just add more memory modules to the 7700XT. Also Intel XeSS ins a AI based upscaler and is compatible with AMD GPUs... PSSR isn't FSR, and the "FSR" doesn't need to be AI based in order for the GPU to match, PSSR is it's own upscaler, like XeSS is, TSR and also FSR. All which can be run in the PS5, because they aren't GPU specific technologies like the DLSS (maybe the PSSR will be, who knows). Focus on the Performance increase, if anything goes beyond +50% of a RX 6700, it's already better than what PS5 Pro is capable of.
With generic FSR there is at least a chance that someone would add it into a game since it at least provides capabilities to older hardware. With hardware specific FSR, is there any incentive for developers to add it? They lost market share every single generation as the "drivers are crap" myth has persisted even to now when NVidia has many issues as well but the smell doesn't stick to them.
@@redshiftit8303 the white dual fan zotac 3069ti had widespread fan failures. When it happened to me I looked into it and found that it was alarmingly common. The black version had no such issues. So I decided to be smart and ordered a replacement set of fans, only this time I ordered the black ones. To my dismay the screw holes didn't match up between the black and white models. Ended up zip tieing a couple 12cm case fans to it and vowed to never buy zotac again.
I think AMD should continue to make FSR an open standard but because they know in advance how the features will be implemented, they should design their hardware to be more efficient for it
Yes... But no.... Because Nvidia didn't planed tensor cores for DLSS, it was made for enterprise and AI oriented workloads. They just saw it doing nothing in gaming and gave it a use, just like nvenc and the new optical flow accelerator that also was designed for computational vision. But was used for frame Gen. That's the cool part of hardware accelerated stuff, they occupy space on die, but their use mainly occupy power draw budged. Said that, if AMD want the same recognition, they need just to make like Nvidia and do things for who is paying for it .....
@@pedro.alcatra the idea is to adapt FSR to maximize the efficiency on AMD hardware (in this case make full use of their tensor core implementation). Unless both AMD and Nvidia hardware work in the same way, it would stand to reason that the most optimized software implementation for one does not work as well on the other. You absolutely still need FSR to be open as AMD is the underdog so game developers need an incentive to add support. If they can get some improvement for all users (picking an open FSR) vs a lot of improvement for a small number of users (closed FSR) vs a lot of improvement for most users (closed DLSS) they will probably rank those choices as either 132 or 312, but would be insane to pick a closed FSR as a primary path forward.
PNY is the manufacturer of allmost all Quadro cards, they are of course pretty large as a manufacturer. Quadro cards perhaps even have higher margins, because they are very expensive. I have two Quadro RTX 5000's in my engineering workstation.
My PNY 4090 XLR8 OC is an excellent card. You most you'll get out of a way more expensive card is about +5%. ImWateringPSUs only had 2% better performance from his card.
I have a PNY 3080ti. And the PNY cards in the 3000 series was very popular here(Norway). But that was during the GPU shortage, and you see alot of them on the second hand market now(XLR8 series). Not sure how the 4000 series been doing for them. But we are a tiny market anyways
6:20 MLID mentioned that one of his contacts at AMD says that theyve been working on it for a while now. I think he said over a year. TBH, I'm glad they aren't rushing it out. Hopefully it gives DLSS and XESS a run for their money out the gate.
Problem with waiting is that DLSS is already included in hundreds of games, while AMD users are stuck with various FSR implementations, which generaly lacks in quality. AMD need to get it out to start building a software library with great FSR implementations, this will take years.
The Ps5 pro will cost more than $700 usd and only the Universe knows what it'll cost in Australia, so a $1000 PC with similar specs will a 100% be better value.
We don't know, but AMD have AI cores that rn aren't being used, so it would make sense but also would be against what they have been doing lately so idk , I have Nvidia so idc@@Hi_Im_o2
5700x3d is closely priced to a 7600, and am4 users can probably find a 5800x3d for that price on the used. Even a 5700x is priced better. (Prices in India, amazon etc..)
This is a complete shot in the dark, but if NVIDIA still has more driver overhead offloaded to the CPU, could that contribute at all to lower VRAM usage on the card?
How about texture details and sharpness? I heard people saying that AMD gpu produce sharp texture details especially in the dark scenes. Is that true ?
@@sajithsaji3606 haven’t noticed this tbh, just the vibrant colors. I will say I prefer the nvidia look, but you don’t notice it unless you have access to both lol
@@Slambear that's true actually I'm planning to buy my first gaming pc but the problem is choosing a gpu brand is headache for me .some People in RUclips confusing me like they say AMD image quality is superior than nvidia that's why I asked you.
@@sajithsaji3606 it all depends on your budget, the games you play, and the resolution and frame rates you wanna play at. If you need advice I can recommend some stuff. Daniel Owens on RUclips does a lot of these performance builds if you wanna check some out
Power supply, thermals, cpu will hold it back. You can't compare gpus like that lol. Its not going to come near a 7800xt. And many have built examples at $700 which beat the pro alleged specs.
Do you think Intel can ever fill the void that AMD left by abandoning their plan to launch high end GPUs? I really wish AMD would come around in few years and decide to launch high end GPUs. Otherwise, Nvidia would take this opportunity to inflate RTX 5090 or 6090 or whatever they decide to name their future high end GPUs.
How dare the questioner not mention Sapphire? It's the EVGA of Radeon GPUs! 😅 I have a Zotac 4070 Super and Sapphire Nitro 7900GRE and they are as good as my 'big named' MSI 4080 Suprim. People tend to buy ASUS, MSI, Gigabyte cause they flood the market with so many models. How many models of the 4070 Super does ASUS have? let's see Dual, Dual Evo, Prime, ProArt, TUF, Strix. how many does Zotac have? Twin Edge and Trinity? so yeah people are gonna see only MSI, Gigabyte, Asus.
As far as Frame Gen goes, I don't think AMD saw a need or purpose to it until Nvidia launched their tech and the public responded to it. Because it truly is kind of silly and not especially good, particularly where you'd think it would matter most - low FPS gaming. Upscaling is the better tech to pursue with regards to improving frame rates, and I think FSR 3 took longer because development resources had to be directed away from it to Frame Gen.
Did anyone else see the video on RUclips where the tech tuber was saying that the main issue with Zen 5 is the power settings are not great? In that video his 9700X wiped the floor with the 7700X, and was trading blows with the 7800X3D. I would love to see another channel try to replicate it. Shouldn't the current RDNA 3 be able to run FSR 4, considering there are unused cores (AI or tensor, I think)? The 7900XTX for instance has 192 cores, and a good TOPs amount. Amuse AI local image generation runs pretty fast on the XTX.
Zotac is good for SFF cards. While MSI/Gigabyte/Asus would occasionally make single fan ITX cards, Zotac has always made small dual fan cards for pretty much every generation. I had a 2070 and have a 4070 of theirs. At 226mm and 2-slot it's one of the most compact 4070s you can get. Also AMD only has 12% of the GPU market, so for developers to support FSR it has to work on all cards.
As for the CPU test with 7900XTX. I've seen it with my own eyes how the 9700X at 1080p performed 10% faster than with a 4090. One of the technician at my lcoal dealer was playing with the newly arrived CPUs and he noticed it and was showing it to us all. But since their job was basically to sell those CPUs, i took those results with a grain of salt. The explenation was basically that the better branch prediction of the 9000 series CPUs, combined with the increased IPC was improving the communication with the GPU through SAM and wasn't working the same way with nvidia GPUs which DO have Re-BAR, but it wasn't exactly the same as AMD's Smart Access Memory. Whether it's true, idk but it kinda makes sense for AMD to implement someting and boost performance on an all-AMD system. I've been expecting them to do that for 5 years. After all they've been talknig about boosted performance on their own ecosystem ever since 1st gen ryzen came out.
11:00 XFX has a pretty big advantage in Germany. It's often one of the cheapest with good coolers
yeah i got a 6800 from them, and it's a really good card.
XFX has been consistently very good after they completely fumbled the bag with the RX 5000 series.
got 6950xt and 7900gre from XFX, I was surprized by the design and overall quality given the lower price. Man both coolers were able to easily keep temps in check even when pushing 300W+, which also means that with some UV and 240W limit both cards ended up really quiet!
The RX 6000 was amazing, 7000 is mid not sure what they did with the cooler but even if you tune down that out of the box turbine, then the TUF or Nitro is still a good 10 degrees cooler on the same noise level. (XT/XTX)
same here in Czech Republic, XFX got usualy best Radeon deals, both 6750 /6800 still available here for pretty good price
PNY is the only brand that Nvidia trusted to produce their workstations gpu..and PNY is a US based company..
Sometime ago, leadtek were making tons and tons of Quadro. No idea if that is still the case
Also, rumors lately that kingpin is working with pny. Take my fucking money pny give me a 5080 ftw
@@dtectatl1that’s not a rumor he said it in his RUclips video
PNY is a Palit, which is main Nvidia vendor
PNY is Chinese company
DLSS isn't hardware based, it's hardware accelerated. it could run on any hardware (if not always very fast) , and could run just as fast on any hardware that has matrix solvers (that Nvidia markets as tensor cores) if nvidia would allow it.
Also worth noting is that Nvidia "hardware accelerated" features tend to use both the Tensor cores AND additional host CPU cycles.. at least on my 4080 Super running the same benchmark (Cyberpunk 2077) with features like RT or DLSS enabled.
It has been done, DLSS can run on GTX but due to lack of the hardware acceleration it runs much worse than native.
As far as I remember I saw it in a video about uniscaler mods or something. The guy was trying out FSR 3.1 in older unsupported games.
So you mean its not viable unless hardware accelerated? Just like any other hardware accelerator that does something that can be done in software? Its hardware based. It doesnt work correctly without the hardware
Yeah if FSR4 could run on NPUs this would be a big win for laptops and handhelds
@@TheYuppiejrI'm almost 100% sure this is true because I noticed when I ray traced my CPU usage skyrockets.
13:00 Always find it somewhat funny that a lot of people in the west think Gigabyte, MSI or Asus are the "big boys" in GPU manufacturing and everyone else is small time, while in reality the real GPU giant is actually Palit Microsystems. They overtook Asus back in 2013 already to become the largest GPU AIB.
As for the others players in the market which are not that well know by western audiences and reviewers but still very large.
Colorful(Largest in China which moves massive volume)
PC Partner Group(Manili,Zotac,InnoVISION) who also also does a lot of contract manufacturing
is Palit cards legit? cuz i remember seeing it few years ago with a way lower price than other brands and it instantly gave me scammer vibes 😂
wait the frog gpu guys?
@@あなた以外の誰でもない Yes, they have been around a long time, I bought a Palit GTX 780 jetstream way back when
EVGA is the only big boy to me 😢
@@あなた以外の誰でもない Model is also relevant. Palit and zotac make some very underbuilt cards as their base models for midrange cards. Much worse thermals than you'd get with like a base model like an msi ventus 2x, asus dual or gigabyte windforce.
ever since switching to linux I dont require any new hardware because Im busy just trying to get basic things working sometimes.
That is half the fun
It's a game on its own right.
I game on Linux (sometimes) because some games support the OS directly from Steam. You can also use virtual machines if that doesn't work for you, but anti-cheat egines might complain lol.
Haha I know right?! It’s so pointless.
I faced similar issue with my Mi 120g laptop. For me popos worked better than the rest. It's not perfect yet, I still face audio issues, and occasionally it becomes a chore to get it working.
But, on my desktop, I don't face any issues with the popos. Everything just works. If you haven't tried popos yet, try it and see if it solves the issue for you.
PNY is an American Company that does a TON of data center GPUs and have kinda started dabbling in consumer GPUs a bit more in the 4000 series
PNY has been a NV OEM since before geforce was a thing for NVIDIA I think they started with the RIVA TNT. They have always been the lowest end reference designs.
I worked at Staples back in the stone age, and we carried PNY RAM. This was the SIMM/DIMM SDRAM era, but the return rate was abysmal. I've never got the impression their consumer line-up has improved to any notable extent.
I own a PNY 4090 XLR8 OC...
It's a lovely card. Cool and quiet. Full frame. Solid performance. Vapour Chamber, 8 pipes (two 8mm). Triple circle enclosed fans. Compact design. It's not for benchmarking. But I get an excellent undervolt/OC at 2760MHz @ 975mV with +1200MHz (12%) VRAM OC. It doesn't look like a giant maxipad or candy bar so I'm quite happy.
Not only data center cards, they also do all the expensive Quadro workstation cards. I have two Quadro RTX 5000's in my engineering workstation and their radial fan design is better thought out than some of the consumer brands axial designs. I hold them to the same standard as EVGA was. Their fan bearings are of quality, which many other board partners cheap out on.
My sister in law bought an RTX 3060Ti Eagle OC from Goigabyte. Loudest card ever, even more so than a GTX480 and rubbish cooling, flow through area was completely covered by a brittle plastics backplate. She sent it back and replaced it with a PNY 3060 Ti, PC is super quiet. Very big flow through and quality fan bearings.
"Haven't any roasts prepared or anything?"
"I wish, but its not worth my time."
*Boom* 💥 🔥🔥🔥
I have a XFX merc 319 6800xt and its great. runs quiet cool and fast. i also really like the XFX style. no rgb puke or other weird light garbage. just a lit up XFX logo.
Have their 7900XTX Merc 310, very clean white lit up logo. Both the 4090 FE and their XTX just have clean professional looking white light. Our 7800XT Gigabyte OC has the RGB logo going on, very niche, matches with RGB ram. My main pc has a black out theme in a white case, elegant having less flashy elements sometimes.
Yeah Xfx cards look great and run great. I have 4 in my house all bought within the last few years. Build quality and no rgb is what I wanted.
XFX used to be one of the most recommended manufacturers out there back when they also made Nvidia cards. I remember cross-shopping between EVGA and XFX for my 8800GT when I was building a PC to play the original Crysis. And from what I’ve gathered over the years, XFX is very much still considered one of the “big 3” of the red team.
I ended up getting ahold of an MSI Ventus 3080 10gb back in December 2020 (for actual MSRP too), and though it has served be very well, it has a plastic backplate. I was used to the build quality of EVGA and Sapphire cards, and holy crap, the shrouds in MSI’s newer graphics cards feel ridiculously cheap!
Simple: Never Buy MSI.
XFX is doing better after being called out for that Plasticky "THICC" series GPUs with the 5700 XTes.
@@alrecks619 Yeah they had a big turn around, I had a Thicc 3 ultra 5700xt and you could drop temps by like 6 degrees by popping the back panel and removing some of the excess shrouding
MSI makes good mobos, but I'd never buy anything else from them
I opened msi 3070 gpu and compare to colorful and gigabyte aero 3080 even saw rlthe 3070 version online. Yeah msi gpu pcb is so cheap
11:00 Honestly, I've had far better experiences with the smaller brands than with the big boys to the point where I'll now actually prefer them.
Zotac is very popular in India as it provides 5(2+3) years of warranty
Bought my first gforce card from them. It gives me 😊peace of mind.
There I see way more "repaired" zotac compared to any other brand. Might tell you something. Gets hotter than other brands too and cheaper.
In their website they don't even tell you directly where their service centres are. You need to mail them, giving invoice and if only ok they will mail you the location. Very annoying experience.
I wish they could compete with CUDA too for developing AI models, but there is just too much support and momentum behind Nvidia from the network effect, I can't even imagine AMD competing with nvidia on this front.
technically the models aren't written in CUDA the stuff that runs them is pytorch, and that has a Backend for AMD GPU's though it only runs on Linux ( For image generators specifically ), part of the functionality has been ported over windows and that runs language models fine.
You can run CUDA on AMD, but people don't like change. It's easier to just repeat the same old things even when the world has moved on.
@@sammiller6631 zluda isn't a thing anymore for amd, the hip conversion requires manual effort. so how exactly?
@@anonapache With the same effort that Linux lovers want everyone else to use Linux with. If Linux is easy for everyone to use, running CUDA on AMD shouldn't be any different.
It's a matter of wanting it. Those who say they want to use AMD are full of bullshit. They should be honest about what they really want.
With Intel claiming 0x129 fixed the core issue of the Intel 13th & 14th gen stability problems and yet, they've released another microcode update 0x12B which they yet again claims is to improve the stability of those CPUs, what do you think they've done this time and is it time to do another Intel CPUs testing update?
The first microcode update didn't cover all motherboards.
amd gpu gang
I thought you all changed your name to broke boys gang?
@@tomgreene5388 #ROASTED those broke 🅱️ois amirite champ? LOL! You've won the internet for today.
@@tomgreene5388 Nice troll but it doesn't really land. Everything is expensive and overpriced these days, including Nvidia and AMD GPUs.
@@tomgreene5388since i’m broke, can you please buy me an rx 7900XTX for me since you are so rich?
@@tomgreene5388 7900 xtx is 1000 $ what are you on
FSR can be good, just look at GoW Ragnarok and Ghost of Tsushima, the thing is when comparing one of the worst implementations of FSR in Cyberpunk to DLSS, then DLSS will always look way better
and on other hand, there's a lot about FSR 3.1 ghosting, it's almost non-existent in GoW Ragnarok (snowflakes do leave a trail), but when compared to how uncanny ghosting with DLSS in Wukong can be, then i'd take FSR every day of the week
Does FSR break water reflections like DLSS does? Look at the water whenever you are in a boat and moving.
@@sias.2569 TechPowerUp uploaded 3 days ago comparison of XeSS, FSR and DLSS in boat part, you can check that 2min video
i think the only thing FSR does to water in that game is making puddles sligtly shimmer on the edges, nothing you will notice if you don't stare at it, as for reflections on that video, all looks good, but there's no native to compare
DLSS in Wukong is buggy, it looks very grainy compared to Alan Wake 2 or cyberpunk. If you want to compare FSR to DLSS, use the DLSS in Cyberpunk, stable, not grainy, barely any ghosting or artifacts. In fact, it looks almost as good as native even at 1080p.
FSR looks good so far, still having a hard time recreating small fine details especially those that in the distance, ghost of Tsushima has falling leaves disappear then suddenly reappear, trees in distance looks blurry compared to native. I imagine FSR 4 might fix this with AI, but right now DLSS still better especially at lower resolution, since people like me who has a 3050 laptop can use DLSS at 1080p and it looks almost as good as native 1080p
I actually had strong ghosting with FSR in Wukong and switched to XESS.
@@camdustin9164 *akhem* Nvidia sponsored title and when AMD sponsored title has problems then i can hear it from every channel, my main point was that most comparisons are not with peak FSR performance, like setting up a race with Usain Bolt and 2nd fastest runner but 2nd runner has iron ball next to his leg, there's no competition... using cyberpunk is Nvidia shilling at it's best since FSR 2.1 is looking worse and worse with every patch they release, somehow, i don't know how and FSR 2.2 with frame gen is even worse, somehow
1080p is very low base for FSR, raindrops also has that problem, probably because AI can take more things into account, while algorithm is more likely to miss that while comparing neighoring pixels
for Radeon, Sapphire and Power Color are kings, XFX is good affordable option and rest is... rest, in short it's like EVGA, best performing and best quality products are not from biggest corporations as they are doing too much at once
@1Grainer1 I don't get why evga is so praised tho. They made quite some f ups
@@El_Deen they did a lot of custom stuff before Nvidia restricted it, also they were overclockers, so a lot of their stuff was made strictly with overclocking in mind, which made it easier to prolong product life, but yeah, looking at repair shop content and their gripes about EVGA, then there was a lot problems
FSR shines on old GPUs. Probably the best case scenario would be 5700XT, 1070, 1080ti. Affordable first market (5700XT) and second hand market for NVidia old generation GPUs.
Now saying this, i found out a lot of value for 5700XT with it's pricing below 200$.
@20:32 the ps5 came out around the same time as the 30 series but it was equivalent to a 2070super/6650xt/1080Ti NOT a 3070 which is 35% faster and closer to a ps5 pro gpu.
later the ports run like crap, or even the native ports, thats a myth you need more than a 3070 you need a 3070 and all the components that ps5 already have.
you are full memes and myths.
ps5 has 6700 non XT , so slightly below 3060ti 6700 XT performance
@@Jackson-bh1jw Yes if you look at death stranding on PS5 it performs very close to 3070. The rt performance creates a gap.
PNY is really big in workstations in the US, now because of the CUDA monopoly most of these are Quadro cards, but i cant recall seeing anything other than PNY in the enterprise space.
I mean HP and Dell make cards like the A100 and H100 but outside of those, you dont really buy an HP or dell workstation card, unless they come inside of one of their workstations.
When talking about the PS5 Pro, my concern is that it us supposedly still Zen2
That means its may be just as slow as my 4700S/BC250 with a 7800XT
Zen2 APUs, including PS5, have only 4MB of L3 available to any one core, Zen3 on the other hand has 16MB available to any one core due to having double the L3, along with a unified CCX/CCD
I would genuinely be surprized if they didnt at least make a unified CCX version of Zen2, maybe they keep the 8MB of cache instead of bumping up to the 16MB in Zen3, but i think it would be a bad idea
APUs share a memory bus, the more cache the CPU has, the less it uses the RAM, the less it uses the RAM, the more cycle time the GPU has to work with.
Even with Zen3 i suspect it will be slower than an RX6800 in raw performance even if its RDNA3 based, console optimizations can hopefully make the thing perform better than a 7800XT, and i'd imagine that if they went with 16MB Zen3+ it might perform similar to a 7800X3D+7800XT
Regarding the Nvidia color thing: I believe you are referring to how the Nvidia cards will default to a limited dynamic range when using HDMI while AMD will default to full. You can change the setting in the Nvidia control panel to use 'full' over HDMI. DP is default set to full. I think they work under the assumption that HDMI means TV and TV operated best with limited since the TVs would compress the dynamic range anyway.
I don't get why AMD didn't partner with Intel to make an Upscaler to compete with DLSS.
A few topics here are odd.
1. VRAM usage -> Like RAM, it will use as much as it can get so it doesn't need to hit the disk. As long as you are not getting out of memory usage, it's fine. You want it to fill it up as you pay the electricity cost whether it's full or 1 bit is used, so better to avoid having to go to slower memory (/disk).
2. FSR being open source does not stop them from optimizing it for their own hardware (or overriding/hooking at driver level so other implementation is used), but does increase the chance that the game comes with the correct API calls so they can use it.
Yeah, I came here to say point 2...
Supporting your competitors' hardware is not part of open sourcing the software. I agree that maybe AMD doesnt need to focus as much on supporting old Nvidia cards, but at least for me, them open sourcing their software is a big selling point. It makes them more of a good-faith player than Nvidia.
When the Rx 7600 first launched I think I saw a laptop review that had the AMD card often perform significantly worse than the RTX 4060 because it was just slightly using more VRAM,a and they were testing games that all teetered on the edge of 8gb. AMD went over a lot more often and suffered hard. Even if it was just using like 300mb more, it was sending it over the edge more often and choking up.
This is true for memory in general. As long as you are able to run everything you need, you want your game and operating system to use as much memory as possible
I imagine their odd take on point 2 is due to that they looked at it from a gamer point of view rather then the developer point of view. If you're trailing OSS is a good way to keep your technology incorporated in as many products as possible. And so you keep good support despite your position.
I think what they meant by open source isn't actually anything to do with it being open source but that they are spending limited dev time making fsr compatible with rtx and gtx cards as well as older radeon designs like polaris instead of spending that time making it work better on modern radeon cards. That's nice for the people with a 1660, 1070 or a 580 but it's not doing anything to help sell radeons.
Will note on the PS5 Pro comparison. The 7800XT is way faster, primarily due to no Infinity Cache on the PS5 Pro
And that has been a thing that historically cripples a RDNA card versus its IC-Bearing Counterparts. (680M/780M vs 6400, 890M vs 6500XT.etc)
So 7700XT is fair for Raster performance for the PS5 Pro due to not having Infinity Cache.
As for AI/RT Performance that is more variable but feature wise rumors state RDNA4 (Which Cerny has said the RT is actually coming from) will at least match Ampere's RT Feature set. So we can probably look for GPUs in the Ampere gen that are around 7700XT level in Raster then go from there
Which would lead us to a RTX 3070/3070Ti for a lowball or an Downclocked 4070 for a highball
The vram is shared with system memory tho, not exactly the same but the ps5 gpu was roughly a 6700 non xt and performed close to one, as well as console only optimizations would bridge the gap, infinity cache helps bandwidth issues not 100% present on consoles due to a unified memory config, thats why modern games tend to require loads of vram because consoles don't have to worry about the transfers between cpu and gpu like pc typically does
@@PelonixYT no, not quite. The 6700 OEM is the card that is closest to PS5 and even then outside of wacky cases (poor optimization) on PC, often outperforms PS5 exponentially at some resolutions. Again, due to Infinity Cache.
@@Alovonusually its the same performance as shown side by side from digital foundry
@@puffyips Yeah, that's when matched to PS5 Settings at higher outputs. At lower input resolutions though Infinity Cache's benefits become apparent.
The ps5 pro gpu is estimated to be about 50% faster than the PS5 which puts it at 6% faster than an rx 6800 in raster but a 4070 ti in rt
35:32 question about memory: if one uses AMD APUs, then going with 8000MT/s memory with, for example, 8700g to boost iGPU performance makes sense
Funfact the ps5 motherboard ram and apu costs sony only $115 lol.
In parts maybe, not in research and development.
No. They cost nothing.
The few grams of sand metal and plastic used are practically for free.
Hot take
I think FSR targeted universal card support so that it wouldnt be ignored, sure there was probably a hope some games would choose FSR over DLSS, but really i think the main goal was just to have it included at all.
If a game dev often chose not to include DLSS despite it being in ~90% of new cards, why would they bother with FSR if it only worked on 5-10% of new cards?
By working on all cards, including Nvidia cards that dont support DLSS, it evened the playing field on weather a developer would include it or not.
I just wonder how the AI upscaling will work on arc and rtx
Yeah, at the time (and it may even still be the case), there were more people with Nvidia GPUs that couldn't use DLSS than there were people with AMD GPUs. AMD leveraged those neglected Nvidia owners to get FSR into games (because, if devs wouldn't add it for the sake of AMD owners, they'd at least do it for all of the 1080/1660 owners) and, now, it's an established technology that people expect to see in new games.
It was open, for much the same reason Java was Open, to stop the competitor closing the market (NVidia graphics or Windows OS). If it hadn't been for Java being Open Source, Microsoft would have closed the internet completely. Same for FSF. If AMD hadn't open sourced it, Nvidia would have closed off the market and everyone would have to license DLSS from NVidia, who can refuse.
It's not just that there are people that prefer XFX over ASUS, it's that XFX, PowerColor, and Sapphire make the best AMD GPUs and the big three generally don't spend much money to make sure their Nvidia designs work on an AMD board.
Unfortunately AMD didn't specified anything about whether that will be FSR 4 Upscaling or just FSR 3.1 with some AI Frame Generation, also no information if it will be supported by modern GPU such as 7000 and above series.
And I can easily recommend Kryosheet, as I have this gorgeous little thing in my 7900XT from Gigabyte, which dropped my temps from 90-95C and fans at 2000rpm +, all the way dowm to 82C 1400-1500rpm at stock, while 88C max 1700-1800rpm with almost 400watts after unlocking power limit. But there is literally no FPS increases over 2-5% of power limit, so I have 3% over the stock and getting lile 84C 1500-1600rpm at max when full load.
So yeah I truly recommend this Sheet if you have enough of pump out effect on your GPU. Just use termal electric tape (0.1mm yellow) around the Die to secure the transistors and you good to go.
If AMD doesn't incorporate some of their existing FSR algorithms with their AI I will be surprised since I imagine that will make training easier by providing essentially humanities best performing manually coded upscaler as additional input. While FSR isn't as good as DLSS it comes close in man ways after all.
12:40 you're 100% right. I'm in the middle east and I have a pny 4070 and most of the GPUs here are pnys and zotacs and XFXs
I bought my PNY 4090 XLR8 OC in Australia so I don't understand these guys here. They sold quite well. It's a great card.
I think console cost + online cost for the lifespan of the console makes it equal or more expensive than same tier pc, is just the cost is not upfront. I also feel the ps5 pro will be more like a 7700xt rather than a 7800
PS5 performance sits between a 3060 and 4060.
Sony claims the PS5 Pro is 45% faster in rendering, so that would roughly put it right behind a 3070 ti.
So it might not even match the 7700XT.
As far as I know the GPU is still RDNA2 but customized with better Raytracing and new AI acceleration.
I see HU video i click instantly, glad to see you guys are not standing , give your legs some rest
Sapphire and Powercolor make the best GPUs
Can confirm I have Sapphire Pulse RX 7600 Overclocked edition
@@davidbooth1634 they are also the most expensive(AMD)
@@El_Deen I got a Sapphire pulse 7900xt last year for $720. That deal was very good so I pulled the trigger on a new PC.
The RX 6700 has nearly the same specs as the PS5, with a 160-bit memory bus and a slightly higher clock speed, and both deliver similar performance according to Digital Foundry's tests. So, the idea that a PS5 Pro will surpass the 7800XT makes zero sense. The 7800XT is about 70% more powerful than the RX 6700, and Sony itself claimed the PS5 Pro will be around 45% faster than the PS5. Do you really think the 7800XT is only 45% faster than a PS5? Of course not. The numbers don’t lie-believing the PS5 Pro can match that level of performance is pure delusion.
Yeah, Tim mentioned 256-bit bus but PS5 Pro has to use that for the CPU too. And I think it has a smaller on-die cache so it needs more bandwidth too.
AMD should be implementing XESS instead of going their own route.. they'd do this if they were genuinely about open standards rather than just using openness to compete with whatever Nvidia's latest tech is.
You can already run XESS on AMD cards, but not accelerated (rather, DP4A). If they used this as a reason to cut FSR, then they would be permanently stuck with a less performant implementation, and further development of upscaling on their cards would be reliant on Intel's fledging dGPU division, which is a competitor, and which could unfortunately be shut down at a moment's notice.
Nvidia introduced very efficient frame buffer lossless compression back in the GTX 1000 series. In fact, they were specific about that at the time
Don't throw your wallets up here
Pny will also get kingpin cards soon
20:55 Not really, there is no "magical it's better because it's console optimized GPU". PS5 GPU is basically RX 6700 hardware and it runs around the same performance as PC RX 6700 - sometimes little bit slower (when game wants more compute power) and sometimes little bit faster (when game wants faster memory). PS5 Pro will most likely be little bit below 7800 XT (if it's RDNA 3) because it has both slower memory and lower compute performance.
But if PSSR is decent, and it does have RT from RDNA4, then the ps5 pro can output a better image from a lower input resolution, and will have better performance than any AMD equivalent GPU. Just like RTX cards in the same performance tier.
@@ArchieBunker11 According to digital foundry it's like an underclocked 4070. Which means it has close to 6800xt/7800xt levels of raster performance but much better RT performance and a better upscaler.
@@enmanuel1950 as far as raster is concerned the base 4070 is slightly worse than a 3080 and 7800xt, and marginally better than a 6800xt(speaking in averages). An undervolted 4070 would be closer to a 6800 non xt. Also, sony themselves say “45% better than a ps5. The 6800 is 45% faster than a 6700.
Although id grant you the image quality and RT.
Asking Tesla to test drive self driving tech on a VW as it's more reliable is like asking AMD to test on Nvidia.
What's next HW unboxed just doing a nexus clips channel as they did better testing or better equipment on certain topics ? Nice for the consumer but wouldn't be a great look for you guys .
It's just not a smart decision to push a rivals card however if the percentage uplifts are the same then it doesn't really matter
I feel you overestimate consoles a bit. PS5 is more between 3060 and 3060 Ti, and Pro according to what Sony said is 45% faster which would land it around 7700XT, not above 7800XT.
7700xt with better ray tracing basically
1:22 - data compression
No one can beat nvidia but we keep hoping someone will. Don't kid yourself, just sell a kidney and buy a 5090. God gave you a spare for a reason.
RDNA2 in any generation cannot beat PS5 Pro in any enhance title game, for one main reason the PSSR, with this AI upscaling it's making RT easier to implement and process. and on that note RDNA2 doesn't have BF16 silicon so it's only compatible on FSR3, while RDNA3 and later generation will be running FSR4 because they have ML silicon.
I don't get it why people praise so much Sapphire. Their thermals are one of the worst. Powercolor, MSI and Asus have way better thermals and efficiency.
MSI GTX 1080 to a XFX 7900XT both have been wonderful. Still using the 1080 in the living room rig.
What amd/intel should do to their cpus to maximize the gains in bandwith from DDR5-8400+ compared to ddr5-6000 or no matter the ram bandwith speed, it wont matter until we have high speed ddr6?
What for?
bro ram speed doesnt matter much when the CPUs themselves dont grow much in terms of cache. CPUs work best when running sequentially, for that they need good cache algorithms to keep the cores fed but its really complicated. CPUs do branch prediction and out-of-order execution to mitigate this. But ram speed is so far away from being an issue, you wouldnt even believe. As of today, higher speed ram only "copes" for poorly written game engines and mostly doesnt even boost a % of performance....
Nvidia has quite a few on AMD when it comes to AMD catching up.. as long as AMD puts the effort in I don't see why it couldn't be just as good..maybe better if they figure something out Nvidia has not..but that's gunna be hard since Nvidia is a money printing factory
My understanding is DLSS 'AI' is merely AI tuned rather than 'run in AI.' I think leveraging this level of 'AI' is fine, as it streamlines tuning the various upscaling logic to closest match the original target resolution. This is something that could work with previous versions of FSR ... although would require tuned profiles on a per game basis
PS5 have radeon 6700 (non XT) inside, but with shared 16 GB VRAM
“It was interesting to see amd talk about this… normally with their future looking stuff they kind of hide it away” …. FSR3 frame generation would like a word
15:44 I found that not only are the smaller brands for AMD at least, higher quality, but they also have better customer support for the same price as the bigger brands. So I see no reason to get the bigger brands actually.
I've never dealt with the smaller brands for Nvidia though, so I can't say about ZOTAC and PNY.
DLSS has seen no real improvement for a long time. It's good but some artifacts are still pretty distracting like power lines breaking up horribly. FSRFG proves to be very comptitve against DLSSFG. Fsr 4 is gonna be better than DLSS if it stays the same.
The PS5 itself is identical in performance to a RX 6700, there is no doubt about that it matches both raw and RT performance, so accounting for everything announced to the PS5 Pro, the only GPU that matches the performance gain and the "next level RT processor" and so on, is the RX 7700 XT, it has 42% increased performance from the RX 6700, and also has a next level RT processor.
It can't be the 7900 GRE because that is a 89% uplift, and the 7800 XT is a 72% uplift from the standard PS5 performance.
The NVIDIA Equivalent would be a RTX 3070 Ti, although the raw performance is a little bit slower, and the RT performance is a little bit higher.
PS5 Pro has a machine learning-based upscaler and more than 10 GB of VRAM, making it equivalent to 3080 and 4070.
@@Kukajin That is too much of a performance improvement from the PS5, the ammount of VRAM isn't a issue, the GPU is always custom made, usually is a shared Memory and they allocate the way it's needed.
7700XT has 12gb of VRAM, but PS5 uses a custom GPU, with 13,7GB... not exactly a reason to call it mismatch, just add more memory modules to the 7700XT.
Also Intel XeSS ins a AI based upscaler and is compatible with AMD GPUs... PSSR isn't FSR, and the "FSR" doesn't need to be AI based in order for the GPU to match, PSSR is it's own upscaler, like XeSS is, TSR and also FSR.
All which can be run in the PS5, because they aren't GPU specific technologies like the DLSS (maybe the PSSR will be, who knows).
Focus on the Performance increase, if anything goes beyond +50% of a RX 6700, it's already better than what PS5 Pro is capable of.
With generic FSR there is at least a chance that someone would add it into a game since it at least provides capabilities to older hardware. With hardware specific FSR, is there any incentive for developers to add it? They lost market share every single generation as the "drivers are crap" myth has persisted even to now when NVidia has many issues as well but the smell doesn't stick to them.
As far as I am aware, prior to the Radeon VII, none of the AMD cards had DP4A, so not vega or polaris. Therefore, even XeSS will not run on these
What type of FPS counter are you guys using?
I'm primarily a Radeon guy, and I rarely (if ever) buy from the big 3. Its typically Sapphire > Powercolor or XFX
I currently have a zotac 3060ti. Only fan failure I've ever experienced in any PC.
I'll never buy zotac again
Unfortunately its luck of the draw... Only fan failure I've had on any card was an MSI card.
had a 1070ti and now have a 3070ti other than it running hot never had a problem. as redshift said its luck of the draw
@@redshiftit8303 the white dual fan zotac 3069ti had widespread fan failures. When it happened to me I looked into it and found that it was alarmingly common. The black version had no such issues.
So I decided to be smart and ordered a replacement set of fans, only this time I ordered the black ones. To my dismay the screw holes didn't match up between the black and white models. Ended up zip tieing a couple 12cm case fans to it and vowed to never buy zotac again.
So this makes Sony’s PSSR super useless.
Almost mised this Q&A thanks to the (weird) title.
I think AMD should continue to make FSR an open standard but because they know in advance how the features will be implemented, they should design their hardware to be more efficient for it
Yes... But no.... Because Nvidia didn't planed tensor cores for DLSS, it was made for enterprise and AI oriented workloads. They just saw it doing nothing in gaming and gave it a use, just like nvenc and the new optical flow accelerator that also was designed for computational vision. But was used for frame Gen. That's the cool part of hardware accelerated stuff, they occupy space on die, but their use mainly occupy power draw budged.
Said that, if AMD want the same recognition, they need just to make like Nvidia and do things for who is paying for it .....
@@pedro.alcatra the idea is to adapt FSR to maximize the efficiency on AMD hardware (in this case make full use of their tensor core implementation). Unless both AMD and Nvidia hardware work in the same way, it would stand to reason that the most optimized software implementation for one does not work as well on the other.
You absolutely still need FSR to be open as AMD is the underdog so game developers need an incentive to add support. If they can get some improvement for all users (picking an open FSR) vs a lot of improvement for a small number of users (closed FSR) vs a lot of improvement for most users (closed DLSS) they will probably rank those choices as either 132 or 312, but would be insane to pick a closed FSR as a primary path forward.
PNY is the manufacturer of allmost all Quadro cards, they are of course pretty large as a manufacturer. Quadro cards perhaps even have higher margins, because they are very expensive. I have two Quadro RTX 5000's in my engineering workstation.
Nice.
Nice to see the boys together.
PNY cards are the only nvidia GPUs sitting at msrp in my country, the rest are marked up 20-30% sometimes 50%
My PNY 4090 XLR8 OC is an excellent card. You most you'll get out of a way more expensive card is about +5%. ImWateringPSUs only had 2% better performance from his card.
I have a PNY 3080ti. And the PNY cards in the 3000 series was very popular here(Norway). But that was during the GPU shortage, and you see alot of them on the second hand market now(XLR8 series). Not sure how the 4000 series been doing for them. But we are a tiny market anyways
I bought my PNY 4090 XLR8 OC in Australia, it's an excellent card.
Wait, there are two of them !?
6:20 MLID mentioned that one of his contacts at AMD says that theyve been working on it for a while now. I think he said over a year. TBH, I'm glad they aren't rushing it out. Hopefully it gives DLSS and XESS a run for their money out the gate.
Problem with waiting is that DLSS is already included in hundreds of games, while AMD users are stuck with various FSR implementations, which generaly lacks in quality. AMD need to get it out to start building a software library with great FSR implementations, this will take years.
Its q4 of 2024. Best GPU (with 16GB Vram which is the minimum) is an 2020 radeon model, called RX 6800.
How is that possible, want went wrong?!?
Got the used one for about $270 this year
The Ps5 pro will cost more than $700 usd and only the Universe knows what it'll cost in Australia, so a $1000 PC with similar specs will a 100% be better value.
JB and EB have pricing set to $1199 for the pre orders. Doubt this will change but it’s always a possibility
@@fueledgti Jezus >$900 USD if you want the stand and disc drive.
FSR 4 aka DLSS for everyone
"everyone" FSR4 will use AI cores, so not everyone has them, and the people on Nvidia who has them will use DLSS.
@@SweetFlexZ wait it needs ai cores? then wouldnt that just be useless???? i thought they were doing it like XeSS DP4a
Funny how Frame gen is now accepted 😂. NVIDIA was slaughtered initially for" fake frames " although the proponents went silent.
@@Hi_Im_o2 i mean, the whole point is it's AI driven?
We don't know, but AMD have AI cores that rn aren't being used, so it would make sense but also would be against what they have been doing lately so idk , I have Nvidia so idc@@Hi_Im_o2
PNY is big, because the make all the workstations nvidia GPUs...
AMD needs ultra budget GPUs and better priced CPUs.
am4 still strong tho, someone jumping into pc gaming right now would be satisfied with a 5700X3D
Their cpus are already very well priced, x3d ones are just in high demand
5700x3d is closely priced to a 7600, and am4 users can probably find a 5800x3d for that price on the used. Even a 5700x is priced better. (Prices in India, amazon etc..)
We're getting a bit greedy if US$160 or $180 is too much for a 7500f or 7600
@@MisterFoxton PPP would like to know your location.
stick with native, fk dlss and fsr.
by the card that matches your resolution, simple.
and use dlss when the games become demanding for your resolution. Stop being so backwards.
@@Sal3600 or knock down a setting or 2.
This is a complete shot in the dark, but if NVIDIA still has more driver overhead offloaded to the CPU, could that contribute at all to lower VRAM usage on the card?
I have both an AMD and Nvidia GPU and the colors do look a lot more vibrant on my Nvidia card.
How about texture details and sharpness? I heard people saying that AMD gpu produce sharp texture details especially in the dark scenes. Is that true ?
@@sajithsaji3606 haven’t noticed this tbh, just the vibrant colors. I will say I prefer the nvidia look, but you don’t notice it unless you have access to both lol
@@Slambear that's true actually I'm planning to buy my first gaming pc but the problem is choosing a gpu brand is headache for me .some People in RUclips confusing me like they say AMD image quality is superior than nvidia that's why I asked you.
@@sajithsaji3606 it all depends on your budget, the games you play, and the resolution and frame rates you wanna play at. If you need advice I can recommend some stuff. Daniel Owens on RUclips does a lot of these performance builds if you wanna check some out
Hoping latest FSR for Wukong, FSR 3 on AFOPB does wonder
Power supply, thermals, cpu will hold it back. You can't compare gpus like that lol. Its not going to come near a 7800xt. And many have built examples at $700 which beat the pro alleged specs.
I got a ZOTAC Gaming GeForce RTX 4090 AMP Extreme AIRO and it's alright.
For Radeon i like Sapphire best. Asrock also started making Radeons, and if they're anything like their AM4/AM5 mobos they might be good cards too.
I always buy Sapphire if I get a Radeon GPU. I love their Nitro series. My 7900XT Nitro+ is absolutely beautiful!
The only PNY cards I have used have been in pro workstations, specifically Quadro's
Will we see desktop computers that are fully integrated? Full Intel/AMD cpu, gpu, motherboard?
Do you think Intel can ever fill the void that AMD left by abandoning their plan to launch high end GPUs? I really wish AMD would come around in few years and decide to launch high end GPUs. Otherwise, Nvidia would take this opportunity to inflate RTX 5090 or 6090 or whatever they decide to name their future high end GPUs.
Look at all the PC fan bios fighting each other in the comments! Absolutely pathetic
How dare the questioner not mention Sapphire? It's the EVGA of Radeon GPUs! 😅 I have a Zotac 4070 Super and Sapphire Nitro 7900GRE and they are as good as my 'big named' MSI 4080 Suprim. People tend to buy ASUS, MSI, Gigabyte cause they flood the market with so many models. How many models of the 4070 Super does ASUS have? let's see Dual, Dual Evo, Prime, ProArt, TUF, Strix. how many does Zotac have? Twin Edge and Trinity? so yeah people are gonna see only MSI, Gigabyte, Asus.
will AMD finally beat DLSS?
short answer: no
long answer: hellll no
As far as Frame Gen goes, I don't think AMD saw a need or purpose to it until Nvidia launched their tech and the public responded to it. Because it truly is kind of silly and not especially good, particularly where you'd think it would matter most - low FPS gaming. Upscaling is the better tech to pursue with regards to improving frame rates, and I think FSR 3 took longer because development resources had to be directed away from it to Frame Gen.
nv have had a lead with vram compression since maxwell/750ti days. sure they pimped it as a feature back then.
I think nvidia cards may load in less textures. Amd cards just load in everything that moght be needed probably.
if steam would introduce a steam console i gues they can deliver a device that is either more powerfull or cheaper then a PS5
I cant troll console gamers if they have more power than my pc!!!! I will be forced to upgrade :(
Did anyone else see the video on RUclips where the tech tuber was saying that the main issue with Zen 5 is the power settings are not great? In that video his 9700X wiped the floor with the 7700X, and was trading blows with the 7800X3D. I would love to see another channel try to replicate it.
Shouldn't the current RDNA 3 be able to run FSR 4, considering there are unused cores (AI or tensor, I think)? The 7900XTX for instance has 192 cores, and a good TOPs amount. Amuse AI local image generation runs pretty fast on the XTX.
I have an AMD cpu and GPU and highly doubt they will top DLSS. If i cared about tray tracing I would have gotten nvidia for sure.
No, Since FSR 4 was designed for mobile gaming. AMD lost
Zotac is good for SFF cards. While MSI/Gigabyte/Asus would occasionally make single fan ITX cards, Zotac has always made small dual fan cards for pretty much every generation. I had a 2070 and have a 4070 of theirs. At 226mm and 2-slot it's one of the most compact 4070s you can get.
Also AMD only has 12% of the GPU market, so for developers to support FSR it has to work on all cards.
Watching PNY closely since they’ve picked up kingpin
As for the CPU test with 7900XTX. I've seen it with my own eyes how the 9700X at 1080p performed 10% faster than with a 4090. One of the technician at my lcoal dealer was playing with the newly arrived CPUs and he noticed it and was showing it to us all. But since their job was basically to sell those CPUs, i took those results with a grain of salt. The explenation was basically that the better branch prediction of the 9000 series CPUs, combined with the increased IPC was improving the communication with the GPU through SAM and wasn't working the same way with nvidia GPUs which DO have Re-BAR, but it wasn't exactly the same as AMD's Smart Access Memory. Whether it's true, idk but it kinda makes sense for AMD to implement someting and boost performance on an all-AMD system. I've been expecting them to do that for 5 years. After all they've been talknig about boosted performance on their own ecosystem ever since 1st gen ryzen came out.
currently have a sapphire 6900xt absolutely love it for 4k gaming i think sapphire made the first gpu i bought and installed mysek a x850xt lol
I must admit that XFX makes some of the sleekest looking GPUs on the market. It's a very good option for people who aren't fans of overdone RGB.