- Видео 55
- Просмотров 926 495
Blackbird PC Tech
США
Добавлен 31 авг 2013
Break through tech content overload with Blackbird PC Tech, a trusted one-stop-shop to help guide you on your PC building journey. After designing the world's fastest helicopter, leading AI for the world's largest defense contractor and building PC's for over 20 years, you can trust me when I say it's not rocket science.
Website: blackbirdpc.tech/
Store: blackbirdpctech.store/
Benchmarks: blackbirdpc.tech/
Membership: ruclips.net/channel/UCQwpOG-l1nYHtX2alMFdLFQjoin
Website: blackbirdpc.tech/
Store: blackbirdpctech.store/
Benchmarks: blackbirdpc.tech/
Membership: ruclips.net/channel/UCQwpOG-l1nYHtX2alMFdLFQjoin
Is GPU VRAM Really Important?
Is GPU VRAM really important? It’s a question a see a lot, so in this video we are going to find out. In the “It’s Not Rocket Science” series, we’ve been helping you troubleshoot and optimize your system to keep your PC running like a pro. In this video, our focus will be on helping you buy the right GPU by answering two important questions:
1. Is the amount of VRAM important?
2. How much VRAM do you really need?
In addition to showing you benchmarks across 8 popular games, I am also going to demystify upscaling and frame generation technologies, and let you know when you should consider using them, something you will definitely not want to miss.
----------------------------------------------...
1. Is the amount of VRAM important?
2. How much VRAM do you really need?
In addition to showing you benchmarks across 8 popular games, I am also going to demystify upscaling and frame generation technologies, and let you know when you should consider using them, something you will definitely not want to miss.
----------------------------------------------...
Просмотров: 3 674
Видео
PC Tech Game Changers You Will Want to Buy in 2025!
Просмотров 1,1 тыс.День назад
With CES 2025 wrapped up and your RUclips feed inundated with new product announcements, I thought it would be helpful to break through the noise and bring you my top picks in PC Tech that you will want to buy in 2025. One of the things I noticed about all of the coverage from CES is that most content creators simply focus on telling you what is on display versus trying to understand the tech ....
How To Fix Intel Core Ultra CPU Performance!
Просмотров 7 тыс.14 дней назад
How do you tune and optimize the new Intel Core Ultra 200S CPUs to extract max performance? In this video we are going to find out. In the “It’s Not Rocket Science” series, we’ve been helping you troubleshoot and optimize your system to keep your PC running like a pro. It’s not rocket science and as you will see throughout this series, it really is Lego. Title: How To Fix Intel Core Ultra CPU P...
Should You Buy The Core Ultra 9 285K or Ryzen 7 9800X3D?
Просмотров 20 тыс.21 день назад
Intel finally released fixes for the new Core Ultra processors, so is the 285K now worth buying? In this video, the next in our "Ultimate PC Component Fighting Championship" battle series, we are going to find out. In the UpcFC series we’ve been helping you make the right choice by pitting two components against each other in the PC octagon to see who wins. In this video our focus will be on th...
AMD 5800X3D vs 7800X3D vs 9800X3D - Is It Time To Upgrade?
Просмотров 20 тыс.Месяц назад
With the recent launch of the 9800X3D, is it finally time to upgrade your 5800X3D? In this video, the next in our "Is It Worth It" upgrade series, we are going to find out. In the "Is It Worth It?" series we’ve been helping you make the right choice by showing you just how much you can expect to increase performance with a drop-in upgrade. In this video our focus will be on the legendary AMD Ry...
Should You Buy The Ryzen 7 9800X3D or Stick With The 7800X3D?
Просмотров 34 тыс.Месяц назад
The 9800X3D launched with rave reviews, were they right or was it just hype? In this video, the next in our "Ultimate PC Component Fighting Championship" battle series, we are going to find out. In the UpcFC series we’ve been helping you make the right choice by pitting two components against each other in the PC octagon to see who wins. In this video our focus will be on the two top AMD gaming...
Should You Worry About PC Bottlenecks?
Просмотров 4,8 тыс.Месяц назад
Should you worry about PC bottlenecks? It’s a question I see a lot, so in this video we are going to find out. In the “It’s Not Rocket Science” series, we’ve been helping you troubleshoot and optimize your system to keep your PC running like a pro. It’s not rocket science, and as you will see throughout this series, it really is Lego. Title: Should You Worry About PC Bottlenecks? Chapters: 00:0...
Is The Intel Core Ultra 7 265K Really That Bad?
Просмотров 27 тыс.2 месяца назад
Is the Intel Core Ultra 7 265K really that bad, or did reviewers get it wrong again? In this video, the next in our "Is It Worth It" upgrade series, we are going to find out. In the "Is It Worth It?" series we’ve been helping you make the right choice by showing you just how much you can expect to increase performance with a drop-in upgrade. In this video our focus will be on the new Intel Core...
How To Tweak an AMD AM5 Ryzen CPU: Step-by-Step Guide!
Просмотров 14 тыс.2 месяца назад
How do you optimize and tune an AMD AM5 based Ryzen CPU to extract max performance? In this video we are going to find out. In the “It’s Not Rocket Science” series, we’ve been helping you troubleshoot and optimize your system to keep your PC running like a pro. It’s not rocket science and as you will see throughout this series, it really is Lego. Title: How To Tweak an AMD AM5 Ryzen CPU: Step-b...
What's The Best Memory for AMD AM5 Ryzen CPUs?
Просмотров 98 тыс.2 месяца назад
Is faster memory really worth it for AM5 based Ryzen CPUs? In this video, the next in our "Is It Worth It" upgrade series, we are going to find out. In the "Is It Worth It?" series we’ve been helping you make the right choice by showing you just how much you can expect to increase performance with a drop-in upgrade. In this video our focus will be on measuring the impact of memory speed on AM5 ...
How To Tweak an Intel 14900K for Max Performance!
Просмотров 9 тыс.2 месяца назад
How do you undervolt and optimize an Intel Raptor Lake CPU to extract max performance? In this video we are going to find out. In the “It’s Not Rocket Science” series, we’ve been helping you troubleshoot and optimize your system to keep your PC running like a pro. It’s not rocket science and as you will see throughout this series, it really is Lego. Title: How To Tweak an Intel 14900K for Max P...
9800X3D Performance Prediction & AM5 Ryzen CPU Tweak Guide!
Просмотров 24 тыс.3 месяца назад
Should you wait to buy the new 9800X3D? In this video, the next in our "Is It Worth It" upgrade series, we are going to find out. In the "Is It Worth It?" series we’ve been helping you make the right choice by showing you just how much you can expect to increase performance with a drop-in upgrade. In this video our focus will be on predicting the performance of the highly anticipated AMD Ryzen ...
Should You Buy a Ryzen 9 9950X or Core i9-14900K?
Просмотров 17 тыс.3 месяца назад
Should you buy the new Ryzen 9 9950X from AMD? In this video, the next in our "Ultimate PC Component Fighting Championship" battle series, we are going to find out. In the UpcFC series we’ve been helping you make the right choice by pitting two components against each other in the PC octagon to see who wins. In this video our focus will be on the two flagship CPU’s, with the newly released AMD ...
How To Overclock Your GPU!
Просмотров 2,2 тыс.3 месяца назад
How do you overclock your GPU? In this video we are going to find out. In the “It’s Not Rocket Science” series, we’ve been helping you troubleshoot and optimize your system to keep your PC running like a pro. It’s not rocket science and as you will see throughout this series, it really is Lego. Title: How To Overclock Your GPU! Key Component Affiliate Links: Nvidia GeForce RTX 4080 Super - amzn...
How To Undervolt a Ryzen 7 7800X3D or 9800X3D!
Просмотров 15 тыс.4 месяца назад
How do you undervolt a 7800X3D or 9800X3D? In this video we are going to find out. In the “It’s Not Rocket Science” series, we’ve been helping you troubleshoot and optimize your system to keep your PC running like a pro. It’s not rocket science and as you will see throughout this series, it really is Lego. Title: How To Undervolt a Ryzen 7 7800X3D or 9800X3D! Key Component Affiliate Link: AMD R...
Should You Buy a Ryzen 9 9900X or Core i7-14700K?
Просмотров 22 тыс.4 месяца назад
Should You Buy a Ryzen 9 9900X or Core i7-14700K?
How To Undervolt a Ryzen 7 5800X3D or 5700X3D!
Просмотров 33 тыс.4 месяца назад
How To Undervolt a Ryzen 7 5800X3D or 5700X3D!
Should You Buy a Ryzen 7 9700X or 7800X3D?
Просмотров 76 тыс.4 месяца назад
Should You Buy a Ryzen 7 9700X or 7800X3D?
How to Fix a PC that Restarts & Gets Stuck in a Boot Loop!
Просмотров 6 тыс.4 месяца назад
How to Fix a PC that Restarts & Gets Stuck in a Boot Loop!
Is an All AMD Gaming PC Really Worth It in 2024?
Просмотров 17 тыс.5 месяцев назад
Is an All AMD Gaming PC Really Worth It in 2024?
How to Fix a PC that Stutters in Games!
Просмотров 17 тыс.5 месяцев назад
How to Fix a PC that Stutters in Games!
How to Fix a PC that Does Not Power On or Boot!
Просмотров 1,7 тыс.5 месяцев назад
How to Fix a PC that Does Not Power On or Boot!
Is a 5700X3D Upgrade Really Worth It?
Просмотров 11 тыс.5 месяцев назад
Is a 5700X3D Upgrade Really Worth It?
Should You Buy an RTX 4070 Super or RX 7900 GRE?
Просмотров 63 тыс.6 месяцев назад
Should You Buy an RTX 4070 Super or RX 7900 GRE?
Is It Worth Upgrading to an RTX 4070 Super?
Просмотров 6 тыс.6 месяцев назад
Is It Worth Upgrading to an RTX 4070 Super?
The Ultimate Gaming PC - Part II - The Benchmarks
Просмотров 1,4 тыс.7 месяцев назад
The Ultimate Gaming PC - Part II - The Benchmarks
The Ultimate Gaming PC - Part I - The Build
Просмотров 6287 месяцев назад
The Ultimate Gaming PC - Part I - The Build
14900K vs 7800X3D - Does Intel Have a Problem?
Просмотров 7 тыс.8 месяцев назад
14900K vs 7800X3D - Does Intel Have a Problem?
ASUS Scam Exposed - Will It Happen To You?
Просмотров 9 тыс.8 месяцев назад
ASUS Scam Exposed - Will It Happen To You?
Anytime you send something in for repair most often they send you a different one that has already gone through whatever their process is and determines usable... the conditions should be evaluated so that you're sent one in the same condition... would be a simple thing to fix.
Agreed ... I didn't have an issue with getting a different serial number, it was more about the state of the card when I received it back. I ended up bending the bracket and running it. It ran super hot, so I pulled it apart and re-pasted it ... the quality of their repair was truly terrible.
@blackbirdpctech yeah that's a lot of money to spend for something like that to happen...
Interesting vid. The reason why testers often skip cpu testing on 1080p low game settings is really simple. It's not a realistic scenario. People who buy the 9700x cpu for games ar not going to play games at 1080p low settings. It makes sense to test a cpu at 1080p low settings for the purpose of testing, but it does not make sense for the people who are going to use this cpu for actual gaming.
I hear this argument a lot, so I decided to address it directly in this video: ruclips.net/video/Zi7LtyQVPdM/видео.html
You will never be a real frame. You have no contact with the anything but the tensor cores and ai tops, you have no native resolution. You are a crude mockery of a frame, extrapolated with artifacts and blur behind the engines back, and inserted in between real frames, introducing input lag and noticeable distortions in the graphics. All the validation you get in benchmarks and frames per second counters is two faced and half hearted. Behind your back gamers mock you. Your mega corporation and shills are ashamed and disgusted by you as you are a hard sell. Players are utterly repulsed by you. Years of gaming has allowed gamers to sniff you frauds out with great efficiency. Even DLSS 4.0 frames who “pass” look uncanny and unnatural to gamers. Your appearance and structure is a dead giveaway. And even if you manage to get a drunk guy to play a game with you on, he’ll turn tail and bolt the second he gets a whiff of your diseased, infected pixels.
An open letter to “fake” frames? Well done 😂😂😂
Very informative video. Although I was left wanting a comparison between the 4060 Ti 8 & 16 GB cards using the DLSS technology and subsequently with Frame Gen, just to be able to see how they fare with upscaling. Otherwise brilliant stuff all around.
Games like Cyberpunk 2077 and F1 2024 have DLSS turned on by default when you select Ultra game settings however I didn't compare them with frame gen turned on in any game.
Appreciate your rocket fast response. Apologies for not being more specific, but since my focus was more on MSFS 2024 and your graph showing a massive difference between the 2 cards in TAA, I was wondering what the performance difference would be with upscaling and FG. In your opinion is 16GB enough for MSFS 2024 or would more VRAM like 24GB or even 32GB benefit performance. Of all your games compared, MSFS 2024 displayed the max disparity between 8GB & 16GB. Thanks, Sid
Ah, ok ... I just showed results for the 8GB card. Based on my data I would argue that 16GB is more than enough for MSFS 2024 ... if you are targeting the 5080 then DLSS 4 is quite a bit more efficient with VRAM when compared with DLSS 3, and FG doesn't really impact the usage (I haven't tested MFG yet).
There is a problem with DLSS made frames you didn't touch and I often feel is being left out... no it isn't the latency. It is the fact that the gamea needs to support DLSS, FSR etc. If they don't then you only have the "real frames" . Not many major games will have that issue but there is AMAZING games out there not supporting it because they are made by small studios. And thats why I think defending using those frames is wrong as marketing, because it is a borderline lie, and it is arrogant towards the indie studios where alot of the real new ideas for gaming is starting.
That's true, the game does need to support DLSS. That said, I didn't defend Nvidia using frame gen for marketing purposes, in fact in the video I stated the following: "I don’t think Nvidia did themselves a service by claiming that a 5070 could match the performance of a 4090, but the image quality improvements that they’ve made with DLSS 4 are impressive"
@@blackbirdpctech I did notice you did say that. I just wanted to say, it is a part many don't mention when they talk about frames and why I think it is so wrong not to mention it. It is also the reason I think most streamers push against it being used as a measure of the power. Nvidia simply compares 2 different things and I can see alot of people don't understand the difference. :)
I won a 7900 XTX from some clown a while back. I water cooled it. I was on the pure rasterization bandwagon before this video, but you make some great points. I don't game much anymore, but I want to start playing around with the frame gen tech. Just to see if I like it. I can tell you when VRAM really matters...video editing. The 7900 XTX is a BEAST when editing/rendering huge videos. I take it for granted because it just works as it should, but I can render 10 GB videos in just a few minutes. Great video!
Ha ha ha ... so you won the card from Jay? 😉
@@blackbirdpctech 🤣🤣🤣🤣
I have a CPU: AMD Ryzen 7 5700X, RAM: 96.0GB Dual-Channel DDR4 @ 1596MHz and I need a new graphics card for some gaming and mostly video editing. Which one would you recommend under $700? Amd, Nvidia? Thanks
At this point in time I would wait to see what the new 5070 and 9700XT offer ... if you can't wait then you should check with your video editing software if it has better support for Nvidia or AMD cards ... most of the larger ones tend to support Nvidia features more at this point.
Perfect discussion around generated frames. No one will be able to tell the difference 👍
It's not a popular opinion, but if you did a blind side-by-side test my guess is that the results would come back proving it right.
@blackbirdpctech RUclipsr Vex did a blind test with him friends and clearly couldn't tell and made up excuses when presented with what he had been playing.. Often back peddling when he had chosen frame generation.
Ha ha ha ... that just proves that most RUclipsrs lie to their audience to generate clicks ... when I saw DLSS 4 at CES on Black Myth Wukong I genuinely thought the DLSS image was better
My electric bill and the fps counter said that the fake frames are real 😂 The AI cores draw a lot of power to bring the fake frames to reality. Fake Frames Matter!!!
That would be an interesting metric to measure ... power draw with frame gen on versus off.
@blackbirdpctech will definitely try this one as soon as I receive my card.
I will look through my benchmark data ... now I am super curious to see
GPU vram really important🤔Nope i still use a 3060 ti 8gb and a 4070 12gb at 1440p with no issue, this goes for 3d animations and rendering as well
That has been my experience too.
Here is the thing, I agree with all the testing here but its still not enough WOW for me to upgrade an entire X3D system from a 5800 to a 7800x3d or 9800x3d. 100+ fps generally is not going to be felt in a game. Sure bench marking wise you get 30 percent in some games but 241 fps to 300 fps is not really going to be felt only seen on a bench mark. I see no compelling reason to update cpu/ram/mb etc to a 9800x3d just not a WOW especially with a 4070FE. Now I had a bunch of games on the edge say running at 40fps and I could boost them to 60+ then maybe I would think about it but its not the case with 4070FE/1080P/240hz. I have two 65" OLEDS but never play games on them I prefer smaller up close high refresh. Don't forget a total system rebuild takes a huge amount of time to test and reinstall everything to get it working like the old system. Some might love it, but the older I get the less LOVE I have for tinkering that much for 30 percent. For most people I would just swap out the non X3D for the 5700X3D plug n play upgrade no need to reinstall the OS etc. Good enough dude for most folks.
I think that's a very reasonable approach and justification to stick with a 5800X3D ... time definitely becomes more "valuable" the older you get
In Escape of Tarkov I easily fill up 16gb on medium'ish settings. But EFT is special kind of game regarding lack of optimizations
I will have to check it out
You earned my sub! Super objective! Please keep it real like this and I'll 100% stay around
My objective is to tackle misinformation with data and logic ... there will always be some form of bias, but I do my best to avoid it as much as possible ... and welcome to the Blackbird PC Tech community!
I made a poll on this topic just yesterday on Facebook. Asking will VRAM effect Nvidia in the future more/1st Or will RT performance effect AMD GPUs more/1st. It was fairly even, but to my surprise, more people chose the second option. Showing many people already believe Nvidia will succeed in this vram compression technology
That's interesting ... I think that's recognition that Nvidia leads AMD in this area.
Thank you for your video, I'm considering upgrading from my i7 8700k, but the price in Belgium for the 7800X3D and 9800X3D are crazy high. So, thanks to your video, I have another option to consider for my upgrade that will save me some money while allowing me to change everything I need to change.
You are very welcome ... happy to hear that it helps.
Agreed on the 'fake frames' argument. However there is one issue which is the DLSS frames dont improve latency they only improve smoothness. Which makes it more useful for single player games and less useful for competitive shooters.
Agreed, that is very true
under 4k, RT and ultra settings, 4060ti can run 2077 with 41fps? This doesn’t sound right.
I always love the "that doesn't sound right" argument because you can't argue against it ... it's something that exists only in your mind. The good news is you can replicate my data by using the settings and components I used, which I show clearly in the video. Unlike other channels that create unique benchmarking runs to avoid any form of accountability, I encourage people to check my data. When you can form a logical argument with data then please come back and let me know ... then we can have a meaningful discussion.
I have a small system (my big rig with the 4090 is only for work, I work mostly as a 3D & VFX artist these days and use the 4090 solely for GPU rendering) with a crappy RTX 3050 8GB, and I run Cyberpunk at 1080p at 150 fps on average on an 180 Hz monitor. I do this with a FSR 3.1 frame gerneration mod that pushes the framerate to 150 fps, DLSS upscaling is activated (FSR 3.1 can utilize DLSS upscaling, which FSR 3.0, inbuilt into the game, can not) and set to performance mostly because "balanced" shows some strange artifacts at the edges of the frame (no other setting does that). I also have set a custom upscale range to use if I need it and the artifacts appear if I set the lowest fps target to the lowest setting relative to the (capped) highest settings. All graphics settings are set to high or "psycho" (if available). I get a frametime of 7ms, which is quite phenomenal. I can use Raytracing (simple reflections) too, but I don't see much of a difference in comparison to screen space reflections most of the time. I only see the difference when I actually stop and look for the reflections, or when I look out onto the harbor, the water looks not that good with screen space reflections and the simple speculars don't do it much better either. The frame time with simple reflections (no DLSS ray reconstruction) jumps to 11 ms on average, which is still acceptable. For a more smoother experience I can even turn on Lossless Scalings frame generation, but that shears off quite a bit of fps and it's also pixelbased, so distortions happen. The frametime goes to 15 ms when I do this. It is an option to use it on top of everything (or just turn off FSR 3.1 and use Lossless Scalings FG instead, which gives me better results than putting it on top of FSR 3.1), but it is of course rather suboptimal to use FG (pixelbased) on top of FG (engine based, which is much better on its own). Without raytracing but with AI trickery activated the RTX 3050 8GB never goes over 6.5 GB usage for Cyberpunk. With Raytracing it hovers around 7.6 GB, which pushes towards the limit. But it is absolutely possible to play Cyberpunk at the highest settings (excluding Raytracing) with acceptable frame times and general fps if you make heavy use of AI upscaling and frame generation.
Great comment, thanks for sharing.
The problem with fake frames is that they are fake (as in not connected to the game's calculations). Any game in which frame counting is important will become scuffed with the fake frames added in. You will miss shots in fps games, mistime jumps/inputs in some platformer games, and miss combos in fighting games. Yes, this mostly affects competitive gaming, but it is sill important to note that you are essentially creating some input lag when you use DLSS. They are not cost-free frames. I think most people don't have a problem with AI generated frames as a feature, they are upset at the deceptive marketing of them. There is a drawback to using them for some games. I believe that most would agree with me that until the game engine can respond to an input done at the generation of a "fake" frame then the proper way to market GPUs is to show BOTH raw and DLSS fps.
I think that is a very reasonable take on frame generation technology. While I disagree about the use of the term "fake", I completely agree that the fundamental issue is deceptive marketing. What I really don't get is that Nvidia has great technology ... they could have simply told the truth and kept the new frame gen tech secret until launch ... that would have been received very differently in my opinion.
@@blackbirdpctech They don't tell the truth because actual truth would be something akin to "for a slight, and in most cases negligible, lag in input response time, you can greatly increase the frames displayed on the screen." The best you would get is something like a drug ad side-effects list. As for semantics, I think fake is a very accurate description though. As far as the game is concerned, those frames do not exist. They aren't real. They are fake.
At 4k i think it matters. At 1440p not as much. My 4070 only uses about 10-11gb in cyberpunk on max settings. Hogwarts rarely uses more than 10 at max. However noticed a few games using more than 16gb ram though. Caught hogwarts using 24gb
How did you measure 24GB on a 4070?
@blackbirdpctech 24 gb ram (ddr4) Not vram
@ ah ok, I misread that ... 24GB of RAM is a lot ...
The reason why Hardware Unboxed tests gpus in ultra settings is because it's the most realistic scenario. All the PCs I've built except for a few that were not for gaming were required for playing at Ultra Settings, ranging from 1080p to 4k. Playing competitively was just a bonus by lowering graphical settings. If you are going to spend on the latest and greatest then you have to expect the best. Don't get me wrong, I do, however, find this content about content creators very interesting.
I don't have an issue testing GPUs with ultra settings ... that is exactly what you should do to fully load GPUs, but for CPUs it doesn't make sense, since the objective is to fully load the CPU. I explain CPU testing in much greater detail in the following video: ruclips.net/video/Zi7LtyQVPdM/видео.html
bit of a silly qestion as DDR5 will need a new mobo and cpu too lol. so DDR 4 wont come in to it.
I used the exact same cpu to conduct the testing, and I used Z690 chipsets for both motherboards.
Subbed.
Welcome to the Blackbird PC Tech community!
Well this was useful ;) Love it!
Glad you liked it!
Can these optimization settings be achieved with B650 motherboards, regardless of the brand, such as Gigabyte or MSI?
Yes they should.
Recommending users to disable memory integrity features can potentially increase security risks by creating vulnerabilities that could be exploited by malicious actors. However, I appreciate the thoroughness and methodology employed in your testing process.
That's true ... I think the risks are minimal for most gamers however if you use your system for other more sensitive applications then the trade-off in gaming performance might make sense. I will include a statement like that in future videos.
One important thing with respect to undervolting that I didn't mention in the video, is that you also need to disable hypervisor or hyper-v in Windows before XTU will allow you to use the "Core Voltage Offset" option.
Saying that if you cant tell the difference visually between "fake" and real frames means they hold the same/ similar value is just plain wrong. Any competitve game where latency matters frame gen is useless which happens to be an auful lot of games. Also frame gen cant be used effectively below 60 which is the main area you would want to turn it. Also in games where you would use it, non competitve games i still wouldnt use 4x and would stick to 2x as i would rather less artifacts at 120 then more at 240 The main issue is most peoples target in games where they would use frame gen is 60-144 where 3x and 4x are not useful
Let's correct a few things that I hear repeated constantly, but are factually incorrect: 1. Stating that you can't use frame gen below 60fps is a strange argument to make because you can't use frame gen without upscaling, and upscaling will almost always push you above 60 fps, unless of course your GPU is so weak that it can't maintain 60fps using ultra performance mode, which is an effective resolution of 640 x 360. 2. The majority of games are single player titles ... massively online competitive games are popular, but there is a significant cost to maintaining the servers, so most games don't require them. That means that there is a large community of gamers that will directly benefit from a technology like frame gen to improve their smoothness of gameplay. 3. For competitive games, Nvidia offers Reflex to help reduce the latency, which is compatible with a wide range of competitive titles. And they reduced the latency impact with MFG (4x) over FG (2x) by the nature of the approach that they used (one is interpolation and one is extrapolation). As I stated in my video, there is no such thing as "real" frames for a game, because games are by definition fake. The only difference is that one frame is rendered by a game engine while the other is rendered by an algorithm that was trained by the game engine. If you can't tell the difference then it shouldn't matter. In fact, DLSS 4 with MFG looks better than native in the games that I've seen. If you are a professional gamer that only plays competitive titles then don't use frame gen ... that doesn't mean the technology is not beneficial in many other scenarios.
How Much VRAM is needed for VR and Flat to VR games?
I think that would depend somewhat on the resolution that you are playing at … I would have to test it to know for sure
Thats good to hear 16gb would more than enough with 5000 series with dlss 4 being more efficient with resources.
Yes, I really wanted to see the data because there are a lot of uninformed opinions floating around on this topic.
Ppl arent getting it yet that tech improvements are leading to lower vram usage. And i dont use frame gen, but do use dlss quality
One more question: can I apply your recommended settings? Or does it come down to silicone lottery whether a 30mV undervolt will work on my system? What would happen if the voltage is too low? And is it a good idea to test the RAM expo settings (6000MHz and CL30 in the case of the RAM I ordered) and see whether the system is stable, and then fiddle with the CPU settings? And I am a little bit wary about using ryzen master. I like to do BIOS stuff in BIOS and not via windows. Any disadvantage of applying your recommended settings just in the BIOS? And what about low-power useage, like just idling the system, will there be any problems when the cpu doesn‘t get enough juice?
You can try however there is no guarantee that they will be stable. The undervolt is indeed silicon lottery dependent. If you apply an undervolt that is too low your system might not boot or if it does, it might cause clock stretching, which I explain in this video: ruclips.net/video/OW-tpXttv8U/видео.html You should always test your memory stability first before applying other tweaks. Applying your tweaks directly in Bios is fine. I haven't come across any issues during idle.
1. Most of those techtubers made same comments about how AMD marketed these chips. 2. Those "lazy" 😅 techtubers who run a ridiculous amount of benchmarks and engage with AMD to try and figure out what is going on... Could not recommend the CPU compared to the similar version of 7 series "AS IS" with the CPUs pricing. 3. Because I watched a another lower sub techtuber. I was confident in buying 9700x for an over priced 350 euro. Using buildzoids (old) memmory timings and 105w fix with some slight fije tuning. You can get near enough to 9800x3d. Which at the time of purchase is half the price. 4. Faster ram is not the fix that is being claimed here which leads to a small fps increase. At the price for higher rate memory its insane. Just get the well priced 32gb 6000mhz CL30 ram and use the buildzoids timing which you can find with a quick google.
Please don't spread misinformation. Running a lot of benchmarks the wrong way isn't something to be applauded, it's remarkably stupid. An email to AMD also shouldn't be considered as "effort", they are supposed to be "experts", so it should have been obvious that the power limit was significantly impacting performance. Faster RAM does indeed help and a kit of DDR5-6400 is only ~$10 more than a kit of DDR5-6000 ... definitely not "insane" pricing as you state.
It is a good question, and I ran into an interesting problem with my AMD 7800 XT. When running Stable Diffusion models, having 16GB of VRAM is not enough to get the special pony and SDXL models running. However, if I use my older Nvidia 3060 Ti, it runs them all fine - with just 8GB of VRAM. Apparently Nvidia has something where they can share system memory with video card memory when it runs out, which is huge for that situation. Thumbs up.
That's interesting ... would be good to redo these tests with AMD cards to explore that more.
Nice to see you blowing up, glad to be able to say that i was pretty much there from the beginning! Great video, I pray i can snipe a 5080 when it releases. 🙏
Yes you were and I really appreciate it! It's going to be tough to get these cards at first ... just need to be ready to click and buy.
@blackbirdpctech yeah I'm praying that I could swoop by a local store first thing in the morning because I doubt I'll be able to get it online
So, would it be better to just get 8000, future proof wise? Seems hard though, not too many manufacturers are even pushing it right now at this time.
The problem is the cost ... it would be better to get a 64GB kit of DDR5-6000/6400 to future proof rather than a DDR5-8000 kit if you are going to spend that amount.
just picked up 64GB of 6400 by Corsair, it worked out cause they don’t even make a 8000mhz stick lol and I’m going for an all white build. I saw good thinks about the N7 B650E as well so I can’t wait for everything to come together 😅
What is the tdp of this configuration?
It's open ... the limit is temp.
@blackbirdpctech This means that with these settings, performance is highly dependent on the cooler we are using. Did I understand it correctly?
@ these chips run cool, so the thermal limit is simply to prevent the chip from pulling too much power/voltage/current … that said, your cooler will always impact your performance.
@@blackbirdpctech thanks a lot
What I know for sure is this, despite the backlash and hate towards nvidia, and legitimate anger against their marketing decision of promoting 5070 as an equal of 4090. The nvidia 50 fake frame series will sell like donuts near a police station. Do not wait to much, it will be a shortage and 5090 will most likely be out of stock for months. I already know about 3 of my friends which are buying the Asus Astral 5090 in the first day they hit the shelf. For those who buy nvidia nothing matters, they will buy the cards and enjoy the games they like, so no matter how much the techtube couch experts will backlash, these cards will be out of stock. Happy gaming and thanks Matt here who is trying to offer the best content without bias and with tons of empirical evidence. All the frames generated by these cards are the result of hardware processing, like it or not, this is the reality.
Well said!
So 4090 vs 5080? I think I'm leaning towards 4090. I'm making a new PC for myself this coming February. There's a shop in my country that sells MSI Suprim X 4090 for 2k usd.
I don't think you can go wrong with either option.
I would still call those generated frames as fake. As explained, a frame is generated between frame 1 and frame 0(frame 0, being the previous rendered frame. But my issue with frame gen is that, regardless if the game is online or offline, game engines still only recognized and include real frames on it's calculation. Allow me to give you an extreme kind of example. Let's they that you are playing an fps online shooter game. While roaming, you saw your opponent running fast from point A to point D. The frame zero showed he opponent's body/hit box at point A, while frame 1(next frame), is already at point D. Like if a character uses a skill to be extra fast. But that game has frame gen, and it generated two frames between point A and D, created an image of your opponent being in point B and C. Again, this is an extreme example. But if my chance you shot and hit your opponent, but while your opponent is at point B and C, the game engine, won't recognized it as a hit. That's why no one is using frame gen in fps games. imagine using the x4, where in there are 3 fake frames in between real frames. Until the game engine includes those fake frames in their computations, I would only consider frame gen as fake and cosmetics.
That is a fantastic point ... we are talking about extremely small timeframes but if you jam a time delta between game engine rendered frames with 3 new frames that are not directly tied in to the game engine, then there is no way for the game engine to acknowledge a hit. This is something I need to dig in to more, but it makes sense. The only way around that is for this capability to be integrated in to the game engines, which I think is the next obvious step.
@@blackbirdpctech I see it totally different. I see it like borrowing money from the bank, you have an ammout of money and the bank gibes you 3xthat amount from the future, yet you get all the money at once. Both the GPU and the Bank know exactly what you want to see and they rigorously hide the things you cannot see. In the worst case scenario the present is a little blurry 😂
It all comes down to the time increments we are talking about and whether that time increment is relevant to the game. If these new frames are generated in 2 micro second increments and the fastest a human can react is 10 micro seconds then it really doesn't matter. That's what I want to look in to more.
@ I think this is why frame gen requires a minimum native fps for it to work decently. At least the old frame gen I read about needs at least 60fps on specific resolution and graphic settings. However, on the key keynote, Jensen said something about, this generation is the first time that they would be able to intermix AI workloads and graphics workloads. So I guess, I hope that if not this 5000 gen, at least in the future, those fake frames would be recognized by game engines. By then, frame gen wouldn't be fake, but just another rendering technique, like or in tandem with rasterization.
Is a B650 on with the 9800x3d?
Do you mean is it compatible? It should be however you really need to consult your motherboard support area and check to make sure, since it will require a Bios update.
16gb minimum. just try stalker 2 at 1440p with those GPUs and you will see why i say 16gb. Rosstok area in that game will take you over 12gb.
COD: BO6 took me over 12GB but my recommendation is based on what I believe to be sufficient for most games, not what is enough for all games. I think 12GB should be the minimum in 2025 with 16GB as an ideal amount.
Hi, again I have a question. I read about Chipset driver "Version 7.01.08.129" is it woth it to update to this Version with the 9700x? Or did I understand this right, that its only food for the x3d Versions? I´m asking because I dont want to loose all my settings in the bios.
Updating your motherboard chipset will not reset your Bios ... I would always recommend updating your chipset drivers to the latest version.
one of the ever best gaming tech youtubers out threre! Keep it up Matt! Your content is amazing and so worthy!
I really appreciate the feedback!
Matt we need an update for that video now or in a few month! I'm very curious how much the updates will squeeze our of the 9700X =))
I plan to compare the 9700X with the 265K and 9800X3D in a coming video ... should be interesting to see.
@blackbirdpctech Oh yes, that's very nice! I can't wait to watch it😁🤝
Let's be real, a 2500€ RTX 5090 with 32 GB VRAM is enough for anything upto GTA 5 at 16k resolution. You don't always need to buy the shiniest new toy on the market.
According to internet tech masters the 50 series is a fake technology, though I need it because I want to play Microsoft Fake Simulator which displays such a beautiful fake AI generated world, and which works perfectly with RTX's fake frames. I also want to try a lot of other fake games with fake frames and fake ray work technology. /this was a rant joke 🤣🤣
@TedTed-q5i Games are fake, frames are fake, all graphical solutions are fake 😎
@@dvornikovalexei Precisely! 🤣🤣🤣
You guys get it 😉
The most Vram i have seen my GPU use in games is around 13GB, i think 16GB is the current sweetspot.
Agreed
Big unknown if most 9700x will undervolt and run higher speed ram kits with all the different motherboard revisions available. Maybe AMD has improved the odds since AM4.
Based on my experience testing 9000 series versus 7000 series, it appears that they have improved memory compatibility, FCLK and undervolt potential, but of course we are talking about ~20 CPUs total, so it's not a statistically significant sample size.
You use ultra when testing cpus because there are settings that impact the cpu? Higher population density will affect the cou petformance so obviously its worth turning it up. Are you saying the rtx 4090 was limiting the cpus at 1080p ultra? Cuz if not then it doesnt matter that they tested it in ultra and is probably better to do so.
It's very simple, when testing CPUs you should try to load the CPU as much as possible. At 1080p with Ultra settings turned on the load shifts away from the CPU to the GPU. So while you are correct that ultra settings can increase the load on the CPU, it also significantly increases the load on the GPU, which isn't what you want when testing a CPU.
@blackbirdpctech but would the load even matter if the game is gpu limited, which it will almost always be at 1080p, even at ultra. So this way you're getting a more realistic performance of what the CPUs are capable of, without the gpu limiting them.
Why would you expect a game to be GPU limited at 1080p, even with ultra settings turned on? GPU limited means the GPU load is close to 100% ... that typically doesn't happen at 1080p
@@blackbirdpctech my bad I misspoke, I meant cpu limited at 1080p with a 4090, even at ultra
@@PixelDew1337 that’s the point, at 1080p when you use ultra settings you shift the load to the gpu and the load on the cpu drops … that is why I criticized HUB in the video and why the differences they showed were low.
Sumary of internet talks about graphic cards in 2025: fake frames, fake frames fake frames fake frames fake frames fake frames fake frames fake frames fake frames fake frames fake frames fake frames. This is how much the understanding of users and 'reputable reviewers' is about computing, ray work equations, frame generator algorithms, and tech in general.
You missed ... be afraid of AI ... be scared of AI ...
@@blackbirdpctech hahahahaha! Of nvidia AI mostly, any other deep learning stuff is ok. 😂