I probably would have, if my AM4 mobo didn't die. When pricing my pc parts, the only difference between AM4 and AM5 build was 75$ more for the better DDR5 ram.
Give the platform some time to mature. Keep in mind the 5800X3D came out at the very end of AM4's life cycle and the 7k series X3D CPUs released at the start. We saw huge improvements over AM4's lifetime and the same will happen on AM5 given time.
This seals the deal, 5800x3d is one of the best processor releases of the last 5-10 years. It was ahead of its time when it came out and is still extremely impressive.
People will have to disable one CCD in bios if they want best gaming results on the 7950x3d for now. No surprise that software that is reliant on xbox game bar does not work correctly all the time.
Linus is spending god knows how much on production, equipment, studio space and staff... and here you are, with industry-leading production values. You should be very, very proud of what you do.
The higher thermals are due to the fact the primary CCD has to be dissipated through an entire extra layer of silicon. In normal 7000 series CPUs, the compute dies are in direct contact with the head spreader, in the X3D SKU's that heat has to be dissipated through the 3D V-Cache die, which sits on top of the compute die.
The 7950x3d and 7900x3d launch was rough but hindsight 2024, windows scheduling got much better and 7950x3d is running much faster than 14th gen i7 and i9 in many games.
They won’t, PC games are just shitty console ports now so expect games to utilize 6c/12t at most until the PS6 comes out and hopefully has more than 8 cores.
I replaced my 5800X3D system with a 7950X3D system and I'm seeing a 40% performance increase in MSFS (4K with my 3090 Ti.) It took me a long time to get there. In fact, I was sure I'd wasted my money at first, when I let Windows handle everything. Like the results you got, the 7950X3D barely outperformed the 5800X3D. Process Lasso is the answer. I turned off Xbox Game Bar, Windows Game Mode, HAGS, enabled the Ultimate Power Plan, did a per-core undervolt for max boost, and used Lasso to schedule everything other than the sim to the non-Vcache chiplet. The CPU runs 60-65°C during flights with a Dark Rock Pro 4 air cooler, and 75-80°C running all-core Cinebench. The 7950X3D is a gaming beast, but you have to unleash it.
Was looking at other reviews, and you seem to be missing around 10% of the performance for the 7950x3D. Sort of what people were seeing with the old bios/chipset driver. Were yours up to date? If so, did you also have that Bios setting you have to switch for the OS scheduler to park the cores set to "Auto"? The reason I'm saying this is because before and after the bios update/chipset driver update the 7950x3D and 7950x were on almost on par with the 7950x3D coming ahead by 2-5% and I'm seeing that in every benchmark you did.
Your video production quality is through the roof - from the bright colored desks and the close-up shots of the product, it's always phenomenal. Also can't forget those well-laid out graphs. Thanks for all you do man!
The 120 watt tdp across the board means the 7800x3d will be the best in games. Unless advertised tdp isn't accurate, which it won't be but I still think it will be accurate enough.
I'm kind of wondering if opening that up on the motherboard--if even allowed--might not do something about that as long as the cooling system can keep the temperatures down.
For stuff like zstd compression, and luaJIT 3d vcache is fucking insane, not to mention it's a beast for simulation work that can't be put on the gpu as well. And if you're stuck with cpu training for ML for whatever reason, you get decent speedups there as well.
I'm expecting it's a bios and/or windows issue not using the 3D v-cache properly. Which will be ironed out in future updates. I'm also wondering if performance is hampered because of physical space with the dies and v-cache, curious to see the 7800X3D compared to this.
I’m keeping my 7900x till next gen. It’s PBO and CO single core tuned, DDR5 OCed, I’m happy! All I need to play Star Citizen. No gains worth having at 4K gaming - 62 FPS average playing SC is very good and very playable too. The 7900X was actually better than my 5800X3D, I think SC likes the faster extra cores, it uses them all.
You're running into the same issues as Hardware Unboxed, I recommend everyone to check out their video. If you disable the 7950X3D ccd that has no 3dvcache on it, the performance of the chip increases substantialy in some cases. It seem's there are some issues with the drivers not correctly disabling the right cores.
I'd love to see some videos about those cpus with the use of Process Lasso. I know there has been issues with games using the wrong chip for its needs, or jump between chips with no reason, and Process Lasso should help highly I believe
It Looks like the 5800x3D is one of those legends like the 1080 ti GTX still is. Why I left my 3950x day one of the release of the new 3D cache chip. Not regrets paying full price for the best still to this day. Happy gamming.
If you read up on how Zen 4 operates you would understand why it brings up the temp to its max. No matter what kind of cooling you have, it's by design.
I don't own any PC yet, so I would start fresh. Would you still recommend going all in on AM4 and the 5800X3D or shall I go with AM5 and one of Ryzen 7000?
I'm literally on the same boat as you. I decided to go with ryzen 7 7900x3D and just start fresh with AM5 😅 and I'm stuck between either the 4070 TI or the 4080
For me was worth the wait. I've bought x670e motherboard + 7600x just to wait for 7900x3d or 7950x3d. Also it looks like I've lost silicon lottery as my 7600x becomes unstable when I do curve optimizer even for -5 =)
Looks like Amd making overpriced overhyped trash 3D CPU but the fanboys blamed Windows for this crap Amd pulled. Amd fanboys is worst as linux fanboys, ignorant at best.
If you got money to spare, and you care about energy efficiency, I think this one is fine. But I also want to see tests of 7700X3D, 7800X3D before any decision.
Having only one ccd with 3d v caches intentional. I imagine the 8000 series chips to have 2 without both dies but the 8950x3d will double the vcache on 2 ccds.
Wait. This is the first time I've ever heard of the XXXX-3D lineups of AMD cards. What differentiates these cards from the other? Are they designed for VR gaming?
Disable the non v cache ccd and run the tests again. Some of the results dont look correct. The 7950x3d is unfortunatelly not yet working as it should.
in most workstation multimedia tasks (premiere, photoshop, after effects, cinema 4d) the 13900k looks better or the same. in FPS there is practically no difference between the old and new (cache & no cache), and somewhere it is worse than the original. the results are not unambiguous, it is not clear what I should pay for here.
Hi, I have a question, you mentioned that the other 8 cores are asleep when gaming? If I am lets say streaming? or rendering something in the back ground would those unused cores pick up that task? or would they stay a sleep?
Any difference on the 7950x3D when using 6000 MHz CL30 ( which AMD recommends ) vs the CL36 used here for testing ? IF yes, how much % ? Appreciate the review, seems like the 5800X3D is still good for gaming for at least 2 years :)
I would guess around 2%. Maybe 5% at Max. The thing is, picking the right memory and timing is very important. It's also very tricky. If you're already buying this high end cpu. Don't cheap out on the memory. Just get the lowest cas latency possible.
What's confusing is that his link to Amazon is for the 13600. I clicked that first, then when his graphs referenced the 13900, I got confused. I assume it's performance would be somewhere in the middle.
I unbelieve, I have no word, to describe 5800X3D)))) I bought it for 200 bucks, It cost 300 bucks. I sold 2700x for 100 bucks, added 200 bucks and bought it. 5800X3D this is true legend now.
@@iikatinggangsengii2471 So from what you are saying the 7800x3d could be better because its sole purpose is to only us the 8 cores and the 3d cache by itself and not having to worry about the schedular?
Interesting how your conclusion is wildly different from hardware unboxed. Looks like that might be because you tested almost completely different games from HU and in all of your tests except cyberpunk the 70x3d showed no gains while HUs tests mostly show the am5 crushing am4. Or is there something else going on here?
Im looking for information to possibly make my own custom handheld. I know nothing of computers or programming. But its because Ayaneo is producing enough for low emulations. Retroid is also a little behind. Odin 2 can only play up to mid graphics and PC games. Steam Deck has a really bad 720P screen. So I want something that can emulate basic games all the way to streaming and high graphics without affecting power (long game play). I was thinking of getting and connecting the motherboards of all the modern Xbox, PlayStation 5 and Wii U to play all games. But that is very very redundant when I can buy a top of the line processor chip, Ram, HDMI port, multiple SD slots for memory (split capabilities: Windows, Linux, Plug and play), Hal sensors, Touchpads, D pads, Analog triggers, extra triggers on molded cases, SNES buttons, extra controller buttons, Gyroscope capability, enough power for long play hours docked or handheld mode. Molded case for capable with LED advertisement style display for customized messages or images with different colors from customization features. A display screen that gives 4K to 1080P. Debating on OLED or LED. OLED can lead to burnout screen if left too long. So maybe LED with a capable image for strong darker images with a high customizable brightness. So it won't hurt the eyes in the dark. Saying all these wink wink ideas because Emulation companies don't have the courage to compete with the main brand gaming companies. So individuals have to invest to make their own ultimate customizable handheld. The problem would be making the motherboard, Power function too.
I thing from the hering that the R9 7950x 3D good CPU for ITX builds is. The efficienz is high as he´s not 3D version the only problem is win dumm again. From time to time some data go´s in the worng CCD and get slow down.
The 5800x3D is the 1080ti of CPU's. That sucker will make people completely skip AM5 at this rate!
Completely agree
I probably would have, if my AM4 mobo didn't die. When pricing my pc parts, the only difference between AM4 and AM5 build was 75$ more for the better DDR5 ram.
Yessir. No need to move from the 5800x3D at all judging by these benches.
Totally
Give the platform some time to mature. Keep in mind the 5800X3D came out at the very end of AM4's life cycle and the 7k series X3D CPUs released at the start. We saw huge improvements over AM4's lifetime and the same will happen on AM5 given time.
5800x3D is the real king here.
Eh. It's like the F-22. It *was* the King. And now it's not.
The best gaming CPU on AM4 and still really viable in 2023
Am so glad I bought it on launch day and upgraded from the 3700x. Best hardware decision yet.
Nah, it was slower than my 12700k, 12900k and 13900k and the price was just 100€ less than my 12900k when I bought them both last April.
@@afriendofafriend5766yeah and surpassed by the overpriced and only slightly better F-35
This seals the deal, 5800x3d is one of the best processor releases of the last 5-10 years. It was ahead of its time when it came out and is still extremely impressive.
Pbo2 at -20 and some ram: 3800 cl4 and boom
For gaming yes and that's it. It's worse than the normal 5800X in almost anything else xD
@@chovekb A CPU literally made for gaming worse in anything but gaming? Shocking, indeed!
@@chovekbIntel fan boy💀💀
@@diegorendon8537 Fail more fool, i'm on amd right now lool
The lighting in your videos is incredible! Really gives it that "premium" look.
3:07 the 58x3d being so far above the 79x3d makes it seem like it was failing to utilize the CCD with the extra cache
almost certainly
exactly....
That's exactly it. Other channels have compared the same game with the non 3D V cache ccx/ccd disabled and the results are as expected.
People will have to disable one CCD in bios if they want best gaming results on the 7950x3d for now. No surprise that software that is reliant on xbox game bar does not work correctly all the time.
@@MrTuxy that's pretty sad if true
Linus is spending god knows how much on production, equipment, studio space and staff... and here you are, with industry-leading production values. You should be very, very proud of what you do.
My 7950X right now: 👁👄👁
I just bought one yesterday. 🥲
@@singularity108 you got 14 days to change it ;)
It's okay, you're in the AM5 platform, there's probably gonna be a Ryzen 11950X in the next two years.
Highly appreciate that you included 3600 in tests. This way normal peasants as me can see how much performance is in these new processors.
As someone looking to upgrade their PC sometime soon, coming from a 1600 will be like a lightyear jump.
@@henryhealy
my 2600x crashes above 3.4gh. The 7950x3d i have coming should feel like a true improvement
The higher thermals are due to the fact the primary CCD has to be dissipated through an entire extra layer of silicon. In normal 7000 series CPUs, the compute dies are in direct contact with the head spreader, in the X3D SKU's that heat has to be dissipated through the 3D V-Cache die, which sits on top of the compute die.
The 7950x3d and 7900x3d launch was rough but hindsight 2024, windows scheduling got much better and 7950x3d is running much faster than 14th gen i7 and i9 in many games.
This guy is never disappointing us.
The biggest gains are for games made in Unity. Games such as Rust, Tarkov, and Sons of The Forest see the biggest difference.
Aka poorly optimized turds
Sheeesh I am so happy I grabbed that 5800X3D on sale. Thanks Optimum! :D
Liked for the latency measurements added to the chart. Thank you.
At this point, game engines need to start using more threads when its available
Oh wow who would’ve thought about that
They do
@@s0rr0wHacker intel slowed the progress of multi threaded games by several years. It will take some time before it is fully utilized
They won’t, PC games are just shitty console ports now so expect games to utilize 6c/12t at most until the PS6 comes out and hopefully has more than 8 cores.
not everything can be parallelized in games
Your B-roll is insane man, you can tell how much effort you put in.
The subtle dolly at 7:30 is 👌
I replaced my 5800X3D system with a 7950X3D system and I'm seeing a 40% performance increase in MSFS (4K with my 3090 Ti.) It took me a long time to get there. In fact, I was sure I'd wasted my money at first, when I let Windows handle everything. Like the results you got, the 7950X3D barely outperformed the 5800X3D.
Process Lasso is the answer.
I turned off Xbox Game Bar, Windows Game Mode, HAGS, enabled the Ultimate Power Plan, did a per-core undervolt for max boost, and used Lasso to schedule everything other than the sim to the non-Vcache chiplet. The CPU runs 60-65°C during flights with a Dark Rock Pro 4 air cooler, and 75-80°C running all-core Cinebench.
The 7950X3D is a gaming beast, but you have to unleash it.
0:54 damn the presentation looks sublime. Very nice
Was looking at other reviews, and you seem to be missing around 10% of the performance for the 7950x3D. Sort of what people were seeing with the old bios/chipset driver. Were yours up to date? If so, did you also have that Bios setting you have to switch for the OS scheduler to park the cores set to "Auto"? The reason I'm saying this is because before and after the bios update/chipset driver update the 7950x3D and 7950x were on almost on par with the 7950x3D coming ahead by 2-5% and I'm seeing that in every benchmark you did.
❤️Thanks everyone for helping me reach 97,700+ Subscribers ❤️🙏🏻
Your video production quality is through the roof - from the bright colored desks and the close-up shots of the product, it's always phenomenal.
Also can't forget those well-laid out graphs. Thanks for all you do man!
I hear clicking
hopefully 7800X3D will address this issues with the split cache and we will see some nice gains
i doubt it. even if they could why would they make it better than their top range cpu
@@dioxyde0 Because the 7950X only has the cache on 1 ccx. 7800X only has 1 ccx so you avoid that weird cache split
Scatterbenchers OC video came out, the CCD with the V-cache can hit 5.2Ghz using PBO2 and Curve Optimizer with -30 offset.
Thats not bad but my 7900x does 5.6GHz across all 12 cores, all the time. No cores parked, no cores lower clocked. All 12 ready for gaming.
Almost the same frequency as the 7800x3d, curious to see how it will stack in benchmarks
@@jondonnelly3 how many games are gonna utilise all 12 cores 🤔
@@nyvkroft6530 Very very few. Most 6 to 8. Others 10. Very few at 12. I believe Battlefield can use 12.
The 120 watt tdp across the board means the 7800x3d will be the best in games. Unless advertised tdp isn't accurate, which it won't be but I still think it will be accurate enough.
I'm kind of wondering if opening that up on the motherboard--if even allowed--might not do something about that as long as the cooling system can keep the temperatures down.
The red lighting in the back for the red team is the small but good detail we like❤
VERY happy with my 5800X3D, especially as I don't have to upgrade to DDR5.
For stuff like zstd compression, and luaJIT 3d vcache is fucking insane,
not to mention it's a beast for simulation work that can't be put on the gpu as well.
And if you're stuck with cpu training for ML for whatever reason, you get decent speedups there as well.
Given the embargo, I was expecting blah performance and there it is. At least the pricing is in line with the non 3D vcache models.
Thanks for the review Op!
I'm expecting it's a bios and/or windows issue not using the 3D v-cache properly. Which will be ironed out in future updates.
I'm also wondering if performance is hampered because of physical space with the dies and v-cache, curious to see the 7800X3D compared to this.
I’m keeping my 7900x till next gen. It’s PBO and CO single core tuned, DDR5 OCed, I’m happy! All I need to play Star Citizen. No gains worth having at 4K gaming - 62 FPS average playing SC is very good and very playable too. The 7900X was actually better than my 5800X3D, I think SC likes the faster extra cores, it uses them all.
One of the reasons I got 5900x last year instead of 5800x3d... Certain games lately are using more cores/threads
The cinematography is off the charts
Shouldn't the 7950x3d be compared to the 13900KS?
expect the 5800x3d to go out of stock soon, AMD is probably going to reduce the production of those until EOL
Luckily I am picking mine up this week, moving from my 2700x and I'll be sitting happy with my esport title and sim racing gains!
5800X3D breaks sales records, this is AMD's best-selling processor right now)
You're running into the same issues as Hardware Unboxed, I recommend everyone to check out their video.
If you disable the 7950X3D ccd that has no 3dvcache on it, the performance of the chip increases substantialy in some cases.
It seem's there are some issues with the drivers not correctly disabling the right cores.
How does this work exactly?
I'd love to see some videos about those cpus with the use of Process Lasso. I know there has been issues with games using the wrong chip for its needs, or jump between chips with no reason, and Process Lasso should help highly I believe
Hello, your results are surprising... Did you update the BIOS/chipset drivers?
I mean that higher cores really is meant for higher workloads or Virtual Machine use as I use it for that on QEMU/KVM
💯💯
Your videos are always so clean
im out here getting 80 fps on valorant at 720p, while i see this pc getting over 1000 at 1440p. Cant wait to upgrade
So glad to see 1440p results, makes so much more sense than the guys testing at 1080p...
Tell me you don't understand CPU tests without explicitly stating it.
thought that valorant would immensely benefit...but guess 5800x3d is beastly enough
you can think even at unoptimized state 7k ryzens are already on top
damn. i wait so long for the 7950X3D, now i gotta wait for 7800X3D.
u think bes compronise is the 13600K for gaming, productivity and its price.
Yes
Finally someone who benchmarks MW thanks Optimum !
It Looks like the 5800x3D is one of those legends like the 1080 ti GTX still is. Why I left my 3950x day one of the release of the new 3D cache chip. Not regrets paying full price for the best still to this day. Happy gamming.
Great, I'm happy that I should not feel bad for recently having bought the 5800x3d then!
Thanks very mach for review!!! You Best!🔥👍
If you read up on how Zen 4 operates you would understand why it brings up the temp to its max. No matter what kind of cooling you have, it's by design.
Why is no one talking about how good the 7700 non x did? With low power draw and low temps it’s pure win.
How did apex benchmarks look?
Almost upgraded to am5 last week in anticipation for these new x3d cpus. Lucky I held off for reviews, the 5800x3d lives for another year xd
I don't own any PC yet, so I would start fresh. Would you still recommend going all in on AM4 and the 5800X3D or shall I go with AM5 and one of Ryzen 7000?
I'm literally on the same boat as you. I decided to go with ryzen 7 7900x3D and just start fresh with AM5 😅 and I'm stuck between either the 4070 TI or the 4080
Either way I would go with the Noctua edition. They are soo nice and quiet!
Did you update chipset drivers? Other reviewers have bigger difference between the cpus
Most of them are testing at 1080P low settings, 1440P medium/high settings is what is shown here.
People install the Chipset drivers? Lmao.
For me was worth the wait. I've bought x670e motherboard + 7600x just to wait for 7900x3d or 7950x3d. Also it looks like I've lost silicon lottery as my 7600x becomes unstable when I do curve optimizer even for -5 =)
Looks like so far Windows scheduler often fails to put the game on the X3D CCD
Not really, check 1:10
One CCX disabled, everything basically runs on X3D cache die
Looks like Amd making overpriced overhyped trash 3D CPU but the fanboys blamed Windows for this crap Amd pulled. Amd fanboys is worst as linux fanboys, ignorant at best.
@@pisachasrinuan7960 what happens when you have discord, browser with tabs and streaming running? Will it make the right calls?
If you got money to spare, and you care about energy efficiency, I think this one is fine. But I also want to see tests of 7700X3D, 7800X3D before any decision.
7600x3d?
I don't really sub to people, but you always deliver the best information that I'm looking for. Nobody else has such in depth dives. Love ya man
Having only one ccd with 3d v caches intentional. I imagine the 8000 series chips to have 2 without both dies but the 8950x3d will double the vcache on 2 ccds.
Thank you for adding MWII to the chart!
I love to see you as ginger when doing amd reviews.
The most impressive CPU in the chart(s) is the Ryzen 7 7700. 👌
He is the guy who will definitely include valorant benchmark and latency too. Latency is really important some people dont talk about
v*lorant player 🤮
unbelievable video and lighning quality, what is that tec
Wait. This is the first time I've ever heard of the XXXX-3D lineups of AMD cards. What differentiates these cards from the other? Are they designed for VR gaming?
not sure prob new hbm
Disable the non v cache ccd and run the tests again. Some of the results dont look correct. The 7950x3d is unfortunatelly not yet working as it should.
Is it fixed now with bios updates? I'm upgrading to AM5 or intel 13000 series or higher at some point.
Or should I wait for 14000 series or 8000x3d series?
Gotta say it. Looking jacked m8
I think the goal with the 7950x3d is trying to close the gap between workstation and gaming pc
in most workstation multimedia tasks (premiere, photoshop, after effects, cinema 4d) the 13900k looks better or the same.
in FPS there is practically no difference between the old and new (cache & no cache), and somewhere it is worse than the original. the results are not unambiguous, it is not clear what I should pay for here.
Nice review. i9 is crippled with that memory kit. any chance of seeing both platforms maxed out on the memory?
This.
I'll wait for the 19900X5D with Quantum entanglement engine.
And free bjs left, right and centre
Hopefully there will be some good games by then.
Thankyou. Well explained.👍
thx to NOT only test in FullHD or even 720p :)
Hi, I have a question, you mentioned that the other 8 cores are asleep when gaming? If I am lets say streaming? or rendering something in the back ground would those unused cores pick up that task? or would they stay a sleep?
Any difference on the 7950x3D when using 6000 MHz CL30 ( which AMD recommends ) vs the CL36 used here for testing ? IF yes, how much % ? Appreciate the review, seems like the 5800X3D is still good for gaming for at least 2 years :)
I would guess around 2%. Maybe 5% at Max. The thing is, picking the right memory and timing is very important. It's also very tricky. If you're already buying this high end cpu. Don't cheap out on the memory. Just get the lowest cas latency possible.
7800x3d in tomb raider 30fps over 13900ks , factorio 50 fps , watch dogs legion 30fps, horizon zero dawn 60fps . So best gaming cpu is def 7800x3d .
Great, but...wen tiny WC 4090 bild?
Amazon is already sold out of the 7950X3D.
where does the 5900x fall into this.
fyi, there's a weird rendering issue at ~5:45
Looks like I'll be keeping my standard 7950X
Uh yeah, definitely. Who upgrades 6 months after an upgrade.
RIP my old 3950X.
Well I wonder why the chip with the 3D V-CACHE STACKED ON TOP OF THE CHIP WOULD RUN HOTTER THAN THE REGULAR 7950x. 🙄
He says that the temp jumping to 89 C is not adding up... Isn't that literally what it's designed to do?????
next up 7950x3d+pro dlc that ads the missing cache on the other ccd
Thanks for putting old 2600 cpu in the charts
I'll wait for the 4D version
how good are these processors for 3D rendering, animation, video editing, and game development?
AMD Mullet edition, Cores in the front, Cache in the back.
7950X 3D is pointless for 4K gaming right? Was hoping to see benchmarks for that but it stopped at 1440p.
I wouldn't say pointless just not near the best.
@@TheHandHistoryVault idiot for sure
Would have been interesting to include Intel 13600 or 13700 rather than just the 13900.
What's confusing is that his link to Amazon is for the 13600. I clicked that first, then when his graphs referenced the 13900, I got confused. I assume it's performance would be somewhere in the middle.
I unbelieve, I have no word, to describe 5800X3D))))
I bought it for 200 bucks, It cost 300 bucks. I sold 2700x for 100 bucks, added 200 bucks and bought it.
5800X3D this is true legend now.
Can someone explain to me in simple terms how/if the 7800x3d could/will be better/faster than the 7950x3d please?
usually mid-high cpus (like 12700k) performs better at same freq with higher ends due to lesser cores, thus can boost optimally
this method commonly used in hgh level overclocking where you only enable cores that oc high
oops
@@iikatinggangsengii2471 So from what you are saying the 7800x3d could be better because its sole purpose is to only us the 8 cores and the 3d cache by itself and not having to worry about the schedular?
Interesting how your conclusion is wildly different from hardware unboxed. Looks like that might be because you tested almost completely different games from HU and in all of your tests except cyberpunk the 70x3d showed no gains while HUs tests mostly show the am5 crushing am4. Or is there something else going on here?
I've learned not to trust HB for these reasons.
@@computermedics8119 Because this guy didn't update the drivers for the 7950 x3d?
would be interesting to see how the 7900x3d does with the extra 0.2 GHz.
So do you have a video regarding the pbo settings for 7950X3D?
Im looking for information to possibly make my own custom handheld. I know nothing of computers or programming. But its because Ayaneo is producing enough for low emulations. Retroid is also a little behind. Odin 2 can only play up to mid graphics and PC games. Steam Deck has a really bad 720P screen. So I want something that can emulate basic games all the way to streaming and high graphics without affecting power (long game play). I was thinking of getting and connecting the motherboards of all the modern Xbox, PlayStation 5 and Wii U to play all games. But that is very very redundant when I can buy a top of the line processor chip, Ram, HDMI port, multiple SD slots for memory (split capabilities: Windows, Linux, Plug and play), Hal sensors, Touchpads, D pads, Analog triggers, extra triggers on molded cases, SNES buttons, extra controller buttons, Gyroscope capability, enough power for long play hours docked or handheld mode. Molded case for capable with LED advertisement style display for customized messages or images with different colors from customization features. A display screen that gives 4K to 1080P. Debating on OLED or LED. OLED can lead to burnout screen if left too long. So maybe LED with a capable image for strong darker images with a high customizable brightness. So it won't hurt the eyes in the dark. Saying all these wink wink ideas because Emulation companies don't have the courage to compete with the main brand gaming companies. So individuals have to invest to make their own ultimate customizable handheld. The problem would be making the motherboard, Power function too.
For am5, what’s the best itx board out there?
always want the newest tech but this time I will keep by 5800x3d
Sad you didn't show the 1080p test.
if u got the same results as the 7950x the game probably use the wrong ccd
I thing from the hering that the R9 7950x 3D good CPU for ITX builds is.
The efficienz is high as he´s not 3D version the only problem is win dumm again.
From time to time some data go´s in the worng CCD and get slow down.