Starstream Gamer
Starstream Gamer
  • Видео 57
  • Просмотров 145 096
12GB of VRAM vs Ratchet & Clank: Rift Apart. 2nd run. Frame Gen enabled! Can 4070 Ti handle it?
This is my second video comparison of Ratchet & Clank: Rift Apart game running on a 12GB GPU. This game is one of the most VRAM hungry titles.
With all 3 enabled RT effects and Very High quality textures it already consumes a lot of VRAM. Frame Generation also increases VRAM usage. So if you are going to use all the bells and whistles offered by the game, it turns into the real VRAM hog.
So let's see, how it runs on 4070 Ti at 1440p maxed out with full RT and FG!
00:00 Introduction
00:26 Comparison at 1080p resolution (DLSS FG)
02:53 Comparison at 1080p resolution (FSR FG)
05:28 Conclusion about 1080p performance
05:38 Comparison at 1440p resolution (DLSS FG)
08:12 Comparison at 1440p resolution ...
Просмотров: 143

Видео

AMD Ryzen 5600: stock vs overclocked/tuned RAM (3600 MHz CL18 vs 3800 MHz CL15). Does memory matter?
Просмотров 4105 месяцев назад
Can Ryzen 5 5600 CPU performance be improved with a help of faster RAM with tighter timings? Let's find out! I compared the most popular DDR4 RAM with 3600 MHz CL18 XMP profile with tuned 3800 MHz CL15 variant. Is the latter one really makes any difference? Many people claim that 3600 Mhz RAM is a sweet spot for Zen 3 CPUs. Is it true? #ryzen5600 #memorytest #ramoverclocking
Will Ratchet & Clank Rift Apart run better w/o RTX IO? Direct Storage On vs Off comparison.
Просмотров 29211 месяцев назад
Will Ratchet & Clank Rift Apart run better without Direct Storage (RTX IO)? Let's find out! To disable Direct Storage, delete these two files in the main game's folder: "dstorage.dll" and "dstoragecore.dll". Alternatively you can just cut/paste them in some other folder, to be able to turn them back if needed. System specs: GPU: Palit RTX 4070 Ti GameRock CPU: AMD Ryzen 5 5600 - OCed to 4550 MH...
4070 Ti (Super) vs Resident Evil 4 Remake. Are 12GB of VRAM enough for this game (4K, Max, RT)?
Просмотров 43711 месяцев назад
Can 4070 Ti run Resident Evil 4 Remake at 4K/Max settings (incl. Max textures and Max RT) without any issues related to VRAM capacity? Let's find out! System specs: GPU: Palit RTX 4070 Ti GameRock CPU: AMD Ryzen 5 5600 - OCed to 4550 MHz (via Curve optimizer) RAM: 16GB (2x8GB) DDR4 Crucial Ballistix BL2K8G32C16U4R - OCed to 3800 MHz [16-19-19-40 CR2 tightened subtimings] Motherboard: MSI B550M ...
4070 Super (emulated) vs 4070 Ti. What to expect from this new GPU? Comparison in 4 new games.
Просмотров 770Год назад
How fast will be the most interesting Super GPU? 00:00 - Introduction 00:03 - Testing Methodology 00:59 - A Plague Tale: Requiem 03:18 - Forspoken 04:18 - Resident Evil 4 Remake 05:21 - Ratchet & Clank: Rift Apart 09:41 - Results and Conclusion Hello everyone. I was so eager to see the difference in performance between 4070 Super and 4070 Ti, that I decided to emulate RTX 4070 Super and compare...
12GB of VRAM vs Ratchet & Clank: Rift Apart. Deep-dive analysis of 4070 Ti (Super) performance.
Просмотров 728Год назад
Can 4070 Ti run Ratchet & Clank: Rift Apart at Max settings (incl. Very High textures and full RT) without any issues related to VRAM capacity? At what resolution will it run out of VRAM? Let's find out! Ratchet & Clank: Rift Apart is probably the most demanding game in terms of VRAM consumption. It uses very detailed textures and three RT effects (reflections, shadows and ambient occlusion). P...
4070 Ti vs Alan Wake 2. Are 12GB of VRAM enough for this game? At what res will it limit this GPU?
Просмотров 599Год назад
Can 4070 Ti run Alan Wake 2 at Max settings (incl. Ultra textures and RT/PT) without any issues related to VRAM capacity? At what resolution will it run out of VRAM? Let's find out! System specs: GPU: Palit RTX 4070 Ti GameRock CPU: AMD Ryzen 5 5600 - OCed to 4550 MHz (via Curve optimizer) RAM: 16GB (2x8GB) DDR4 Crucial Ballistix BL2K8G32C16U4R - OCed to 3800 MHz [16-19-19-40 CR2 tightened subt...
4070 Ti (Super) DOES have enough VRAM at 4K in Far Cry 6 (Ultra, RT, HD textures)!?
Просмотров 1,1 тыс.Год назад
12GB of VRAM? Is it enough for 4K gaming? Let's find out by testing one of the most VRAM hungry games - Far Cry 6 with HD textures and RT reflections and shadows. As you know, HD textures and RT effects are the most demanding settings in terms of VRAM capacity. So this game should consume a ton of VRAM, right? System specs: GPU: Palit RTX 4070 Ti GameRock CPU: AMD Ryzen 5 5600 - OCed to 4550 MH...
Does Denuvo hurt performance of Resident Evil: Village? Denuvoless comparison.
Просмотров 5 тыс.Год назад
This video shows how Denuvo DRM influenced the performance of Resident Evil Village video game. This is surprisingly the first game which provides less then 10% improvement after Denuvo removal. But since it's initial CPU performance was excellent before, no wonder it has not improved much. This game is one of a few single player titles which can offer high refresh gaming, if you are an owner o...
Will overclocking VRAM help RTX 4070 Ti to overcome it's memory limitations? Stock vs OC comparison.
Просмотров 10 тыс.Год назад
00:00 - Introduction 00:03 - Testing Methodology 00:24 - A Plague Tale: Requiem 02:28 - Control 04:24 - God of War 08:26 - Hellblade: Senua's Sacrifice - Enhanced Edition 09:56 - Marvel's Guardians of the Galaxy 10:57 - Shadow of the Tomb Raider 11:37 - The Last Of Us Part I 13:35 - Uncharted: The Lost Legacy 15:48 - Conclusion Hello everyone. RTX 4070 Ti was heavily criticized for it's relativ...
"RTX 4070" (emulated) vs 4070 Ti. Test and comparison in 6 new games (incl. The Last of Us Part I).
Просмотров 1,2 тыс.Год назад
How fast will be the most awaited GPU? 00:00 - Introduction 00:03 - Testing Methodology 00:54 - The Last Of Us Part I 02:33 - A Plague Tale: Requiem 04:43 - Control 06:45 - Hellblade: Senua's Sacrifice - Enhanced Edition 08:45 - Shadow of the Tomb Raider 09:25 - Days Gone Hello everyone. I was so eager to see the difference in performance between 4070 and 4070 Ti, that I decided to emulate RTX ...
Does Denuvo hurt performance of Dying Light 2: Stay Human? Denuvoless comparison.
Просмотров 2,4 тыс.Год назад
This video shows how Denuvo DRM influenced the performance of Dying Light 2: Stay Human video game. In this case Denuvo reduces CPU performance by ~10-15%, depending on a certain scene/location. Overall gameplay seems to be a bit smoother without Denuvo, especially if you have a budget or mid range CPU. In this case I have not noticed reduction of RAM usage, but it was quite low even with Denuv...
Does Denuvo hurt performance of Middle-earth: Shadow of War? Denuvoless comparison.
Просмотров 654Год назад
This video shows how Denuvo DRM influenced the performance of Middle-earth: Shadow of War video game. In this case Denuvo reduces CPU performance by ~5-15%, depending on a certain scene/location. Overall gameplay seems to be a bit smoother without Denuvo, especially if you have a budget or mid range CPU. Moreover Denuvo consumes additional ~1,5-2 GB of RAM. And, finally, loading times are decre...
A Plague Tale Requiem. Ray Tracing Shadows Comparison (Performance & Visual) RT ON/OFF. RTX Enabled.
Просмотров 2,2 тыс.Год назад
Recent patch added Ray Tracing shadows to the game. Here is the comparison of the game with RT shadows enabled/disabled. Mostly RT shadows look fine, but there are some visual glitches. Performance hit is pretty hefty though. Intro - 00:00 Performance comparison - 00:04 Visual comparison - 02:13 System specs: GPU: Gainward RTX 3060 Ghost CPU: AMD Ryzen 5 5600 - OCed to 4550 MHz (via Curve optim...
Does Denuvo hurt performance of Wolfenstein Youngblood? Denuvoless comparison.
Просмотров 346Год назад
I had another chance to compare the performance of a game, which was relieved from Denuvo DRM. This time it is Wolfenstein: Youngblood. In CPU bound scenarios you can expect quite a significant CPU performance improvement (up to 25%, depending on a particular scene). Going from ~230 fps to ~280 fps might not seem exciting, but imagine if you have much slower CPU than my R5 5600. This would be ~...
AMD Ryzen 2600 vs Ryzen 5600. CPU comparison in 10 games! Amazing improvement in two generations!
Просмотров 12 тыс.Год назад
AMD Ryzen 2600 vs Ryzen 5600. CPU comparison in 10 games! Amazing improvement in two generations!
GTX 1660 vs RTX 3060 comparison/test in 10 games with Ryzen 5 2600 | Is it worth upgrading?
Просмотров 5272 года назад
GTX 1660 vs RTX 3060 comparison/test in 10 games with Ryzen 5 2600 | Is it worth upgrading?
Hellblade: Senua’s Sacrifice | DX11 vs DX12 | API comparison | GTX 1660 + Ryzen 2600
Просмотров 3,7 тыс.3 года назад
Hellblade: Senua’s Sacrifice | DX11 vs DX12 | API comparison | GTX 1660 Ryzen 2600
Does Denuvo hurt performance of Shadow of the Tomb Raider (DX12)? Denuvoless comparison.
Просмотров 3043 года назад
Does Denuvo hurt performance of Shadow of the Tomb Raider (DX12)? Denuvoless comparison.
Does Denuvo hurt performance of Rise of the Tomb Raider (DX11)? Denuvoless comparison.
Просмотров 5763 года назад
Does Denuvo hurt performance of Rise of the Tomb Raider (DX11)? Denuvoless comparison.
Does Denuvo hurt performance of Rise of the Tomb Raider (DX12)? Denuvoless comparison.
Просмотров 8783 года назад
Does Denuvo hurt performance of Rise of the Tomb Raider (DX12)? Denuvoless comparison.
Does Denuvo hurt performance of Star Wars Jedi: Fallen Order? Denuvoless comparison.
Просмотров 2,9 тыс.3 года назад
Does Denuvo hurt performance of Star Wars Jedi: Fallen Order? Denuvoless comparison.
What is the best CPU for LGA 2011-3 socket? 2620v3, 2630v3, 2630lv3, 2640v3 or 2678v3?
Просмотров 14 тыс.3 года назад
What is the best CPU for LGA 2011-3 socket? 2620v3, 2630v3, 2630lv3, 2640v3 or 2678v3?
How to undervolt a GPU. Make your graphics card cooler and quiter!
Просмотров 743 года назад
How to undervolt a GPU. Make your graphics card cooler and quiter!
How to overclock a GPU (graphics card). Nvidia or AMD. GTX 1660, 1060, 1050, RTX 3060, RX 5700, 580
Просмотров 583 года назад
How to overclock a GPU (graphics card). Nvidia or AMD. GTX 1660, 1060, 1050, RTX 3060, RX 5700, 580
Little Nightmares 2 (II) | GTX 1660 + Ryzen 2600 | 1440p, Ultra
Просмотров 473 года назад
Little Nightmares 2 (II) | GTX 1660 Ryzen 2600 | 1440p, Ultra
The Medium | DX11 vs DX12 | API comparison | GTX 1660 + Ryzen 2600
Просмотров 1,2 тыс.3 года назад
The Medium | DX11 vs DX12 | API comparison | GTX 1660 Ryzen 2600
Tips and thoughts about memory (RAM) overclocking. Important things about Intel/AMD CPU overclock.
Просмотров 393 года назад
Tips and thoughts about memory (RAM) overclocking. Important things about Intel/AMD CPU overclock.
The Medium Gameplay | GTX 1660 + Ryzen 2600 | Optimal/best settings for 1080p/30fps experience
Просмотров 513 года назад
The Medium Gameplay | GTX 1660 Ryzen 2600 | Optimal/best settings for 1080p/30fps experience
The Medium Gameplay | GTX 1660 + Ryzen 2600 | 1080p, High, Medium, Low
Просмотров 7523 года назад
The Medium Gameplay | GTX 1660 Ryzen 2600 | 1080p, High, Medium, Low

Комментарии

  • @prettyawesomeperson2188
    @prettyawesomeperson2188 11 дней назад

    No point in doing these tests if you don't push the game to the limit, as well as your hardware. 540P? Really?!

    • @starstreamgamer3704
      @starstreamgamer3704 7 дней назад

      I am testing CPU/RAM performance. Game resolution has nothing to do with it, since higher resolution loads GPU. So to eliminate GPU bottleneck, all CPU tests should be performed at low resolution. Most reviewers test CPUs with 4090 at 1080p. 3060 is like 4 times less powerful. So even at 540p it performs much worse than 4090 at 1080p. I'd say 3060 at 540p is even slower than 4090 at 1440p. So it is pretty realistic scenario. If I had more powerful GPU back then, I would increase resolution, at least, to 1080p.

  • @averagejeff6037
    @averagejeff6037 Месяц назад

    What do you think about i5-660 paired with gtx 1050 ti gaming x 4 gb? And I would like to upgrade to i7 880 or x3440 witch one should i get? My motherboard is gigabyte usb3L rev 2.0. Thanks.

  • @burakahmettr8193
    @burakahmettr8193 Месяц назад

    hi i also have r5 5600 and crucial ballistic tracer, can you help me to tune it ? im trying on xmp profile to set 3600 and 1.3 volt but never achieved a stable oc

    • @starstreamgamer3704
      @starstreamgamer3704 Месяц назад

      DDR4 XMP profiles are usually rated at 1,35V. If you want to achieve higher than XMP frequency with the same timings, try 1,4V. I had to set voltage to 1,43V to achieve 3800MHz CL15. Previously I used it at 3800MHz/CL16 and 1,35V was enough.

    • @burakahmettr8193
      @burakahmettr8193 Месяц назад

      @@starstreamgamer3704 are you doing oc with xmp profile or you create new manual profile with hand writed timings ?

    • @starstreamgamer3704
      @starstreamgamer3704 Месяц назад

      ​@@burakahmettr8193 I started with XMP. But it does not matter much, as long as you manually tune voltage.

  • @guillepankeke2844
    @guillepankeke2844 2 месяца назад

    It still punishes gamers that buy the game, some performance issues but also the invasion of privacy and use or resources, like why would i use the lifespan of my hardware for drm? I ALREADY BOUGHT THE DARN GAME! If they keep treating buyers like this maybe not buying games is morally right.

  • @dramaticmudderer5208
    @dramaticmudderer5208 2 месяца назад

    Performance being worse is due to them increasing quality

  • @JayJo777
    @JayJo777 2 месяца назад

    Thanks that was explained very well

  • @Ryyxo
    @Ryyxo 3 месяца назад

    how is the game now in v2.13

    • @starstreamgamer3704
      @starstreamgamer3704 3 месяца назад

      After v2.00 introduction CPU performance decreased noticeably (in the main game). The only thing that v2.13 brought is FSR-FG support (aka FSR 3).

    • @Ryyxo
      @Ryyxo 3 месяца назад

      @starstreamgamer3704 exactly bro, I played the game fine back in 1.6, now driving fast ruins everything

  • @ApostKef
    @ApostKef 3 месяца назад

    should i have llc on or off? which one is safer for degredation?

  • @henrikw377
    @henrikw377 3 месяца назад

    Super test, thanks!

  • @leafar4964
    @leafar4964 3 месяца назад

    i have a question, what software do you use to lock the cpu frequency

    • @starstreamgamer3704
      @starstreamgamer3704 3 месяца назад

      I do not use any specific software, just set Curve Optimizer to +150MHz and -18 for all cores (in BIOS). My sample works fine at these settings and holds 4600 MHz most of the time.

  • @sownlengoc
    @sownlengoc 4 месяца назад

    look like 88w is the real tdp of 5600 not 65w

    • @starstreamgamer3704
      @starstreamgamer3704 4 месяца назад

      65W is it's rated TDP. While it's PPT is 88W. But it can consume up to 110-120W under some loads, when not constrained by any limits, depending on it's core clock and core voltage. Though the most optimal limit for this CPU is 90W. Anything higher brings negligible performance improvements, but significantly increases the CPU temperature.

    • @sownlengoc
      @sownlengoc 4 месяца назад

      @@starstreamgamer3704 then I wonder how r5 2600/3600 perform if they are forced to use more watts

    • @starstreamgamer3704
      @starstreamgamer3704 4 месяца назад

      ​@@sownlengoc My sample was functioning at 3,8 GHz/1,22V. Next step for it was 3,9GHz/1,28V. This would increased it's power draw to 65-80W in games. At 4GHz/1,38V it was consuming ~90-100W, but it's gaming performance was just 3-4% higher then at 3,8GHz/1,22V. So 70W limit for 2600 is an optimal choice. 5600 just needs a bit more power to reach a good balance between power consumption and provided framerate. 3600 is similar to 2600 on this matter - 70W limit is an optimal choice for this CPU as well.

  • @Maxim67459
    @Maxim67459 4 месяца назад

    i have xeon e5-2630 v3 and the world seems to hate me

  • @King_javi17
    @King_javi17 4 месяца назад

    jabladol eso es mentira dice @GantelYT dice que la 1070 es mejor

  • @tomaszeklolek8134
    @tomaszeklolek8134 5 месяцев назад

    192bit is enough for 1440p. You must test it in 4k

    • @starstreamgamer3704
      @starstreamgamer3704 5 месяцев назад

      I wanted to make such a comparison, but this card is not fast enough to run modern games at native 4K/Max. And if you use DLSS set to "Quality", this means games are rendered at 1440p and then upscaled to 4K via tensor cores. So the results will be the same as shown in this video. For older games it might be more useful though. I tested one game at 4K (Horizon Zero Dawn) and found out that VRAM OC brings 8% more frames at native 4K. But since there is no DLAA in this game and modded DLAA (via DLSS Tweaks) performs significantly worse than TAA (and do not achieve locked 60 fps), I do not see any sense in not using DLSS Quality. So testing HZD at native 4K is kinda irrelevant.

    • @tomaszeklolek8134
      @tomaszeklolek8134 5 месяцев назад

      @@starstreamgamer3704 light game turn on 4k max but modern games turn on in 4k not im max, just medium in nativ 4k

  • @smwfreak1647
    @smwfreak1647 5 месяцев назад

    overclocked my 3600mhz cl16 sticks and tightened timings with the help of people from the overclocking server on discord. got it up to 3800mhz cl16 with gdm off on 1t less latency and feels amazing. I have basically the same cpu but the 5600x thats overclocked.

    • @starstreamgamer3704
      @starstreamgamer3704 5 месяцев назад

      Good result. May I ask what chips is your RAM based on? And what are the settings you chose for your CPU (frequency, voltage etc.)?

    • @razorgcy
      @razorgcy 5 месяцев назад

      which exact server is it :) I'd love to join

  • @Ogk10
    @Ogk10 5 месяцев назад

    nice

  • @marcc5768
    @marcc5768 5 месяцев назад

    shouldn't look down on a cpu just cause it does not support ddr3 RAM. My concern is power draw. I prefer Xeons v3 or v4 long as the CPU tdp is under 115W. That lower power 55W xeon still has a use also, if using as a NAS build. Since it would be lower power draw for a system that could be running 24/7.

  • @KROLOLO
    @KROLOLO 5 месяцев назад

    wtf 4070 ti isn't 504 GB/s

    • @starstreamgamer3704
      @starstreamgamer3704 5 месяцев назад

      It is! Look at it's tech specifications on any website (TechPowerUp, for example). Memory config: Memory Size - 12 GB Memory Type - GDDR6X Memory Bus - 192 bit Bandwidth - 504.2 GB/s

  • @swede7581
    @swede7581 5 месяцев назад

    i5 12400f cpu so what would usually be the best when i game? dx11 vs dx12 vs Vulkan(i know that Vulkan has seemed best for games i have,but some others is the dx11 and dx12 are depending on what game)But is there a rule for the main part of games what is mostly the best dx11 vs dx12 for my hardware=?? RTX3060 12gb i5 12400f 16gb Its far from the best i know but can atleast handle most of games still..... Thanks 4 this video

  • @gadirigadiri2763
    @gadirigadiri2763 6 месяцев назад

    overclocking for 3 or 4 fps doesn't worth it.

    • @starstreamgamer3704
      @starstreamgamer3704 6 месяцев назад

      Why not? There is no significant increase in either power consumption or temperatures or fans speed. So overclocking memory (VRAM) is literally free performance for very low effort. Overclocking GPU is a different story though and depends a lot on the type and size of a GPU cooler, room temperatures, PC case quality, etc. I'd rather undevoted a GPU and overclocked memory. It is a perfect combination of performance/temperatures/noise ratio.

  • @azrans4644
    @azrans4644 6 месяцев назад

    you didn't add frame generation, try this in next test it's impossible to add in 1440p for a 4070 because of vram not enough

    • @starstreamgamer3704
      @starstreamgamer3704 4 месяца назад

      Well, I performed another comparison, this time with enabled Frame Gen as well (both FSR 3 and DLSS 3). You can watch my latest video. Shortly: you can run this game on 4070 Ti (or any other 12GB NV card) at 1440p maxed out with FG, if there are no active background apps, that use additional VRAM. Cards with 12GB buffer are right on edge, to be honest, but they are still capable to run any game at 1440p w/o any issues.

  • @filipe_smith
    @filipe_smith 6 месяцев назад

    I have a Asus P7P55D-E with a i7 870 running at 3.6Ghz with 4x4gb 1600mhz cl9 and 2 HD 5770, runs good, i want to do an upgrade in gpu and cooler

  • @_duduferrari
    @_duduferrari 6 месяцев назад

    00:06 back on that day, this motherboard was such a huge deal! Amazing how some LGA 775 and most LGA 1156 motherboard were well constructed like this one, nowadays barely there is over than a PCi-e and x1 slot, usually for Wi-Fi and works for 2 through 4 years, instead of a decade and half like that!

  • @bogdanursache501
    @bogdanursache501 7 месяцев назад

    Very, very good video, sir. Congrats! Subscribed

  • @gtx1650max-q
    @gtx1650max-q 8 месяцев назад

    weirdly my laptop performs better on dx12 FPS wise but stuttering is still present in both the APIs

  • @minidesty4747
    @minidesty4747 8 месяцев назад

    me in 2024 watching your video very very nice thank you

  • @cyberjunk7258
    @cyberjunk7258 9 месяцев назад

    8:36 what are those stutters

  • @jaystheone1
    @jaystheone1 9 месяцев назад

    This is real jungle, best gaming Xeon CPU dual socket support?

  • @a_a_mhb5725
    @a_a_mhb5725 9 месяцев назад

    Hi. I have DH57DD motherboard does it have the ability to regulate imc voltage??

    • @starstreamgamer3704
      @starstreamgamer3704 9 месяцев назад

      Hi! Unfortunately, I do not have any experience with this specific board. You can check this yourself. It should be called the same.

  • @WinsWithDZ
    @WinsWithDZ 9 месяцев назад

    I have an Xeon e5-2620 v3 and I need a cpu cooler that’s cheap help

    • @starstreamgamer3704
      @starstreamgamer3704 9 месяцев назад

      This CPU has a TDP of 85W. So you do not need massive cooler. Any tower-type model with 3 heat pipes should be sufficient. Just check the combability with LGA 2011-3.

    • @WinsWithDZ
      @WinsWithDZ 9 месяцев назад

      @@starstreamgamer3704 thanks

    • @starstreamgamer3704
      @starstreamgamer3704 9 месяцев назад

      No problem.

  • @agedcode
    @agedcode 9 месяцев назад

    @Starstream Gamer, thank you for responding to every question posted under your very informative video. Let me add mine, too. I've got a Dell Precision T5810 with an E5-1620 v3. What faster CPU would you recommend, considering that I need single-core performance as a priority? I'm not a fan of high TDP, especially since it's working as a home server running my own three server software (they use 5 to 8 threads each).

    • @starstreamgamer3704
      @starstreamgamer3704 9 месяцев назад

      You are welcome. I am glad my video was useful for you and others. If you need higher single core performance, there is not much of a choice. 1620 v3 has only 10MB of L3 cache. So you may want a CPU with larger L3 cache, which might give a small boost. Other than that, v3 Xeons can only offer more cores. But you can look at v4 models, their single core performance is a bit better due to slightly higher IPC. For example 2637 v4, which boosts to 3,7GHz and also has 15MB of L3. Or 2643 v4, which has 6 cores (with the same 3,7GHz boost), but also 20MB of L3. P. S. Make sure Dell Precision T5810 supports v4 CPUs (you might need to upgrade motherboard BIOS).

  • @etienne1062
    @etienne1062 9 месяцев назад

    I think i read somewhere that DS is only active on the highest texture setting. So using high texture decrease VRAM usage from both using lower quality textures and disabling DS. It's really too bad that we are seeing no game take advantage of sampler feedback streaming coupled with DS and GPU decompression. On the paper it seems like the bast case scenario: High fidelity assets, fast streaming, and low VRAM usage. It would make recent 8gb VRAM GPU way more viable... Well in theory, because for now we haven't seen a single game use it. It's a dx12 Ultimate feature, so that might explain why.

  • @teaspoontea
    @teaspoontea 10 месяцев назад

    Currently Im using HP Z440 can not unlock turbo boost. I have E5-2667 v4 & E5-2699 v3, 32GB DDR4 2133 ECC + RTX 3050. Which one your suggestion to run racing game + motion platform simulator? Thanks.

    • @starstreamgamer3704
      @starstreamgamer3704 10 месяцев назад

      99% of games can not utilize more than 8 CPU cores. 2667 v4 clocks at 3,5 GHz, while unlocked 2699 v3 - at 3,6 GHz. Taking into account small IPC improvements in v4 CPUs, 2667 should be a bit faster. But it has only 25 MB of L3 cache, while 2699 v3 can boast 45 MB. But such a big amount of cores/threads might hurt 2699 v3 gaming performance a little bit. So, in the end, it is a tough choice. To make thing simple, I'd recommend 2667 v4. This is an easy "drop and use" solution. While with 2699 v3 you should unlock turbo boost and disable about half of it's cores (10 active cores should be enough, I suppose). This will also let this CPU hold it's core clock at 3,6 GHz most of the time.

    • @teaspoontea
      @teaspoontea 10 месяцев назад

      @@starstreamgamer3704 wow its a great explanation. Now I put 2667 on my Z440. Thank you very much!👍

    • @starstreamgamer3704
      @starstreamgamer3704 10 месяцев назад

      I am glad I could help. Happy gaming and better winning!

  • @Superdazzu2
    @Superdazzu2 10 месяцев назад

    My rtx 4070 super inno3d twin oc pushes +1800 mhz on vram, reaching 12300mhz, sadly it's power limited so i have to undervolt it to 0.950 mV at 2715 core clock

    • @starstreamgamer3704
      @starstreamgamer3704 10 месяцев назад

      Nevertheless, good result. Though I think even 975 mV at ~2800 MHz would be enough to fit in 220W. At least my 4070 Ti can do it in most cases. 4070 Super is indeed power limited, unlike 4070 Ti. More expensive versions have extended power limits, but they cost almost as much as the Ti. Nvidia is a very calculating company.

  • @DanielRamirez-sk2ve
    @DanielRamirez-sk2ve 10 месяцев назад

    What’s your opinion on the 2680v4 ? Or is it only worth using v3 for the core boost?

    • @starstreamgamer3704
      @starstreamgamer3704 10 месяцев назад

      Xeon e5 2680 v4 functions at 2900 MHz (all core boost), has 35 MB of L3 cache and a bit higher IPC, while Xeon e5 2680 v3 functions at ~3300 MHz (with unlocked turbo boost) and has 30 MB of L3 cache. So, I believe, overall gaming performance of these CPUs will be very similar. Since Xeon e5 2680 v4 has 2 more cores and higher RAM speed, it is a better CPU for productivity. So I would choose v4, if it costs just a bit more. You will not need to mess with unlocking turbo boost in this case, which is a plus.

    • @DanielRamirez-sk2ve
      @DanielRamirez-sk2ve 10 месяцев назад

      @@starstreamgamer3704 nice thanks I just built a pc with the v4 and a rx580 2048 and am really enjoying it. Just wanted to see if I missed a better option thanks again for getting back to me

    • @starstreamgamer3704
      @starstreamgamer3704 10 месяцев назад

      You are welcome!

  • @MrSamadolfo
    @MrSamadolfo 10 месяцев назад

    🙂 thanks for the educational video

  • @MarDrzewinski
    @MarDrzewinski 10 месяцев назад

    Could you make a useful comparison pci 3.0 in raid 0 of two nvme 3500 (in theory it removes support for direct storage but will it have more speed?) vs an nvme 3.0 using direct storage in theory slower but with less latency for gpu

    • @starstreamgamer3704
      @starstreamgamer3704 10 месяцев назад

      According to some independent sources, R&C: RA can utilize only 1/3 of full NVME 3.0 bandwidth (about 1200 MB/s at peak). So even the slowest NVME 3.0 drive is enough to realize the full potential of this game. There are a lot of comparisons between NVME 3.0 and 4.0 on the web, but no one could find even a slight difference. Same goes to PS5 version of this game - the slowest expandable NVME 4.0 drive (3200 MB/s) matches internal PS5 drive (5500 MB/s) perfectly.

  • @dvxAznxvb
    @dvxAznxvb 10 месяцев назад

    In any heavy Ray trace no upscale 1440p males the cap for 12gb vram without spillover otherwise you can mess with texture settings to get it stable frame time Even 4k quality is demanding on medium rt in regards for 12vram The 4070ti has the power but not the vram only It’s ideal gaming card for blended rt modes

  • @brebatov4873
    @brebatov4873 11 месяцев назад

    My gpu VRAM 11.71 ?

  • @phonework2969
    @phonework2969 11 месяцев назад

    1680 v4 is good for gaming? The actual price is 44 usd in Aliexpress

    • @starstreamgamer3704
      @starstreamgamer3704 11 месяцев назад

      This CPU is a pretty decent value for gaming. Though, it depends on how much cheaper v3 alternatives are currently selling for.

  • @wizztiti1181
    @wizztiti1181 11 месяцев назад

    Storage: Western Digital Blue WD Blue SN570 1TB NVMe SSD = PCI-e 3.0 !!! it would have been necessary to test with a WD-Black in pci-e 4.0. The test means nothing in the end.

    • @starstreamgamer3704
      @starstreamgamer3704 11 месяцев назад

      You are wrong. Please, show me, where the game devs claimed PCI-E 4.0 drive is required for the correct functionality of DS? Moreover, the official requirement for DS is as follows: an NVMe SSD (PCI-E 3.0 or higher).

    • @luisguerreiro8045
      @luisguerreiro8045 11 месяцев назад

      afaik rtx io hasn't really used more than 1200mbps in any test, so how could pcie 4 possibly be obligatory? Not even the full 3.0 bandwidth is necessary...

    • @wizztiti1181
      @wizztiti1181 11 месяцев назад

      @@luisguerreiro8045 what is your source? Where did you see that RTX i/O is limited to 1200mb.s? :-o DirectStrage is made to increase bandwidth. So to say that it is limited to 150MByte/s.

    • @luisguerreiro8045
      @luisguerreiro8045 11 месяцев назад

      @@wizztiti1181 not really what I meant, I'm saying it hasn't gone to that level in any title yet, ratchet and clank for once hardly uses 200mb/s in those portals as you can see in the digital foundry review or simply by toggling a i/o graphic into your rivatuner and checking it yourself, at this point we only have ratchet and clank and forspoken running directstorage, I'm not saying directstorage's won't ever make full usage of the pcie4.0 bandwidth, I'm saying the games that did implement it on pc at the moment don't come anywhere close to even capping a pcie3.0 ssd, that will likely change in the future, but as of now it is what it is.

  • @kraithaywire
    @kraithaywire 11 месяцев назад

    Incredible video!

  • @kraithaywire
    @kraithaywire 11 месяцев назад

    Please do more of these videos! Your channel is so good!

  • @mohamedboudjerda7609
    @mohamedboudjerda7609 11 месяцев назад

    E5 1650 V4 15mb L3 E5 1680 V4 20mb L3 These are the best

  • @vladislavkaras491
    @vladislavkaras491 11 месяцев назад

    I wanted to know how much FPS boost would VRAM overclocking give and got the answer! Thanks for the video!

  • @MrStereo1987
    @MrStereo1987 Год назад

    4070 ti still better

  • @starstreamgamer3704
    @starstreamgamer3704 Год назад

    UPDATE: Nvidia finally revealed RTX 4000 Super series cards. And according to the official information, 4070 Super gets just 36MB of L2 cache (not 48MB like many leakers speculated). So the performance of this card will be slightly worse than I showed in this video (as 4070 Ti has 48MB L2). While 4070 Ti Super gets 48MB L2 (and not 64MB, like it was expected) and 4080 Super gets the same 64MB, like vanilla 4080.

  • @manhandler
    @manhandler Год назад

    Skip all these cpus and get a i7 6850k, it can run on a lga 2011-3 using ecc ddr4, it just can't utilize the ecc function. It will outperform all the other xeons that run on lga2011-3. Your welcome

    • @starstreamgamer3704
      @starstreamgamer3704 Год назад

      It depends. i7 6850k is not an outstanding value for the money these days. There is a cheaper version named Xeon E5 2643 v4. It is the same 6-core Broadwell CPU with slightly lower core clock, but bigger L3 cache. Thus they will perform roughly the same. But if you have a decent branded MoBo, which has overclocking capabilities, then the unlocked multiplier of i7 6850k may be of use (even if it's price is higher).

    • @JCMW-hw9jl
      @JCMW-hw9jl Год назад

      E5 1680v4 is about the same, more cores, no overclock but. more cache

    • @cyberraybolt9205
      @cyberraybolt9205 5 месяцев назад

      ​@@JCMW-hw9jlE5 2680 have 12 cores

  • @Livebasura69
    @Livebasura69 Год назад

    my 4070 nn ti inno3dx3 push +1800=12300 = 24600mhz .i see a video 4070 ti inno3dx3 push +2300 =12800= 25600mhz (effective) with tunedit (inno3d program) because afterburner can`t do over 2k overclock

  • @1989rs500
    @1989rs500 Год назад

    I think doing same with a 4080 to emulate a 4070 Ti Super would b too a great idea, Contrary to majority of people who are not interested in 4070 Ti Super, I am 1 who thinks 4070 Ti super will be the best card for 1440p till 208 I feel 16 GB is the optimum for an Nvidia GPU costing around 900 USD which the 4070 Ti Super is going to get priced. Its still expensive when considering 7900Xt/xtx, but I am a sucker for RT and DLSS, so I want best fps with RT+DLSS at 1440p and I feel 4070 Ti Super has best shot f this

    • @starstreamgamer3704
      @starstreamgamer3704 Год назад

      Yes, it is pretty easy to make similar comparison between the real 4080 and emulated 4070 Ti Super and also emulated 4080 Super, which will be, at best, 6-8% faster than base 4080, so you just need to overclock 4080 (cores by 6% and memory by 3%). As concerns 4070 Ti Super, it's theoretical performance will be 44.10 TFLOPS, which is 10% more than 4070 Ti (with it's 40.09 TFLOPS) and 9,5% less than 4080 (with it's 48.74 TFLOPS). So 4070 Ti Super is right in the middle between 4070 Ti and 4080. But it's memory bandwidth is 672 GB/s, which much closer to 4080 (just 6% less) or 33% more than 4070 Ti. Judging by one of my previous videos (ruclips.net/video/p6-axcnHFVk/видео.htmlsi=Tz-I_VBQ2Rvhcbmf), 17% of additional bandwidth provides just 5-6% more frames for 4070 Ti. This means that 33% more bandwidth will provide 10-12% more frames, at best, since the rule of diminishing returns applies in this case. Judging by 3070 Ti, which has 36% more mem bandwidth, but is just 7-8% faster than vanilla 3070 (partly due to 4,5% more cores), 10-12% is a bit too optimistic though, but since all these cards use different chips, we can not say for sure. Taking into account the fact than 4080 is ~20% faster than 4070 Ti at 1080p, ~25% faster at 1440p and ~30% faster at 4K, we can assume that 4070 Ti Super will be roughly 8-12% faster than 4070 Ti at 1080p, 13-17% at 1440p and 18-22% faster at 4K (in the best case). This makes sense as 7900 XT (depending on resolution) is usually 5-15% faster than 4070 Ti (in raster). While, according to Nvidia's plan, Ti Super should outperform this card at any resolution (in most games), just like 4070 Super and 4080 Super should outperform 7800 XT and 7900 XTX correspondingly. P. S. For now, 4070 Ti do not have any issues related to it's 12GB buffer at 1440p. While for 4K it is a bit too slow to play at native res and max settings, if we are talking about modern demanding games. But 4K + DLSS at High settings is doable, since it decreases demands for both the chip itself and VRAM. 4070 Ti Super will become a solid lower tier 4K card, just like 7900 XT, thanks to it's much higher mem bandwidth and more powerful chip, but taking into account it's features, it will be definitely be more appealing one. It will be a decent starting option for the owners of 4K screens, especially if 4080 Super is just 12-15% faster and 20-25% more expensive.

    • @1989rs500
      @1989rs500 Год назад

      @@starstreamgamer3704 Exactly, I ve a Gigabye M27W monitor and it's 2k only, but I ve found issues with Last of Us P1 and Lords of the Fallen where VRAM spill happened. After latest update the issue with Last f Us is gone but I see texture popping in and out at long distance which I feel s a measure the programmers took to minimize GPU Memory. Same happened with Hogwarts Legacy. So I feel a top tier 1440p Ray Tracing GPU or a low tier 4K GPU. At thr pricing they r pushing that is what a 4070 Ti Super should be

    • @starstreamgamer3704
      @starstreamgamer3704 Год назад

      ​@@1989rs500 At $799 Ti Super would be a good deal (in the current market). But Nvidia will probably price it at $850-$900, which is still much better deal than $1200 4080. But time passes and tech becoming cheaper. That is a usual situation. In a year we will get 5000 series, which offers better performance for the price. 5070 will likely be priced at $600-$700, will have 16GB of VRAM and will be close to 4080 Super in terms of performance. P. S. TLoU needs only 9,3GB at 1440p/Ultra now, after it has been patched. Maybe there is still some kind of a bug related to textures. HL is still a buggy game, same as SWJS, Forspoken, Dead Space Remake etc. Unfortunately, most PC ports of 2023 were broken and many of them are still in a pretty bad shape.

    • @1989rs500
      @1989rs500 Год назад

      @@starstreamgamer3704 5000 series as per news will not be significant upgrade to 4000 series, 5090 will be a significant upgrade but rest f it will be marginal say 10-20% over current 60,70,80 series. I for 1 wud keep 4070 Ti Super till 5xxx series goes out f production.