Unreal Engine 5 Games are HERE!!! Every Current Gen Nvidia, AMD and Intel Arc GPU tested and MORE!

Поделиться
HTML-код
  • Опубликовано: 2 июл 2024
  • So many new games will be using Unreal Engine 5. Layers of Fear is the first third party game to be fully released using this engine, and it makes heavy use of the new Lumen lighting system. In this video I test the relative performance of the 4090, 4080, 4070 Ti, 4070, 4060 Ti 8GB, 7900 XTX, 7900 XT, 6950 XT, 6800 XT, 6700 XT, 7600, Arc A750, and Arc A770.
    Test system specs (ResizeBAR/SAM ON):
    CPU: Ryzen 7800X3D amzn.to/3Hkf7Qi
    Cooler: Corsair H150i Elite amzn.to/3VaYqeZ
    Mobo: ROG Strix X670E-a amzn.to/3F9DjEx
    RAM: 32GB DDR5 6000 CL30: amzn.to/41XRtkM
    SSD: Samsung 980 Pro amzn.to/3BfkKds
    Case: Corsair iCUE 5000T RGB amzn.to/3OIaUsn
    PSU: Thermaltake 1650W Toughpower GF3 amzn.to/3UaC8cc
    Monitor: LG C1 48 inch OLED amzn.to/3nhgEMr
    Keyboard: Logitech G915 TKL (tactile) amzn.to/3U7FzA9
    Mouse: Logitech G305 amzn.to/3gDyfPh
    What equipment do I use to make my videos?
    Camera: Sony a6100 amzn.to/3wmDtR9
    Camera Lens: Sigma 16mm f/1.4 amzn.to/36i0t9t
    Camera Capture Card: Elgato CamLink 4K ‎amzn.to/3AEAPcH
    PC Capture Card: amzn.to/3jwBjxF
    Mic: My actual mic (AT 3035) is out of production but this is a similar mic (AT 2020) amzn.to/3jS6LEB
    Portable Mic attached to camera: Rode Video Micro amzn.to/3yrT0R4
    Audio Interface: Focusrite Scarlett 2i2 3rd Gen: amzn.to/3wjhlad
    Greenscreen: Emart Collapsable amzn.to/3AGjQXx
    Lights: Neewar Dimmable USB LED amzn.to/3yw4frD
    RGB Strip Backlight on desk: amzn.to/2ZceAwC
    Sponsor my channel monthly by clicking the "Join" button:
    / @danielowentech
    Donate directly to the channel via PayPal:
    www.paypal.com/donate?hosted_...
    Disclaimer: I may earn money on qualifying purchases through affiliate links above.
    Chapters:
    0:00 Intro
    0:20 Why is Unreal Engine 5 a big deal?
    0:42 What do I mean by "first 3rd pary UE5 game"?
    1:26 Layers of Fear as a UE5 test may not reflect larger games
    2:30 Ray Tracing on vs off with Lumen explained
    3:21 This game features DLSS, FSR2, XeSS, and TSR upscaling
    3:42 "High" is the highest graphics setting in this game
    4:22 Ray tracing on vs off RTX 4090 4K
    4:45 7900 XTX vs RTX 4080 4K High
    5:09 7900 XTX vs RTX 4080 4K High RT On
    5:33 7900 XTX vs RTX 4080 1440p High
    5:58 7900 XTX vs RTX 4080 1440p High RT On
    6:19 7900 XT vs 4070 Ti 4K High
    6:44 7900 XT vs 4070 Ti 4K High RT On
    7:07 7900 XT vs 4070 Ti 1440p High
    7:30 7900 XT vs 4070 Ti 1440p High RT On
    7:54 6800 XT vs 6950 XT vs 4070 4K High
    8:18 6800 XT vs 6950 XT vs 4070 4K High RT On
    8:41 6800 XT vs 6950 XT vs 4070 1440p High
    9:05 6800 XT vs 6950 XT vs 4070 1440p High RT On
    9:28 6800 XT vs 6950 XT vs 4070 1080p High
    9:53 6800 XT vs 6950 XT vs 4070 1080p High RT On
    10:15 6700 XT vs 4060 Ti 8GB vs Arc A770 1440p High
    10:40 6700 XT vs 4060 Ti 8GB vs Arc A770 1440p High RT On
    11:02 6700 XT vs 4060 Ti 8GB vs Arc A770 1080p High
    11:25 6700 XT vs 4060 Ti 8GB vs Arc A770 1080p High RT On
    11:48 Arc A750 vs RX 7600 1080p High
    12:12 Arc A750 vs RX 7600 1080p High RT On
    12:37 Arc A750 vs RX 7600 1440p High
    13:01 Arc A750 vs RX 7600 1440p Medium
    13:24 Final Thoughts
  • НаукаНаука

Комментарии • 584

  • @leonelmessi3010
    @leonelmessi3010 Год назад +199

    6700XT seems to be performing real good for its range.

    • @syncmonism
      @syncmonism Год назад +31

      This was to be expected based on everything else I've ever seen on this channel, but I guess you were spending too much time playing football (soccer) and winning major tournaments to know about that XD

    • @MAarshall
      @MAarshall Год назад +5

      @@syncmonism lol

    • @newyorktechworld6492
      @newyorktechworld6492 Год назад +3

      6700 non xt is a good value also.

    • @christophermullins7163
      @christophermullins7163 Год назад +5

      I see many on the local market asking $250 to $300. 6800 xts are holding as much value as 3080s because of that extra vram.

    • @TTx04xCOBRA
      @TTx04xCOBRA Год назад +2

      AMD sucks!

  • @horatimetalero
    @horatimetalero Год назад +7

    Amazing analysis, as always Daniel! So much work put in every video and that makes your channel one of the best tech channels out there, especially gpu related. Cheers from Argentina!

  • @CaveyMoth
    @CaveyMoth Год назад +6

    You are a hero, giving us all of these benchmarks. I wish there were some kind of benchmarking robot that could be built, which automatically plugs graphics cards in, manages drivers, and benchmarks the games.

  • @jaredangell5017
    @jaredangell5017 Год назад +6

    Daniel we really appreciate your work!

  • @denisruskin348
    @denisruskin348 Год назад +19

    Played for 2 hours yesterday. Yeah, the quality of assets is mesmerising and Lumen is just such a massive boost in lighting quality.

  • @Ivan-pr7ku
    @Ivan-pr7ku Год назад +142

    I would argue that Nanite is by far the most premiere feature in UE5. Lightning can be approximated effectively in many ways, but proper handling of dense streaming geometry with seamless LOD is true achievement. Epic have worked on this for many years and Nanite is essentially software (compute) based rasterizer, completely supplanting the hardware implementation of this pipeline stage that has been with us, ever since GPUs took over geometry processing. Nanite has become practical only when most of the GPU market was fully capable of async compute and Epic probably had to delay a bit this key UE5 feature until the market had become ready. Some day there could be hardware implementation of Nanite, who knows. Blackwell is rumored to have dedicated de-noising logic for cleaner RT image, the same way Ada Lovelace implemented Optical Flow in hardware from the previous compute-based API library.

    • @Wobbothe3rd
      @Wobbothe3rd Год назад +14

      You're correct, but Lumen is also an EXTREMELY big achievement in itself even in its software form. Real time GI is equally as mind blowing as infinite detail.

    • @gavinderulo12
      @gavinderulo12 Год назад +2

      ​@@Wobbothe3rdI think his point is that we were already able to achieve similar GI quality to lumen before, though it took a lot more dev time and effort. While Nanite truely brings unseen possibilities.

    • @Wobbothe3rd
      @Wobbothe3rd Год назад +5

      ​@@gavinderulo12yeah well I would strongly disagree that anything even CLOSE to Lumen was achieved by literally any other technology besides Quake 2 RTX in 2019. And even nvidia's path traced lighting was not as performant and clear at first (it's arguably better now). I would definitely argue that real time GI is a bigger Holy Grail than virtual Geometry.

    • @raresmacovei8382
      @raresmacovei8382 Год назад +3

      No. Lightning is way harder and more important than dense geometry. Devs are barely using Tessellation, lol.

    • @NevermoreZach
      @NevermoreZach Год назад +2

      @@Wobbothe3rd Realtime dynamic GI existed as far back as The last of us on PS3. Arkham Knight is comeplety lit using realtime GI using LPVs. LPVs were a common technique to use realtime dynamic gi many many years ago and a tonne of games have used it. Cryengine used to have various implementations of it too. Enlighten was another solution for Ue and Unity users for realtime dynamic GI. Lumen traces SDF, which again isnt a new ideology. The challenge is to make it performant. RT BVHs are super tricky to get right. Even today Lumen isn't "performant". Even after all our tests it nowhere is. Even on fortnite lumen alone is rendered at half screen res and upscaled. It is literally non-existent on lower quality modes.

  • @grtitann7425
    @grtitann7425 Год назад +1

    Wow, what a crazy change compared to the last video i saw.
    You must be the first reviewer that has properly label all GPUs and provided a really fair comparison.
    Well done.

  • @janbenes3165
    @janbenes3165 Год назад +163

    It's almost like AMD's RT is not that bad if the software is not just Nvidia's optimized implementation made to run on AMD's hardware.
    Edit: I'm not saying AMD is some RT powerhouse, just that it is usable if the implementation is right.

    • @gavinderulo12
      @gavinderulo12 Год назад +36

      Actually, I'd argue that it simply isn't properly utilizing all of the features of nvidia GPUs. For example, I doubt they are using shader execution reordering, or neural radiance caching.

    • @2intheampm512
      @2intheampm512 Год назад +7

      ​@@gavinderulo12Neural radiance caching doesn't have any software implementation quite yet (at least one to where devs could utilize it). But you're right SER is definitely not being used here

    • @gavinderulo12
      @gavinderulo12 Год назад +4

      @@2intheampm512 there are rumors that it might be used in the next cyberpunk patch.

    • @2intheampm512
      @2intheampm512 Год назад +16

      The 7000 series is fairly good at RT, but it's still basically an entire generational gap behind Lovelace. Devs really haven't optimized properly for the new features implemented in RTX 4000, except for CDPR in 2077 path-tracing mode. It's not apples to apples since there's basically no optimization for RX 7000 there, but even if there was I can guarantee you there's absolutely no way it'd get even close to Lovelace

    • @__-fi6xg
      @__-fi6xg Год назад +8

      @@gavinderulo12 makes sense, its heavily sponsored by nvidia afterall.

  • @Plague_Doc22
    @Plague_Doc22 Год назад +1

    I gotta say, I really like your abillity to pick topics for videos. I didnt even realize I wasn't subbed, I just got recommended every single video cuz I watch them so often lol.

  • @listerinekiller
    @listerinekiller Год назад

    insane video :) thanks as always for giving an in depth analysis for all variables

  • @Ladioz
    @Ladioz Год назад +3

    Daniel this probably took an entire day to do. Thanks for all the content man...

  • @Workaholic42
    @Workaholic42 Год назад

    Thanks for all that work, Daniel!

  • @s-nooze
    @s-nooze 11 месяцев назад

    Thanks for doing this. I am really excited for the Satisfactory update in particular.

  • @professorclup1082
    @professorclup1082 Год назад +1

    Thanks for the benchmark Daniel

  • @AlyxSharkBite-2000
    @AlyxSharkBite-2000 Год назад +5

    This was a really good look at UE5 on the current gpus! Ty!

    • @garyb7193
      @garyb7193 Год назад

      True, So much value and performance is still in last gen cards. This is why current gen sales are down. They aren't bringing much new or perceived value to incentivize an upgrade.
      For me, the 6800xt is the value/performance 'sweet spot' in 1440p gaming. I love its pairing with the 5800x3d. I will skip this current gen and ride this pair out.

  • @LlywellynOBrien
    @LlywellynOBrien Год назад

    Thanks man, a really interesting video with useful comparisons.

  • @Quizack
    @Quizack Год назад

    This man never sleeps. Really appreciate your work, Dan! Thanks mate.

  • @MartyMcFlyTop1
    @MartyMcFlyTop1 Год назад +1

    as always - perfect test

  • @nm4520
    @nm4520 Год назад

    Thanks for the video!

  • @HMClagi
    @HMClagi Год назад

    Great vid! Lots of hard work, I bet, and still on a coffee rush? Calm down, dude! HAhaha. My compliments for the video.

  • @AbbasDalal1000
    @AbbasDalal1000 Год назад +2

    some sick work!!!

    • @Jason_Bover9000
      @Jason_Bover9000 Год назад

      First actual ue5 game very well optimized even on a 3060ti the demo that is i heard its even better now

  • @LeBurkaTron
    @LeBurkaTron Год назад

    Great work. thanks.

  • @Kules3
    @Kules3 Год назад

    Thank you for vid!

  • @smallletterdee
    @smallletterdee Год назад +77

    I'm a big UE5 skeptic, but lumen is starting to feel like a big game changer with each title that impliments it. Exciting stuff!

    • @garyb7193
      @garyb7193 Год назад +34

      Like it or not, UE5 will be the dominant development platform for the foreseeable future. I believe there will be growing pains with it b/c of its learning curve and hw rqmnts. UE5 game engine will carry us well into 2030 and later. Much like oversize pairs of shoes, EPIC created this for dev to grow into over the years. Each new gpu generation and game iteration will perform better and better. However, the first batch of games, will be kinda rocky. It IS exciting stuff!

    • @gavinderulo12
      @gavinderulo12 Год назад +9

      ​@@garyb7193I find it funny how fortnite was the first unreal 4 game and now its the first unreal 5 game.

    • @JeryLawl3318
      @JeryLawl3318 Год назад +4

      @@gavinderulo12 Because it is the most popular thing.

    • @garyb7193
      @garyb7193 Год назад +28

      @@gavinderulo12 Why? Fortnite was created by the same company that created UE4 and UE5.
      Of course, Epic Games would test it first with one of their own.

    • @ThunderingRoar
      @ThunderingRoar Год назад +17

      @@gavinderulo12 it wasn't the first ue4 game, fortnite came out around 2017, ue4 was a thing in 2014

  • @spudnic5849
    @spudnic5849 Год назад +9

    A game called ''The Isle'' Just got a beta for UE5 and fps has litterally doubled across the board. Its insane how much of a difference it makes compared to UE4!

  • @kennadod2080
    @kennadod2080 Год назад

    Good content. happy with my 6750Xt for the next few years

  • @Chef_-xv7ms
    @Chef_-xv7ms Год назад

    Very interesting video Thank you.

  • @marley_sr
    @marley_sr Год назад +2

    I play Satisfactory, and the experimental version was recently updated with Lumen. Man, that shit shines! It's absolutely amazing. Even my 6700XT can handle it well. I just can't turn it off anymore. DX12 and Vulkan.

  • @n78966969696896
    @n78966969696896 Год назад +8

    Curious why there is no 3080 or 3060ti comparison? Not hating but this would definitely be something I think many would like to see what they are up against with last gen hardware from NVidia.

  • @RatedR3030
    @RatedR3030 Год назад +3

    would like to see gpu range performance on nvidia 60 cards like the 50/60 series skews since they are more common among gamers according to steam charts and myself, most people still rocking 1050tis 1660s 2060s 3060s etc since realistically most people dont have more than 200-400$ to dump into a single pc component when ssds/cpus/mbs/power supplies/ram are still all priced enough that a decent system puts people in the 1000$+ range just for medium + details at 1080p

  • @Emma_Z1
    @Emma_Z1 Год назад

    i got a 6800 XT recently so i am glad it will perform well on new games

  • @mnedix
    @mnedix Год назад +1

    I really appreciate the work you're putting in @Daniel, it really helps us (the regular GPU users) to have an idea about the real life situation. I'm still sitting on a stone age GPU (GTX970) that I wanted to replace right about when the crypto started but no chance due to prices. Now that the market recovered, for almost a year I am internally debating which GPU should I choose. Of course there is the performance factor I am interested in but it is more at the new-toy level. I actually barely have a couple of hours to play during weekdays and probably slightly double during the weekend. What I started to be more and more interested in are the "soft" stats like disipated head, wattage and noise level both idle and under load.
    If it's not too much to ask, could you please also add the manufacturer and the model of the GPU's you are testing? I would be interested in how many fans they have and their size. Thanks.

  • @malec8517
    @malec8517 Год назад

    The hero we needed

  • @relu84
    @relu84 Год назад

    This channel is criminally undersubscribed!

  • @HardyDimension
    @HardyDimension Год назад +4

    This is a really good news for all the GPUs, especially to AMD GPU.
    No longer need to rely on Ray Tracing but also can enjoy the effect and it is even better.

  • @CptBlaueWolke
    @CptBlaueWolke Год назад

    1:47 funny because the main purpose for the nanite system is exactly this: closed interior spaces with few moving objects. Epic might have it improved by now but that was the main intention for the first release.

  • @techsamurai11
    @techsamurai11 Год назад

    Great conclusion about using RT in moderation to improve the quality without dramatically affecting performance. I think Insomniac has proven the best to implement RT in their titles while maintaining multiple modes on consoles. Lumen will hopefully democratize Ray Tracing and RTGI for any game using it.

  • @iancurrie8844
    @iancurrie8844 Год назад +4

    Thanks for the coverage of the 7900XT and 6700XT! Great video!

  • @ranarambo2529
    @ranarambo2529 Год назад +6

    Love your videos keep it up. Thanks for this comparison, I’m now definitely getting the RTX 4070.

    • @megamanx1291
      @megamanx1291 Год назад +1

      Don't, please that vram will mess you up!!

    • @LagiohX3
      @LagiohX3 Год назад

      lol 4060 16gb seems like a better investment if u wanna stick with nvidia

    • @03chrisv
      @03chrisv Год назад +1

      Don't listen to the other two, the 4070 is a great card.

  • @Slane583
    @Slane583 Год назад

    When Epic Games released the tech demo for UE5 demonstrating how it works and all of the layers involved I thought it was the coolest thing. Getting so much detail into environment design as well as the lighting all while doing it in real-time. Such detail was only ever done in CG based films and even then it took several hours or days to pre-render such a scene. So it is understandable as to why some people consider it a hoax of sorts. But in reality it is very real. I for one welcome any future games built upon it. It will be nice to see how far the devs can take it. :)

  • @laszlodajka5946
    @laszlodajka5946 Год назад

    Glad to see that there is a new solution for rt where amd gpus can also give smooth performance. Hopefully this will be a trend instead of the unnecessarily taxing heavy scenarios where u need to give up on native resolution

  • @jeremylindemann5117
    @jeremylindemann5117 Год назад

    I think it would be helpful if GPU performance monitoring tools were improved to have individual utilisation levels for different work areas within the GPU. For example there could be a utilisation metric for how many CUDA cores are in use or what percentage of their power is being utilised. The same would be useful for ray tracing (tensor cores) so they could output how much of the ray tracing is being utilised. There are probably several other types of workloads going in within a single GPU that could report back individual usage levels.
    This would help give us more reliable numbers instead of much less reliable estimates based on personal individual judgement.

  • @mr.deadpool2350
    @mr.deadpool2350 Год назад +1

    Looking at this benchmark seeing the 6700xt go sub 60 at times at 1440p, I have the 6750xt, would it last 3/4 years at high/med settings with tweaks at 1440p on newer games?

  • @sleeplessindefatigable6385
    @sleeplessindefatigable6385 Год назад +1

    The interesting data here is that RDNA2 seems to see a much smaller performance penalty than it does in most RT scenarios, and across all cards tested, RDNA2 seems to actually only lose a little bit more performance for RT compared to Lovelace. Together with the Fortnite results, it seems Lumen taxes hardware weirdly equally.

  • @soraaoixxthebluesky
    @soraaoixxthebluesky Год назад

    Wish you also include Ampere in this comparison as how you also include RDNA2.

  • @anarchicnerd666
    @anarchicnerd666 Год назад +20

    WOW! This is awesome data to get out in the open Daniel, thanks! Really hoping my RX 6650 XT isn't going to be completely crushed...
    EDIT : The RX 7600 is roughly comparable to the RX 6650 XT, and it definitely seems like what I expected - fine at 1080p, at 1440p drop settings or use FSR. Phew, glad for that. I do wonder how things will evolve moving forward tho, Layers Of Fear is a very small and contained game that perhaps doesn't stress the engine too hard. Immortals Of Aveum is the first big test for scalability on 8gb GPUs, and we'll have to see how things evolve from there.

    • @Movierecap998
      @Movierecap998 Год назад +1

      i am planning to buy a 6800xt with 5700x for streaming and casual gaming and video editing with some coding and ai learning. is it good or can you recommend me something better?

    • @anarchicnerd666
      @anarchicnerd666 Год назад

      @@Movierecap998 Hmmm. 5700X is equivalent to an i7 12700k, so it's a beast and should do the job. The 6800 XT though? For pure gaming and video editing AMD's a great choice, the workflow you described though might be better suited to Nvidia. Make sure you check the programmes you use and see how much they benefit from CUDA acceleration. At that price point though you're effectively sacrificing VRAM for acceleration on the tasks you do. Tricky. I'd say go for the 6800 XT if you're prioritising gaming and video editing, get a 4070 non ti for $100 dollars more if you're prioritising your professional workload.

    • @Bleckyyyy
      @Bleckyyyy Год назад +1

      @@anarchicnerd666 So basically you want him to buy a crappy expensive latest generation 8gb graphic card. Talk about advice xD Gonna be outdated when it comes to gaming this winter.

    • @anarchicnerd666
      @anarchicnerd666 Год назад

      @@Bleckyyyy Hey, if you've got better advice, go for it :) He did mention coding and AI workflows though, which can benefit from CUDA acceleration on Nvidia. Don't get me wrong, I smell a 4070 Super refresh next year maybe XD Ngreedia suck, but they just are plain better for some tasks

    • @anarchicnerd666
      @anarchicnerd666 Год назад +1

      @@Bleckyyyy Also the 4070's a 12gb card :P not an 8gb one

  • @brendanhoffmann8402
    @brendanhoffmann8402 Год назад

    I just got Resident Evil Village... loving the graphics, Watch dogs legion is pretty good too. FInally playing next gen games lately on my 6700xt

  • @pascaldifolco4611
    @pascaldifolco4611 Год назад +2

    What's cool with UE5/Nanite is that the RT isn't CUDA dependent so AMD cards still rock ^^

  • @Shammikaze
    @Shammikaze Год назад

    Why do the graphics and lightning look so different between the two cards? For example, at 5:50 you can clearly see more "glow" from the light entering through the ceiling on the 7900 than on the 4080. Is that a graphical setting, or a hardware limitation? It seems like a substantial difference...

  • @WinterSnowism
    @WinterSnowism Год назад +7

    Amazing Ram & Vram usage even at 4k RT on

    • @gavinderulo12
      @gavinderulo12 Год назад +4

      There is nothing on screen lol

    • @PhilosophicalSock
      @PhilosophicalSock Год назад

      @@gavinderulo12 He meant exactly that probably. Isnt it amazing to spend 6gb vram to hold a couple of textures and several objects in an empty coridor?

    • @gavinderulo12
      @gavinderulo12 Год назад

      ​@@PhilosophicalSockit's likely just allocating that and not actually using all of it.

    • @Wobbothe3rd
      @Wobbothe3rd Год назад +1

      Exactly, UE5 is very memory efficient. 8gb cards will be just fine for this entire generation.

    • @PhilosophicalSock
      @PhilosophicalSock Год назад

      ​@@gavinderulo12 Yea, most likely. Immortals of Aveum does not seem to require a lot of vram with all the graphics complexity. I guess, we are safe.
      Also it will be interesting to see how nvidia's NTC performs in reality

  • @TarnishedRyry
    @TarnishedRyry Год назад

    Hows the stutter it was there in the demo, still waiting on the version of unreal engine that supposedly eliminates the shader stutter.

  • @MsHehehe007
    @MsHehehe007 Год назад +1

    may i ask why is the 3080 ti seldom appearing in reviews now a days ?

  • @stefanrenn-jones9452
    @stefanrenn-jones9452 Год назад

    I play fortnite with lumen and nanite set to epic at 4k with some others shadow settings turned down on my 7900xtx, holds above 70fps. Great performance. But other more geometrically dense UE5 games may not be so forgiving.

  • @onomatopoeia162003
    @onomatopoeia162003 Год назад +1

    Satisfactory update 8. It's in Experimental version of it :)

  • @kruze187
    @kruze187 Год назад

    Forget to mention dlss 3 which is only release for the 40 series and is being implemented into ue5.2

  • @serbelike
    @serbelike 5 месяцев назад

    4070 ti’s performance is incredible even though it has 8gb less than 7900XT, in love with this gpu 💚

  • @anthonytroila4907
    @anthonytroila4907 Год назад +5

    The lighting in this game is next level like the lantern actually looks like it's reflecting off the walls like the real deal. I honestly thought It was real lighting. It's amazing how the performance is stable with all these gimmicks. This is the big step that I was expecting for a next gen experience.

    • @sirbughunter
      @sirbughunter Год назад

      Agreed. The lighting is incredibly realistic in this game. I played the demo with my 7900 XTX and I got over 100 FPS 🙏

  • @imAgentR
    @imAgentR Год назад

    Does this mean more compilation stutters for the next 5-6 years? My gpu has to be upgraded just for that.

  • @AR-ey1ur
    @AR-ey1ur Год назад

    Did you notice any difference in image quality with the new lighting systems, though? Is Lumen, RT, etc. even worth it?

  • @SenorSwagBuns
    @SenorSwagBuns Год назад +6

    This generation feels so wrong when I want to play at 4k 120hz.

  • @XieRH1988
    @XieRH1988 Год назад +3

    looking forward to see what Respawn will do in UE5 for their Jedi Survivor sequel. I'm sure they'll find ways with lumen to make the FPS go into the single digits if your lightsaber so much as illuminates more than 10 surfaces on screen

  • @t3amb4sh
    @t3amb4sh Год назад +1

    What impresses me the most, is that in this game, at 1440p , with RTX 4080, I get a constant 99% GPU Usage no matter what and a VERY smooth frametime graph (using i7-13700K). No ups and downs, no nothing.
    I hope UE5 , with all of its features, take advantage of what the hardware can do

  • @georgebessinski2122
    @georgebessinski2122 Год назад +8

    Game doesnt have many really reflective surfaces, so hardware RT doesnt use that much resources compared to software lumen. If it had something like lots of mirrors, then it would be completely different. Altough software lumen reflections can`t render mirror reflections properly at all, so it makes sense. As for shadows i dont see much difference between software and hardware when playing around with UE5. At least not enough to warrant losing performance.

    • @2intheampm512
      @2intheampm512 Год назад

      From what I understand shadows are by far the least computationally expensive aspect of lighting to Ray trace right? I'm assuming that based on the fact that pretty much all PS5/XSX games with RT will use RT shadows as the baseline

    • @gavinderulo12
      @gavinderulo12 Год назад +1

      ​@@2intheampm512and you can already get really nice soft shadows basically for free tracing against signed distance fields, so not sure how much of an improvement the hardware version can possibly yield.

    • @2intheampm512
      @2intheampm512 Год назад

      @@gavinderulo12 Thanks appreciate the explanation 👍

  • @Cygnus-Phi
    @Cygnus-Phi Год назад +5

    Thanks, this cements my decision for me: 4070 will be just fine.

    • @ronhaworth5808
      @ronhaworth5808 Год назад

      If only NVidia would drop the price to $499 it might be fine for me too.

    • @Cygnus-Phi
      @Cygnus-Phi Год назад

      @@ronhaworth5808 Oh yeah, I'm still waiting for the 7700 and 7800 launch hoping that will drag the prices down a bit but it seems AMD is just as terrible.
      Having said all that I paid €480 for my 1070ti in 2017, which would be around €630 in today's money. I can buy a 4070 for €680 right now, in stock. It's not that bad.

    • @ronhaworth5808
      @ronhaworth5808 Год назад

      @@Cygnus-Phi I think there's some things about NAVI 32 we aren't being told. The common scuttlebutt is it's barely faster than a RX6800XT so AMD doesn't know where to place it. But if you recall a few months back there were some leakers than said that RDNA3 NAVI 31 and 33 were flawed but NAVI 32 was fixed. Well, if AMD has fixed the issues with RDNA3 with NAVI 32 then maybe it performs a lot better than we are being led to believe. If true this puts AMD is a tough spot if the 7800XT comes close or beats the 7900XT.

    • @Cygnus-Phi
      @Cygnus-Phi Год назад

      @Ron Haworth perhaps, but for my specific situation my PC runs 16 hours a day of which 10 hrs gaming. I'm not going to use a 300-350w card. If amd had actually made a step forward instead of just brute forcing more performance while gas guzzling like a drunkard then it would be an option. But looking at the 7900s and then 7600 that doesn't seem to be the case.

    • @noobgamer4709
      @noobgamer4709 Год назад

      @@ronhaworth5808 navi32 probably be 10-15% faster than 6800 non xt. the cu count just to low for it to beat 6800xt. 60 on n32 vs 72 on n21. if somehow amd can get the n32 silicon to perform as it should have been (if the rumor is true regarding rdna3 silicon issue). you might can see 6800xt+10%. if not amd need n31 (w7800) to be 7800xt

  • @maxsmith108
    @maxsmith108 Год назад

    It’s interesting to look at fps and 1% lows, but at the end of the day consumers also buy into two (now three, with intel in the game) different ecosystems: nvidia GeForce experience/control panel or AMD Adrenalin. AMD has also been more generous in their pricing, their vram amounts, and SAM resizeable bar support. Nvidia has better raytracing and, at least traditionally, software and codec support. I think fps is and frametimes are decent measures, but anybody who buys a gpu should understand all facets of what they are buying.

  • @Dionyzos
    @Dionyzos Год назад +1

    I recently tried the new demo for Jusant which is also using UE5 and it absolutely TANKED my 3080 at native 4K. There weren't even options to turn RT on or off and no DLSS or other upscalers either. It looks really good though.

    • @Dionyzos
      @Dionyzos Год назад

      @Blue You could say that about many games with a non realistic artstyle, especially AA and Indie stuff. But then again, Jusant has too much geometric detail which PS3 or phones can't render.

  • @Kryptonic83
    @Kryptonic83 Год назад +2

    I just hope #StutterStruggle is no longer an issue with UE5 games. Sounds like shader cache and CPU utilization should hopefully be better with UE5 than UE4.

    • @gavinderulo12
      @gavinderulo12 Год назад +3

      It's one of the major improvements in the 5.2 Roadmap.

  • @And_Rec
    @And_Rec 3 месяца назад

    Something doesn't compute with me tho, how can and be on par with ray tracing here while on released games the and stall at 20 while 4070 doubles that? Are game released that bad or something is weird here?

  • @IAMNOTRANA
    @IAMNOTRANA Год назад

    I'm not sure if the game is using nanite asset, but it is then this game should scale really well on low resolution like 1080P and have high refresh experience.

  • @seank997
    @seank997 Год назад +1

    I was considering the 7900xtx but Best Buy has open box 4080 for the same price…… what to do 🤔

  • @gavinderulo12
    @gavinderulo12 Год назад +13

    I wonder if hardware lumen even has a visual difference in this game. As the main difference is that hardware lumen can trace against dynamic objects. Which the game doesn't seem to have.

    • @MLWJ1993
      @MLWJ1993 Год назад +3

      Not exactly, for GI the software solution traces against signed distance fields & global distance fields (voxels) whereas the hardware solution traces against triangles. This results in the occasional lightleaking with the software solution (visible in Hardware Unboxed video when comparing Lumen Hardware RT vs software RT in Fortnite).

    • @gavinderulo12
      @gavinderulo12 Год назад +7

      ​@@rustyclark2356but the hardware acceleration on the GPUs is specifically made to enable faster ray to triangle intersection calculations.
      That's all that the hardware version is doing.

    • @gavinderulo12
      @gavinderulo12 Год назад

      ​@@MLWJ1993you are right but the light leakage is mainly an artefact of low resolution mesh distance fields. I still think the largest downside is the fact that it only operates on static meshes. While the hardware version Updates the bvh structure if necessary. Even for deformed meshes.

  • @GKSchattenjaeger
    @GKSchattenjaeger Год назад

    Important question, got any good scares?

  • @dante19890
    @dante19890 Год назад

    In this kind of game I think U would get a similar look with baked lightning.

  • @YoureBreathtaking
    @YoureBreathtaking Год назад +2

    I guess you can expect the same performance and quality for silent hill remake judging from this layers of fear game

  • @unorthodox5171
    @unorthodox5171 Год назад

    To anticipate Unreal Engine 5 games, which GPU should I get between AMD 6700 and AMD 7600?

  • @parzivallampruge2549
    @parzivallampruge2549 Год назад

    Battlebit go Brrrrr

  • @amaurytt
    @amaurytt Год назад +2

    I know it is a lot of cards to test but sad that you didn't include your RTX 3080 12GB model. I guess it is on par with the 4070ti...kind of started watching your channel because of that card 😅

    • @amaurytt
      @amaurytt Год назад

      @@blue-lu3iz Will take your word for the rtx 4070...for the 6800xt I don't see how that is possible since the 3080 12GB fits in between the 3080 10GB and the 3080ti. That particular model is not limited to just a VRAM bump. Edit : @1440p ultrawide

  • @TheTakenKing999
    @TheTakenKing999 Год назад

    The MVP we don't deserve but we need

  • @randomanimegalaxy6859
    @randomanimegalaxy6859 9 месяцев назад

    Have checked this all with desktop gpu or laptop gpu ?

  • @Krenisphia
    @Krenisphia Год назад

    I'm really looking forward to new UE5 games, but I think Layers of Fear is not the best title to showcase the new tech. As you mentioned, it's a smaller, tighter and slower paced game so performance isn't too bad.

    • @LagiohX3
      @LagiohX3 Год назад

      We need some goofy fast paced game like Just Cause to showcase the engine.

  • @crimehole
    @crimehole Год назад

    omega strikers has been out for a while and is actually using unreal engine 5 but that's a cross platform game that's on mobile, and the only new tech is uses is TSR (and it's actually terrible in that game, cuz it's so much more computationally expensive than just using a higher res with the simpler rendering)

  • @vvrr6dif
    @vvrr6dif Год назад +1

    Thanks I hope VR will look a lot better very soon

  • @adamadamx5464
    @adamadamx5464 Год назад +2

    AMD cards are not bad in RT on UE5, also UE5 don't need so much VRAM here, but I wonder what will be on open world with UE5 VRAM utilization.

    • @giglioflex
      @giglioflex Год назад

      How much VRAM is required will depend entirely on the game. What UE5 does allow is for games to scale up their game to degree previously difficult.

  • @LayerZlayer2000
    @LayerZlayer2000 Год назад

    Well i am happy and can't cry with my 4090 for most games

  • @bindxpoxt
    @bindxpoxt Год назад +7

    the diference in temps and power draw on the 7900xtx vs 4080 are insane

    • @istgyash
      @istgyash Год назад +1

      ​@jennymelo2098same I have a 4070ti with decent uv it uses just 150 w while gaming on 4k truly efficient

    • @shepardpolska
      @shepardpolska Год назад +1

      Nvidia jumped from a rather bad 8nm Samsung to a much better 4n TSMC. The RDNA3 chiplets also come at a rather large power cost, so it's not that suprising this gen Nvidia is this efficient, they were terrible at efficiency on 8nm Samsung process, now they have the monolithic efficiency advantage and they are on a better node then AMD.

    • @saricubra2867
      @saricubra2867 Год назад

      ​@@shepardpolska Ada Lovelace is objectively speaking a better architecture than RDNA 3 as well.

    • @shepardpolska
      @shepardpolska Год назад

      @@saricubra2867 It's not objectively speaking a better architecture because it isn't better at everything.
      It might be better for you, but for example it isn't for me, most of it's advantages I would simply waste and not use

    • @saricubra2867
      @saricubra2867 Год назад

      @@shepardpolska It's better at everything. That's why no one buys AMD cards for serious work.

  • @LamGorYun
    @LamGorYun Год назад

    game runs pretty well with rtx on even on a 5 yr old turing card.

  • @VimyScout
    @VimyScout Год назад

    I watched recently the Starfield trailer and wondered if it was UE5. Watching this on a newly installed 6950XT.

    • @danielowentech
      @danielowentech  Год назад

      Stafield will not use UE5. Bethesda updates their own engine for each new game.

  • @Extreme96PL
    @Extreme96PL Год назад +6

    I once read that UE5 is more optimized in resource usage than UE4, I'm not sure if that's true or if it will make a difference in games as developers can always fuck up.

    • @gavinderulo12
      @gavinderulo12 Год назад

      What do you mean? Unreal 5 has completely new technologies so you can't compare the two.

    • @AntiGrieferGames
      @AntiGrieferGames Год назад

      Depends on Games. This Game seems well optimized (or not?)

  • @ramjidharmaraj
    @ramjidharmaraj Год назад +1

    Have been a fan of your channel for a long time. Keep up your spirit.
    What do you think is the optimal FPS for a 1080p game so that the game will look like a movie?

    • @HarbingerLP
      @HarbingerLP Год назад

      If you want your game look like a movie i'd say 24fps, since movies are 24fps, if you want a game to play smoothly i'd say 60fps+ depending on screen refresh rate

  • @chexmixkitty
    @chexmixkitty Год назад

    why is your 6950xt clock so low? Mine is at least 2600mhz or higher.

  • @Jason_Bover9000
    @Jason_Bover9000 Год назад +19

    This game is pretty well optimized

    • @Inpulsiveproductions
      @Inpulsiveproductions Год назад +8

      Its a corridor game...

    • @gavinderulo12
      @gavinderulo12 Год назад +25

      Bro, the 4090 is only running at 100fps. Which doesn't seem too bad if you ignore that there isn't anything on screen other than an empty corridor.

    • @capetamenino
      @capetamenino Год назад +5

      ​@@InpulsiveproductionsDo the same corridor at the forspoken's engine and get the same game 12x more demanding

    • @melcorchancla9431
      @melcorchancla9431 Год назад +1

      @@capetamenino no lmao, not how things work.

    • @melcorchancla9431
      @melcorchancla9431 Год назад +1

      It’s a corridor, it runs like shit for a corridor I’d even say. This literally runs rose than cyeberpunk and you call that well optimized.

  • @outlet6989
    @outlet6989 Год назад +1

    AAA Gamer, "I'm spending more time playing with the settings than playing the game."

  • @hansyulian3671
    @hansyulian3671 Год назад +1

    i can't tell the difference between RT on and off

  • @vtheman1850
    @vtheman1850 Год назад

    My issue with everyone and their mother using UE5 is that i've already started noticing weird issues with just how things are rendered in every game.
    I know it's odd but hair is a great example of it, the weird almost plastic glow on it that i've taken to calling UE hair :D is just one of the many issues.

  • @Shikaar
    @Shikaar Год назад

    Predecessor can be bought on Steam and is UE5

  • @SameBasicRiff
    @SameBasicRiff Год назад +1

    really wish you had even a single 30 series card. You can;t say "can your gpu run [x]" if then gen a majority of GPU owners are on is not even listed.

  • @erickelly4107
    @erickelly4107 Год назад +2

    I think 1440p will be the "optimal" resolution by in large for even the RTX 4090 when more UE5 games start coming out. Personally I prefer to game on my 1440p /240hz / G-Sync monitor currently with the RTX 4090 as I don't find the jump from 1440p to 4K to be all that impressive (I also have a 4K/ 120hz OLED / VRR display) and I much prefer a minimum of 90FPS (Native 1440p / max settings / ray tracing) for games like Jedi Survivor / CyperPunk 2077 and up to 240FPS for first person shooter games like Destiny 2 / Halo Infinite for example. I still find that 1440p on a 27' display looks amazing, especially when you're able to sustain at least 90FPS at max settings w/ ray tracing. I've never felt justified running 4K if it means "upscaling" and or settling for less than 90FPS but that's just me, it's nice to have options for people regardless as much of this is simply preference.
    Anyways, thanks for yet another excellent analysis Daniel! You are my "go to" reviewer when it comes to GPU reviews.

    • @samgragas8467
      @samgragas8467 Год назад

      Any worth new game will have upscaling, if you have a 1440p monitor or a 4k monitor, you should always play using upscaling quality/balanced/performance instead of lowering the resolution.
      In your case you should change monitor as you do but it wont be a big deal. 4k dlss quality shits on native 1440p imo.

    • @erickelly4107
      @erickelly4107 Год назад

      @@samgragas8467 Yeah don’t think 4K upscaling of any sort “shits on native 1440p” this isn’t even remotely the case in my experience but I’m sure plenty will looking for ways to somehow justify that 4K display they bought rather it makes sense or not. Also native will always look better than some “upscaled” resolution and there’s no need to “upscale” native 1440p with an RTX 4090- just a pristine native image like God intended. Diminishing returns beyond 1440p when it comes to gaming generally is what I’ve found- never a case worth going from 90fps to a lower fps just to game at “4K” as 1440p native looks phenomenal at 90fps +

    • @samgragas8467
      @samgragas8467 Год назад

      @@erickelly4107 You have 2 monitors, 1440p or any resolution besides 4k looks terrible on 4k displays so most people agree dlss is superior. In some games dlss is better than native 4k because dlss is a good antialiasing.
      1440p native on 1440p vs 4k dlss quality is just your case and I doubt it doesnt look better in whatever game you tested.
      FPS are almost identical just a bit lower with dlss so idk wym.

    • @gavinderulo12
      @gavinderulo12 Год назад

      ​@@erickelly4107what? Upscaling from 1440p to 4k using dlss quality Mode looks significantly better than native 1440p. And It has much better antialiasing on top. If you are not using dlss quality Mode you are just leaving image quality on the table. There is literally no reason not to use it.

    • @erickelly4107
      @erickelly4107 Год назад

      @@gavinderulo12 Seem you may be missing the point here…
      The point is that gaming on a 1440p monitor allows for FAR superior performance vs. gaming on a 4K display/ monitor. Yes you can use DLSS (looks "better" than Native? Don't think so but it's not bad and if you want higher performance it's often worth doing) on a 1440p monitor for games that have this but the point is that it's generally not even necessary as it IS with 4K - even with an RTX 4090.
      On more demanding games you WILL have to enable upscaling in order to get acceptable performance, this isn't the case with 1440p generally speaking. In either case (with or without upscaling) performance is FAR superior at 1440p. I'd take 1440p@ ~90FPS vs. 4K@ ~60FPS any day with or without upscaling.

  • @syncmonism
    @syncmonism Год назад

    It's impressive to see how UE5 seems to be so much better optimized for AMD/ Radeon hardware than previous iterations of Unreal Engine used to be. It's always good when both AMD and Nvidia GPUs work well with a game, and Intel GPUs seem to work relatively well in this game as well.
    It's also impressive to see ray tracing actually working so well on AMD GPUs, especially the so called "software based" lumen. I find the term "software based ray tracing" to be a bit odd, because it's running on a GPU, and a GPU is a hardware accelerator. Back in the day, running a 3d game without hardware acceleration literally meant that it wasn't using the GPU at all, and was rendering using the CPU, and without using a 3d API like DirectX, Vulcan, or the now long defunct Glide.

    • @MrAkamagi
      @MrAkamagi Год назад +1

      That's because UE5 does RT on the CPU, not the GPU. You could toss in an Intel GPU and get pretty much the same results. The RT is being calculated by the CPU and then the results sent to the GPU. The minor differences in performance are simply a result of driver optimization. If you look up the documentation on for the Unreal Engine it says clearly that almost all hardware RT support has been degraded and will likely be removed entirely in the future in favor of Lumens software rendering. All they are supporting now are Skybox and basic single source lighting in hardware. Everything else is being done in software.

    • @gavinderulo12
      @gavinderulo12 Год назад +1

      ​​​@@MrAkamagiere does it state any of that in the documentation? I doubt Lumen runs on the cpu, the difference between the hardware version is that it traces against triangles instead of signed distance meshes. Which is way more accurate and only possible because modern GPUs have special units that are designed to perform ray triangle intersection calculations.
      The cpu only build the bvh structure.
      And where does it say that they won't improve hardware lumen? In their 5.2 Roadmap they literally listed multiple improvements to the hardware version.
      Edit: in the documentation it says Software Lumen requires a GPU that supports Dx11 shader model 5. So it definitely runs in the gpu.

    • @Blue_Man
      @Blue_Man Год назад +1

      @@MrAkamagi That is incorrect (I'm a software engineer using UE5 professionally)
      Software Lumen still runs fully on the GPU, but the ray tracing is implemented in the shader (thus software), it's tracing against signed distance fields which is an approximation of the scene, still fully on the GPU.
      Hardware accelerated RT on the other hand is also on the GPU but the ray triangle intersection is handled by the rendering API (DirectX for example (DXR)) and processed by the RT cores.

    • @LagiohX3
      @LagiohX3 Год назад

      well they have to work well on AMD since they are on all current gen consoles.

  • @BoGy1980
    @BoGy1980 Год назад

    @5:57 i have the impression that on the nvidia card the view angle looks smaller than on the AMD card, did u use a built-in bench or a script to perform the movement? Or did you just walk through that level by yourself for every card? If it's scripted/built-in, i'd really start comparing this cuz it LOOKS like nvidia is again taking shortcuts to gain the upper hand (aka cheating)...

  • @guily6669
    @guily6669 Год назад +1

    The problem is this isn't your typical UE5 game, recommended hardware claims GTX 1070...
    Everything I tested on my RX580 UE5 based run so damn bad and the problem is most stuff I tested even looked pretty bad and worse than the majority of much older games that run better.
    I also tested the "Unrecord" crazy realistic game map test compiled and it's just a small indoor map that doesn't look much good and nothing like the final game and even just that run so bad😕