So many people still claim devs are not at fault for bad optimization. Well so explain why would we need a 4090 to play the game... with 2 upscale... on PERFORMANCE MODE! That's laziness.
Not only that, the game really doesn't look that good, i mean graphically , they're more games from like 2,3,4 years that looks way way better and requires less hardware , look at re4 remake, damn that next gen graphics
Not a defense, but modern software is damn complex. I wouldn't blame the developers directly. I would however point the finger at the management expecting more, for less money and faster.
@@JohnKerrashVirgo i don't disagree with u however , the devs are responsible for how the game work, so the games from 2,3,4 years are not complex for its that time? But since these shis techs such as dlls,ccs, fsr,gsr whatever the shit names! Devs started to depends to much on them, and that will prudence half baked games for us, and the more we shut up about it, the more the game industry will get fucked. For example , this Indiana game, doesn't looks that impressive !!? No games from 2,3,4 years some are way better graphically than it, but ray trasing!! Shit technology
The example also cheated. The rasterized render did not use screen space ambient occlusion to spread out the light a little bit. And pre-baked texture can rival ray tracing in non dynamic objects like the walls of the room.
@@Noavaileblenames It can be subjective. With less light bouncing around you might feel that the non full traced image is less distracting. Realistic does not always mean better. A good example of this is that blurring effect they put to make games more realistic but it makes the gameplay worse. This is the mean weakness of ray tracing it drops the FPS so much making the gameplay worse while bringing a small improvement in realism that might not even improve the image quality to many people.
Energy efficiency is going to be a huge issue. Not only is energy getting more expensive, but the AI gold rush means that enterprise will want to fit as many AI accelerators into their racks as possible while keeping the power consumption down (not just lower bills, but requiring less HVAC and lower per-cycle running costs to stay competitive). nVidia has had the field largely to itself, but that may change as AMD roll out more efficient processors.
@@whitecometify Yes, solar and offshore wind are now (by far) the cheapest forms of generation. However, in many countries - particularly here in the UK - the domestic price is set by the wholesale price of gas which is far more expensive. Renewables are great, I'm a big fan - but unfortunately their output is not constant so they have to be supported by hydro, nuclear and/or gas generation to ensure base loads are met, The other factor is that asset managers like BlackRock have moved into the energy sector, hiking prices to provide high returns for shareholders. So while some generation is getting cheaper, demand and private equity means that the price is going UP, not down.
In the first example with the lights shining through the windows, while the fully ray traced example does certainly look more realistic, the "not fully ray traced" example looks aesthetically better to my eyes, even if it is technically less realistic and this is just a big problem with ray tracing in general is that it's much more of a game development time optimization technique rather than a graphical enhancement, but the mainstream buyers don't have hardware capable enough to do that sort of ray tracing yet and so it is still too early, and it will remain too early to do fully path traced only games until Nvidia or AMD or Intel (or better yet, all 3 of them) decides to give us actually path tracing capable hardware in the mainstream GPU segment and just not make it a luxury for the top-end 0.01% of buyers.
When developers are given enough time and resources, they can make a video game look incredible using rasterization. Even lighting, shadows, reflections, and refractions can be done really well with rasterization when developers put the time in. For other assets such as textures and geometry, rasterization looks just as good as RT/PT. It's a shame that developers are starting to leave rasterization behind at least when it comes to single-player games.
It's not entirely fair to analyze ray tracing only by looking at still images. The largest effect it really has is the added realism of moving shadows and reflections while the observer is in motion.
And what that do to the game play? Does it make the game playable? Or make the players more pro!? From now on we will see alot of games that doesn't even looks next gen will ask for more hardwares, dont be surprised when a devs make a game that looks like 10 years old and ask for a rtx 5060 as minimum .
A more realistic shadow gets defeated when half the time you are seeing fake frames because your $2K GPU is not fast enough to render full tracing in 60 FPS.
@@weil46 Quote: _"And what that do to the game play? Does it make the game playable? Or make the players more pro!?"_ Ray tracing doesn't do any of these things. But that is not what ray tracing is for. You might as well complain that green tea is useless because it doesn't make you more of a "pro-gamer".
@The_Stef bro what i mean is, if ray trasing become something that devs push it so hard like Indiana Jones game, when its not even a graphically impressive, than we will see in the future games that demands more hardwares than what the game should need, and as u know how gpus prices are.
The only time you get a soft edged shadow outside is with cloud and layered shadows (Dappled). This is due to the sun being so intense a light source and at a singular point. Soft shadows appear inside under man-made lights, due to how the light source is made. What I am seeing in these pics demonstrates the tech and the design of games is only just starting to develop the detail that the human eye picks up. I wonder how long it will be before a moving image generated in software will be indistinguishable from real life by most people? Scary thought.
100% agree with you on the "Full Ray Tracing" thing. Its looks... different, likely more realistic. But better? At this point it's a matter of preference. I can see many people would prefer the "simple RT" as it produces more contrasty, and in some cases, easier to read results. Kind of like some people would prefer over satured pictures over a more realistic, true colors, "flatter" one... We're really at the point of diminishing returns, and I'm not sure they will sell us more GPU based on such "improvements".
1) AMD reaching their power efficiency goal: AMD has been known to play fast & loose with their published numbers of late, so I'll wait until I see some real world testing before congratulating them over it. 2) The path tracing thing: I play my games without the benefit of having two side-by-side renderings to compare, so I genuinely don't think it's a big deal at all (and on a related note, I _actually_ prefer some of the 'Full Ray Tracing OFF' images over their 'ON' counterparts). 3) ARM's GPU: I'm really excited about this and I hope it pans out. 4) Ryzen Max: I'm somewhat excited about this but (again), until it's out for independent testing and review, I won't be popping my cork over it _just yet._
Real RT can make a huge difference. The subtle details can be everything when trying to reach realistic visuals. Especially when you bring in animation.
Ray Tracing and path tracing currently not worth the cost to me. Maybe someday that tech will be more widespread and affordable, but it really is not currently and the graphical upgrade in game vs the costs makes it not worth it to me.
5:04 this is one of the things that make the game feel closer to the movie series. consider that the movies were shot by steven spielberg that went to great lenghts to shape the scenes he shoots in. the subtle difference between non rtx to rtx is the differenc ebetween it looking like a game and looking like you are inside those movies.
Question with these APUs is what the pricing will be. We have a >300$ CPU paired with an iGPU with the power of a 400$ video card... 700$ just for the chip might be too much but it will be extremely expensive and idk if we will see them somewhere else besides consoles
This is the way to go. Having games that require hardware from next year to run on medium settings. Just like in the olden days. The most modern games on the most modern GPU would run at 15fps at low resolutions. That's why the next generation GPU had almost double or even four times the power. These days, it's like 15% and people are happy. So please, integrate even more insane things. Give it a 5fps Cosmic-Scale Setting.
A variant with four fast (8000+) memory channels would be interesting in a desktop along with a great cooler - might be an excuse to use liquid cooling.
ARM entering the PC GPU market is great 👍 for one more competition; then prices get lower and performance get highly competitive. ARM GPU architecture is used in millions of smartphones and many developers are more than familiarized with it. But scaling from handhelds to PC is a huge step and they barely have any windows drivers (ATM W11 is optimized for Adreno GPUs)
DLSS, FSR, and XeSS have made game developers lazy in thinking they don't have to optimize the game for performance. "Oh just lower the resolution and use DLSS to get 60 FPS" Bitch I got a 240 Hz monitor, 60 FPS is rookie numbers you gotta pump those numbers up.
ever since I came to pc gaming from xbox, and started regularly getting 100+ fps, I started to not mind bringing down the fidelity a bit to get that smooth game play, cause it can look amazing, but if it doesn't feel smooth while I'm playing it, it looks bad, at least to me. Both the full ray-traced and partial ray-traced screenshots looked pretty nice though.
So is Intel and many US defence chips. But enjoy supporting Islamists and terrorists that want to destroy the West and its freedoms. Palestine had decades of anti-western propaganda and indoctrination in its education system and on public TV channels. There are hundreds of street interviews where the public there openly stated they want Western countries nuked. Nevermind that Palestine is a Western created territory filled with Arabs from other middle Eastern countries left over from when the British Empire still controlled the area before returning it to the Judaic people who had claim to that land before Islam even existed.
The truth is that this has to be seen from an artistic point of view as well. Lighting up the dark areas more also takes away something good from the feeling of the scenes. There is a certain melancholy, or fear, or uncertainty where the shadows are very prominent and the light contrasts are very important and make the scene feel more, whether with fear, anxiety or tranquility etc... The shadows are important. Now if they are games for competition where the artistic aspect is not so important and if it is simply being able to see your enemy to annihilate it, then it will serve more but that is another objective. Honestly, in those scenes of nature, the classroom etc... I think they are much better WITHOUT RAY TRACING than WITH RAY TRACING. It's like when you increase the exposure of a photo in Photoshop and you see more of the dark parts, but it doesn't mean that it's the correct way to edit them. The point is the feeling or what you want to convey with that photo or scene and RAY TRACING doesn't discriminate the scene, but it does it with the whole game path and what it does is ruin the game or the scene for the purpose of the one who designed it.
The interesting bit about the APU is: This is essencially what the "i7-8809G Vega M" did (The NUC CPU, which had AMD Vega graphics which were roughtly on paar with a 1060. For Laptops and other handhelds (Steam deck, hint hint) this could be amazing. Also small formfactor PCs.
That was similar in intended purpose, but the design is completely different. The 8809g was not mcm, just 2 dies on the same substrate and the vega "igpu" had its own dedicated 4gb of hmb2 vram, while strix halo has the igpu in the io die (imagine desktop strix with the 2 ccds and 1 iod that already has a small igpu, except the iod is now swapped for another one with a huge igpu instead). The 8809g cpu even still has its own intel igpu functional, which means the cpu portion is effectively completely independent by itself.
To be honest the difference between Full Ray Tracing on and off could be 'time of day' and the way the primary light source is affecting the environment. I personally prefer the OFF version - the on is overly light saturated.
that path tracing actually wouldn't need such insane hardware if properly implemented with support for intels OneAPI instead of only for nvidias legacy cuda(which is insanely inefficient), actually the intel lunarlake APU is capable of getting the same results as that super expensive high end hardware, where for the raytracing it only needs to use it's buildin NPU for that. honnestly wether people like it or not, that pathtracing they do, can be easily done by a simple AI filter, after all the pathtracing in the game is actualy not even truly accurate, it just estimates things roughly and only uses a few steps while real world lighting uses pretty much infinite. so it is quite doable to get similar looks with a simple AI filter which essentially just computes/AI's the lighting. this can actually make it faster than normal modern rasterization since many computations for the lighting and similar effects can be skipped. next to that this will actually be able to give much better lighting and accuracy than even that "raytracing" they use in those games, which should easily make up for the AI related issues like potential noise(actually raytracing and path tracing also give such noice issues, you just don't see them because they are filtered out, but honnestly this means they can be filtered out, even though the question is: why do so? sligtly yes, but in real life your eyes litterally work as noise. in case you are used to AI being heavy. that is so for full generation, but for things like this, it is super lightweight, even a rather old laptop cpu can do that almost instantly using only the cpu without any optimizations to make it more efficient and using almost no ram, thing like this is already used in much more heavy form for things like removing background. of even more heavy for modern smartphones, if you have a modern samsung or apple smartphone or any other famous expensive brand, their images are for aound 90% generated by AI to "enhance" them. this has been like that for years and by now it is even more than 90% on modern smartphones. you don't notice it because even on a smartphone this works in real time, low power usage, low ram usage, and looks real enough to make essentially AI image to image generated images of the "real world" look as real that most people think they are real and have to discover they are fake after taking a image of a image and noticing the image on their phone has way more detail than the image they took a image off(like taking a picture of a blurry picture of the moon resulting in a crisp image of the moon where you can zoon in and see all craters and such). right now I am only suggesting to use this for lighting and light based shading as this can easily be done using a simple filter and a black and white greyscale mask/map, also this won't risk losing or chaning important details, next to that doing this step with AI can very easily result in better performance than normal rasterized rendering, and also more actual realism than game raytracing. as such a stated gpu takes several minutes to render 1 frame in industry conform realistic enough(for movies) normal raytracing. ofcource there would come a point when raytracing is more efficient than rasterization. the main question however is wether that point will come before using multi level AI is more accurate and efficient than both of them with better results. to clarify the example I used, still uses normal rasterized rendering, and just did lighting using AI, I did this because this way it wouldn't compromise and already would be possible to get better results than such current raytracing as well as better performance than rasterization, using only technology which essentially already exists, and can run a smal part of a gpu or a even smaller part of a npu, even a modern integrated NPU has many times the power required for this.
Ray tracing looks like 20 percent better but can't suffer a massive 50 percent decrease in framerates. I'd rather play at 80fps than 40fps with raytracing.
Yup naturally rolls in the tongue, ryzen AI max plus pro 395....They could have added ultra, x, xt, so it would be much clearer. But yeah, seems these apus would be a beast in a mini pc form factor
The New Ryzen Halo should be allowed to boost up to 200w, 150 might be hampering it a little, but also it will be bandwidth starved as it's using system Mem, but it's amazing what it is. A full gaming system on a chip 😊 Ray tracing won't cover up a bad game, let's hope the Indiana Jones games is worthy of those system requirements, be a shame if it didnt
Playing Indy, you really don't need more than 60fps for the most part... The game looks good regardless so it is definitely a nice-to-have option to push your GPU to the max for that extra bit of visual fidelity. But it's definitely not necessary for a good experience. Even at medium settings I can get over 120fps at 1440p on a 3080 10GB.
I'll wait to get excited about an ARM gou. I already have an ARM (based) GPU in my phone which is more powerful than any games available for it on the OS it runs on. Something tells me a "PC" based ARM GPU would have the same problem.
By the time the 6000 series rolls around, you're going to be getting angry at games that *don't* have ray-/path-tracing because the performance will be there too. To get there from here, we need titles like this that prove what's possible.
if you need to enable framegen to get your game running at 60fps, you can consider you're game is not working. Like 4090 with fg and dlss ultra perf, it means you basically need a 4090 to run the game natively at 1080p 30fps. It is just not functional on any pc at all
Hey guys, maybe instead of just listening to opinions try playing the game, I’m on a 3080 and got the game to run a solid 60fps at 60-80% utilization on high, also on 5120x1440, with DLAA and NO DLSS, It runs great, my vram was at 9200mb/10gb but never stuttered
The only games I want to play ray-traced are old games where the graphics otherwise won't make any modern GPU cry without DLSS. Like Deus Ex and Morrowind. I also want to play Morrowind with OpenMW, which means probably no raytracing unless the devs decide to include it there. Which I doubt they will do until after 1.0
The images with full ray tracing on looks like it makes that over head lines jagged like it turns off anti-aliasing. I would rather have ray tracing off if its going to do that.
Ray tracing might be better for game production, if you can define the materials and not worry about lighting fixups. If you can use AI to use multiple pictures of a material to reverse engineer the material properties required for ray-tracing, we should quickly build up a library of materials for devs to use.
The AI Max is a chip I have been waiting for, it really should be compared to other mobile chips rather than desktop chips. I am personally excited about the bump to ai performance. The Ray tracing can make a difference in gaming because it can help visualize items that you may otherwise not be able to see well. Just depends on. The type of game you play will determine how much it will matter.
5:28 There i personally prefer ray tracing off, because it looks more natural when you look at the trees... When ray tracing is onm it looks too "greenish" 🍏. It maybe be more useful when playing games with very dark areas i guess... I hope!
8:30 AMD is most likely not included bcz they are just way ahead of others. An ex-nvidea employee told me that AMD is the only one actually producing gaming GPUs
Yes the full RT looks better but only when standing still. If you're moving around at all then the blur caused by dlss, framegen, and super makes thing just not worth it. Stick with the basics and as a whole its much better.
16core APU with rtx 2060-level iGPU and perhaps 65w max TDP. That's impressive. But lets see whether AMD could achieve their goal to release iGPU that on par with at least mobile rtx 4050.
Well, there is form factor and TDP: but there is one other thing. Memory contention with the CPU cores. And like it or not, DDR5 is slower for graphics then GDDR6 (even at 128-bit bus width like on the RX 7600, and not considering contention with the CPU). All that said, that still is one monster of iGPU - I'm looking forward to the reviews.
I still keep imagining the possibilities of what such a GPU could do if AMD put HBM onto one of these APU's... (HBM only be being utilized for the iGPU - regular DDR5 as currently socketed and used for the CPU and core portion).
wait a second..... are you from South Jersey??? lol the way that you say 'wooder' instead of water' is how a LOT of people in SJ say it LMAO! (I'm JK!!1 I know you in Florida somewhere because you were affected by the hurricanes this past summer)
Depends on how they implement it. Same with RT. I could care less about reflections but shadows and global illumination definitely makes a difference. The thing is that we are still heavily limited on the number of rays so it will never be true path tracing as in pixel level path tracing. Its a estimation at best atm. 😑
In MOST cases, I don't even like the "Ray Tracing" version.... I think the ORIGINAL "Looks Better". Like in most if not ALL of the examples in this video. I like "contrast".... I don't like things "washed out".
I think the Indiana Jones game is on the forefront of a wave of high requirement games and Nvidea is grossing under estimating the amount if VRAM people are needing and wanting out if the New RTX 5000 series. They are out of touch with what is actually happening right now.
I have used an ARM powered laptop and I was very under whelmed after a week I was fed up with the performance akin to a 7 year old i5 powered laptop. Never tried games just general office and pdf drawings.
As a professional photographer and filmmaker who has been pixel peeping at imagery for 40 years, I am not impressed by ray tracing. It does not look real. Real life is not HDR where we can see into shadows. On clear sunny days contrasts are sharp, not soft. Bloom does not exist... its a film and lens artefact (or you have cataracts). In every example you posted I preferred the raster image. Cleaners, sharper, more colourful, more contrasty and real. The classroom for example, was missing one set of shadows that would be trivial to add which would have made the raster image photoreal while the ray tracing looked like a game.
I completely disagree about the full raytracing / pathtracing. To me it makes a huge difference. I think it is an awesome piece of tech and it's the way to go for photorealism...the light that gets reflected and partially absorbed from objects and walls makes it so much more real looking. But yes - I too turn it off , because the performance tax is still too high.. I think that it will get better in the future with new hardware and tech solutions like dlss or framegen.. who knows what nvidia /amd / intel got in their laboratories right now.. That's the thing.. I am more hyped to the new software features rtx 5000 will bring to the table, than the hardware itself. I am sure nvidia has a surprise for us.. and then AMD will have something new to keep up (and they will).. So we can all benefit from it.
I gotta agree with ay on the ray-tracing.... there isn't that much of a difference that affects actual gameplay that it would make up for the requirements and ALL THAT EXTRA MONEY for a card capable of doing that....Maybe when ray-tracing becomes much MUCH EASIER to perform (not requiring such a powerful GPU.. AND they didn't cost as much as a used car - WHICH I think AMD is about to start doing with the 8000 generation.... the patents they filed should change A LOT!!!And just like you said, the only way I can really see it being important is to brighten up areas that are important to solve a puzzle or somethign along those lines.... BTU I CAN JUST DO THAT WITH MY MONITOR LOL
imo ray tracing really isn't worth the compromise in performance and I wouldn't mind buying a graphics card without ray tracing cores that is either cheaper or has more power for rasterization. Why do we have to pay extra for features that most of us don't even use?
pretty much every generation or two amd usually comes out with a "monster apu" that takes on bottom tier gpus from the last generation. Not sure why that is such a surprise, but here we are again.
Because the regular APUs typically come with 8-12 CUs. This has a whopping 40. It has more cores than their mid range GPUs. I'd say that's a pretty big deal.
@ I guess the crux of what I was saying is that no matter how many cu’s the apu has they will always target the low end market instead of making one towards the higher end of the market.
The naming scheme of some products are pure slop. Ryzen AI PRO MAX + X PLUS XT 395 Ti Super is ridiculous.
Pure slop is my new favourite way to describe amd's naming 😂
@@justmatt2655 dem gimmick names man, i swear
I do enjoy the random AI's just thrown in for no reason.
@@rotm4447same lol
Lol
@9:50 For the Ryzen BENCHMARK....
Thanks. These yapologists getting outta hand
Thanks
Thanks bruh 😂
Holy shit that's the end of the video bruv
At this rate, only "User Benchmark" fans will buy Intel.
INTEL IS STILL BETTER
COPE
@@FO0TMinecraftPVP WOMP WOMP LIL BRO ENJOY YOUR HEATER
@@FO0TMinecraftPVP Keep telling yourself that.....
@@FO0TMinecraftPVPare you the person who owns userbenchmark?
Under appreciated comment 👌
So many people still claim devs are not at fault for bad optimization. Well so explain why would we need a 4090 to play the game... with 2 upscale... on PERFORMANCE MODE! That's laziness.
Not only that, the game really doesn't look that good, i mean graphically , they're more games from like 2,3,4 years that looks way way better and requires less hardware , look at re4 remake, damn that next gen graphics
They have contracts with Nvidia to introduce their latest rendering features. Tanking everyone's performance as always.
Not a defense, but modern software is damn complex. I wouldn't blame the developers directly. I would however point the finger at the management expecting more, for less money and faster.
@@JohnKerrashVirgo i don't disagree with u however , the devs are responsible for how the game work, so the games from 2,3,4 years are not complex for its that time? But since these shis techs such as dlls,ccs, fsr,gsr whatever the shit names! Devs started to depends to much on them, and that will prudence half baked games for us, and the more we shut up about it, the more the game industry will get fucked. For example , this Indiana game, doesn't looks that impressive !!? No games from 2,3,4 years some are way better graphically than it, but ray trasing!! Shit technology
@@weil46 Battlefield v, a 2018 title looks much better than Indiana jones
Those ray tracing changes really go to show that even standalone rasterisation still looks fantastic.
The example also cheated. The rasterized render did not use screen space ambient occlusion to spread out the light a little bit. And pre-baked texture can rival ray tracing in non dynamic objects like the walls of the room.
@@kazedcat I really thought that the Non Ray traced images looked the best. Hands down on every sample picture..
@@Noavaileblenames It can be subjective. With less light bouncing around you might feel that the non full traced image is less distracting. Realistic does not always mean better. A good example of this is that blurring effect they put to make games more realistic but it makes the gameplay worse. This is the mean weakness of ray tracing it drops the FPS so much making the gameplay worse while bringing a small improvement in realism that might not even improve the image quality to many people.
Energy efficiency is going to be a huge issue. Not only is energy getting more expensive, but the AI gold rush means that enterprise will want to fit as many AI accelerators into their racks as possible while keeping the power consumption down (not just lower bills, but requiring less HVAC and lower per-cycle running costs to stay competitive). nVidia has had the field largely to itself, but that may change as AMD roll out more efficient processors.
Not true energy is getting super cheap especially with solar and performance per kw is improving non stop
@@whitecometify Yes, solar and offshore wind are now (by far) the cheapest forms of generation. However, in many countries - particularly here in the UK - the domestic price is set by the wholesale price of gas which is far more expensive. Renewables are great, I'm a big fan - but unfortunately their output is not constant so they have to be supported by hydro, nuclear and/or gas generation to ensure base loads are met, The other factor is that asset managers like BlackRock have moved into the energy sector, hiking prices to provide high returns for shareholders. So while some generation is getting cheaper, demand and private equity means that the price is going UP, not down.
The cyberpunk with path traicing on and off is big difference but indiana jones nope
Exactly. Path tracing adds 'real' 3d depth while ray tracing just enhances already good lightning. And of course adds reflections.
Now we wait for Minisforum to make a mobo with that APU.
Exactly bro
In the first example with the lights shining through the windows, while the fully ray traced example does certainly look more realistic, the "not fully ray traced" example looks aesthetically better to my eyes, even if it is technically less realistic and this is just a big problem with ray tracing in general is that it's much more of a game development time optimization technique rather than a graphical enhancement, but the mainstream buyers don't have hardware capable enough to do that sort of ray tracing yet and so it is still too early, and it will remain too early to do fully path traced only games until Nvidia or AMD or Intel (or better yet, all 3 of them) decides to give us actually path tracing capable hardware in the mainstream GPU segment and just not make it a luxury for the top-end 0.01% of buyers.
When developers are given enough time and resources, they can make a video game look incredible using rasterization. Even lighting, shadows, reflections, and refractions can be done really well with rasterization when developers put the time in. For other assets such as textures and geometry, rasterization looks just as good as RT/PT. It's a shame that developers are starting to leave rasterization behind at least when it comes to single-player games.
Full ray tracing to play at 60fps is kind of meh. I can't even stand playing a mmorpg in less than 120fps
It's not entirely fair to analyze ray tracing only by looking at still images. The largest effect it really has is the added realism of moving shadows and reflections while the observer is in motion.
And what that do to the game play? Does it make the game playable? Or make the players more pro!? From now on we will see alot of games that doesn't even looks next gen will ask for more hardwares, dont be surprised when a devs make a game that looks like 10 years old and ask for a rtx 5060 as minimum .
A more realistic shadow gets defeated when half the time you are seeing fake frames because your $2K GPU is not fast enough to render full tracing in 60 FPS.
@@weil46oh you’re talking about stalker 2 i see
@@weil46 Quote: _"And what that do to the game play? Does it make the game playable? Or make the players more pro!?"_
Ray tracing doesn't do any of these things. But that is not what ray tracing is for. You might as well complain that green tea is useless because it doesn't make you more of a "pro-gamer".
@The_Stef bro what i mean is, if ray trasing become something that devs push it so hard like Indiana Jones game, when its not even a graphically impressive, than we will see in the future games that demands more hardwares than what the game should need, and as u know how gpus prices are.
Let’s hope we get a ton of mini pc options with this new amd apu.
Let goooooo AMD!!!
INTEL IS BETTER
COPE
@@FO0TMinecraftPVPWOMP WOMP LIL BRO ENJOY YOUR HEATER
@@FO0TMinecraftPVP HAHAHAH ON ALL COMMENTS. the real cope is you
@@FO0TMinecraftPVP Intel lost 6 years ago
I am a 13600k user and my next upgrade will be a ryzen cpu(first time with amd), I will not get any intel CPUs for now.
Those images could be easily archived with more effort being done to the lighting without raytracing
The only time you get a soft edged shadow outside is with cloud and layered shadows (Dappled). This is due to the sun being so intense a light source and at a singular point. Soft shadows appear inside under man-made lights, due to how the light source is made. What I am seeing in these pics demonstrates the tech and the design of games is only just starting to develop the detail that the human eye picks up. I wonder how long it will be before a moving image generated in software will be indistinguishable from real life by most people? Scary thought.
Man, sure AMD made great advances in energy optimization, but any graph without a scale is just bullshit.
100% agree with you on the "Full Ray Tracing" thing. Its looks... different, likely more realistic. But better? At this point it's a matter of preference. I can see many people would prefer the "simple RT" as it produces more contrasty, and in some cases, easier to read results. Kind of like some people would prefer over satured pictures over a more realistic, true colors, "flatter" one...
We're really at the point of diminishing returns, and I'm not sure they will sell us more GPU based on such "improvements".
1) AMD reaching their power efficiency goal: AMD has been known to play fast & loose with their published numbers of late, so I'll wait until I see some real world testing before congratulating them over it.
2) The path tracing thing: I play my games without the benefit of having two side-by-side renderings to compare, so I genuinely don't think it's a big deal at all (and on a related note, I _actually_ prefer some of the 'Full Ray Tracing OFF' images over their 'ON' counterparts).
3) ARM's GPU: I'm really excited about this and I hope it pans out.
4) Ryzen Max: I'm somewhat excited about this but (again), until it's out for independent testing and review, I won't be popping my cork over it _just yet._
"Monster max"
This like a new flavour of monster munch or somethin
They are uk crisps/chips
Real RT can make a huge difference. The subtle details can be everything when trying to reach realistic visuals. Especially when you bring in animation.
Ray Tracing and path tracing currently not worth the cost to me. Maybe someday that tech will be more widespread and affordable, but it really is not currently and the graphical upgrade in game vs the costs makes it not worth it to me.
5:04 this is one of the things that make the game feel closer to the movie series. consider that the movies were shot by steven spielberg that went to great lenghts to shape the scenes he shoots in. the subtle difference between non rtx to rtx is the differenc ebetween it looking like a game and looking like you are inside those movies.
Question with these APUs is what the pricing will be.
We have a >300$ CPU paired with an iGPU with the power of a 400$ video card...
700$ just for the chip might be too much but it will be extremely expensive and idk if we will see them somewhere else besides consoles
It is for using large LLMs without investing in claster of GPU. Buing it for gaming is just moronic idea.
@@i34g5jj5ssx Nah its for putting in expensive laptops, they even named it after apple's chip.
It hasn't been proven to have the power of a $400 video card yet. And the CPU takes most of the TDP at 12-16 cores.
@@rotm4447 this is another series with very similar naming
@@mikeb3172 does not matter if it will be slow but capable load most of today models or several smaller at ones
In my opinion it's not worth it to turn RT on.
This is the way to go. Having games that require hardware from next year to run on medium settings. Just like in the olden days.
The most modern games on the most modern GPU would run at 15fps at low resolutions. That's why the next generation GPU had almost double or even four times the power.
These days, it's like 15% and people are happy.
So please, integrate even more insane things. Give it a 5fps Cosmic-Scale Setting.
Good presentation. Thanks for the news!
A variant with four fast (8000+) memory channels would be interesting in a desktop along with a great cooler - might be an excuse to use liquid cooling.
To me looks better without ray tracing going by those images
ARM entering the PC GPU market is great 👍 for one more competition; then prices get lower and performance get highly competitive. ARM GPU architecture is used in millions of smartphones and many developers are more than familiarized with it. But scaling from handhelds to PC is a huge step and they barely have any windows drivers (ATM W11 is optimized for Adreno GPUs)
Maybe they should compare the expected resolutions while having raytracing on. And make it movable not static.
DLSS, FSR, and XeSS have made game developers lazy in thinking they don't have to optimize the game for performance. "Oh just lower the resolution and use DLSS to get 60 FPS"
Bitch I got a 240 Hz monitor, 60 FPS is rookie numbers you gotta pump those numbers up.
ever since I came to pc gaming from xbox, and started regularly getting 100+ fps, I started to not mind bringing down the fidelity a bit to get that smooth game play, cause it can look amazing, but if it doesn't feel smooth while I'm playing it, it looks bad, at least to me. Both the full ray-traced and partial ray-traced screenshots looked pretty nice though.
Boycotting ARM
Why?
@@awesomethegreatamazing2651 bc of helping murderers.
@@awesomethegreatamazing2651 Made in Israel.
@@robertmyers6488 Read the Bible... I'm basically certain Israel will get boycotted by God very soon.
So is Intel and many US defence chips. But enjoy supporting Islamists and terrorists that want to destroy the West and its freedoms.
Palestine had decades of anti-western propaganda and indoctrination in its education system and on public TV channels. There are hundreds of street interviews where the public there openly stated they want Western countries nuked.
Nevermind that Palestine is a Western created territory filled with Arabs from other middle Eastern countries left over from when the British Empire still controlled the area before returning it to the Judaic people who had claim to that land before Islam even existed.
The truth is that this has to be seen from an artistic point of view as well. Lighting up the dark areas more also takes away something good from the feeling of the scenes. There is a certain melancholy, or fear, or uncertainty where the shadows are very prominent and the light contrasts are very important and make the scene feel more, whether with fear, anxiety or tranquility etc... The shadows are important. Now if they are games for competition where the artistic aspect is not so important and if it is simply being able to see your enemy to annihilate it, then it will serve more but that is another objective.
Honestly, in those scenes of nature, the classroom etc... I think they are much better WITHOUT RAY TRACING than WITH RAY TRACING. It's like when you increase the exposure of a photo in Photoshop and you see more of the dark parts, but it doesn't mean that it's the correct way to edit them. The point is the feeling or what you want to convey with that photo or scene and RAY TRACING doesn't discriminate the scene, but it does it with the whole game path and what it does is ruin the game or the scene for the purpose of the one who designed it.
Is ARM going to be making discrete GPUs for desktops, though? My guess is that they'll focus on the mobile segment, with integrated solutions.
Indiana Jones.....looked better without ray tracing.
This is the laptop I've been waiting for to upgrade to. A 4060 built into the APU!
Thank you for showing us that Full Ray Tracing is still a gimmick
I never use ray tracing to be honest. It's just not worth the fps hit. It's to me kind of a useless technology
Notice how they never compare RT vs. HDR on the same scene?
😂
Ahem, ahem, Nvidia sponsored title.
@herobrinecyberdemon8104 The double ahem is a nice touch.
The interesting bit about the APU is: This is essencially what the "i7-8809G Vega M" did (The NUC CPU, which had AMD Vega graphics which were roughtly on paar with a 1060.
For Laptops and other handhelds (Steam deck, hint hint) this could be amazing. Also small formfactor PCs.
That was similar in intended purpose, but the design is completely different. The 8809g was not mcm, just 2 dies on the same substrate and the vega "igpu" had its own dedicated 4gb of hmb2 vram, while strix halo has the igpu in the io die (imagine desktop strix with the 2 ccds and 1 iod that already has a small igpu, except the iod is now swapped for another one with a huge igpu instead). The 8809g cpu even still has its own intel igpu functional, which means the cpu portion is effectively completely independent by itself.
Did you test it?
To be honest the difference between Full Ray Tracing on and off could be 'time of day' and the way the primary light source is affecting the environment. I personally prefer the OFF version - the on is overly light saturated.
We might need quantum gpus to get some good performance in ray tracing in modern games
G-meld. AMD hype master confirmed
that path tracing actually wouldn't need such insane hardware if properly implemented with support for intels OneAPI instead of only for nvidias legacy cuda(which is insanely inefficient),
actually the intel lunarlake APU is capable of getting the same results as that super expensive high end hardware, where for the raytracing it only needs to use it's buildin NPU for that.
honnestly wether people like it or not, that pathtracing they do, can be easily done by a simple AI filter, after all the pathtracing in the game is actualy not even truly accurate, it just estimates things roughly and only uses a few steps while real world lighting uses pretty much infinite. so it is quite doable to get similar looks with a simple AI filter which essentially just computes/AI's the lighting.
this can actually make it faster than normal modern rasterization since many computations for the lighting and similar effects can be skipped.
next to that this will actually be able to give much better lighting and accuracy than even that "raytracing" they use in those games, which should easily make up for the AI related issues like potential noise(actually raytracing and path tracing also give such noice issues, you just don't see them because they are filtered out, but honnestly this means they can be filtered out, even though the question is: why do so? sligtly yes, but in real life your eyes litterally work as noise.
in case you are used to AI being heavy. that is so for full generation, but for things like this, it is super lightweight, even a rather old laptop cpu can do that almost instantly using only the cpu without any optimizations to make it more efficient and using almost no ram, thing like this is already used in much more heavy form for things like removing background.
of even more heavy for modern smartphones, if you have a modern samsung or apple smartphone or any other famous expensive brand, their images are for aound 90% generated by AI to "enhance" them. this has been like that for years and by now it is even more than 90% on modern smartphones. you don't notice it because even on a smartphone this works in real time, low power usage, low ram usage, and looks real enough to make essentially AI image to image generated images of the "real world" look as real that most people think they are real and have to discover they are fake after taking a image of a image and noticing the image on their phone has way more detail than the image they took a image off(like taking a picture of a blurry picture of the moon resulting in a crisp image of the moon where you can zoon in and see all craters and such).
right now I am only suggesting to use this for lighting and light based shading as this can easily be done using a simple filter and a black and white greyscale mask/map, also this won't risk losing or chaning important details, next to that doing this step with AI can very easily result in better performance than normal rasterized rendering, and also more actual realism than game raytracing.
as such a stated gpu takes several minutes to render 1 frame in industry conform realistic enough(for movies) normal raytracing.
ofcource there would come a point when raytracing is more efficient than rasterization.
the main question however is wether that point will come before using multi level AI is more accurate and efficient than both of them with better results.
to clarify the example I used, still uses normal rasterized rendering, and just did lighting using AI, I did this because this way it wouldn't compromise and already would be possible to get better results than such current raytracing as well as better performance than rasterization, using only technology which essentially already exists, and can run a smal part of a gpu or a even smaller part of a npu, even a modern integrated NPU has many times the power required for this.
Ray tracing looks like 20 percent better but can't suffer a massive 50 percent decrease in framerates. I'd rather play at 80fps than 40fps with raytracing.
Am I the only one that find the none raytracing images better than the raytracing ones?
Yup naturally rolls in the tongue, ryzen AI max plus pro 395....They could have added ultra, x, xt, so it would be much clearer. But yeah, seems these apus would be a beast in a mini pc form factor
The New Ryzen Halo should be allowed to boost up to 200w, 150 might be hampering it a little, but also it will be bandwidth starved as it's using system Mem, but it's amazing what it is. A full gaming system on a chip 😊
Ray tracing won't cover up a bad game, let's hope the Indiana Jones games is worthy of those system requirements, be a shame if it didnt
Playing Indy, you really don't need more than 60fps for the most part... The game looks good regardless so it is definitely a nice-to-have option to push your GPU to the max for that extra bit of visual fidelity. But it's definitely not necessary for a good experience. Even at medium settings I can get over 120fps at 1440p on a 3080 10GB.
graphic chips are needed for weapons so no wonder ARM want have some on hand too
I'll wait to get excited about an ARM gou. I already have an ARM (based) GPU in my phone which is more powerful than any games available for it on the OS it runs on. Something tells me a "PC" based ARM GPU would have the same problem.
By the time the 6000 series rolls around, you're going to be getting angry at games that *don't* have ray-/path-tracing because the performance will be there too. To get there from here, we need titles like this that prove what's possible.
The pics without raytracing all look so much better and more realistic 😂
4:28 To me it seems more subjective than usual with graphics if the Path tracing looks better. I for example see things I prefer in both Images.
AMD's new APU would be a beast in a desktop with a 75w tpu.
if you need to enable framegen to get your game running at 60fps, you can consider you're game is not working. Like 4090 with fg and dlss ultra perf, it means you basically need a 4090 to run the game natively at 1080p 30fps. It is just not functional on any pc at all
Hey guys, maybe instead of just listening to opinions try playing the game, I’m on a 3080 and got the game to run a solid 60fps at 60-80% utilization on high, also on 5120x1440, with DLAA and NO DLSS, It runs great, my vram was at 9200mb/10gb but never stuttered
The only games I want to play ray-traced are old games where the graphics otherwise won't make any modern GPU cry without DLSS. Like Deus Ex and Morrowind. I also want to play Morrowind with OpenMW, which means probably no raytracing unless the devs decide to include it there. Which I doubt they will do until after 1.0
Anyone know the audio track?
The images with full ray tracing on looks like it makes that over head lines jagged like it turns off anti-aliasing. I would rather have ray tracing off if its going to do that.
Ray tracing might be better for game production, if you can define the materials and not worry about lighting fixups. If you can use AI to use multiple pictures of a material to reverse engineer the material properties required for ray-tracing, we should quickly build up a library of materials for devs to use.
The AI Max is a chip I have been waiting for, it really should be compared to other mobile chips rather than desktop chips. I am personally excited about the bump to ai performance.
The Ray tracing can make a difference in gaming because it can help visualize items that you may otherwise not be able to see well. Just depends on. The type of game you play will determine how much it will matter.
no one cares about ai except wall street speculators.
5:28 There i personally prefer ray tracing off, because it looks more natural when you look at the trees... When ray tracing is onm it looks too "greenish" 🍏. It maybe be more useful when playing games with very dark areas i guess... I hope!
we are now ryzen maxxing before gta 6
Welcome to GamerSwell! 😂 /s
I'll wit for the actual hardware release.
I'm only pumped when it comes to desktop and I can build a sick SFF PC with a decent GPU in it.
8:30 AMD is most likely not included bcz they are just way ahead of others. An ex-nvidea employee told me that AMD is the only one actually producing gaming GPUs
Yes the full RT looks better but only when standing still. If you're moving around at all then the blur caused by dlss, framegen, and super makes thing just not worth it. Stick with the basics and as a whole its much better.
16core APU with rtx 2060-level iGPU and perhaps 65w max TDP.
That's impressive.
But lets see whether AMD could achieve their goal to release iGPU that on par with at least mobile rtx 4050.
the ray traced just looks more brighter. i like the darker ones better
Is their a law that says all ASML ultra nM machines have to be installed underneath ballistic missile radii?
Well, there is form factor and TDP: but there is one other thing. Memory contention with the CPU cores. And like it or not, DDR5 is slower for graphics then GDDR6 (even at 128-bit bus width like on the RX 7600, and not considering contention with the CPU). All that said, that still is one monster of iGPU - I'm looking forward to the reviews.
I still keep imagining the possibilities of what such a GPU could do if AMD put HBM onto one of these APU's... (HBM only be being utilized for the iGPU - regular DDR5 as currently socketed and used for the CPU and core portion).
wait a second..... are you from South Jersey??? lol the way that you say 'wooder' instead of water' is how a LOT of people in SJ say it LMAO!
(I'm JK!!1 I know you in Florida somewhere because you were affected by the hurricanes this past summer)
The 385 is probably the fastest for the TDP they're all limited to. 8c/16t, 32CU.
Path tracing = Highly overrated
Depends on how they implement it. Same with RT. I could care less about reflections but shadows and global illumination definitely makes a difference. The thing is that we are still heavily limited on the number of rays so it will never be true path tracing as in pixel level path tracing. Its a estimation at best atm. 😑
In MOST cases, I don't even like the "Ray Tracing" version.... I think the ORIGINAL "Looks Better". Like in most if not ALL of the examples in this video.
I like "contrast".... I don't like things "washed out".
"AI MAX+ PRO" 😳
Yes, it's targeted at professional fully maxed out ai ... but + even more than maxed out.
They shouldn't call it Raytracing, when it's Pathtracing
I think the Indiana Jones game is on the forefront of a wave of high requirement games and Nvidea is grossing under estimating the amount if VRAM people are needing and wanting out if the New RTX 5000 series. They are out of touch with what is actually happening right now.
I have used an ARM powered laptop and I was very under whelmed after a week I was fed up with the performance akin to a 7 year old i5 powered laptop. Never tried games just general office and pdf drawings.
Hopefully they will be available for desktop soon.
I this this APU is for PlayStation
As a professional photographer and filmmaker who has been pixel peeping at imagery for 40 years, I am not impressed by ray tracing. It does not look real. Real life is not HDR where we can see into shadows. On clear sunny days contrasts are sharp, not soft. Bloom does not exist... its a film and lens artefact (or you have cataracts). In every example you posted I preferred the raster image. Cleaners, sharper, more colourful, more contrasty and real. The classroom for example, was missing one set of shadows that would be trivial to add which would have made the raster image photoreal while the ray tracing looked like a game.
Okay, boycott ARM. Got it.
Anti semitic much?
@@martinrogers8465 I've no idea what you're talking about.
@@King-IbnDragonbut you’re the one starting this thread… nevermind…
I completely disagree about the full raytracing / pathtracing. To me it makes a huge difference. I think it is an awesome piece of tech and it's the way to go for photorealism...the light that gets reflected and partially absorbed from objects and walls makes it so much more real looking.
But yes - I too turn it off , because the performance tax is still too high.. I think that it will get better in the future with new hardware and tech solutions like dlss or framegen.. who knows what nvidia /amd / intel got in their laboratories right now.. That's the thing.. I am more hyped to the new software features rtx 5000 will bring to the table, than the hardware itself.
I am sure nvidia has a surprise for us.. and then AMD will have something new to keep up (and they will).. So we can all benefit from it.
Why I picked the Radeon 7900 xtx over Nvidias alternative with lower rasterisation perf.
And next up for today
I gotta agree with ay on the ray-tracing.... there isn't that much of a difference that affects actual gameplay that it would make up for the requirements and ALL THAT EXTRA MONEY for a card capable of doing that....Maybe when ray-tracing becomes much MUCH EASIER to perform (not requiring such a powerful GPU.. AND they didn't cost as much as a used car - WHICH I think AMD is about to start doing with the 8000 generation.... the patents they filed should change A LOT!!!And just like you said, the only way I can really see it being important is to brighten up areas that are important to solve a puzzle or somethign along those lines.... BTU I CAN JUST DO THAT WITH MY MONITOR LOL
imo ray tracing really isn't worth the compromise in performance and I wouldn't mind buying a graphics card without ray tracing cores that is either cheaper or has more power for rasterization. Why do we have to pay extra for features that most of us don't even use?
pretty much every generation or two amd usually comes out with a "monster apu" that takes on bottom tier gpus from the last generation. Not sure why that is such a surprise, but here we are again.
Because the regular APUs typically come with 8-12 CUs. This has a whopping 40. It has more cores than their mid range GPUs. I'd say that's a pretty big deal.
@ I guess the crux of what I was saying is that no matter how many cu’s the apu has they will always target the low end market instead of making one towards the higher end of the market.
Yah, RT is cool, but I have to admit, my brain probably can't tell unless I have the images side by side.
wow 4th gpu player
maybe i will wait 3-5 years from now before buying any new gpu and maybe that was the great time
Exploding GPU's coming to a device near you....
Oh, right on spot.
water❌
Woder✅
Wuatuh?
Vater
So BIG X on any ARM product from now on.
That “wordor” though😂😂😂
Wait, so is “full ray tracing” actually path tracing?