i tried this with my pc. Its actually extremely beautiful. Especially when I'm in a foggy biome. It feels like I'm really there. Because of the smoke coming out of my PC its really immersive.
Minecraft Java is the worst optimized game ever and for some reason not many people are complaining about it. This is something that Mojang should work on instead of adding new features that are only going to make the game even laggier.
Complementary Shaders absolutely slap. The visuals are stunning and the impact isn't anything too crazy when paired with Iris. I've tried many others, but I keep coming back to Complementary and BSL sometimes.
What you said about Minecraft utilizing more of the GPU was technically correct, but you must understand that Minecraft still isn't actually using the 4090 to its full ability- RT cores are not being used. All of the graphics are being bottlenecked through the "generic" parts of the GPU because neither OptiFine nor Iris support RT cores in their current state. Iris is working on it, however, and Continuum Graphics is making Focal Engine, a mod which will allow Minecraft to utilize Vulkan rendering software, which will be able to take advantage of RT cores as well.
Cuda remains the strongest part of any GPU however. standalone RT cores might hardly break the halfway mark in terms of performance compared to the cuda cores.
I didn't even consider that the game wasn't utilizing the RT cores. I'm glad to hear that Iris and Continuum are working on it, and hopefully we can see a video eventually where it's a reality
@@igameidoresearchtoo6511 I'm not sure you're quite grasping just how much of an impact RT cores have on ray tracing. Cuda cores work fine for a rasterized rendering pipeline, but are horribly unoptimized for ray tracing. When it comes to shaders like SEUS PTGI, KappaPT, and NostalgiaVX, the main barrier to performance (and overall appearance) is the fact that they have to squeeze path-traced rendering into the Cuda cores along with all the other rendering tasks. To fit, corners have to be cut and fps has to be sacrificed. RT cores make a big difference when it comes to the rendering method that they're *literally optimized for.*
@@draco6349 True, but in many games they work together with cuda to get an even higher frame rate, but for minecraft I really don't think they might even break the halfway mark for standalone RT cores vs cuda cores. Minecraft hardly uses cuda to its full potential, if even to its designed potential, I really doubt it would use the much more complex RT cores correctly.
@@igameidoresearchtoo6511 well sure, Minecraft itself wouldn't because Mojang doesn't care enough to optimize it, but that's why there are modders working to integrate support properly.
It boggles my mind how far the modders pushed Minecraft in the decade or so it's been out, I mean sure, it's not exactly going to run well, but the contrast between the baseline visuals and what modders accomplished is absolutely stunning.
If you turn off animated textures, portal effects (or whatever its called) and some more texture effects and all particles set to minimal or off, the fps on Patrix 128x/256x doubles or Even quadrouples… its an interesting fix that people should be aware of!
1:22 Also, it's very noticeable that using shaders and high-end resource packs binds even more performance to the GPU. You can actually see your CPU load dropping when you turn on shaders, as rendering load is taken off of the CPU. I say this running PTGI on an i5-11600KF and RTX 3070.
@@etmezh9073 1080 is by far the most popular resolution (unless it changed), plus even SEUS doesn't recommend people to use HRR 3, says he probably will rewrite HRR entirely because the performance gains compared to the normal version are starting to not be visible anyone
Continuum and SEUS are the 2 heaviest shaders in the world, if you really want a high quality, customized and really good for your FPS shaders, you need AstraLex, you just need to be pacient and costum it...
Seeing how high the fps could go with low render distance was cool. But it also would have been interesting to see how high the render distance can go with a 4090
@@someone11112This is a somewhat misleading response. While render dist does heavily depend on the amount of ram you have allocated, your CPU is heavily taxed at higher render distances. This is often why you'll see a reduction in your GPU usage as you increase your render distance; your CPU bottlenecks your GPU more.
Man i always enjoy watching incredible shader videos for Minecraft. The great visual ones that aren't a texture pack and still have vanilla blocks make the game feel so much immersive.
I want to note that I wanted to replicate the highest framerate video on minecraft, and there is still a lot you can do to optimize the fps even more, when i first got my Rtx 3070 ti i used Lunar Client and turned off everything, i used 1.8.9 and turned everything to low, i hit a wopping 6200+ fps using an rtx 3070 ti and ryzen 9 5900x, i wanted to make a video out of it but a week later my minecraft account got hacked.. I know i have a photo of the fps somewhere lol
@@lemonaut1 Probably not that much, so i would say no more than 10 fps less, but it does depend on the hardware. But anyways if you are trying to get max possible performance, every frame matters
The Rethinking Voxels shaderpack (or Complementary Reimagined) combined with the Faithless texture pack makes the game look completely incredible, highly recommend
Rethinking Voxels (Complementary Reimagined RT edit) could also be a big thing after a few more updates. The light system on that thing is completely unreal on par or above with bedrock indoors and any other rt shader I’ve tried, and there’s a huge plus which is that it is still Complementary
@@TheLivingHuman that's from complementary reimagined though, so if you use it for only that, then you'd be better off using complementary reimagined for improved performance and general visual fidelity (minus coloured light)
I saw it on modrinth, but you could do better. They put a lot of bloat mods there which are unnecessary, you are better off just nitpicking the opti mods out of it and delete the rest.
@@idedary Yeah, I don't understand the bias people have with this modpack lol, maybe they advertise it. There's plenty of better modpacks out there for that
@@Cronalixz I agree, I can get 50% more performance by just selecting the mods myself, as I did for my server. When i look at the mods list there is unnecessary stuff hogging the resources. Its a clickbait modpack.
@@etmezh9073 i use vsync and it still happens, setting it myself doesn't work either because my monitor is actually 59.6hz, and obviously i can't put that in setting
The reason bedrock uses less gpu usage, and was locked at 72 fps may have been due to vsync, as if it is unable to reach 144 fps (assuming that is your monitor refresh rate) it will lock it to half of 144 (which is 72). Because it locked the fps, it will end up using less gpu
i dont understand what any of this means i just know that bedrock runs faster than java especially when playing in biomes like mangrove swamp / jungle. wanted to cry when i went on java and saw that the things i love exploring in bedrock is being laggy af lol
@@nafsii04 bedrock is written in c++, its just native code. It is gonna be a lot faster. But java runs on a virtual machine. But thats why minecrat java has a lot of support for modding and compatibility
Yes this is correct, he has to edit some configs manually to disable it! With my 3060 I was stuck at 48 for some reason but after editing the config I got like 60.
Intrestingly, one thing that's clearly noticable in the ray traced high resolution versions of any games on graphics cards, is the power of the sun! In that it is now able to profoundly showcase the power of sunlight to a more realistic degree and also create enough difference between the region in bloom and not in bloom, effectively also increasing the f stops!
not really.. the biggest graphics card leap in history was the monster that was the GTX 1080 Ti this card was so powerful not even the next gen RTX 2080 could beat it , the 1080 Ti was waaay ahead of its time
F3 reduces your FPS because the text rendering is not optimized. Every frame it has to rebuild the strings letter by letter as far as I'm aware. I don't think it has any kind of caching.
You need to see the load of each core of the CPU with some software like HWinfo to know if you are throttling it. Minecraft probably runs on just a couple cores, so if you have 16 cores it could be throttling 2 cores to the max but the task manager will only show you a 12-13% CPU usage.
I think you should have used the physics mod to really push the GPU and I think we all wanted to see how it looks. Love your videos, please keep posting these amazing videos
One thing to note, if your gpu isn't at about 90% utilization or above, it IS the cpu that is the bottleneck. That's why your 4090 was hardly ever being used fully in this video. If you upgrade your cpu you should see a big increase in fps.
@@TBone.Gaming there’s many variables but it would depend on the generation of it and your gpu. If you have a 4090 and the newest i9 (I’m not very familiar with these) then it would be better than this video, but still the bottle neck as it may take another year for cpus to catch up to the 4090
@@TBone.Gaming prolly not, I heard the 13900k is only about a ~5% increase from the 13700k strictly for gaming. Every cpu on the market is bottlenecking the 4090 at this point so the only thing you can do is play the waiting game.
I struggle to see how raytracing could be compatible with LOD, maybe LOD could simplify textures and models in the distance for vram benefits? But that doesnt solve the light rendering which is what I would call the most intensive part. Maybe it can limit the amount of light bounces after a certain range, but even then that would make lighting from the sun and moon janky. Maybe its possible but based on my current knowledge of raytracing that would very hard for a developer to implement correctly.
@@legendarylava8775 I'm not even thinking of it being such a coherent comparability, just for there to not be significant visual glitches when both are used at the same time, which occurs currently with LOD mods and most even non-raytracing shaders.
When seeing only a 17% or so cpu usage this is completely normal and can still be a CPU „limitation“ as Minecraft can’t use lots of cores like other games. So even if it is maxing out the 2 cores it uses, your 16 core CPU won’t reach a high percentile load, but can still cap the framerate. (The single core speed is the limiting factor :)
ordered my 4090 a few days ago and its on its way got a 4k monitor ready! this video popped up and seeing it makes me all the more excited to open that box!
@@Bl0xxy managed to get super lucky and got one. For sure worth setting money to the side. Even if little by little. A monster worth selling you soul for haha. But also fair close to over kill. Not really neccesary. But sometimes I do renders for my friends games he makes. So I kinda need it.
What you said about bedrock rtx shaders I completely agree with. Indoors look absolutely fantastic, especially pair with a good texture pack, but outdoors usually just look decent to sometimes plain bad for a shader
It wouldn't be so bad if it wasn't so glitchy and you could disable the atmospheric blurring crap. It's technically the most advanced in terms of true RT, but it looks surprisingly mid in all but the darkest of scenarios.
@@0h_hey944 You spent all that money on the world’s fastest gaming PC, so you might as well get the world’s fastest gaming monitor. You will definitely notice the difference at 500hz.
You should have used performance overlay like afterburner to see the gpu usage. When I tested some super high quality shaders with my 7900xtx and lowering the render distance to 12 made the game utilize the gpu and cpu more raising my fps from 60 to over 300.
i really love rethinking voxels, its one of the pretties packs ive ever used, being fully path traced, and it gets better performance than PTGI while also having colored light.
Ya, like when i first brought minecraft, I was so dumb but I watched a video of a player going to the nether, and I tried the tutorial on how to make it and i got t the nether and when I saw it I was like"bro i just found hell in minecraft"😅😅😅
You should do a Minecraft conversion for a war mod. Like including planes. tanks. boats. machinery. or basically whatever you want to included. I bet you it will be epic. Also if you make it a modpack im pretty sure it will get popular that way too.
Na i woukd go to iris and not to optifine because iris has better shader Compiler and use less Performance and iris has nicer shaders today and with other Performance and generell fabrik Tools u can triple your default fps and has also nicer shaders and options
One critique that I have with SEUS PTGI is that there are little blurs and tears when your character is moving or slight visual bugs and it just looks weird. I can also say that your hand in first person POV is like dark for me. I love the work these authors put in and don't get me wrong I am not saying these are bad shaders. They are the best out there. I just wanted to share my opinion.
For those who are curious: SEUS PTGI is INSANELY fast for a full path traced shader and the visuals are absolutely stunning. It runs at a (marginally) playable 20 FPS on my GTX 1660 ti, at 1080p and when using the free 32 px version of Patrix it runs about 15 fps. So if you have a 2080 ti (which if I recall correctly was a MASSIVE improvement over the earlier cards) then that should run at a pleasantly playable frame rate.
I recently bought a 4080 Ryzen 9 literally everything it’s a top of the line computer and so far every game I’ve played like rust or tarkov I’m at the games max frame cap and it’s amazing
i tried this with my pc. Its actually extremely beautiful. Especially when I'm in a foggy biome. It feels like I'm really there. Because of the smoke coming out of my PC its really immersive.
LMAO
bro I'm dying after reading this 😂
Okay you got me lol
ikr it even got the smell and shit
Lmfaooo omfg
Minecraft Java is the worst optimized game ever and for some reason not many people are complaining about it. This is something that Mojang should work on instead of adding new features that are only going to make the game even laggier.
That's true
When they did that in 1.15, people called it a trash update
Main reason for why I play 1.12.2 and use mods to add significantly more features than mojang ever hoped to add in a much higher quality too.
RIIR (Rewrite it in Rust)
True
Complementary Shaders absolutely slap. The visuals are stunning and the impact isn't anything too crazy when paired with Iris. I've tried many others, but I keep coming back to Complementary and BSL sometimes.
Complementary goes so hard, I used to have one similar with black cinematic bars at the top, thats all i miss
That the website you use to get this REALISTIC minecraft?
@@GONTE_YT curseforge or modrinth. Iris and Optifine can run shaders but can't you just search it up how to install them?
Try rethinking voxels. Its a complementary based shader with ray tracing. If you cant run that then complementary reimagined is also really good
@@Rooftopaccessorizer I will check it out, thanks!
What you said about Minecraft utilizing more of the GPU was technically correct, but you must understand that Minecraft still isn't actually using the 4090 to its full ability- RT cores are not being used. All of the graphics are being bottlenecked through the "generic" parts of the GPU because neither OptiFine nor Iris support RT cores in their current state. Iris is working on it, however, and Continuum Graphics is making Focal Engine, a mod which will allow Minecraft to utilize Vulkan rendering software, which will be able to take advantage of RT cores as well.
Cuda remains the strongest part of any GPU however.
standalone RT cores might hardly break the halfway mark in terms of performance compared to the cuda cores.
I didn't even consider that the game wasn't utilizing the RT cores. I'm glad to hear that Iris and Continuum are working on it, and hopefully we can see a video eventually where it's a reality
@@igameidoresearchtoo6511 I'm not sure you're quite grasping just how much of an impact RT cores have on ray tracing. Cuda cores work fine for a rasterized rendering pipeline, but are horribly unoptimized for ray tracing. When it comes to shaders like SEUS PTGI, KappaPT, and NostalgiaVX, the main barrier to performance (and overall appearance) is the fact that they have to squeeze path-traced rendering into the Cuda cores along with all the other rendering tasks. To fit, corners have to be cut and fps has to be sacrificed. RT cores make a big difference when it comes to the rendering method that they're *literally optimized for.*
@@draco6349 True, but in many games they work together with cuda to get an even higher frame rate, but for minecraft I really don't think they might even break the halfway mark for standalone RT cores vs cuda cores.
Minecraft hardly uses cuda to its full potential, if even to its designed potential, I really doubt it would use the much more complex RT cores correctly.
@@igameidoresearchtoo6511 well sure, Minecraft itself wouldn't because Mojang doesn't care enough to optimize it, but that's why there are modders working to integrate support properly.
It boggles my mind how far the modders pushed Minecraft in the decade or so it's been out, I mean sure, it's not exactly going to run well, but the contrast between the baseline visuals and what modders accomplished is absolutely stunning.
there’s so much to be done when every single component of the game can be isolated into individual “blocks” customization becomes endless
VScode, game edition
6:56 I love that lighting in the storage area
What's the shader?
@@hopeconfig4072 Minecraft RTX on Bedrock edition
@@hopeconfig4072 bedrock rtx
What is it?@@hopeconfig4072
shader?
If you turn off animated textures, portal effects (or whatever its called) and some more texture effects and all particles set to minimal or off, the fps on Patrix 128x/256x doubles or Even quadrouples… its an interesting fix that people should be aware of!
Thank you for sharing !
1:22 Also, it's very noticeable that using shaders and high-end resource packs binds even more performance to the GPU. You can actually see your CPU load dropping when you turn on shaders, as rendering load is taken off of the CPU. I say this running PTGI on an i5-11600KF and RTX 3070.
I run PTGI HRR 2.1 on a 3050 and somehow have 60-85 fps if i use Iris + Sodium
@@ZedDevStuff Thats awesome
@@ZedDevStuff 1080p...
also 2.1 is much less intensive than 3.0.
@@etmezh9073 1080 is by far the most popular resolution (unless it changed), plus even SEUS doesn't recommend people to use HRR 3, says he probably will rewrite HRR entirely because the performance gains compared to the normal version are starting to not be visible anyone
Continuum and SEUS are the 2 heaviest shaders in the world, if you really want a high quality, customized and really good for your FPS shaders, you need AstraLex, you just need to be pacient and costum it...
Seeing how high the fps could go with low render distance was cool. But it also would have been interesting to see how high the render distance can go with a 4090
Render distance is RAM based. Basically you multiply how many gigabytes of RAM you have by 3 to get your render distance.
@@someone11112so if I got 16GB of ram I can easily run 48 render distance?
@@GameOver-nm2us how much of that have you allocated to minecraft?
@@someone11112This is a somewhat misleading response. While render dist does heavily depend on the amount of ram you have allocated, your CPU is heavily taxed at higher render distances. This is often why you'll see a reduction in your GPU usage as you increase your render distance; your CPU bottlenecks your GPU more.
@@someone11112 I should reinstal minecraft now that I upgraded from 16GB to 64GB to try some things out I guess xdd
Man i always enjoy watching incredible shader videos for Minecraft. The great visual ones that aren't a texture pack and still have vanilla blocks make the game feel so much immersive.
I want to note that I wanted to replicate the highest framerate video on minecraft, and there is still a lot you can do to optimize the fps even more, when i first got my Rtx 3070 ti i used Lunar Client and turned off everything, i used 1.8.9 and turned everything to low, i hit a wopping 6200+ fps using an rtx 3070 ti and ryzen 9 5900x, i wanted to make a video out of it but a week later my minecraft account got hacked.. I know i have a photo of the fps somewhere lol
@@anonimus11236 yeah i did the exact same thing turning everything off from my pc
@@anonimus11236 maybe it's my drivers but Linux performs about the same for me
@@anonimus11236 wow, does dynamic wallpaper really change performance in a measurable way?
@@lemonaut1 Probably not that much, so i would say no more than 10 fps less, but it does depend on the hardware. But anyways if you are trying to get max possible performance, every frame matters
@@anonimus11236 or a custom image of windows. Like Tiny11
The Rethinking Voxels shaderpack (or Complementary Reimagined) combined with the Faithless texture pack makes the game look completely incredible, highly recommend
The Barely Default resource pack too, cant forget that!
i think rethinking voxels is deffo going to be the go to shader when it comes out of beta, great visuals with good performance
Imagine showing this to someone in the past playing the alpha version of the game
Rethinking Voxels (Complementary Reimagined RT edit) could also be a big thing after a few more updates. The light system on that thing is completely unreal on par or above with bedrock indoors and any other rt shader I’ve tried, and there’s a huge plus which is that it is still Complementary
I'll need to check it out!
@@AsianHalfSquat I meant Rethinking Voxels, that’s the proper name btw, sorry lol
Rethinking voxels is probably the best shader I've ever used, the aroura's in snowy biomes are crazy
@@TheLivingHuman that's from complementary reimagined though, so if you use it for only that, then you'd be better off using complementary reimagined for improved performance and general visual fidelity (minus coloured light)
@@gri5733 yeah I get halved performance on voxels compared to complementary.
I would love to see this again but with the fabulously optimized modpack :D
I saw it on modrinth, but you could do better. They put a lot of bloat mods there which are unnecessary, you are better off just nitpicking the opti mods out of it and delete the rest.
@@idedary Yeah, I don't understand the bias people have with this modpack lol, maybe they advertise it. There's plenty of better modpacks out there for that
@@Cronalixz I agree, I can get 50% more performance by just selecting the mods myself, as I did for my server. When i look at the mods list there is unnecessary stuff hogging the resources. Its a clickbait modpack.
@@idedary you should upload your modpack like that and advertise it as "fabulously optimized without the bloat" lmfao
@@Cronalixz he should actually do that because im stupid and i cant do what he says because im not smart enough
Finally started ordering the parts for my PC today with an RTX 4070 and am super excited!
Windowed mode is brutal. The ultimate challenge
Back when I had intel hd graphics 530, I either used windowed mode or fullscreened in an ugly resolution.
What about it? I play Minecraft in windowed mode, because my monitor is 59hz and any game on full screen will cause screen tearing.
@@lost4030 just cap the FPS.
@@etmezh9073 i use vsync and it still happens, setting it myself doesn't work either because my monitor is actually 59.6hz, and obviously i can't put that in setting
@@lost4030 set it to 60hz then instead of 59.6
unless there is no "60.000hz" option
The reason bedrock uses less gpu usage, and was locked at 72 fps may have been due to vsync, as if it is unable to reach 144 fps (assuming that is your monitor refresh rate) it will lock it to half of 144 (which is 72). Because it locked the fps, it will end up using less gpu
i dont understand what any of this means i just know that bedrock runs faster than java especially when playing in biomes like mangrove swamp / jungle. wanted to cry when i went on java and saw that the things i love exploring in bedrock is being laggy af lol
@@nafsii04 if the fps with vsync on doesnt reach the monirtor refresh rate, it makes the fps half of the refresh rate.
@@nafsii04but since they do a render dragon update to bedrock now it start to be laggy..
@@nafsii04 bedrock is written in c++, its just native code. It is gonna be a lot faster. But java runs on a virtual machine. But thats why minecrat java has a lot of support for modding and compatibility
Yes this is correct, he has to edit some configs manually to disable it! With my 3060 I was stuck at 48 for some reason but after editing the config I got like 60.
Intrestingly, one thing that's clearly noticable in the ray traced high resolution versions of any games on graphics cards, is the power of the sun! In that it is now able to profoundly showcase the power of sunlight to a more realistic degree and also create enough difference between the region in bloom and not in bloom, effectively also increasing the f stops!
3090 ti was a serious leap in graphics card technology.
But the 4090 alone is a the Neal Armstrong of graphics card leaps.
Minutemen?
@JavaScrapper YES.
More specifically, the fallout 4 modded Minutemen Republic flag.
I am just saying ok don't get to say that I am stupid just think u take this shader and try to run it in a 2 gb ram mobile and its on fire yay
awesome analogy dude.........lol
not really.. the biggest graphics card leap in history was the monster that was the GTX 1080 Ti this card was so powerful not even the next gen RTX 2080 could beat it , the 1080 Ti was waaay ahead of its time
F3 reduces your FPS because the text rendering is not optimized. Every frame it has to rebuild the strings letter by letter as far as I'm aware. I don't think it has any kind of caching.
This is litterally the graphics I expect for a minecraft movie
4:45 is how my computer runs completely vanilla Minecraft lol
Love the extra editing it looks cool
Bliss shaders look amazing in the nether with the atmospheric fog
You need to see the load of each core of the CPU with some software like HWinfo to know if you are throttling it. Minecraft probably runs on just a couple cores, so if you have 16 cores it could be throttling 2 cores to the max but the task manager will only show you a 12-13% CPU usage.
It's java. It only runs on a single core. It's Windows that makes it appear the load is being shared, but it actually isn't
I think you should have used the physics mod to really push the GPU and I think we all wanted to see how it looks. Love your videos, please keep posting these amazing videos
this "destroying gpu" series is just getting better every year wow
The intro is half the video
yes
Looking forward towards the 5090 benchmark!!!
4090 + 20fps
👍
you should've used the physics mod to see what minecraft could really be like
0:22 who seen those blocks launching in the background?!
*CHICKENS*
They were goats
Song from 0:12 is The Ancient Dragon from Dark Souls 1 :)
2:51 I chocked on my food after hearing that
This man will never stop upgrading his PC with the latest and best graphics cards isn’t he? 😂
😄
Yeah, and some shader developers will have to up their ante with the gpus lol
One thing to note, if your gpu isn't at about 90% utilization or above, it IS the cpu that is the bottleneck. That's why your 4090 was hardly ever being used fully in this video. If you upgrade your cpu you should see a big increase in fps.
Hey man, i recently got a I9, do you think that's going to yield better results than the video?
@@TBone.Gaming there’s many variables but it would depend on the generation of it and your gpu. If you have a 4090 and the newest i9 (I’m not very familiar with these) then it would be better than this video, but still the bottle neck as it may take another year for cpus to catch up to the 4090
@@TBone.Gaming prolly not, I heard the 13900k is only about a ~5% increase from the 13700k strictly for gaming. Every cpu on the market is bottlenecking the 4090 at this point so the only thing you can do is play the waiting game.
6:42 This is what i remember minecraft to look like as a kid
When raytracing shaders start working with LOD mods, I’d love to see you visit this idea again.
I struggle to see how raytracing could be compatible with LOD, maybe LOD could simplify textures and models in the distance for vram benefits? But that doesnt solve the light rendering which is what I would call the most intensive part. Maybe it can limit the amount of light bounces after a certain range, but even then that would make lighting from the sun and moon janky.
Maybe its possible but based on my current knowledge of raytracing that would very hard for a developer to implement correctly.
@@legendarylava8775 I'm not even thinking of it being such a coherent comparability, just for there to not be significant visual glitches when both are used at the same time, which occurs currently with LOD mods and most even non-raytracing shaders.
When seeing only a 17% or so cpu usage this is completely normal and can still be a CPU „limitation“ as Minecraft can’t use lots of cores like other games. So even if it is maxing out the 2 cores it uses, your 16 core CPU won’t reach a high percentile load, but can still cap the framerate. (The single core speed is the limiting factor :)
This
I love watching videos of stuff I will never be able to afford.
Imagine this on GT710 💀
"With that out of the way, we're going to take things up a 'Notch'"... I see what you did there hahah (3:56)
No? He didn't emphasize it?
Obviously it wasn't intended, but it's still a pun :| @@malindrome9055
fun fact: There is an advantage having FPS past your monitors refresh rate, as it shows you the newest frames rather than showing you them quicker
Where's Your This Year's April Fool Video
ordered my 4090 a few days ago and its on its way got a 4k monitor ready! this video popped up and seeing it makes me all the more excited to open that box!
damn you're rich i wish i could ever get one.. im stuck with a geforce gtx 1070
@@Bl0xxy managed to get super lucky and got one. For sure worth setting money to the side. Even if little by little. A monster worth selling you soul for haha. But also fair close to over kill. Not really neccesary. But sometimes I do renders for my friends games he makes. So I kinda need it.
My favourite shaders has to be rethinking voxels. It's so cool to actually see Minecraft have that musch realistic lighting
What you said about bedrock rtx shaders I completely agree with. Indoors look absolutely fantastic, especially pair with a good texture pack, but outdoors usually just look decent to sometimes plain bad for a shader
It wouldn't be so bad if it wasn't so glitchy and you could disable the atmospheric blurring crap. It's technically the most advanced in terms of true RT, but it looks surprisingly mid in all but the darkest of scenarios.
I love playing competitive pvp with my 4090 and 13900k with no shaders on 1440P!
You should go for 1080p 500hz. Yes 1080p 500hz is actually a thing now. Asus made the first 500hz monitor. Last year
@@bretttanton328 i dont notice anything above 240hz, and i rly only need a 144hz one
@@0h_hey944 You spent all that money on the world’s fastest gaming PC, so you might as well get the world’s fastest gaming monitor. You will definitely notice the difference at 500hz.
@@bretttanton328 ik, i have played on a 1080p 360hz before, didnt notice so i returned it and got a 144hz 1440p. honestly its fine.
@@KyjiPurr and other games, but mostly minecraft
My favourite shader by far is Astralex. Its sky / sun and moon customization are unmatched.
Waiting on the next optifine
You should have used performance overlay like afterburner to see the gpu usage. When I tested some super high quality shaders with my 7900xtx and lowering the render distance to 12 made the game utilize the gpu and cpu more raising my fps from 60 to over 300.
I just wish I had a computer that can run Minecraft normally without any mods or shaders
Laptop is the best way for cheap gaming. Less than £500 and you can play minecraft. If you get a cheap mouse it's no different to a PC.
6:10 it doesn't even feel like Minecraft anymore
hey asianhalfsquat, no april fools day video this year? 😩
The smoke coming out of my phone after watching this video is too realistic 🔥
april fools where
if someone could make a mod compatible with the 40 series's frame generation that would be a game changer
Great to see you are a DRG fan
1:00 What shader is that? Or is it bedrock RTX?
Nvm its probably rethinking voxels
No Optifine 4 this year 😢
Yea but look there is now 1.20 version in Minecraft :L
i really love rethinking voxels, its one of the pretties packs ive ever used, being fully path traced, and it gets better performance than PTGI while also having colored light.
This man is the reason for half of Nvidia's profit
😭
@my account got suspended :/ it's a joke, he isn't actually
optifine 4?
Nvdia: The rtx 4090 can run anything
Minecraft: Let me even the odds
WHERE IS MY OPTIFINE 4
Minecraft,
THE GRIM REAPER FOR ALL COMPUTERS
Nah bruh thats hell 5:50
Ya, like when i first brought minecraft, I was so dumb but I watched a video of a player going to the nether, and I tried the tutorial on how to make it and i got t the nether and when I saw it I was like"bro i just found hell in minecraft"😅😅😅
Oh my pccccccccc
You should use euphoria patches or rethinking voxels mod (addon to compeletary shaders) it adds even better lightning
Where big boi optifine 4 april fool video?
;(
Waiting for Optifine 4 update.........
Who wants to bet he'll get the 5090 when that comes out
WHERES OPTIFINE 4
Where optifine 4?
I am a fan of the Luna Texture Pack :)
Have you tried the rethinking Voxels Shader? In my opinion one of the best shaders rn.
Yeah, those are really epic.
Yes, except in the max pre-set. With my 3090 and a resolution of 1440p, I achieve only 6 fps
@@_Zelkova bro that's good XD I get 4 frame or my PC crashes when I use then
where's optifine 4?
/??/?
You should do a Minecraft conversion for a war mod. Like including planes. tanks. boats. machinery. or basically whatever you want to included. I bet you it will be epic. Also if you make it a modpack im pretty sure it will get popular that way too.
No april fools joke? 🥺👉👈
wait what was the game at 0:32 ?
Deep rock galactic
I really hope maybe in the next decade that computers accessible to most can really use these graphics
Bro where is optifine 4
Na i woukd go to iris and not to optifine because iris has better shader Compiler and use less Performance and iris has nicer shaders today and with other Performance and generell fabrik Tools u can triple your default fps and has also nicer shaders and options
no april fools vid :(
Nobody is asking this man’s godly specs…
One critique that I have with SEUS PTGI is that there are little blurs and tears when your character is moving or slight visual bugs and it just looks weird. I can also say that your hand in first person POV is like dark for me. I love the work these authors put in and don't get me wrong I am not saying these are bad shaders. They are the best out there. I just wanted to share my opinion.
When your computer runs hot enough to smelt alloys:
Soon your gonna have to do this with an RTX 5090.
Are you kidding me that’s so cool you deserve a sub man.
this is how i remember minecraft from my childhood looked like
i havent watching you for a year or 2 i never heard you so active and happy is it just me or what?
I'm going to cry, he didn't upload a april fools video 😭😭
5:26 those clouds are CRAZY
For those who are curious:
SEUS PTGI is INSANELY fast for a full path traced shader and the visuals are absolutely stunning. It runs at a (marginally) playable 20 FPS on my GTX 1660 ti, at 1080p and when using the free 32 px version of Patrix it runs about 15 fps. So if you have a 2080 ti (which if I recall correctly was a MASSIVE improvement over the earlier cards) then that should run at a pleasantly playable frame rate.
cant wait for you to test the 5090
Congratulations on 1mil btw
I would love to see you try rethinking voxels, it might not be the best, but it’s just beautiful in its own way.
rethinking voxels
Minecraft in ultra 4K, what a time to be alive!!!
I recently bought a 4080 Ryzen 9 literally everything it’s a top of the line computer and so far every game I’ve played like rust or tarkov I’m at the games max frame cap and it’s amazing
ok but running portal rtx with i assume no upscaling or frame generation of any kind at 100 fps is WILD this things a BEAST (imagine the 5090 😭)