@@W1LlzA well most are entry or low tier tbh, especially if they are older than 2 3 years, my 1660ti was mid tier for sure in past, now its entry level at best
Best to test the voxel shaders in a cave or night time. They use a dynamic lighting system which is why it's so intenisve. Torches, lava and other coloured light sources can cause the surrounding blocks to cast hard shadows that update in real time. These shaders are also based off the complemantry reimagined shaders which is why they had the vanilla looking water by default.
Yeah for real i do strongly agree and strongly suggest it for people who do like caving a lot since ( in my opinion ) having a shader enabled while discovering the cave is really good and gives you good visual and sometimes i think it is looking like a real life experience for me and alot of people who are really interested in it! I think it is very cool to have this specific shaders ( Voxel shaders ) even with water if you see one in the cave! Tbh, I strongly suggest that people should at least ONE time use these shaders!
I definitely see why you’re unimpressed by rethinking voxels, but above ground at daytime is definitely its weakest point. In order to understand how amazing its visuals can be you have to go cave exploring to see lava and other light sources. It also looks significantly better just at night too
The rethinking voxels shader doesn't use path tracing or anything, it uses voxel based lighting, which about I don't know too much, but it should interesting interacting with light sources.
This shader list is so much better then what you normally run! I love how you have raytracing shaders to strangle any performance out of the card. It would be rlly interesting to see you run Seus HRR 3 on older tech also
In shader options, you can set the render quality to 2x. This is similar to DSR, rendering frames on higher resolution and downscaling them. But this will heavily impact your fps. I like to use it on vanilla graphics. EDIT - only possible in optifine, not iris.
This is an odd comment, but would you do me a favour? In bedrock edition, turn on RTX and teleport yourself to 20 million blocks and 30 million blocks on X or Z axis. You will see something extemely weird and please share it with us. I know what you would see, but I never saw that in RTX, which might give even more interesting results...
Just tried it, you end up with these weird looking strips of the blocks. I tried on a superflat world and i ended up with really long strips of the grass texture with really long strips of the bedrock texture a few blocks underneath and slightly to off to the side.
Me personally I like the SEUS PTGI shaders the most, they don't look that great when you use vanilla minecraft texture pack, these are meant to be used with PBR texture/resource packs, this way they look way better but the game could get a little bit more intensive.
I would love to see you review the 4070 TI GPU in MINECRAFT.... just to see how well it does! btw, SEUS Renewed Shader looks way better when you go into the "Post-Processing" settings and change the Tonemap Operator settings to ACES instead of SEUS. You'll get your vibrant shader then with perfect lighting
The reason Sildurs Vibrant with Volumetric Lighting doesn't have the cool " God rays " is because Volumetric Lighting has god rays of its own. For the next video I recommend installing just the regular Sildurs Vibrant. Not the Volumetric Lighting ones. Or disable the volumetric Lighting
The reason that you, Kryzzp, aren't seeing any godrays in Sildurs Vibrant shaders is because Volumetric lighting turns off godrays internally even though you turned them on. You can see godrays by turning off the Volumetric lighting option from "Sky and Lighting" settings. Thank me later and love your content!
Hello I am a 74 year old man from Uruguay, and can I just say that I love your content! The deep philosophical barriers really reflect on what post war modernism has done to the average psyche. Your in depth arguments truly evoke discussion, for example in this video you commented on the obscenity of the war on drugs in Switzerland. What an amazing statement! Please continue what you are doing... the world needs people like you
I wish that SUES PTGI would work on AMD as well also rethinking voxels needs loading time to compile the shaders and latest iris I would recommend fabulosly optimized mod pack for testing
With Rethinking Voxels, DO NOT set its profile to ultra cause it makes no difference in visuals, but for some reason destroys performance. Where it's gorgeous are block lights, set time to night and go to a village or a cave and place torches and other stuff, looks absolutely beautiful, each one will create nice sharp colored shadows, looks even better than something like Seus HRR. It is intensive, but if you set the settings right, it can be running quite alright, I mean, on my gtx1050ti 1080p I managed to get solid 30fps+ with beautiful visuals, on rtx4070 1440p it may be sitting at 120fps.
The reason why rethinking voxels is so intensive at ultra is cause it has realtime sun shadows pretty much if a cloud blocks the sun everything goes black it’s extremely intensive even on a 4090 also the lighting is voxel based there’s nothing like it it’s such a good shader if you can run it
My brother in law just showed me your video. He said you look just like my son. I didn't believe him. So here I am and sure enough, you look just like my son!!!
Hey man, could you do updated videos for RX 5700XT? I know you have videos about this card already on your channel but i feel like they are a little bit outdated and i would love to see how performance of this card have changed over the years :P.
u can lower the simulation distance. if u make it like 6. after 6 chunks of distance mobs will not move redstone systems will stop etc. it will help the cpu.
The reason why you've never seen a Mansion before is because they are rare around spawn, usually one is between 5k-10k from spawn and goes on like that every 5k-10k, there are loots but they are hidden and you have to find them, not all mansions are the same as well.
idk if u know this but you can use /locate structure to get coordinates of the nearest structure u want for example /locate structure Village (it got updated in 1.20.6 idk that one) and /locate biome plains to get coordinates of a plains biome
This is the best video for me currently because i will build a pc in a few months and have planned buying the rtx 4070 ghost and want to play mc on it mainly
the seus hrr 2.1 and hrr 3 are amazing but you need to go in the settings and play around with the settings because the default settings feel a little too vibrant, also i reccomend hrr 2.1 because is more stable and has more fps than hrr 3.
yeah should be 500 max 550 since, you can buy a rx 6800xt which has 16gb of VRAM for 550 Euros. 4070 delivers the same- performance as the rx 6800 xt with the downside that the 4070 is more likely to run in VRAM issues (spillover, stutters, etc)
@@limeisgaming I remember getting an open box 1070 FE for 300 dollars back in 2018. I used this video and the 6900XT MC shader video (I upgraded to the 6950XT from my 1070) to see the difference between the two cards, as they were around the same price, $600. According to these two videos, the 6900XT beats the 4070 (project LUMA results have a 100 FPS gap OMG). Not sure how far drivers have progressed since the 6900xt video came out so it might be different, but still, Damn. I chose the 6950xt originally, because although 40 series had all the new bells and whistles, I mainly wanted a GPU that could play Minecraft modpacks with shaders at large render distances. DLSS 3 and Frame Generation was not going to help me get over 240 hz on Minecraft Java, so the Nvdia Fanboy Feature argument I see in yt comments was out the window. Funny thing is, even though by buying RDNA2 instead of 40 series, I gave up the potential to enjoy frame gen in newer games (if I decided to play them), apparently AMD is going to bring it to my GPU anyways with FSR 3, and **Maybe** even give me the option to enable frame gen on the older DX11/DX12 games, It is funny how things end up working out. Sorry for the essay, I just wanted to share my thoughts on a video about the main use case for me having such a powerful GPU.
I would go for the 1TB SSD for now. When u run out, buy a cheap sata ssd. Even tho i have an SSD for windows, I have 6TB of HDD storage for my games and I really regret it. Load times aren’t fun.
Sildur's Shaders disable godrays when volumetric lighting is enabled I think, you can't have godrays along with VL or VL with godrays. It's because of how volumetric lighting works
Talking about the woodland mansion you found amm i once had a moment on PS4 where i created a random creative world flew across the world only to find a village and woodland mansion right inside of one another that was funny😂
I think after 2 more videos you're gonna blow that monster GPU 😂 after testing 16K res and Ultra setting you can get 10fps what do you think what you can get 😂❤
I wonder what the 4060 ti will be like and price point, so I am guessing it will be like a 3070 ti in power and with dlss 3 of course at the price of 450 prob
38:35 you're using a 4k monitor, afaik superresolution essentially works like AA (as it has to avg 4 pixels down to 1 so it's like 4x MSAA the hard way) so you would see less aliasing 39:30 well it is running at your monitor's native res
I went with the Gigabyte 4070ti eagle OC. This thing is an absolute monster, while not cheap at $800 it still remains $400 cheaper than a 4080 and $800 cheaper than the 4090. I play on 1440 and I couldn’t have chose a better gpu, love it
@@MarkoMarkovina I paired mine with a Ryzen 7 7700x for $300 and it came with Jedi Survivor. The thing runs amazingly and much better than I thought it would. I came from using a laptop with a 2070 and a I7-7000 series cpu. It crushes it 10 fold. $300 is in expensive for a cpu these days and it’s been a monster. I recommend looking into that one
@@TheAscendedHuman actually because of frame generation 4070ti is way faster than 3090 ti! That's the whole point. Yeah people who can't afford it will always cry about it
@@TheAscendedHuman Agreed bro and with your other comment. All these reviewers telling their subs not to buy shit, it’s not their money. We know how much it is but it’s an investment because I’m not gonna need anything new for around 10 years. They all complain even though their already 144+ frame rate just isn’t “good enough” because they want to squeeze out an extra 10 frames every generation that makes no difference. Everything I play right now is maxed out and above 100, even Squad gets only as low as 80 which is still fast. So done with RUclipsrs crying lol
You are saying I will not change from a GTX 780ti 10 years ago to a RTX 4080ti because I can play the games right now in 100+ frames in 1080p. But where is the GTX 780ti now? They can't even play 60 fps in cyberpunk in 1080p native low-medium settings without FSR. So saying this is 'investment' is not true, and these GPU companies always wants you to buy their new release. So if you have money, buy the GPU that satisfies your needs and do your research by watching the reviewers or the internet so your hard-earned money doesn't go to waste.
Hey bro, I was wondering, what are the actual clocks of your CPU? You see any performance benefits going from stock 5.1 to any other overclock, let's say 5.5 ? I tested it with so many games and it feels like 13600k performs the same on any clock. I paired it with a 3080 10G.
I run 5.6GHz on the P cores and 4.2GHz on the E cores. You don't see a difference with the 3080 because the i5 is already good enough at stock to not bottleneck it, in that case, instead of overclocking the CPU I suggest undervolting it so it consumes less power while giving ou the same performance :) You'd start seeing a difference when overclocking with something like an RTX 4070Ti or 7900XT and above in performance, or if you use low settings / low resolutions in games!
I have a quick question that I hope can be resolved here: how do you consistently keep your gpu at 90% - 100% usage? even after I do mass tweaking to my game it always underperforms while staying at ~50% usage (this is with an rtx 3080 ti)
@@DravenCanter no not at all, when I run benchmarks the scores seem fine but whenever I'm ingame I immediately notice I'm lacking performance, in his video with the 3080 ti running mc he is constantly hitting 200+ fps at 100% usage while I sit at 100ish frames (1440p) going as low as 40 - 50 with it at around 40% - 60% usage
to add onto this, I also run into issues concerning my vram usually running out of it when running aplications that shouldn't be exhausting all my vram even with all my apps closed
I think a really fun thing you can do in these minecraft videos is to just try like spawning a bunch of mobs or explode a lot of tnt to see the performance and btw if you wanna find a specific biome or structure in creative you can type /locate biome jungle for example! Hope this helps
Can you do a CPU comparison at this point for Minecraft. Like 5800x3d vs 5900x vs 13600k on Minecraft with the rtx 4080 at 1080p to 4k. You could do as many cpus as you want, but it would be nice to see which cpu is also working the best for this game. Also spawn like a huge section of mobs in a fence to see how they handle a cow farm for example.
@@yancgc5098 I did not ask that. I asked for a comparison. In fact you could be wrong. What if minecraft takes advantage of the 3d V-cache in multiplayer scenarios. Its just not that simple of an answer. I've also seen the 13600k lose to the 5800x3d many times. Its all test scenario bc they are both better in certain areas.
@@Purified1k Well you can throw the 5900X out of there since Minecraft Java is very single threaded. Between the 13600K and the 5800X3D it depends whether Minecraft loves that amount of 3D V-Cache or prefers the adequate amount of cache alongside the superior single core performance of the 13600K. My money is on the Intel CPU
@@yancgc5098 But you forgot the part where I just want the comparison in general. So yes, even though the 5900x might not be as good. I want to see how it stacks up.
I use the complementary shaders with sodium on my GTX 1070, at 1080p fancy the game stays above 60fps at all times. There's this cool setting in the sodium menu that lets you dedicate CPU threads just for chunk rendering, so even on my 9 year old 8 core xeon e5-2667 v2 I can use 32 chunks and get good fps. It's on 1 thread default.
Minecraft is the only thing which can run with the potato gpus and can also destroy the rtx 4090 with shaders physics mods and texture packs because of this big community
@@zWORMzGaming Good stuff, i'm planning on upgrading from a ryzen 3600 to a 5800x3d, my old gtx 1070 will, most likely be replaced with a 4070, since my case is small (meshify c)
24:18 what 80 years ago😂or am i hearing wrong. Also decrease the simulation distance to 5 that decreases cpu load by a lot without giving any noticeable change.
it would be cool to see the performance for continuum RT because thats probably the most intensive shader there is. it would be interesting though kinda a pain to install
I don’t even care for the 4070 I just watch this man’s videos because they are honestly so strangely enjoyable
well even if you dont care, fom midcard to survive minecraft shaders and rtx is pretty impressive, especially of that low power draw
same
@@NostalgicMem0ries if 4070 is mid then mine is pure fuck
@@W1LlzA well most are entry or low tier tbh, especially if they are older than 2 3 years, my 1660ti was mid tier for sure in past, now its entry level at best
@@NostalgicMem0ries I have a gtx 1650 😦
Best to test the voxel shaders in a cave or night time. They use a dynamic lighting system which is why it's so intenisve. Torches, lava and other coloured light sources can cause the surrounding blocks to cast hard shadows that update in real time. These shaders are also based off the complemantry reimagined shaders which is why they had the vanilla looking water by default.
I know I just love them
Yeah for real i do strongly agree and strongly suggest it for people who do like caving a lot since ( in my opinion ) having a shader enabled while discovering the cave is really good and gives you good visual and sometimes i think it is looking like a real life experience for me and alot of people who are really interested in it! I think it is very cool to have this specific shaders ( Voxel shaders ) even with water if you see one in the cave! Tbh, I strongly suggest that people should at least ONE time use these shaders!
This is joke?)
Nice ^^
As someone in the market for a 1440p GPU, these 4070 focused videos are a treat!
Minecraft with shaders definitely on the bucket list.
I definitely see why you’re unimpressed by rethinking voxels, but above ground at daytime is definitely its weakest point. In order to understand how amazing its visuals can be you have to go cave exploring to see lava and other light sources. It also looks significantly better just at night too
The rethinking voxels shader doesn't use path tracing or anything, it uses voxel based lighting, which about I don't know too much, but it should interesting interacting with light sources.
The blocklighs (and shadows) are path traced, so yeah
Voxel based lighting comes under the umbrella of ray tracing
This shader list is so much better then what you normally run! I love how you have raytracing shaders to strangle any performance out of the card. It would be rlly interesting to see you run Seus HRR 3 on older tech also
In shader options, you can set the render quality to 2x. This is similar to DSR, rendering frames on higher resolution and downscaling them. But this will heavily impact your fps. I like to use it on vanilla graphics.
EDIT - only possible in optifine, not iris.
hey i have 3070 and trying to find good shader, any recommendations? whats the best looking one and playable
@@TuxikCE thanks
@@aliahmedov4674 Complementary is pretty good for playable frame rate and looks nice too
@@tsumuphi9206 yes and rethinking voxels also
Super duper vanilla
This is an odd comment, but would you do me a favour? In bedrock edition, turn on RTX and teleport yourself to 20 million blocks and 30 million blocks on X or Z axis. You will see something extemely weird and please share it with us. I know what you would see, but I never saw that in RTX, which might give even more interesting results...
Interesting, I'll try!
sparked my interest
Just tried it, you end up with these weird looking strips of the blocks. I tried on a superflat world and i ended up with really long strips of the grass texture with really long strips of the bedrock texture a few blocks underneath and slightly to off to the side.
@@Papa_Parry I know, but does ray tracing show something more weird?
@@TuxikCE i did it with ray tracing so i'm assuming it's just the same result as non ray traced
Me personally I like the SEUS PTGI shaders the most, they don't look that great when you use vanilla minecraft texture pack, these are meant to be used with PBR texture/resource packs, this way they look way better but the game could get a little bit more intensive.
I would love to see you review the 4070 TI GPU in MINECRAFT.... just to see how well it does!
btw, SEUS Renewed Shader looks way better when you go into the "Post-Processing" settings and change the Tonemap Operator settings to ACES instead of SEUS. You'll get your vibrant shader then with perfect lighting
yes would be perfect
The reason Sildurs Vibrant with Volumetric Lighting doesn't have the cool " God rays " is because Volumetric Lighting has god rays of its own. For the next video I recommend installing just the regular Sildurs Vibrant. Not the Volumetric Lighting ones. Or disable the volumetric Lighting
The reason that you, Kryzzp, aren't seeing any godrays in Sildurs Vibrant shaders is because Volumetric lighting turns off godrays internally even though you turned them on. You can see godrays by turning off the Volumetric lighting option from "Sky and Lighting" settings. Thank me later and love your content!
Hello I am a 74 year old man from Uruguay, and can I just say that I love your content! The deep philosophical barriers really reflect on what post war modernism has done to the average psyche. Your in depth arguments truly evoke discussion, for example in this video you commented on the obscenity of the war on drugs in Switzerland. What an amazing statement! Please continue what you are doing... the world needs people like you
what the hell? xD
I'm sorry sir, this is a Minecraft video
09:09 didnt thought i would cry in benchmarking video
I wish that SUES PTGI would work on AMD as well
also rethinking voxels needs loading time to compile the shaders and latest iris I would recommend fabulosly optimized mod pack for testing
With Rethinking Voxels, DO NOT set its profile to ultra cause it makes no difference in visuals, but for some reason destroys performance. Where it's gorgeous are block lights, set time to night and go to a village or a cave and place torches and other stuff, looks absolutely beautiful, each one will create nice sharp colored shadows, looks even better than something like Seus HRR. It is intensive, but if you set the settings right, it can be running quite alright, I mean, on my gtx1050ti 1080p I managed to get solid 30fps+ with beautiful visuals, on rtx4070 1440p it may be sitting at 120fps.
very nice video
also could you perhaps make a minecraft showcase with realistic mods and shaders for the rtx 3060?
you deserve much more subscribers and views my friend .....keep up the good work
I would love to see RT overdrive implemented in Minecraft.
I love the look of Seus PTGI so much
The reason why rethinking voxels is so intensive at ultra is cause it has realtime sun shadows pretty much if a cloud blocks the sun everything goes black it’s extremely intensive even on a 4090 also the lighting is voxel based there’s nothing like it it’s such a good shader if you can run it
My brother in law just showed me your video. He said you look just like my son. I didn't believe him. So here I am and sure enough, you look just like my son!!!
(Insert traffic light emoji)
25:52 bro the horizon zero dawn lens flare and Gta vice city lens flare too 😂😂
Hey man, could you do updated videos for RX 5700XT? I know you have videos about this card already on your channel but i feel like they are a little bit outdated and i would love to see how performance of this card have changed over the years :P.
u can lower the simulation distance. if u make it like 6. after 6 chunks of distance mobs will not move redstone systems will stop etc. it will help the cpu.
26:33 of course, objective, not subjective at all XD
nice slip up, I got a good laugh
Please also try benchmarking with high res texture packs. They do hit your fps, not that much, but they do a bit.
The reason why you've never seen a Mansion before is because they are rare around spawn, usually one is between 5k-10k from spawn and goes on like that every 5k-10k, there are loots but they are hidden and you have to find them, not all mansions are the same as well.
idk if u know this but you can use /locate structure to get coordinates of the nearest structure u want
for example /locate structure Village (it got updated in 1.20.6 idk that one)
and /locate biome plains to get coordinates of a plains biome
it would be pretty cool if you used the nvidium mod i think the game would look amazing with those shaders with it
Yoooooo another video I was waiting for it love you bro keep it up one day you will reach 1million
I just love the surprise on non-Minecraft players when they find woodland mansions 😂
Theese shaders so good, youtube on my phone crashed
Minecraft it's the game that will be always laggy on every PC it depends only from how many mods you have and shaders
Test Soft voxels shader. In cave too!
And perhaps make fog less intensive.
i can play mc on full settings on full hd screen (gtx 970) but shaders doesnt recude performance on full hd
This is the best video for me currently because i will build a pc in a few months and have planned buying the rtx 4070 ghost and want to play mc on it mainly
What application is he using to show the CPU and GPU usage in game?
Msi afterburner and rivatuner statistics
Your videos(especially the intros) make my day :D
Extreme VL means extreme volumetric lightning, and when VL is turned on it disables god rays
the seus hrr 2.1 and hrr 3 are amazing but you need to go in the settings and play around with the settings because the default settings feel a little too vibrant, also i reccomend hrr 2.1 because is more stable and has more fps than hrr 3.
The 4070 isn't even that bad of a card but its price is crazy.
yeah should be 500 max 550 since, you can buy a rx 6800xt which has 16gb of VRAM for 550 Euros.
4070 delivers the same- performance as the rx 6800 xt with the downside that the 4070 is more likely to run in VRAM issues (spillover, stutters, etc)
@@limeisgaming
I remember getting an open box 1070 FE for 300 dollars back in 2018.
I used this video and the 6900XT MC shader video (I upgraded to the 6950XT from my 1070) to see the difference between the two cards, as they were around the same price, $600. According to these two videos, the 6900XT beats the 4070 (project LUMA results have a 100 FPS gap OMG). Not sure how far drivers have progressed since the 6900xt video came out so it might be different, but still, Damn.
I chose the 6950xt originally, because although 40 series had all the new bells and whistles, I mainly wanted a GPU that could play Minecraft modpacks with shaders at large render distances. DLSS 3 and Frame Generation was not going to help me get over 240 hz on Minecraft Java, so the Nvdia Fanboy Feature argument I see in yt comments was out the window.
Funny thing is, even though by buying RDNA2 instead of 40 series, I gave up the potential to enjoy frame gen in newer games (if I decided to play them), apparently AMD is going to bring it to my GPU anyways with FSR 3, and **Maybe** even give me the option to enable frame gen on the older DX11/DX12 games, It is funny how things end up working out.
Sorry for the essay, I just wanted to share my thoughts on a video about the main use case for me having such a powerful GPU.
The rethinking voxels allows colored lighting which is cool
im using 4070 with r5 5600x and my frames are pretty much a lot lower, idk why and i cant find the issue
You've listed the problem
did you, by any chance, voiced over Homer Simpson? And now you can't unhear it.
Nobody:
Bob everytime he get killed: 5:36
I am looking to get into 1440p gaming, what is the best GPU you would recommend for 1440p?
4060/70 is fine bro
Hey kryzzp which storage solution should i go for?
256 GB M.2 NVME SSD + 2TB 7200Rpm HDD or 1 TB M.2 NVME SSD?
I would go for the 1TB SSD for now. When u run out, buy a cheap sata ssd. Even tho i have an SSD for windows, I have 6TB of HDD storage for my games and I really regret it. Load times aren’t fun.
Don't forget to place ice pack over the gpu
Sildur's Shaders disable godrays when volumetric lighting is enabled I think, you can't have godrays along with VL or VL with godrays. It's because of how volumetric lighting works
Talking about the woodland mansion you found amm i once had a moment on PS4 where i created a random creative world flew across the world only to find a village and woodland mansion right inside of one another that was funny😂
I think after 2 more videos you're gonna blow that monster GPU 😂 after testing 16K res and Ultra setting you can get 10fps what do you think what you can get 😂❤
"this is how i have to buy a new GPU"
I wonder what the 4060 ti will be like and price point, so I am guessing it will be like a 3070 ti in power and with dlss 3 of course at the price of 450 prob
host a radio show or podcast or something you have an amazing voice bro
If nobody got me, I know BSL shaders got me. Can I get an amen?
Pls make a vid on gta 5 natural vision evolved rtx 3050
For those of who has a GTX 1070 . Would we be able to run these at 1080p
as 1650d6 owner, everything starts up and works fine (~40-50fps with iris)
Thank you very much for your love and respect guys ❤️
😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂🎉😂🎉🎉🎉😂😂😂😂😂
What do you think about the Gollum game's requirements?
Me watching this video at 720p
ok nice test video
Me playing with Intel HD 610 and Pentium 4417U running Minecraft 60FPS with terrible stutters:
"I wish I had that PC"
If I get a 4070 and a 12600k will I get roughly the same FPS? I mean shaders and 12 chuncks are more on the GPU side. (I'd be using those settings)
You should try the photon shaders!
Which is better for minecraft with heavy mods and shaders...rtx 4070 or amd rx 7800xt ?
38:35 you're using a 4k monitor, afaik superresolution essentially works like AA (as it has to avg 4 pixels down to 1 so it's like 4x MSAA the hard way) so you would see less aliasing
39:30 well it is running at your monitor's native res
Bro, my 4070 gets only 40 fps in sea ruines map in RuHypixel and i optimized everything everywhere
I just bought one and I hope it's good, this is for my first pc
Complementary, is the best Shaders for Fantasy theme
The fact a gtx 1650 or 1660 can run one or two of the shaders at 35 to 60 fps, yet an rtx 4070 stuggles tho for its high price
The fact he just destroyed one of the rarest structures in the game
Nice video bro. Btw you dont need to have a day in minecraft just tipe /time set 0 or time set day and there you have iz
SEUS renewed shader is amazing with the patrix resourcepack
I went with the Gigabyte 4070ti eagle OC. This thing is an absolute monster, while not cheap at $800 it still remains $400 cheaper than a 4080 and $800 cheaper than the 4090. I play on 1440 and I couldn’t have chose a better gpu, love it
I agree... its the best buy card for 1440p... I bought it recently but with i5 11400f I have a cpu bottleneck so Ill have to upgrade my cpu soon
@@MarkoMarkovina I paired mine with a Ryzen 7 7700x for $300 and it came with Jedi Survivor. The thing runs amazingly and much better than I thought it would. I came from using a laptop with a 2070 and a I7-7000 series cpu. It crushes it 10 fold. $300 is in expensive for a cpu these days and it’s been a monster. I recommend looking into that one
@@TheAscendedHuman actually because of frame generation 4070ti is way faster than 3090 ti! That's the whole point. Yeah people who can't afford it will always cry about it
@@TheAscendedHuman Agreed bro and with your other comment. All these reviewers telling their subs not to buy shit, it’s not their money. We know how much it is but it’s an investment because I’m not gonna need anything new for around 10 years. They all complain even though their already 144+ frame rate just isn’t “good enough” because they want to squeeze out an extra 10 frames every generation that makes no difference. Everything I play right now is maxed out and above 100, even Squad gets only as low as 80 which is still fast. So done with RUclipsrs crying lol
You are saying I will not change from a GTX 780ti 10 years ago to a RTX 4080ti because I can play the games right now in 100+ frames in 1080p. But where is the GTX 780ti now? They can't even play 60 fps in cyberpunk in 1080p native low-medium settings without FSR. So saying this is 'investment' is not true, and these GPU companies always wants you to buy their new release. So if you have money, buy the GPU that satisfies your needs and do your research by watching the reviewers or the internet so your hard-earned money doesn't go to waste.
Hey bro, I was wondering, what are the actual clocks of your CPU? You see any performance benefits going from stock 5.1 to any other overclock, let's say 5.5 ? I tested it with so many games and it feels like 13600k performs the same on any clock. I paired it with a 3080 10G.
iirc Kryzp's 13600K is running a locked 5.6GHz OC on the P-cores
I run 5.6GHz on the P cores and 4.2GHz on the E cores.
You don't see a difference with the 3080 because the i5 is already good enough at stock to not bottleneck it, in that case, instead of overclocking the CPU I suggest undervolting it so it consumes less power while giving ou the same performance :)
You'd start seeing a difference when overclocking with something like an RTX 4070Ti or 7900XT and above in performance, or if you use low settings / low resolutions in games!
@@zWORMzGaming was suspecting it, thank you so much. Love your work, keep it up ^^
you can try to put a 500x or 1000x or even a 2000x texure pack to really test the GPU
Sildurs shaders is what iv always used and preferred, it looks great, after using shaders for so long i find vanilla minecraft ugly
I have a quick question that I hope can be resolved here: how do you consistently keep your gpu at 90% - 100% usage? even after I do mass tweaking to my game it always underperforms while staying at ~50% usage (this is with an rtx 3080 ti)
By having no CPU bottleneck
@@zWORMzGaming is my ryzen 9 5950x enough for my rtx 3080 ti? with 32g's of ram
Are you not hitting the fps you desire?
@@DravenCanter no not at all, when I run benchmarks the scores seem fine but whenever I'm ingame I immediately notice I'm lacking performance, in his video with the 3080 ti running mc he is constantly hitting 200+ fps at 100% usage while I sit at 100ish frames (1440p) going as low as 40 - 50 with it at around 40% - 60% usage
to add onto this, I also run into issues concerning my vram usually running out of it when running aplications that shouldn't be exhausting all my vram even with all my apps closed
i like the projectluma at 4k it looks nice
the voxels one does say beta, could be they haven't properly optimised them yet
Yooo ty much love to my favorite benchmarker
Bro said "You poor" in every language
I think a really fun thing you can do in these minecraft videos is to just try like spawning a bunch of mobs or explode a lot of tnt to see the performance and btw if you wanna find a specific biome or structure in creative you can type /locate biome jungle for example! Hope this helps
Can you do a CPU comparison at this point for Minecraft. Like 5800x3d vs 5900x vs 13600k on Minecraft with the rtx 4080 at 1080p to 4k. You could do as many cpus as you want, but it would be nice to see which cpu is also working the best for this game. Also spawn like a huge section of mobs in a fence to see how they handle a cow farm for example.
The 13600K is the fastest one
@@yancgc5098 I did not ask that. I asked for a comparison. In fact you could be wrong. What if minecraft takes advantage of the 3d V-cache in multiplayer scenarios. Its just not that simple of an answer. I've also seen the 13600k lose to the 5800x3d many times. Its all test scenario bc they are both better in certain areas.
@@Purified1k Well you can throw the 5900X out of there since Minecraft Java is very single threaded. Between the 13600K and the 5800X3D it depends whether Minecraft loves that amount of 3D V-Cache or prefers the adequate amount of cache alongside the superior single core performance of the 13600K. My money is on the Intel CPU
@@yancgc5098 But you forgot the part where I just want the comparison in general. So yes, even though the 5900x might not be as good. I want to see how it stacks up.
I use the complementary shaders with sodium on my GTX 1070, at 1080p fancy the game stays above 60fps at all times. There's this cool setting in the sodium menu that lets you dedicate CPU threads just for chunk rendering, so even on my 9 year old 8 core xeon e5-2667 v2 I can use 32 chunks and get good fps. It's on 1 thread default.
guy can you test 4070 minecraft 64chunks +SEUS how much fps ? thank !
Minecraft is the only thing which can run with the potato gpus and can also destroy the rtx 4090 with shaders physics mods and texture packs because of this big community
Bro try adding an RTX texture pack while using seus ptgi hrr shaders and see what will happen with RTX 4070
Hey dude thanks if i made similar pc i do choice pre-last ones shaders they looks awesome.
i like how the 6650 xt fucking demolishes the 4070 lmao
hello can anybody help? I have rtx 4070 and 60hz monitor and I get 40 fps while moving with lite shaders on
cuz your cpu
Can you pls use PBR texturepacks in your future minecraft videos?
What if u changed the cpu to a 5800x3d, will fps change much for the worse?
It'll be pretty much the same :)
@@zWORMzGaming Good stuff, i'm planning on upgrading from a ryzen 3600 to a 5800x3d, my old gtx 1070 will, most likely be replaced with a 4070, since my case is small (meshify c)
24:18 what 80 years ago😂or am i hearing wrong. Also decrease the simulation distance to 5 that decreases cpu load by a lot without giving any noticeable change.
hello i found a new shader for minecraft its called rasberry and i love your vids!
No retesting for Optifine Shader? Also no 1080p (the most used Resolution) Test?
Is it just me who watch his video because of Minecraft,not gpu
it would be cool to see the performance for continuum RT because thats probably the most intensive shader there is. it would be interesting though kinda a pain to install
I mean not really because it actually takes advantage of rt cores unlike any other shaders
This guy truly lights up the day
Exactly