To be fair it did take days to render single frames of animated movies with path traced lighting just 20 years ago. So the fact that this card can even do it 18 times per second just shows how far the power of computers have come.
It still can't. Nvidia is doing a lot of tricks to make this possible according to their paper, but I guess it's now path-tracing's time to go through the decades of tricks that rasterization had in order to improve game performance.
@@AnDOnlineify News flash: Literally everything in computer graphics is all about faking stuff and creating illusions. We can't accurately simulate most things, so we create shortcuts and approximations to get close.
@@MartinFinnerup I know. I was just replying to the earlier comment that path tracing is now undergoing optimizations that rasterization has had for decades.
Probably FIFA or Fortnite. With magnificent 19% gpu utilization peaks. I swear optimization is not devs' priority anymore. The game only needs to run up to the part where you upload your credit card info
I started learning computer in school using DOS 😅 then we got Windows 95 and then we could play Oregon Trail finally but with a 30 min computer class i could only ever get as far as start my family and then most of them die from dysentery on the way to the first river and the rest drown after my wagon crashed😢. Then windows 2000 and then xp shortly after was all that we had for the rest of the time i was in school 😅
I work with Unreal and to render a 1920x1080 with path tracing it takes about 90 seconds with a RTX 3060. So, I'd say 18fps with Path Tracing is insanely fast...
As someone written a path tracer renderer before, I’d say you have absolutely no idea what you blame the card for. Even using the hardware accelerated bounding volume structures , ray triangle collisions are expensive especially having hundreds of secondary rays shooting through random directions. It causes non-uniform data sampling from the memory which isn’t exactly the fastest operation even with texture caches.Until recently path tracing had always been an offline operation used to render frames for movies etc which took days to render. New generation cards are optimised on faster floating point arithmetic for their built in acceleration structures. That’s why you get to calculate a scene like that 18 times per second.
@@soundwavesuperior6761 eh, they optimized it a lot. It's slow because it has full-on path tracing now, which is a feature that's arguably ahead of this generation of hardware's time.
@@soundwavesuperior6761 Crysis tried to be ahead of it's time, but it chose the wrong timeline. It focused more on singe core CPU performance, which didn't turn out to be where the technology really went.
my Chromebook: I am not pathetic I am a being stuck in time and space that refuses to progress, I am a bastion, a fortress, a stronghold! against the new
@@Arsoonist you can't deny the MX160 i have IS worse than urs, man i love spending money on GPUs that my integrated GPU does better when using longevity gaming cuz it doesn't take that much RAM to process (mx160 takes 4gb of ram from ur pc) and i can play fairly smoothly in many games
@@lywoh I mean yeah, it would give me very good fps, unless it was cooled with a single fan the size of one of my balls(my balls ain't massive, the fan is small). It does better in the first 10 sec only to throttle and perform shit, so I can't even properly use it.
@@Arsoonist bro same, except I have the mx330 which is *barely* faster than the integrated iris plus graphic. I have another laptop with an mx550 which is sort of a decent chip
I'm so glad I'm not the only one thinking this. I remember back when I was getting into PC it was all about "can it run Crysis 3?" That was like the ultimate benchmarking game.
Devs get lazier and lazier. Decades ago people had to think and use illusions to make the game look great and run great. Today the McDonald's App fries my S-Tier Phone and crashes. Inefficient as hell. If someone would really use the power of a modern GPU efficently it would be jaw dropping.
I mean, define look great? Yeah, old games look ok, but it’s very clear how far visual fidelity has come, especially if you can actually run games at high graphics
@@mrlightwriter do you 2 think the 3080 could do the same cause this is what I have and I am coming from AMD graphic card. Still getting use to blender and UE5.3
I mean gpu are already so good.of course standards of people change and actually if you compare any top of the line graphic card to something it can't achieve of course it will look pathetic .it's like comparing a millionaire to a. Billionaire.doesnt means the. Millionaire isn't good .just means we got a high standards for judgements . My
@@Xenor999 rtx is the buzzword nvidia came up with as to not call it "path tracing" but is a scaled down version of path-tracing intended to make realtime "raytracing" possible, path tracing is the whole shabang that renderengines like Cycles,Vray, ETC use to make animations and photorealistic renders.
It’s pathetic people feel like they need it to have a good experience on a game. Developers too focused on graphics and not enough on delivering a solid optimized experience that is actually pleasurable to play.
Actually kind true.. They know if they were just competing on plain rasteurization, they would lose every single time vs AMD. They had to manufacture their own problem to fix.
@@asd-dr9cfthis is beyond false. They under that we can no longer just keep is using Rasterization. Also hardware RT was not started by Nvidia they just made it possible to do. The gaming industry had already been on the raytracing train years before the 20 series was a thing. If Nvidia still wanted to do Rasterization they'd still blow the competition out but what's the point of keeping on with Rasterization I'd it isn't actually going to make different rendering techniques useful?? Everyone says they had to manufacture their on problem yet the only GPU manufacturer doing this dies off and for years after the 20 series everyone has being following Nvidia's wake and are still losing to Nvidia.
@MrTasos64no you don't understand a damn thing about the rendering techniques being used. Path tracing a game and doing it efficiently in real time is taxing. These technologies are taxing because they are being done in real time unlike the cinematic field were they render in clusters and not in realtime.
Obviously it’s a test and the fact that the 4090 can even get 18 FPS is amazing. We are far from native path tracing. But they are trying to show how DLSS can help improve performance
@@klippo1235 If you have a 144hz monitor and your 4090 is only putting out 115fps, then it's a waste of money and is indeed pathetc. With a 144hz monitor, you expect to see an expensive flagship card do 144fps.
@@luminatrixfanfiction like people surely did with crysis on new hardware at that time? lol. The red engine is extremely poorly optimized and games are getting more advnanced than our hardware, thus. DLSS
As a 3D Artist I agree, even 1-2fps for 100% PT is awesome especially in such complex game world. Normally the RTX 4090 still can take minutes and more ( 0.016fps if 1min ) to clean single 1080p frame using normal Path Tracing.
To be fair Minecraft isn't the best optimized game. I have an RTX 3050 16gb and it can run most modern games at over 100fps and the crazy thing is I bought it for only $700 off Amazon.
(I don’t know computers well) about 2 years ago I got a 3060 and never realized the fans weren’t moveing and now I don know how to make them spin PLEASE HELP
@@Nexosist that's not related, but the iphone can run a bunch of high quality games, plus if csgo DOES get phone support, yes it probably would be able to play csgo
Honestly i really wish they would focus on textures and detail but i get why they're focusing on something they just gotta do once and have it apply to the entire game engine
So what's a good computer by comparison? Enlighten me. This is one of the most enraging shorts. Like:"your Ferrari is pathetic, it can't even come close to pulling 1200mph" Sooooooo what can?
To put it into a simplistic answer, the only step over consumer grade 4090s, is the professional line of workstation gpus that are used to actually create and render the latest innovations in graphics tech. Which would be like the Ada gen RTX 4000 all the way up to the RTX 6000. So essentially there is no “better” computer to compare it to, because you’d be taking the leap from consumer grade flagship gaming hardware, to professional grade creator hardware.
In terms of current consumer components there isn't one. Maybe the 5090 when it eventually becomes a thing it'll be able to manage 30fps but we'll have to see
It’s okay but the amount of expectation from me really left a sour taste, I also think the game should be better instead of being a “graphics bench”. Beautiful game but everything else is ehhh
It's really weird everyone thinks a GPU is the only thing that can impact your FPS. Your CPU and RAM and Motherboard matter A LOT to this process. Especially in heavily multi threaded games like Cyberpunk.
Eh, I don't think the pathtracing and graphics stuff is running on the cpu. yeah mobo might matter for pcie speed. The higher res you go and the more graphics options you enable, the more it relies on the graphics card. But yeah the other stuff matters too. I don't really care about fps past 120 or 60 depending on the game, as long as it doesn't stutter.
Higher res images and rt is all done on the GPU. All that happens on the mb and CPU is arranging those images in way you can see. And even then modern changes to that pipeline try to limit the back and forth between the CPU and GPU. So you would be wrong. Now if you want more fps and less res then the story is different and the bottleneck can become your CPU though I believe there is a bigger push for latency reduction rather than higher fps. So it just depends on what you play.
@@Generationalwealth94 those people don't know anything about computer graphic It will take at least 20 or even more years for computers to become powerful enough to run pure path tracing in real-time at 60fps
@@grass_rock Well okay, so RT really has been a gimmick so far. Got it. Seemed kinda obvious too that sacrificing half of your FPS to make graphics look shinier has been a stupid idea ever since RTX Graphics first came out tbh.
Ah shit, my integrated gen 7 intel Core i7 on a laptop with a broken fan and cracked screen can’t run Cyberpunk on the highest settings? This is a fucking tragedy.
Cyber Punk really has improved ever since release, I just wished they actually took more time to make it this level now and was not forced to release it too early
So many things scream scrapped or incomplete. Modding community is doing so much more than the devs in terms of releasing interesting content. Modding the game for me made it more enjoyable. Modding is a must to get the best experience
Generally it feels pretty cool and nice to play, I definitely had fun, but the end just seems to come very sudden and quick, it felt like something was cut of, and some more interesting side missions wouldn't be bad. If I compare it to the witcher 3, every witcher contract felt unique and made fun, but most fixer missions felt a bit repetitive in cyberpunk
Hahaha you think benchmarks actually relate to real in game performance you poor soul I'll see you in a few hours after the first slide if the slide show renders
@Living Glowstick I'm referring to the Cyberpunk 2077 in-game benchmark. One would assume that it's designed around pushing hardware to the limits of what happens in-game, or else what would be the point right?
@@tmisv2493 no benchmarks in game are very unreliable as they are still simulated in someway they are not random or fast moving as gameplay which quickly reduces performance especially with any sort of rtx explosions in cyberpunk are also one of the most draining things along with physics interactions basically any combat in game or even fast driving or running will make it chug like a 90s animation on the web
Okay, tried in game 60-110fps at 4k, balanced dlss with path tracing on. Where do I find the ingame Tactical nukes that would provide the explosions to take me below 30fps?
Literally doesnt mean its pathetic???????? Its called that rendering things with such detail and percision takes A LOT of power, its not pathetic, its genuinly super fucking impressive that we can even sort of run it in realtime
Its not perfect, its perfect simulated lighting. True to life lighting is Wave Tracing or Spectral rendering which simulates the wavelengths of visible light using Nanometers. I made a demo and it looks way better. Real life light is a wave but ray tracing is a ray.
we’re in that age of tech where software is going to start making things run better and idk how I feel about this.. amd is already so good but imagine when their business relies on it, will we get the same budget friendly approach?
Looks like we need one of those Blackwell gpus with a driver that support games just to run cyberpunk at the highest settings and native high resolutions 😂
Nvidia’s vision was to show us the way what the future in realtime visuals will look like and give us consumers a glimpse into the future with their 20 series hardware. Consumers totally misinterpreted it, even though the message was loud and clear. Their research and hardware software implementation made about 70% of graphics research in the last 10 years accelerate or even possible. Kudos for that man even though ripping money from individuals cannot be forgiven, the fact he helped pushing humanity forward that much totally overweigh the negatives for me. We should appreciate people with such strong vision.
To be fair it did take days to render single frames of animated movies with path traced lighting just 20 years ago. So the fact that this card can even do it 18 times per second just shows how far the power of computers have come.
Especially in "real time". Those were pre-renders. This is actually doing it live, so it makes it even more impressive.
It still can't. Nvidia is doing a lot of tricks to make this possible according to their paper, but I guess it's now path-tracing's time to go through the decades of tricks that rasterization had in order to improve game performance.
@@AnDOnlineify News flash:
Literally everything in computer graphics is all about faking stuff and creating illusions.
We can't accurately simulate most things, so we create shortcuts and approximations to get close.
@@MartinFinnerup I know. I was just replying to the earlier comment that path tracing is now undergoing optimizations that rasterization has had for decades.
@@MartinFinnerupDuh we all know that. but did you get the point tho?? stay on topic 🤓
my computer: jokes on you, i AM already pathetic!
Us bro
Me with my Nvidia Quadro K5000:
Same
Me with integrated graphics:
Me with a 2 core cpu and integrated graphics
"But can it run Cyberpunk 2077 with path tracing?" Is the next GPU benchmark
Hopefully we get a PTX 5090 for path tracing
Crysis: Earth 2023 Remastered + DLC
Yep maybe by 6090 we won't have to turn on fake frames to run cyber punk path tracing lol
"Can it be run by 2077? "
Crazy to think in a short time we will be looking back on this same way with Crysis. I just hope GPU's are not £1500+
Cyberpunk 2077 has just become more tech demo then anything else honestly
Fr💀
The new crysis basically 😂
That is just not true. It’s one of the best games ive played in a good long while.
@@idomanythings4159Facts
@@idomanythings4159you really did not understand what he was getting at
"your computer is pathetic"
Me and my 2007 PC: "We know" *ugly sobbing*
The core 2 Quad's
My mom gave me her dual core AMD and it still had Windows 7 lol I’m not sure what to do except try to upgrade it and hope it doesn’t catch fire
@@michaelcarrillojr453 yeah bud go for the upgrade
once i get the chance and get to a good stopping point my own pc issues ill work on it and post it
@@pixoarts3262
I have an 4070ti and i get 144fps solid on cyberpunk77 and i play on max settings
I'd say that simulating light, 18 times a second is actually pretty insane
Ya don’t know how they managed to even program that must be why cyber was in dev for 20 years
Ye but before 24 u see images switching and not a video
@@yungcalvin798imagine spending years on perfectly simulating real lighting only for it to be too good for any pc or console to run
@@FarerasBTD6before 24? More like before 60 lmfao
@@your_average_cultured_dude na actually u see pictures going by before 24 and after 24 it's a video but not very smooth
"Your computer is pathetic"
Hey, he's trying his best
She’s* don’t ask how I know 💀
@@LoonyToony 💀
@@Crescent21 DON'T ASK
@@LoonyToony why 😏
@@sangheilipozole i said... DON'T ASK
they must be using so many tricks to pull that off at that speed
Just imagine the game that will bring the 9090 to its knees
probably nothing. at that point, Better cards would be for more intense things like video editing and rendering and intense mathematical equations
no game company will invest more time in more realistic visualy
Probably FIFA or Fortnite. With magnificent 19% gpu utilization peaks. I swear optimization is not devs' priority anymore. The game only needs to run up to the part where you upload your credit card info
@@JGaute lmao the truest words ever spoken
@@JGaute a prime example for good optimization is Doom eternal and atomic heart
My 1997 windows computer that can barely start itself without using all my ram: 👁️👄👁️
Im the same lol i got one of them 1990s toshiba satellites and i love it for retro games
"A gigabyte of ram should do the trick"
You’re not using a pc from 1997 lol
My 2020 dell laptop that takes 15 minutes to be in a somewhat usable state 👁👄👁
I started learning computer in school using DOS 😅 then we got Windows 95 and then we could play Oregon Trail finally but with a 30 min computer class i could only ever get as far as start my family and then most of them die from dysentery on the way to the first river and the rest drown after my wagon crashed😢. Then windows 2000 and then xp shortly after was all that we had for the rest of the time i was in school 😅
Dawg my computer can't even run Ms paint without being noisy💀
I'm sure your computer will be fine for cyberpunk at 4k with path tracing 💀💀💀
Let’s see what happens when you run it on a chromebook
@@gamingwill896-1 will to live and +1 fire hazard
How about rtx 9090 ti sli
My old laptop fans would be at 100 percent at the desktop after multiple reboots as well like factory resets 💀
"Performance is low"
"Someone in the future just has 5 RTX gpu running with the best coolers"
I work with Unreal and to render a 1920x1080 with path tracing it takes about 90 seconds with a RTX 3060. So, I'd say 18fps with Path Tracing is insanely fast...
Gonna be wild when we can render 3840x2160 at 240fps with path tracing. It'll probably be the year 2045, but we'll get there eventually.
@@JeffreyBolesat that point we could probably get path tracing vr
@@JeffreyBolesthat'll never happen properly, we can't get super small in transistors and stuff so unless it's a kilowatt or something and costs $5000
From seconds per frame to frames per second, we have come a long way and still have a long way to go
You have a job that many of us dream of having. Congratulations and all the best in your career my friend 😊
if a 4090 is considered pathetic, i don't even WANT to know what mine is supposed to be called 💀💀💀💀
Extra large with sauce pizza pathetic
@@That_1_Ginger_cathow bout a bean
Intel pentium lookin ahh
Garbage
@@MMC619 nah it be worse than that
Crysis back in 2007: Finally. A worthy foe
Bro i used to hear that meme back good days
Can it run crysis?
Yeah, but can an android phone run carx street smoothly
an android managed to run crysis 3 at 18 fps
@@bananatie-j5y Oneplus 6t managed to run Crysis 3 at that performance. Poco f1 performs slower.
As someone written a path tracer renderer before, I’d say you have absolutely no idea what you blame the card for. Even using the hardware accelerated bounding volume structures , ray triangle collisions are expensive especially having hundreds of secondary rays shooting through random directions. It causes non-uniform data sampling from the memory which isn’t exactly the fastest operation even with texture caches.Until recently path tracing had always been an offline operation used to render frames for movies etc which took days to render. New generation cards are optimised on faster floating point arithmetic for their built in acceleration structures. That’s why you get to calculate a scene like that 18 times per second.
They basically went "ah fuck it, let's become the modern crysis"
Lol
Crysis 4 is lurking in the corner waiting to attack
@@soundwavesuperior6761 eh, they optimized it a lot. It's slow because it has full-on path tracing now, which is a feature that's arguably ahead of this generation of hardware's time.
@@soundwavesuperior6761 Crysis tried to be ahead of it's time, but it chose the wrong timeline. It focused more on singe core CPU performance, which didn't turn out to be where the technology really went.
Cyberpunk RT overdrive will be plays le at 30 fps within a decade or so.
my Chromebook: I am not pathetic I am a being stuck in time and space that refuses to progress, I am a bastion, a fortress, a stronghold! against the new
@Pleyer7575 my chromebook is beyond potatos
Same with mine
@@Ice_elite baked potato
This is a beautiful comment
Good potato
im just here, with integrated graphics, not worrying bout sh*t.
I am here with a discrete GPU... That's worse than integrated graphics😢. Meet the mx350, even my iris xe can outperform it.
@@Arsoonist you can't deny the MX160 i have IS worse than urs, man i love spending money on GPUs that my integrated GPU does better when using longevity gaming cuz it doesn't take that much RAM to process (mx160 takes 4gb of ram from ur pc) and i can play fairly smoothly in many games
@@lywoh I mean yeah, it would give me very good fps, unless it was cooled with a single fan the size of one of my balls(my balls ain't massive, the fan is small). It does better in the first 10 sec only to throttle and perform shit, so I can't even properly use it.
@@Arsoonist bro same, except I have the mx330 which is *barely* faster than the integrated iris plus graphic. I have another laptop with an mx550 which is sort of a decent chip
And I’m here with Intel UHD Graphics 620.
“Your computer is pathetic”
My dell optiplex and gt 710: *why are we still here just to suffer?*
"Your computer is pathetic"
Computer : and I took it personally
It's funny how you people got offended when he said that.
I also took that personally
Old days: "can it run crisis 3?"
Modern day: " can it run cyberpunk 2077?"
I'm so glad I'm not the only one thinking this. I remember back when I was getting into PC it was all about "can it run Crysis 3?" That was like the ultimate benchmarking game.
Can it run sims 3.
idk, i ran crysis 3 on my 4090 last night maxed with AAx8 and it wasn't smooth as it should be lol
Can it run ark ascended?
@@antop4597 Bruh try the first Crysis, your 4090 will sob 💀💀
"your car is pathetic because it can't even get to the moon"
Legendary lmaoo
खरं आहे.. कॉम्प्युटर कसेही असो नीट उपयोग केल्यास फारच उपयुक्त असते हे नाकारता येणार नाही.
@@tenand11 🗣️🗣️🗣️
Baby patrick
not een the same concept
Devs get lazier and lazier. Decades ago people had to think and use illusions to make the game look great and run great. Today the McDonald's App fries my S-Tier Phone and crashes. Inefficient as hell. If someone would really use the power of a modern GPU efficently it would be jaw dropping.
Wise words.
I would also argue the quality of programmers has dropped too. Outsourcing work was probably not a good idea to make the app cheaper.
I mean, define look great? Yeah, old games look ok, but it’s very clear how far visual fidelity has come, especially if you can actually run games at high graphics
as a blender user, this is really impressive
I'm a blender user too, and I definitely agree with you!
@@mrlightwriter do you 2 think the 3080 could do the same cause this is what I have and I am coming from AMD graphic card. Still getting use to blender and UE5.3
As a fellow blender user, i dont know what this all means or what it has to do with blenders. Well, off to make a fruit shake
As someone who has completed the basic tutorial for blender, I can say I agree.
as someone that installed blender to make a donut and half and never opened it after, i agree
The fact it runs even at 18Fps is INSANE. when graphics cards get good enough we will be in for a treat
Indeed; people don't know how long it takes to render a single image in Unreal with path tracing.
Our wallets will disagree with that notion
@@mrlightwriter it's different from ray tracing right.
I mean gpu are already so good.of course standards of people change and actually if you compare any top of the line graphic card to something it can't achieve of course it will look pathetic .it's like comparing a millionaire to a. Billionaire.doesnt means the. Millionaire isn't good .just means we got a high standards for judgements . My
@@Xenor999 rtx is the buzzword nvidia came up with as to not call it "path tracing" but is a scaled down version of path-tracing intended to make realtime "raytracing" possible, path tracing is the whole shabang that renderengines like Cycles,Vray, ETC use to make animations and photorealistic renders.
“Your Computer is Pathetic”
Don’t you dare insult my Gtx 1060 6gb like that
me with my gtx 1050 ti 🧍♂️
It’s pathetic people feel like they need it to have a good experience on a game. Developers too focused on graphics and not enough on delivering a solid optimized experience that is actually pleasurable to play.
Leave my computer alone, he’s trying his best
Are u sure that your laptop is not transgender
@@terra3673No I'm sure he's normal 😂
@@scrap5407 Thanks for clearing it up for the less intelligent ones that transgender is indeed normal, we need more of that!
@@iamyourgreatgreatgreatgrea6291 lol
Nvidia made its own monster called RT and also made its own hero named DLSS😮😮
Actually kind true.. They know if they were just competing on plain rasteurization, they would lose every single time vs AMD. They had to manufacture their own problem to fix.
@@asd-dr9cf AMD did the same with their FSR, but FSR can work even with NVIDIA GPUs, meanwhile AMD GPUs don't work with DLSS
@@asd-dr9cfthis is beyond false. They under that we can no longer just keep is using Rasterization. Also hardware RT was not started by Nvidia they just made it possible to do. The gaming industry had already been on the raytracing train years before the 20 series was a thing. If Nvidia still wanted to do Rasterization they'd still blow the competition out but what's the point of keeping on with Rasterization I'd it isn't actually going to make different rendering techniques useful?? Everyone says they had to manufacture their on problem yet the only GPU manufacturer doing this dies off and for years after the 20 series everyone has being following Nvidia's wake and are still losing to Nvidia.
@MrTasos64no you don't understand a damn thing about the rendering techniques being used. Path tracing a game and doing it efficiently in real time is taxing. These technologies are taxing because they are being done in real time unlike the cinematic field were they render in clusters and not in realtime.
@@blvckl0tcs750you are wrong... almost across the board.
A guy with a render farm: You don't know who you are messing with
Me with a nasa supercomputer: alien technology is nothing compared to this
The return to form on some of these is really nice, I’m really excited to get my hands on zombies and see how they did
Obviously it’s a test and the fact that the 4090 can even get 18 FPS is amazing. We are far from native path tracing. But they are trying to show how DLSS can help improve performance
quality DLSS looks better and performs better than native anyways so i don't get why theres an issue
@@klippo1235 If you have a 144hz monitor and your 4090 is only putting out 115fps, then it's a waste of money and is indeed pathetc. With a 144hz monitor, you expect to see an expensive flagship card do 144fps.
@@luminatrixfanfiction like people surely did with crysis on new hardware at that time? lol. The red engine is extremely poorly optimized and games are getting more advnanced than our hardware, thus. DLSS
As a 3D Artist I agree, even 1-2fps for 100% PT is awesome especially in such complex game world. Normally the RTX 4090 still can take minutes and more ( 0.016fps if 1min ) to clean single 1080p frame using normal Path Tracing.
But what sense does a test make if noone is going to use it because of it's absolutely underwhelming performance?
I can't even imagine how scarily good computers will be in 10 years.
Bro my computer is pathetic, it barely runs the log in screen without crashing
Even tho I have 5600G right now, I was there too. Ofc not with 5600G but with one of the old PCs that I had back in the day
Get more ram
Use an OS that uses less ram
Lol, mine barely runs the windows startup logo
**Traces every electromagnetic radiation's paths**
Me with laptop equiped with I5-8250U and UHD 620 integrated grapychs:
Aw man! Now im fucked
Ayy my laptop has the same specs!
@@ryanjames2247 is it dell?
Me with iris xe, oh no.... Anyways
@@LAYATORS nah, its a lenovo
Bro I had even worse. A4 6300 dual core CPU, GT 710 2GB and 6GB RAM. What a crap that was! At least, I had fun time playing San Andreas, CS 1.6, etc
My computer that can't even run minecraft :🗿
Try play Alpha
To be fair Minecraft isn't the best optimized game. I have an RTX 3050 16gb and it can run most modern games at over 100fps and the crazy thing is I bought it for only $700 off Amazon.
@@fireballmafia4833 Ain't no way bro 700 bucks only , Send me that shit I'll pay 800 dollars
@@fireballmafia4833Sodium:🗿
@@fireballmafia4833You spent 700 on that
Games look so real these days that games arnt fun anymore, feels like going to work
literally anyone who works or plays with cgi tools would know how much more demanding path tracing is over the hybrid approach nvidia was using
Amen.
"Your computer is pathetic"
"Crys in poor"
It's funny how you people got offended when he said that.
Hey, you can see things this way:
Your computer is pathetic and cheap.
A 4090 computer is pathetic and expensive.
"Your computer is pathetic!"
"i know 😢"
RTX should've only been offered on Quadros, even until like 5 years from now.
I'm still rocking a 1070ti and 4690k. My CPU's architecture is almost 10 years old.... I need an upgrade.
Still running a 4790k myself, but with a 6700XT 😅 I will upgrade the CPU next at some point but it works well at the moment
I also have 4690k. But today of tomorrow will be its last day as I'm upgrading to 7800X3D
@@ivvan497 I think I may go for a 7800x as I like to play with overclocking every now and then, I did consider the x3d though, solid CPU for gaming
@@dynamitedaviesThere is no 7800X
@@casnub5484 sorry I didn't make it clear enough, I meant 5800x3d Vs 7800x3d.
Ray tracing is good for animation and professional rendering but games it’s just kinda pointless
I can only imagine How things would look when we can use path tracing in vr with the megaRTX 69420 one of these days.
Yes all this running on my intel i56 500 core CPU, all powered by a nuclear reactor, with 1000 petabytes of gayddr9 ram. I'm excited for the future !
@@GoodByeBritain gay?🤣
@@mudit1i think hes making fun of the rgb lmao
Imagine spending 10k on a rig just to be able to walk outside and touch grass😂
Me with a gt 1030 chillin:
lmao same
Poor guy😢, I’ll buy you a water bottle 😊
Me with Intel UHD 630 : This is fine 🙂👍
put this guy in jail
(I don’t know computers well) about 2 years ago I got a 3060 and never realized the fans weren’t moveing and now I don know how to make them spin PLEASE HELP
Me with my potato made from 12 year old spare parts:
i have one like these week ago noe buyed 1660 super and i5 10400f
“Your computer’s pathetic” Jokes on you I don’t have a computer!!
Well you do, it’s your phone
Thats not even related @danzstuff
@@Nexosist Yeah it is, a phone is basically a mini computer, it has all the stuff you would find in a computer
@@danzstuff can you play csgo?
On it
@@Nexosist that's not related, but the iphone can run a bunch of high quality games, plus if csgo DOES get phone support, yes it probably would be able to play csgo
Honestly i really wish they would focus on textures and detail but i get why they're focusing on something they just gotta do once and have it apply to the entire game engine
So what's a good computer by comparison? Enlighten me. This is one of the most enraging shorts.
Like:"your Ferrari is pathetic, it can't even come close to pulling 1200mph"
Sooooooo what can?
He's obviously saying every gamer needs to buy a quantum computer.
To put it into a simplistic answer, the only step over consumer grade 4090s, is the professional line of workstation gpus that are used to actually create and render the latest innovations in graphics tech. Which would be like the Ada gen RTX 4000 all the way up to the RTX 6000. So essentially there is no “better” computer to compare it to, because you’d be taking the leap from consumer grade flagship gaming hardware, to professional grade creator hardware.
In terms of current consumer components there isn't one. Maybe the 5090 when it eventually becomes a thing it'll be able to manage 30fps but we'll have to see
@@TyrannisVirtueкстати, квантовый компьютер это scam. Погугли разоблачение этой темы
It's a scummy tactic to rage farm comments
When I attempt diy but forget I'm poor😅
Fr
"IF I DON'T HAVE AT LEAST 1000FPS THE GAME IS LITERALLY UNPLAYABLE"
Edit: yall it was a joke stop getting angry🥶
yes 👍 👍 👍
Fortnite players 🙂
CSGO players:
18 fps is unplayable
18 fps is very low.
That saw their death at the launch of Alan Wake 2 & now their death is here
Your Computer Is Pathetic: My Intel Atom First Generation “I think I understand 💀”
Bro probably can't even watch a RUclips video at 3 fps 144p 💀💀💀💀💀💀💀
Bro, I’m over here working with my goddamn school computer, I know it ain’t shit
If only they actually made Cyberpunk a better game instead of using it as a graphic test bench.
i heard it ain’t that bad these days tbh
Oh look, yet another persona who's never actually touched the game
@@CitrusRev I competed game few times and it's still bad. Cope.
Its a pretty good game imo
It’s okay but the amount of expectation from me really left a sour taste, I also think the game should be better instead of being a “graphics bench”. Beautiful game but everything else is ehhh
"Your computer is pathetic"
Yes.
Let me just spend a couple million to integrate the James Web Space Telescope inside of my PC, then we will see who’s pathetic!
I watched it 3 times and couldn't make sense of what he means.
Update : 4th time and I'm having seizure
Relatable
It's really weird everyone thinks a GPU is the only thing that can impact your FPS. Your CPU and RAM and Motherboard matter A LOT to this process. Especially in heavily multi threaded games like Cyberpunk.
"Motherboard" lmao. Please delete that before someone roasts you.
@@Salvo78106oh you think motherboards can't be a bottle neck? Out a 3900x in a b550 board and come back to us.
Eh, I don't think the pathtracing and graphics stuff is running on the cpu. yeah mobo might matter for pcie speed. The higher res you go and the more graphics options you enable, the more it relies on the graphics card. But yeah the other stuff matters too. I don't really care about fps past 120 or 60 depending on the game, as long as it doesn't stutter.
Higher res images and rt is all done on the GPU. All that happens on the mb and CPU is arranging those images in way you can see. And even then modern changes to that pipeline try to limit the back and forth between the CPU and GPU. So you would be wrong. Now if you want more fps and less res then the story is different and the bottleneck can become your CPU though I believe there is a bigger push for latency reduction rather than higher fps. So it just depends on what you play.
I might not be an expert but i dont think Ray tracing and path tracing has anything to do with cpu.
the bare minimum: can it run doom
the peak: can it run cyberpunk with path tracing
At least it's not ordinary potato.
It's sweet potato PC ! Let's go!!!
Man my laptop's fans are already revving up just by watching this video
So 3-4 generations top card will do ok with that real ray tracing.
When will it end 😭😭😭
@@Generationalwealth94?
@@grass_rock That's what people said when the 2080Ti came out. Now it's been 2 generations since that and people say "just 3-4 generations more" lol
@@Generationalwealth94 those people don't know anything about computer graphic
It will take at least 20 or even more years for computers to become powerful enough to run pure path tracing in real-time at 60fps
@@grass_rock Well okay, so RT really has been a gimmick so far. Got it. Seemed kinda obvious too that sacrificing half of your FPS to make graphics look shinier has been a stupid idea ever since RTX Graphics first came out tbh.
Me on a laptop with a basic Intel iris xe graphics and an nvidia T600 graphics card 💀
Ah shit, my integrated gen 7 intel Core i7 on a laptop with a broken fan and cracked screen can’t run Cyberpunk on the highest settings? This is a fucking tragedy.
My school computer can barely run Doom, you know it’s gonna boot up this game and go “fuck it, I’m dead.”
Your school computer isn't a Chromebook?
@@fortnite.burger no, we have these rubbish fixed PCs all over the school. We also have Dell laptops but they're not much better either lol
@@Lux_1138 oof
still waiting for the 5080 TI
Still waiting for the 6080 TI
Probably going to be waiting another 2 years...maybe even 3 lol
Sooooo, instead of RTX cards the next lineup will b PTX cards? XD
actually it's impressive that it's still giving 18 FPS in Path Tracing seriously
It’s at 60fps now. I just tried it and it’s giving me solid 60. I think it’s just optimization
@@DavyDave1313it’s pretty new to the consumer market, it’s looking more promising
“You’re pc is pathetic”
Me and my gaming laptop with a gtx 2050
same here, mine is rtx
@@zenox1164 yeah that’s mb I meant rtx not gtx 😭
@Unemployedweeb69420 same 💀 i do get 90+ fps in Minecraft 1.16.5 though although that might be because I'm on windows 7
This turned out to be the biggest memed on game ever
Digital Storm dual 4090 GPU setup: “You haven’t seen what he’s even capable of”.
The rtx 5090 laughing in the background
Oh boy are you gonna be disappointed
thanks for reminding us that upgrading with each new generation is BS. Also, nice loop man!
Cyberpunk 2077 was supposed to be played with a GPU from 2077, didn't ypu guys know?
Ah yes, the rtx 4090 ti. Brilliant article.
Cyber Punk really has improved ever since release, I just wished they actually took more time to make it this level now and was not forced to release it too early
So many things scream scrapped or incomplete.
Modding community is doing so much more than the devs in terms of releasing interesting content.
Modding the game for me made it more enjoyable. Modding is a must to get the best experience
Generally it feels pretty cool and nice to play, I definitely had fun, but the end just seems to come very sudden and quick, it felt like something was cut of, and some more interesting side missions wouldn't be bad. If I compare it to the witcher 3, every witcher contract felt unique and made fun, but most fixer missions felt a bit repetitive in cyberpunk
60-70 fps in the benchmark with path tracing on at 4k. 7950x3D and 4090. I think I'll manage. ;)
Hahaha you think benchmarks actually relate to real in game performance you poor soul I'll see you in a few hours after the first slide if the slide show renders
@Living Glowstick I'm referring to the Cyberpunk 2077 in-game benchmark. One would assume that it's designed around pushing hardware to the limits of what happens in-game, or else what would be the point right?
@@tmisv2493 no benchmarks in game are very unreliable as they are still simulated in someway they are not random or fast moving as gameplay which quickly reduces performance especially with any sort of rtx explosions in cyberpunk are also one of the most draining things along with physics interactions basically any combat in game or even fast driving or running will make it chug like a 90s animation on the web
Okay, tried in game 60-110fps at 4k, balanced dlss with path tracing on. Where do I find the ingame Tactical nukes that would provide the explosions to take me below 30fps?
@@tmisv2493 you do realise the highest nividia got themselves was 16 right you need to turn in overdrive not just ray tracing
"your computers pathetic"
Haha yes😢
Not mine tho😎
My wallet: 💀
The potato at the beginning 😂, i didn't notice it first time lol.
Pretty sure this isn't the GPU's fault but rather just cyberpunk being extremely unoptimized.
Literally doesnt mean its pathetic????????
Its called that rendering things with such detail and percision takes A LOT of power, its not pathetic, its genuinly super fucking impressive that we can even sort of run it in realtime
This dude pretty much tells us not to buy any graphics card 😂
He's an influncer. Whatever he says is a complete lie. Just go with a 3070 and a 12th Gen processor, you'll be fine with 1440p gaming.
@@FutaCatto2Lmao these are the exact specs I have, and I don't regret on getting these at all. I play at 1080p instead of 1440p tho
“My grandpas pc has no pathetic parts, kaiba!”
"Game developers not doing their job"
This guy: Your PC is pathetic!
Path Tracing Cyberpunk 2077 is now the new Crysis i guess.
2029 we're installing 6000$ cooling system for GPU
Its not perfect, its perfect simulated lighting. True to life lighting is Wave Tracing or Spectral rendering which simulates the wavelengths of visible light using Nanometers.
I made a demo and it looks way better.
Real life light is a wave but ray tracing is a ray.
we’re in that age of tech where software is going to start making things run better and idk how I feel about this.. amd is already so good but imagine when their business relies on it, will we get the same budget friendly approach?
You telling me a $7000 pc is trash?
You're channel is most very helpful thanks you.
Looks like we need one of those Blackwell gpus with a driver that support games just to run cyberpunk at the highest settings and native high resolutions 😂
jokes on you my computer cant run windows explorer with a higher fps then 20
Meanwhile me with a i3 4150 with HD 4400 nah it still runs minecraft at 60 fps in 1080p😅
Minecraft rethinking voxels shader makes path tracing but gives 120 fps at optimized settings but macimum settings possible 3 fps
Nvidia’s vision was to show us the way what the future in realtime visuals will look like and give us consumers a glimpse into the future with their 20 series hardware. Consumers totally misinterpreted it, even though the message was loud and clear. Their research and hardware software implementation made about 70% of graphics research in the last 10 years accelerate or even possible. Kudos for that man even though ripping money from individuals cannot be forgiven, the fact he helped pushing humanity forward that much totally overweigh the negatives for me. We should appreciate people with such strong vision.
That doesn’t mean rtx 4090 is pathetic, that means cdpr is crap at optimizing and making an actually useful upgrade.