Nvidia got us mindfcked by launching GPU cards at a very greedy high price point. now they lowered their prices for their Super & TI models and us consumers think we got a great deal. lol 😂💀 their sales team must be politicians. 😂💀
I mean this is still a reasonable price-to-performance ratio back in the days for example the 1080 was $600 but also it would have NEVER ran 4K properly, it already didn't do so hot at 1440p with some games, leading to the 1080ti (amazing card) Fast forward today, GPUs have overtaken games pretty much, so now really what before was an 80 series is a 70 series... if you want to run 1440p balls to the wall happily, the 4070 Super is $599... exactly like the 1080, but unlike the 1080 it offers much, much more peformance if you take in consideration the games of today and the games of back then 4080 at 1200 was just idiotic.
It's called anchoring, very standard practice in sales. That being said, the 1080Ti had an msrp of $749, so not THAT big of a difference, and the new cards do a lot more than the 10 series did.
Just... No. You can't justify those stupid prices by saying they give you more performance than previous generations. That the only point of an upgrade. Going from 1080 to 1440p to 4K is just keeping up with current tech it's not leaping into the future. 1200$ for an 80 series was insane and 1000$ is still crazy. You don't pay a 50% premium for a current high end CPU despite it having higher clocks and more cores that the one you bought 5 years back.
I don't think getting them at msrp will be an issue. The demand isn't what it was a couple of years ago, there aren't any sort of component shortages and mining on these gpus isn't a thing anymore.
@@Egrodo1inflation alone doesn't explain shit A 1080ti was $700, today that would be $850 If Nvidia isn't a greedy company, where tf did the extra $200 come from
@@flamingscar5263Nooo... The 1080ti wasn't today's 4080 super. The 1080ti was nvideas fastest gpu. Their flagship. It's 700 compared to 1600. They are killing pc gaming, and amd is the same crap even though everyone is throating amd to the base
Correct me if I'm wrong, but at 0:35 shouldn't more idle power draw be marked as a red number and less draw while gaming be marked as a green number? Technically they follow the idea that green=more, but I think it makes more sense for green=good
Disagree, 12 years in accounting and every single chart, spreadsheet, etc. has green = positive number and red = negative number. What the numbers mean is irrelevant. I think this green = good and red = bad is more of a thing in marketing which I would argue these charts are not marketing but informational. If they were to do green=good the charts would stop being a source of factual information and start being an "opinion" expressed by LMG which has its own issues. I think its best to leave the charts as pure information and leave good/bad out of it as to not risk adding a potential source of bias.
It’s crazy how during Covid NVDA started price gouging and people freaked out…. But then everyone forgot and they continued to price gouge and now it’s the new normal 😊
I remebmer another massive event happening between 2019 and 2022... every person and their motha buying up every card they could get their hands on... and lets not forget that the 2080ti was already 1200 dollars back then, and the Titan or now the 3090/4090 class cards were 2000 dollars and now they are 1500.. its a differnt time now, its not just raster performance the ammount of IP and programing and AI that comes with their upper teir to top level tier cards is pretty extrodenary.. DLSS can turn a 3060 with 16 gigs of ram... which my 3080ti only has 12 gigs mind you for some F'ed up reason even though it was within 4 percent of the 3090 in pretty much every single benchmarch gaming wise.. those 3090s, and 4090s are for professionals.. not your average gamer.. but yes prices are rediculous but buy a last gen used card like a 3070 for 200 bucks and play over watch at 250 FPS if you even have a monitor that can handle that and quite complaining.. if you were really in an industry that requried a 3080ti, 3090, 3090ti, a Titan, or top tier card, your company would probably have a machine with one it it or an even better one.. or you would be making enough money to build your systems so you could be doing that kind of work that requries such hardware because gaming doesnt.. you dont here people bitching about 800 dollars cpus from AMD... and they dont even come with a cooler.. or a motherboard, just the bare chip and god forbid if you bend a pin or seat it wrong and F it up, so your 700 dollar processer needs a 200 dollar cooling supply but no one is complaining about that.. and intel just keep craping out new generations that are using soo much power, and actually have lost performance on the newest 14th gen its laughable.. my advice buy an xbox or a ps5 your life will be more simple.. or a mac.
ppl really act like the 7990 wasnt 999$ or the radeon 7 wasnt 700$ msrp priced to "compete" with the 2080 and eventual 2080s....While at the time prior to big navi the best option AMD had was an overheating 275w when oc 5700xt....that couldnt even touch a 2080 non super. AS IF nvidia HAD priced their GPUS accordingly considering the lack of threat/viable options comparatively to AMD....yeah PPL forget SO QUICKLY. Prob bc they were hunting used 570x gpus at the time....once they earn/have enough for a beefy gpu they now cry about the need for a better platform and higher quality PSU....go figure.
@@dcpoweredWhat Innovation are you paying for with Nvidia that AMD doesn't do better on making up with price to performance. AMD may still be overpriced but matching performances and prices between nvida and AMD it's not even remotely close. Not to mention that and isn't far behind Nvidia anymore if you were actually keeping up with things.
The 4080 offers like 30-40fps over a 4070... for the price of 2 4070's. While these SUPER cards are technically more 'affordable' they only serve to fill massive price gap that shouldn't exist in the first place.
@@conorturton To be fair, Modern Warfare 3 is more well-optimized for Radeon than it is for GeForce, so unless you only play MW3 and nothing but MW3, it's not a fair way to compare hardware. Normally, the 4080 has a rasterization performance similar to that of the 7900 XT, and the 4080 Super will likely compete with the 7900 XTX. At $999, the 4080 Super will definitely be a competitive product, at least until AMD drops pricing on the 7900 XTX.
yeah and if only i wasnt in the DEMOGRAPHIC that was intent on owning a 800-1000$ gpu to play a free game like warzone or apex legends off of a 500$+ monitor....o wait...im not one of those suckers...that spends upwards of 1200$+ on just a gpu monitor combo to play a slew of free titles....
Disagree, 12 years in accounting and every single chart, spreadsheet, etc. has green = positive number and red = negative number. What the numbers mean is irrelevant. I think this green = good and red = bad is more of a thing in marketing which I would argue these charts are not marketing but informational. If they were to do green=good the charts would stop being a source of factual information and start being an "opinion" expressed by LMG which has its own issues. I think its best to leave the charts as pure information and leave good/bad out of it as to not risk adding a potential source of bias.
@@cookoo4lyfwell in this case I think using more power to do the same thing (ie basically nothing when idling) is pretty objectively bad, and the average viewer is going to be following the more widespread convention of green=good, red=bad. And they've gone and fixed that anyway, it's red now
@@cornonjacob not sure what you are looking at, but it's definitely still green. 0:42 I also think the average user is smart enough to see that a higher power draw is bad without color coding. If its a real problem I would say no color is the solution, not going against the standard for every datasheet on the planet 🤷♂️. That or they could just add a small legend at the bottom clarifying that green numbers are positive and red is negative to help people that are confused. At the end of the day I'm merely pointing out that the chart here is clearing using the normal use of those colors on datasheets and that LMG is not saying higher idle is a good thing, as a rebuke to the original comment.
@@Bigdude0444 or people that simply allocate value and have more income. Its ok if its not for you. There are other tiers and brands for your income and preference. Seems like a non issue
@@AnDr3w066 homie I could build a top spec PC any time I want. Why would I when I can get something reasonable and just enjoy games? I built my PC like 5 years ago with the intent to upgrade stuff later when prices went down and became a generation old. And that's exactly what I did. Spent much less money than most people did cause I have patience. 5800x and 3080 all at less then MSRP.
Of note backlight strobing helps with all display technologies, even OLED, because unlike moving objects, your screen is not moving, but if your eye is tracking an object as it progresses across the screen, it will enduce motion blur because your eye is mobing, but the actual pixels are not. OLEDs generally uses a monochrome per sub-pixel backlight in front of a color converter/filter, this is what allows for those inky blacks, millions of dimming zones instead of the hundreds you'd find on miniLED. One way to get around this is to either have a 10-25khz monitor(the rough estimate of how fast a monitor would need to refresh to accurately simulate motion to be discernable at high brightness levels the frequency needed depends on how much of your field of view the monitor takes up, how bright it is, and how well your eyes, yes your eyes percieve motion. A much more cost effective option is to have a relatively normal refresh rate say 240hz, but instead of an always on backlight, shut off the backlight part way though the pixel refresh cycle. making the pixels on screen only be visible for what would be the same amount of time as if it were running at 1500+hz, the pixels may still only be refreshing once every 4ms, but they are only lit for say, 0.66ms meaning it has the motion persistence, of a 1500hz monitor, while not needing an expensive pixel array capable of 1500hz.
Theres has been a lot of effort and money being poured into elimitating flicker, and for good reason. In my opinion ULMB should stay where it is now, as an option for people that really want it. Having this enabled in all monitors would be tremendously irresponsible, and not only price-wise.
@@LonggshotI'm sorry but sample & hold displays primary reason for existing was not because of flicker, it was mostly because of size and weight. LCD displays despite being uglier than CRTs when they were first made took the market by storm as they were way more convenient and most people aren't tech enthusiasts. A BFI type mode should be on every high end display, obviously with an option to disable it of course because flicker can be annoying at lower refresh rates however flicker is basically not visible at higher refresh rates so there's no reason not to run it unless you have brightness concerns but at something like 60fps it definitely becomes more subjective.
@@Longgshot I didnt mean to imply that all displays sold should come with a tech like this, that would be expensive, but that this methodology for reducing image persistence works on pretty much every display tech, even OLED which most people dont think of as having a backlight.
@@Longgshot Also of note, this is not about reducing flicker, this is about reducing motion blur, the side effect is that this can reduce flicker as well.
But wouldn't that drastically reduce brightness of an OLED if you let the pixels stay lit for only such a short time? You would need to drive the pixels in much higher brightness in that reduced on state in order to compensate. Not sure how this would affect longevity.
its nice seeing lcd tech continue to improve. as a person who tends to keep monitors for the long haul, i remain hesitant to buy oled due to worries of burn in after 8+ years.
to be fair, i feel like 8 years is a good lifespan, i wouldnt be heartbroken over it. After 8 years, considering the new tech that would have come out by then, i would certainly want to upgrade.
Its not cheap but upgrading monitors is one of the best upgrades. Its what you feel and see each second of PC use! Lookin fwd to when I can afford a nice 32 inch 4K gaming OLED! 😁
@@mrs8768 tell that to my 15 year old 16:10 LG Flatron ;) Still runs like a beast, zero burn in, only got a slight backlight artifact in the corner when it's dark grey, like something wrinkled inside. I had to fix the CFL driver once, because some pins of the driver just started to crack/desolder themselves (dry joints) and one lamp started to flicker, then the whole monitor started randomly losing backlight. A quick reflow with a fresh solder on the hottest looking points and it's completely fine! Didn't even change the caps after all these years :) Give me a monitor now with such a lifespan, while being 100+Hz, 16:10 and at least above ~1000p. Not really possible. CFL is great because it doesn't really burn out that much, whille LEDs either love to pop or have terrible coverage in cheaper monitors.
OLEDs look so gorgeous. I have a switch OLED and it looks unbelievable. I check in on OLED every now and then to see how the tech and prices are coming, because I want an OLED PC monitor so badly. One of these days...
@@MrBraxThat’s insane, I thought it was insanely stupid expensive at the MSRP of $1500 when I got mine. But holy crap €2500 is beyond insane. That’s like first car level of money.
@@MrBraxAMD is to be blamed here. Had they come up with a viable competition to 4090 (gaming and more importantly productivity), Nvidia would’ve lowered the pricing. AMD is an impotent competition. We need someone else to step up.
Given that the 4070 Ti Super also has 16GB of VRAM, it would be interesting to see if the extra $200 for the 4080 Super is necessary for most games to hit reasonable framerates or if the 4070 Ti Super will be good enough. Also, does "good enough" require some extra features like DLSS 3.0? Out of these cards, I actually like the boost from the 4070 to the 4070 Super the most, but given that it's still stuck on 12GB of VRAM, I'd be hesitant to consider it for a machine running a 4K display.
well seeing as how i'm gonna be stuck on a 4k 60hz tv for the forseeable future , i think the 4070ti-S will be good enough for me for atleast a few years if not 5-6 years. but still 800 bucks is gonna be a hard swallow so i'm eyeing intel maybe battle mage will have some insane RT performance. from what i've seen teh Arc cards have more comparable RT performance (than AMD) to nivida cards at those price ranges
I bought Nvidia products since the Geforce 8 days upgrading every couple of years. They finally priced me out of their market and I replaced my 1070 with a RX 7800 XT. It works great for gaming and the excellent Linux support is an added bonus.
About the same story here, I’ve been buying Nvidia since Geforce 6 series and recently upgraded to a 7800 XT from a 2070 Super. I’ve always liked Nvidia but I couldn’t justify the price to performance.
Same! always nvidia and i retired my goat 1080ti for a 7900xt last summer... not gonna pay for greedy company that clearly are only thinking on money, business companies, AI etc and not gamers...
4080s have been around $1000 not that uncommonly and weren't attractive at that price either. 4070 super and 4070 Ti super needed a price drop. And 12VHPWR remaining is awful. I know a guy who built a top end gaming pc and couldn't deal with the anxiety of having a fire hazard in his brand new three grand PC. He returned the 4090 for a 7900 xtx - and without any options for regular power connectors, who could blame him?
Sorry to be the bearer of bad news but get ready because its been officially declared that connector was designed bad, they already redesigned that connector the new one is going to be called 12v6x2 @@fever309
Yeah cause redesigning a PCB and cooler to accommodate 8pin connectors rather than just using what is already very readily available makes a lot of sense from a manufacturing standpoint. Really shocked Nvidia didn't want to spend more money, to charge slightly less money. The 12vhpwr isn't the issue, its the lack of common quality in the cables used and and people just hearing a click and not actually making sure the cable is actually fully seated OR in a lot of cases, not having stupid tight bends in the cables like they got used to being able to do for the last 25yrs.
Just ordred the 4080 super along with the samsung G8 Oled monitor. and a spare 4Tb Nvme 4.0 drive. upgrading from 3070 on a 13600k with 48Gb DDR5 7000Mhz ram. think this will be an okay build.
yeah same i won't have the money to invest in a new desktop before the 50 series is right around the corner so might as well wait, especially since i also write AIs i need the VRAM and preferably 16gb or more, now i'd like to see AMD make a push with AI hardware since nvidia drivers on linux are god awful but beggars can't be choosers
I don't understand how there's no internet info about Markiplier having 2 4090s. He mentioned having 2 of them several times and it makes no sense how no one caught the discrepancy of having TWO 4090s! SLI was last supported in the RTX 3090. RTX 40 series doesn't have SLI, what are you talking about Mr. Fischbach?
I want that G-Sync Pulsar tech on a monitor with the resolution of 3440x1440 or 5160x2160. I have a 21:9 aspect ratio monitor and I just can't go back, but I really want BFI on a monitor. NOTE: Alienware OLEDs are including BFI, but you'll reduce your Hz in half, so from 240 to 120. That's fine if the game is heavy duty and can only hit 120-140. Just keep it locked at 120 and it should look really solid.
oh AIO;'s and reatailers both msot assuredly will. mark my words you'll see RTX 4080's at 1,100 m, and 4070 ti-S's at 850-900 . there will be models you can get at msrb but they will be the one's that sell out the msot often and be out of stock all the time.
@@DenverStarkey but what about the reg 4080/4070 ti? I know they are gonna stop making them so maybe if the 4080S is $1000, the reg 4080 will drop to $850-$900 to at least make it somewhat viable for that price point up against the XTX and the 4080S
@@slite3276 its all up to the retailer....if they want to empty stock/shelves theyll liquidate the discontinued models with a sale....clear old stock....and are able to order/stock up more on the new models....if they see a profit/aspect keeping the discontinued cards at full/msrp range....bc of a supply issue/shortage with the super options....than they wont have to liquidate the old stock...bc they still want to offer "in stock" options regardless. Unless consumers are full out not buying the older/discontinued cards...thus forcing the retailers to price/cut them. Overall its not bad...if you have local/retailers that offer price matching....you can always go down that avenue with microcenter/bestbuy/amazon...and pull up a sales price and try to come up on a deal....or look towards liquidated "gigabyte cards" on bestbuys site which means theyll in turn cut their "open box" gigabyte gpus equally with the daily/weekly sale price cut of their brand new options. IF you really want a deal bc i saw a 4070ti base 3 fan from gigabyte dropping to 634$ open box if i was willing to drive 70mi round trip for it....then again in 2 weeks the super and 4070ti super launch so im more than willing to wait.
I bought a 3080ti at the height of the lockdown for 2.1k (my company severance package was 85k) so I wont be upgrading till late 5000 series or maybe even 6000 series. Before the lockdowns and the wild price jumps from gen to gen I upgraded to the latest "80" series card from the GTX680. The good old days of getting a new 80ti model every gen is over, and Ill enjoy my run with my 3080ti, and thankfully the saving grace is that it is the 12gb not the 10gb variant.
at least if youre willing you can still sell most 70-80 series range cards "prior" to each/other new generations launch for some feasible recoup to use recoup$$$ toward the next/gen 80 series...the MAIN issue is having a back up gpu/card and a MAIN game arpg/moba/freesports shooter to play/hunkerdown with inbetween gpu upgrades....its TRULY the most cost/effective approach for ppl that lack a SECURE income to justify these upgrades/hobby/passions. My backup is a 144$ 5500xt 8gb....but i managed to flip some 2080tis and 5700xts when ampere launched and the 3080 was constantly out of stock....wound up making some $$$....implored my friends to sell their 1070's for 400$ when they were able to buy a 3060 as a placeholder gpu...but not many were willing.
I was about to buy a 4080 at launch but I decided save up for a couple weeks and get the 4090 instead. seeing the Super series I really wish this price/performance was available on launch.
i feel like i got mugged for getting the 4090 at £1650 (i did anyway).. But this 4080 super at this price to performance really is a kick in the teeth now
@0:43 - Slight error with the chart colors. Idle Power should be red, and the increase in idle power is bad, because in idling it should be as low as possible, because GPU is not working in that moment. Same with Avg. Gaming Pwr. It should be green since decrease is power requirement while increase in performance is for the better, not for the worse.
My first step when purchasing a 4090 was to purchase a power source that does not use an adapter. Sources above 850W are already coming with a specific port and specific cable.
That is good, you remove one variable that can go wrong, and i would do the same if i bought a 4090, but it's the plug design / spec itself that has problems, and those problems are still there even with a direct psu to gpu cable.
You gotta wonder why the price tag is different…it seems they realized people weren’t happy and dropped it down for this new model instead of dropping the original’s price.
These cards are still 2 steps more expensive then they should be. Meaning, every card 2 steps down cost what the one 2 steps above reasenably should cost (and then it's still expensive). While with the size of these cards I can understand a little. Nvidia seems to brute force itself towards the specs. Curious to see what the next gen Intel cards have to offer. Also Nvidia's 5000 series will probably be availible end of this year. See if they manage to get higher performance in a little more elegant way. Even if the price would be good I am not sure I would want such a huge card. And those 12GB. 🤨 There is 0 future proof space here. They should all be minimum 16gb. And minimum should be availible for a 4060x card. A 4070x should not even be availible with the minimum of 16gb.
its bc "playable fps" used to mean 60-80 fps for PVE titles where most ppl would play older AAA on med settings and so on....these days playable is like 100 fps....and 144-165(bc of hz/monitors) etc for multiplayer....and "MAXED" 144-165fps yields for AAA titles on 2k-4k. These aspects of end user/range....have LARGELY changed in the last 10 years....and same to the level of "access/hardware" to acquire these frame yields. IMO id say its about accurate when you really compare what is offered by both brands/price point and how these GPU compare to the last GEN frame yields...and even more between the 2000 series gpus and the 5700xt compared to this generation of gpus. Thermals/wattage/Heat largely included which ppl often miss sincec the 1000-2000 series as well as the 5700xt.
@@anhiirr not sure about claims. Higher res, better graphics, more FPS has always been a requestion, that is not new. If anything, for a long time many things where not progressing. 1080p has been the default for over 10 years. Crysis did not look much worse then games released 10 years later. And like you said, 60fps was long considered acceptable. So yes, for like 10 years there was little progression. But that was not the default.
I'm so glad we're finally focusing on killing motion persistence blur. This is a roided out version of ULMB that is also compatible with VRR. This plus a 360 Hz Ultrawide with BFI (SO BFI would be at 180Hz) would be obscenely good looking.
@@Hybred Exactly. At that point you have perfect pixel response times, defeating motion persistence blur, AND you have VRR ensuring the refresh ratematches the framerate. At that point the only remaining limitation left is our eyes.
That's preplanned. Linus always is scripted since they have a team. Even those small statements. That's why his expressions are so choreographed. He's basically like Bill Nye but for computers. An actor.
The 4080S feels like Nvidia admitting the 4080 was overpriced without having to announce a price cut. The 4080S moves the needle so little for the $200 drop that they might as well have not launched it and just cut the 4080 MSRP instead. I'm really dissapointed we didn't see something similar to the A5000 ADA as the 4080S. They could launch that at what the 4080 was, and then cut the 4080 down to $999 instead. That would give 100SMs and 20GB over a 320-bit bus, enough to also keep their precious VRAM segmentation between the 4070 SKUs and the 4080 SKUs.
Its going to be very interesting how much the RTX 4080 Super ACTUALLY retails at launch here in the UK. Due to taxes and etc even though the £ is still stronger than the $ we usually see the entry price point around the same as the US $ I will not be trading up from my RTX 4080 for sure! Find it to be a great beast of a GPU. Even more GPU power than I need in so many games!
My general rule is to only buy if you get a ~50% performance boost for the same price you already paid. For instance, I just upgraded my 4790K/1080Ti system to a 7800X3D/7800XT. (I got the Ti on eBay for $500 the day 20 series was revealed.) The whole rig cost about the same as my first PC, but is next-generation ready. Because of the technological growth, I got twice the CPU cores, RAM, and SSD storage as my first PC without paying more, so that's a nice bonus too. No "bleeding edge" tax either.
@@TheWiiplay Sort of yes and no... I had to have two drives before at almost twice the price to match modern capacity/price ratios. I believe my first SSD was $150 for 500GB, and I got a 1TB under $100 for the new rig. My last build also included a large 2TB WD Gold for data backup so I saved almost $200 in drives alone. Both builds ran about $1200 after done. I think technically that makes the new one cheaper by inflation.
Oh yeah definitely because it's very close to the original 4070 Ti but for 599 , still I do love the 4070 Ti specs but 4080 Super is mid but still insane it's a 250w card compared to like 500w for 4090
@@Beginnergamer1823 I found the 70 "tie" model to be more interesting, due to the upgrade seen in the components, it will be interesting to see how much faster it actually is. The 70 is very nice that it comes at (hopefully) 599 and with only 200 more you get actually 16 gb vram. Will watch all the benchmarks with great interest. The 4090 draws more like 300-350W by the way.
When he says "strobe the backlight" isn't that basically the same thing as Black Frame Insertion? If so, you can do that on LG OLED tvs, and I'm pretty sure it works with VRR, at least on the older models.
ooh price drops nice, the 4080 super is basically the same price as the 7900XTX now after checking my local reatailer, i am very pleased with this and also torn on witch to get xD. waiting for those performance tests to see how it competes with the 7900XTX, especially curious about how its gonna fair in productivity workloads. nice to see nvidia being more competetive with there pricing finally
Remember that MSRP on most GPU launches only apply to low tier models and for a brief period of time so they can say that they had the cards at that price point. So expect prices to be higher than what was shown in the video.
@@dima97It's looking like late 24 now, 25 is older news but regardless if ur not planning on buying the 90 or 80 ur waiting another 6-9 months regardless
The black is really nice. Good thing these aren't like consoles where you can easily trade in your current model at the local game store for the new black one.
So Nvidia thinks we'd be grateful, that they went from 699$ (3080) to 1,199$ (4080) and now a miserable 200$ down to 999$? 799 would have been the tag a 4080 needs. Not great, but way more fair.
0:40 I'd make the red text a bit brighter, it's a bit hard to read on the dark background. Also, since less power draw is technically better anyway, I'd make it green aswell.
I'd really love to see Nvidia (or AMD) develop a rolling scan method (such as RBFI) for their Gsync modules and GPU's, this would mean 1ms MPRT (fantastic motion) @ just 60Hz and 60 FPS on OLED, and almost 0ms @ 120Hz * 120 FPS (up to 4K), for the same performance @ 8K you would only need 200Hz-240Hz at the most, but even at 8K you would still get 1ms or under at 120Hz with rolling-bar/scan (only need 240Hz for near 0ms MPRT, once you know what 1ms feels like, there is a big difference between 1ms and 0ms MPRT).
IMO 1ms isnt really that needed/necessary at those settings resolutions MPRT is more viable for 1080p twitch FPS shooters where the strobing/brightness can be counter acted by the GPU suite's color settings...IMO AAA gaming or 4k gaming or 21:9 should be more ideal for color/image uniformity vs pure response times. Plus 4k LG oled tvs offer better response times anyways than most "gaming" oriented 4k monitors while offering pretty stunning visuals. Id definitely not be considering MPRT in a racing sim/experience either on a multimonitor set up vs a 48" oled.
Yeah a rolling scan/refresh is probably the best way to go, especially if you can easily tune it. If it's set up where it always has the same amount of lines lit on screen you can avoid most flicker because your eyes are always getting a similar amount of light, rather than it rapidly flashing on and off. Really wish this was on regular consumer displays and not just the extremely expensive professional stuff...
@@anhiirr I couldn't disagree any more, even 0ms makes gameplay so much more fun and enjoyable, feeling as if you are actually connected to what's going on the screen rather than just participating, that's the best way I can explain the difference it makes to gameplay, and don't forget its also about IQ, 0ms offers razor sharp motion resolution, even 1ms has some very obvious motion persistence blur, for platformers, SHMUPS, rhythm & timing based games, twin stick shooters, classic & modern arcade games, retro games, the closer to 0ms the MPRT, the better, for me, 1ms is the bare minimum, my Plasma monitor with focus field-drive has a 0.4ms MPRT, and my CRT monitors have perceptively 0ms across the board (Input/video/motion-resolution/motion-input-response), once you have enjoyed that level of IQ and gameplay response, I would never want to go backwards, so here's hoping for rolling-bar RGB-OLED in 2025.
If i hear anything about 12+4 pin melting on rtx 4080 supers im skipping rtx 50 series for AMD gpu instead on rdna5, so it better be new 12 pin design.
I'm thinking I'll wait till the 50 series are out. no need to make the jump yet with my 1080p screen, and once I do I'm hoping for the 40 series to be super cheap by then.
Bro the low memory in the lower spec cards makes me want to murder Nvidia, I have workloads that require more memory than that, along with ray tracing acceleration, I just don't need the most insanely powerful card that costs a 1000 fucking dollars. What is the deal that I can't buy a card for $500 that has any memory in it
DLSS quality looks better than native anyway. Dunno why some people is hating on DLSS. It was even proved by Digital foundry there's no reason to use native if DLSS is available and you put it in quality. Not talking about framegen btw.
Only one little detail (I still haven't finished the video), but there's a little mistake in the table at 0:43. The increase in idle power consumption is green and the decrease in power consumption in red. Little mistake, but I just want to let you know guys to that you can improve. EDIT: Same mistakes in the table at 1:56 for all three power consumptions EDIT 2: The table at 3:08 is the same as the one at 0:43, therefore it has the same little mistakes
yesss this is the LTT CES type of video I have been waiting for. linus on camera, doing something interesting, candid casual shooting. and the best topic of the 2024 show
@@НААТ well its displaying 4k at 100 fps without dlss. You really think that product at that performance is made for the average gamer. Sound so ignorant
You know what's bad about Pulsar? Over the past years, people were having health inconveniences with PWM driven OLED dimming causing eye strain and headaches. Pulsar is that but on a big bright screen.
RRP nowadays means open box price or no warranty. We all know that the retailers will sell those things for whatever they want, and definitely for more than the non super models.
Honestly, Nvidia is probably just dealing with rich people at this point because for a graphic card 999$ is not reasonable at all, most people are living paycheck to paycheck and this price is not it, 599$ would be a better price but welp
Also i like to comment that while playing multiplayer in MW3, I get about 160 fps on Extreme custom settings with everything maxed out on my RTX 4080 on my Samsung OLED G9 95SC monitor. which is super ultra wide and its 80% 4k in pixels but still 5120x1440p. So what I'm generally seeing in this video is that the 4080 Super is pushing 4090 frame rates @ 4k resolution.
I can still play Level 10 Faceit with just a 180hz monitor, 5800x3d and rtx 3080ti. I haven't had a chance to try monitors with refresh rates higher than 180hz in the past 3 years. So I'm still very satisfied with the current viewsonic 180hz monitor.
No longer super overpriced, just extremely overpriced.
Reverse it
You dictate the price
@@Sal3600 super overpriced (super cards)
'
Yet Linus still must shill
@@nttinvis you dictate that
Nvidia got us mindfcked by launching GPU cards at a very greedy high price point. now they lowered their prices for their Super & TI models and us consumers think we got a great deal. lol 😂💀 their sales team must be politicians. 😂💀
I mean this is still a reasonable price-to-performance ratio
back in the days for example the 1080 was $600 but also it would have NEVER ran 4K properly, it already didn't do so hot at 1440p with some games, leading to the 1080ti (amazing card)
Fast forward today, GPUs have overtaken games pretty much, so now really what before was an 80 series is a 70 series... if you want to run 1440p balls to the wall happily, the 4070 Super is $599... exactly like the 1080, but unlike the 1080 it offers much, much more peformance if you take in consideration the games of today and the games of back then
4080 at 1200 was just idiotic.
@@Galf506 still bad pricing.
capitalism psychologie
It's called anchoring, very standard practice in sales. That being said, the 1080Ti had an msrp of $749, so not THAT big of a difference, and the new cards do a lot more than the 10 series did.
Just... No. You can't justify those stupid prices by saying they give you more performance than previous generations. That the only point of an upgrade. Going from 1080 to 1440p to 4K is just keeping up with current tech it's not leaping into the future. 1200$ for an 80 series was insane and 1000$ is still crazy. You don't pay a 50% premium for a current high end CPU despite it having higher clocks and more cores that the one you bought 5 years back.
I will eat my hat if you can buy an rtx 4070 super for $599. That would be a miracle.
Or that the old 4070 will drop 50$/€. I can't use a 12VHPWR in my ITX PC, no room for the adapter. Its only 5 liter big.
There's even a founders edition. Seems quite possible, if not for everyone.
I don't think getting them at msrp will be an issue. The demand isn't what it was a couple of years ago, there aren't any sort of component shortages and mining on these gpus isn't a thing anymore.
I'll help eat the brim if I can get one under a grand
there r 4070 cards for $600 ....at least in my country
You know Nvidia has lost its mind when a $999 gpu is consider "reasonable"
Facts
inflation..
@@Egrodo1inflation hit nvidia years ago, that’s how ahead they were
@@Egrodo1inflation alone doesn't explain shit
A 1080ti was $700, today that would be $850
If Nvidia isn't a greedy company, where tf did the extra $200 come from
@@flamingscar5263Nooo... The 1080ti wasn't today's 4080 super. The 1080ti was nvideas fastest gpu. Their flagship. It's 700 compared to 1600. They are killing pc gaming, and amd is the same crap even though everyone is throating amd to the base
Correct me if I'm wrong, but at 0:35 shouldn't more idle power draw be marked as a red number and less draw while gaming be marked as a green number? Technically they follow the idea that green=more, but I think it makes more sense for green=good
It's very cold out here, so that's a benefit.
More idle power consumption = warmer room, so it's a benefit.
I’m lazy and bad at math and high and guessing that 15% rise in idle draw is gonna feel like the original price in places like uk .
@@RainOfAshes bad cope
Disagree, 12 years in accounting and every single chart, spreadsheet, etc. has green = positive number and red = negative number. What the numbers mean is irrelevant. I think this green = good and red = bad is more of a thing in marketing which I would argue these charts are not marketing but informational. If they were to do green=good the charts would stop being a source of factual information and start being an "opinion" expressed by LMG which has its own issues. I think its best to leave the charts as pure information and leave good/bad out of it as to not risk adding a potential source of bias.
It’s crazy how during Covid NVDA started price gouging and people freaked out…. But then everyone forgot and they continued to price gouge and now it’s the new normal 😊
No serious alternatives. AMD is lagging with innovation. Intel recently stepped into the arena with Arc.
I remebmer another massive event happening between 2019 and 2022... every person and their motha buying up every card they could get their hands on... and lets not forget that the 2080ti was already 1200 dollars back then, and the Titan or now the 3090/4090 class cards were 2000 dollars and now they are 1500.. its a differnt time now, its not just raster performance the ammount of IP and programing and AI that comes with their upper teir to top level tier cards is pretty extrodenary.. DLSS can turn a 3060 with 16 gigs of ram... which my 3080ti only has 12 gigs mind you for some F'ed up reason even though it was within 4 percent of the 3090 in pretty much every single benchmarch gaming wise.. those 3090s, and 4090s are for professionals.. not your average gamer.. but yes prices are rediculous but buy a last gen used card like a 3070 for 200 bucks and play over watch at 250 FPS if you even have a monitor that can handle that and quite complaining.. if you were really in an industry that requried a 3080ti, 3090, 3090ti, a Titan, or top tier card, your company would probably have a machine with one it it or an even better one.. or you would be making enough money to build your systems so you could be doing that kind of work that requries such hardware because gaming doesnt.. you dont here people bitching about 800 dollars cpus from AMD... and they dont even come with a cooler.. or a motherboard, just the bare chip and god forbid if you bend a pin or seat it wrong and F it up, so your 700 dollar processer needs a 200 dollar cooling supply but no one is complaining about that.. and intel just keep craping out new generations that are using soo much power, and actually have lost performance on the newest 14th gen its laughable.. my advice buy an xbox or a ps5 your life will be more simple.. or a mac.
People always forget.
ppl really act like the 7990 wasnt 999$ or the radeon 7 wasnt 700$ msrp priced to "compete" with the 2080 and eventual 2080s....While at the time prior to big navi the best option AMD had was an overheating 275w when oc 5700xt....that couldnt even touch a 2080 non super. AS IF nvidia HAD priced their GPUS accordingly considering the lack of threat/viable options comparatively to AMD....yeah PPL forget SO QUICKLY. Prob bc they were hunting used 570x gpus at the time....once they earn/have enough for a beefy gpu they now cry about the need for a better platform and higher quality PSU....go figure.
@@dcpoweredWhat Innovation are you paying for with Nvidia that AMD doesn't do better on making up with price to performance. AMD may still be overpriced but matching performances and prices between nvida and AMD it's not even remotely close. Not to mention that and isn't far behind Nvidia anymore if you were actually keeping up with things.
The 4080 offers like 30-40fps over a 4070... for the price of 2 4070's. While these SUPER cards are technically more 'affordable' they only serve to fill massive price gap that shouldn't exist in the first place.
At that point fps is just a fraction of what you are paying for but the overpriced sentiment is still valid😅
This card is still going to cost more than my entire PC that has a 7800x3D and a 6900XT. Euro prices still suck.
Don't get the 4080 Super. The FPS it's getting with DLSS off in MW3 is the same as my 6900XT gets with everything ramped up to max.
@@conorturton To be fair, Modern Warfare 3 is more well-optimized for Radeon than it is for GeForce, so unless you only play MW3 and nothing but MW3, it's not a fair way to compare hardware. Normally, the 4080 has a rasterization performance similar to that of the 7900 XT, and the 4080 Super will likely compete with the 7900 XTX. At $999, the 4080 Super will definitely be a competitive product, at least until AMD drops pricing on the 7900 XTX.
yeah and if only i wasnt in the DEMOGRAPHIC that was intent on owning a 800-1000$ gpu to play a free game like warzone or apex legends off of a 500$+ monitor....o wait...im not one of those suckers...that spends upwards of 1200$+ on just a gpu monitor combo to play a slew of free titles....
I was like, "Why test a Cod game out of all games?" Cod is super easy to run. They should've tried Cyberpunk or something as demanding as that game.
@@VelocifyerLol it was at one time. Even the newest Cod games get chewed through by modern cards.
I love how they added an actual physical switch for the prototype monitor, very cute.
I like that the IDLE power being more gives it a green percentage as if its a good thing.
Should've been red I agree, but thankfully total power consumption was lower compared to original 4080.
Disagree, 12 years in accounting and every single chart, spreadsheet, etc. has green = positive number and red = negative number. What the numbers mean is irrelevant. I think this green = good and red = bad is more of a thing in marketing which I would argue these charts are not marketing but informational. If they were to do green=good the charts would stop being a source of factual information and start being an "opinion" expressed by LMG which has its own issues. I think its best to leave the charts as pure information and leave good/bad out of it as to not risk adding a potential source of bias.
@@cookoo4lyfwell in this case I think using more power to do the same thing (ie basically nothing when idling) is pretty objectively bad, and the average viewer is going to be following the more widespread convention of green=good, red=bad.
And they've gone and fixed that anyway, it's red now
@@cornonjacob not sure what you are looking at, but it's definitely still green. 0:42 I also think the average user is smart enough to see that a higher power draw is bad without color coding. If its a real problem I would say no color is the solution, not going against the standard for every datasheet on the planet 🤷♂️. That or they could just add a small legend at the bottom clarifying that green numbers are positive and red is negative to help people that are confused. At the end of the day I'm merely pointing out that the chart here is clearing using the normal use of those colors on datasheets and that LMG is not saying higher idle is a good thing, as a rebuke to the original comment.
before the 4090 costed a liver and a kidney now the 4080 super only costs a liver!!!
Sounds like brokie talk
Sounds like a bargain. How many fps can your liver pool?
@@AnDr3w066 Sounds like bad spender talk.
@@Bigdude0444 or people that simply allocate value and have more income. Its ok if its not for you. There are other tiers and brands for your income and preference. Seems like a non issue
@@AnDr3w066 homie I could build a top spec PC any time I want. Why would I when I can get something reasonable and just enjoy games? I built my PC like 5 years ago with the intent to upgrade stuff later when prices went down and became a generation old. And that's exactly what I did. Spent much less money than most people did cause I have patience. 5800x and 3080 all at less then MSRP.
Of note backlight strobing helps with all display technologies, even OLED, because unlike moving objects, your screen is not moving, but if your eye is tracking an object as it progresses across the screen, it will enduce motion blur because your eye is mobing, but the actual pixels are not.
OLEDs generally uses a monochrome per sub-pixel backlight in front of a color converter/filter, this is what allows for those inky blacks, millions of dimming zones instead of the hundreds you'd find on miniLED.
One way to get around this is to either have a 10-25khz monitor(the rough estimate of how fast a monitor would need to refresh to accurately simulate motion to be discernable at high brightness levels the frequency needed depends on how much of your field of view the monitor takes up, how bright it is, and how well your eyes, yes your eyes percieve motion.
A much more cost effective option is to have a relatively normal refresh rate say 240hz, but instead of an always on backlight, shut off the backlight part way though the pixel refresh cycle. making the pixels on screen only be visible for what would be the same amount of time as if it were running at 1500+hz, the pixels may still only be refreshing once every 4ms, but they are only lit for say, 0.66ms meaning it has the motion persistence, of a 1500hz monitor, while not needing an expensive pixel array capable of 1500hz.
Theres has been a lot of effort and money being poured into elimitating flicker, and for good reason.
In my opinion ULMB should stay where it is now, as an option for people that really want it.
Having this enabled in all monitors would be tremendously irresponsible, and not only price-wise.
@@LonggshotI'm sorry but sample & hold displays primary reason for existing was not because of flicker, it was mostly because of size and weight. LCD displays despite being uglier than CRTs when they were first made took the market by storm as they were way more convenient and most people aren't tech enthusiasts.
A BFI type mode should be on every high end display, obviously with an option to disable it of course because flicker can be annoying at lower refresh rates however flicker is basically not visible at higher refresh rates so there's no reason not to run it unless you have brightness concerns but at something like 60fps it definitely becomes more subjective.
@@Longgshot I didnt mean to imply that all displays sold should come with a tech like this, that would be expensive, but that this methodology for reducing image persistence works on pretty much every display tech, even OLED which most people dont think of as having a backlight.
@@Longgshot Also of note, this is not about reducing flicker, this is about reducing motion blur, the side effect is that this can reduce flicker as well.
But wouldn't that drastically reduce brightness of an OLED if you let the pixels stay lit for only such a short time? You would need to drive the pixels in much higher brightness in that reduced on state in order to compensate. Not sure how this would affect longevity.
its nice seeing lcd tech continue to improve. as a person who tends to keep monitors for the long haul, i remain hesitant to buy oled due to worries of burn in after 8+ years.
may not last as long but being 10x the monitor makes it worth it to many like myself.
to be fair, i feel like 8 years is a good lifespan, i wouldnt be heartbroken over it. After 8 years, considering the new tech that would have come out by then, i would certainly want to upgrade.
Its not cheap but upgrading monitors is one of the best upgrades. Its what you feel and see each second of PC use! Lookin fwd to when I can afford a nice 32 inch 4K gaming OLED! 😁
@@mrs8768 tell that to my 15 year old 16:10 LG Flatron ;) Still runs like a beast, zero burn in, only got a slight backlight artifact in the corner when it's dark grey, like something wrinkled inside. I had to fix the CFL driver once, because some pins of the driver just started to crack/desolder themselves (dry joints) and one lamp started to flicker, then the whole monitor started randomly losing backlight. A quick reflow with a fresh solder on the hottest looking points and it's completely fine! Didn't even change the caps after all these years :) Give me a monitor now with such a lifespan, while being 100+Hz, 16:10 and at least above ~1000p. Not really possible. CFL is great because it doesn't really burn out that much, whille LEDs either love to pop or have terrible coverage in cheaper monitors.
OLEDs look so gorgeous. I have a switch OLED and it looks unbelievable. I check in on OLED every now and then to see how the tech and prices are coming, because I want an OLED PC monitor so badly. One of these days...
0:14 I was very much expecting 'and this super segue to our sponsor' lol
Gpu market is dead
Super dead, one might say.
They’re still going to sell out. People will buy whatever they put out :/
@@donotworried the 4090s start at 2500€ here and people still gobble it up. It's astounding what people suddenly accept
@@MrBraxThat’s insane, I thought it was insanely stupid expensive at the MSRP of $1500 when I got mine. But holy crap €2500 is beyond insane. That’s like first car level of money.
@@MrBraxAMD is to be blamed here. Had they come up with a viable competition to 4090 (gaming and more importantly productivity), Nvidia would’ve lowered the pricing. AMD is an impotent competition. We need someone else to step up.
Given that the 4070 Ti Super also has 16GB of VRAM, it would be interesting to see if the extra $200 for the 4080 Super is necessary for most games to hit reasonable framerates or if the 4070 Ti Super will be good enough. Also, does "good enough" require some extra features like DLSS 3.0?
Out of these cards, I actually like the boost from the 4070 to the 4070 Super the most, but given that it's still stuck on 12GB of VRAM, I'd be hesitant to consider it for a machine running a 4K display.
The ti super is the most value card this gen, for its price its really close to the 4080 wille much cheaper with 16gb and good specs
I think it depends how often do you change your graphics.
@@MLTAKOSi think that crown goes to the rx 7800 xt
well seeing as how i'm gonna be stuck on a 4k 60hz tv for the forseeable future , i think the 4070ti-S will be good enough for me for atleast a few years if not 5-6 years. but still 800 bucks is gonna be a hard swallow so i'm eyeing intel maybe battle mage will have some insane RT performance. from what i've seen teh Arc cards have more comparable RT performance (than AMD) to nivida cards at those price ranges
@@sherr6847 considering prices in my country amd is the most value tho and i would get the 7900xtx and its the same as the 4070 ti plain
I can clearly imagine the face of the Nvidia representative when he turned off DLSS😂😂😂
I bought Nvidia products since the Geforce 8 days upgrading every couple of years. They finally priced me out of their market and I replaced my 1070 with a RX 7800 XT. It works great for gaming and the excellent Linux support is an added bonus.
About the same story here, I’ve been buying Nvidia since Geforce 6 series and recently upgraded to a 7800 XT from a 2070 Super. I’ve always liked Nvidia but I couldn’t justify the price to performance.
I'm just here to say, fuck nvidia.
8800gt, 580gtx, 1070gtx, 1080ti (used) ...
... Rx 6800xt
Fuck nvidia, these prices will never be justified.
Been driving consoles and integrated graphics for years and then upgraded to a 6800xt. I can't believe lag doesn't exist anymore.
Same! always nvidia and i retired my goat 1080ti for a 7900xt last summer... not gonna pay for greedy company that clearly are only thinking on money, business companies, AI etc and not gamers...
This means the 3070ti / 3080’s used will now become reasonably priced, great!
these cards are being discontinued by nvidia, so only the price on current stock will be lower. after they are sold out you are just out of luck.
@@dennisdelion daarom zei ik tweedehands :p
Thank you for this video! building a solid PC this week with a $4080 Super
Need to see that "Fraction of our power" meme with a CRT pointing at the OLED/LCD planes
being from the town where bequiet has their headquaters, it makes me super proud to see them be so successful
Tribe mentality
4080s have been around $1000 not that uncommonly and weren't attractive at that price either. 4070 super and 4070 Ti super needed a price drop.
And 12VHPWR remaining is awful. I know a guy who built a top end gaming pc and couldn't deal with the anxiety of having a fire hazard in his brand new three grand PC. He returned the 4090 for a 7900 xtx - and without any options for regular power connectors, who could blame him?
I've had my 4090 since launch day with no problem 🤷
He was very smart
Sorry to be the bearer of bad news but get ready because its been officially declared that connector was designed bad, they already redesigned that connector the new one is going to be called 12v6x2 @@fever309
Yeah cause redesigning a PCB and cooler to accommodate 8pin connectors rather than just using what is already very readily available makes a lot of sense from a manufacturing standpoint. Really shocked Nvidia didn't want to spend more money, to charge slightly less money. The 12vhpwr isn't the issue, its the lack of common quality in the cables used and and people just hearing a click and not actually making sure the cable is actually fully seated OR in a lot of cases, not having stupid tight bends in the cables like they got used to being able to do for the last 25yrs.
@@fever309 same, no issues
Just ordred the 4080 super along with the samsung G8 Oled monitor. and a spare 4Tb Nvme 4.0 drive. upgrading from 3070 on a 13600k with 48Gb DDR5 7000Mhz ram. think this will be an okay build.
Think I'll just wait until the 5XXX series. Been VRAM cucked by my RTX 3070 at 8GB.
You should check once in awhile if GPU’s drop for good value. I picked up my 6950 xt for 550
felt, 3080 10GB
yeah same i won't have the money to invest in a new desktop before the 50 series is right around the corner so might as well wait, especially since i also write AIs i need the VRAM and preferably 16gb or more, now i'd like to see AMD make a push with AI hardware since nvidia drivers on linux are god awful but beggars can't be choosers
A 3070 with 16GB would have been so cool.. unfortunately Nvidia is so greedy it can't get enough money for their cards
@@metallurgico hell, even a 10 gig or 12 gig variant would’ve been cool
I don't understand how there's no internet info about Markiplier having 2 4090s. He mentioned having 2 of them several times and it makes no sense how no one caught the discrepancy of having TWO 4090s! SLI was last supported in the RTX 3090. RTX 40 series doesn't have SLI, what are you talking about Mr. Fischbach?
The funniest part is telling you that dropping down to “Only” one thousand dollars is a good deal. Lmao
I want that G-Sync Pulsar tech on a monitor with the resolution of 3440x1440 or 5160x2160. I have a 21:9 aspect ratio monitor and I just can't go back, but I really want BFI on a monitor.
NOTE: Alienware OLEDs are including BFI, but you'll reduce your Hz in half, so from 240 to 120. That's fine if the game is heavy duty and can only hit 120-140. Just keep it locked at 120 and it should look really solid.
I will be very surprised if retailers don’t raise the prices. They know the products will be popular.
oh AIO;'s and reatailers both msot assuredly will. mark my words you'll see RTX 4080's at 1,100 m, and 4070 ti-S's at 850-900 . there will be models you can get at msrb but they will be the one's that sell out the msot often and be out of stock all the time.
@@DenverStarkey but what about the reg 4080/4070 ti? I know they are gonna stop making them so maybe if the 4080S is $1000, the reg 4080 will drop to $850-$900 to at least make it somewhat viable for that price point up against the XTX and the 4080S
@@slite3276 its all up to the retailer....if they want to empty stock/shelves theyll liquidate the discontinued models with a sale....clear old stock....and are able to order/stock up more on the new models....if they see a profit/aspect keeping the discontinued cards at full/msrp range....bc of a supply issue/shortage with the super options....than they wont have to liquidate the old stock...bc they still want to offer "in stock" options regardless. Unless consumers are full out not buying the older/discontinued cards...thus forcing the retailers to price/cut them. Overall its not bad...if you have local/retailers that offer price matching....you can always go down that avenue with microcenter/bestbuy/amazon...and pull up a sales price and try to come up on a deal....or look towards liquidated "gigabyte cards" on bestbuys site which means theyll in turn cut their "open box" gigabyte gpus equally with the daily/weekly sale price cut of their brand new options. IF you really want a deal bc i saw a 4070ti base 3 fan from gigabyte dropping to 634$ open box if i was willing to drive 70mi round trip for it....then again in 2 weeks the super and 4070ti super launch so im more than willing to wait.
I bought a 3080ti at the height of the lockdown for 2.1k (my company severance package was 85k) so I wont be upgrading till late 5000 series or maybe even 6000 series. Before the lockdowns and the wild price jumps from gen to gen I upgraded to the latest "80" series card from the GTX680. The good old days of getting a new 80ti model every gen is over, and Ill enjoy my run with my 3080ti, and thankfully the saving grace is that it is the 12gb not the 10gb variant.
at least if youre willing you can still sell most 70-80 series range cards "prior" to each/other new generations launch for some feasible recoup to use recoup$$$ toward the next/gen 80 series...the MAIN issue is having a back up gpu/card and a MAIN game arpg/moba/freesports shooter to play/hunkerdown with inbetween gpu upgrades....its TRULY the most cost/effective approach for ppl that lack a SECURE income to justify these upgrades/hobby/passions. My backup is a 144$ 5500xt 8gb....but i managed to flip some 2080tis and 5700xts when ampere launched and the 3080 was constantly out of stock....wound up making some $$$....implored my friends to sell their 1070's for 400$ when they were able to buy a 3060 as a placeholder gpu...but not many were willing.
Question, I just bought the 4070 without knowing the 4070 super was coming out. Should I return the 4070 and exchange it for the 4070 super?
1:36 RGB lighting on your face. Broke your rule 😂❤
I definitely feel like I got mugged for buying a 4080 for £1150 lol
I was about to buy a 4080 at launch but I decided save up for a couple weeks and get the 4090 instead. seeing the Super series I really wish this price/performance was available on launch.
i feel like i got mugged for getting the 4090 at £1650 (i did anyway).. But this 4080 super at this price to performance really is a kick in the teeth now
@@tapsofosiris3110a couple of weeks to get the difference? nice
worst part is the 4090 still barely runs cyberpunk at max settings with frame gen and dlss
@@TwinkleTutsies It runs it at 180+ fps at 1440p dlss quality with RT off. Not everything needs RT man.
@0:43 - Slight error with the chart colors. Idle Power should be red, and the increase in idle power is bad, because in idling it should be as low as possible, because GPU is not working in that moment. Same with Avg. Gaming Pwr. It should be green since decrease is power requirement while increase in performance is for the better, not for the worse.
My first step when purchasing a 4090 was to purchase a power source that does not use an adapter. Sources above 850W are already coming with a specific port and specific cable.
That is good, you remove one variable that can go wrong, and i would do the same if i bought a 4090, but it's the plug design / spec itself that has problems, and those problems are still there even with a direct psu to gpu cable.
You gotta wonder why the price tag is different…it seems they realized people weren’t happy and dropped it down for this new model instead of dropping the original’s price.
These cards are still 2 steps more expensive then they should be. Meaning, every card 2 steps down cost what the one 2 steps above reasenably should cost (and then it's still expensive).
While with the size of these cards I can understand a little. Nvidia seems to brute force itself towards the specs.
Curious to see what the next gen Intel cards have to offer. Also Nvidia's 5000 series will probably be availible end of this year.
See if they manage to get higher performance in a little more elegant way.
Even if the price would be good I am not sure I would want such a huge card.
And those 12GB. 🤨 There is 0 future proof space here.
They should all be minimum 16gb. And minimum should be availible for a 4060x card. A 4070x should not even be availible with the minimum of 16gb.
its bc "playable fps" used to mean 60-80 fps for PVE titles where most ppl would play older AAA on med settings and so on....these days playable is like 100 fps....and 144-165(bc of hz/monitors) etc for multiplayer....and "MAXED" 144-165fps yields for AAA titles on 2k-4k. These aspects of end user/range....have LARGELY changed in the last 10 years....and same to the level of "access/hardware" to acquire these frame yields. IMO id say its about accurate when you really compare what is offered by both brands/price point and how these GPU compare to the last GEN frame yields...and even more between the 2000 series gpus and the 5700xt compared to this generation of gpus. Thermals/wattage/Heat largely included which ppl often miss sincec the 1000-2000 series as well as the 5700xt.
@@anhiirr not sure about claims. Higher res, better graphics, more FPS has always been a requestion, that is not new.
If anything, for a long time many things where not progressing.
1080p has been the default for over 10 years.
Crysis did not look much worse then games released 10 years later.
And like you said, 60fps was long considered acceptable.
So yes, for like 10 years there was little progression. But that was not the default.
I'm so glad we're finally focusing on killing motion persistence blur. This is a roided out version of ULMB that is also compatible with VRR. This plus a 360 Hz Ultrawide with BFI (SO BFI would be at 180Hz) would be obscenely good looking.
If ULMB 2 + G-sync Pulsar ever gets reworked for OLED compatibility that will be an endgame display for me.
@@Hybred Exactly. At that point you have perfect pixel response times, defeating motion persistence blur, AND you have VRR ensuring the refresh ratematches the framerate.
At that point the only remaining limitation left is our eyes.
Wait so will gsync pulsar essentially be vrr and ulmb at the same time?
Yup, it's VRR + ULMB
4:19 the fact that Linus couldn't recognized if its native or not have really showed how good upscaling technologies have became.
That's preplanned. Linus always is scripted since they have a team. Even those small statements. That's why his expressions are so choreographed. He's basically like Bill Nye but for computers. An actor.
@@CanditoTrainingHQ hm if so he is an enthusiastic actor, at least he convinced me
It's Friday! Everyone take a shot every time the word 'Super' is mentioned
The 4080S feels like Nvidia admitting the 4080 was overpriced without having to announce a price cut. The 4080S moves the needle so little for the $200 drop that they might as well have not launched it and just cut the 4080 MSRP instead. I'm really dissapointed we didn't see something similar to the A5000 ADA as the 4080S. They could launch that at what the 4080 was, and then cut the 4080 down to $999 instead. That would give 100SMs and 20GB over a 320-bit bus, enough to also keep their precious VRAM segmentation between the 4070 SKUs and the 4080 SKUs.
4070 Ti Super. 😂 Whats comming Next?
5070 Ti Super Titan?
Its going to be very interesting how much the RTX 4080 Super ACTUALLY retails at launch here in the UK. Due to taxes and etc even though the £ is still stronger than the $ we usually see the entry price point around the same as the US $ I will not be trading up from my RTX 4080 for sure! Find it to be a great beast of a GPU. Even more GPU power than I need in so many games!
*Cant wait for the 4090 Super if this is considered cutting edge*
My general rule is to only buy if you get a ~50% performance boost for the same price you already paid. For instance, I just upgraded my 4790K/1080Ti system to a 7800X3D/7800XT. (I got the Ti on eBay for $500 the day 20 series was revealed.) The whole rig cost about the same as my first PC, but is next-generation ready. Because of the technological growth, I got twice the CPU cores, RAM, and SSD storage as my first PC without paying more, so that's a nice bonus too. No "bleeding edge" tax either.
Is that factoring in inflation?
@@TheWiiplay Sort of yes and no... I had to have two drives before at almost twice the price to match modern capacity/price ratios. I believe my first SSD was $150 for 500GB, and I got a 1TB under $100 for the new rig. My last build also included a large 2TB WD Gold for data backup so I saved almost $200 in drives alone. Both builds ran about $1200 after done. I think technically that makes the new one cheaper by inflation.
The 4070 super is the most interesting one tbh
Oh yeah definitely because it's very close to the original 4070 Ti but for 599 , still I do love the 4070 Ti specs but 4080 Super is mid but still insane it's a 250w card compared to like 500w for 4090
@@Beginnergamer1823 I found the 70 "tie" model to be more interesting, due to the upgrade seen in the components, it will be interesting to see how much faster it actually is. The 70 is very nice that it comes at (hopefully) 599 and with only 200 more you get actually 16 gb vram. Will watch all the benchmarks with great interest. The 4090 draws more like 300-350W by the way.
trust me im going for ti super aswell
for 1080p, for everything else i think its going to age horribly
@@Alborah since Nvidia says “ it is faster than a 3090” it should be good for 1440p
When he says "strobe the backlight" isn't that basically the same thing as Black Frame Insertion? If so, you can do that on LG OLED tvs, and I'm pretty sure it works with VRR, at least on the older models.
ooh price drops nice, the 4080 super is basically the same price as the 7900XTX now after checking my local reatailer, i am very pleased with this and also torn on witch to get xD. waiting for those performance tests to see how it competes with the 7900XTX, especially curious about how its gonna fair in productivity workloads. nice to see nvidia being more competetive with there pricing finally
Well if you don't run DLSS on it's no faster than my 3 year old 6900XT at 4K everything max in MW3.
Remember that MSRP on most GPU launches only apply to low tier models and for a brief period of time so they can say that they had the cards at that price point.
So expect prices to be higher than what was shown in the video.
@@Longgshot indeed, its about 1149€ and upward for the 4080super where i live.
I bought the 7900XTX about 8 months ago and I have zero regrets. It’s a killer 4K raster card and I got it for $850 👌🏻
@@TheEpxMaster damn nice
"natural blur of your eyes" IS BRO SCIENCE... This is anecdotal at best.
At this point imma hold out for the 50 series if only the 40 series launched like this
50 series not set for until 2025 just fyi
@@dima97It's looking like late 24 now, 25 is older news but regardless if ur not planning on buying the 90 or 80 ur waiting another 6-9 months regardless
@@Frozoken oh cool, I didn't know.
70 ti super 4 gb memory bump will keep those cards alive for some years
The black is really nice. Good thing these aren't like consoles where you can easily trade in your current model at the local game store for the new black one.
So Nvidia thinks we'd be grateful, that they went from 699$ (3080) to 1,199$ (4080) and now a miserable 200$ down to 999$? 799 would have been the tag a 4080 needs. Not great, but way more fair.
Huge agree
Oh how lucky we are with this price drop. Only 1000$ for a 70 series die with crippled VRAM. Thank you NGREEDIA you too kind.
0:40 I'd make the red text a bit brighter, it's a bit hard to read on the dark background. Also, since less power draw is technically better anyway, I'd make it green aswell.
they really gave everyone that bought a 4080 and a 4070 a big middle finger
Don’t forget everyone that bought a 4070 ti.
Yep
Ngl, the black on black looks really good on the FE cards. I might have to get one for my Sim rig
You know LTT is schilled when they have alienware momitors for a sponser.
Imagine thinking $999 for a GPU, that isn't even the flagship model, is reasonable.
I'd really love to see Nvidia (or AMD) develop a rolling scan method (such as RBFI) for their Gsync modules and GPU's, this would mean 1ms MPRT (fantastic motion) @ just 60Hz and 60 FPS on OLED, and almost 0ms @ 120Hz * 120 FPS (up to 4K), for the same performance @ 8K you would only need 200Hz-240Hz at the most, but even at 8K you would still get 1ms or under at 120Hz with rolling-bar/scan (only need 240Hz for near 0ms MPRT, once you know what 1ms feels like, there is a big difference between 1ms and 0ms MPRT).
IMO 1ms isnt really that needed/necessary at those settings resolutions MPRT is more viable for 1080p twitch FPS shooters where the strobing/brightness can be counter acted by the GPU suite's color settings...IMO AAA gaming or 4k gaming or 21:9 should be more ideal for color/image uniformity vs pure response times. Plus 4k LG oled tvs offer better response times anyways than most "gaming" oriented 4k monitors while offering pretty stunning visuals. Id definitely not be considering MPRT in a racing sim/experience either on a multimonitor set up vs a 48" oled.
Yeah a rolling scan/refresh is probably the best way to go, especially if you can easily tune it. If it's set up where it always has the same amount of lines lit on screen you can avoid most flicker because your eyes are always getting a similar amount of light, rather than it rapidly flashing on and off.
Really wish this was on regular consumer displays and not just the extremely expensive professional stuff...
@@anhiirr I couldn't disagree any more, even 0ms makes gameplay so much more fun and enjoyable, feeling as if you are actually connected to what's going on the screen rather than just participating, that's the best way I can explain the difference it makes to gameplay, and don't forget its also about IQ, 0ms offers razor sharp motion resolution, even 1ms has some very obvious motion persistence blur, for platformers, SHMUPS, rhythm & timing based games, twin stick shooters, classic & modern arcade games, retro games, the closer to 0ms the MPRT, the better, for me, 1ms is the bare minimum, my Plasma monitor with focus field-drive has a 0.4ms MPRT, and my CRT monitors have perceptively 0ms across the board (Input/video/motion-resolution/motion-input-response), once you have enjoyed that level of IQ and gameplay response, I would never want to go backwards, so here's hoping for rolling-bar RGB-OLED in 2025.
If i hear anything about 12+4 pin melting on rtx 4080 supers im skipping rtx 50 series for AMD gpu instead on rdna5, so it better be new 12 pin design.
Don’t. I have had 5700 xt, 6700 xt and now 7900 xt, on AMD cards the 1% lows are always fucked up which is more important than average fps
@@nazmulfahad3044 My 1% lows are fine
we not gonna talk about how thick that gpu was
not me buying this for 1440 p 240hz gaming
After such an intro, I was expecting the "make it Super, with SuperCheap Auto"
I already have a 4090 FE, but that 4080 S looks good in matte black
I'm thinking I'll wait till the 50 series are out. no need to make the jump yet with my 1080p screen, and once I do I'm hoping for the 40 series to be super cheap by then.
of course nvidia want's DLSS on... "we know game optimization is crap these days so here have ai frame generation as a band aid"
I opted for an RX 4900 XTX.
Aside from price and not doing productivity, I'm simply worried about my PC burning down.
Bro the low memory in the lower spec cards makes me want to murder Nvidia, I have workloads that require more memory than that, along with ray tracing acceleration, I just don't need the most insanely powerful card that costs a 1000 fucking dollars. What is the deal that I can't buy a card for $500 that has any memory in it
DLSS quality looks better than native anyway. Dunno why some people is hating on DLSS. It was even proved by Digital foundry there's no reason to use native if DLSS is available and you put it in quality. Not talking about framegen btw.
Actually Surprised that they even let Linus Check out the 4080 lol.
This right here 😂
was waiting for him to say "And this Super segway, to our sponsor!" lmfao
Only one little detail (I still haven't finished the video), but there's a little mistake in the table at 0:43. The increase in idle power consumption is green and the decrease in power consumption in red. Little mistake, but I just want to let you know guys to that you can improve.
EDIT: Same mistakes in the table at 1:56 for all three power consumptions
EDIT 2: The table at 3:08 is the same as the one at 0:43, therefore it has the same little mistakes
Its winter right now, more power draw = more heating = good. lol
i'm still waiting for the 4050 and 4060 super and 4060 ti super COME ON NVIDIA
TI SUPER
Ffs
yesss this is the LTT CES type of video I have been waiting for. linus on camera, doing something interesting, candid casual shooting. and the best topic of the 2024 show
Still over priced, Greedvidia can sit on it and rotate.
Jimmy is The Beast in Revelation?! It's been right in front of us this WHOLE TIME!!!!! lol
NVIDIA shill LTT.
Stay mad brokie
@@AnDr3w066 We get it, ur sugar daddy spoils you.
@@НААТ we get it you are poor and jealous. You can’t afford it then don’t buy it. Its a premium product 😂.
@@AnDr3w066 😂😂😂 premium product he says.. stop clowning
@@НААТ well its displaying 4k at 100 fps without dlss. You really think that product at that performance is made for the average gamer. Sound so ignorant
So Linus carried a screwdriver through the airport security to market it.
You know what's bad about Pulsar?
Over the past years, people were having health inconveniences with PWM driven OLED dimming causing eye strain and headaches. Pulsar is that but on a big bright screen.
RRP nowadays means open box price or no warranty. We all know that the retailers will sell those things for whatever they want, and definitely for more than the non super models.
The real truth - if it's not 4090 Ti or Super, then I don't care about this card or its price, I'm waiting for the next 5xxx series cards.
Honestly, Nvidia is probably just dealing with rich people at this point because for a graphic card 999$ is not reasonable at all, most people are living paycheck to paycheck and this price is not it, 599$ would be a better price but welp
G-sync Pulsar with OLED should be awesome!
4:08 "Mixed & Mostly Negative". 💀
Should have been the original RTX 4080 to begin with
Cant wait for the 5085 ti and a half super with frosting on top edition
I've had a 3090 for 3yrs. I game on X34p. I will wait till the 5090 hits. This is a good buy if found at MSRP.
All matte black looks cool!
a thousand dollars to play games at 4k with FAKE FRAME mode on in order to ensure good frames is how i like to play my games.
cool video I've been watching for a number of years, good job from Denmark
Anyone else on the edge of their seats waiting for the "from our sponsor" and how Linus was going to segway...
People dont understand, yes the RTX 4080 super is not Super more powerful, but it is still a little bit faster GPU but cheaper!
Edit: Enjoying mine
0:38
Idle Power increase should not be an upside, it should have been highlighted red
The color isnt about if its a good or bad thing. Just if its an increase or decrease
I was waiting for the super segue to the sponsor.
This video honestly feels like an April Fools prank
Also i like to comment that while playing multiplayer in MW3, I get about 160 fps on Extreme custom settings with everything maxed out on my RTX 4080 on my Samsung OLED G9 95SC monitor. which is super ultra wide and its 80% 4k in pixels but still 5120x1440p. So what I'm generally seeing in this video is that the 4080 Super is pushing 4090 frame rates @ 4k resolution.
I can still play Level 10 Faceit with just a 180hz monitor, 5800x3d and rtx 3080ti. I haven't had a chance to try monitors with refresh rates higher than 180hz in the past 3 years. So I'm still very satisfied with the current viewsonic 180hz monitor.