hey guys its riley and it's barely still Monday where I am, but guess what? time is relative. Any time you think it's one time, WRONG it's another time in another country where they might be saying things like "thanks to Saily for sponsoring our CES 2025 coverage - Get a 15% discount on Saily eSIM data plans! Download the Saily app and use code TECHLINKED at checkout. Or go to saily.com/techlinked ". Lots of different cultures out there and and things that they say
Not just frame gen, multi frame gen. Frame gen only generates one frame per frame rendered, multi frame gen generates 3 frames per frame rendered. In reality the 5070 is a 4070Ti according to the two first games on the slide. Its an awful generational improvement.
Advertising a graphics card by saying "Rendering in 720p will make your fps number go up" is insane, like yeah, that's how I played GTA 5 on a 450 GTS 10 years ago
where's my 4090 for 400usd? nv claims the 5070 for 549usd is just like the 4090 in performance! which means the 4090 should cost 400usd now as old power-hungry 2022 GPU)) right!?))
I spend a lot of time correcting people with bad takes (e.g. RT/DLSS are useless), but Nvidia makes it hard when they try to fabricate reality. Even if the perceived fidelity of frame gen matched rendered frames (it doesn't), it would STILL be misleading marketing. As is, this is a clear telegraph that their rasterization performance uplift will be lackluster.
Can i ask why does it matter if the frames are ai generated? Dlss has been doing this for a few years now and i love dlss performance and looks good too so whats the problem? Not only we have more frames, but the new frames are being enhanced with ai super resolution to make them look crisp too! Dlss is the only reason why i havent had to upgrade from my 3070 yet
The numbers being boosted by frame generation is just another reason to not spend any money until independent reviewers tell us how it really performs when its just being, you know, a regular GPU and not some buzzword salad that does nothing for older titles or superior OS's.
If gou look closely they compared just rt performance too. Which could be approximate to raster performance and there we get something like 20% better performance. With that we have more power consumption increase too and i do not see point buy new card if perf per watt is same
This exactly. Whats the ACTUAL performance. No DLSS upscale or Ray Tracing. Ray Tracing recent deep dive test show that despite all the titles that have it in the name, less than 10 actually improve visuals and worth turning it on. Also show that ya need 16gb Vram cards to enjoy it. Making the R in RTX invalid for MOST cards.
10:25 Funnily, ps5 have the highest kids user base in gaming in this generation, in nintendo yearly fiscal report we know the majority of switch users are adults.
Makes sense, I've gotten my son a lot of cheap PS4 games during sales and then PS5 games during Xmas and his birthday. Meanwhile I buy the expensive $60 switch games 😂
that's because kids are dumb and have the "nintendo is for little babies, big kids and adults play like, call of duty and stuff" mindset, which is just objectively false, nintendo games are WAY better and far more fun then call of duty ever will be, and ever has been
I definitely noticed many weird glowing effects or water effects with DLSS turned on, where a laser glow might bleed on to the background or water reflections/lighting would just be nonsense. I wondered why it was happening so consistently in recent games, and DLSS was the culprit. My games now look so much better with DLSS off.
Stop buying green, problem solved. AMD has FSR whatever its called and it ain't good either, but at least it has good native performance at (what used to be) half the cost. I tried tinkering around with it last night in TC wildlands and it just wasn't happening.
I want to see how it compares in raw performance when it launches, Nvidia always claims something is much stronger and better than previous gen then it turns out just A.I and DLSS stuff.
At least on previous launches the new xx70 cards have been in the ballpark of the old xx90 cards. That wont be the case here. The 5070 got 15-20% more performance than the 4070 according to the first two games on nvidias slide, making it a 4070Ti.
Depending on where you are in life a 50 series card can run you between your share of the rent for one month up to one month’s mortgage payment. That’s crazy.
As long as you dont upgrade every gen, the price isnt as painful if you get tons of use from it. Then again even if you upgrade every gen.. you have to save what 70 bucks a month (yes i know for some people thats a lot, but this card isnt for them) to afford it. Sadly gone are the days of the top tier card being 1k but at least the 5080 wasnt like $1500 but the 6080… who knows.
@@Jzwizi bought a 1060 for 300 euros just after launch. Then sold it for 350 euros in 2020 and bought a second hand rtx 2060 super for it in 2020 which i can probably use for a few more years.
@@pleheh very nice, i went from no dgpu (igpu) to a gaming laptop in college with a 1080 (big mistake) then an alienware tech broke it so they gave me a 2080 laptop (ran horrible and over 100 degrees nonstop) and then eventually just made a proper gaming pc with a 4080 super which was fun since it was my first non work related self made pc. I plan on using it till it dies or till the very rapid approach of 16gb of vram not being enough.
In the cyberpunk benchmark comparing the 4090 to the 5090, the 4090 is using frame generation at 2X, while the 5090 is using it at 4X. Cut the frames in half on the 5090 to have a fair comparison, and you get 108FPS vs 121FPS. The real difference is only 13%, but that doesn't account for the overhead required for 4x frame gen, so most likely around 20% to 25% ..maybe? The 5070 at 4X frame gen is competing with a 4090 with 2X frame gen, so logically the card is roughly half the speed of a 4090 natively, which places it at the same performance of the 4070 super or 3090. Just use lossless scaling for 4x frame gen and you won't need 50 series for literally anything, unless you specifically value it's neural rendering for some reason. Also, a question I don't see being asked often enough. If the training for the frame generation and everything is already done on an nvidia super computer over at HQ, why can't previous gen cards use it? All you have to do is squint at their marketing, and you'll see more than clearly through the smoke. In pure rasterization, the 5070 will lose to the 9070 XT, so you can also expect AMD's card to cost too much at this point as well.
may be a perf penalty to generating those extra 2 frames, so optimistically slightly better than 13% improvement. Gonna guess around 25% real improvement
That's my first reaction as well after I learnt DLSS4 frame gen seems to be just a multiplier increase for the frame gen component. I'll reserve judgement in terms of Blackwell's improvement regarding pure horse power - I'll give nVidia benefit of the doubts since Blackwell may still boast larger compute improvement; but I would have thought there would have been a bigger fuss being made about the improvements shown on stage in CES 2025 being attributable to DLSS4 frame gen multiplier instead of compute horsepower. It should also be noted that the quality of the generated frames from DLSS4 may be of a higher quality than other competitors, just to be fair.
Going by the specs the 5090 will definitely be much faster than 13%, its not that simple that you can just cut the frames in half and calculate the uplift, there must be a performance penalty for generating the frames. They released a video of the 5090 running the cyberpunk dlc at around 27 fps at 4k maxed out with pathtracing without dlss/fg, the 4090 gets like 18fps in that particular area so it's like a 50% improvement. Best is to wait for third party reviews with updated drivers.
writing and delivery were so funny in this episode, i was lol-ing and almost spilled watching this while drinking my coffee this morning (esp. the sarcastic therapy jokes, well done guys!)
wake me up when these are real world prices in europe and not some fantasy numbers on the big screen that are true in some selective specific edge cases with 4 types of dlss and 3 types of frame gen enabled in this 2004 source engine game
Generally prices in Europe are morekormless the same as the published US ones but with VAT. The US prices are without sales tax, which is different in each state and even within each state. Tjere are a few startes with no sales tax. In Europe price must include all taxes so they take the US price with no tax and add a sum tomthe next X49/X99 price in Euros. Usually the added cost to include VAT is lower than the actual VAT in most European countries. The infamous PS5 Pro was 699 dollars (before tax) in the US and 799 Euros in Most of Europe, which is the US price in Euros plus ~19% for Vat, while in many European countries it's 20% or higher. Imprefer to pay more for a graphic card than go bankrupt because I got sick, have actual paid vacation or sick leave, not loose everything because I lost my job, be able to study without huge loans, pay far less for the "less important" things like food or medicines, have good affordable public transit etc.
Very misleading.. 5070 will have about half the raw performance of the 4090 . The comparison is between a 5070 with DLSS 4 and 4090 DLSS 3.5. DLSS 4 will be available on 40 series too and if you do a benchmark between these two cards DLSS 4 enabled on both cards, the 4090 will way more better.
@@CJTheTokay It says it in NVidia's benchmark slides in the disclaimer at the bottom. In fact, it's using DLSS 4 (4x mode), meaning it was generating even more frames than the old DLSS 3 Frame Gen used in the 4090 they were comparing it to.
@@CJTheTokay each graph or comparison showed info that title was running DLSS or DLSS 4 vs 3.5. Besides you really think that nvidia will be selling raw performance on pair with previous gen and suddenly 3 times cheaper? It's past christmas miracles...
The reason Nvidia is pushing AI and DLSS so hard is because they can’t make the transistors any smaller, it costs too much to find an alternative solution to the Silicon Wafer technology. I think a lot of us realize this now which is why the technology is dramatically slowing down every year, but the corporation still needs to pump out products for its stock price.
Nvidia listed their Far Cry 6 comparison benchmark that doesn't use DLSS 4 and it looks like the 5070 is only 20%-25% better than the 4070 in reality. Very standard generational increase. Same for the 5090 vs 4090.
@@alumlovescake possibly Overhead of 4x gen. Its faster than normal rendering, not free. Far Cry 6 is, what we would call, a fair comparison, according to NVIDIA. Wait for GN and Hardware Unboxed to release proper Benchmarks and we will know the real numbers.
A reminder, this is the most cut down generation of graphics cards in Nvidia's history. The 5080 is a mere 44% of the total 102 die (the norm for 80/80ti class cards is 80-95%). You are effectively purchasing 60ti/70 class silicon for 90 class money. Do not buy these cards.
@ don’t get me wrong, these cards will be faster than last gen but they should have been LOADS faster. It’s bad enough there isn’t a node switch (ie they are using cheaper manufacturing), they are once again locking features to cards to give you FOMO. The laptop naming nomenclature still hasn’t been fixed (the 5090 mobile is just a 5080, and so on). These cards are just as vram starved as the last two gens have been despite 3Gb gddr7 modules being available. That alone would have made the 5080 a 24Gb card and the 5090 a 48Gb but nooooooo. That would cannibalise the sale of the overpriced AI cards.
@@Creepernom it is. Your wilful apathy is exactly why this company has no incentive to improve. It doesn’t matter if they make good products or bad products because they are effectively a monopoly when you need gpus to do anything that isn’t gaming. Demand better for the love of god.
@jasonhemphill8525 But they are improving. The performance is better. They are introducing a bunch of new tech like DLSS4, Reflex 2, etc. The pricing is better than before. Clearly this decision isn't being a greedy bastard if from a consumer's point of view, everything from price to performance has improved significantly.
The 2080ti which was a flagship gpu has the same performance as the 3070 so yes, it could be possible but i'm confused if 5070=4090 raw perf or 5070 with dlss = 4090 raw.
To everyone talking about how the 5070 needs dlss and frame gen to compete with 4090: I don’t think saying how something performs with dlss or frame gen is a bad thing but I do think they should 1. Tell how BOTH cards perform with dlss and frame gen (ie how both the 4090 and 5070 perform with it), 2. They should make it be VERY clear and tell you it’s with dlss and frame gen on, And 3. They should say how they perform without it on as well. I’d say if they did those 3 things then everything would be ok, especially because since some people do use dlss and frame gen that means those users can know how a card will perform and people who don’t use them will know how they perform and if nvidia makes it very clear that everyone knows if it’s on or off then that makes no one confused on how something will perform Sry for long text
I mean tbh these prices are a surprise, but they're gonna be irrelevant at launch at least because these are gonna be scalped to hell, ESPECIALLY the 5090. Nvidia using AI/DLSS as excuses to make ridiculous performance claims is certainly not a surprise though.
He's so out of touch and cringey on stage. Every single time. Does nobody tell him, "Hey, you sound like a douche up there on stage.. Maybe just talk about your product and stop trying to seem cool."
Yet their real worry should be the middle-aged parents that bought that Switch 2 "for the kids" who rocket jumped in Quake and surfed in Tribes that now have mouse-like input on a console.
That’s performance, if it’s part of their GPUs features then it’s something that needs to be included. I don’t get why yall wanna ignore something that’s part of the product..
@@RandomPersonOnTheInternet1234 because it's software, they're not selling you the hardware that matters anymore. The reason they cut stuff is because they are protecting their AI big boy cards. Open your eyes or don't and go and buy dlss4 because that's what you're paying for only. They include it but they make graphs with it and also make it their main selling point, raw performance has been left behind for years now.
what a surprise, a person promoting his product which has a lot of focus around AI, mentions AI a lot. What next, a car manufacturer talking about BHP, performance, MPG? Or a RUclips Tech channel talking about.....hmmm Tech, its terrible. OMG what is the world coming to!
5090 is $4039 in Australia. There's no point in even thinking about that price unless there is a sudden game development renaissance and they stop making low effort slop
@titan_fx , AMD needs to offer equal RT performance, much better raster performance, AND offer it at a lower price than the equivalent NVIDIA chip if they want anyone to buy their cards. Unfortunately, I don't see them doing that.
@@GoodGamer3000but why disappointed, week ago we hears 5080 would be like $2800, but now for only $549 ofc if Nvidia doesn't lie, we get the same performance like 4090, and according to specs, 5070 Ti would 100% be better than 4090 at almost half price, everybody mentioning about taxes that includes, but still no difference because Nvidia also releasing 4090 on $1599 (no taxes included) and when release, the price ain't getting abused.
@@freedomutilities9240, I will not take NVIDIA's numbers at face value, especially with them throwing DLSS into the mix. $1000 is still too high for a 5080 and $2000 is way too high for a 5090. $550 is not too terrible for a 5070, but we don't actually know how it performs yet. Overall, I don't think this will even be as good as the 3000-series, which was mediocre, even before all the crypto crap.
What about us old timers that don't enjoy blurry games with input delay? Is it still comparable to a 4090 if I want to run the game without AI gobbledygook?
@@turbochargedfilms DLSS 4.0 which utilizes frame gen (no other way they can double their framerate), introduces input lag, "blurriness" most likely means to AI upscaling fucking up and causing visual bugs.
8:06 Same. Between AMD and Intel I have no idea wtf any of the names mean anymore. Nvidia seems to be the only company with a naming scheme that makes any sense and they are charging a premium for it.
when the 5000 series releases so the price of the 4090 goes down so the price of the 3090 ti goes down so the price of the 2080 ti goes down so the price of the 1080 ti goes down so that i can get a 1050 ti : - ))
@@surftThey also said they're trying to ensure the launch with a large stock, which I think checks out with them discontinuing the 40 series cards early. A large initial launch should be effective at combating scalpers even past launch.
@@BruhMomentTTV no tariffs apply if they have been manufactured, assembled, and shipped to warehouses stateside before they kick in. guarantee you nvidia, amd, and a great many other companies stockpiled as much as they could before trump takes office
$2000usd 5090 base price killed all my interest in 5000 series. Nothing can max out 4090 series at settings I can see. Hopefully ryzen ai max plus super pro ai 395 is as good as the specs sound, and makes some killer steam deck / gaming netbook SKUs.
The only reason to get a 5090 imo is to power extremely high end VR headsets without having to bother with foveated rendering or downscaling. My headset has an effective resolution of 7680x2160 that needs to be pushed at 110fps for a great experience. My 3090 can do a pretty fine job but on more visually complex games, or horribly optimized ones like vrchat, the performance can suffer, of course suffer meaning dropping below 60fps.
If your game is resource-bound on either cpu or gpu, turning on reflex will lower FPS but provide near constant latency. It works by attempting to synchronize cpu and gpu, taking account your input lag (if the hardware support it). Those effect might be desirable, or not. So there's an option to disable reflex in case of bad result (too much desync, preference for higher fps, etc). Note that the above is true for current gen reflex. 50 series supposed to inject predictive input to reduce frame output gap and desync, which translates to improved average latency and reduce desync.
@bepamungkas ah thanks for the info. I'm a bit skeptical about this predictive input though. Seems like it could appear like input lag if I'm doing something like a twitch shot if it predicted I would be traveling mouse the same way at that moment in time.
@@resresres1 input data are fed from CPU to GPU, and is the authoritative one. So the predictive input only affect which part they need to render. (i.e if you turn left, they can confidently discard right part of frame then "stitch" the left part). Your input is then injected last to actually render the scene. Unless you're playing "tricky" game like ARMA with maxed out render distance, those process should not affect fps severely, while improving perceived latency. As long as the latency is stable, our brain will do the rest of the work to "normalize" the interaction.
Most North America cell carriers don’t charge for domestic roaming and unlimited plans will also include Canada and Mexico for free as well. However, non unlimited plans can cost between $6 to $12 a day, depending on your carrier, which is absolutely insane.
Actualización 2: Nvidia ha proporcionado los precios para España, que son un poco más elevados que los alemanes. GeForce RTX 5090: 2.369 euros GeForce RTX 5080: 1.190 euros GeForce RTX 5070 Ti: 899 euros GeForce RTX 5070: 659 euros Actualización: Los precios alemanes, que deberían ser de referencia para Europa, ya han sido publicados. Son los siguientes: GeForce RTX 5090: 2.329 euros GeForce RTX 5080: 1.169 euros GeForce RTX 5070 Ti: 879 euros GeForce RTX 5070: 649 euros
This is the perfect video. Good, fast information which is perfect for my work break, funny and gave me some good laughs, and just the info I need TECH NEWS
Im old enough to have paid ~$1500 for a Pentium 60mhz chip - I don't wanna hear anyone complain about losing 1/3 in 2 years - I lost around 80% in one ;)
@@lelandbateyhate to break it to you but most of nvidia's money doesnt come from gpu sales anymore, hell my Dad is obsessed with nvidia products in terms of the A.I. and self driving tech, he doesnt play video games he could care less about gpu performance. Hate to break it to you but most nvidia fans nowadays arent even gamers, they are tesla owners and stock market traders and A.I. programmers
1. The 5070 is not as powerful as the 4090 in raw performance, it is just a marketing scam. The 5070 is using X4 frame gen and dlss whereas the 4090 is not 2. The 5090 is also not double the performance of the 4090. It is around 20%-30%, again, a marketing scam, dont get fooled
they lied about the 5070. because of the versions of DLSS they were using, they tried making the 5070 sound better than it is. The actual power of the 5070 is more like the 4070ti
The Australian prices are awful. Glad the US are great. But with reference cards not being sold here officially and with reference cards being more expensive it's pretty sad for Aussies
@@Shinkajo 1. GST is great, essentially the tax is included in the price of the product and not calculated after or during. 2. Our import tax is a bitch and because of our low population (24 million people isn't too low). Stuff generally doesn't change. The new surface pro laptops that came out last year that dropped by $200 USD MSRP from the previous models. Because they used Snapdragon chips. Even though they dropped in the states, they remained the same price for MSRP here. RTX 5090 if you convert is $3200 AUD so apparently $800 is our import tax because the 5090 is $4000 AUD. From what I know nothing has really changed too much for import tax so eh. My 4080 was $1500 aud for a zotac card (to be fair it was like $200 off at the time). But the 5080 is supposed to be $2200 AUD which isn't really sold in Australia and the board partner cards will be more. Long story short the PC market is weird here, with Nvidia cards sometimes having cheaper MSRP than AMD cards it can get very confusing pretty quickly after watching reviews
I don't think anyone was surprised at this point yes. It's Nvidia and flagships are gonna cost at least arm and a leg and most people won't care cuz they won't consider them anyway. Let the "I have to have the best of the best all the time" rich people sink their money while Nvidia is smiling. Same to me.
Have fun paying a subscription to use the internet you already pay for. Oh and also not having your game library anymore, uninstalling and installing one or two games because everything is 100 to 300 gbs
just try to get a good deal on last gen hardware and it'll be way better than a console. Just act fast before people realize how small of an upgrade the 5070 is
if it makes you feel better, it was false advertising and the new gen GPU's are not nearly as good as Nvidia claimed (they included fake frames / frame generation as "performance")
it'll launch at $550, then once they sell out of those every 5070 will be substantially more expensive lol they always do that, it's just so it looks good at launch
I still find troubling the audience didn't mind Jensen making those PC influencers fight in his gladiator arena for his amusement. I mean he did it when the 4000 series came out, but at least the CES crowd didn't clap at the spectacle.
9:32 honestly Bing is way better than Google right now, lately I've started to use it when Google wasn't able to give me what I was looking for and more often than not Bing shows up the results on the first page with no issues. (to give you some more context I'm a 3D modeler and I've been working on some historical architecture lately so mainly information like measurements of building and usable reference images) Google just shows some random crap that is not even related to what I was looking for.
I decided to divide into two types of people. One is general people who wants to play games for buttery smooth gameplay without complaining. These people just care about how smooth their games play just by seeing and what they're paying for. 5070 for 4090 performance at $550 for them is actually steal. Another one is people that really care about very very little things, like how they dont like FG and they're not paying for "fake frames" and they say things like "AI slop" or giving hate towards AI because of 7ms in a frame imperfections that really bothers them. Me? i'm a bit broke, so going for B580 i guess
If the Switch 2 Mouse thing is real (and actually works well, and it's pushed by Nintendo for 3rd parties to support such input), that could be a big feature for the console. It would make certain types of games (like 4X) more pleasant to play on the console, and if mouse support becomes a standard feature that too could be a benefit (just connect any BT mouse, or wired with the dock) for FPS and various games that benefit from mouse control. No word on further VR/AR support for the console though, so I'm guessing they're passing on that for their next console. Maybe the next console will push VR/AR functionality, who knows.
Our currency took a dive a few days ago to the US. With $1 AUD lucky to buy 60cents USD, add tarrifs and GST and it won't be far off that total amount.
10:24 that would actually be pretty bad ass!!! The switch already supports M&Kb input.... even though it's limited in function. Warframe is going to play so much smoother like that on the switch!!!
hey guys its riley and it's barely still Monday where I am, but guess what? time is relative. Any time you think it's one time, WRONG it's another time in another country where they might be saying things like "thanks to Saily for sponsoring our CES 2025 coverage - Get a 15% discount on Saily eSIM data plans! Download the Saily app and use code TECHLINKED at checkout. Or go to saily.com/techlinked ". Lots of different cultures out there and and things that they say
riley they cant keep you hostage forever
So Nvidia openly admitting they've been overcharging for graphics cards.. and they clap 👏 😂
Say thank you to Intel and Pat and nana 🙂
My names not guys
Thanks for the coverage. Goodnight!
CES in a nutshell
10% Product specs
90% AI Yapping
most companies are a dumpster fire at this point and I am HERE for it
If your fridge isn’t AI, might as well be neighbours with the dinosaurs and Atari then
Shareholders need to be happy bro sit down
@@Mike604if by AI you mean “almost ice”
10% luck
20% skill
15% concentrated power of will
5% pleasure
50% pain
And 100% reason to fully expect these GPUs to get scalped
0:54 on par with RTX 4090 raw performance with DLSS 4 and frame gen
so basically yes no
kind of of like downloading ram
Not just frame gen, multi frame gen. Frame gen only generates one frame per frame rendered, multi frame gen generates 3 frames per frame rendered. In reality the 5070 is a 4070Ti according to the two first games on the slide. Its an awful generational improvement.
@@TheGamingNorwegianyup but sadly mostly people will fall for this marketing scam and still over pay to scalpers
dlss and fsr is like having motion blur on. pass
@@Xamiakassyeah but in FSR's case it's free for all including older GPUs even from competitors
DLSS is just a walled garden rn
@@itsmilan4069 so, free trash is better than paid-for trash?
both of them are still trash, unless you're visually impaired.
Advertising a graphics card by saying "Rendering in 720p will make your fps number go up" is insane, like yeah, that's how I played GTA 5 on a 450 GTS 10 years ago
That's not how dlss works
@@Numb_that's exactly how dlss works
Yea should be at 4k raster and fg dlss etc i want both.
Because number go up is marketing, not actual value of a card
@@devnolits mostly how Dlss works, its actually slighter worse.
His jacket is not alligator skin, it was generated with AI together with DLSS 4 with the latest Ray tracing technology.
😂
It's clearly using the new ai neural shaders!
where's my 4090 for 400usd? nv claims the 5070 for 549usd is just like the 4090 in performance! which means the 4090 should cost 400usd now as old power-hungry 2022 GPU)) right!?))
It's clearly cuckadile leather. It focuses his boomer energy to 9000.☮️
i thought it was rayLaced togethor
1:33 Twice as fast with DLSS DOES NOT COUNT!!!!
Nvidia says it does 😂. Oh Nvidia you suck!
I spend a lot of time correcting people with bad takes (e.g. RT/DLSS are useless), but Nvidia makes it hard when they try to fabricate reality. Even if the perceived fidelity of frame gen matched rendered frames (it doesn't), it would STILL be misleading marketing. As is, this is a clear telegraph that their rasterization performance uplift will be lackluster.
for me it does.
@@lordzed83 4x the fake frames 2x the fake frames
Can i ask why does it matter if the frames are ai generated? Dlss has been doing this for a few years now and i love dlss performance and looks good too so whats the problem? Not only we have more frames, but the new frames are being enhanced with ai super resolution to make them look crisp too! Dlss is the only reason why i havent had to upgrade from my 3070 yet
The numbers being boosted by frame generation is just another reason to not spend any money until independent reviewers tell us how it really performs when its just being, you know, a regular GPU and not some buzzword salad that does nothing for older titles or superior OS's.
If gou look closely they compared just rt performance too. Which could be approximate to raster performance and there we get something like 20% better performance.
With that we have more power consumption increase too and i do not see point buy new card if perf per watt is same
Try the ECHO scheduler set to 0 to compensate for the frame gen. It's extremely responsive with 5800X3D on Linux for sure.
This exactly. Whats the ACTUAL performance. No DLSS upscale or Ray Tracing. Ray Tracing recent deep dive test show that despite all the titles that have it in the name, less than 10 actually improve visuals and worth turning it on. Also show that ya need 16gb Vram cards to enjoy it. Making the R in RTX invalid for MOST cards.
@@Talemix85 whats a superior os
Superior lol
7:34 "world's first commercial notebooks built with a Modular USB-C Port"
did no one tell them about Framework?
I guess they don't consider the Framework a commercial product, given the non-existent sales.
Framework is kind of a joke. People enjoy building pcs, not laptops.
Framework? The laptop brand that only sells at limited places and they wonder why it's underperforming?
Companies don't actually care if what they say is true
@@gormauslander truth has a price to them.
lies do not.
$550 for a 5070 isn't going to happen...
scalpers: $900 take it or leave it
Real price will be minimum USD$650 +AiB tax and taxes
hope we can atleast get the 5060ti under 500...
799,- in Europe minimum.
We're still having chip shortages?
10:25 Funnily, ps5 have the highest kids user base in gaming in this generation, in nintendo yearly fiscal report we know the majority of switch users are adults.
Alright big bro
Makes sense, I've gotten my son a lot of cheap PS4 games during sales and then PS5 games during Xmas and his birthday. Meanwhile I buy the expensive $60 switch games 😂
@bobhanson1037 That how it goes these days, My Christmas gift was xenoblade X preorder, my nephew asked for some stuff for fortnite or something XD.
Damn those 30 to 40 year old kids!
that's because kids are dumb and have the "nintendo is for little babies, big kids and adults play like, call of duty and stuff" mindset, which is just objectively false, nintendo games are WAY better and far more fun then call of duty ever will be, and ever has been
I hate hate HATE frame generation. I can't be the only one who can see the difference between DLSS and native generation
Casuals don't notice the input lag and artifacts. They're playing on a controller with glare hitting their tv.
I definitely noticed many weird glowing effects or water effects with DLSS turned on, where a laser glow might bleed on to the background or water reflections/lighting would just be nonsense. I wondered why it was happening so consistently in recent games, and DLSS was the culprit. My games now look so much better with DLSS off.
@@aromatic8565 Casuals *wheeeeeze* 😂💀
Heck I notice the difference between regular DLSS and native in Cyberpunk. Ghosting and motion trails are painfully obvious while driving around.
Stop buying green, problem solved. AMD has FSR whatever its called and it ain't good either, but at least it has good native performance at (what used to be) half the cost. I tried tinkering around with it last night in TC wildlands and it just wasn't happening.
I want to see how it compares in raw performance when it launches, Nvidia always claims something is much stronger and better than previous gen then it turns out just A.I and DLSS stuff.
At least on previous launches the new xx70 cards have been in the ballpark of the old xx90 cards. That wont be the case here. The 5070 got 15-20% more performance than the 4070 according to the first two games on nvidias slide, making it a 4070Ti.
No node shrink. Just more silicon (read more power draw) and faster ram.
The 30 series was decent, and 10 series was great. Usually it's cyclical. Hopefully it'll be a decent gen upgrade per $. And make used cards cheaper 😁
@ 3080 was good but it set a bunch of bad precedents.
Disgusting
The RTX 5070 will not be as fast as a 4090. The fake frames shown on your FPS counter will be about the same, but frames != performance anymore.
bros comment is a boolean expression
100 times "YES"! Fake frames don't improve performance. They might improve visual smoothness.
Can literally download lossless scaling and show hundreds of fps on any GPU, don't mean it actually is running that fps 😂
@@Winnetou17 Visual smoothness... just call it temporal fidelity.
@@cblack3470 oh darn, forgot about that tool , thanks for reminder.
5070 on par with 4090,BS ,take this with a large grain of salt
It will be using frame gen which sucks in most games
Yeah, it should have been mentioned in the video that this is achieved using multi-frame generation and so is apples to oranges marketing BS
@Workaholic42 it has been always like this with Nvidia
I will wait anyway for DF Reviews to know the truth
yup the whole jar of salt
@@Workaholic42 DLSS 4 Multi Frame Generation generates up to three additional frames per traditionally rendered frame,
Did you really just accept that 5070 = 4090 claim at face value without mentioning all the fine print?
yea I can imagine the 5080 being slightly better than the 4090 tho
I don't think that required explicit clarification
Like half the VRAM which is the most important spec for anyone running local LLM's
3:11 Ok, wasn't ready for that. Got me for a laugh!
🤣🤣🤣🤣
"Happy for you, or sorry that happened", killed me.💀
"Intel Ultra 120U I don't even know what that is" haha
Same lmfao
Depending on where you are in life a 50 series card can run you between your share of the rent for one month up to one month’s mortgage payment. That’s crazy.
7 months' worth of work for me 😂
As long as you dont upgrade every gen, the price isnt as painful if you get tons of use from it. Then again even if you upgrade every gen.. you have to save what 70 bucks a month (yes i know for some people thats a lot, but this card isnt for them) to afford it. Sadly gone are the days of the top tier card being 1k but at least the 5080 wasnt like $1500 but the 6080… who knows.
@@Jzwizi bought a 1060 for 300 euros just after launch. Then sold it for 350 euros in 2020 and bought a second hand rtx 2060 super for it in 2020 which i can probably use for a few more years.
Buying last gen after the release of a new generation of GPU's launches has always been the best bang for the buck anyway.
@@pleheh very nice, i went from no dgpu (igpu) to a gaming laptop in college with a 1080 (big mistake) then an alienware tech broke it so they gave me a 2080 laptop (ran horrible and over 100 degrees nonstop) and then eventually just made a proper gaming pc with a 4080 super which was fun since it was my first non work related self made pc. I plan on using it till it dies or till the very rapid approach of 16gb of vram not being enough.
In the cyberpunk benchmark comparing the 4090 to the 5090, the 4090 is using frame generation at 2X, while the 5090 is using it at 4X. Cut the frames in half on the 5090 to have a fair comparison, and you get 108FPS vs 121FPS. The real difference is only 13%, but that doesn't account for the overhead required for 4x frame gen, so most likely around 20% to 25% ..maybe?
The 5070 at 4X frame gen is competing with a 4090 with 2X frame gen, so logically the card is roughly half the speed of a 4090 natively, which places it at the same performance of the 4070 super or 3090. Just use lossless scaling for 4x frame gen and you won't need 50 series for literally anything, unless you specifically value it's neural rendering for some reason.
Also, a question I don't see being asked often enough. If the training for the frame generation and everything is already done on an nvidia super computer over at HQ, why can't previous gen cards use it? All you have to do is squint at their marketing, and you'll see more than clearly through the smoke. In pure rasterization, the 5070 will lose to the 9070 XT, so you can also expect AMD's card to cost too much at this point as well.
may be a perf penalty to generating those extra 2 frames, so optimistically slightly better than 13% improvement.
Gonna guess around 25% real improvement
That's my first reaction as well after I learnt DLSS4 frame gen seems to be just a multiplier increase for the frame gen component. I'll reserve judgement in terms of Blackwell's improvement regarding pure horse power - I'll give nVidia benefit of the doubts since Blackwell may still boast larger compute improvement; but I would have thought there would have been a bigger fuss being made about the improvements shown on stage in CES 2025 being attributable to DLSS4 frame gen multiplier instead of compute horsepower.
It should also be noted that the quality of the generated frames from DLSS4 may be of a higher quality than other competitors, just to be fair.
Going by the specs the 5090 will definitely be much faster than 13%, its not that simple that you can just cut the frames in half and calculate the uplift, there must be a performance penalty for generating the frames. They released a video of the 5090 running the cyberpunk dlc at around 27 fps at 4k maxed out with pathtracing without dlss/fg, the 4090 gets like 18fps in that particular area so it's like a 50% improvement. Best is to wait for third party reviews with updated drivers.
@@dhaumya23gango75 you never trust a handpicked performance test or only one performance test on the internet
We'll see but my old 5700xt runs my indie games for now so Im not really interested in this generation either(I dont play 3a games that much)
writing and delivery were so funny in this episode, i was lol-ing and almost spilled watching this while drinking my coffee this morning
(esp. the sarcastic therapy jokes, well done guys!)
Hmmmh, that was a nice, delicious TechNews. I love it when 90% of the news cover actual product announcements and releases.
wake me up when these are real world prices in europe and not some fantasy numbers on the big screen that are true in some selective specific edge cases with 4 types of dlss and 3 types of frame gen enabled in this 2004 source engine game
19 % VAT, maybe leave the EU so you can import for cheaper
@@X_OtmanThe VAT here is 25% 😭
Rtx 5090 is 2400€ haha so costum could be way more xd crazy
Generally prices in Europe are morekormless the same as the published US ones but with VAT. The US prices are without sales tax, which is different in each state and even within each state. Tjere are a few startes with no sales tax. In Europe price must include all taxes so they take the US price with no tax and add a sum tomthe next X49/X99 price in Euros. Usually the added cost to include VAT is lower than the actual VAT in most European countries. The infamous PS5 Pro was 699 dollars (before tax) in the US and 799 Euros in Most of Europe, which is the US price in Euros plus ~19% for Vat, while in many European countries it's 20% or higher.
Imprefer to pay more for a graphic card than go bankrupt because I got sick, have actual paid vacation or sick leave, not loose everything because I lost my job, be able to study without huge loans, pay far less for the "less important" things like food or medicines, have good affordable public transit etc.
@@AL5520 You could have made your comment about pricing without exaggerated criticism about the US.
Very misleading.. 5070 will have about half the raw performance of the 4090 . The comparison is between a 5070 with DLSS 4 and 4090 DLSS 3.5. DLSS 4 will be available on 40 series too and if you do a benchmark between these two cards DLSS 4 enabled on both cards, the 4090 will way more better.
How do you know the 5070 was using dlss?
@@CJTheTokay It says it in NVidia's benchmark slides in the disclaimer at the bottom. In fact, it's using DLSS 4 (4x mode), meaning it was generating even more frames than the old DLSS 3 Frame Gen used in the 4090 they were comparing it to.
@@HanayouDev ah ok. That’s kinda scummy they didn’t mention that. Now I’m curious of the real performance of the 5070.
@@CJTheTokay each graph or comparison showed info that title was running DLSS or DLSS 4 vs 3.5. Besides you really think that nvidia will be selling raw performance on pair with previous gen and suddenly 3 times cheaper? It's past christmas miracles...
@@CJTheTokay why worry when you can just wait for GN and HUB to review the bloody thing
3:44, hey guys you might want to correct this. The 7945HX3D was the first CPU with 3D V-Cache, not the 9955HX3D
Wasnt that 5800x3D?
@@LukijsFroGo He meant (but didn't wrote) the first mobile (laptop) CPU.
That got cancelled or never publicly available
@@RohanSanjith Asus put the 7945X3D in a ROG notebook (as did every other major vendor). It is very much a production unit.
I had no idea I needed to be reminded of the "dude you're gettin a dell" marketing campaign, but I def am glad I was. Thanks LTT!
The reason Nvidia is pushing AI and DLSS so hard is because they can’t make the transistors any smaller, it costs too much to find an alternative solution to the Silicon Wafer technology. I think a lot of us realize this now which is why the technology is dramatically slowing down every year, but the corporation still needs to pump out products for its stock price.
Nvidia listed their Far Cry 6 comparison benchmark that doesn't use DLSS 4 and it looks like the 5070 is only 20%-25% better than the 4070 in reality. Very standard generational increase. Same for the 5090 vs 4090.
good catch. i hadnt considered why it was so much lower than the others.
They say 5090 using 4x gen is 2x as good as 4090 doing 2x what implies the 5090 is a very VERY minimal upgrade
@@alumlovescake possibly Overhead of 4x gen. Its faster than normal rendering, not free.
Far Cry 6 is, what we would call, a fair comparison, according to NVIDIA.
Wait for GN and Hardware Unboxed to release proper Benchmarks and we will know the real numbers.
Still good if increase is 20-25%
"ONLY 20-25% better"? So what's your complaint, that's pretty good in a generation leap regardless of what brand you support
A reminder, this is the most cut down generation of graphics cards in Nvidia's history. The 5080 is a mere 44% of the total 102 die (the norm for 80/80ti class cards is 80-95%).
You are effectively purchasing 60ti/70 class silicon for 90 class money.
Do not buy these cards.
that's abysmal. I guess we're all gonna have to wait for the fated LTT video once they're done with the testing. and GN as well
I genuinely could not give less of a shit about die coverage or whatever. What matters is performance, and clearly it's not 5060 level.
@ don’t get me wrong, these cards will be faster than last gen but they should have been LOADS faster.
It’s bad enough there isn’t a node switch (ie they are using cheaper manufacturing), they are once again locking features to cards to give you FOMO.
The laptop naming nomenclature still hasn’t been fixed (the 5090 mobile is just a 5080, and so on).
These cards are just as vram starved as the last two gens have been despite 3Gb gddr7 modules being available. That alone would have made the 5080 a 24Gb card and the 5090 a 48Gb but nooooooo. That would cannibalise the sale of the overpriced AI cards.
@@Creepernom it is. Your wilful apathy is exactly why this company has no incentive to improve. It doesn’t matter if they make good products or bad products because they are effectively a monopoly when you need gpus to do anything that isn’t gaming.
Demand better for the love of god.
@jasonhemphill8525 But they are improving. The performance is better. They are introducing a bunch of new tech like DLSS4, Reflex 2, etc. The pricing is better than before. Clearly this decision isn't being a greedy bastard if from a consumer's point of view, everything from price to performance has improved significantly.
That 5070 is actually a 5060 in disguise and it looks like people are falling for it. A 105 die has never been used for a 70 series card until now
Because a die number is what matters, not performance…
That's why we wait for reviews to inform us on performance.
What is your source on the die size/name? It wasn't mentioned in this video right?
The 2080ti which was a flagship gpu has the same performance as the 3070 so yes, it could be possible but i'm confused if 5070=4090 raw perf or 5070 with dlss = 4090 raw.
@@rdspam Clearly it matters. Have you not seen how stagnant performance has been for 60-70 class cards?
If the 50 series actually is that price and performance I'll actually upgrade from my 1080 ti.
Same, 1080 here - Non ti. 5070 TI looks really tempting
To everyone talking about how the 5070 needs dlss and frame gen to compete with 4090:
I don’t think saying how something performs with dlss or frame gen is a bad thing but I do think they should 1. Tell how BOTH cards perform with dlss and frame gen (ie how both the 4090 and 5070 perform with it), 2. They should make it be VERY clear and tell you it’s with dlss and frame gen on, And 3. They should say how they perform without it on as well.
I’d say if they did those 3 things then everything would be ok, especially because since some people do use dlss and frame gen that means those users can know how a card will perform and people who don’t use them will know how they perform and if nvidia makes it very clear that everyone knows if it’s on or off then that makes no one confused on how something will perform
Sry for long text
the biggest surprise of the whole show is that it was a gator skin jacket instead of the traditional leather
with these prices i wouldn't be surprised if by next generation in 2027 Jensen will be wearing an albino alligator jacket instead.
I mean tbh these prices are a surprise, but they're gonna be irrelevant at launch at least because these are gonna be scalped to hell, ESPECIALLY the 5090.
Nvidia using AI/DLSS as excuses to make ridiculous performance claims is certainly not a surprise though.
He was so close to break into Mr. Burns' "See My Vest!" song.
@@Junebug89 yeah cause AMD dont do the same with FSR do they......
He's so out of touch and cringey on stage. Every single time. Does nobody tell him, "Hey, you sound like a douche up there on stage.. Maybe just talk about your product and stop trying to seem cool."
I'm betting the 5070 is on par with 4090 only in games where DLSS 4 is available. In raw power, I have doubts.
right on the money
5070 is probably on par with 4090 if the 5070 is using dlss4 and 4090 isn't using anything lmao
5070 vs 4090 Unplugged
Duh. This isn't exactly the Hoax of the Century. They just know most PC gamers will stomach any amount of disrespect for new Nvidia hardware. 🤷🏿♂️
That last joke had me rolling! "Oh no, the children are here"
Yet their real worry should be the middle-aged parents that bought that Switch 2 "for the kids" who rocket jumped in Quake and surfed in Tribes that now have mouse-like input on a console.
you absolutely kill these vids, Riley. you have the crown, now and forever, amen.
Those are twice as fast while using dlss and multi frame gen. I wanna see real power. Show me raster!
Only fully rendered frames should count 100% agree. Bulk power, not machine hallucinations.
0:53 don't know why they go with that lie instead of adding that it is only with new ai features on 50 series only and not actual real performance...
I think he lightly mentioned it was with the new DLSS or something like that soon after that statement.
Cause most people think bigger number better always
That’s performance, if it’s part of their GPUs features then it’s something that needs to be included. I don’t get why yall wanna ignore something that’s part of the product..
@@RandomPersonOnTheInternet1234 because it's software, they're not selling you the hardware that matters anymore. The reason they cut stuff is because they are protecting their AI big boy cards. Open your eyes or don't and go and buy dlss4 because that's what you're paying for only. They include it but they make graphs with it and also make it their main selling point, raw performance has been left behind for years now.
@@RandomPersonOnTheInternet1234 It's not comparable performace. Therefore that graph is misleading at best and lies at worse.
Man, I can't believe I hadn't already thought of "Dude, you're getting Adele. " in all these years. Thank you for that.
Jensen mentioned AI for like 200 times.
Jensen is probably the most expected person who'll mention AI, because all his Plan, Vision and Products were made for AI growth.
have to please the investors somehow
what a surprise, a person promoting his product which has a lot of focus around AI, mentions AI a lot. What next, a car manufacturer talking about BHP, performance, MPG? Or a RUclips Tech channel talking about.....hmmm Tech, its terrible. OMG what is the world coming to!
Theoppositeopinion because it's the same AI as the 4090, barely any difference whatsoever
Theoppositeopinion because it's the same AI as the 4090, barely any difference whatsoever
5090 is $4039 in Australia. There's no point in even thinking about that price unless there is a sudden game development renaissance and they stop making low effort slop
I'm amazed how fast you guys had this video edited and ready. Bravo!
Wowie, I had low expectations of NVIDIA's new lineup and was still disappointed
Hope that AMD won't botches their GPU launch (again).
@titan_fx , AMD needs to offer equal RT performance, much better raster performance, AND offer it at a lower price than the equivalent NVIDIA chip if they want anyone to buy their cards. Unfortunately, I don't see them doing that.
@@GoodGamer3000but why disappointed, week ago we hears 5080 would be like $2800, but now for only $549 ofc if Nvidia doesn't lie, we get the same performance like 4090, and according to specs, 5070 Ti would 100% be better than 4090 at almost half price, everybody mentioning about taxes that includes, but still no difference because Nvidia also releasing 4090 on $1599 (no taxes included) and when release, the price ain't getting abused.
thne 5070ti doesnt look that bad is an improvement and cheaper tha nthe original 4070ti
@@freedomutilities9240, I will not take NVIDIA's numbers at face value, especially with them throwing DLSS into the mix. $1000 is still too high for a 5080 and $2000 is way too high for a 5090. $550 is not too terrible for a 5070, but we don't actually know how it performs yet. Overall, I don't think this will even be as good as the 3000-series, which was mediocre, even before all the crypto crap.
What about us old timers that don't enjoy blurry games with input delay? Is it still comparable to a 4090 if I want to run the game without AI gobbledygook?
nope
There are many reasons to criticize and be wary of DLSS, but "blurriness" and input delay are not either of them.
Would require devs to actually start optimizing games again….
@@TillmannHuebnerdei hires dont do dat. How about a seminar about anxiety or working 2 hours a day.
@@turbochargedfilms DLSS 4.0 which utilizes frame gen (no other way they can double their framerate), introduces input lag, "blurriness" most likely means to AI upscaling fucking up and causing visual bugs.
Where is Bartlett Lake on lga1700? Are these Arrow Lake architecture 200S series chips going to be on lga1700? I don't think so.
8:06 Same. Between AMD and Intel I have no idea wtf any of the names mean anymore. Nvidia seems to be the only company with a naming scheme that makes any sense and they are charging a premium for it.
9:00 Hold on wait you did not roast them enough for this. This is absolutely insane
Cannot wait to see the reviews!
You have no choice. You must wait.
when the 5000 series releases so the price of the 4090 goes down so the price of the 3090 ti goes down so the price of the 2080 ti goes down so the price of the 1080 ti goes down so that i can get a 1050 ti : - ))
What GPU are you rocking that you are upgrading to a 1050 TI? Thats some next level poverty gaming.
@@JerichoMultiMediaintegrated intel graphics 🤣
@@user-op8fg3ny3j I'm currently gaming with integrated graphics on a 2019 Macbook using a screen resolution I didn't even know existed.....
poor man gaming? who would be excited about getting a glorified paperweight by the name of 1050ti
@@JerichoMultiMedia rx 560 here
This was the most fun episode in a long time. Thanks for the laugh 😂
the sound issue in the video made me think I had a grounding problem for my speakers :(
Remember when a 80 series used to cost 600$?...
The catch is, isn't about the naming, but the performance, Nvidia actually can put $600 price on 5080, but the performance would also be reduced.
You mean 80 class.
Pepperidge Farms remember.
Anyone remember when a gallon of gas was 25 cents?
Remember when $600 used to be worth $600?
The $550 5070 will be well over $1k, assuming that we'll have to deal with the usual supply shortages if the scalpers have their way.
As long as mining is not viable, I don't think it will be as high or will go down quickly once supply normalizes
@@surftThey also said they're trying to ensure the launch with a large stock, which I think checks out with them discontinuing the 40 series cards early.
A large initial launch should be effective at combating scalpers even past launch.
They probably raise MSRP with the planned tariffs in the US
no sane person will buy the 5070 for 1k, when it's a 10-20% max generational improvement over the 4070
@@BruhMomentTTV no tariffs apply if they have been manufactured, assembled, and shipped to warehouses stateside before they kick in. guarantee you nvidia, amd, and a great many other companies stockpiled as much as they could before trump takes office
$2000usd 5090 base price killed all my interest in 5000 series. Nothing can max out 4090 series at settings I can see.
Hopefully ryzen ai max plus super pro ai 395 is as good as the specs sound, and makes some killer steam deck / gaming netbook SKUs.
not even indiana jones at 4k native res?
lets face it, you never really had an interest did you. Just trying to get a liked comment by having a downer on Nvidia
The only reason to get a 5090 imo is to power extremely high end VR headsets without having to bother with foveated rendering or downscaling. My headset has an effective resolution of 7680x2160 that needs to be pushed at 110fps for a great experience. My 3090 can do a pretty fine job but on more visually complex games, or horribly optimized ones like vrchat, the performance can suffer, of course suffer meaning dropping below 60fps.
@@theoppositeopinion9290you must be a bot or a fanboy. Getting so hurt over an opinion 😂😂😂
@@Gahlfe123 Nobody's more hurt than the guy with the crying laughing faces.
question. If reflex reduces latency, why would it ever be an option to turn it off or on? why shouldn't it just always be on?
If your game is resource-bound on either cpu or gpu, turning on reflex will lower FPS but provide near constant latency. It works by attempting to synchronize cpu and gpu, taking account your input lag (if the hardware support it).
Those effect might be desirable, or not. So there's an option to disable reflex in case of bad result (too much desync, preference for higher fps, etc).
Note that the above is true for current gen reflex. 50 series supposed to inject predictive input to reduce frame output gap and desync, which translates to improved average latency and reduce desync.
@bepamungkas ah thanks for the info. I'm a bit skeptical about this predictive input though. Seems like it could appear like input lag if I'm doing something like a twitch shot if it predicted I would be traveling mouse the same way at that moment in time.
@@resresres1 input data are fed from CPU to GPU, and is the authoritative one. So the predictive input only affect which part they need to render. (i.e if you turn left, they can confidently discard right part of frame then "stitch" the left part). Your input is then injected last to actually render the scene.
Unless you're playing "tricky" game like ARMA with maxed out render distance, those process should not affect fps severely, while improving perceived latency. As long as the latency is stable, our brain will do the rest of the work to "normalize" the interaction.
This show makes me laugh out loud at some point, every time. Thanks for the laffs, happy new year folks.
6:42 Jeez $37 for 25 gb of data! Roaming prices truly are out of hand in North America
Most North America cell carriers don’t charge for domestic roaming and unlimited plans will also include Canada and Mexico for free as well. However, non unlimited plans can cost between $6 to $12 a day, depending on your carrier, which is absolutely insane.
Actualización 2: Nvidia ha proporcionado los precios para España, que son un poco más elevados que los alemanes.
GeForce RTX 5090: 2.369 euros
GeForce RTX 5080: 1.190 euros
GeForce RTX 5070 Ti: 899 euros
GeForce RTX 5070: 659 euros
Actualización: Los precios alemanes, que deberían ser de referencia para Europa, ya han sido publicados. Son los siguientes:
GeForce RTX 5090: 2.329 euros
GeForce RTX 5080: 1.169 euros
GeForce RTX 5070 Ti: 879 euros
GeForce RTX 5070: 649 euros
that is a whole ass 15% increase goddamn
Wow - at an MSRP of $549, that means retailers and scalpers will likely only mark it up to $800!
What a great time to be alive.
This is the perfect video. Good, fast information which is perfect for my work break, funny and gave me some good laughs, and just the info I need TECH NEWS
gotta say the natural way Riley does the tech news and the charm is my favorite part of these
Paying so much money for a GPU only to have that price slashed by a third in 2 years is crazy!
you wish lmao
Gpus don't loose that much value
Im old enough to have paid ~$1500 for a Pentium 60mhz chip - I don't wanna hear anyone complain about losing 1/3 in 2 years - I lost around 80% in one ;)
I couldn't believe they spent 50 minutes talking about random ai stuff before spending like 2 minutes on the GPUs... haha
That AI stuff is vastly more important to the industry, than the graphical performance.
Wasn’t the gpu stuff before the ai talk
The AI stuff is vastly more important to suits and investors who want to know that they're doing "something".
He announced the GPU lineup first before his speech of AI and stuff. Have you watched the presentation?
@@lelandbateyhate to break it to you but most of nvidia's money doesnt come from gpu sales anymore, hell my Dad is obsessed with nvidia products in terms of the A.I. and self driving tech, he doesnt play video games he could care less about gpu performance. Hate to break it to you but most nvidia fans nowadays arent even gamers, they are tesla owners and stock market traders and A.I. programmers
If the 5090 is 2X more powerful then a 4090 and is 1999
But the 5070 is as powerful as the 4090 and is 549
Why not just get 2 5070
1. The 5070 is not as powerful as the 4090 in raw performance, it is just a marketing scam. The 5070 is using X4 frame gen and dlss whereas the 4090 is not
2. The 5090 is also not double the performance of the 4090. It is around 20%-30%, again, a marketing scam, dont get fooled
It doesn't scale like that, nor do they support sli I believe
Wait hold up, that's not how it works 💀
they lied about the 5070. because of the versions of DLSS they were using, they tried making the 5070 sound better than it is. The actual power of the 5070 is more like the 4070ti
Bro got two GPU slots in his tower 💀
Riley, you bring me joy. Thank you.
“Oh no, the children are here” is wild 😂
The Australian prices are awful. Glad the US are great. But with reference cards not being sold here officially and with reference cards being more expensive it's pretty sad for Aussies
Well i dont feel sorry for you mate, because over the ditch in kiwi land, we're envious of the better prices you guys get 😂😢
@x3ko777 at least we both had to wait 2 years for the steam deck to officially come out here. I've seen your game prices and yikes 😬
Can you explain why things are so expensive there? Especially digital things.
@@Shinkajo probably about the Trump tariffs
@@Shinkajo 1. GST is great, essentially the tax is included in the price of the product and not calculated after or during.
2. Our import tax is a bitch and because of our low population (24 million people isn't too low). Stuff generally doesn't change. The new surface pro laptops that came out last year that dropped by $200 USD MSRP from the previous models. Because they used Snapdragon chips. Even though they dropped in the states, they remained the same price for MSRP here. RTX 5090 if you convert is $3200 AUD so apparently $800 is our import tax because the 5090 is $4000 AUD.
From what I know nothing has really changed too much for import tax so eh. My 4080 was $1500 aud for a zotac card (to be fair it was like $200 off at the time). But the 5080 is supposed to be $2200 AUD which isn't really sold in Australia and the board partner cards will be more.
Long story short the PC market is weird here, with Nvidia cards sometimes having cheaper MSRP than AMD cards it can get very confusing pretty quickly after watching reviews
1:56 bro any sane person will not pay attention to ai spiel
Very fun performance - thx, man!
0:16 I love my smart toilet, touchless clean
The whole industry doesn't know how to name things
Playstation and the PCIe group understand the value of simple naming.
Confusion and obfuscation is the intent, unfortunately
if you don't have time to watch the whole video, watch from 8:20 holy cow, ryile is killing it HAHHAHA
CES is my favorite time of the year. It's like being a children again. You know going to a rich friend's house to see all the cool gifts
I don't see enough people talking about 2000$ price tag
It's a flagship gpu? Did you expect it to cost under 2k.. im surprised that it wasn't 3k
Because enough people cannot afford the flagship anyway. Its for the wealthy & the youtubers 😂
People are just kinda forgetting that GPU prices these days are just 2-3x what they used to be.
I don't think anyone was surprised at this point yes. It's Nvidia and flagships are gonna cost at least arm and a leg and most people won't care cuz they won't consider them anyway. Let the "I have to have the best of the best all the time" rich people sink their money while Nvidia is smiling. Same to me.
the richest people are gonna buy it anyway. It could also help lower the cost of the 5070 as they will make good amt of money from the 5090
God damn these GPU prices are really making me think about going back to console life.
I've got a Ryzen 2700 and a 2070 super and it's getting a little dated. Not bad but getting there
$550 is cheap?
Have fun paying a subscription to use the internet you already pay for. Oh and also not having your game library anymore, uninstalling and installing one or two games because everything is 100 to 300 gbs
@@DGKChoice 5700x3D upgrade, upgrade to a 9070xt (ASSUMING AMD DOESNT FUMBLE THEIR LAUNCH AGAIN)
just try to get a good deal on last gen hardware and it'll be way better than a console. Just act fast before people realize how small of an upgrade the 5070 is
Scalpers are gonna be grabbing those 5070 cards and selling them over 1k
if it makes you feel better, it was false advertising and the new gen GPU's are not nearly as good as Nvidia claimed (they included fake frames / frame generation as "performance")
Merry Christmas and Happy New Year!!!
Righteous Destruction - Cyphruss
Righteous Destruction - Band Camp
5070 same as 4090 (asteriskasterisk asterisk)
most likely alot of dlls 4 but for the price it still good
I’m gonna need like 3 cross symbols, a wingding, and a superscript “2” for this one
it'll launch at $550, then once they sell out of those every 5070 will be substantially more expensive lol they always do that, it's just so it looks good at launch
True
its the riley guy
he's pretty cool, but I miss when he still hosted RileysRiledLinks
yay
Ma dude said “..sh ssh sshh! I know who you are….you’re chicken sh!#” 🤣💀
Funniest line yet.
I still find troubling the audience didn't mind Jensen making those PC influencers fight in his gladiator arena for his amusement. I mean he did it when the 4000 series came out, but at least the CES crowd didn't clap at the spectacle.
1080Ti has so much performance and they are still releasing cards that are slower, I just hope it never dies
Crazy thing is you can still buy a 1080...for $300, which is crazy expensive for that old of a card (even though it's still good).
2000$ without a disc drive...
Scam!
2:19 “But Mark and I would prefer to go home at some point? So I’ll talk about that tomorrow.”
Wow. You didn’t even ask me! I would have stayed! 😭
Riley you make the tech news so enjoyable i was shitting myself laughing at the AMD stuff....great just great delivery
best thing about the 50 series is that I'll finally be able to afford a 40 series
9:32 honestly Bing is way better than Google right now, lately I've started to use it when Google wasn't able to give me what I was looking for and more often than not Bing shows up the results on the first page with no issues. (to give you some more context I'm a 3D modeler and I've been working on some historical architecture lately so mainly information like measurements of building and usable reference images) Google just shows some random crap that is not even related to what I was looking for.
Facts. Bing as a search engine has gotten surprisingly good and I also use it when google isn't showing me what I'm looking for
I haven't tried Bing a lot, but I agree, Google search has become absolute crap.
5070 = 4090? oof , looking forward to arguing on facebook marketplace about why i want your 4090 for less than $1k CAD
Honestly, what excites me about new CPU or GPU announcement are the current and last gen of CPU and GPU becoming cheaper
Wanna buy a 4080 super for $500-550 tax free? Im selling
I guess it's news for games (5070), but if it has 16GB of RAM this will be great news for Davinci Resolve users (or at least 12GB).
I’ve never been more confused with GPU and CPU branding changes on top of the product line name revisions
Lol. 5070 matching 4090 is going to prove to be _absolute_ bullsh!t. Complete and total nonsense.
4070
I decided to divide into two types of people. One is general people who wants to play games for buttery smooth gameplay without complaining. These people just care about how smooth their games play just by seeing and what they're paying for. 5070 for 4090 performance at $550 for them is actually steal. Another one is people that really care about very very little things, like how they dont like FG and they're not paying for "fake frames" and they say things like "AI slop" or giving hate towards AI because of 7ms in a frame imperfections that really bothers them.
Me? i'm a bit broke, so going for B580 i guess
If the Switch 2 Mouse thing is real (and actually works well, and it's pushed by Nintendo for 3rd parties to support such input), that could be a big feature for the console. It would make certain types of games (like 4X) more pleasant to play on the console, and if mouse support becomes a standard feature that too could be a benefit (just connect any BT mouse, or wired with the dock) for FPS and various games that benefit from mouse control.
No word on further VR/AR support for the console though, so I'm guessing they're passing on that for their next console. Maybe the next console will push VR/AR functionality, who knows.
The bing, google emulation is hilarious 😂😂😂😂😅
This was such a great episode. I want more 😩
So the RTX 5090 is $1999 USD ($3,187 AUD), but on the Nvidia website its telling us it'll be $4,039 AUD. Where'd the extra $850 come from?
US Export tax to your region, nothing to do with NVidia
probably your government
That's the Australian government kick in the nuts, standard for computer parts
Our currency took a dive a few days ago to the US. With $1 AUD lucky to buy 60cents USD, add tarrifs and GST and it won't be far off that total amount.
10:24 that would actually be pretty bad ass!!!
The switch already supports M&Kb input.... even though it's limited in function.
Warframe is going to play so much smoother like that on the switch!!!
It is, isn't it?
And about the switch already supporting M&Kb input... so do the Xbox and PlayStation.
They are lying about something. Stockholders would never allow under priced over delivered items 😂.