How Good...or Bad...is Ampere?
HTML-код
- Опубликовано: 7 фев 2025
- History doesn't lie.
♥ Check out adoredtv.com for more tech!
♥ Subscribe To AdoredTV - bit.ly/1J7020P
► Support AdoredTV through Patreon / adoredtv ◄
Buy Games on the Humble Store! -
►www.humblebund... ◄
Bitcoin Address - 1HuL9vN6Sgk4LqAS1AS6GexJoKNgoXFLEX
Ethereum Address - 0xB3535135b69EeE166fEc5021De725502911D9fd2
♥ Buy PC Parts from Amazon below.
♥ NEW USA Store! - www.amazon.com...
♥ Canada - amzn.to/2ppgYsX
♥ UK - amzn.to/2fUdvU7
♥ Germany - amzn.to/2p1lX6r
♥ France - amzn.to/2oUAK2Z
♥ Italy - amzn.to/2p37Uui
♥ Spain - amzn.to/2p3oIBm
♥ Australia - amzn.to/2uRTYb7
♥ India - amzn.to/2RgoWmj
♥ Want to help with Video Titles and Subtitles?
www.youtube.com...
"That's a real 2x, not a Jensen 2x...." OMG my sides!
this should be a meme, the Jensen multiplier, put it in the box with elon time and apple corners and "it just works"
@@geofrancis2001 how does elon time compare to valve time?
Lol
@@KibitoAkuya Elon time goes beyond 2.
But there are a lot of Nvidia fans that simps for Jensen, so they absolutely believe it.
And the worst part about this is that Fermi was an accident, Turing and Ampere are completely deliberate.
Chuckiele they know AMD focusing their RnD for Ryzen, so they may have a good nap for few generations then 🙈
Not necessarily. I think Nvidia planned 7nm for Ampere so it would have been at least 20% faster and more power efficient, but they had to redesign it completely.
@@perschistence2651 They cannot design a GPU in 1 year. No one can. They never redesigned Ampere at all, they designed both a 7nm TSMC and a 10nm Samsung Ampere GPU in case they couldn't manufacture their GPUs at TSMC. It's what really happened.
Fermi wasn't an accident. It's was typical nvidia at the time. Did you own tesla, for example?
Nvidia only good when it has no competitors. That's why it always tries to eliminate them. Nvidia's glory a myth.
@@rayaneh5230 I think they did. Ampere is not the true Ampere they just took turing, increased the number of int32 units and made some small tweaks. We'll see the true Ampere at 7nm with another name. Ampere is a very cheap architecture.
“One of the saddest lessons of history is this: If we’ve been bamboozled long enough, we tend to reject any evidence of the bamboozle. We’re no longer interested in finding out the truth. The bamboozle has captured us. It’s simply too painful to acknowledge, even to ourselves, that we’ve been taken. Once you give a charlatan power over you, you almost never get it back.” - Carl Sagan
Basically cognitive dissonance
Humbleness is the only way to overcome something that has dented your pride.
@Amandeep Singh no truer words can be spoken. It is what it is, we fight to break free of the Bamboozle, or relax and make ignorance as Blissful as possible.
It is easier to decieve people that convince them they were decieved
-M Twain
And what is even more sad is that you can say that about AMD and Nvidia and pretty much any company imaginable that sells products. They have all bamboozled us and will continue to bamboozle us, we are all sheep in this game of life.
When i saw 3080 day one reviews i was very surprised how everyone just ignored the fact that it draws ~350 watts of power. Then came 3090 with ~400W and still no one even mention it.
But I knew someone would question it and i knew it was going to be Jim @ Adored Technology and Vision.
Thanks for your videos. Sadly common sense is not a disease spread by a virus worldwide...
to be honest many of us don't really care about performance per watt
i am one of those people
i care more about how much FPS do i get .. and at what cost .. couldn't care less about power consumption as long as the cooling is sufficient which it is on the 3080/90 ... rumors that the flagship AMD card gonna be 300w and i'm 100% ok with that if it means near 3080 performance at cheaper price (price per fps matters to me .. nothing else)
Your comment mustve been buried beneath the comments screaming in praise though, just like mine...
I don't think they ignored it as much as presumed everyone and anyone who was going to be looking at those reviews knew the tdp was 350 and 400 because it was in the press video.
@@someone-wi4xl I see a lot of people saying things along these lines, but I think a lot of them aren't understanding what it means to have hundreds of watts of heat dump into what is probably a bedroom. We're getting into literal space heater territory as many/most of them have a 500 watt setting.
@@dycedargselderbrother5353 Well, some of us live above the arctic circle, too :p
1.4k for top GPU? Few years ago 1.4k is cost of top PC xD Hoping that AMD can do fair challange
Yeah I built my whole top of the line PC for Crysis for like 1800 and that included the monitor as well. Now I'm building a new gaming PC and its gonna cost me around $3500(including the monitor) and that's if I only get an RTX 3080 10GB. I'm not going to spend double that for the 3090, it's just not worth it.
Some people will pay anything for bragging rights.
Andy Parker And some youtuber will slap it on your face even harder just for the money 💰
@@e2rqey No one is forcing you to buy a 3090 gpu card!!
@@basilthomas790 The problem is that we have people that are crazy enough to pay that amount, which drives the whole GPU market prices up. While a few years ago I could get a 970 that would perform the same as the last gen top of the line card for $330, now I'd have to pay $500 for that and get less VRAM to boot with the 3070.
This comment may seem weird to some. I have a good job and run a decent sized gaming RUclips channel. I have plenty of money for any GPU I want to buy, and eagerly snatched up a 1080ti back at launch. It performed so good vs my old 970. Was very happy.
After a couple years I was bored with no change. So despite my better judgment screaming at me not to, I bought a 2080ti. Waste of time and money. At the same time I bought a 5700XT for a PC build on my channel. It wasn't a perfect card but it was cool to see how good my games ran for such a significant fraction of the cost.
I use my 2080ti for recording videos for my RUclips channel because I already paid for it and it is superior performance. When I game for fun by myself I game on my 5700XT build. I guess my head gets more enjoyment out of solid performance that was far more affordable than the best performance at a price that makes me still feel stupid.
All this said...I have been very suspicious of Nvidia since I screwed up and bought a 2080ti. Not keen to make the same mistake twice.
I won't buy a 3080 or 3090. I may buy an RDNA2 card if it can beat my 2080ti at 1440p gaming.
350 Watts is not desirable either. Even with central AC my room gets hot with a 2080ti. No way I am upping it to 350W 3080.
Oh hey it's the other person that still plays Attila Total War. Hi.
I'm in the same boat. I upgraded from a 970 to a 1080ti and it was fantastic. I took one look at Turing and went *skip*. Absolutely no point in buying any of the RTX 2000 series if you already owned a 1080ti. The 3080 looked compelling at first but the 350 watt TDP had me worried that I wouldn't even be able to turn my PC on during the summer for fear of being cooked alive. Then real performance numbers and all of the driver issues came in and now I can safely say that Ampere is another *skip*. Long live the 1080ti.
@@The-Terminator-T800 gamersnexus measured 321w playing total war at 4k. 314w at 1440p. Is 350w that far off?
Edit: sorry on my phone. That was for the 2080ti. The 3080 FE was 366w at 4k. 350 seems like a good estimate lol
@@The-Terminator-T800 GamersNexus would like to have a word with you on that. Might want to check their benchmarks regarding the 3080FE and the 3090, particularly at 4k gaming.
@@The-Terminator-T800
You may have to (re)inform yourself considering the Gamers Nexus findings. There are places where power hurts in the long run.
Statistics, remembering recent events and holding organisations responsible for bullshit just don't seem to be things our current society is very good at, Jim. Being tricked by propaganda, moving from target to target without actually holding anyone accountable and wanting unnecessary luxuries on the other hand, we've got those down to a tee.
Word.
Steps of hype:
1) hear the rumours.
2) see the cooked Nvidia figures.
3) watch the glowing reviews.
4) listen to Jim pull you back down the earth.
Thanks for this, mate.
Jim, this has been the best video about Ampere that I've ever seen. I myself IMMEDIATELY tried to get people to come back to Earth by posting (on TechSpot) how nVidia had fooled them with my post showing them nVidia's marketing and price progression. They pulled a switcheroo by changing the top Ti card from $649 to $800 to $1200 (now to $1500) while quietly moving the top GTX/RTX x80 card into the $800 slot that was occupied by the top Ti card for a time. It seems that nobody saw it except you and I.
My wallet sure sees it. The sub 400 card market is sad. 2060 land here.
Somehow my gut feel made me stop at the 1070 I currently use, and I'm glad to see it was the right feel.
Pascal is the best gaming architecture the world has ever seen and I'm pretty sure the industry will never see something like it again. RDNA2 does look pretty good, but I don't think it'll be quite as good as Pascal was.
A like from a fellow 1070 owner👍🏻🏴
Better upgrade my 1060 to a used 1080ti.
If the 1070 still meets your expectations / needs (1080p high refresh rate or 1440p if you're gaming with it, I'm guessing), stick with it. If the 3060 comes in at around $400, it will probably be worth it from the price / performance ratio, but at the end of the day there's really no reason to buy a new card (yet) if it's doing the job you want it to.
Games are always becoming more demanding, though... we see hardly any games with the 1070 as the "minimum" requirement now, but I suspect that will change soon as the bar is being raised with the new consoles having more capable hardware.
Competition is important.
Finally, someone said it. Comparing it to the 2080Ti in terms of value is absurd, it was the first x80 Ti card that launched at 2x the MSRP of its predecessors. The 2080Ti, had it launched at the same price as the 1080Ti and 980Ti, would have been much better value, and the 3080, a much less impressive margin over it at that price.
If you take a 2080TI and unlock it then run it at the same wattage as the 3080 its only 5-7% behind in performance. The 3080 is terrible terrible card. It has been pushed to the very edge of the voltage curve just to get everything they could out of it. Nvidia never does this, makes you wander what they are afraid of.
@@dralord1307 what do you mean afraid of? You said it yourself that the 3080 has basically no performance increase over the 2080 ti. Thats a problem regardless of competition. Nvidia still has to compete with itself.
@@Heatranoveryou I dont think Nvidia would have cut their profit margin so much, or pushed these cards to the ragged edge the way they have if they werent afraid of something. They moved up the launch date of the 30series even they knew they were understocked. They didnt give AIB's enough time to test or validate their cards. They pushed back the 3070 launch so it launches the day after RDNA2 announcement. What does all of that make you think they are worried about
@@dralord1307 The 3080 was designed around the high wattage, with cooling and power delivery. The 2080Ti was not. Sure shunt mod and overclock the 2080ti to the hilt, put a waterblock on it. Heck why not add a chiller too and match a 3080 good luck.
@@dralord1307 Check what happens when you lower the wattage of 3080 to the 2080 Ti levels.
Now this.... This is an objective look at ampere cards.
A lot of reviewers are going gaga over ampere while somewhat forgetting that ampere seems "so good" because previous gen was so bad.
As always, great video Jim.
Look at Gamer Nexus video memeing about money that all reviews get
I'm not sure it's that objective.
@@loranmorash8532 The way the video is presented is not objective. He compares a 2080ti to a 3090 they are not the same class. Should compare 2080 vs
3080. Also looks like he averaged the resolutions to get performance, these days it makes a difference your resolution the 3080 is for 4k. And then ignoring things like DLSS & RT when looking at value.
@@loranmorash8532 Traditional metrics is useful for comparing to all the generations. But Nvidia is going in heavy on boosting the performance of their cards using AI. Considering a chunk of the performance loss is due to efforts to boost performance in new ways, AI deserves at least a mention. Without it, this is an incomplete picture of whats going on.
@@nrosko wrong. You awfully sound like a butthurt Nvidia fanboy who doesn't really understand the video. The 3090 is the successor to the 2080 Ti, hence the higher Vram amount and even higher pricing, but comparable. The 3080 is the 2080 successor and nothing else, hence the 2 GB extra vram and same pricing.
Why are people so blind to nvidias marketing lies? Because they are fanboys.
Jensen: "Hire hitmen."
When single hitman is not enough?
@@Worgen4ik Hire two, then plan a sequel.
@@tomhsia4354 But it's Jensen. A doubling doesn't equal two. :-P
The more hitmen you buy, the more they die.
5nm hitmen might just do it, though.
Then again, they might get ripped to threads.
Nvidia: The more we lie the more you buy!
❌ 90% performance/Watt increase compared to Turing
❌ RTX 3080 2x faster than RTX 2080
❌ Cards will be available on September 17th
That last one literally made me laugh out loud 😂
@@dracer35 what do you mean? It is true. Cards available from scalpers for *50 the RRP! Just look to the nearest dodgy auction website operated primarily by scammers and con artists!
@@DoomsdayR3sistance I really hope that AMD can give some competition and make Nvidia lower the price. hope they lose a lot of money for their greed
@@VovixMorg I doubt AMD will make much impact on Nvidia here and with what nvidia has planned for Hopper, AMD needs to surpass it's A game to compete. I do think AMD will make some gain in the mid-tier consumer grade GPU market but it's a small slice of a small piece of pie.
@@DoomsdayR3sistance you need to find new XTX leaks, that can be much better than you think
Oh Jim, every time you release a video I make myself a cup of coffee and just chill after work listening to you.
But Jensen knows how to sell a fridge to an Eskimo lol
Intel: "Write that down!!!"
At Intel, we know how we can sell a fridge to an Eskimo. Cause global warming, and one day they will buy our fridge!
Nah m8, you're way off. The new cards have great performance uplift over last gen in that one game with those settings that choke the performance of the card we made Digital Foundry compare it to due to VRAM limitations.
Someone post this on r/nvidia and see how many seconds till it is removed.
That's the evilest think I can imagine
r/nvidia ban any% speedrun
@@SonGoku-97 HAHAHAHAHAHHAHAHHAHHAHA
This very much sums up my thoughts on the whole ampere launch, I'm glad I wasn't the only one noticing this.
@Neil Leisenheimer Yeah, and while the 3070 looks like "great value" against 2080Ti, all the points mentioned in the video still apply, and the VRAM bottleneck will be even more severe. The only saving grace might be games moving to use DirectStorage and thus require less cached assets in VRAM.
The 3070/3080 really needed to be offered with 12/16gb. If you want to see what will happen, go look at the fury x.
@@rayjk1431 but with gdr6x that would really blow out the power budget.
Did you guys not watch the Unreal 5, Demo? motherboard with PCI-E 4/5 and fast SSD.
On that perf/price graph you can CLEARLY see the cards that people sat on for years.
Why upgrade 970 before next-gen games hit? Or why upgrade 1070?
3090: A Titan-class GPU with none of the professional software support and at 25% more cost. Buy it today. (Wait, you can't?)
Pascal and maxwell vs turing and ampere.
Imagine what we would have if amd had enough money to compete over the past 5 years
Its good that two companies blow fists with onther
This gen and next by amd will be talk for long time
And i dont wont nvidia or intel to fall just get back to the ground
AMD doesn't exist to get Nvidia to release better products.
Snozz McBerry do you know another gpu company?
In some segments AMD have competed well, only drivers held them back. If 2 cards are similar price and performance but one is stable and the other is not, well the choice is easy. AMD is trying to get the drivers solid this time around with RDNA2 and should be competitive again in all segments. AMD still powered millions of consoles which make the pc market look like a joke.
Well tsmc controls the market and therefore the performance. 8nm is what, basically a half node from 16 to 7 so you could probably expect a pretty good jump in performance with an architecture that was specifically designed for rasterized gaming. The extra die space taken up by all the ML stuff is gonna take a hit independent of the node off course.
Feels good, when you learn a lot of interesting historical facts and the video is not even halfway through.
Thank you for teaching a little bit of GPU history to the people not able to experience it back then, Jim.
I still have a running GTX 580. Somehow Thermi still lives!
Excellent video Jim, please do this for Radeon when RDNA 2 comes out
Oh course he will.... I mean the man practically got liquid silicon as blood running through his vains
He'll just conveniently "forget" to lambast RTG over its pitiful attempts at graphics processing.
But I suppose that's really it. It's just "pitiful"; not malicious, or greedy (not by comparison at least, and not until Navi 1).
@@kintustis mommy mommy
@@Veloce87 what?
@@kintustis You must be new here. Welcome though
nVidia: enacts second worst price/performance improvement in history
Everyone: OMG nVidia does care!
RTX 3080 is actually the best price/perf card bruh. This FUD needs to stop.
@@Sal3600 Have you not seen the video?
Yeah right that's why jensen try to lore pascal owners to upgrade.
Jensen " my friends pascalian is now safe to upgrade" when we all know that ampere is worst...
very nice video, the bar graphs with the percentages came right as I was thinking to myself that hearing all these numbers became a little confusing. perfectly timed.
Brilliant! As you know I was doing my own research but really struggling to get my head around this generation. It never occurred to me to focus on the 3090 but after watching it makes perfect sense now. Thank you for the top quality analysis, this video feels like closure to me as it answers so many questions I've had and draws a conclusion I can understand in my own way. Thanks again for all the hard work. Was a surprise It turned out as bad as it did but that's why I subbed, you never run from the facts.
It was clear to me the 3090 was the dangler for the people who want to buy the best gaming GPU neverminding the cost. And in that respect has replaced the 2080ti. I have a 2080ti and I don't in any way regret it, but I won't be getting a 3090.
But Nvidia have not at any point suggested buying the 3090 for 4k gaming or below.
@@SafetytrousersAs long as you're happy with your purchase that's the main thing. Hate to see people waste their hard earned money on things they end up disappointed with so it's nice to hear from someone who knew what they were buying and are happy with that. I don't agree with the 2080ti but who am I to judge especially when you clearly know what you're doing for your own circumstances.
People forget their history. Thank you for bringing FACTS back into the discussion, Jim.
Forget? Most simply don't know it. Wannabe nerds with pitiful knowledge who pretend to be smart, way too many of those.
@@eazen True, a lot of people are new to the space and are just ignorant of history and easily buy into marketing. I remember watching the announcement videos and all the "RIP AMD" comments after every marketing slide that has since been proven to be just that...marketing. Not truth. Lol.
Lots of people saying in forums that they hope AMD delivers a good new product, and then still buy nVidia, Somehow it doesn't matter kind of products AMD has to compete with. Only way to make prices go down is to not buy the over the top expensive cards. Vote with your wallets! But, people jump all over new nVidia gpus, and new iPhones in a way that it seems like they cannot continue to live if they don't have the newest, fastest and "best" thing, no matter the cost.
Great video by the way!
Those people just want to buy Nvidia at a reasonable cost. The only way they can do that is if AMD competes. I personally don't care about Nvidia anymore. AMD can compete just fine in the price bracket I'm willing to spend in and they have awesome open source drivers. As long as they keep doing that they'll have a customer for life.
"Vote with your wallets" is the most stupid nonsensical argument ever. Don't you realize that if the competition follows suit with the bad choices the more successful company made then you're left with no choice?
Do you know why in 2020 you are stuck with hideous non-standard sized slabs of glass with either literal holes or notches devoid of the essential universal audio port and with glued-in batteries and 1000$+ price tags?
Got a 5600xt for my brother's rig. Did not see a reason to go for 2060 since RTX performance would be ass at that level.
That is true so far, but we'll see after the newest AMD GPUs are announced. Don't forget that the same situation you're describing happened with Intel CPUs vs AMD CPUs, but it's very different now. AMD has continued to take marketshare from Intel since the introduction of Ryzen. RDNA and CDNA will ultimately have the same effect on Nvidia and how consumers see both including AMD being a true alternative for the best GPU technology at the best price.
I use AMD now. I can afford to spend $1000 on a new GPU. AMD has driver issues. I won't pay more than $300 for a GPU that doesn't even work properly. That is why I am done with AMD. Not because I like 2xJensen. I will probably buy a used 20 series card even tho I can afford a 3080, I don't want to reward scummy behavior.
To be fair, one of the reasons turing and ampere are comparatively bad with previous generations is the presence of "rt cores and tensor cores" . If that "die space" was used for "normal rasterization hardware" the difference with previous generations wouldn't have been that bad, on the other hand one could say that turing and ampere are the biggest generational leap in ray-tracing performance. In order to innovate someone has to be first, and nvidia decided sacrifice rasterization performance for ray tracing acceleration. I'm not saying that nvidia doesn't deserve criticism for their marketing and high price points :)
I'm not so certain that they actually sacrificed anything when it comes to rasterisation in relation to CUDA core count. There's a little something called diminishing returns where things don't really scale all that well at the top end. Looking at 1080p performance I'm suspicious enough to say that we might've already hit a cap there & that smarter approaches are necessary instead of just increasing quantities of certain parts.
I'm also fairly sure that a company like Nvidia & AMD are unlikely to both be shooting themselves in the foot...
If only the mainstream picked up on these facts.... Thanks for posting this Jim ✊
The facts that people with actual brains already know but most just don't want to know.
Now tell me what the best price/perf card is.
Im still trying to figure out how ampere managed to suck 100 more watts and only give about 8-15 more 4k fps over the 2080ti.
@@Sal3600 Probably a Radeon RX 580
@What yes. The 1080ti is just so good that it's almost undermining Nvidia's new cards in terms of value.
I have to say it is really refreshing to see such a well researched fact based video when most other youtube videos about this topic are more or less clickbait or tailored to people who don't have a clue... Keep up the good work Jim!
I still can't believe how little I managed to pick up a 2080Ti for after the Ampere announcement. Receipts in the box and the seller spent over £1200 in Q1, just to let it go for just over a third of the purchase price 7 months on. That was the lightbulb moment for me. Targeting people who hold the things they buy with such little regard is a brutal stroke of genius from a business standpoint. Two ye olde sayings spring to mind. "The person who writes for fools is always sure of a large audience" and "A fool and his money are easily parted". The Nvidia business strategy - brilliantly - bakes these two sayings into a nice palatable slice of cake which the tech industry just gobbles up. The big question should really be when will AMD follow suit, rather than if Nvidia will change... there is a reason the last few Nvidia press conferences have been in a kitchen after all ....
I wonder what Adored's chart would look like if you actually aggregated the used market. My favorite part of the Ampere announcement was seeing the Turing prices drop like bricks.
The sceptic in me says AMD would be crazy to not do the same.
I did the same thing but with Pascal. Waited until the 20 series released and bought a 1080ti with an EK waterblock included for $400. I'll probably do the same in the future but for now, the 1080ti is more than good enough for all the games I play. I won't be paying that Nvidia tax for every shiny new thing.
AMD already does that with Ryzen CPUs
Great Video! I love the in-depth history lessons on GPU's.
Keep em coming...
Another great video, clear, precise and unbiased. Thank you Jim.
That was an absolutely fantastic breakdown, thank you so much for all your in-depth analytics! It's very nice to be able to show something like this to my skeptical friends, or at least those w/ an attention span...
I am just letting you know it has been months for any of your videos coming up on my RUclips home screen... I do click on them myself, having to find you in my long list of subs!
Interesting, this video was position one for me on YT home!
ruclips.net/user/feedsubscriptions
This is why I give you my money every month! You make the videos I need to make the proper arguments and recommendations. Keep doing an excellent job!
Jim consistently dropping bombs over here. Your videos are always worth the wait, thanks again.
Every year there is another video that makes me feel great about getting and still running the 1080Ti
Still rocking my 1070 here. Definitely got my money's worth at 1440p60. Will continue to for at least another generation even if I have to reduce settings.
This is precisely why we must keep calm and await the queen's speech on the 28th.
You are such a simp ....
The 3070 was delayed until the 29th anyway
She gives more of a cool aunt vibe to me
cringe
@@hihna2011 Yes, either that or it was wordplay and I don't actually think she's a queen...
This is why I am waiting for Big Navi review first before upgrading this time. Especially since the real cost-performance benefit of getting an Ampere card comes only at post-1440p gaming due to its different INT32 + FP32 datapath design of its architecture which can only be effectively utilized at these resolutions, thus making it technically a worthless upgrade if you just play at some 1080p-like resolution where the gain is pretty modest, or if you're playing at 1440p, that is unless you really want raytracing and you're only playing games that support it. Games that support DLSS will especially be better at 4k and higher but those are a few and between still even after these years. Since that is all countered by its enormously increased power draw you have to factor in the additional cost of a new power supply anyway since it has been shown that it can get into 380 Watt ranges.
Ahh as always; even though you have to wait a bit for your postings, you know it’s due to a good reason and again you prove that the wait is worth it. Incredibly concise and clear video with, as always, great back view mirror examples. Big 👍 😀
Great detective work and very useful comparative complication of data. Another great video. Thanks.
makes me sad that Jim gets so much flack when he is one of the few people shining the light of truth on hardware manufacturers. Love your work Jim
You are confusing "flack" with "slack."
If they were giving him "slack" it would be a good thing, although using the word in that manner would still be dubious.
I think you mean "flack"
The world could use a lot more slack right now
he gets it due to his navi predictions,
which i dont understand, its not like he had a crystal ball, no one really knew how navi was gonna perform.
indeed
and he puts his neck out there...
Great content. I would suggest inflation adjusting dollar amounts being compared across time, though. Comparing prices in the same year is fine, but across a decade is a HUGE difference. Companies must adjust prices for inflation as the majority of their paying customers also experience wage inflation at or greater than CPI. Only COSTCO can well the $1.50 hotdog forever.
AMAZING Video, I'm still currently using my 1080 TI, all of Nvidias launch thus far have been a disappointment, thank you for such detailed information! 👍
Prove value.
That's the one thing I'm looking for in my next upgrade. I appreciate your in-depth, impartial, and critical analysis as the PC world goes mad over the (literally) hot new thing. Thank you!
Don't forget that Jensen said ampere has "1.9x Performance Per Watt"
Pythagoras was rolling in his grave during that presentation 😂😂
Great work jim, as always. Time to post this on some nvidia fangirl groups.
But... But.. But. It is does have 90% better perf/wat... At a 120W load, but don't look at that part.
that'll be an insane boon for mobile GPUs so its not all that bad, 90% PPW Improvement in a laptop 120W gpu will lead to some good laptop designs with ryzen 5000
@@stevenbradford6138 so you want 3080 buyers to undervolt and underclock to 120watt range? XD aint nobody gonna do that. It might do good in mobile platform. Then again amd gpus will have 30%ish higher transistor density. means smaller dies less power. its not gonna be easy for them this time.
@@Pixel_FX I would actually prefer if people didn't buy these cards as they are junk.
I was making a joke based on the fact that Nvidias launch presentation showed a graph that said the new cards are 90% more efficient when consuming 120W.
Nvidia was using statics to make a massively misleading statement.
@@stevenbradford6138 Yeah they are junk, That presentation was so misleading. Sad thing is still people are worshiping jensen. i can't wait to see they become even bigger joke next month. at least its slightly better than turing when u compare cuda core count vs performance scaling. turing had horrible scaling. best scaling ive seen yet are on Pascal and RDNA1. almost synchronous performance gain with cuda/sm count. So this 80cu will really scale well I hope. we'll see soon enough.
Btw, only good thing i saw about ampere cards are their compute/rendering performance. almost twice as fast as 2080ti.
i wish if you added the 3080 as well
the same way you added both 780 and 780 TI .. so we see the value of the 3080 compared to the last 2 generations
He added the 780 because that was the top card in the line for quite a while. Top cards are never the best value in their line. So comparing a top of the line card with a non-top-of-the-line card is apples/oranges. If the 3090 wasn't released you could be sure that the 3080 would be even more ridiculously priced.
I usually only comment when I have something to add. But this video was so great, I just wanted to say that :P
Hey Jim, I don't do Twitter so I'll respond here. I was ok with the $50 price hikes, but the lack of a 5700X does hike price of entry into 8c/16t more like $120 vs the 3700X.
The problem here is - 4K Gaming and 4K modding and VR need VRAM. 10 GB is an issue for modders. That is a fact too. The 2080 Ti lacks HDMI 2.1 and cannot drive a 4K OLED HDR TV well. Its also not fast enough. The 3090 is not fast enough too but still its an improvement and has enough VRAM.
I will wait for RDNA2 for sure, but if not... 3080 20GB it will be.
You don't need much VRAM to use VR. I don't know why people keep claiming this. Half Life Alyx is the most demanding VR game and on the Valve Index it only uses 7GB. That's means you'll only pass 10GB if you use an HP Reverb with it's 2160/2160 displays per eye. But good luck finding any GPU that can run that.
@@GeorgePerakis Yoda him i hear speak when you❤️️ said that
You will want a new GPU by the time you need more than 10GB. Sure you can run silly hi res texture packs to prove a point. I have tried them myself, but they are not worth the download time and space taken up.
@@GeorgePerakis it's a VR game, they aren't made for the consoles. The only one they can be on is the PS5. Most VR games have been exclusively on PC like Half Life Alyx. That makes it basically next gen just lacking ray tracing which is too slow for the 90fps needed for VR. What we got is what we will have for the next 2-4 years. Most VR games barely use 4GB of VRAM. The consoles have nothing to do with it and Nvidia releasing the main 3080 with only 10GB of VRAM and the 3070 with only 8GB means VR games have no headroom to push up. It's based on PC hardware, not the consoles. Why would the consoles effect anything with VR?
@@alistermunro7090
That is up for me to decide, not someone else. I like my 4K mods and texture packs. I want future VR titles on future VR headsets to max textures.
This "10GB is fine" is getting rediculous. 8GB isnt fine even at 1440p if you manually set settings to Ultra in games like Wolfenstein 2. I KNOW that since I have a 2080 Super and had a 1440p monitor and it DID stutter.
Thank you Jim. There's not many people like us, that cares for an objective view of products.
Most think as consumers do, while we struggle to convey what a product really is. Let alone about the company.
I've followed you for years, and I hope for many more. The world has been pretty crazy lately, take care Jim.
Best wishes from Malaysia.
2 of the best videos you put out lately, back-back? Jim, you're on fire.
It's great to get perspective like this, on a silver platter. I'd never do so much research about GPU metrics relative to previous generations. #PerspectiveMatters
Though, i'm wondering how Jensen comes to conclusions like: "biggest performance leaps ever...", while it can be fact checked. Sounds like he has something in common with the orange-head from US.
Jensen thinks he has a lot of stupid fanboys, and he's kinda right with that. So he abuses his power. But nvidia was always like this. First they destroyed 3dfx with marketing lies, then they lied about Radeon and broke them nearly as well. Then they lied to their trusty customers or fanboys, to keep selling more GPUs.
Keep it up, Jim. I always learn new things from your content
The "largest performance bump in histroy" claim becomes true if you account for compounding.
Let's say the GTX 580 has 100 performance units.
Then the 780 bumped up to 200 performance units. +100
Then the 980ti bumped up to 286 performance units. +86
Then the 1080ti bumped up to 529 performance units. +243
Then the 2080ti bumped up to 735 performance units. +206
Then the 3090 bumped up to 1074 performance units. +338
So the 30 series delivered the largest increase in performance units, if you calculate it in this way. Which is probably how Nvidia arrived at their claim.
It is also an absurd way to measure things, and not consistent with what the user sees in performance.
Let's say they all bumped the same +100 in perf, except the 3090 gives +101. The +100 would give the 780 double (i.e. 200%) the FPS compared to the previous gen. The +101 of the 3090 would give 601/500 = 120.2% of the FPS compared to previous gen, but no user is going to accept this 20% FPS bump as the biggest ever increase in performance!
Finally a video that lays out what I've known for years...EXCELLENT presentation!!!
Jim's contempt for Nvidia and Intel is legendary.
Try calculating the ray tracing performance per frame or watt verse the previous gen? You'll probably come to the reverse conclusion. The RT/Tensor cores and DDR6x use plenty of power.
Now that most of the hype and drama has settled down, the meta analysis can come out. Thanks for giving some perspective.
I was looking forward to upgrading from a gtx1080 some time next year, when the 3080 actually become available, but now I'm not so sure anymore.
19:38 bitcoin bitcoin bitcoin.
-47% Then -12%
The bitcoin backlash for PC gamers will be some what resolved by AMD end this year.
It was a long wait, but i am ready to see this day come and see Nvidia go on their knees.
I loved your video on the architecture of ampere using the whitepaper.. Great job explaining it in a way that everyone can understand.
‘Ampere is the biggest generational leap in performance ever’
*Jensen 1st of September*
I mean it is. You cant just ignore DLSS 2.0. Nvidia releases a new technology that gives you massive increases in performance and you turn it off and say look its only 48% better. How is that a fair comparison at all.
@@Recon801 Oh but we can ignore it. Since even after 2 damn years there are only 14 games supporting it. And even from that, most of them use the terrible DLSS 1.0. It may as well be irrelevant.
@@Recon801 Look who sucked up all that marketing speak as gospel.
@@rdmz135 Firstly, there are way more than 14 games that support DLSS. Secondly, you only need DLSS for the few AAA games that are graphically intense enough to need the help of DLSS, and all the AAA games releasing do support it.
If I can enable DLSS 2 and get 30 or 50% more fps at 2k res and visually the game looks the same that is a BIG DEAL.
My point is Adored shouldn't make a video essentially saying Nvidia is lying and then exclude a core piece of Nvidia's claim in the first place. Imagine Nvidia saying you're getting the greatest increase to memory bandwidth because of new compression and faster memory, and then a youtube disables the new compression and goes "look Nvidia is lying again"
@@Recon801 There are only 14 I could find. Even if it was 20 its far too little to be relevant. If it was a globally available feature I would agree with you, but it is not, and it doesn't look like it ever will be.
Rather than comparing flagships of each generation(whose supply will vary significantly due to yields), I think there'd be some useful insight gained from comparing performance at a given mid-range price point (adjusted for inflation obv.).
It'd also be more valuable from a consumer perspective too, as flagship sales make up a tiny fraction of total units sold within each generation.
I'd suggest $250-300 (2016 dollars) as a reasonable midrange price (e.g. 1060)
We don't have mid range Ampere cards yet.
Is anyone surprised? Every time Nvidia feels like they should brag about increases of ridiculous proportions, it's usually the other way around. Ampere is awful. And people fail to learn from the past. Fairly fuckin consistently too
2080ti smashing performance for half the cost. I know someone who has one and is delighted with his 3080. I wish I had one as well as it obliterates my 1080ti which also cost me more. Just impossible to find and I refuse to pre-order.
@@jondonnelly3 Yeah, because Turing was literally one of the worst generations for Nvidia ever. Against that backdrop, it's not hard to make it seem like you've made a huge leap when in reality it's just a sham to squeeze the piggy bank even harder.
@@Senzorei So true.
Nvidia can release what they like and people will eat it up. Remember AMD known for powerdraw and heat. Intel fans still believe it about Ryzen. Now with Radeon group IF AMD run the same perf but at around 50 to 100watts less Nvidia fans will still believe 'Hot and Loud'
You talked about consumer Ampere, but I've yet to see anyone tackle of the most curious aspects of consumer Ampere vs HPC Ampere that no-one seems to be talking about... Why does HPC 7nm Ampere not have doubled CUDA core count like 8nm Ampere?
Hey Jim, how do you quantify the benefit or cost of having RTX and its impact on the ROI of the last two gens?
Seems something that Nvidia have tried to use to justify the cost.
people said Nvidia is way faster in machine-learning task, ie; Deepfake, and also Nvidia marketed their Ray-tracing RTX for games and rendering (job), also their Tensor core is used for parallel supercomputing. So if a user use any of those unique feature of Nvidia card for their work then surely their ROI is faster than using a GPU that don't excel in those task.
9800GTX was the last "high-end card I bought" I remember comparing that PC to the current consoles complaining of how overpriced the consoles were.
What a time to be alive. If you want hot power hungry machines you go with intel/nvidia but if you want cooler more power efficient you go amd. Why isn't this the narrative?
A small note about the graphs: i think the biggest title should say what this graph is about: performance, power etc. Then the second title should provide more context: "difference between generations". The left title "percentage difference" is redundant because each bar is annotated with "%"
Gamers are cringing at buying based on value rather than e-clout.
Everyone else is cringing at buying stuff for the "e-clout", whatever that translates to in real life.
@@DaybreakPT our whole society is influence by electronics. why then hold this particular case to an arbitrarily "higher" standard? what moral imperative exists behooving us to paint this example negative, especially giving the somewhat esoteric nature and questioned legitimacy of AdoredTV's arguments? bananas.
Jim! Thanks for releasing this after I was asleep. I really needed the rest.
I totally get what you are saying here Jim, but in Price/Performamce Ampere looks WAY better using a 3080 instead of the insanely overpriced RTX 3090. At best the 3090 gets an extra 10% FPS over 3080, but it costs over twice as much.
(Of course this would have made the Gen on Gen improvement look absolutely tiny compared to what we were used to as recently as Pascal)
As for the perf per watt and such, the 3080 is as bad as the 3090 mostly. Samsung 8N is essentially Samsung 10nm, which is only a half node shrink - ALSO even TSMC 16nm FinFET is frequency advantaged over Samsung 8nm!
Well yes and no. Beacuse the 3080 isn't available for MSRP. More like for $800+ and yes while its still "more attractive" at that price, its not as good as you think.
ruclips.net/video/s23GvbQfyLA/видео.html
@@Jimster481 3080 is definitely not a massive upgrade over 2080 Ti given, and the lack of MSRP is from scalping.\
My whole point was simply, 3080 is a FAR better perf/$ buy over the 3090.
Being that 3090 is NOT a real Titan, it is even more stupidly priced - though cheaper than a "True" Titan.
@@Dysphoricsmile I agree, but 40-50% over last Gen is a pretty big gain these days. However it's really this way because Turing sucked so much.
@@Jimster481 Agree 100% - comparing 3080 to 2080 is a HUGE upgrade, but not the 80%+ Nvidia showed off (at least in games)
Anymore the best conversation I have about tech are all here.
Sure, dumb people watch AdoreTV as well, as his most recent video highlights well - and Jim probably let's them get under his skin a bit too much. I get into heated arguments with morons as well - but I usually just let it be after they start repeating stupid shit.
Nicely laid out Jim.
We see a lot of this in tech, a company taking a lead and then getting fat and happy, raising prices, loosing sight of the target or providing the customer with the best product possible fir a decent price. Microsoft, Intel, Nvidia, the list is endless.
Once again Jim. You shown us the truth, with passion and heart, just how 3xxx isn’t the GPU gamers wanted. While everybody (almost) else in tech community/youtuber is so in love with 3080, (Linus included) that they ignore the real truth of what ampere really is.
Only ones that seem be honest, You/ Paul (Redgamingtech) Steve from gamer nexus/hardware unbox guys/maybe Jay2cents. (For ones that I watch) almost everybody else is like so blinded, it’s like they in love for first time with 3080/3090.
You go extra mile with all the homework, thank you. Keep up good work
This is the video I wanted to make but couldn't be bothered with spending the time. :-) Thanks!
Ohohoho! I've been waiting for this one...
Hooray
Ohohoho.. Green giant
Got a reality slap.
Well put together. That puts these cards into perspective.
May I enquire why the "Performance Change Between Generations" chart at about 16:30 compares the 980ti vs both the 780ti and the 780non-ti? It makes the performance jump between those generations look bigger than it actually was. Not saying that your overall analysis is inaccurate, it just struck me as an odd inconsistency.
I'm pretty sure it was meant to be 780 vs 980 and 780ti vs 980ti but one of the lines was typo'd. From what I've seen elsewhere those numbers are consistent for the Maxwell performance improvement. Looking at modern performance even the budget gtx750 (actually maxwell, not kepler like its number implies) sometimes outperforms the gxt780.
Real gold. I love these graphics cards in review videos!! It's really refreshing to see where we came from. Especially for those who got into top end gaming a generation of two into the modern era.
Jim, the OverVolted podcast on iTunes needs attention. There's one episode (July 17) tagged as "Season 1" and all the rest "Untitled Season." It needs to be fixed for reasons. TIA
AMAZING work, AdoredTV! This is precisely the kind of charts and visualisation of the price/performance issue that I've seen so far. Please, could you do a similar chart for AMD, once the RX 6900 XT is released? I think it would prove quite illuminating.
Let’s hope AMD deliver on the 28th !
Did you see their new 3070 launch date :D "seems someone is scared of what RDNA2 will be"
Jim is over AMD. good or bad he dont care
@@dralord1307 It looks more like Nvidia wants to over shine AMD and be talked about more on AMD's announcement day. Let's be honest here, Nvidia has the market share so they will outshine them in the news even if Navi is superior.
Just having some decent competition at the high end will benefit all. AMD or Nvidia not fussed, power draw is 2nd place after price v FPS so here is hoping AMD deliver this time.
@@Skylancer727 Exactly. Nvidia seems to be more "afraid" of what AMD will launch this time around. So cutting AMD as hard and fast as they can is Nvidia's goal. I mean its super smart marketing. Nvidia is the best when it comes to marketing shenanigans.
Big can here. Watch all your videos. You know what would be helpful? A list of links to the articles you reference. I have some fan boyism amongst buddies who seem to think that the 3080 looks amazing when compared to the 2080ti and now after watching your video I have to do a massive portion of what you did to help provide evidence in addition to sending them the link to your video. Super convenient to find some of this stuff already documented
This is a cool idea, I'd like to see the same thing for AMD.
Amazing video as always! Thoroughly enjoyed it. ;)
As bad of an "upgrade" as Ampère may be, I'm still very interested in what the 3070 can do. But that's only because I'm still on an RX 480. If I had upgraded to a 2060 Super like I was thinking about last year, I wouldn't be interested at all. Also I keep telling myself I'm going to be doing more 3D creative work and Ampère is supposed to show much better improvements there than it does in gaming. Either way, I won't be buying until holiday sales and I'll be comparing to RDNA2 at that point.
I'm super interested in GA106 aka the 3060/3050 or whatever they end up calling it. That's where the real volume ships. And yes, Ampere is a beast at compute.
The Tensor cores already showed themselves being very capable in blender performance back on Turing where even a 2060 (KO) was very good. So there's that.
eye opening! great work! thank you for your honesty!
Who remembers what happened after the Radeon 4770 when AMD had the node advantage? Fermi!
Ampere is definitely Fermi 2.0
Big Navi will be on a 22 month mature node advantage!
I gave in and ordered an ASUS 3080 TUF OC. I want the RT performance that Turing was supposed to bring.
It's a huge upgrade for me comming from a EVGA 1080Ti FTW3. I think I'll also jump on the Ryzen 5900X when it lands.
This broken down history facts video shows again how Nvidia's time-line has been and where we are heading.
I hope more people get recommended to watch this right now, cause there is a huge drama going on right now in the gamer community.
And this video just solves that!
@AdoredTV well done!
Please do another video / update to see how 40xx series fits into jump in performance on previous gen. Always fun and interesting to see the Nvidia con illustrated like this.
forgot your intro card, jim
anyway, i am starting to think that Ampere was supposed to be more of a compute heavy gpu being forced to perform gaming workload, my two cents
Yuo're dead on I think. They released the Ampere Cumpute cards a couple months ahead of the gaming cards and opened sales well before then too.
Combine this analysis with Moore's Law is Dead's most recent analysis of the impending RDNA2 launch and it seems that AMD will have a huge success on it's hands as long as it has sufficient supply at launch.
Would be interesting to see the 3080 on some of these charts listed specifically the performance per dollar chart.
3090 is just bad value & unnecessary unless you use it for work or NEED the best of the best no matter what.
I think the 3080 would be a monstrously good performance per dollar gpu on this chart.
This is just my assumption but given it's less than half the price of the 3090 while only being about 10% slower (at 4k) I think it's a safe assumption to make.
Edit: Adored more or less addresses this at the 20:47 mark.
Would still be interesting to see the 3080 in the charts though.
the performance per dollar of a 2080ti can make anything look good. that's how stupidly overpriced that card was
His chart is relative to the previous gen, the 3080 would only look good because the 2080 ti is so bad. You would have to make the consession on the 2080 ti as well and use the rtx 2080 which would now make turing look better.
Could say that about all other generations, why not compare the 80, the 70 and so on to find the sweet spot for performance per dollar per watt etc. The point of the these comparisons are to show the perf/dollar/watt of the actual flagship of each generation, the card that uses the full die of each generation to compare the architectural differences moving forwards. How much it'd cost to buy said full die, how much performance you'd get and how much wattage it'd draw etc.
@@Soutar3DG I see where you're coming from, but Nvidia's "actual flagships" (ignoring Titans) were priced to compete until the 2080ti came along. The 780ti, 980ti and 1080ti were all solid cards that got recommendations from reviewers. Both the 2080ti and 3090 were/are stupid halo products. No respected reviewer takes those cards seriously outside competitive benchmarking. Anyone with even the faintest interest in value for money should completely ignore them.
That's why, IMHO, it's not particularly informative including a 2080ti or 3090 on any sort of chart that explore "value". From a value perspective, they are utterly terrible products and reviews near-unanimously denounced them as such at release. Personally, I'd rather see a comparison of the highest tier card which consumers should actually consider buying.
@@antiwokehuman Yeah the 2080 ti was a joke price/performance wise.
I upgraded my gtx970 with a rtx 2070 a few months ago. I bought both cards for ~350$ and my target in both cases is 1080p/60fps. I'm happy with my purchase, it's exactly the increase in performance i was looking for, with the same price and very similar power consumption.
Imho the real story here is that "flagship" graphics cards these days only matter in super high resolutions/fps and the benefit you get from 4k and/or above 60fps is very minimal (only my opinion ofc). But the increase in cost to run these higher resolutions is very very high.
As the complete 30 series so far only makes sense at 4k and costs at least 500$ it's just not interesting for most gamers. Maybe a possible 3060 could be. But you might as well look for a good deal on last years cards if you want to upgrade now.
Could you analyze the same for AMD when the RDNA2 arrives, please?
Im very interested to see the 3090 and 2080ti both capped at 250w. I've seen a test done where the 3080 and 2080ti were both at 330W and the perf uplift over the 2080ti was only about 15%.
I suppose you can technically compare the 2080ti and 3080 as they are both cutdown versions of the 102 spec dies.
@@crithylum9846 10496 vs 10752 possible, in other words, ~2.439% more cores possible or 97.619% of the max amount. You really won't miss out on anything with just 2 CUs missing