The 4070 just got more disappointing...

Поделиться
HTML-код
  • Опубликовано: 27 окт 2024

Комментарии • 2,4 тыс.

  • @chubbysumo2230
    @chubbysumo2230 Год назад +2108

    It's a 60 class card with 80 class pricing. This was originally supposed to be 4070 TI. Think about that. Nvidia wanted to sell this card as two steps above where it actually performed. And their original MSRP was probably going to be $750.

    • @luxaly9510
      @luxaly9510 Год назад +91

      i think nvidia did think that they can reduce raw power but can use DLSS 3 as an excuse for higher prices like yeah it sucks but with DLSS 3! 50 series has DLSS4 then they are weaker than last gen GPUs in Raw power but double as good with DLSS

    • @stevemoon2136
      @stevemoon2136 Год назад +44

      I literally spat my coffee out when Jay said it was always ment to be a higher tier card in the intro. WTF has he been smoking...

    • @frischerfisch2528
      @frischerfisch2528 Год назад +140

      This 4070 was suppose to be the 4070Ti. Did you forget the 4080 12GB. If Nvidia didn’t change that one to name it 4070 Ti, this 4070 was next in line and would have named by Nvidia 4070Ti and charge $750 for it. But the backlash on the 4080 12 GB was to big and Nvidia got cold feet.

    • @chubbysumo2230
      @chubbysumo2230 Год назад +101

      @@stevemoon2136 Nvidia wanted to sell it as a 4070ti. If they hadn't renamed the 4080 12gb, this would have been your 4070 TI.

    • @Mr.Genesis
      @Mr.Genesis Год назад +96

      @@luxaly9510 using DLSS 3 when you're paying $600+ is absurd.. Acting like frame generation is worth paying for when you should have native power at that price range is ridiculous

  • @Deeps__
    @Deeps__ Год назад +427

    It’s almost as if the Ti was the 70 and the 70 was the 60. They didn’t get away with it when it came to the 80 but have just gone and done it with the 60/70. Nice bait and switch!

    • @spacechannelfiver
      @spacechannelfiver Год назад +48

      Everything from the 80 down is a tier too high in the naming.

    • @dylanguerrero8053
      @dylanguerrero8053 Год назад +34

      Nvidia were definitely going to charge us $600 for. 4060. 🤡🤡🤡

    • @adventuretimeae
      @adventuretimeae Год назад +8

      Ding Ding Ding! We have a winner

    • @generalawareness101
      @generalawareness101 Год назад +14

      You guys read what Jensen recently said about pricing? I know this I am jumping ship as evga did. I may not be as fast in AI, or DLSS but I am no longer going to ride Jensen's boomstick. YMMV, but expect this to get even worse with the 5k series. As long as that man is alive, and CEO this is only going to get worse.

    • @RogueWraith909
      @RogueWraith909 Год назад +1

      @@generalawareness101 That's why I'm looking at 7900 XTX's and trying to figure which to buy (must be white or light silver to fit the case and motherboard I'm getting/have got), they look decent and a slight upgrade from my laptop 3080. I won't buy Nvidia after the scummy things they've done so far.

  • @NathanOakley1980
    @NathanOakley1980 Год назад +1047

    There has never been a less inspirational range of GPU’s.

    • @BlackParade01
      @BlackParade01 Год назад +5

      @travisclark2642 yea I'm also buying a 4090 next week!

    • @AlexRubio
      @AlexRubio Год назад +1

      Interesting never thought you would be interested in GPUs. Haven't watched your debates in a while.

    • @chriswilson9331
      @chriswilson9331 Год назад +5

      Geforce FX has entered the chat.

    • @Volker_A4
      @Volker_A4 Год назад +19

      The 4090 is atleast fun to window shop. But, the 70 and 80 are hot garbage.

    • @DeadPhoenix86DP
      @DeadPhoenix86DP Год назад +27

      @@BlackParade01 I can't even afford a 4090. 2000 euro here.

  • @Archer_Legend
    @Archer_Legend Год назад +563

    It is not a surprise that the 4070 is power efficient; I have never seen a 60 class card which is inefficient.

    • @dustinharvey7394
      @dustinharvey7394 Год назад +18

      i dunno for the performance of the 3060 it definitely isn't power efficient, a 6700xt uses the same amount of power and out performs it a ton. besides the fact a 2060/2060 super used far less power then the 3060 and they are barely out performed by the 3060

    • @alvarg
      @alvarg Год назад +15

      @@dustinharvey7394 3000 series were pretty power hungry, 4000 series definitely is a leap in power efficiency and overall a decent leap in performance, its just a shame that Nvidia decided to name them and price them the way the did.

    • @Takisgr07
      @Takisgr07 Год назад +5

      @@dustinharvey7394 source trust me bro.

    • @VitisCZ
      @VitisCZ Год назад +24

      @@Takisgr07 it's all information you can find online from multiple sources so what are you on about?

    • @orangepacker7479
      @orangepacker7479 Год назад +1

      @@dustinharvey7394 that’s a node difference. Compared to the higher tier cards the 3060 was more power efficient, because it wasn’t pushed as hard.
      If you look at it realistically the bigger cards are more power efficient if you just limit them because they have more cores…

  • @akina309
    @akina309 Год назад +475

    Went shopping for a GPU upgrade yesterday, was thinking about the new 4070 for about 600-650 dollars. Left with a rx6950xt reference model which is at $649 right now. honestly, could not be happier. that was the best "change my mind on the fly at the store" i've ever had

    • @SEGArianer
      @SEGArianer Год назад +42

      Ok but 335W vs 200W TDP/TGP, hope AMD will drop TDP for their next GPUs.

    • @FellowshipOfTheAviatorZ
      @FellowshipOfTheAviatorZ Год назад +117

      No "OK but" here.. you made a great choice!

    • @megadeth8592
      @megadeth8592 Год назад +83

      ​@@SEGArianer it's the top end card from last gen....... Why would you compare tdp...? Obviously it's gonna be better on the newer and lower end model in the lineup... Weird.

    • @andrewsolis2988
      @andrewsolis2988 Год назад +41

      My wife still rocks her Sapphire Nitro 6950 xt and laughs at these newer gen cards that can't keep up without fake frames. That being said AMD fake frames are coming lol

    • @SEGArianer
      @SEGArianer Год назад +16

      @@megadeth8592 Because of Form Factor an the power supply needed. The 4070 will fit in more cases and should not require an PS upgrade. Thats why you should not only compare fps.

  • @VeiLofCognition
    @VeiLofCognition Год назад +363

    The only disappointing thing is seeing youtubers NOT bashing the hell out of this rip off 4060 for 600 bucks tbh!

    • @bobbygetsbanned6049
      @bobbygetsbanned6049 Год назад +34

      I don't think I have seen a youtuber not bash it lol. It's a joke, a 4060 with a 4070 badge and price.

    • @WayStedYou
      @WayStedYou Год назад +20

      every single review ive seen of it said it was poor

    • @Dragon2220
      @Dragon2220 Год назад +16

      not enough.
      since the 4080 12gb bullshat its become almost silence

    • @mkuhnactual
      @mkuhnactual Год назад +7

      Most are being fairly critical. The only review I've seen sing it's praises is Ars Technica which is a written article.

    • @wagnonforcolorado
      @wagnonforcolorado Год назад +19

      Most reviewers, from the 5 or 6 I have watched, have said the "card is OK, but overpriced". They then usually go on to say how much better Ray Tracing and DLSS are, and point out power draw is low, tech is latest... Not really a bashing. Jay's video is a well constructed takedown, and well supported with data and methodology.

  • @thomasbradford6434
    @thomasbradford6434 Год назад +312

    6950xt owner here. I switched from nVidia when I saw the 40 series pricing was simply unaffordable this generation. I’ve been nothing but impressed. The software is powerful and intuitive, the performance is next generation and the price was a steal. Give AMD a chance, you won’t regret it.

    • @Gabu_Dono
      @Gabu_Dono Год назад +16

      Is the high power draw not a problem? It’s has one of the highest power draws out there, and I’m concerned it might be a bit loud.

    • @Pirate85getready
      @Pirate85getready Год назад +15

      Same - never regret it so far.
      @Gabriel - theyre quiet. The Coolers are pretty good... ive the Referencemodel.

    • @sturmtruppen-1916
      @sturmtruppen-1916 Год назад +8

      Put an AMD GPU into a 3D environment and tell me how it goes... Not everyone is into the gaming stuff.

    • @hhkl3bhhksm466
      @hhkl3bhhksm466 Год назад +4

      @@Gabu_Dono It's pretty loud and power consuming, I have the red devil variant. If you have a really low watt psu, the 4070 is honestly a pretty good alternative

    • @Pirate85getready
      @Pirate85getready Год назад +16

      @@sturmtruppen-1916 what or which 3D Environment?

  • @jeehoonoh
    @jeehoonoh Год назад +734

    It's nice to see the community speak with their wallet. The card ain't doing too hot sales wise

    • @LethaliDyr
      @LethaliDyr Год назад +78

      Still not enough. These ridiculous prices needs to go. the 80tis need to be at $699 and the rest will follow.

    • @chrisk3127
      @chrisk3127 Год назад +73

      Nvidia will just make less cards to keep the price up lol

    • @kenos911
      @kenos911 Год назад +6

      Where I live this card is pretty cheap and amd cards are really pricy so I might actually jump the gun and go for it but yeah, I’ll wait a couple months

    • @Mopantsu
      @Mopantsu Год назад +36

      Which is why Nvidia are reducing production instead of the price.

    • @shinyhappyrem8728
      @shinyhappyrem8728 Год назад +1

      @@kenos911: try to get it used then, there ought to be some who want to move on to a better card

  • @yugoprowers
    @yugoprowers Год назад +98

    I may not have bought much EVGA stuff but this is why we need them in the video card market. EVGA would have found a way to unlock it forcing Nvida to unlock voltage for all cards.

    • @backlogbuddies
      @backlogbuddies Год назад +32

      Remember EVGA left because Nvidia was starting to punish people for doing that sort of stuff and because they kept screwing over board partners

    • @Apollo-Computers
      @Apollo-Computers Год назад +1

      Honestly voltage has been locked since 1000 series. That voltage slider just allows it just a few more millivolts.

    • @TheOfficialFloridaMan
      @TheOfficialFloridaMan Год назад

      I have a evga 980 sc and looking to upgrade right now, first ive heard of this

  • @racerex340
    @racerex340 Год назад +140

    I keep saying that this 4070 was originally meant to be the 4060 or 4060 Ti. What's making this so much worse was the MASSIVE increase we got in CUDA from 2000 series to 3000 series.
    If you look at all of the other 4000 series and look at CUDA core counts generational growth:
    1070 (1920 Cuda) to 2070 (2304 Cuda) 20% Cuda core increase
    2070 (2304 Cuda) to 3070 (5888 Cuda) 156% Cuda core increase
    3070 (5888 Cuda) to 4070 (5888 Cuda) 0% Cuda core increase
    1070 Ti (2432 Cuda) to 2070S (2560 Cuda) 9% Cuda core increase
    2070S (2560 Cuda) to 3070 Ti (6144 Cuda) 140% Cuda core increase
    3070 Ti (6144 Cuda) to 4070 Ti (7680 Cuda) 25% Cuda core increase
    1080 (2560 Cuda) to 2080 (2944 Cuda) 15% Cuda core increase
    2080 (2944 Cuda) to 3080 (8704 Cuda) 195% Cuda core increase (this was such a big upgrade)
    3080 (8704 Cuda) to 4080 (9728 Cuda) 12% Cuda core increase
    1080 Ti (3584 Cuda) to 2080 Ti (4532 Cuda) 26% Cuda core increase
    2080 Ti (4532 Cuda) to 3080 Ti (10240 Cuda) 126% Cuda core increase
    3080 Ti (10240 Cuda) to 4080 Ti? (11800 Cuda?) 15% Cuda core increase?
    TitanX (3072 Cuda) to TitanRTX (4608 Cuda) 50% Cuda core increase
    Titan RTX (4608 Cuda) to 3090 (10496 Cuda) 128% increase
    3090 (10496 Cuda) to 4090 (16384 Cuda) 56% increase
    Future 4090 Ti, 18176 Cuda
    I took a guess that 4080 Ti will be 15% increase over 3080 Ti.
    It's pretty clear that Nvidia had no interest in giving such an enormous generational uplift again. They gave us an RTX 4080 that had just 12% more Cuda cores and 60% more VRAM than the RTX 3080 10GB FE, yet increased the price from $699 to $1199, or 72%, Nvidia expected to charge 72% more for 12% more CUDA cores and roughly 30% more non-RT/non-DLSS performance. When we went from the 2080 to the 3080, we kept the same $699 MSRP but got a 70% increase in gaming performance. The 4090 Cuda count was clearly grown simply to keep the performance at the top unattainable by anything AMD could launch. The 4080 is literally almost half of the 4090, it should have cost $799, which would have factored in inflation.
    (edit, accidently used the future 4090 ti Cuda count instead of the 4090, fixed)

    • @SamfisherSam
      @SamfisherSam Год назад +5

      That's not really how CUDA cores work... They are not comparable generation to generation.

    • @racerex340
      @racerex340 Год назад +20

      @@SamfisherSam They are pretty comparable within the RTX generations, meaning, they perform roughly the same function. I understand that going from RTX 2000 to RTX 3000, the huge increase in the CUDA core count was due to the FP32 ALUs being doubled for each SM. That doesn't change the generational horsepower discrepancy, which when excluding Ray Tracing, DLSS and some 4K loads, is worse than the pitiful ~20% increase we saw going from GTX 1070 to RTX 2070.
      Nvidia is tiering this 4070 based on the minimum amount of GPU that they thought they could get away with while still being able to "sell" the story of large generational improvements, heavily reliant on DLSS 3 / Frame Generation, which is why all of their marketing materials discussing performance improvement over last-gen reference metrics that only use DLSS 3, not raw rendering performance.
      They took DLSS 3 / Frame Generation and decided to use it to sell us an RTX 4060 as an RTX 4070 for 20% more than they demanded for the RTX 3070, and 82% more than their MSRP for the ACTUAL previous generation RTX 3060 12GB. If Nvidia was making $30 gross profit on every RTX 3060 12GB (within industry ranges), then they are making closer to $300 in gross profit from the RTX 4070, and I don't know about you, but I'd personally feel pretty damned abused if I knew that a company decided to double their gross profit on a given product, but Nvidia has certainly more than quintupled their gross profit per unit on the 4070 compared to the 3060/3070.
      Nvidia's gaming revenue exploded in CY 2021 (FY 2022) due to mining and morons paying over MSRP and price increases, then they were down almost 30% year over year for CY 2022 (FY 2023), yet net income (before a nearly $4B increase in operating expenses) remained nearly flat due to increased prices, yet they made sure they had enough to run an $11B stock buyback and $2.5B in Stock-based compensation expenses (exec bonuses). I'm getting tired of people defending what Nvidia is doing, they make incredible products, yet they've gone anti-consumer. The best we can hope for is another couple of quarters with 25%-30% YoY gaming revenue losses to help them recognize that they still need consumers, although with the AI explosion that is just starting to take off, I'm betting they go from $15B in data center revenue in FY 2023 to $25B or more in FY 2024, the gaming segment won't really matter at that point and profit margins for datacenter will double, possibly triple because they can charge whatever they want at this point.

    • @atharvatalaulikar9066
      @atharvatalaulikar9066 Год назад +8

      very good information, thanks to you my knowledge about rtx40 series cards has increased dramatically.

    • @atharvatalaulikar9066
      @atharvatalaulikar9066 Год назад +6

      @@racerex340 wow, i heard that samsung offered Nvidia half the price for their 8nm node compared to Tsmcs 8nm node and in way they earned way more revenue and considering the gpu price crisis , they made some serious profits, i think their focus was more towards increasing the profit per gpu over increasing performance. it is a very pro invester move from a business standpoint , but their consumer are showing awareness now in not buying 4070 gpus

    • @ChristopherHailey
      @ChristopherHailey Год назад +1

      It was going to be the 4070ti but the whole two 4080s thing happened. That's all marketing anyway. Like all other generations new models perform around the level of the higher model in the previous generation, just like now. Marketing is not the tech.

  • @TNM001
    @TNM001 Год назад +68

    the 4070 is a 3060ti replacement according to power draw. same tdp class. if it would be 200 bucks cheaper it would be ok.

    • @KelvinKMS
      @KelvinKMS Год назад +11

      Yes 4070 only worth $399

    • @bloxoss
      @bloxoss Год назад +2

      but it ties the 3080-ti and 3090 in some case??

    • @sweetsurrender815
      @sweetsurrender815 Год назад +3

      @@bloxoss each generation the 70 tier cards were at least 20% faster than the 80 tier from the past, this time it barely matches the 80 tier card, and sometimes loses to it. Since when is it 3090 level?

    • @username8644
      @username8644 Год назад +1

      ​@@bloxoss No shit. If there's no improvement then why would people upgrade to the new generation? You do realize that the entire point of a new generation is to make people upgrade from what they currently have because the new gen is better. A new generation being faster does not justify a price increase, the entire point of a new generation is that it performs better.

    • @KelvinKMS
      @KelvinKMS Год назад +1

      @@username8644 Yes, and nVidia CEO was promised to release new product every 6 months !!! Not every 2 years now !!! so 2 years long. nVidia has to provide at least 2x faster speed at same price point !!! so every 40 series are all overpriced !!!

  • @CarlosDiaz-je1bg
    @CarlosDiaz-je1bg Год назад +507

    Nvidia is definitely going to come out with a Super series. Probably with extra VRam if they keep getting backlash.

    • @ScottGrammer
      @ScottGrammer Год назад +94

      And I won't buy it. I'm so pi$$ed at NVidia that I sold my one card from them and bought an Intel to replace it. I lost a tad of horsepower, but NVidia needs a good strong dose of competition and loss of sales to help them get the message, and Intel needs some encouragement to keep making cards.

    • @none377
      @none377 Год назад +8

      Aren't the "Super" cards similar in performance to their "ti" counterparts?

    • @braaaaaaaaaaaaaains
      @braaaaaaaaaaaaaains Год назад +30

      @@ScottGrammer Honestly, Nvidia does not care too much about consumer graphics cards anymore.
      They are shifting over to AI, which is probably why they increased the prices so much for their graphics cards. If they can get much much larger margins from the same TSMC node by producing AI accelerators, why bother with consumer graphics cards ? Thus, they increase the price so much, that it is worth it for them.
      As much as I hate the game ( capitalism ), Nvidia is doing everything right within its rules.
      Don't hate the player, hate the game...

    • @marshallmcluhan33
      @marshallmcluhan33 Год назад +34

      Nvidias Low end 4000 cards don't even have enough vram to load the textures let alone the new AI models coming. I feel bad for people who buy these.

    • @ians_big_fat_cock5913
      @ians_big_fat_cock5913 Год назад

      VRAM isn't possible unless they come out with 4GB density.. meaning 24GB 4070.

  • @chrismastroeni4181
    @chrismastroeni4181 Год назад +193

    So you’re saying nvidia is charging more, and giving less?? 🤯

    • @KelvinKMS
      @KelvinKMS Год назад +4

      Sure

    • @outlet6989
      @outlet6989 Год назад +4

      You got that right, Skippy!

    • @SayWhaaaaaaaaaaaaaaaaaaaaaaat
      @SayWhaaaaaaaaaaaaaaaaaaaaaaat Год назад +1

      No. they giving more for less. 3080 price was like 1000$ and UP.. 4070 i 599. and uses at least 30% less power than 3080. SO how it's giving less? Yeah...less in terms of power bill.

    • @kennymartinez774
      @kennymartinez774 Год назад +1

      The more you buy the more you save … say it with me

    • @KelvinKMS
      @KelvinKMS Год назад +3

      @@SayWhaaaaaaaaaaaaaaaaaaaaaaat We talk about raw graphic power not just saving power. Using TSMC is more power efficient but nothing special. and 3080 was $699 not $1000. You just got scam on that price.

  • @Geomlord
    @Geomlord 7 месяцев назад +8

    7:57 this aged like fine wine

  • @farbenseher2238
    @farbenseher2238 7 месяцев назад +3

    You can get 3080TI Performance at 140-150W (undervolted and overclocked) plus DLSS3 support. I like the card.

  • @e3446
    @e3446 Год назад +171

    Love how honest you are. Nice to see not all youtubers are too worried about their relationship with Nvidia to be honest.

    • @e3446
      @e3446 Год назад +7

      @@2leggedpirate265 In any relationship honesty is key. Your not doing anyone any favors by letting them ram you from behind.

    • @Alex-zi1nb
      @Alex-zi1nb Год назад +5

      2 legged pirate you are insane

    • @93836
      @93836 Год назад +6

      Dog, he’s gaming on a 4090 in his free time. Think about that. He doesn’t really give a crap. He’s feigning outrage to try to relate to the everyday poor person.

    • @johnnypopstar
      @johnnypopstar Год назад +2

      @@93836 Just because you're needlessy cynical, doesn't mean everyone else is. Some people *do* care about other people.

    • @jimmyhopkins8305
      @jimmyhopkins8305 Год назад +2

      @@2leggedpirate265 are you.. ok?

  • @RoderickL1121
    @RoderickL1121 Год назад +506

    I love that the 4070 go more disappointing

    • @BlackJesus8463
      @BlackJesus8463 Год назад +7

      ikr 😂😂

    • @theduck17
      @theduck17 Год назад +2

      I hear this in Linus' overly high pitched voice: "LET'S GOOOOOO (more disappointing)"

    • @Pickelhaube808
      @Pickelhaube808 Год назад +13

      The rumor come out: Does 4070 is more disappointing?

    • @RoderickL1121
      @RoderickL1121 Год назад +2

      @@Pickelhaube808 The answers may answer your question!

    • @ScrewFearMe
      @ScrewFearMe Год назад +1

      @@corndog9482 I would say a 4060

  • @jankie55
    @jankie55 Год назад +53

    I got lucky with acquiring a evga 3080 ftw3 in the middle of all the madness a couple years back and was hoping to get a 4070 on launch thinking it would be a huge improvement. Glad I don't need to spend that money now.

    • @sevsnk3043
      @sevsnk3043 Год назад +1

      Deep down u know u want a 4090

    • @mtrx1708
      @mtrx1708 Год назад +15

      Why would you think a 4070 would be a huge improvement over the 3080...

    • @NahBNah
      @NahBNah Год назад +2

      ​@@sevsnk3043well of course, we all do.

    • @jankie55
      @jankie55 Год назад +13

      @@mtrx1708 historically the 70's of the next generation are much better than the previous 80s. I was hoping the rt cores would be much better and they really aren't

    • @KrayZGames
      @KrayZGames Год назад +2

      @@mtrx1708 Historically it was, but the 4070 is not.

  • @SquintyGears
    @SquintyGears Год назад +28

    This was much more informative than most review videos from launch day.
    (not a critic of the work done during that time crunch, just an applause for the work done for this video)

  • @mehdi76302
    @mehdi76302 10 месяцев назад +8

    i can't believe that Jay actually called the release of 4070 super lol 8:00

    • @davidgrenet
      @davidgrenet 8 месяцев назад +1

      "I don't think nvidia is dumb enough to do that this time" 🤣🤣

  • @chrishexx3360
    @chrishexx3360 Год назад +229

    It's worse when you consider that it was originally planned to sell for 750.

    • @CyberneticArgumentCreator
      @CyberneticArgumentCreator Год назад +6

      Was never planned to be sold at $750, that was a leaked memo saying "DO NOT EVER PRICE THIS OVER 750!" as a guideline for partners. It was never the MSRP.

    • @yugoprowers
      @yugoprowers Год назад +10

      @@CyberneticArgumentCreator Yeah but I get what he means any time they put a ceiling on a product most of that product ends up being that price.

    • @LetsFixITJoe
      @LetsFixITJoe Год назад +2

      i believe the price is more an inflation issue than a height issue ... money getting more worthless and poor people nowadays arent earning that much money than back in time ....a price is never a problem but the bank account always is one :)))))

    • @itsTyrion
      @itsTyrion Год назад

      FOR HOW MUCH!?

    • @jmwilsoND
      @jmwilsoND Год назад +4

      @@LetsFixITJoe Inflation between 3070 release (2021) and 4070 release totaled 10%. That means adjusted for inflation, the 4070 should have been $549. At $599 that's double inflation not even counting the biggest problem ... it's not even close to the performance gains seen from both sides for all history. From 2070 to 3070 there was a 40% rasterization gain. From 3070 to 4070 it's only 27% despite the absurd price jump that we all know was supposed to be $750, aka a 50% increase. Compare that to the 980 ti vs 1080 ti. There was a 7.5% price increase and 100% performance gain. Screw Nvidia, so glad I bought AMD last year.

  • @andydbedford
    @andydbedford Год назад +56

    I commented on your last video that I thought this 4070 with its specs and performance looks like it should have been a 4060, everyone called me crazy, looks like I was right.

    • @NocturnEternal
      @NocturnEternal Год назад +10

      Especially since the 3060 has the same VRAM size and PCI bus bandwidth!
      Screw NVIDIA, I’m looking at AMD and Intel’s offerings at the end of this year.

    • @michaelmaness5493
      @michaelmaness5493 Год назад +3

      You're still crazy (in a good way), but you, sir, were correct.

    • @racerex340
      @racerex340 Год назад +1

      I said the same thing. This is the "series" in Nvidia's portfolio that saw ZERO CUDA core increases generationally. 1070 to 2070 did, 2070 to 3070 did (big time), 3070 to 4070 = 0%. Shit, the 2080 to 3080 was almost a 200% increase in CUDA cores. Even the 3090 to the 4090 was a 73% increase, while 3080 to 4080 was a 12% increase.
      We were getting less of a bump this time no matter what, as Nvidia had always planned on DLSS 3 / Frame Generation with RT providing most of the generational uplift over 3000 series, as evidenced by their marketing materials only focusing on RT/DLSS/FG numbers and NEVER native rendering/rasterization improvements, because the 4080 is really only a 30% increase over the 3080, makes sense as the new node process gave them almost 20% while the increase of cores by 12% covered the rest. But with everything else, Crypto and now AI/ML boom, Nvidia is basically telling us that they do not want to give us more than a 20% increase in performance in frames per dollar over the original 3000 series..

    • @andydbedford
      @andydbedford Год назад +1

      @@michaelmaness5493 thank you, as my wife says it’s not often that I’m right 😂

    • @andydbedford
      @andydbedford Год назад +3

      @@racerex340 I have a 3080Ti, and so I’m not really looking to upgrade at all this generic.
      However, I think, I may be wrong, but I do think people who know about this stuff are seeing through the BS of NVIDIA giving us DLSS and frame generation instead of real hardware improvements in the mid to high tier range, the only exception this generation is the 4090 the price of that card here in the Uk from AIB’s are around £2000. Thus all the cards are to up sell. So NVIDIA gets away with it, because there is no competition in that tier of card, but I really think consumers should start buying AMD for a generation or two, I mean they are good cards, not the best no, but they can certainly get the job done until this bs gets sorted out.

  • @dobermanownerforlife3902
    @dobermanownerforlife3902 Год назад +88

    "Nvidia isn't that dumb."
    True. Their customer base just might be.

    • @GrizzAxxemann
      @GrizzAxxemann Год назад +13

      Their customer base *IS* that dumb. Anyone who didn't see this happening years ago is blindly chugging Koolaid.
      AMD isn't much better in some regards, but their cards have longevity and better performance per dollar.

    • @username8644
      @username8644 Год назад +3

      ​@@GrizzAxxemann Amd isn't better in any way. Their cards are always bad at launch until at least a year later due to their shitty drivers that they still can't seem to get control of. And all they do is match Nvidia's pricing and then lower their price a little bit to make up for their instability with their cards. Anybody who's bought a GPU since the 20 series is to blame for this. There hasn't been a single good deal since the 10 series and people keep buying cards, it's ridiculous. Too bad most of the population is stupid because the rest of us have to suffer due to their poor decisions.

    • @poofy0121
      @poofy0121 Год назад

      @@username8644 It's a competition issue, not a customer issue. PC Gamers are enthusiasts. They don't care about cost as much as they do performance...if cost was the primary concern they would be console gamers and not pc gamers. The problem with this gen is that the performance does not even justify the cost, which is why sales are shit...make no mistake, if the costs were the same as they are, but performance for each card was 20% better, these cards would be sold out at these prices.
      I guarantee you when Nvidia releases next gen, they will keep the same pricing scheme, but the cards are going to be sold out because the performance gains that are missing right now, will be there in the next gen.

    • @username8644
      @username8644 Год назад

      @@poofy0121 Customers are as much the issue as the companies right now. They are so desperate to play games at 4k 200fps instead of just settling on 1440p gaming with a few settings in game settings turned down from ultra. You don't need an rtx 4090. My 1070 ti still works great to this day and is totally usable even with my 3440x1440p monitor. An rtx 3080 is more than enough for anything gaming related. Yet these people think they need more powerful GPUs and keep giving money to companies that are scamming us. They are the reason the companies keep scamming us with every generation.

    • @starvader6604
      @starvader6604 Год назад

      ​@@username8644 if you're running old triple A games sure i guess

  • @MadayMaday
    @MadayMaday 11 месяцев назад +6

    It's funny to watch this 6 months later and the announcement of the super series of 40 series cards... LOL!

    • @brokenhalo2001
      @brokenhalo2001 11 месяцев назад

      I was just thinking the same as I got to that part LOLOL

  • @rid12
    @rid12 Год назад +14

    im sort of glad that i went for 6000 series amd (6800 XT) as my card on paper performs the same most of the time give or take as the 4070 except ofc RT performance which i dont think anyone really uses(at least i don't). Also having paid a lot less than a 3080 and 3080 ti im very happy with my card and its performance

    • @steveleadbeater8662
      @steveleadbeater8662 Год назад +1

      If you've seen any of the UE5 games, RT is about to become far more important, particularly in the AAA space (search Unrecord for where we are heading). AMD MUST improve their RT compatibility as they are lagging in Third place. RT/PT will be standardised in game engines in the next two years, given NVs market share. Intel have already outdone AMD and their next gen cards could really damage NVs market share at the all important low-Mid range.

    • @Takisgr07
      @Takisgr07 Год назад +1

      Source trust me bro as every amd buyer tell us, i dont use RT so i think no else is using 😂😂

    • @steveleadbeater8662
      @steveleadbeater8662 Год назад +1

      @@Takisgr07 Up until now its a valid argument as its hardly ubiquitous. However, every next gen engine is going to have some version of it baked in. You still don't have to use it but I don't think AMD can afford to be in third place, especially if Intels Battlemage cards continue to outclass them on that department.

  • @chrisp7729
    @chrisp7729 Год назад +88

    "I don't think Nvidia is stupid enough to do that..." - Jay underestimating Nvidia.

    • @acolossalsquid
      @acolossalsquid Год назад +11

      "Hold my beer" - Nvidia

    • @zhardy323
      @zhardy323 Год назад +2

      they arent stupid. they know exactly what they are doing, but if people keep buying from them then why stop?

    • @TalesOfWar
      @TalesOfWar Год назад +2

      They're not stupid, just greedy.

    • @royaloreca
      @royaloreca Год назад

      Also Jay,... "I don't know what Nvidia is capable of anymore"

    • @RyTrapp0
      @RyTrapp0 Год назад +2

      @@zhardy323 It's not Nvidia who are the stupid ones - the stupid ones don't want to admit it though but still keep opening their wallets(cuz they "need" a new gen card of course)...

  • @quackmoor
    @quackmoor Год назад +39

    Everything i know about this card convinces me this is a 4060 class card.

    • @racerex340
      @racerex340 Год назад

      yeah, especially the fact that it has exactly the same number of CUDA cores as the 3070, and they have NEVER kept CUDA core counts the same generationally, always increased except in this one 3070 to 4070 case.

  • @boy638
    @boy638 Год назад +60

    "The 4070 just go more disappointing..." is the original title before they change it.

    • @SkoomaChugger
      @SkoomaChugger Год назад

      i just noticed you were right hahahaha good catch

    • @EmilePesky-n1v
      @EmilePesky-n1v Год назад

      Not changed after 41 minutes still

  • @chris929rr7
    @chris929rr7 Год назад +6

    Oh good to know and I just got my strix 3080 OC and was worried I should have got the 4070 but the 3080 was only AUD$600 (USD380-400ish) so of course I couldn't pass that up. Looks like I made the right choice.

    • @steveleadbeater8662
      @steveleadbeater8662 Год назад +1

      Given the usual prices in Aus, the 4070 would be, what, $250 - 300 more?

    • @chris929rr7
      @chris929rr7 Год назад

      @@steveleadbeater8662 That's about right. As of today the "4070" is AUD$999 and the "4070 Ti" is between $1200-1300.

  • @N0N0111
    @N0N0111 Год назад +27

    The tail of why EVGA left the GPU market, Jensen became Gamers enemy pretty much.

  • @Eddaeken
    @Eddaeken Год назад +114

    Never have I been more certain that my next GPU will be from AMD.. even if it has to be a used card.

    • @rewndude
      @rewndude Год назад

      If you aren't shy about open box, newegg was selling 6950xts open box for $569 just a few days ago. I ordered one.

    • @hhkl3bhhksm466
      @hhkl3bhhksm466 Год назад +16

      Nah, even AMD is overpricing their cards because of NVIDIA. No gpu company is your "friend" when they're clearly working with each other's interests. Just wait for the next generation and stop giving these companies your money

    • @thetranya3589
      @thetranya3589 Год назад +15

      @@hhkl3bhhksm466 What makes you think the next generation will be any different? One thing is for sure, the next gen will be more expensive.

    • @Calamity_Jack
      @Calamity_Jack Год назад +4

      @@hhkl3bhhksm466 The next generation will likely be even more expensive. While you're right that no GPU company is your friend, I tend to want to go with one that's at least less my enemy. So I'm also looking hard at a RX6xxx card to be my next, probably a RX6950XT.

    • @hhkl3bhhksm466
      @hhkl3bhhksm466 Год назад +2

      @@Calamity_Jack Wait just a couple more months. Amd is about to the release their new 7800 xt, which is the competitor to the 4070. It also uses RDNA 3 and is significantly faster than the previous gen

  • @fliggopolis
    @fliggopolis Год назад +53

    The skip method worked for the 2000 series. If a large chunk of consumers do that as well then Nvidia may be forced to come to their senses for next gen.

    • @chilpeeps
      @chilpeeps Год назад +7

      I think what really hurting nvidia is ps5 399 usd console

    • @Obviousman1
      @Obviousman1 Год назад +2

      I'll believe it when we see it.

    • @dralberthofmann
      @dralberthofmann Год назад +11

      ​@@chilpeeps The PS5 has definitely given disappointed NVIDIA customers another option but NVIDIA's fundamental problem in the PC Gaming market is self inflicted.

    • @KaoruSugimura
      @KaoruSugimura Год назад +1

      Not going to work. NGreedia would sooner keep the pricing next gen and push gamers to buy their subscription service to use network GPUs.
      NGreedia couldn't give less of a shit about the gaming community or how 'affordable' their adult Legos are.

    • @RandoBurner
      @RandoBurner Год назад

      They are selling now mainly to people who use AI, stuff like that. Productivity related activities. Who do not care how much it costs, because they either make money with it or universities are paying for them etc. So as long as they don't have a competitor(AMD is not a competitor in that, at all), they can set whatever price they basically want. I hope I'm wrong, but I think I'm not.

  • @TheIslander93
    @TheIslander93 Год назад +131

    I start to think that my spontaneous purchase of an used 3090 was a good deal.

    • @Kinvarus1
      @Kinvarus1 Год назад +16

      Yeah I went from my 3070FE 8GB to a 3090Ti 24gb on a spontaneous "Oh that looks good price." thing just before they announced the 40 series and while I had a brief "Oh maybe I should have waited, now I'm kinda glad I got that 3090ti."

    • @vroomzoom4206
      @vroomzoom4206 Год назад +16

      Tbh buy anything current gen seems like a rip off unless you can afford the 4090

    • @TheIslander93
      @TheIslander93 Год назад

      @@Kinvarus1 I went from 1080ti to 3090, that was a huge gap :D the auction it was on said "lightly used for simulation project" so for sure it was mining for past 2 years but I did try to murder it with Kombustor and other programms and it passed with no problems. Not the best bin considering max clocks but still undervolted performs nicely enough. Even made 3d printed shroud for it with 3 noctuas 92mm so it is maxing at 62c and close to perfect silence.

    • @spacechannelfiver
      @spacechannelfiver Год назад +8

      @@vroomzoom4206 the 7900XT are starting to look a bit better now the prices are coming down, $700-750 and they'd be pretty solid.

    • @AbhijeetMishra
      @AbhijeetMishra Год назад +1

      I bought the 3090 at a higher price than MSP (though not that much higher) and was kicking myself for wasting money, but I think all in all it was a good purchase. With DLSS etc there's no real need to upgrade for a long time now.

  • @scrummonkey24
    @scrummonkey24 Год назад +4

    just got the 4070 fe for my mini itx. It is actually a pretty good card for a mini itx build. it is quiet, cool, efficient and small. However, if you are not building a mini itx then it has pretty bad value.

  • @nicholasabsher4599
    @nicholasabsher4599 10 месяцев назад +4

    lmao 7 months ago this guy calls out invidia's plan of a super 😂😂 love the vids brother

  • @Mike__B
    @Mike__B Год назад +41

    You thought the 4070 was a higher tier card that they limited? Well that's certainly a first I've heard, most people strongly believe this is a lower tier card that boosted "the number" to get more money out of it, just like they tried to do with the 4080 12GB edition

    • @RacingAnt
      @RacingAnt Год назад

      3080 12gb is a great card.
      If you can buy it for 3080 10gb price.
      Closer to Ti performance than the 10gb.

    • @Mike__B
      @Mike__B Год назад

      @@RacingAnt Sorry, typed wrong, I mean the 4080 12GB edition. Mind gets a little mixed up when thinking of generations of screw jobs Nvidia has been releasing.

    • @RacingAnt
      @RacingAnt Год назад +1

      @@Mike__B that makes a lot more sense now 👍Yeah, the whole 40 series has just been a long line of NVidia treating their customers like 💩

    • @hastyscorpion
      @hastyscorpion Год назад +2

      People who believe that aren’t doing the math right. If there originally was a 4080 16 gb and a 4080 12gb and Nvidia got their way and no one complained, they had to have had a card lined up to be the 4070 TI. It should have been this card (the card now named the 4070). After they bumped the 4080 12gb down to be the 4070 Ti. They had to bump down all the cards below it as well.

    • @Mike__B
      @Mike__B Год назад

      @@hastyscorpion That's an interesting take, and certainly a feasible one, however another theory is they simply would not have come out with a 4070ti at all and simply had the 4070 card be the first *70 card out there after all except for the 3060ti all the other "day 1" cards where non-ti variants (or super in the case of the 20 series) and then later the ti versions came out. Then Nvidia could really push the idea that those super pricey flagship products are well worth the money, just look at the difference in performance!
      That said, the 4070ti wasn't a limited 4080, those cards were already produced by everyone (Nvidia and AIBs), already boxed and ready to be shipped to retailers, until word came down from on high and then boxes got changed out and any name plates or stickers on the cards that had "4080" were removed and swapped for 4070ti naming.

  • @stepheneddington1667
    @stepheneddington1667 Год назад +8

    I had to rebuild my five year old system at the beginning of the year. Went from a 1050ti to a 3060. So far everything I've seen in the past few months tells me it was definitely better to get the 3060 rather than wait for a 4070 or any other 4000 series card. I'll be fine for the next few years when hopefully the graphics card market will finally fix itself.

    • @mikepatrona472
      @mikepatrona472 Год назад +1

      If I upgrade from the 3060ti it’ll be to 6800 or 6900xt Prices will only keep rising thanks to the war mongers and countries that just keep printing money

    • @ARedditor397
      @ARedditor397 Год назад +1

      @@mikepatrona472amd is not your friend

  • @JamesRussoMillas
    @JamesRussoMillas Год назад +64

    It's not because the 4070 was destined for more, it's because it was destined for less! It's the actual successor to the 3060ti. So yeah there SHOULD be at least an extra card between the 4070 and 4070ti

    • @ZackSNetwork
      @ZackSNetwork Год назад +11

      Nope the 4070 is really a 3060 successor.

    • @guilhermelopes986
      @guilhermelopes986 Год назад +28

      Nope, they updsold all the cards by 1 tier.
      4090 should be the 4080, the 90 series is a sham to make the flagships cost more.
      4080 is the true 4070.
      4070ti is the 4060ti.
      4070 is the 4060.
      The sorry excuse of a card that the 4060 will be is actually the overpriced entry level 4050.

    • @Sleepless4Life
      @Sleepless4Life Год назад +1

      Plz no! Don't give Ngridia anymore ideas!

    • @cjmillsnun
      @cjmillsnun Год назад

      More like a 4050.

    • @JamesRussoMillas
      @JamesRussoMillas Год назад

      @@guilhermelopes986 the 4070ti performance with 16 GB of VRAM should be the real 4070 being that the 70 is always last gen top tier performance. Hence why the actual 4070 is more like a 4060ti. (3060ti and 2080 were basically the same).

  • @arloracc
    @arloracc 3 месяца назад +1

    As someone who's been using this graphics card for over a year, I've got to say that although the value of it could be better, the performance is pretty good for my standards. I do not have older GPUs to compare to, but what I did have was a laptop with a 3050 TI in it. As we all know, laptop GPUs are much weaker than their desktop counterparts. So a 4070 was definitely a massive upgrade from a 3050 TI laptop GPU. I will probably keep using this GPU until it just stops working at which point I will hopefully have enough money saved up to buy something better.

  • @kevinphillips6063
    @kevinphillips6063 Год назад

    I bought PNY 4070 and have no issue turning on the voltage or any other settings. My 3D Mark Time Spy is over 17,000. Maybe its just the Founders Edition that's locked down. Have photos if you want them.

  • @Animize
    @Animize Год назад +87

    Now imagine how great the 4060 will be...

    • @ferdgerbeler8494
      @ferdgerbeler8494 Год назад +17

      64bit bus, pci-e 5 2x, 6gb, and nvenc disabled. $469 ... will be 3050 speed on any system with less than pcie-5 ... i mean, im just plucking that out of thin air, dont wanna give um ideas..

    • @Endpoint101
      @Endpoint101 Год назад

      😂 It's going to be pretty awful

    • @racerex340
      @racerex340 Год назад +5

      You mean the RTX 4050, right?

    • @ferdgerbeler8494
      @ferdgerbeler8494 Год назад +1

      @@racerex340 no

    • @Eric-ct2ri
      @Eric-ct2ri Год назад

      @@ferdgerbeler8494 thats more likely going to be amd's for its 6500xt replacement...

  • @Raptorz413
    @Raptorz413 Год назад +9

    was thinking of getting this gpu no thanks gonna stick with my 1080 ti and get the 6950 xt or 7900 xt both are looking real sweet now never tried amd before but its looking better and better

    • @9999titanium
      @9999titanium Год назад +2

      I prefer amd aderenlin over GeForce experience, 6950 xt is a good card haven't had any driver issues.

  • @Clint_the_Audio-Photo_Guy
    @Clint_the_Audio-Photo_Guy Год назад +5

    I guess I'll keep my 4070 Ti then. I was feeling a little guilty for spending $900 on it (Gigabyte Aero/White build), but since you can build a whole civilization in the valley between the 4070 and 4070 Ti, I guess it'll be worth it, lol.

    • @RaptorFPV
      @RaptorFPV 10 месяцев назад +1

      Best price/performance ratio! Good choice!

    • @Clint_the_Audio-Photo_Guy
      @Clint_the_Audio-Photo_Guy 10 месяцев назад

      @@RaptorFPV It ended up crashing games to I returned it before my window closed. Back to deciding what to get again. The 7800 XT is looking like a good candidate now.

    • @EliasOwnage95
      @EliasOwnage95 10 месяцев назад +1

      @@Clint_the_Audio-Photo_Guy Crazy driver issues with 7800xt that I just got, returning it to get a 4070 tbh. Windows update loves to replace amd drivers with its own and fucks ur drivers. You then have to constantly rollback the driver och re-install it.

    • @Clint_the_Audio-Photo_Guy
      @Clint_the_Audio-Photo_Guy 10 месяцев назад

      @@EliasOwnage95 There's a video on how to fix that windows update issue. That's a windows problem not a GPU problem.

    • @EliasOwnage95
      @EliasOwnage95 10 месяцев назад

      @@Clint_the_Audio-Photo_Guy I know it's a Windows issue. I tried literally every fix and nothing solved it, don't want to risk getting past being able to return it and endure future driver bs. I saw it as paying extra for something that works with no headaches.

  • @IceCreamePudding
    @IceCreamePudding 8 месяцев назад +3

    @8:00 Foreshadowing.

  • @robertcaldwell82
    @robertcaldwell82 Год назад +1

    Hey Jay and the Two Cents team!
    Big fan of the channel for years, thank you for the videos, I've have been pouring over them over the last few days. The reason being is that I'm being asked to travel to the US for business and I'm going to take the opportunity to visit Micro Center for the first time so I can upgrade my PC!
    So, the reason I write here is;
    I built my PC about 5-6 years ago, and have tipped away at upgrades over that time, upgrading the cooler, PSU, storage etc...
    Originally, I had thought to do the same and was going to buy a 4070ti, but, my i7 6700k probably isn't going to cut it...
    So I'm looking at a new Motherboard, CPU, Ram & GPU that give me a little flexibilty to upgrade later.
    The problem I'm seeing online is bottlenecking, compatibilty, power delivery, and most importantly, bang for buck.
    I've saved about $1,000 for this, but, I'm getting such great deals in the US (compared to home where the 12600k is $300-350) so I'm tempted to use the credit card to bump up a choice or two (like the 13600k) but I'm nervous that will balloon into unneccesary purchasing. Could you please cast your eyes over my build and nudge me in the right direction?
    Any help at all would be most appreciated!
    i7 6700k > i5 12600k
    NZXT Kraken x62
    Asus z170 > z690 DDR5
    16gb DDR4 3200 CL16 > 32gb DDR5 5600 CL40
    Samsung 970 EVO Plus 1TB + 3TB Toshiba p300
    EVGA FTW 1070 > MSI Ventus 4070ti
    EVGA Supernova 850w Gold
    NZXT s340 Elite
    I use Photshop, Illustrator & Premier for work and game at 3440 x 1440 144hz
    Thanks in advance!

    • @flameshana9
      @flameshana9 Год назад

      The 13600k is almost 28% faster (all core load). That's worth it in my opinion. Single thread only gains 5% though.

    • @steveleadbeater8662
      @steveleadbeater8662 Год назад

      Buy what you can afford Robert, don't go loading yourself with debt for an extra 10fps. I'll get savaged by the fanboys but if you are using the GPU for anything other than gaming I'd always go NVidia so I'm not going to advise getting a last gen AMD card, even though they are much better value. I'd also consider sticking with a DDR4 board to pick up the 13600 within budget.

  • @BobBobson
    @BobBobson Год назад +8

    I wonder how it'd behave with a shunt mod, or other ways to get around the power limit

  • @E_Sunbro
    @E_Sunbro Год назад +30

    I think what's illustrated best here is that NVIDIA straight up thought that the 4080 12GB scam was gonna fly, mainly due to that large performance gap.
    I think they originally planned to have the 4070Ti be the 4070, and the 4070 be the 4060, but then figured they'd be able to milk the consumer by doing a tier split on the 4080 class thinking nobody would notice. Gotta make up that mining money somehow. ¯\_(ツ)_/¯

  • @networkgeekstuff9090
    @networkgeekstuff9090 Год назад +19

    I am stunned that nobody actually gives this card any points for the performance per watt (or silence) .... it dominantes in that cathegory. Are influencers target audience kids that do not care for energy bills or what ? I actually went for 4070 solelly that I can keep using my 600w passive PSU, undervolt the the GPU and have a very silent rig that consumes like a 2015 era PC and still have great 1440p ultrawide performance.

    • @mrp9023
      @mrp9023 Год назад +2

      These youtube channels whilst giving out decent info also have to have hyperbole etc to get views. I gave my 3070 to my son and ended up getting g a new pc with a 4070. I wouldn't usually upgrade by one generation but wanted my own PC and to not share my sons. Seems like a decent card to me, a solid improvement from the 3070, better performance, better dlss, frame generation, and very good energy consumption....and will also benefit from future updates.

    • @imjody
      @imjody Год назад +1

      So, here's a crazy thing about this card. I bought a 4070 Gigabyte Eagle OC a couple of weeks ago (replaced my 1650 Super OC), and I am absolutely blown away. Now, keep in mind, everything else about my computer, is nearly 5 years old at this point; bought at the same time as my old 1650 Super OC. So I'm running a Core i5 9400, and 32GB of DDR4 3600 ram. Both slow compared to today's standards, and only low/mid tier. But yet still, I can manage to play Warzone 2 at 4K (I have a Gigabyte Aorus 48" 4K 120hz 'FO48U' monitor that I also bought at the same time as the 4070), and still average out at around 100FPS! On an i5 9400 CPU!! 😂I'm definitely CPU limited, but man, 100+FPS on Warzone at 4K(!!) is just insane. I'd average like 30FPS on the 1650 Super on this very same computer; nothing else has changed. I'm super happy with my purchase; although the price was definitely up there...

    • @daniellew2271
      @daniellew2271 Год назад +2

      At this point these commenters are just crying babies. I decided to buy 4070 for the price/performance/power/availability combination. Of course if Nvidia can drop the price another 50 bucks or more it will be best. In my country AMD card ain’t cheap, these ivory tower RUclipsr don’t really know what they are talking about. Their eyes are only in the US market 😂

    • @RobBCactive
      @RobBCactive Год назад

      No they're not very interested in energy usage.
      As for silence some say most gamers use ear phones, so don't care.
      I undervolted an RX 6700xt, trimmed the max frequency to run cooler in summer to reduce with similar success on 1440p.
      Most of the most interesting games aren't the GFX intensive AAA types anyway. For the few others AMD game profiles made it easy to push the GPU harder.

  • @KhainesKorner
    @KhainesKorner Год назад

    Gonna be honest, I bought one of the 4070's, but I got the Asus non-OC'd one, not the reference model.
    I'm pretty happy with it, but I do feel it could've been a bit cheaper. Like $50-$100 cheaper.|
    And before anyone says "Why didn't you buy X instead???"...
    1) It was going in an eGPU box, thus limited space. Otherwise I'd have just ripped the 3080ti out of my desktop.
    2) Power usage. Electricity prices are nuts here in the UK, so i'm trying to limit power usage where I can (also why I don't use my desktop much right now with it's 3080ti).
    3) What I use my rig for really benefits from CUDA cores to speed up rendering.
    I'd be very interested to see if there's a way to unlock the voltage to get it closer to the wattage it can pull max... I think it's 225W from the 8-pin and the PCI-E slot? Might give it a nice boost.

  • @mattcvideo9407
    @mattcvideo9407 Год назад +5

    It seems like skipping the 4000 generation altogether is the right move financially. I'm building a new PC now and found a lightly used 3080 for $400 - it should be plenty good enough until 5000 gpu's get released!

    • @simon3225
      @simon3225 Год назад

      Thinking on the same lines but getting hold of a low mileage 3080 at that price would be optimistic to say the least . Decent 3080's in the UK at $400 equivalent, is like trying to search for Unicorn Poo but much harder, harder to find that is, not harder Unicorn Poo which is even harder to find ( didn't want to get things confused, that's Nvideas job.)

  • @BigStank5483
    @BigStank5483 Год назад +6

    1080Ti still running strong, but getting old and slowing down quickly!
    I usually build a New Top of the Line PC every 7 yrs roughly and it's time to do so but the prices are just atrocious on Graphics cards and MOBOs.
    Also Intels next CPU will be a different Socket, so building a 13900lk is just silly for future proofing.
    Between Prices and Tech changes that's happening just sucks to build a new TOTL PC.

    • @Asfanboy1
      @Asfanboy1 Год назад

      Also what game's?

    • @hotdogsarepropaganda
      @hotdogsarepropaganda Год назад

      I also have a 1080ti, hit it with a decent OC but it is feeling its age now. hjoping to push it to intel battlemage

  • @umbles7007
    @umbles7007 Год назад +6

    Could I ask why you would take a 3080 over a 4070? They're still selling for more than the 4070 new anywhere I see, and are about the same on performance, with a higher wattage.

    • @umbles7007
      @umbles7007 Год назад +5

      @Something Diabolical The 3080 that is close to the price of the 4070 is 10gb though, so its less than the 4070

    • @agentb4074
      @agentb4074 Год назад +3

      I wondered the same thing, but then got WAY more confused when he said he would get a 3070 Ti over a 4070. That makes absolutely no sense to me.
      Why would you get something slower, with much less VRAM, much higher power, and fewer features, for the same price?

    • @jedpratte
      @jedpratte Год назад

      The 3080 has more raw performance then the 4070. The 4070 has to use it’s gimmicks to match it. Especially at higher resolutions in most games the 3080 beats it. I’m sticking with my FTW3 3080

    • @umbles7007
      @umbles7007 Год назад +2

      @@jedpratte Every comparison I've seen had them trading blows, but mostly the same. I could see the 4k maybe, but I only 1440p game anyway. But totally, if you already have a 3080, there would be zero reason to upgrade. I have a 1080 though, and had gone back and forth between the 3080 and 4070.

    • @jedpratte
      @jedpratte Год назад +3

      @@umbles7007 yea the ones I saw showed at 4k the 3080 wins the rest is similar performance. That’s kinda sad as usually the 70 series would trade blows with previous 80ti

  • @007TheReaper007
    @007TheReaper007 Год назад +3

    Try the newest version of Afterburner (4.6.5). It has added support for 40 series.

  • @mattjohnston9131
    @mattjohnston9131 Год назад +1

    0:25 🤣that was me about 2 months ago. I built a rig with 14 Corsair fans and 3 iCUE commanders. Took a while to get the cabling organized. Makes me curious as to what they have coming! 🤔

  • @southchum101
    @southchum101 Год назад +1

    The 4070ti was suppose to be a watered down 4080 with less memory. This was already confirmed. They changed the labeling last minute when consumers complained. So this threw the whole line up out of whack.
    4090
    4080
    4070ti (4080 12GB)
    4070

  • @SapphicBambi
    @SapphicBambi Год назад +6

    There was something I saw with performance on another channel:
    Previous gen's 80 series becomes new gen's 60 series. eg 1080, 2080 became 2060, 3060.
    With this 4070, it mirrors the performance of the 3080. Lending more evidence to it should have been a 4060 card.

  • @daniels7281
    @daniels7281 9 месяцев назад +4

    "the worrying thing about this is that there's almost room for them to slide a Super card in there"
    This comment aged very well

  • @cormoran2303
    @cormoran2303 Год назад +10

    The 4070 is so gimped it even got rid of the 't' in 'got'.

    • @bami2
      @bami2 Год назад +2

      They want to push you to get a 4070ti if you want more t's

    • @zdspider6778
      @zdspider6778 Год назад

      The "t" is a DLC.

  • @UrueWhisperwind
    @UrueWhisperwind Год назад +1

    I worked my ass off and bought a 4080. Told myself to ignore the price tag just keep saving. I couldn’t be happier performance-wise. The card is a beast. But not everyone can pay that much, I’m fact most can’t, which sucks.

  • @mikem4371
    @mikem4371 Год назад +1

    I really have no complaints about my new 4070 that being said I upgraded from a 1080 and just let me say wow it’s the diff between night and day for me. I don’t really understand why people complain so much it’s good for what I want to use it for

  • @inneedofmedication
    @inneedofmedication Год назад +3

    Yeah, the 4070 is bad vs the 4070ti, 20% more performance for 25% more price. Yeah this comment is on a video that is 5 months old, but when you're talking about price for performance, per dollar the 4070 is better. It's not really a fair comparison when reviews are not taking account of prices. If price isn't an option, 4070ti is better, if price is an option, you get more bang for your buck with a 4070. With the prices of GPUs the last 5 years or so, price is a huge selling point.

  • @DeadPhoenix86DP
    @DeadPhoenix86DP Год назад +4

    I have a RTX 4070 as well. But i can adjust the voltage slider just fine. I have mine running at 3050mhz. So that's 150mhz overclock on the core clock and 1000mhz on the memory clock. No stability issue's.

    • @joegreezy
      @joegreezy Год назад +1

      That’s crazy bruh you paid $600+ for a 192bit bus 😂

    • @BlackJesus8463
      @BlackJesus8463 Год назад

      It's a Founder's Edition bruh. Get help.

    • @DeadPhoenix86DP
      @DeadPhoenix86DP Год назад

      @@BlackJesus8463 They're all using the same boards. Plus you get help.

    • @VoldoronGaming
      @VoldoronGaming Год назад

      You should have gotten a 6950xt

    • @DeadPhoenix86DP
      @DeadPhoenix86DP Год назад +2

      @@joegreezy Its my money. I'll use it how i please. I'm not paying a 1000+ for a 4070 Ti. 650 was my max budget. And i was looking for a GPU with 3080 performance.

  • @StevenRogers-mero909
    @StevenRogers-mero909 Год назад +14

    Sounds like I'll be sticking with my 3080 for a while longer.

    • @ashleyobrien4937
      @ashleyobrien4937 Год назад +2

      you absolutely should, for a LOT longer...

    • @delboy6384
      @delboy6384 Год назад +1

      For sure, we have a 3080 & 3090 in our house and they are going no where

    • @4Leka
      @4Leka Год назад +1

      Me too with my second-hand 2080 Ti. Was planning on getting the 3080 before its price jumped and now it just makes no sense to upgrade to it. The 40 series might as well not exist until the 4080 sells cheap second-hand.

    • @scriptkiddies3727
      @scriptkiddies3727 Год назад

      @@ashleyobrien4937 100%

  • @eppsislike
    @eppsislike Год назад

    I'm always on the move and really enjoy GeForce Now. But, decided not to go for the 4080 subscription as I did not see the point of it, considering that I can game in 4k 60FPS DTS with the 3080 stream subscription. Glad I was right.

  • @LordAshura
    @LordAshura Год назад +2

    The RTX 40 series (with the exception of the 4090/4080) is basically the RTX 20 series. Minor performance boosts inexchange for a new early beta feature that they're charging a major premium over the previous gen.

  • @zarodkiewicz
    @zarodkiewicz Год назад +5

    Performance difference massive but price difference massive too. Che ked Amazon earlier today and the cheapest 4070 is £560 and 4070TI starts at £850 here in the UK. I went for 4070 as its enough but I would go for 4070ti if it was cheaper

  • @Xiddahmoto
    @Xiddahmoto Год назад +7

    When the 4060 releases, I would love to see the power difference comparison between the 30 series cards. At this point I wouldn’t be surprised if 4090 would be the 3090 and the 4060 would be comparable to like a 2080ti.

    • @nope6471
      @nope6471 Год назад +3

      yeah, a 2080ti with 8gb of vram. but hey, DLSS BRO! DLSS IS AN AMAZING MIRACLE!

  • @3rdWorldGamer
    @3rdWorldGamer Год назад +5

    So glad I got my 3080 12gb model last August for 800 after taxes and shipping costs to outside of the US. This is getting ridiculous. It took me 12 years to upgrade and I feel I made the best choice at the best possible time.
    Edit: Was coming from an EVGA 970 (got that one borrowed after my HD7950 burnt out) and an FX8320.

  • @Thatguy-tb9pw
    @Thatguy-tb9pw 6 месяцев назад +1

    I overclocked mine works fantastic

  • @Sil2ntScott
    @Sil2ntScott Год назад

    😂😂😂😂 Jay's "slimy mother" ending was priceless and hilariously funny

  • @AlexRubio
    @AlexRubio Год назад +5

    I screwed up getting the 3070, but again this was in the mining boom.

    • @RobertFromEarth
      @RobertFromEarth Год назад +3

      Yep me too. Didn't listen to the warnings about its 8GB vram buffer. I'm ok with the fact that every year new games will demand more GPU power so framerate drops a bit but lowering textures? ...nope.

    • @AlexRubio
      @AlexRubio Год назад +1

      @@RobertFromEarth Lol, im with ya

  • @thatzaliasguy
    @thatzaliasguy Год назад +3

    Nvidia is smoking meth. This was *supposed* to be the 4070 Ti (as the current 4070 Ti was originally the "4080 12GB"). A 60-class GPU priced as an 80-class... Weaker than last gen in terms of raw power; "But guys, it's better with DLSS3, which we locked behind firmware of our new gen!"

  • @anthonygoldie6961
    @anthonygoldie6961 Год назад +7

    Thanks for taking the trouble to do things like this Jay, it really helps a lot of us and is really appreciated. Even though I have already purchased a new graphics card recently i still found this very useful recently to help warn my friend to not waste his money when buying a new graphics card for his son on a limited budget. Its not right to slam people who can just afford something decent with such a huge performance gap.

  • @LonerJoe
    @LonerJoe Год назад

    Im a 71 year old Jay. I've watched these video's for 100 years... Well... I forgot how many years. Thanks for the great content.

  • @hesher4life
    @hesher4life Год назад

    I just bought a EVGA FTW3 Ultra 3080 for $700 CAD ($515 USD) used. It has maybe one year mileage on it. Very content with my purchase.

  • @denverbasshead
    @denverbasshead Год назад +11

    Jay has it backwards, the 4070 is a 4060 class card they are upselling you lol

    • @raresmacovei8382
      @raresmacovei8382 Год назад +2

      It's a 4050 class of card. The 4070 Ti was 4060, 4080 was 4060 Ti.

    • @RayanMADAO
      @RayanMADAO Год назад

      ​@@raresmacovei8382 by your logic a 4050 should match a 3080? What

    • @raresmacovei8382
      @raresmacovei8382 Год назад +2

      @@RayanMADAO No, going by core count vs 4090.

    • @DeadPhoenix86DP
      @DeadPhoenix86DP Год назад

      @@raresmacovei8382 Yeah...No.

    • @RayanMADAO
      @RayanMADAO Год назад +1

      ​​@@raresmacovei8382 you cant compare core counts between generations, what....
      You compare performance
      Its like comparing bus speeds when they increased cache by 3x this generation (meaning bus isnt needed as much with local cache)

  • @vigilante9259
    @vigilante9259 Год назад +6

    I think this has to do with the 4070ti originally being the 4080 12gb, so it's higher in performance as it was categorized as an 80 card.

    • @patrickmoody33
      @patrickmoody33 Год назад

      I completely agree. Not sure why this was not mentioned

  • @EmpathVibe
    @EmpathVibe Год назад +5

    I've been nvidia and intel for a long long time. Recently, I picked up an amd cpu and I was blown away by performance and lower wattage all around. Looks like I'll be going with AMD for my gpu next aswell.

    • @Tom-sd2vi
      @Tom-sd2vi Год назад

      Same. Doubting between rx6950xt and rx7800xt

    • @disguiseddv8ant486
      @disguiseddv8ant486 Год назад

      And what AMD CPU did you picked up that blew you away that you forgot to mention?

  • @rynosraceroom66
    @rynosraceroom66 Год назад +1

    Open test bench looks great ! Hell , It all does ,. Awesome job guys .

  • @windyrunner9191
    @windyrunner9191 Год назад

    I'm so happy i didn't wait for this. I got a used 3080 ti 12gb vram for 600 USD.
    The temps are perfect

  • @V.D.22
    @V.D.22 Год назад +4

    I just got a Sapphire 6950XT Nitro+ last week. I never had a AMD card before. This card is a beast...runs everything at high speed. The only problem is the heat it generates (280W) but with an optimised airflow, it should be ok.

    • @denisbaz6682
      @denisbaz6682 Год назад

      Ye and its god only for games, no 3d no nothing

    • @V.D.22
      @V.D.22 Год назад

      @@denisbaz6682 I wouldn't know. I only game with it.

  • @quittessa1409
    @quittessa1409 Год назад +8

    I think the 4070 is Exactly where nVidia originally planned it to be and that there was supposed to be a 4070ti bethween it and the 408012GB. When that got renamed to the 4070ti instead they've basically Dropped the middle card as I'n betting they already had a lot of the coolers, cards etc built for the 4070 and the middle card ti varient was to be made later. I could see it coming up as a 4070 Super tbh

    • @nlflint
      @nlflint Год назад

      My thoughts too while watching this. The extra-wide performance gap is a result of the 4080 12gb "unlaunch".

    • @4Leka
      @4Leka Год назад

      nVidia is really just causing themselves extra headaches by compressing their naming scheme. Why does every usable card have to be either 60, 70, 80 or 90?
      Why not 4010, 4020, 4030, 4040, 4050, 4060, 4070, 4080 and 4090? Then push the Ti models as refreshes as they used to do.

    • @jamesbyrd3740
      @jamesbyrd3740 Год назад

      @@nlflint isn't the 4070ti considerably worse than the 4080 though?

  • @Xenoray1
    @Xenoray1 Год назад +4

    so what is coming for corsair??? wireless rgb?!

    • @csabauri351
      @csabauri351 Год назад

      101% Daisy chainable no wire fans like Unifans / D30 etc.

    • @D_8KEN
      @D_8KEN Год назад

      Lian li style connections I’d imagine. Or a single fan / RGB connector?

    • @N3KO_79
      @N3KO_79 Год назад

      Maybe wireless wiring 😂🤭🙃

  • @robdeezle
    @robdeezle Год назад

    I was going to get the 3070 to but I chose the 7900xt and my god I’m glad I did, atomic heart used 14gb vram while maxed out, 165hz not one dropped frame

  • @grizzbgaming
    @grizzbgaming Год назад

    Wish I would have watched this before building my pc. I was going for the 4080 or 4090 but ran low on money, so I went with the next one down because it was half the price of the 4090. Lesson learned! I will just have to replace it when I get the money.

  • @georgioszampoukis1966
    @georgioszampoukis1966 Год назад +8

    I agree with every point made in the video, however, to be fair, a 7.4% performance increase by overclocking is actually pretty decent, especially considering this is an FE card. The 4070 would have been a very good deal at 450 - 500$

    • @2nd_Directorate
      @2nd_Directorate Год назад +1

      Still not a good deal. Decent at best. The "4070" is basically just a renamed 4060 which should at best have a price tag of $400. And ask yourself, would $400 be a good price for a 4060?

    • @CanIHasThisName
      @CanIHasThisName Год назад

      @@2nd_Directorate You need to take actual performance into account, and at 400$ it would be an amazing deal.

    • @2nd_Directorate
      @2nd_Directorate Год назад +5

      @@CanIHasThisName No you don´t. That is what Nvidia wants you to believe. You just need to take R&D and the production cost into account. And suddenly (as the R&D cost of the Graphics Unit division hasn´t changed significantly in 5 years ) it is not that good of a deal anymore.

    • @munteanucatalin9833
      @munteanucatalin9833 Год назад

      Not with those 12GB of VRAM... 1080p cards should cost no more than $400

    • @CanIHasThisName
      @CanIHasThisName Год назад +1

      @@2nd_Directorate That's the dumbest take I've seen in a while.

  • @haiokthebeast
    @haiokthebeast Год назад +4

    Title typo lol

  • @Johnwick-ed7vo
    @Johnwick-ed7vo Год назад +4

    If you're disappointed now, just think about next release season. Their best offering will be a 4080 at best I feel Nvidia has turned it's back on what helped build them up, same for AMD my hope is that Intel keeps their GPU for gerneral consumer lines going, I like what they got so far, hope they can better it next time.

    • @onomatopoeia162003
      @onomatopoeia162003 Год назад

      hope for the next release. Then never buy anything.... hate how that works.

  • @cracklingice
    @cracklingice Год назад +2

    This card is already just an overclocked 60 class card so it's not like there's any reason to expect much more in the tank. The '12GB 4080 / 4070 Ti' is the real 4070.

  • @CopperRavenProductions
    @CopperRavenProductions Год назад

    This reminds me of the Mini Apple pcs. Where they sautered in the ram so you couldnt replace it with better ram. Forcing you to pay alot more for higher and fastet gigs. I am trully despising these dirty tactics

  • @ZosoMan87
    @ZosoMan87 Год назад +5

    I didn't know there was such a gulf between the two, its insane. I was lucky enough to find a 4070ti for $635 open box at my local Microcenter, Upgraded from a 2070 super. So It's a massive upgrade for me, but I would not have bought it at MSRP.

  • @SharkRoach1
    @SharkRoach1 Год назад +4

    I'm going with the RTX 4070 because its the only card that doesn't draw more than 200W, has a single 8 pin connection, isn't the size of a cinder block and gets exactly double the frame rate than my 3 year old RTX 2060 on mature drivers gets.
    I'm not sad and I won't regret it.

    • @fernandoferraz4146
      @fernandoferraz4146 Год назад

      Going to buy It for the same reasons, im coming out of an r9 380 so its going to be a huge leap in performance

    • @SharkRoach1
      @SharkRoach1 Год назад

      @@fernandoferraz4146 I got a Zotac Twin Edge OC and its great. OCs to 250+ core before certain games crash after 10 minutes.

  • @UranusfromBrussels
    @UranusfromBrussels Год назад +1

    I wanted to upgrade my 1080ti, guess I will wait another generation. 6950xt not an option on 650w PSU 😕

  • @NIGHTSTALKER0069
    @NIGHTSTALKER0069 Год назад +4

    I will be sticking with my 1060 and 3070 Ti in my slow but great performance for the buck machines.

    • @morpheus9137
      @morpheus9137 Год назад +1

      im sticking with my GTX 770

    • @DeadPhoenix86DP
      @DeadPhoenix86DP Год назад

      @@morpheus9137 GTX 480.

    • @brugj03
      @brugj03 Год назад +1

      You should be buying a 1 dollar card then.
      Gives you one fps but hey look at the performance for the buck.
      Just insane.

  • @blake9463
    @blake9463 Год назад +3

    To be honest compared to my old 3060 my 4070 has been a godsend, I do agree that the prices on the 40 series are a tad higher then what they should be

  • @TheCgOrion
    @TheCgOrion Год назад +3

    I believe the 4070 was always supposed to be the 4070, and they just renamed the 4080 12GB to the 4070 Ti. The main reason is Nvidia often released non Ti cards, and then later the Ti cards. In other words, originally the "4080 12GB" was going to stay called that, and eventually a "4070 Ti" was going to be slotted in between.

  • @SomersetUpNorth
    @SomersetUpNorth Год назад +1

    buyers remorse is real 😢 though i have upgraded from an asua x480 8gb oc to a zotac 4070 trinity, im really starting to think i should of waited a few extra months before pulling the tirgger. its not like i havent waited 6 years to upgrade. though i must admit going from a lower tier to the 4070 has really impressed me but apprantly i was small minded

  • @TheBrister123
    @TheBrister123 Год назад +2

    If I am remembering correctly, there was a significant gap between the 3060 and the 3060ti? I was actually hoping for that same gap with the rest of the 3k series cause a 10% gain does not seem to be worth jumping for.

    • @Nauzhror1216
      @Nauzhror1216 Год назад +2

      It was a similar gap, yes.
      That Cyberpunk scenario for example: 81 vs 104 is a 28.4% lead for the 4070 Ti.
      3060 is 39 fps, 3060 Ti is 52 fps. That's a 33.33% lead for the 3060 Ti.

  • @duckilythelovely3040
    @duckilythelovely3040 Год назад +4

    It's dissapointing because you're doing the wrong thing.
    What this card is amazing at, is UNDERVOLTING.
    I'm literally sipping 165 watts of power with no performance loss.
    This card is amazing, and great at many things.
    Staying ice cold.
    Zero noise
    Great performance per watt
    and the ability to undervolt like an absolute CHAMP.
    Loving mine!!

    • @NoobProTV
      @NoobProTV Год назад +3

      bro how do you get zero noise my founders edition rtx 4070 is spinning 5 times per hour for 5 minutes each during office desktop use which is audible through the whole room

    • @duckilythelovely3040
      @duckilythelovely3040 Год назад

      @@NoobProTV Wtf?! LMAOOO
      I got the cheap MSI ventus which costs the same as the default card. has 3 fans and a large heatsink (longer)
      Mine never goes above 55c in gaming. It's ice cold with zero fan noice. I'd return that one and get the msi ventus, do youreself a favor.
      Also undervolt your card, seriously. They undervolt VERY well.

    • @DeadNoob451
      @DeadNoob451 Год назад

      No surprise that if you ignore the performance or price or price to performance ratio, then yeah its pretty amazing.

  • @JoonYerr
    @JoonYerr Год назад +6

    Just go more disappointing