It’s All Over For The RTX 3070

Поделиться
HTML-код
  • Опубликовано: 26 июл 2024
  • Benchmarking the nVidia Geforce RTX 3070 against 13 popular and recent games on an average gaming PC in 2023.
    With the recent release of the RTX 4070, its predecessor is now on its way into the GPU Archive. Thanks to its 8GB frame buffer, this midrange Geforce RTX 3070 may struggle to remain relevant in an era of games ported from the 10GB XBOX Series X and 16GB PS5. In this video, I've compared the 8GB 3070 against some recent titles to see how it's holding up, and whether it's time to think about an upgrade.
    Pick up USED on eBay:
    ebay.us/KucNNz
    Pick up this Gigabyte model NEW on Amazon:
    amzn.to/3KEuK5Q (UK)
    Music by Backing Track backingtrack.gg/
    00:00 nVidia Geforce RTX 3070 vs. 2023
    00:43 Background: History of the RTX 3070
    03:05 Test System: Ryzen 5 5600X, 32GB DDR4-3600
    03:36 Gaming Benchmarks
    03:39 Benchmarks: The Last of Us
    05:06 Benchmarks: Forspoken
    06:32 Benchmarks: Forza Horizon 5
    07:34 Benchmarks: Halo Infinite
    08:32 Benchmarks: A Plague Tale: Requiem
    09:20 Benchmarks: Marvel’s Spider-Man
    10:24 Benchmarks: Uncharted 4
    11:02 Benchmarks: God of War
    11:34 Benchmarks: Resident Evil 4
    13:02 Benchmarks: Cyberpunk 2077
    15:12 Benchmarks: The WItcher 3 Remastered
    16:04 Benchmarks: Fortnite
    17:06 Benchmarks: Warzone 2.0
    17:59 Benchmarks: Time Spy, Fire Strike
    18:04 Conclusion: Is the RTX 3070 still worth it in 2023?
    A D S
    The following links are affiliated. Everything costs the same to you, but I get a commission. Please only click these links if you want to help me out!
    Get cheap game keys @ CDKeys: bit.ly/3c0Tu6j
    G E A R
    Audio:
    Behringer UM2 amzn.to/3aV17wq
    Behringer XM8500 amzn.to/3nCt2WY
    Generic microphone arm amzn.to/339yZRC
    Video:
    Fujifilm X-T3 amzn.to/3nGapRY
    Camera & 18-55mm lens kit: • Trading a PS4 for a PC...
    Fujifilm 23mm f2 amzn.to/3udXvgn
    FeiyuTech G6 Max Gimbal amzn.to/3nHndYs
    Interfit LM8 Video LED amzn.to/2SkESJL
    Neewer 120cm Softbox amzn.to/2RjlMDm
    Raleno Rechargeable Bicolour Video LED Panel amzn.to/3f2l7hV
    C O N T A C T
    Twitter/Insta @IcebergTop10s
    Email icebergtop10s@gmail.com
    #Geforce #nVidia #scalperpandemic
  • НаукаНаука

Комментарии • 983

  • @user-di7bb4yk1y
    @user-di7bb4yk1y Год назад +1278

    Calling a two-year old card "dead" is a big deal for a channel with 10-year old cards reviews.

    • @R3TR0J4N
      @R3TR0J4N Год назад +23

      I lol'd

    • @BertieJasokie
      @BertieJasokie Год назад +138

      Well said. These limited Vram 30 series gonna age badly.

    • @Negiku
      @Negiku Год назад +104

      @@BertieJasokie Most of the top 10-20 most popular games will run on 2-4GB vram just fine for a pretty long time as they always have been.
      I however wholeheartedly agree that nvidia shouldn't be too stingy with their vram though.

    • @IvanIvanov-ni4rs
      @IvanIvanov-ni4rs Год назад +11

      Ampoo ded.

    • @BertieJasokie
      @BertieJasokie Год назад +81

      @@Negiku If I paid top dollar for a decent GPU during scalpocalypse, I'd be pissed. Meanwhile 4-8gigs is fine, IF it's cheap, for the rest of us.

  • @theburger_king
    @theburger_king Год назад +127

    It’s hard watching people trashing on graphics cards that are like 50 times faster than your own

    • @GraveUypo
      @GraveUypo Год назад +15

      you know, a 3070 isn't really an unobtainable prize relic. it's just a shitty card that is going to the grave early because nvidia skimped on VRAM. also my 6900xt is way faster than a 3070 and it was ironically cheaper than one at the time, lol.

    • @zzzjz8437
      @zzzjz8437 Год назад +4

      Yea meanwhile im on a quadro k1000m (2gb ddr3 vram 128bit bus and around 326gflops fp32)

    • @theburger_king
      @theburger_king Год назад +3

      @@zzzjz8437 at least my 2gb is ddr5 😞

    • @misterpinkandyellow74
      @misterpinkandyellow74 Год назад +2

      I remember when I had a shitty pc, dont worry you can still have lots of fun with it, I know it did.

    • @user-mn1sm4pj2d
      @user-mn1sm4pj2d Месяц назад +1

      This video is exactly for people who are looking to upgrade their low-end gpus at a reasonable budget now that the used market is full of accessible gpus far below their msrp and shows that 3070 is a bad option for those purposes. In my area, 3070, even 3060 12gb used are more expensive than 2080ti used. Now that's a perfect gpu to sit out a few years without sacrificing game quality much in 1440p and see what happens with the new cards. At worst, you enable dlss quality.

  • @bulutcagdas1071
    @bulutcagdas1071 Год назад +803

    It's mind boggling that the 3060 is going to outlast the 3060 Ti and 3070

    • @Verpal
      @Verpal Год назад +94

      With how NVIDIA keep improving DLSS, 3060 will just get better and better, since DLSS can improve its raster performance, and it have enough texture to keep up.

    • @sharathvasudev
      @sharathvasudev Год назад +20

      ​@@Verpal proud owner here

    • @xFluing
      @xFluing Год назад +67

      @@Verpal or just drop the resolution. literally. it's all dlss does, it's not some magic free performance tech
      ok yes it sharpens and upscales the image too but it can only GUESS the lost detail of the lower resolution

    • @Verpal
      @Verpal Год назад +102

      @@xFluing I am not entirely sure why would you care about the rendering scale and how it reach the final image, if the final image is good, just use the technology, no need to be allergic to it, or call it magic.

    • @gytux0258
      @gytux0258 Год назад +30

      ​@@Verpal It introduces additional artifacts.
      Its not as good as native. But neat.

  • @alrightfritz9692
    @alrightfritz9692 Год назад +136

    Kinda insane to me that the 3070 has the same amount of VRAM as the 1070

    • @XX-_-XX420
      @XX-_-XX420 Год назад +27

      It has the same as the R9 290X lmao. It's just sad.

    • @notfunny3397
      @notfunny3397 8 месяцев назад +9

      Same as Rx 470, a 40 dollar card.
      Even that card can benefit from 8gb Vs 4gb

    • @kristiannygard9141
      @kristiannygard9141 8 месяцев назад +1

      @@XX-_-XX420Thats the card i have lol.
      Havent actually replaced it yet either.
      But the time will soon come sadly, It has served me pretty good.

    • @Wreckedftfoxy
      @Wreckedftfoxy 3 месяца назад +1

      and the 3050 and the 3060 (8gb)

    • @radiofloyd2359
      @radiofloyd2359 24 дня назад +1

      Pretty sure it's faster, though, as it's ddr6 technology rather than ddr5🤷

  • @KingLamink
    @KingLamink Год назад +51

    I had a 3070 too, upgraded from a 970 because I was burned by vram. Got burned again, and sold it on and bought a 6900xt. Never looked back and never been happier with a card

    • @jasonandrews7355
      @jasonandrews7355 Год назад +1

      I never really found my 970 vram limited, personally. I get the controversy, but it never actually bit me, and I had the card until early 2022!

    • @mikebruzzone9570
      @mikebruzzone9570 Год назад

      mb

    • @docmars
      @docmars Год назад +1

      Do you feel like FSR 2.0 is giving you what you want out of the card? DLSS has always been like 1.5x-2x better for scaling tech. Or do you find yourself running games more often at native resolution without any scaling?

    • @KingLamink
      @KingLamink Год назад +7

      @@docmars because of the extra power and vram I usually run at native 4k. But I do sometimes use fsr 1.0 and 2.0 depending on the game.
      Since I play at 4k, there are actually situations where I prefer fsr 1.0 over both dlss and fsr 2.0, as there is enough data at 4k for fsr 1.0 to reconstruct the image very well and it doesn't have any ghosting, which can be quite apparent with FSR 2.0 and dlss (such as in dying light 2).
      However in other games, such as cyberpunk, I really don't mind fsr 2.0 and don't miss dlss one bit. Whilst it may be better, unless you are directly comparing them you'll not really notice a difference. The other benefits of the card (lower power draw Vs similar power Nvidia cards of the generation, significantly more vram, much nicer driver UI, more useful feature set for what I do) vastly outweigh the minor benefits dlss brings me over fsr

    • @XX-_-XX420
      @XX-_-XX420 Год назад

      Same I had a 3070ti, which I got from trading my 3060ti to a miner. But it had constant Vram issues when that 3070ti just came out. Sold it to a pour soul for 1100(it was bidding so I mean they came up with the amount themselves) and after that I went to AMD to get a 6700XT to have a much faster and more efficient card. Also sold that one but after this gen I think I'm skipping Nvidia for the next 5 generations, and AMD if possible RX7000 is terrible so far. Really hope intel can compete well.

  • @gurnery4219
    @gurnery4219 Год назад +191

    I've never been happier owning my 6700xt.
    It's been the best bang for my buck card I've ever purchased. It's a shame NVIDIA are being pathetic with their pricing. Would love comparisons between the 3070, 6700xt, 6800xt and the 3060 (due to the bigger vram count). I would love NVIDIA to bring me back in as I had a 2060 and a 1080TI and I loved using broadcast and the RTX voice but I'm not paying upwards for less or same performance.

    • @eclipsegst9419
      @eclipsegst9419 Год назад +40

      I had a lot of people tell me i was silly to think i would need 12gb for 1080 ultrawide within the next 4-5 years. Boy do i have a lot of people to tell "i told you so"

    • @gurnery4219
      @gurnery4219 Год назад +13

      @@eclipsegst9419 exactly the same! The main reason I got the 6700xt was to use it for ultrawide but even my 16:9 1440p friends are struggling atm with the vram limitation. I really hope AMD bring out a well priced range of cards next time out. NVIDIA need pricing out for a wake up call

    • @eclipsegst9419
      @eclipsegst9419 Год назад +2

      @@gurnery4219 Pricing and better RT performance. Intel getting a competitive lineup out too would also be a big help. Personally I only turn on RT to medium or so on single player games and it's fine, looks good, stays over 60. But a lot of people just have to see those bars on the chart where they want them.

    • @feryfrost8572
      @feryfrost8572 Год назад +5

      What PSU are you using for the RX 6700XT?
      I'm planning to buy this card as well with a 550W PSU

    • @gurnery4219
      @gurnery4219 Год назад +4

      @@feryfrost8572 You *should* be fine with a 550watt but I went with 650W to give me some leeway. I'd advise getting a 650w with a little extra money just in case.

  • @drkRoss89
    @drkRoss89 Год назад +25

    When it came time to upgrade from my 1070TI, I never seen the point from going from an 8GB GPU in 2017 to a new 8GB GPU in 2022. For that reason, I snapped up an AMD Radeon 6700XT from AMD due to the extra VRAM and the fact it was around £100 cheaper than the 3070 at the time.

  • @tollph3314
    @tollph3314 Год назад +241

    its the 8GB VRAM that absolutely limits capabilities of this card

    • @Lucas-uo9ml
      @Lucas-uo9ml Год назад +39

      99% gamers dont care about vram, my 3070 runs perfectly all my aaa games in 4k.

    • @Vfl666
      @Vfl666 Год назад +78

      @@Lucas-uo9ml well if the game stutters like shite then they do care and not everyone wants to play with low textures.

    • @ImJusSha
      @ImJusSha Год назад +13

      @@Vfl666 3070 studdering? That doesn’t happen at all y’all have to understand there’s a difference between a lot of vram vs fast vram

    • @tollph3314
      @tollph3314 Год назад +42

      @@Lucas-uo9ml that's a lie you can't run all of them specially newest one and cosider next 1-2years it's going to get more demanding,3070 is already struggling at 1080p at maximum or even high quality settings in some titles with not enough VRAM,textures not full loading ,fps drops ,frametime issues and so on so with even something like Hogwarts Legacy you can't play smoothly maximum settings at 1080p forget 1440p or 4k you are out of VRAM it's been shown all over the web you have to lower certain settings to even get smooth gameplay at 1080p which is bad considering card is not that old and wasn't cheap at all. Just think about the fact that 1070 from 2016 had already 8GB VRAM ,Nvidia did this on purpose so consumer's are forced to upgrade sooner and spend more money....

    • @tollph3314
      @tollph3314 Год назад +13

      @@ImJusSha sure but it does not change reality that 8GB VRAM is begging to be minimum for 1080p gaming forget 1440p or 4k new AAA titles ,the card has enough raw performance and it's Achilles is just not enough VRAM THAT'S the whole issue

  • @oatsaredelicious2521
    @oatsaredelicious2521 Год назад +96

    Spiderman Remastered is often, especially with RT, Vram limited even if RTSS/Afterburner doesn't say so. The game just automatically "switches" to system ram which is slower and therefore causes those frametime spikes. Happens to me on a 3080 with RT and Textures very high but the spikes stop if Textures are turned down to high.

    • @IcebergTech
      @IcebergTech  Год назад +34

      That makes a lot of sense, thanks!

    • @dazdaz2050
      @dazdaz2050 Год назад +3

      Thanks for that bit of info helps me with the comment i posted i have your GPU and am trying to see how excess ram and tuned ram helps when your running out of Vram as i have both. The life of my 3080 will depend on it but id rather stick with it over selling up and finding an extra £500 to get anther card prices are ridiculous.

    • @PhoenixSky7
      @PhoenixSky7 Год назад +1

      Some games like Horizon Zero Dawn doesn't like using system RAM for textures, just blurry messed up textures..

    • @conyo985
      @conyo985 Год назад +2

      Yeah. I hate it when the VRAM usage shows it's below 8GB but the game is running out of VRAM because of the low framerates and low power usage. Happens to me in Uncharted at 4K even with DLSS.
      I don't get game engines. There are games that will use up to 7.9 GB in the VRAM monitoring with no problems but some games will only show 7GB usage but it's clearly running out of VRAM because of the stutters and texture swapping.

    • @selohcin
      @selohcin Год назад +1

      @@conyo985 No joke. PC games need an industry standard way of handling this and it should be publicly announced.

  • @mihaylov131
    @mihaylov131 Год назад +145

    I think that 12 GB's 4070 will have almost the same problems in 2025.

    • @erikbritz8095
      @erikbritz8095 Год назад +10

      Yup in 2020 i preached and harped that these 8gig and 10gig cards wont survive long and wow im already proven right, alby for a budget build a 10gig to 16gig gpu will be quite good.

    • @wnxdafriz
      @wnxdafriz Год назад +5

      its minimum, ps5/ and series x give that much so it will last at least until not supported .... the question will be if it will look good though / 2028 is when you will see more of an issue i bet

    • @XX-_-XX420
      @XX-_-XX420 Год назад +4

      I think at 1080p 12GB should be enough. For 1440p I'm honestly not sure I hope it will be but I guess time will tell.

    • @bias69
      @bias69 8 месяцев назад

      consoles have 16gb of shared memory, so 12gb vram will be enough until new consoles come out

    • @isoid
      @isoid 7 месяцев назад +3

      12gb seems like the minimum for future proofing at 1080p
      For 1440 it should really be at least 16gb, but for some reason the only affordable 40 series with that much is the 4060ti lmao

  • @thomaswest2583
    @thomaswest2583 Год назад +13

    I don't know why people use raytracing. It's totally not worth losing over half your fps

    • @jskyg68
      @jskyg68 Год назад +6

      Agreed, it only exists right now to sell high end (overpriced) cards that would never sell without it. Create the problem and the solution, marketing 101...

    • @truthdoesnotexist
      @truthdoesnotexist 28 дней назад

      because it looks amazing?

    • @thomaswest2583
      @thomaswest2583 27 дней назад +1

      @@truthdoesnotexist Not amazing enough to lose half your fps

  • @peppemartincastro1020
    @peppemartincastro1020 Год назад +82

    I'd love to see how a 6700xt compares to the 3070

    • @jskyg68
      @jskyg68 Год назад

      Just search "6700xt vs 3070" lots on benchmarks all over youtube.

    • @eclipsegst9419
      @eclipsegst9419 Год назад +37

      from what i've seen, 3070 has slightly higher highs, but WAY WAY WAY lower lows. Bad frame drops.

    • @dew7025
      @dew7025 Год назад +11

      @@eclipsegst9419 only when vram is exceeded

    • @selohcin
      @selohcin Год назад +11

      @@dew7025 Right, and that's going to be more and more common as time goes by. We've already had four games just this year that use 10GB of VRAM at 1080p, and the year's not even halfway over yet! The 3070 is done, and even the 3080's days will be cut short.

    • @dew7025
      @dew7025 Год назад +4

      @@selohcin lucky me I only play old games that dont care about vram I will upgrade when 50 series comes out since I have no reason to upgrade yet

  • @GTFour
    @GTFour Год назад +89

    3070 absolutely should have been a 16GB card

    • @papalazarou7880
      @papalazarou7880 Год назад +40

      3070 should have been a 12GB card and the 3080 should have been a 16GB card.

    • @noobavage6119
      @noobavage6119 Год назад +14

      @@papalazarou7880 This I agree with. Like no way them mf gave the base 3060 more vram than its supposed bigger and much stronger brothers the 3060ti, 3070, and 3070ti.

    • @conyo985
      @conyo985 Год назад +14

      Yup Nvidia really gimped the 30 series VRAM. It should have been like this.
      3080 Ti 24GB
      3080 20GB
      3070 16GB
      3060 Ti 16GB
      3060 12GB
      3050 8GB.
      As you can see the 3060 having 12GB would have made sense.

    • @markomarkovic5729
      @markomarkovic5729 11 месяцев назад

      @@conyo9853070 should have 10gb like 3080, and there should be only 12gb version of 3080. There are no issues with 10gb 3080. But there's another thing: games are terribly optimized. I play video games since the 90s and TLOU is something that really shocked me how bad it was. Jedi Survivor is also terrible, but TLOU is the worst PC port I've ever seen.

  • @Dango-God
    @Dango-God Год назад +88

    I feel like these Ampere cards might end up facing the same issue Kepler did, which was also the low VRAM capacity while also being released right when new consoles did, which always increases the baseline for what you need in terms of specs. And, just like with GCN, AMD will likely age better in terms of gaming performance and stability.

    • @mikebruzzone9570
      @mikebruzzone9570 Год назад +1

      mb

    • @scoopshort
      @scoopshort 9 месяцев назад +1

      ada lovelace......

    • @789uio6y
      @789uio6y 6 месяцев назад

      What is kepler? 900 series?

    • @Dango-God
      @Dango-God 6 месяцев назад

      @@789uio6y 600 and 700 series. There weren't many differences, the GTX 680 is basically the same as a GTX 770 for example.

  • @vroomzoom4206
    @vroomzoom4206 Год назад +244

    It's definitely not by accident that nVidia limits the vram in their cards so hard. I was considering a 4070ti, but decided to go for a 6900xt instead and am very happy that I did.

    • @hartsickdisciple
      @hartsickdisciple Год назад +27

      I don't think the 12GB cards will have the same issue the 8GB ones are facing. The games giving 8GB GPUs problems at high/ultra detail settings are almost entirely PS5 ports. The PS5 and Xbox Series X have 16GB of unified RAM, and sometimes well over 8GB is allocated as VRAM. This really shouldn't be a problem for 12GB cards, unless new consoles with more RAM are released in the next few years. A very specific set of circumstances led to the 8GB GPUs having this problem at this point in time.

    • @vroomzoom4206
      @vroomzoom4206 Год назад +37

      @@hartsickdisciple we thought that about 8gb 2 years ago.

    • @hartsickdisciple
      @hartsickdisciple Год назад +9

      @@vroomzoom4206 Again, that was a different set of circumstances. The people who understood the specs of the PS5 and Xbox Series X could see that 8GB might not be enough fairly soon. We're already seeing what PS5 ports look like, and they're not giving 12GB GPUs problems. It's very likely that 12GB will be enough for native 1440p in 98% of games until the next console generation.

    • @steaksoldier
      @steaksoldier Год назад +4

      @@hartsickdisciple only the ps5 has unified ram. the xbox series x has 10gb of vram and 16gb of system ram. And also just because the PS5 ports are the first to be a problem to 8gb card owners doesnt mean there wont be non-console ports in the near future that also need more than 8gbs of vram

    • @hartsickdisciple
      @hartsickdisciple Год назад +12

      @@steaksoldier The Series X has 16GB of total RAM. 10GB of RAM running at full speed, and 6GB running at a bit lower speed. It's true that the 10GB is intended as VRAM, but the developers can use the RAM as they see fit.
      Of course it's possible that there could be PC exclusives which need more than 8gb of VRAM, but that hasn't been the issue so far. Right now it's obvious why 8gb isn't enough for high or max detail settings in these PS5 ports. It's a combination of the unified 16gb memory pool and lazy porting. It makes sense that 11 and 12gb cards aren't having these issues. The PS5 version developers probably have to keep at least 4-6gb out of that 16gb as system RAM. They're using around 10gb as VRAM.

  • @quirtthedirt
    @quirtthedirt Год назад +20

    I fought tooth and nail for my 3070, and it's served me admirably at 1440p. The vram is limiting in a couple of my games, but not to the point of it becoming a big concern. I do wish I had bought a 6800xt instead though, especially with how much they've improved in rasterization due to driver updates. I expect I'll keep the 3070 at least until there is a new AAA game out that is actually worth playing (which may be a while still) and from there I'll probably go 6950xt or maybe 7900xtx if they're in stock for a decent price.

    • @rdg665
      @rdg665 2 месяца назад

      If you are only gaming then buy AMD. The only point of using Nvidia is if you don't use it only for gaming but for rendering videos and 3d etc.

  • @JamesSmith-sw3nk
    @JamesSmith-sw3nk Год назад +7

    I played a lot of Far Cry 5's arcade mode with custom maps at 4K. My 3080 would sometimes run out of vram and the fps would drop like a rock, when I got the 3090, performance was always fine.

    • @Renekor
      @Renekor Год назад +1

      3080 10gb?

  • @daxitron
    @daxitron Год назад +11

    It's funny, NVIDIA's built-in obsolescence with my 3070 got me to upgrade... to AMD. Thanks team green!

    • @GeneralS1mba
      @GeneralS1mba 6 месяцев назад

      It's also that they don't want people to have cheap cards for ai usage and other productivity, they made the overpriced 4060 ti 16 GB for that

    • @lennartj.8072
      @lennartj.8072 5 месяцев назад +1

      Same, replaced mine with a 7900 XT.

  • @chexmixkitty
    @chexmixkitty Год назад +5

    I had this exact 3070. Good card, but I upgraded to the 6950XT when it hit $700. Wish I waited a couple months to snag it for less, but the OC Formula didn't drop by $100 like the XFX card did.

  • @emilguldmann6816
    @emilguldmann6816 Год назад

    waiting for a riser extension with extra vram on it.. Would be nice with a way to chuck in some extra vram though. like a vram stick that you put on your sli/nvlink gate.

  • @nastyyoda5673
    @nastyyoda5673 Год назад +8

    RX 6800 looks like a better buy

    • @mtaufiqn5040
      @mtaufiqn5040 Год назад +2

      Indeed, hardware unboxed has released a video pitting rtx 3070 against rx 6800 and the latter come out as the winner

    • @MrHimer12
      @MrHimer12 Год назад

      RX6800XT is even better. If someone who doesn't care about streaming, work on AI and ray tracing then for less than 500$ in some instances way before 4070 launch you are golden. I mean, cool, Ada has some nice tech and it boosts it, but when 12GB of vram is going to run out? In a year? I mean, what's the point of buying 600$ GPU which can't get decent performance without upscaling? Excluding CP2077 I haven't had to turn on upscaling on my RX6800XT... Some older games with RT enabled run completely fine and with decent frames even thou for any team green "fan" I have wOrsEee RT and no FG. Meh, I was nvidia owner for past 20 years but both generations ampere and ada are just a huge letdown. Owful configurations of lower tiers which most people buy only fraction of people buy top end and in both generations nvidia managed to royally fuck it up to hell and beyond. And... People will still buy it.

  • @AnalogFoundry
    @AnalogFoundry Год назад +4

    Man, 16GB is what it should have had in the first place! Such a potent GA104 silicon wasted because they paired it with such low amount of VRAM.

  • @Nachokinz
    @Nachokinz Год назад +1

    As one who will base much of their purchasing decision basis on vram; its frustrating what has happened during the last couple of years with such unfavorable market conditions. Those who couldn't wait any longer for other card stock are faced with either turning down settings or upgrade once again.

  • @Jarmundx
    @Jarmundx Год назад

    Hey Iceberg tech love your content! Would you be willing to take a look to the HD 6970, terascale last hurrah?

  • @dazdaz2050
    @dazdaz2050 Год назад +4

    I would love to see a video on how tuning your system can help negate the Vram problem slightly. I have a 3080 10G and am currently playing TLOU but will try other titles as well hogwarts etc.
    Maxed everything and it seems to be fine i run out of gpu horsepower pegged at 100% before the Vram causes stuttering. I tuned my ram from 3200 to 3766mhz dual rank and pushed up my gpu memory to 1250mhz over stock it seems to help allot.
    Not sure how long ill be able to stay playable with new titles but right now im out of Vram but it doesnt seem to be a problem granted i have 32gb of system ram not 16gb and often use dlss quality on 1440p ultra wide gpu pegged at 100%

  • @TheKproductionsful
    @TheKproductionsful Год назад +7

    Since when did a 70 class Nvidia GPU become mid range?

    • @cmgm6027
      @cmgm6027 Год назад +3

      1080p cards low end, 1440p cards mid range, 4k cards high end. Not to hard imo

    • @lateralus6512
      @lateralus6512 3 месяца назад

      The 3090, being the flagship of this generation, is more like the 80 tier of previous generations. So the 80 is more like a 70 and the 70 is more like a 60.

  • @adityabaghel729
    @adityabaghel729 Год назад +1

    ice i am just curious that what's your favorite gen of gpu is till date....

    • @IcebergTech
      @IcebergTech  Год назад

      Someone asked in my Q&A video, I think I said GCN 2 due to its longevity. I have a lot of respect for Pascal, GCN 4 and 5 too.

  • @LankyBeamer
    @LankyBeamer Год назад

    What is that program you use to get all that information up in the top left

  • @facelessvaper
    @facelessvaper Год назад +3

    I think I will stick with 1080p high or tweaked; medium if I must...

  • @BogdanM116
    @BogdanM116 Год назад +18

    I have a 3060Ti that's also the Gigabyte OC model. With some overclocking i've done to it, i can almost reach reference 3070 performance. I wouldn't say these cards are dead entirely, but they are definitely not an option for 1440p max settings with RT and whatnot anymore. I'm still on 1080p and games still hover around 8GB of VRAM so i will stick with this card for a year or two. I got it in 2021 during apocalypse for a lot under the going price for them back then (3060Ti's we're going for 1000 euro+ where i live and i got this for around 550). If i can stretch this for at least another year i'd say it was worth the money. My next card will definitely be AMD though, i cannot stand Nvidia's decisions anymore.

    • @slandgkearth
      @slandgkearth Год назад

      3070 levels of performance? In many games the difference is 20+ fps and also which 3070 the non oc. FE? Gigabyte does offer OC versions of the 3070 which are 5 to 7% faster in the default OC mode.

    • @BogdanM116
      @BogdanM116 Год назад +1

      @@slandgkearth Almost 3070 FE levels of performance. My 3060Ti is a Gigabyte Gaming OC Pro which is overclocked from gigabyte but i also overclocked it myself a bit. In what games i have like Cyberpunk, Forza Horizon 5, The Witcher 3 and F1 22 it performs very close to the 3070 FE. It's not more than a 5% difference. I don't know if every OC Gigabyte 3060Ti can do this or if i "won the silicon lottery"

    • @couriersix2443
      @couriersix2443 Год назад +1

      I have a Zotac Twin Edge OC model of the 3060 Ti, it's been great with cyberpunk 2077 at 1440p medium with DLSS balanced. Looks good enough for me and feels smooth with frames between 70-120 depending on the intensity of what's happening at any given time. Using a R5 5600 + 32gb 3200mhz RAM

    • @UTFapollomarine7409
      @UTFapollomarine7409 Год назад

      @@slandgkearth on geforce cards a 3060 ti is basically a lower version of a 3070, a 3070ti is lower version of 3080, 3080ti is lower version of 3090 and so on, a 3060 ti with a decent OC can easily run a 3070 for its money buddy, same for the last two generations of rtx and gtx cards, a 1070ti was that of a 1080, a 1050 ti was that of a 1060, its now always the case but basically like that for the most part, 2070s was almost 2080 performance, 2060s was almost 2070.

  • @abritabroadinthephilippines
    @abritabroadinthephilippines Год назад +1

    Is that what MPG means on MSI boards? Moderately Priced Gaming lol I never knew. Yes I know the board used here is a Gigabyte I'm just asking about MSI board naming.

    • @IcebergTech
      @IcebergTech  Год назад +3

      In MSI's case, I like to think it stands for "Might Play Games"
      (j/k, I don't have a problem with MSI, if you're reading this MSI please sponsor me)

  • @darkchillz_1713
    @darkchillz_1713 Год назад

    man your content is gold...I subbed.

  • @AmorVim
    @AmorVim Год назад +4

    this is the greatest "i told you so" in tech in years

  • @Joel-mp2oo
    @Joel-mp2oo Год назад +4

    Nah RTX 3070 is alive and strong atm.. no issues apart from missing frame generation feature which it could do if nvidia allowed it.

    • @roneran51
      @roneran51 2 месяца назад +1

      I know your comment is old, but now you can have FSR 3 in some AAA games

  • @Ouroboross-
    @Ouroboross- 2 месяца назад +1

    Man patience saved me. Got a 3060ti Strix during the pandemic for 670ish. Waited until this year and got a 3090 FE for 760. Couldn’t be happier

  • @Brosfight_
    @Brosfight_ Год назад +1

    As an Idea, I would be interested in a revisit of the Rtx 3080 12gb in 2023 and how it holds up today. Obviously you should buy a Rtx 4070/4070Ti for around the same performance, less power consumption and DLSS 3 Frame Gen, but I would like to here your thought about it.
    My brother bought a Rtx 3080 12gb in May 2022 right after the GPU market started crashing and im curious if we made a bad decision or not. Sure we should have waited longe but it was his 16. birthday and he wanted to build his pc as a birthday gift. I was aware that the card was around 1 1/2 year old at this point. But there was no choice really 😅

  • @Adrienne_Quelle
    @Adrienne_Quelle Год назад +33

    As someone who *also* got a 3070 not long ago (picked up B-stock from EVGA (rip) at the end of 2022) and playing on a 1440p monitor, this is definitely something I'm keeping an eye on. My main game is FF14 which is far from taxed even at maximum settings (easily hitting my monitor's refresh rate of 165FPS in low areas and at least 80+ in Limsa and other high density areas), but I definitely ran into some hiccups with Spidey in turning RT on, even with DLSS enabled. I'm hoping I can skip at least two generations, but it might be better to upgrade in the 50 series (or jump to AMD or Intel) when that rolls around... we'll see. I'll be fine in XIV no matter what, I'm sure. As for FF16, though... concern.

    • @jskyg68
      @jskyg68 Год назад +3

      Try limiting your framerate to 90fps, more than fast enough and it may get rid of stuttering. I use that on my 6700xt just because there's no point to higher fps unless you're playing a shooter.

    • @donsly375
      @donsly375 Год назад +4

      Like the other comment said, try to limit your fps for the game you're playing (I use Nvidia control panel for this and not ingame fps limiter). Also disable vsync and the ingame limiter when you do this. ALSO try to pick a limit which your gpu will most likely reach in the game, I also have a rtx 3070 and play on 144hz, and I limit myself in singleplayer games to like 80-90. Doing this will most likely give you a way smoother gameplay experience

    • @nameunn5479
      @nameunn5479 Год назад +1

      I think having to limit you framerate this soon is the problem

  • @MethosTR
    @MethosTR Год назад +4

    I bought my 3070 in mid-late 2021 for MSRP.
    I don't regret the purchase, but 2023 games have been kicking its ass so far from what I've seen. I'm probably just going to continue holding out for next gen stuff though, I don't really play current AAA games that much to begin with.

    • @BladeCrew
      @BladeCrew Год назад +1

      I also don't run the latest AAA games and I have a rtx 3050 8GB(got mine 10 bucks below msrp when crypto was big), older is good and fun. I still play BF1, battlebit remastered, dota 2, dying light 2, oxygen not included, rainbow six siege and the planet crafter.

  • @lugplays8981
    @lugplays8981 6 месяцев назад

    Will an RTX 3080ti last me before I need to upgrade to a new gen? beacuse rn I got that exact 3070 and i am thinking of getting a used RTX 3080 TI for like 650 bucks then waiting like 2/3 gens. what i mean is will the 3080 Ti be good whit newer and newer games until like 2/3 more gens.

  • @javierperea8967
    @javierperea8967 Год назад

    So, which card you replaced with?

  • @frugalfun2.07
    @frugalfun2.07 Год назад +5

    Nvidia to games developers: "can you make your games use more Vram or not optimise so gamers would need to upgrade to our new GPUs so we could sell more? P.s. we'll keep the prices at pandemic level and give you a small share of the profits......."

    • @deleater
      @deleater Год назад +1

      game devs: "...say no more! we already found some new graphics features that make textures in 2023 look like a 2004 year game at medium settings and it uses around 6-8 GB VRAM, users who want to go high texture will need 12 GB cards now, and for ultra textures sometimes you will need even more than that...and that's only at 1080p native."
      jensen at Nvidia(in happy tears): "staap!!! you guys are making me cry,....here take your payment"
      😂

  • @S己G
    @S己G Год назад +5

    I do still see something good out of this. Since people are probably going to be getting rid of a lot of these their prices are going to drop, aren't they?

    • @CookieCrisp97
      @CookieCrisp97 Год назад +1

      Got mine (an asus tuf gaming) recently for 300€, it's just perfect for my 1080p 165hz monitor

    • @lateralus6512
      @lateralus6512 2 месяца назад

      Your comment aged well. Where I live the market is flooded with 3070 cards now. I’m looking at picking one up cheap.

  • @AStickOnCheese
    @AStickOnCheese Год назад +1

    The delivery of that Forspoken comment killed me lmao 5:03

  • @pravculear
    @pravculear Год назад

    i wonder how the 3070 max-q is holding up?

  • @autumn6994
    @autumn6994 Год назад +5

    the 3070 is still selling for 600 dollars in my country lol, what a scam

  • @세키반키
    @세키반키 Год назад +10

    can you test the Vega 64/56 with its HBCC function enabled? i think that function will extend the Vega 64/56's life a bit longer, even if it has 8gb of VRAM

    • @s1rb1untly
      @s1rb1untly Год назад +3

      I'm curious about this myself seeing as I have a Vega 64... GPU prices are a sick joke at our expense

    • @thanatoast4
      @thanatoast4 Год назад +1

      been curious about this myself. HBCC being touted so much prior to launch only to do absolutely nothing was pretty funny.
      the only recent game i've tested with HBCC on myself was MW2, and it did NOT like it - i assume due to how VRAM target is user-defined in that game via a slider.
      perhaps a lower HBCC size of 12gb or so could be beneficial - i think forcing it to its max size did more harm than good in that case.

    • @IcebergTech
      @IcebergTech  Год назад +3

      I do want to look at Vega 64, but it might be a while. I believe Hardware Lab are thinking of doing a video on that subject, however.

  • @GamerZHuB512
    @GamerZHuB512 Год назад +2

    I bought my 3070 pretty recently. Before, I had a 6700 10GB and was pretty happy with it and the additional VRAM. I bought it simply because it was actually on sale for a split second new, and it was perfect for programming with CUDA. If I feel like I'm going to run into issues gaming with it, I can always repurpose it and wait to buy a 7900XT. I won't look back after buying something like that.

  • @geekedaf
    @geekedaf 10 месяцев назад

    As someone who doesn't play vram intensive games (the most being cyberpunk and i haven't even touched that game in months) would a 3070 be fine for me when paired with a ryzen 5700x? My plan is to upgrade to that from a ryzen 3100 rx 580 pairup, and as someone who streams i wanna make sure i have those 8 cores. Also, the zotac amp holo comes at $380 so it seems like a banger to me.

  • @lanelesic
    @lanelesic Год назад +5

    Perhaps the memory slots for GPUs will return one day so users can just add more VRAM if they want.
    I can see these chips refurbished by aliexpress GPU sellers onto new boards with 16GB of memory as the 256 bit bus allows it.

    • @qwertykeyboard5901
      @qwertykeyboard5901 Год назад +3

      Trace length is a bitch though. Has to do with high frequency signals.

    • @WSS_the_OG
      @WSS_the_OG Год назад +3

      There was an experiment done, successfully, that grafted an extra 8 GB onto the 3070 for a total of 16 GB, but this was unstable due to no support for the added VRAM in VBIOS. The card did show it had 16 GB of RAM available in monitoring software.

    • @jamesbuckwas6575
      @jamesbuckwas6575 Год назад +1

      @@qwertykeyboard5901 Perhaps this could be a technology only for lower end cards initially, since they don't need quite as high memory bandwidths. Socketed server memory can reach bandwidth as high as 460 GB/s, so depending on how hard that is to transfer to lower capacity (

    • @qwertykeyboard5901
      @qwertykeyboard5901 Год назад

      @@jamesbuckwas6575 You might want to look up the Nvidia FX5200 64 bit version.
      Bandwidth is CRUCIAL. Even for low end things.

  • @jzuany
    @jzuany 10 месяцев назад +3

    5 months latter we are seeing the RTX 3070 humiliating the RTX 4060 TI and the RTX 3060 TI humiliating the RTX 4060, even though a RTX 3060 TI was better then the RTX 2080 Super. The RTX 4000 series is bullcrap, just more efficient RTX 3000 cards, everything else is the same.

  • @delacream4527
    @delacream4527 Год назад

    why does it say 1 gb textures in re4 when it uses 6?

  • @scavanger1000
    @scavanger1000 Год назад

    My gpu has 16gb of vram, the last of us says I’m only using 8gb of vram yet I still notice texture popin, the most noticeable being signs loading in as a smear and then boom suddenly it’s a 4k sign

  • @stanb1455
    @stanb1455 Год назад +5

    I have the same amount of VRAM on my far older RX 580, but then I only play at 1080p and on games that aren't all that demanding.
    At least with all this 8GB VRAM crap, we should expect these to be quite cheap used (Even if Nvidias proprietary drivers are a bit of a turn-off for Linux and BSD users like myself).

    • @OneFlockOneShepherd
      @OneFlockOneShepherd Год назад

      3070 is still substantially pricier than the 6700XT. I think 3070 prices will go down eventually

  • @schmax3627
    @schmax3627 Год назад +11

    Back in 2020 Dec when the 3060ti launched, i orderd the 3060ti ... 7 month later, i got a message from the shop i orderd it: "Sorry, your 3060ti wont get shipped bc we switched to LHR Models only instead, but here is a 3070 we still have". Boy even after almost 7 month of waiting, i felt like the luckiest guy ever. It still serves me well and i hope it well be enough until the RTX 5000 series launch comes around.

    • @reasondro
      @reasondro Год назад +2

      same here man, people act like if they don't game at ultra setting they gonna die or summ

  • @str8ripn881
    @str8ripn881 2 месяца назад

    I am wondering what this test was done on monitor wise. Size matters. I want a 34uw 1400p but not sure my 3070 will be up to snuff

  • @littl3spy
    @littl3spy Год назад

    I have a very similar problem with my 1070TI playing FS22, my average fps on max settings is about a 100 but the 1% lows drop to fricking 10s, I haven't really checked if its because of VRAM but thats the only bottleneck i can think of.

  • @rvnx
    @rvnx Год назад +17

    While I agree that 8GB VRAM is a hard bargain in 2023, I don't think the fact that game devs have become rather careless with texture streaming optimizations on PC should be left out either. 8GB should be a perfectly acceptable benchmark for development, considering the PS5 and S/X have around the same available, yet that target is often overshot for seemingly no reason. Just like the size of games in general, the VRAM requirements are ballooning for no reason other than careless development practices.

    • @dashkataey1740
      @dashkataey1740 Год назад +6

      The ps5 and xbox x both have 16gb of ram. The S has 10.

    • @fakethiscrap2083
      @fakethiscrap2083 Год назад +6

      If you have issues with forzá horizon 5 with vram you know that it has an issue. 8gb shouldnt be on a $500 gpu. Hell shouldnt be on entry level anymore.

    • @rvnx
      @rvnx Год назад +1

      @@dashkataey1740 It's shared RAM. They have 16GB in total that is shared between the GPU and CPU.

  • @amir-ti7ok
    @amir-ti7ok Год назад +3

    3060 non ti version should have 8gb and 3070 12gb instead, Nvidia release much weaker card (3060) with 12gb and this, way stronger only 8gb of VRAM. What's the point?

  • @leesanction2068
    @leesanction2068 Год назад +2

    I was in the same boat. Desperate to upgrade my 970, I had my eyes set on the rtx 3080. After months of using RUclips and discord bot channels, I just bit the bullet and got the 3070.
    I kinda regret it now, I should of just waited for the 6800xt. Oh well. Most AAA games are broken at launch anyway. Been playing my large backlog of indie games and other older AAA titles. Been a blast!!

    • @davefrance3721
      @davefrance3721 Год назад

      My 970bis still running well in my old rig, it's a very capable card.
      My new rig has a 3070 and I have no issues at all with my 280 hz monitor and max detail on most games at 1440 or 1080.
      I have 32 gb ram, so the card can use 24gb of system ram.
      I don't plan on upgrading in the near future, maybe later next year if the gpu prices drop.

  • @Shannon-ul5re
    @Shannon-ul5re Год назад +2

    The RTX 3070 is not dead, I'm having no problems play games at high settings at a res of 3440 x 1440 at 100fps most of the time. Next year I plan on passing it on to my son and I'll upgrade to something else and it will be more than enough for 1080p gaming with a 165hz supported monitor for fortnite.

  • @psylina
    @psylina Год назад +28

    People said a 3070 was a better card than a 2080 ti because 'ma new tech' and 'ma efficiency'. Now that the vram issue is becoming more and more apparent, that statement is becoming even more of a joke

    • @jamesbuckwas6575
      @jamesbuckwas6575 Год назад +10

      Well the 3070 was also under half the price of the 2080Ti, so the 3070 was better in that regard, and the better ray-tracing and DLSS support helps as well. I wouldn't call that statement or the people who said it a joke, because aside from being rude, it was true when the 3070 came out. Even if in some games the potential of the 3070 isn't being realized due to the lower VRAM capacity, it's fine to consider it a good purchase for the people who bought it, even if NVIDIA should have realized better what with the new consoles coming out and AMD's competing cards.

    • @ezas533
      @ezas533 Год назад +1

      The vast majority of people could not get the 3070 at MSRP. Many people bought it at over a thousand dollars.

    • @ezas533
      @ezas533 Год назад +1

      Also, the 3070 scores a measly 1 point higher than the 2080ti at the 3D Mark Full RT test. There is no difference in DLSS support, that's a false statement. The 2080ti will outlive the 3070/3070ti and maybe even the 3080 10G.

    • @psylina
      @psylina Год назад +2

      @@jamesbuckwas6575 2080 ti owners literally panic sold their gpus for used under 3070 msrp, if you managed to get one then you pretty much won. Not to mention the 352 bit bus for memory along with it overclocking well enough to trade blows with 3070 ti. Also a 2080 ti has the same tensor core performance as a 3080 so dlss is pretty much the same according to performance gains sheet of Nvidia. RT itself is pretty onpar with a 3070 but does it really matter considering only a few games utilize it well enough?

    • @psylina
      @psylina Год назад

      @@ezas533 If gaming, a 2080 ti can be oc'd pretty well to match up to even a 3070 ti.

  • @quaz1moto241
    @quaz1moto241 Год назад +4

    I'd consider 16GB the bare minimum for a new build these days for single player gaming. Even 12GB is cutting it REAL close at 1080p and in the case of R4make it's already not enough.

    • @attilavs2
      @attilavs2 Год назад +3

      Ram ? You aren't talking about Vram right ?

    • @starvader6604
      @starvader6604 Год назад

      rx 6800 xt is da wey

    • @quaz1moto241
      @quaz1moto241 Год назад

      @@attilavs2 its vram clearly

    • @attilavs2
      @attilavs2 Год назад +4

      @@quaz1moto241 You shoudn't even need 8gb, and 12 is far from the bare min. especially for 1080p. The min is shifting from 4gb to 6gb but that's far from your ludicrious numbers

    • @quaz1moto241
      @quaz1moto241 Год назад +1

      @@attilavs2 You just don't understand how games are moving forward for playing at max fidelity. Try playing with a 1060 6GB on R4MAKE with any decent settings. Games are requiring more than 12GBs for max settings fact. GL w your cope!

  • @TeamGun
    @TeamGun Год назад +1

    Yep, you need ample ram/vram for the new games. My system (r5, 32gb ram, rx6800 16gb) I can play anything like Resident Evil 4 High Settings native 4k 60fps no issues. With Resident Evil games like 4 or Village, I've gone close to 11gb vram usage (when playing 4k native).

  • @UTFapollomarine7409
    @UTFapollomarine7409 Год назад

    pulls out a radeon 7 like what you say about vram?

  • @joemarais7683
    @joemarais7683 Год назад +3

    Still can’t believe people paid flagship prices for such a garbage gpu with specs straight out of 2016 70 class gpus. I guess people must love being able to say they have an Nvidia gpu at the expense of only being able to play at 1080p

  • @xgeo23
    @xgeo23 Год назад +11

    I know that the 16GB modification to the 3070 may not be a viable option for everyone, but it is the only hope to give 3 or more years of life to the RTX3070.

    • @hoboholer
      @hoboholer 4 месяца назад

      i hope theres a place thatll start doing that mod quick cheap and easy

  • @hunterrees
    @hunterrees 11 месяцев назад +1

    What are you talking about?! I just bought a 3070 used for $250 on eBay and it performs wonderfully. I couldn’t be more happy with it

  • @TheVdub1980
    @TheVdub1980 Год назад

    Can you do a review using an MSI 3070ti Ventus 3x OC paired with an r7 5800x .. i think you will find the results are much better

  • @anthonymelendez949
    @anthonymelendez949 Год назад +4

    I'll forever be thankful I got my 3090 FE in December 2020 a month after launch at msrp and it still is working hard for me now running everything I play at ultra 4K 60-120 fps

  • @takehirolol5962
    @takehirolol5962 Год назад +4

    It was DOA since the beginning, F for the folks that paid more than $1000 for it...

    • @takehirolol5962
      @takehirolol5962 Год назад +1

      @@ihatelols Good for you, where I live there is no high end RDNA2.

    • @takehirolol5962
      @takehirolol5962 Год назад +1

      @@ihatelols Nice! Funny thing is that for me, a RX 6950XT does not fit my case but I still have to import it from Amazon US and then spend around $80 for the cheapest mid tower case than can fit it.
      Or I spend more on a RX 7900 XT and get the 28 cm variant that I also need to import it since my country does not sell the smaller models.

    • @takehirolol5962
      @takehirolol5962 Год назад

      @@ihatelols Nice! I use an old Desk for PC that I bought in 2007 that still holds and it's quite good, but it only fits mid tower cases.

    • @takehirolol5962
      @takehirolol5962 Год назад +1

      @@ihatelols I will just buy a new one, is not that easy to buy used cases in my country.

  • @mahballslmao
    @mahballslmao Месяц назад +1

    As someone who has a 3070 I find this video hilarious. You do not need max settings or any sort of raytracing for modern games to look good. The card still runs just fine on most games and I have failed to find a single game I can’t play on medium to high settings at 3440x1440.

  • @ecchichanf
    @ecchichanf Год назад +2

    >19:06
    License plate other a license plate?
    Also 8GB V-Ram is so 2014 with the 8GB Sapphire/MSI Radeon R9 290X.
    You can flash the cards with the R9 390 bios.
    Great video as always :3

  • @raul1642
    @raul1642 Год назад +3

    F for RTX 3070

  • @firestuka8850
    @firestuka8850 Месяц назад

    So, I bought an AMD 7900XT and gave my sister my 3070ti . She was quite happy. She was pulling double the frames from her old GPU with DLSS or FSR , whichever the game used. This was great for her as she was having difficulty with her 1080ti. It was an OG release that she pushed to the limits on every game she could, but it was time for it to die. So, after I upgraded, I gave her my cpu as well (Ryzen 3600x). This was amazing for her as she had a ryzen 1600. The upgrade for CPU and GPU let her really play the hell out of stuff. She's playing Remnant 2, Metro Exodus and many more games with me, other family, and friends online. She plays at 1080p 144hz as she has 2 1080p screens for work.

  • @FunkyTechy
    @FunkyTechy Год назад

    How did you get your cpu clocked that high?

    • @IcebergTech
      @IcebergTech  Год назад

      Well i'll save the full story for my 5600X review, but it *was* doing fine at 4.7GHz using 1.28v on air. Got through 13 games for this review, plus 9 games for the CPU test as well as Time Spy & Fire Strike with no issues, but couldn't even get through a single pass on Cinebench without blue screening!

    • @FunkyTechy
      @FunkyTechy Год назад +1

      @@IcebergTech had the same issue with my 5700g @4.6 and I just gave up and use pbo now which makes it well through time spy and cinebench which it gave my an extra 300 points in r23 for what that’s worth lol

  • @Seppe1106
    @Seppe1106 Год назад +9

    Got a RTX 3070 2 days ago for 450€, so far really happy about the purchase coming from a rtx 2060.
    I use it for 1080p gaming where it does a perfect job and thanks to a little undervolting it's silent, a bit more efficient but still strong.
    I like the card and was lucky to find it for that price. Almost made a mistake to get a bottom tier MSI 3060 x).
    Some might say I should have gone for a 4070, but honestly the real price will be a lot higher from the MSRP and my budget was rather tight.
    Nice Video though. :)

    • @hunterrees
      @hunterrees 11 месяцев назад

      I literally JUST did that exact same upgrade lol, but I got the 3070 for $250 used on eBay

    • @omgrenn
      @omgrenn 9 месяцев назад

      @@hunterreesdude i’m about to do the same exact upgrade too lmao how’s the card doing?

    • @JustRelx
      @JustRelx 7 месяцев назад

      i still use a 3070ti and game in 4k...it still rocks..

  • @Perlereino.
    @Perlereino. Год назад +3

    Yeah Im definitely going AMD when the time comes to upgrade my GPU, seems like theyre the only company with reasonable VRAM figures

    • @dxnny2k
      @dxnny2k Год назад

      AMD GPUs have bad RTX performance but a lot more overall performance for its price. I’m thinking about it too. I mean you could’ve got a 10-30% fps boost for the same price. Worth it, with a Ryzen CPU.

    • @nameunn5479
      @nameunn5479 Год назад +1

      @@dxnny2k I played with ready tracing and could not see a difference other then reflections

    • @dxnny2k
      @dxnny2k Год назад

      @@nameunn5479 sorry, I compared NVIDIA to AMD‘s cost-performance ratio for fps. Accidentally left it out.

  • @hartsickdisciple
    @hartsickdisciple Год назад +2

    I agree with the general consensus that 8GB of VRAM wasn't enough for $500+ GPUs released in 2020, but I also don't think the 3060/3070/3070 Ti are "dead," per se. They're dead as far as trying to sell for MSRP. The prices on all of them should drop substantially.

  • @gokulram7361
    @gokulram7361 Год назад

    My RTX 3070 OC colorful hits above 80 degrees…. Any suggestions to keep the temps low without reasonable effect on performance?

  • @edgararanda1857
    @edgararanda1857 Год назад +3

    I remember the leaked 16gb 3070 and got big hyped. Had to settle for a $370 EVGA RTX 3070 xc3 ULTRA so I’m not as mad as the people who paid $1K+ for the same model

  • @Gruxxan
    @Gruxxan Год назад +5

    it depends what you use it for. im still happy with my suprim x 3070. does what i want

    • @diogo6163
      @diogo6163 Год назад

      I have the Aorus 3070 and get a bit more fps than the shown model. The 3070 does still perform incredible.

    • @Gruxxan
      @Gruxxan Год назад +1

      @@diogo6163 yeah. i think a lot of these tech youtubers just jump on the current bandwagon

    • @Madmadid
      @Madmadid Год назад

      Exactly.

  • @m1natoh1nata
    @m1natoh1nata Год назад +1

    anyone remember the rx 470 back in 2016? that was under $200 and 8gbvram too
    good times

  • @mistaboombosticyt
    @mistaboombosticyt Год назад

    I managed to get this card on launch day for MSRP because of feverish refreshing and having other people helping try to lock one in with Best Buy's website going in and out of operation.

  • @grumpyoldwizard
    @grumpyoldwizard Год назад +9

    I like your channel. From the looks and tests of the 40 series, I’m not sure the 3070 is ready to be trashed. I was actually watching the game display instead of the frame rate. It looks pretty darn playable to me. If you just go by the specs and that measurement tool without looking at the screen and the game I think you’re doing it a disservice. Anyway, keep up the good work.

    • @autumn6994
      @autumn6994 Год назад +4

      it's not only frametimes, the visuals degrade too with just 8gb vram. Paying 500 dollars for a 8gb gpu in 2023 is unacceptable

    • @V1CT1MIZED
      @V1CT1MIZED Год назад +2

      @@autumn6994 Those floorboards on TLoU looked like ass on medium.

    • @jskyg68
      @jskyg68 Год назад

      I'd Just buy a really good 1080p monitor.

  • @wixardo
    @wixardo Год назад +9

    It's kind of crazy for me to think that 8GB VRAM doesn't cut it anymore, when not even that long ago 8GB VRAM was in the ''futureproofing'' range. Also makes my 3080's 10GB VRAM feel pretty meager, I guess I'm going to have to upgrade way sooner than I thought.

    • @GraveUypo
      @GraveUypo Год назад +3

      not long ago? i bought my first 8gb card 8 years ago. do you know how much vram my gpu 8 years before that had? it had 256mb. 8gb of vram has endured for way longer than normal as being enough.

    • @BA-oy9uo
      @BA-oy9uo Год назад +1

      Yeah I feel the same way about the 10gb on my 3080 especially looking to go to 4K within the next 5 or so years, (could be sooner), however long it takes for my ideal monitor to come out (and have good reviews.) 4K 32in OLED 240hz with DP 2.0 and HDMI 2.1 on it. Got the card for 550 used on eBay though and it has been doing great for 1440p for me.

    • @wixardo
      @wixardo Год назад +1

      @@GraveUypo you wrote a whole lot of nothing, it's almost like you didn't even read the comment and got triggered over ''not long ago'' to go on some random tangent about your 8 year old GPU, nobody said anything about that shit. 8GB of VRAM was solid just a couple years ago and suddenly its ran its course.

    • @wixardo
      @wixardo Год назад +1

      @@BA-oy9uo 4K 240hz sounds like a pipedream, I went with a 144hz 21:9 1440p monitor instead, been loving the extra FOV in games and the additional workspace. Got a traditional 16:9 @ 240Hz monitor on the side and I basically never use it, the wider screen makes a bigger difference than a slight refresh rate increase tbh

    • @HardWhereHero
      @HardWhereHero Год назад +3

      No your not, that is what these devs want you to do. They can optimixe these games to run better for sure. this is all a cop out.

  • @goldy2137
    @goldy2137 4 месяца назад

    Good video and really good quality

  • @AlaaSalehLE
    @AlaaSalehLE Год назад +1

    It is impressive some new video games are using 12gb vram. Like long time ago that could be the size of a whole game xD

  • @alucard2438
    @alucard2438 Год назад +4

    Never been happier in my life for ditching Nvidia a couple of years back. After my 2080S died, with all the doubts I had, I got the 6800XT instead of the 3080,was close to getting the 3070,cos the 3080 cost 1500$ at that time. I got the 6800XT for a 1000$ instead. But never really thought much of the amount of Vram. We never gave AMD credit for putting 16GB of Vram on all of their top tier cards for some time now and lately on the 3070 counterpart, the RX 6800

    • @degeneratepervert6255
      @degeneratepervert6255 Год назад

      I was all set to go Nvidia again after my 5700XT, but one look at the 10GB of VRAM on the 3080 made me go 6800XT.

  • @alchemira
    @alchemira Год назад +4

    That's why I bought an AMD rx6800 with 16GB VRAM a few weeks ago. It was cheaper, too.

  • @farooqqureshi3212
    @farooqqureshi3212 Год назад +1

    I'm also having this motherboard gigabyte b550 gaming x v2. I'm using and Ryzen 3600x with sapphire Rx 580 8GB nitro+ with 16GB of ram.
    My configuration is perfect for 1080p 60 fps gaming.

  • @basbas63
    @basbas63 Год назад

    Games have always used data predictable data streaming speeds on consoles to make their memory go further than it would in a pc. I think this console generation changes everything in that regard, with the fast storage speeds and the innovative methods to get data transfered.
    Think we either need minimum storage speed requirements for games or a shit ton of VRAM in newer cards. (Or make sure both are viable options) Some games use 14-15GB now at max settings. Getting dangerously close to the 16GB a lot of cards can provide and is more than some new midrange cards released now.

  • @aaz1992
    @aaz1992 Год назад +3

    I just went from a 3080 to a 4080. Gotta have that VRAM 😂

  • @Metaljacket420
    @Metaljacket420 9 месяцев назад +3

    Linus reviewed a 3070 with 16gb soldered to the board, few games utilized it or had significant performance increase

  • @ThomasDonofri
    @ThomasDonofri Год назад +1

    I was lucky to get mine at MSRP at Christmas 2020. I feel bad for everyone who overpaid.

  • @crzyces1693
    @crzyces1693 Год назад +1

    The 3060Ti, 3070, 3070Ti and the 3080 10GB are all running on borrowed time for 1440P+ gaming. I see a *LOT* of people posting things like _"You're an idiot, my 3070 is still getting over 100FPS in single player games at high/ultra presets."_ Then you find out they are playing competitive multiplayer games that came out 3+ years ago, or single player games from 2007-2017...or bloody emulation where the GPU literally does not matter. _slaps forehead_ When you have the audacity to say _"Well I'm on a 3080 and am constantly running into VRAM stutter walls without using RT, or dropping to 1080P with DLSS before using RT"_ you get a _"Well what do you expect, it's a 2 year old $700 card, of course you're having problems."_ I am calling bullshit there as up until Ampere this was not an issue for $700+ cards when playing on a lower resolution than they were advertised to do fantastic in at launch.
    If I were trying to use 4K/Ultra with medium RT settings in games that just released, even though Nvidia was still selling cards new for well over $700 a few months ago (and many 3rd party sellers are still trying to as are some smaller shops), I would understand. The card being 2.5 years old is certainly a factor, as it will play pretty much anything from 2020 and previous flawlessly, but if you did happen to buy one new 3 months ago, those people expect, and rightfully so, good performance at *4K* for the next two years. If the card was already dated then Nvidia should have slashed prices accordingly as opposed to selling an inferior product well over MSRP on the AIB side of things. So the _"It's a 2+ year old card..."_ is little consolation to your average Jane or Joe who bought one to throw in their 2018 10700K prebuild, or worse, someone who got one in their $3000 ACME _Super High Ultra Game Shredder 2000_ build from Best Buy. So if 10GB is literally falling apart, the 8GB 3070 is in trouble.
    Yes, devs can take an extra 6 months optimizing their games to make textures fit in an 8GB RAM buffer for the 500K who _might_ buy their title on PC who are limited to 8GB of VRAM...or they could focus on ppl with 16GB+ Cards and console performance so the 10 *MILLION* plus people on those platforms get a better experience. Or, they can simply give up on PC ports for a couple of years until the install base with 16GB+ cards is large enough to merit a tolerable PC port again, instead concentrating all of their time on squeezing every last drop of performance out of the PS5 and Xbox Series X, telling Microsoft that as much as they would love to have their game on the Series S and PC, it simply isn't feasible, at which point we enter a brief PC _dark age_ again.
    It's happened numerous times before, and when I see games like The Last of Us performing the way they do, looking like they do, but still using ghastly amounts of VRAM, I have to believe we might be at that _dark age_ inflection point. Even 12GB cards are borderline pointless to purchase *NEW* atm, as what do they have, 1.5 years before 1440P high is a problem for them? Maybe ok if you are grabbing a $379.99 6750XT, but a $900 3080Ti? A used 3080Ti for $650? A $600 4070 or $800 4070Ti? To _probably maybe_ get you through the next year and a half? Heck no. That's way too much money for a placeholder card that will probably be exactly where the 3070 is right now by January 2025 when next-gen consoles with 24-32GB gddr7 memory pool are on the horizon. Simply ridiculous.
    VRAM (GDDR6X) is less than half the price when compared to what it was in 2020 and 2021. There is no reason the 4070 could not have released with 16GB of VRAM at the same $500 price point, with the Ti also having 16GB and a $675 *(MAX)* MSRP. Instead we get this idiotic corner cutting after they already screwed millions of people with Ampere, knowing full well this was going to be an issue while cards were still selling as new products. They work way to closely with Epic in Unreal, EA with Frostbite, CDPR with Red etc etc for them to justify what they did too consumers. I'm sure they are quite proud at the investor level, but to their customers, this sucks. This isn't a $250 card you plan on replacing every 2 years. To the contrary, when you spend $400, $500+ USD on a GPU you expect the card to last at least one GPU generation after it's own.
    When a $700 card literally got you 7 generations of high spec gaming with the 1080Ti, and the $500 1080 was good for 4 years at 4K, 2 at 1440P then fell off a cliff into 1080P/Medium 60FPS w/FSR 2.0, to think a 3070 may literally become irrelevant around the same time as a $500 card from 2016, or a $350 card from 2019 (the 5700/XT) we have a serious problem for Nvidia because their customers should be *LIVID.* There is no EVGA trade up program to save us here, and these cards basically turn into 1080P High cards with DLSS 2.x quality by Christmas 2024? Good grief man.
    I am going to give AMD *EVERY* chance to earn my business. Whether it's with a 7800XT on a very, very slightly cut down 40 die (say a 5% smaller compute die than the 7900XT) or whatever they drop next generation. I _guess_ I can get by at 1440P/High with DLSS *Balanced* turned on and _hope_ the 10GB buffer is enough (Have you seen how bad The Last of Us looks on medium? It is worse than a lot of PS3 textures I've seen, what the hell are they doing...I mean besides using console exclusive compression/decompression for which the PS5 needs it's own separate custom chip to unpack...on PC...in the exact same size and format...When I saw that I almost fell down. Like half the compression size of typical game files...Hmm, I think I just answered my own question and even then, it seems like it costs too much money for them to justify changing file types to suit PC gaming so we get PS4 textures for high and PS2/PS3 for medium at 75% the original file sizes? Of course the VRAM fills up astonishingly fast. Add in the fact that it wants a 16GB pool to pull from and we have a game that is still a bit of a mess on 12GB cards, never mind 8GB ones. Thank goodness Cyberpunk came out when it did, as can you imagine those graphics with shit compression *AND* expecting to be able to store 15.5GB of graphics in VRAM? Yikes. The 3070 would have played like the PS4 at release.)
    Anyway, game devs *should* still try to accommodate 8GB cards for games releasing over the next couple of years in my opinion. The reason being that 90% of the cards sold over the last 3 years have been 8GB *or less.* The average age demo for AAA gamers is up to 39-years old. A huge portion of them lean towards single player story/action based games as opposed to the younger, say 35-years old and under tending to play more multiplayer titles that typically use less VRAM because the more people you can get in a game, the more of a chance you have to monetize said players.
    2 years from now we'll at least be a generation removed from 8GB, though we will still have far to many people on 10GB-12GB cards, at least the number _should_ be far smaller. Maybe 65% on 8GB or less, and 80% on 12GB or less, at which point people will have had the chance to either pick up a 16GB card or save to get one since I doubt many people buying a $600 3070, or $750 3070Ti, or $800 3080, or $900 3080 12GB, or...you get the point, had any idea their card might be dated before 2023 ended. Never mind the people who bought a 4070/Ti whose cards are already showing us _"Shit, 12GB isn't enough half of the time either? Wtf?"_
    Is some of this on Devs? Of course, but it is a decision they are consciously making to get the best console experience possible. Is most of this on Nvidia? Yep. They knew, they didn't care, they maximized profit (which is not a bloody law, people really need to look it up before repeating the same thing over and over), and hopefully this time it continues to be at the expense of their current customers saying _"Nope, I am done. There is no way I am buying that product."_ They can move to AMD, they can move to Intel. Even better, they can buy a used or open box card with 16GB+ memory when they are all at better prices and force both AMD and Nvidia to offer substantial upgrades across the board at a reasonable price both parties can live with. 63% (Nvidia's 2022-Q1 2023) is too high for me to be comfortable with the products, when as little as 5 years ago 10%-30% was pretty standard. Especially when that R&D is going towards making crazy high powered professional AI chips and not better gaming GPUs. I'd rather not subsidize that market. Margins are through the roof there already, that market can support itself.
    More memory, faster cards, better prices. They say that only 28% of gamers report that they are gaming at 4K on PC so why bother making affordable 4K GPUs? Well silly goose, we aren't at 1080P and 1440P because we want to stay there, most of us haven't bought 4K displays because the GPUs needed to run them well have been over $1K since Turing! 4K/High 60-100FPS _should_ be mid-range by now, but prices have not scaled with performance since Pascal. It's not bc 4K displays are too expensive, it's because the GPUs are!

  • @givmetehsucc
    @givmetehsucc Год назад +5

    im glad i went with a 3060, even though i had the chance to get a 3070 for only a tad bit more, i wanted the extra vram, and it seems like that choice is paying off.

    • @CloudShoob
      @CloudShoob Год назад +1

      It doesn’t quite work like that. 99% of the time you’ll run into the physical performance limit of the 3060 before you ever saturate that 12GB of vram.

    • @nikosensei1258
      @nikosensei1258 Год назад +1

      U dont understand bro, i put it simple, 3060 is a truck with 12 ton weight, while 3070 is a truck with 8 ton, 3060 is too weak to pull 12 ton weight, so it doesnt matter for 3060 with 12 ton weight.

  • @n45a_
    @n45a_ Год назад +3

    Luckly for me i dont play singleplayer games, so 8GB of Vram is perfectly fine and will be for probably next 2 years. But AMD gpus are the way to go for most people now (if they dont care about dlss) if they dont need Cuda for work or AI

  • @ElTrainerMah-Knee
    @ElTrainerMah-Knee Год назад +1

    Do you ever have in mind benchmarking an intel dedicated GPU?

    • @EbonySaints
      @EbonySaints Год назад

      The A750 would probably be the best fit for the channel. The A380 is nice, but it's only like 5-10% better than a 1650 at best unless you do some crazy things or really want a great video encoder. The A770 is the poor man's AI/editing card now since the A750 has 90% of the practical performance in gaming, but at 70% of the price.

  • @misfortune5091
    @misfortune5091 Год назад +1

    How does the RTX 3070 compare to the RTX 4060 Ti ?? (both are priced at $400)

  • @100500daniel
    @100500daniel Год назад +3

    3070 is still decent and you'll be decent for some time. You don't need ultra textures and rt to enjoy games.

    • @DarekGLA
      @DarekGLA Год назад

      But Jensen said I need RT to enjoy it and I was asked to pay high end price