RTX 2080 Ti vs RTX 3070 vs RX 6700 XT (Benchmark in 10 Games at 1080p) 2024

Поделиться
HTML-код
  • Опубликовано: 22 авг 2024

Комментарии • 52

  • @feilinzhou2558
    @feilinzhou2558 2 месяца назад +3

    I have RTX 2080 Ti and RX 6700 XT.
    I have always been wondering why RX 6700 XT feels so slower than RTX 2080 Ti in basically every game I tested. But in your test, it doesn't seem to be that slower.

    • @teflonstestbench
      @teflonstestbench  2 месяца назад +2

      I mean, it is noticeably slower than the RTX 2080 Ti but depending on the game or your test system that difference may change, like for example if you were testing the RX 6700 XT without rebar or had models with more or less boost clocks or boards configurations than the ones I had.

    • @JustAGuy85
      @JustAGuy85 2 месяца назад +3

      @@teflonstestbench Exactly. Gotta have that SAM on the AMD cards to get those huge boosts.
      AND depending on your CPU, SOME games benefit 0% from SAM with Zen 2 or Intel 10th gen in certain games, while in those same games, SAM shows the improvements with Zen 3 and Intel 12th gen.
      Not to say that ALL games don't benefit from same on Zen 2/Intel 10th gen, but I saw that in a test and it really surprised me. Dunno if it's an architectural thing or a "soft" limit but yeah...
      If you're running a Zen 2/Intel 10th gen, there's gonna be games that SAM gives 0% gains in whereas with a Zen 3/Intel 12th gen, you'll get substantial gains.
      But the 2080Ti is a beastly card, too. As is the 3070. It's always surprising to see the 6700XT 12GB hang with the 3070. 2 years ago, there weren't any games it could hang with it in. AMD Fine Wine drivers really are a thing lol. The 6700XT 12GB today is not the 6700XT 12GB at launch.
      I'd imagine the 6700XT 12GB does substantially better than the 2080 Ti in Call of Duty games. RDNA2 or just AMD in general really loves that engine or something. Kind of like Forza Horizon 4 and even 5.

    • @_Isshin_The_Sword_Saint
      @_Isshin_The_Sword_Saint Месяц назад +1

      @@JustAGuy85 Between you and me, I gotta say I'm jealous of you having a 2080ti...jk 😂. But, it's cool. Before 3000 series came out my friend secured a banger deal on a strix 1080ti. And now I wish to do the same on a rtx2080ti, I really wanna get one, even founders edition would be perfect as I wanna collect these cards in the future hopefully.
      But the point, you own both and and nvidia, as of right now for upcoming games, black myth wukong, phantom blade 0 , doom etc. at 1080p high or medium settings which GPU should I get.
      A rtX 2080ti or a Rx 6700XT?

    • @JustAGuy85
      @JustAGuy85 Месяц назад

      @@_Isshin_The_Sword_Saint Ah, my bad, I have a 5900x (PBO/180a EDC limit/+200 mhz boost/per core Curve Optimizer) + 6700XT 12GB (XFX Qick 319 flashed to the XFX Merc model) + 2x16GB DDR4 3600 C16 G.Skill Ripjaws @ DDR4 3666 w/ 1:1 IF. RAM has FULLY tuned timings/sub-timings and it's a dual rank kit, too. So, I tend to get better performance than the "usual" benchmarks you see. Dual rank RAM has shown to give up to 10% performance boosts in some games.
      (Edit: May as well list my mobo: Asus B550-F Gaming Wi-Fi II with WiFi 6e and BT 5.2, but now it's BT 5.3 on Windows 11 lol.)( It's an awesome board with a great VRM set up. My 5900x is pulling 220 watts in Prime95 with my settings with zero issues and more room to go if I wanted to. My 5900x is cooled by a Fuma 2 Rev B with 3x fans and behind the 3rd 120mm fan is an Arctic P14 140mm exhaust fan lol! My case has all 140mm P14s in it.. minus the Fuma 2 Rev B)
      But, yeah, I have the 6700XT 12GB. It's definitely a 2080 Ti competitor. The newer the game, the better the RDNA2 technology does. RDNA2 + UE5 = great. These cards LOVE UE5 engine games. It has 96MB of Infinity Cache, too.
      Hell, I'd rather have a 6700XT 12GB or 2080 Ti 11GB than a 3070 with only 8GB VRAM. My RX480 had 8GB VRAM lol. And games USE that VRAM now. I was playing The Callisto Protocol and was using a total of 22-24GB total system RAM and 11.x+ GB VRAM. 8GB just ain't cutting it. I also play DCS: World which uses 24+GB RAM (including Windows) and nearly all my VRAM. 8GB VRAM just won't last through 2024 imo.
      Look at the 3070 vs 6700XT in the latest 20024 UE5 games, now. The 8GB VRAM is killing the 3070.
      BUT Nvidia does have DLSS and superior ray tracing performance... in "RTX" ray tracing games. But when it's not sponsored by Nvidia, AMD does just as good. Like in Far Cry 6 or that Avatar game, the 6700XT beats the 3070 because of VRAM and lumen+nanite.

    • @JustAGuy85
      @JustAGuy85 Месяц назад +1

      I mean, everything I play uses well over 8GB VRAM. F1 22/23/24. Callisto Protocol. I'm trying out that new Robin Hood game. Forza Motorsport. Call of Duty games (which the 6700XT beats the 3070 in anything from Warzone and newer in CoD games). It also beats the 3070 in Battlefield 2042.
      I WANT a 7900 GRE, though. That's a great price per fps if you ask me. That's my next card I plan on getting. I don't think there's a better $500 GPU you can get for that money. I was looking at the 6800XT... but if I'm gonna upgrade, I may as well really upgrade.
      So I thought the 6950XT would be a good choice. Then someone reminded me about the 7900GRE. And, yeah, THAT'S the one I'm gonna get. It's pretty much right on par with the 6950XT but has slightly better ray tracing. And since it is RDNA3, there will eventually be new features that only RDNA3 will be able to do that RDNA2 will be left out of getting.

  • @pizzapr9287
    @pizzapr9287 Месяц назад +2

    6700xt could probably keep up after an oc

  • @tahinkhan3763
    @tahinkhan3763 2 месяца назад +2

    Great Video! Very helpful but I have a question, why was the frametime graph in rdr going insane on the 6700xt?

    • @teflonstestbench
      @teflonstestbench  2 месяца назад

      My guess is AMD drivers. It happens with every AMD card and every system, I've seen it on other people tests too and also the ram goes extremely high vs the Nvidia cards so it's probably a driver thing. Tho the game seems to be playing well.

  • @pkpnyt4711
    @pkpnyt4711 2 месяца назад +9

    By lowering the 2080 ti to 86% power you are nullifying/gimping the real performance of the card. This makes the comparison invalid. The cards should all be left at stock to see what their real world performance is out of the box.

    • @teflonstestbench
      @teflonstestbench  2 месяца назад +7

      What I'm doing is setting it at its stock settings. Gigabyte bumped the tdp to 300w while a stock 2080 ti should be at 250w, so all I'm doing is rebalancing the card so it doesn't have an unfair advantage

    • @pkpnyt4711
      @pkpnyt4711 2 месяца назад +5

      @@teflonstestbench Stock settings for 3rd party cards is whatever it comes out of the box. What you're doing is gimping it period. It's been the same for every 3rd party card since forever. They're clocked higher and pushes their voltages to whatever it is in their default settings. Agree to disagree.

    • @teflonstestbench
      @teflonstestbench  2 месяца назад +1

      What you say it's true to a degree, it's true they have higher boost clocks and different voltages and PCBs, but that doesn't matter much as Nvidia's boosting works over that.
      What I'm doing is limiting the power because I want the performance of the base card, the average for that card and not to account for the extra performance that manufacturer is giving it so it levels the playing field as much as possible. It would be fair to compare a card that has extra tdp and a special PCB to a low-end card then as the performance difference forced by their design will affect theresults and make the gap between the cards artificially bigger or smaller than it really is. So by downgrading or in other cases upgrading the card I can make sure they are as close to the average performance for each card and make a fair comparison between the lineups and not specific manufacturer cards.

    • @pkpnyt4711
      @pkpnyt4711 2 месяца назад

      @@teflonstestbench the base cards were barely sold by Nvidia (reference models), it's the same for the 3xxx series. They were very few in number, so your logic is flawed. Most of the people that owned 2080 ti's (including myself) all bought the 3rd party cards because they were readily available and performed better than reference. If you're also using the same logic then you should also use the refence cards for the other 2 cards, which you aren't. So, you're contradicting yourself which brings me to my original point.

    • @teflonstestbench
      @teflonstestbench  2 месяца назад +1

      I'm using the default values from Nvidia that most cards follow, 3rd party card usually maintain the same tdp and really close speeds, the thing is that some have extra power or boost way higher, only on those cases I compensate.

  • @takik.2220
    @takik.2220 2 месяца назад +1

    could you do a benchmark at 1440p with these cards?

    • @teflonstestbench
      @teflonstestbench  2 месяца назад +3

      Yeah, will do it soon, next videos are gonna be all 1440p and ray tracing tests

  • @Howlsowls
    @Howlsowls 2 месяца назад +1

    Hey Teflon love your benchmarks, I mustve watched a hundred of your videos making buying decisions every year

    • @teflonstestbench
      @teflonstestbench  2 месяца назад +2

      Thank you so much for the nice comment, and so happy they helped :)

  • @sc9433
    @sc9433 25 дней назад

    rx 6700xt almost half the power draw.

    • @pwnomega4562
      @pwnomega4562 22 дня назад

      No it's not. Look at the real time data in the video.

  • @yummerz6302
    @yummerz6302 2 месяца назад +1

    Seeing the 6700XT lose sort of hurts my soul. I hope everything is in order with that 6700XT
    Also god damn how is that 2080Ti so cool?

    • @teflonstestbench
      @teflonstestbench  2 месяца назад +8

      Yeah, it's just that the RX 6700xt was meant to compete with the Rtx 3060 ti, so it is performing as it should, even better than I would expect given it's close to the RTX 3070 performance.
      The RTX 2080 ti is that cool because I had to watercool it as the gigabyte design is so bad the cooler stopped making proper contact with it and it couldn't be cooled down properly, so the only option was to use another cooler and the only ones compatible with all the models are the water cooler kits like the NZXT kraken g12 I'm using.

    • @yummerz6302
      @yummerz6302 2 месяца назад +1

      @@teflonstestbench Only now Ive noticed that I haven't read the important notes after watching lmao, ah here's an interesting video Idea if you ever get a 6750XT, OC and Undervolt the 6700XT and put them up against each other! I can share my settings for the 6700XT if you'd like

    • @teflonstestbench
      @teflonstestbench  2 месяца назад +3

      @yummerz6302 yeah that is something I would love to do with the other 50 series from that generation as the differences are just oc and power, tho sadly I can't afford to get all those GPUs to test as they would only be relevant for those tests and they are quite expensive. Same with the 7600 and 7600xt that the difference is just the memory...

    • @yummerz6302
      @yummerz6302 2 месяца назад +1

      @@teflonstestbench the ultimate bottleneck of a gaming pc... budget...

    • @feisaljatnika9160
      @feisaljatnika9160 2 месяца назад +2

      2080ti is the greatest invention from nvidia, even it has higher fps than 4060.

  • @cfif_asd
    @cfif_asd 2 месяца назад

    в rdr2 при активации Sam 0.1 процент вылетает до небес и карта резко будет обходить 3070. можете не благодарить)

  • @dr.eduardogomes8592
    @dr.eduardogomes8592 Месяц назад

    nVidia better AMD white line.

  • @piyapolphetmunee3879
    @piyapolphetmunee3879 2 месяца назад +1

    Most of these tests are CPU limited

    • @teflonstestbench
      @teflonstestbench  2 месяца назад

      Why?

    • @JustAGuy85
      @JustAGuy85 2 месяца назад

      He's running... a 5800x. You know that is on par with a 12600/12700k in gaming, right? Despite it being released in the same generation as the Intel 10th gen.
      I have a 5900x. It was released the same year as the Intel 10th gen. It eats the 10900K's lunch in gaming. Beats the 11900k in gaming. Goes back and forth with the 12700K (which is a 12c CPU, like the 5900x).
      5800x performs practically the same as a 5900x, maybe 2-3% slower due to ever so slightly lower boost clocks + half the L3 cache.
      Shi... my 5900x has PBO on, CO tuned per core, I have 2x16GB DDR4 dual rank RAM @ 3666 C16 with fully tuned timings/sub timings. I can't get my CAS latency any lower without throwing more voltage at my RAM than I want to because it's dual rank RAM, but dual rank RAM has shown to give anywhere from 3-10% gains in performance. And DDR4 3600 C16 is quite the sweet spot for Zen 3, anyways. I just went to 3666 with IF @ a 1:1 1833mhz for the lulz.
      Ryzen cares more about timings than anything. DDR4 3400 C14 would perform better than my RAM most likely.

  • @PLAEJA
    @PLAEJA 2 месяца назад +1

    RAYTRACING ON !!!! BOAH EY ....