Nvidia claims their AV1 encoder is the best - What's the truth?

Поделиться
HTML-код
  • Опубликовано: 3 окт 2024
  • videocardz.com...
    blogs.nvidia.c...
    Twitter: / swallowfire
    Message me anywhere if you have any suggestions or comments, I'd love to hear from you!

Комментарии • 59

  • @REgamesplayer
    @REgamesplayer Год назад +16

    Thank you. This is exactly what I was looking for. There is next to none information of AV1 comparison between three cards.

  • @MichaelW1980randoms
    @MichaelW1980randoms 7 месяцев назад +4

    What was the bitrate on the Blu-ray re-encodings? If you chose the same as you did on gaming, it’s no wonder the Blu-ray reencoding looked abysmal in comparison. AV1 encoding doesn’t allow for 90% smaller files. And no one ever claimed that. As for streaming quality: AV1 streaming is supposed to be like 25%ish smaller, not half the bandwidth. So it’s not really fair to compare 3.5 mbit AV1 to 6 Mbit H264 either. I almost feel allowed to say here: All you prove, is that Intel does a better job on too low of a bitrate for streaming. At the very least, your testing methods are somewhat faulty.

  • @worthlessAF
    @worthlessAF 9 месяцев назад +17

    The video is confusing, the title is Nvidia claims their AV1 encoder is the best. But look at bitrates that very few uses. Almost everyone maxes the RUclips bitrate, especially gamers. Do the encoders perform the same at all bitrates? And what about presets ?
    Then at the end you compare h264 35mb/s with av1 3.2mb/s? To see if we can see a difference ? Of course, we would see a difference, it's more than 10 times the difference in bandwidth. Why would we not ? You had me at the start and lost me at the end.
    Can you clarify your intensions?

    • @RichMantaray
      @RichMantaray 9 месяцев назад

      He's not using a 40s card with av1 for sure lol. Cos my 4060 out performs all his tests even in the normal way with av1 lol

    • @RestTarRr
      @RestTarRr 3 месяца назад +3

      @@RichMantaray only 40 cards have av1... Your 4060 will have the same performance unless they somehow made it better with newer drivers.

  • @Nostxalgic
    @Nostxalgic Год назад +9

    I really liked the format of this test, especially using a Blu-Ray movie as a source.
    Would love to see a retest soon, at specific bitrates, with AMD in the mix. It’d be the ultimate showdown.

    • @Swallowfire
      @Swallowfire  Год назад +4

      I want to! I just can't justify another expensive GPU right now lol
      Maybe if I can convince one of my friends to buy a 7900XTX and let me mess with it hahaha

  • @Julz2k
    @Julz2k Год назад +7

    What was the used bitrate for the Bluray av1 encode compared to the h264 bluray rip. I watched the part four times but I could not find any hints.

    • @Swallowfire
      @Swallowfire  Год назад +7

      Sorry about that. Oversight in my script. It's the same as the gameplay - 3.2mb/s.

  • @OcihEvE
    @OcihEvE Год назад +2

    I am assuming this video made it to my feed because I was hunting for information on AV1 earlier. I own an RX 7900XTX and was recording a benchmark at 4K High performance in OBS using AV1 at 80,000 kbps bitrate. While running the benchmark on the same card. I had done a benchmark of the game without recording and was hitting 80fps. That number dropped to around 56 avg fps while recording. I did another using .264 encode, it dropped to 52 avg fps so AV1 got me back some fps but it was minimal. When running .264 my CPU spiked to 21% from 2% but everything was still GPU bottleneck. I also tried running the benchmark in 1440 and upscaling the output to 4K with AV1 and the video was complete trash. I returned to a 90fps avg but with the video being useless it didn't matter.
    My goal was to establish, could I play and stream in 4K and while the answer is a resounding yes in most games, it wouldn't be wise to do so. As for the AV1 aspect, unless you have a severe CPU bottleneck it isn't a variable one needs to worry about.

  • @DrFlexit
    @DrFlexit Год назад +4

    Hey! Can you make a video on svt av1 using cqp settings of your choice?

    • @Swallowfire
      @Swallowfire  Год назад +2

      I want to! I'm waiting until I can get an AMD 7000 card so I can do a three way high bitrate comparison.

    • @DrFlexit
      @DrFlexit Год назад +3

      @@Swallowfire no. I mean with a cpu. 😌

    • @MatheusTavares-od6nw
      @MatheusTavares-od6nw 5 месяцев назад

      +1 to having the encoding method/codec/settings that you prefer to use yourself, or maybe even 4K/Fahd comparison with it, plus file size / bitrate / quality comparison between them would be awesome!!

  • @TheRealCatCatCat
    @TheRealCatCatCat Год назад +6

    They should rebrand to NVIDIAnotherfib

  • @demon6937
    @demon6937 Год назад +1

    sheesh i was just worried why did i settle with rtx 3070 laptop when av1 in screenshots for 40 series looks so good 😅

  • @ibrahimamirashayeri2775
    @ibrahimamirashayeri2775 Год назад +3

    I am looking to stream with my gpu I am not sure if I should get the 6950xt or the 4070 because they are the same price. Will then nvidia 4070 make a large difference to the quality of my streams in comparison to the 6950xt which is more powerful . Pls help

    • @Swallowfire
      @Swallowfire  Год назад

      That's a really tough question. I think the 6950 is a better choice. It's an extremely powerful card, and while the encoder isn't as good, you have to think which you'd prefer, a higher framerate, or a slightly better looking image. Most people can't tell on the latter, but everyone can see a higher framerate.

  • @RaaynML
    @RaaynML 2 месяца назад

    I'm here because I noticed AV1 on NVENC being _noticably_ worse quality than HEVC even at higher CQP

  • @Bigga_Velz85
    @Bigga_Velz85 3 месяца назад

    When doin a side by side comparison but im using obs n Nvidia app to record clips/highlights usin av1...I'm being honest... quality from Nvidia app is amazing n looks way better than obs...sharper n more crisp...no pixelation

  • @kozodoev
    @kozodoev 6 месяцев назад

    why not compare to the software encoder though. at least for bluray

  • @Calabreezzy
    @Calabreezzy Год назад +1

    Thanks for the video! I stream Halo Infinite on RUclips, and I'm in the process of building a dedicated streaming PC. I ordered the Acer Predator BiFrost Intel Arc A770 for $479 CAD. Do you think that's a better option than the Nvidia RTX 4070, and what CPU do you recommend? AV1 encoding is my focus for streaming and recording gameplay, and I'm planning on using the Elgato 4K60 Pro MK.2 to capture my Xbox Series X.

    • @Swallowfire
      @Swallowfire  Год назад +1

      Personally for content creation on Arc, I would go with an Intel 13th gen CPU with an iGPU. The Arc and the iGPU can work together to improve encode speed.
      The 4070 will be faster but it's also way more expensive. If all you're doing is streaming, capturing, and editing video, then the Arc is a better value.

    • @Calabreezzy
      @Calabreezzy Год назад +1

      @@Swallowfire Thank you for your input! As long as the Arc A770 can simultaneously stream and record in 1440p without any issues, then I'll stick with it. I'm not gaming on this PC, so the RTX feels like overkill. As for the CPU, can I get away with a 13th gen i5 or i7, or should I go with the i9?

    • @Swallowfire
      @Swallowfire  Год назад

      Nah I'd save money and go for something like an i5 13600K. It's incredibly fast for how cheap it is. Imagine going back 10 years and saying in 2023 i5s have 14 cores and 20 threads.

    • @Calabreezzy
      @Calabreezzy Год назад

      How about a 12th gen? I'm seeing some great deals on them.

    • @Swallowfire
      @Swallowfire  Год назад

      12th gen is also good, I think the performance difference between them is slight.

  • @forcezero2220
    @forcezero2220 Год назад +2

    Man i want buy intel arc but fear because not optimize yet. What are you reccomend buying greedy nvidia ,or buying amd lazy( new generation),or thrd choice intel arc?
    Btw my budget max 500$ for gpu

    • @lain2236ad
      @lain2236ad 11 месяцев назад

      used 7800xt seems like the best rn

    • @forcezero2220
      @forcezero2220 11 месяцев назад

      @@lain2236ad i see how the performance of that card from your experience?

  • @RichMantaray
    @RichMantaray 11 месяцев назад +2

    i would belave u ,, but u still have them round things ,,,,,so ther for i deem this to be click biat :) i wil lcome back when i have my 4060 and let u know

    • @RichMantaray
      @RichMantaray 9 месяцев назад

      I did get a 4060 , and omg it's way better than thus guys test , intact I'm 100% he's lieing and hasn't got a 40s gpubat all lol or he got his from aliexpress lol

  • @MissMan666
    @MissMan666 Год назад +3

    would have been nice with AMD thrown into the mix

    • @Swallowfire
      @Swallowfire  Год назад

      I really hope their low end/mid range cards are good and well priced. They need to light a fire under Nvidia.

    • @Swallowfire
      @Swallowfire  Год назад

      Low end/mid range Radeon 7000 cards I mean

    • @godnyx117
      @godnyx117 Год назад

      Did you watched the video? He already explained this in the end of the video...

  • @Hi-levels
    @Hi-levels 10 месяцев назад

    At 6000 kbits or 8000 1080 60fps can you test av1 encoding on hardware mode with witcher 3 ultra settings. I have a 3090 and nvenc sucks at this scenerio so i use my x264 + slow + some better flagz and it beats nvenc

  • @Komentujebomoge32
    @Komentujebomoge32 Год назад +5

    Man, better save up your money, because you will need to use your GPU's in order to eat. I don't know how the metal and plastic parts taste, but I'm sure that they aren't tasty

    • @Swallowfire
      @Swallowfire  Год назад +7

      I love the taste of copper. Maybe that's just blood tho.

  • @nofalware
    @nofalware 8 месяцев назад

    thanks

  • @52no
    @52no Год назад

    if i stream games both encoders have same quality?

    • @Swallowfire
      @Swallowfire  Год назад +3

      Yeah they're about the same. The footage I showed was recorded natively on both, so you can use that to compare.

    • @RichMantaray
      @RichMantaray 9 месяцев назад

      Helps if u do 1440p ul notic a difference then

  • @faiyez
    @faiyez Год назад +1

    Nvidia's av1 encoder beat Intel arc's across the board in Tom's hardware's tests.

    • @Swallowfire
      @Swallowfire  Год назад +1

      Yeah in a VMAF test. Which did you prefer in my Halo gameplay?

    • @faiyez
      @faiyez Год назад +1

      Can you promise me that arc has the best av1 encoder for streaming because the A750 at 199 US is so enticing

    • @Swallowfire
      @Swallowfire  Год назад +1

      I can't promise anything. I can tell you it's a good card for video production and a hell of a value for $200, compared to how crap the 4060ti is for double that.
      If you couldn't tell between my Halo footage, then you'll be happy with the Arc.
      Definitely wait for OBS to support Intel AV1 natively before buying anything.

    • @dRuNkHiPpi
      @dRuNkHiPpi Год назад +1

      @@faiyez I snagged an A750 open box for around that price. If your sole intention is streaming and/or productivity, I'd pull the trigger. The A750 isn't terrible at gaming either, but there are better options for not that much more money.

    • @faiyez
      @faiyez Год назад +1

      @@dRuNkHiPpi I'm not buying anything until July so the plan was to wait for the 4060.

  • @RichMantaray
    @RichMantaray 9 месяцев назад

    This is fake, hes uisng a 30s card lol not a 40s card cos my 4060 leaves his findings in the dust lol and i do 1440, FAKE FAKE