RTX4060 AV1 vs H.265 Encoder Using Handbrake

Поделиться
HTML-код
  • Опубликовано: 27 авг 2024
  • Comparing the AV1 and H.265 encoder cores in the new NVIDIA RTX4060 chipset using Handbrake.
    If you find my videos useful you may consider supporting the EEVblog on Patreon: / eevblog
    Web Site: www.eevblog.com
    Main Channel: / eevblog
    EEVdiscover: / eevdiscover
    AliExpress Affiliate: s.click.aliexpr...
    Buy anything through that link and Dave gets a commission at no cost to you.
    T-Shirts: teespring.com/s...
    #nvidia #4060 #handbrake

Комментарии • 92

  • @crackwitz
    @crackwitz 11 месяцев назад +137

    CQ scales aren't comparable between codecs. Your options: encode to average bitrate and compare visual quality, encode multiple CQ and then match resulting bitrate or quality.

    • @sth1990
      @sth1990 11 месяцев назад +5

      Or actually compare quality with tools like ffmetrics (don't know if av1 is supported at this time)

    • @salat
      @salat 11 месяцев назад +17

      CQ are not comparable, absolutely true! He should compare the output at a fixed bitrate - e.g. 4k with only 10MBit/s: The difference between HEVC and AV1 is mind blowing..

    • @Thermalions
      @Thermalions 11 месяцев назад +9

      Indeed, it's a mistake many people made in the early days of the switch from 264 to 265.

    • @gglovato
      @gglovato 11 месяцев назад +4

      Exactly this, cq is not comparable between codecs

    • @WacKEDmaN
      @WacKEDmaN 11 месяцев назад +1

      i was gonna say exactly the same thing....
      ...and AV1 is a pita... half the time AV1 youtube vids studder n carry on like a cut snake..especially when ya have 3 youtubes going on 3 screens..... i had to do everything i could to force it to vp9 just to get it smooth

  • @salat
    @salat 11 месяцев назад +40

    8:30 Not entirely correct: The GeForce RTX 4070 Ti and bigger models have 2 (two) NVENC encoders built in (see NVidia's Video Encode and Decode GPU Support Matrix). Apps like Davinci Resolve Studio can use the second encoder to double the encoding speed..

    • @Mr.Leeroy
      @Mr.Leeroy 11 месяцев назад +4

      Also max resolution is probably limited by VRAM requirement

    • @gyulamolnar8971
      @gyulamolnar8971 10 месяцев назад

      And what with intel, and AMD cards? What can faster encoding?

    • @Xero_Wolf
      @Xero_Wolf 8 месяцев назад +1

      @@gyulamolnar8971 I haven't been able to compare Intel and Nvida but if you are looking at AMD for encoding forget it. It's horrible comparing to the other two.

  • @alexatkin
    @alexatkin 11 месяцев назад +31

    As many others have said in the comments, Quality Factor is not comparable between codecs. If you had hovered over the slider for a second handbrake even tells you this "The scale used by each encoder is different".
    Most importantly, hardware encoders are optimised for speed, software you can sacrifice speed for bitrate savings. If you want the absolute best saving in storage space, you need a beefy CPU and to use software encoding. AV1 is mostly a benefit right now if you need the most efficient real-time encoder, such as live streaming or in my case when I'm doing AI upscaling and will do a final encode using software once I'm sure the result looks good.

    • @IvanNedostal
      @IvanNedostal 7 месяцев назад +1

      He used intel SVT on AMD system and comments that it is slow... He does not even know wth is he saying/doing.

    • @volodumurkalunyak4651
      @volodumurkalunyak4651 5 месяцев назад

      @@IvanNedostal SVT-AV1 isn't Intel only. It is a software encoder, NOT A hardware one. It works at as fast as Zen4 on a Raptor Lake.

  • @liampwll
    @liampwll 11 месяцев назад +12

    In addition to what's been said already: NVENC is not designed for quality, it's designed to be fast and low power. Also Intel's QuickSync on their newer GPUs is a little faster and a little higher quality per bit if you can get a low end Intel GPU.

    • @everydreamai
      @everydreamai 11 месяцев назад +3

      NVENC has improved substantially in quality over the years though. Early on it was pretty bad because it lacked quite a few compression features like motion compensation. It's overall envelope of performance and quality has improved.

    • @Mr.Leeroy
      @Mr.Leeroy 11 месяцев назад

      @@everydreamai it is still aimed at real-time encoding with streaming in mind.
      I highly doubt that it is possible to design economically viable real-time encoder ASIC for quality comparable to the limits of codec that general purpose computing is touching while pushing vs time.

    • @matthiasfuchs1299
      @matthiasfuchs1299 11 месяцев назад

      I am also wondering why he is using that hardware encoders. They also create much higher file sizes. Maybe he needs the higher encoding speeds but for sure he loses quality and has (much) larger files.

  • @everydreamai
    @everydreamai 11 месяцев назад +8

    As others suggest, a "26 quality" on codec A is not the same quality as a "26" on codec B, and targeting some sort of equal "quality" number is actually quite hard engineering problem and you need to use a metric like SSIM to judge "equivalent quality." Sometimes the quality range between codecs doesn't even overlap as I think another comment pointed out. One might go from 0 to 50, another might go from 1 to 10.
    Another method is to use the same bitrate then see which one looks better purely subjectively. The barrier to entry is lower than trying to use some subjective-simulating objective measurements (SSIM, VQEG). At high bitrates you may have a hard time distinguishing two different codecs, but if you try lower and lower bitrate you may find AV1 loses less subjective quality than h.265. This is very much a performance envelope type thing balancing bitrate/filesize versus "quality." At X bit/s perhaps both look the same to you, which should just suggest trying a lower and lower bitrate until you can tell the difference. Then perhaps at (X-500) kbit/s or (X-1000) kbit/s you can see a difference.
    The rabbit hole goes pretty deep on this, but I think if you keep investigating you should find AV1 is quite good. It's becoming more popular and widely used across the web for good reason.
    There's quite a lot of engineering expenditure from the broader industry on these codecs and they continue to improve as clever mathematicians devise new schemes and hardware to support the more advanced maths is baked into silicon to make it practical to use.

  • @mehow357
    @mehow357 11 месяцев назад +15

    IDK specifics of "handbreak" software, but there are pros & cons for av1 over h265. AV1 is royalty free, where h265 is a big royalty mess (totally unclear scheme). As usually it goes: free stuff become more popular over time, as encoders/decoders are used by many apps, browsers, phones, PCs, mediaplayers, ect. Even though AV1 is more hardware demanding, it offers higher bitrate and slower size with comparable quality. There shouldbe like 20% gain over h265. If I'm not mistaken (i don't remember, i should dig out my old notes) AV1 has CRF range 1-63 while h265: 1-51, so you should go with higher numbers for av1. Also crf is not that great indicator, it would be best to google what is av1 equivalent of h265's 26. To sum up: you cannot compare same settings for both codecs. I know i would be convenient, but unfortunately you can't.

  • @Motolav
    @Motolav 11 месяцев назад +9

    The SVT encoder(CPU only I believe) works best with AVX512 instructions, for new chips currently only Intel server chips and AMD 7000 series chips include AVX512.

    • @braincruser
      @braincruser 11 месяцев назад +2

      AMD 7000 has fake avx-512

    • @Motolav
      @Motolav 11 месяцев назад +2

      "Fake" AVX-512 is better than no AVX-512

    • @braincruser
      @braincruser 11 месяцев назад

      @@Motolav Not really. Fake AVX-512 is a compatibility layer for developers to target, it isn't for performance improvement.
      Most libraries that implement AVX-512, also implement AVX2 compile targets. So a compatibility layer isn't strictly necessary.
      The performance difference between AVX2 and fake AVX-512 is usually within 30%, sometimes on AVX2 side, sometimes on AVX-512 side.
      The performance difference between AVX2 and true AVX-512 is 2-4X, as long as your CPU doesn't catch fire.

    • @TheBackyardChemist
      @TheBackyardChemist 11 месяцев назад +2

      @@braincruser It is not really fake on AMD 7000 IMO. They have the ZMM registers, etc., they have hardware to support the masking and other features that have no direct replacement in AVX/AVX2. Yeah they split the execution into 2 uops, but having the new shuffles, masking, etc. can still be really useful in some programs. Even Intel has varying degrees of AVX-512 support, with some CPUs having more or less FMA units.

    • @Motolav
      @Motolav 11 месяцев назад

      @@TheBackyardChemist it's actually 1 cycle for Zen 4 to do AVX512

  • @paul312yt
    @paul312yt 11 месяцев назад +7

    Are the quality settings equivalent between different encoders? Very limited knowledge myself but I think you are comparing apples and oranges?

    • @alf5197
      @alf5197 11 месяцев назад +5

      They are not the same between any encoders, even between h265 and h264.

  • @ivolol
    @ivolol 11 месяцев назад +7

    Why do you want a GPU encoder if you are offline encoding for YT? CPU encoder will give the best quality per bitrate. Your 5900X should still be a beast at that comparatively, although given it's a newer and more complex codec its still going to be slower than previous (which is what you pay for to get better quality per bitrate).
    Encoding on the GPU neither will tell you much (definitively) about filesize efficiency, because the GPU is just trying to spit out realtime frames, not do the best job possible for the codec.
    You buy a GPU for encoding primarily for SPEED (realtime streaming), *NOT* for quality, efficiency, or filesize (all the things you wanted to compare). Use your CPU in 2-pass mode for that.

    • @FlamespeedyAMV
      @FlamespeedyAMV 2 месяца назад

      is 2-pass for like gaming and streaming / recording on the same PC?

  • @Psi105
    @Psi105 11 месяцев назад +6

    Quality Factor values are how the author of the codec decided to measure quality. They are totally arbitrary and not any comparable metric, except between files that use the same codec.
    This is obvious if you encode something with QF 20 in MJPG and then QF 20 in H265 🤔
    Really need to keep lowering the QF until you notice it looking a bit shit, then do the same with a new codec and see how the two values compare.

    • @zlac
      @zlac 11 месяцев назад

      I like this comparison, just see which one can make smaller files before looking a bit shit.😁

  • @Mr.Leeroy
    @Mr.Leeroy 11 месяцев назад +3

    Everyone is so triggered by comparing CQ betwen different codecs. Not seen a single comment stating that as a YT content creator you should RTFM (look up "RUclips recommended upload encoding settings") and encode just above max allowable average bitrate for your target resolution and framerate or comparable if you use any unsupported codec prior to upload render..
    BTW, AV1 was experimental on YT last time I checked, so not sure if saved space even worth extra effort.

  • @repatch43
    @repatch43 11 месяцев назад +3

    Sorry Dave, you've completely mucked up this comparison. The CQ numbers are not at all comparable, this is not a new thing, CQ numbers are NEVER comparable between codecs. You need to make comparisons at the same bitrates, and judge whether the video quality is better for one or the other. There are tools that allow for this.
    My own experience is there isn't MUCH difference, but it's there. Whether it matters? Storage is cheap, I prefer x265 simply because it's seems to be more common.

  • @kjlovescoffee
    @kjlovescoffee 11 месяцев назад +2

    The thing to understand is that encoders are not created equal. Codecs (as opposed to encoders) are kind of like a standard, while the encoder is the tool producing output to that standard. The encoders are designed with certain performance characteristics in mind, and with that comes certain trade-offs.
    When it comes to GPU encoding, the big two are gaming cards. Their video encoders are designed for streaming. They sacrifice quality for performance. Quality, in this case, meaning image quality relative to size. This results in very large files.
    Intel, on the other hand, designed their encoders to satisfy Apple's demands for a high quality (best quality for size) to allow Intel Macs to sync high quality video to early iPods and iPhones at an acceptable speed. As a result, Intel's encoders produce much much better quality/size compared to NVIDIA/AMD, but isn't quite as fast.
    So "if you just want to do encoding like this, just get the cheapest, shittest card" don't buy NVIDIA or AMD. Get an Intel card or Intel CPU with onboard GPU. It's still going to be vastly quicker than CPU encoding, and you'll get a much better quality/size ratio than with the other two.

  • @noobulon4334
    @noobulon4334 4 месяца назад

    I was comparing some samples using x265 and svt-av1 earlier (both cpu encoders) and I'll share some results
    I had x265 set to a crf of 18 on the slow preset and svt-av1 set to a crf of 25 on preset 6 without any other tuning. This gave me similar encoding speeds and very similar vmaf scores on my machine (98.5 vs 97 with about 7fps)
    With this setup the av1 file came out about half the size of the h265 file
    The av1 codec is definitely capable of great things, but I think the hardware acceleration for encoding is still a bit green

  • @GameBacardi
    @GameBacardi 11 месяцев назад +4

    Not sure do AV1 and H.265 have exactly same quality factor (CQ slide), maybe not.
    I remember something that AV1 give better STREAMING quality, I could be wrong.

    • @alexatkin
      @alexatkin 11 месяцев назад

      No two encoders have the same quality factor, they're effectively just different encoder profiles for those of us who don't understand how all the many different tweakable options in video encoders work.

    • @ivolol
      @ivolol 11 месяцев назад +1

      AV1 gives better quality full stop, ceteris paribus (same bitrate using two good encoding engines). You are correct the CQ numbers are not comparable, they're unique to each engine.

  • @MatthewSuffidy
    @MatthewSuffidy 11 месяцев назад +4

    May get a tiny boost if you turn on resizable bar in the bios. It requires EFI boot structure and disable the CSM. There is a windows utility that converts your disc for EFI called mbr2gpt in windows.

    • @EEVblog2
      @EEVblog2  11 месяцев назад +1

      I have no idea what any of that means.

  • @sem_skywalker
    @sem_skywalker 6 месяцев назад

    SVT is software encoding on the CPU. Gives way the best quality, but takes loooooooads of time to compute. I spent 7 hours on a 4 minutes clip in SD PAL. Amazing quality for the super-small size.
    I don't think you see the point of using AV1. You get *better quality for the same file size* . I'd personally only use NVEnc for drafts and tests. Not for final encodes. CPU encoding is always the best when it comes to quality. But NVEnc is perfect for streaming.

  • @cliffordphillips305
    @cliffordphillips305 3 месяца назад

    Hi, I would be very interested in You trying the Xmedia Recode client, it can also do the NVenc encoding, I know with NVenc H265 you can select the HQ encoding modes and ABR two-pass encoding which you can't do with handbrake. With handbrake, I don't know if they are using the HQ or LL settings which should only be used for video streaming. If you use the ABR with the same bit rate, the file size will be more comparable. I don't have a 4000 card so I don't know about the AV1 settings in Xmedia Recode.

  • @kyrobase
    @kyrobase 10 месяцев назад

    Thank you for this very interesting comparison.
    You see a difference in quality with the H265 Nvenc GPU and H265 CPU only ? for the same settings of course. GPU compression is not bad ?

  • @Xero_Wolf
    @Xero_Wolf 8 месяцев назад

    Just to add have you tried Shutter Encoder? I was running some test comparing it with Handbrake which I normally used and I found Shutter Encoder to run faster in most cases. I just got a new Intel Arc GPU and I going to running some tests again.

  • @salapolivalenta77
    @salapolivalenta77 7 месяцев назад

    You can encode 4k HDR with cpu with around 10,11 fps using av1-SVT p6 CRF10 and the results are fine. I have exactly the same cpu like you but no video card compatible for av1 encoding.

  • @hungnguyenviet6016
    @hungnguyenviet6016 11 месяцев назад

    What version you are using? I don’t see this option in 1.6.0 with Ryzen 5 7600 and RTX 4060

  • @MVVblog
    @MVVblog 11 месяцев назад +1

    Well done, Dave, you did a great job. AV1 isn't all that impressive, actually. (I just wanted to say something nice compared to what the others have said)

  • @MLeoDaalder
    @MLeoDaalder 11 месяцев назад +1

    As I understand it, with streaming, you can use a (far) lower bitrate with AV1 compared to h265 and still have the same quality video. But I should clarify, I'm neither a streamer nor a video producer, so I have no experience with these things.

  • @Knobber
    @Knobber 6 месяцев назад

    Is it worth to upgrade from 3050?

  • @BrianG61UK
    @BrianG61UK 11 месяцев назад

    AFAIK there is no standard interpretation of the quality figure. It's quite possible that you need a lower quality number to get the same quality. IMHO you need to try different values until you get the same subjective quality and then compare the file sizes. But my understanding is that AV1 isn't ever a huge amount better than H.265.

  • @Okurka.
    @Okurka. 11 месяцев назад +7

    The RTX 4060 was cheap because nobody is buying that POS.

    • @Chriva
      @Chriva 11 месяцев назад +1

      What happened to the old name?
      Please never change :D

  • @FoamingC
    @FoamingC 11 месяцев назад

    PNY makes a lot of professional video cards

  • @white13wolf63
    @white13wolf63 2 месяца назад

    SVT => CPU / QVT => Intel GPU ( arc and integrated )

  • @uncrunch398
    @uncrunch398 8 месяцев назад

    Supposedly h264 is best for absolute best appearance if you scan for flaws like a forensic scientist and you don't want to use all of the storage required of a true lossless format. Successively newer formats do better at lower bitrates that are obviously flawed with prior. Recoding to AV1 in CPU will let you go below typical 144p bitrates for 1080p and look good with a little mess but you can mentally tune it out over time just like very low bitrate audio.

  • @mr_sheen_asg
    @mr_sheen_asg 11 месяцев назад +1

    When somone says smaller files and better quality I usually smell a bs, it's like these car "tuning" boxes on the internet that claim to make your car use less fuel and produce more power 😂

    • @jwflame
      @jwflame 11 месяцев назад

      It is, and for YT videos improved quality is mostly irrelevant, as most views are on mobile phones with tiny 6 inch screens.

  • @Dutch-linux
    @Dutch-linux 11 месяцев назад

    do not use the hardware settings use the first or second one then nove the ecoder preset lower (the little slider) and constant quality you can set to 32 is about the same as 28 for the H265

  • @JanKowalski-gl8rl
    @JanKowalski-gl8rl 10 месяцев назад

    HandBrake 1.6.1, rtx 4090, newest drivers and available are envec only for H.264/265. The AV1 envec not available, why?

    • @SyedAounAbbas
      @SyedAounAbbas 9 месяцев назад

      You need to get the beta version for 1.7 to get AV1 encoding for now.

  • @Dutch-linux
    @Dutch-linux 11 месяцев назад

    no the av1 svt you can use with any card as it is not gpu bound but cpu bound i use that on my amd apu !!!

  • @MrMichaelfalk
    @MrMichaelfalk 11 месяцев назад +2

    One is 30fps the other 60fps

  • @gyulamolnar8971
    @gyulamolnar8971 10 месяцев назад

    h266 is better "A comparative performance evaluation of VP9, x265, SVT-AV1, VVC codecs leveraging the VMAF perceptual quality metric"

  • @HanMoP
    @HanMoP 10 месяцев назад

    what is the VMAF score?

  • @tempacc9589
    @tempacc9589 10 месяцев назад

    Not gonna lie they should rework that quality slider it's super confusing. Quality isn't even comparable between formats it's completely meaningless as well.

  • @katrinabryce
    @katrinabryce 11 месяцев назад +2

    The quality factor is not the same between codecs. If you encode for the same bitrate, AV1 will look better than H.265.
    If qf26 is the quality you want on H.265, you would be looking somewhere around qf38 on AV1.
    If you just want to do video encoding, get an Intel GPU. Intel is rubbish for gaming, AI, and stuff like that, but very good for video.
    If you are doing live streaming of games, then get both, use the Nvidia for the game, and the Intel for the streaming software, or, a lot of streamers use two separate computers - take raw HDMI out of the gaming computer, and capture / stream it in the streaming computer. Then you would probably want the cheapest Mac Mini you can find for your streaming computer.

  • @repatch43
    @repatch43 11 месяцев назад

    '26' for one codec is NOT comparable to '26' on another codec

  • @xKynOx
    @xKynOx 11 месяцев назад

    Easy choice for me, my TV plays 10 bit X265 perfectly AV1 is not supported.

  • @gabest4
    @gabest4 11 месяцев назад

    Soon you will be switching back and forth between videos and looking at pixels from 1cm. Don't say I haven't warned you.

  • @IanScottJohnston
    @IanScottJohnston 11 месяцев назад +2

    As I understand when encoding using the GPU there is still an element of decoding using the CPU and so if you are not getting 100% on the GPU it's probably because the CPU is bottlenecking.
    I played about with Handbrake just now on one of my 4k/60 HEVC (H265) native files from my Panasonic camcorder and could get a 412MB clip down to 30MB H265 using the CPU with no loss in quality.....heck thats good. Tried AV1 also and it gave me a 122MB file.
    Not sure I will bother with the extra step in my workflow though......not until I need to, i.e. storage space.

    • @IanScottJohnston
      @IanScottJohnston 11 месяцев назад

      Out of interest rendering in Resolve (4k/60 H265):
      CPU = 19%
      GPU = 59%
      MEM = 50% (15GB from 32GB avail.)
      SSD = ~0%

    • @zlac
      @zlac 11 месяцев назад

      Lower the AV1 quality until you get down to 70-90% file size as you get with H255 and report back.

  • @Dutch-linux
    @Dutch-linux 11 месяцев назад

    I get 120 frames a sec on my APU so something is wrong over there

  • @vajona2495
    @vajona2495 4 месяца назад

    Stop using nvenc 265... Why are there so many of you who are either willfully ignorant or blatantly lying about this crap?

  • @markissboi3583
    @markissboi3583 11 месяцев назад +1

    2019 bought i7-9700k and a GTX 2080'S water force looked up $ prices 2023 still the same 2080 beats a 3070 easily OH how i beat the con job sellers
    you wont need to upgrade chit till past 2025 when they MAY release the home Ai system then again not many normal household have a clue anyway wtf how a pc works
    true chit ask 5 neighbors i bet you get a Blank face xd . . . tech hasnt caught up as you may think most wont use it and dont anything watching there bird feeders from the kitchen windows :) good for them oldies enjoying real life pleasures

    • @markissboi3583
      @markissboi3583 11 месяцев назад +1

      i think eevblog doesnt really have a clue but uno has loads of crap on walls to say he mite so L) ha

  • @Dutch-linux
    @Dutch-linux 11 месяцев назад

    agein the svt is not intel no need for other card use the svt it will work just fine

  • @meowtuna
    @meowtuna 11 месяцев назад +1

    It's a first generation hardware encoder and to be frank its kinda garbage compared to the h.265 hardware encoder which is now a few generations old and they have learnt a thing or two about a thing or two.
    AV1 shines more on a two-pass encode rather than a one-pass and i wouldn't use it for footage to edit. It's not ready for that.

  • @leandrolaporta2196
    @leandrolaporta2196 11 месяцев назад

    For offline encoding I will stick with CPU, quality & size (matters)😂, but if you want speed, forget ANY gpu, go thread ripper that thing is a beast encoding! In thread ripper # of cores doesn't matter much, what you want is cache! But yes, pricey, the cost of power

  • @Razor2048
    @Razor2048 11 месяцев назад

    For AV1, the benefit is not really with smaller file size, the idea is to have higher quality at the same bit rate. Even with the constant quality setting, typically with AV1, you should see it handle high fill rate content, e.g., quickly panning the camera while pointed at trees or other objects with lots of fine detail, the AV1 artifacts should be a little less noticeable. Overall, this means it is good for fast paced action scenes.
    While you may not do gaming now, that card may be an issue for you when you find a new PC game that you make an exception for and really want to play, the 8GB will be an issue, especially for more graphically demanding games since it will have to GPU compute for 1080p at ultra settings, but the VRAM will be lacking. While the gaming performance will be far better than the RTX 2060, the price to performance ratio for the 4060, will not be as good.

    • @EEVblog2
      @EEVblog2  11 месяцев назад +1

      I don't care about gaming.

  • @MatthewSuffidy
    @MatthewSuffidy 11 месяцев назад +1

    The 4060 was barely better than the 3060.

    • @everydreamai
      @everydreamai 11 месяцев назад +3

      The 40xx series is the first chip from Nvidia to have AV1 encoding in hardware. This isn't about playing video gams.

    • @MatthewSuffidy
      @MatthewSuffidy 11 месяцев назад

      @@everydreamai Just saying, but yes I understood that.

  • @ololh4xx
    @ololh4xx 11 месяцев назад

    yea dave you made a critical mistake : absolutely never buy/use a first-generation product. Also : CQ scales are not comparable between codecs, like a few other guys already mentioned. Stick to h265 for a few more years and give it another shot then ... it'll be much better.

    • @alexatkin
      @alexatkin 11 месяцев назад +3

      There's nothing wrong with the 40x0 series AV1 encoder, he simply doesn't understand how the CQ if never comparable between different encoders.

  • @KramerEspinoza
    @KramerEspinoza 11 месяцев назад +2

    Yuck windows… does that still exist?

    • @jwflame
      @jwflame 11 месяцев назад

      What else would you suggest? Apples proprietary operating system which only runs on their grossly overpriced proprietary hardware. Or the shambles of incompatible everything which is Linux?

  • @gyulamolnar8971
    @gyulamolnar8971 10 месяцев назад

    handbrakefr/docs/en/latest/workflow/adjust-quality So I think h265CQ30~AV1CQ42-46