AV1 is disappointing.

Поделиться
HTML-код
  • Опубликовано: 3 окт 2024
  • For now...
    My Spotify:
    open.spotify.c...
    Linus Tech Tips: • RUclips SHOULD charge ...
    The AV1 codec is in a tough spot compared to H.264 (AVC) right now. It's extremely powerful, but not much supports it. Let's see how it goes in the next few years
  • НаукаНаука

Комментарии • 207

  • @ReLo_FZR
    @ReLo_FZR Год назад +125

    the h265 license killed its adoption

    • @Z-add
      @Z-add Год назад +4

      All new smartphones support h.265. My old samsung note8 does it.

    • @ReLo_FZR
      @ReLo_FZR Год назад +15

      @@Z-add True , to encode and decode a file is under the licence , the problem is useing it in streaming services like twitch or youtube . OBS alowes to record localy in h265 but not stream h265 for a reason

    • @Blueyzachary
      @Blueyzachary Год назад

      @@ReLo_FZRyou can stream in HEVC…

    • @KonglomeratYT
      @KonglomeratYT 22 дня назад +1

      @@ReLo_FZR That's not true. I have been livestreaming h265 for over a year. It's been available for years.

  • @lifelesshawk5725
    @lifelesshawk5725 Год назад +124

    Av1 might have a slow initial start, but being a royalty free codex will allow every company to freely add av1 to their product allowing for a large user base.
    In the next 5 years av1 will probably be the new standard for most devices, and will be incorporated into most people’s graphics cards for encoding.

    • @vextakes
      @vextakes  Год назад +11

      Ye I agree

    • @tui3264
      @tui3264 Год назад +3

      it will be interesting when h266 arrives in hardware

    • @Aerobrake
      @Aerobrake Год назад +3

      ​@@vextakes im looking forward to AV1's future

    • @HDRPC
      @HDRPC Год назад +4

      In 2023, av1 is completely mainstream and i cannot live without av1. I am enjoying av1 in my RTX 4080.

    • @godnyx117
      @godnyx117 Год назад +2

      I would say, in the next 2-3 years. At this point, every graphics card (even budget ones) have both an AV1 ENCODER and DECODER.

  • @sean8102
    @sean8102 Год назад +32

    Things changed pretty quick. RTX 3000 and newer GPUs, RDNA 3, Intel ARC etc all support AV1 hardware decode now (and some like ARC even do AV1 encode) so CPU usage is like ~5% when playing a 1080p AV1 vid on my 8700K (because my RTX 3080 is doing the decoding). The majority of the videos I get on RUclips when using my PC are in AV1 now. I'm guessing because I have a RTX 3080 which has AV1 hardware decode, meanwhile my Note 20 Ultra still gets VP9 since its SOC dose not have AV1 hardware decode. Surprised it's taken Qualcomm till the Snapdragon 8 Gen 2 to finally add AV1 decode.

    • @vextakes
      @vextakes  Год назад +7

      Yeah this video is abt encoding not decoding

    • @memoli801
      @memoli801 Год назад +3

      Nice infos. RDNA2 does decode too by the way.

    • @Triro
      @Triro 10 месяцев назад +5

      RDNA2 the 3000 series and Intel Arc all do decode.
      But its only the 4000 series, RDNA3 and Intel arc that do the encoding.

    • @s.omarnoor1380
      @s.omarnoor1380 Месяц назад

      Intel xe also does decode

    • @YannBOYERDev
      @YannBOYERDev 25 дней назад

      Modern CPUs can even do software decode, I mean, my Ryzen 7 7840HS is at 15% usage when playing a 4K60fps video and the decoding is done purely by the CPU no GPU acceleration here... But I understand that most people prefer hardware decoding even if it's less precise, sofware encode/decode will always be better but slower than hardware encode/decode, I can still play MPEG2(H262, video codec from 1996) videos on my CPU without needing my GPU to support this codec as CPUs are general purpose and can do everything, I doubt in 20-30 years AV1 hardware decoders will still exist but I think AV1 videos will still exist but not modern videos and to play these old videos we can just use a software decoder(like dav1d) that will use the CPU, I have my whole movie/videos collection in AV1 to reduce storage and I'm confident to say that even when GPUs will stop supporting AV1 Decode my CPU will always be here to save me LOL.

  • @bodasactra
    @bodasactra Год назад +47

    You could buy the Intel Arc A380 for around $100 and throw it into a lower PCIe slot under your main GPU and use it for AV1 encoding. I hear it works great but I have not tried it. :)

    • @SkillerWolfCFM
      @SkillerWolfCFM Год назад +8

      did a benchmark in CP77 ultra settings QHD (resolution scaling and RT turned off) with RX 7900XTX and the performance difference in game I got between HVEC and AV1 was 2 fps where both were recorded in 50 Mbps 60fps at QHD. HEVC 121 fps and AV1 119 FPS so at the moment AV1 works fine.
      About having a secondary GPU, those who have huge GPUs like I do would barely fit a secondary GPU there and your main GPU will overheat due to no space for the fans

    • @angrysocialjusticewarrior
      @angrysocialjusticewarrior 6 месяцев назад +1

      @@SkillerWolfCFM or you could just install one of the gpu's vertically using a riser cable.

  • @rocvan8190
    @rocvan8190 Год назад +18

    This video is specially relevant NOW, because RUclips just decide to use AV1 for streaming in beta.
    I think this video is hardly focused on content creators and streamers, not on just watchers. I don't understand why so much people dislike it (yeap, using 'Return RUclips Dislike' extension. Works great).

  • @BlakeB415
    @BlakeB415 Год назад +14

    Even though RUclips has to store AV1, VP9 AND h264, they see this as mainly a bandwidth saving measure. Once enough devices support the codecs, and they drop support for older devices, then they will be able to see the storage saving benefits as well.

    • @Triro
      @Triro 10 месяцев назад +3

      RUclips should just dump h264 encoding entirely at this point.
      The amount of devices that cant decode vp9 is so small at this point.
      Vp9 not only looks better, but also decodes just as fast as 264 on a device that can use hardware to decode it.

    • @BlakeB415
      @BlakeB415 10 месяцев назад +2

      @@Triro they should just keep 360p/480p h264 for basic playback and drop every other h264 quality.

    • @Triro
      @Triro 10 месяцев назад +1

      @@BlakeB415 Yeah fair.
      They should also just drop the completely useless 144p and 240p

    • @mikethespike056
      @mikethespike056 2 месяца назад

      ​@@Triro If they do that, at least keep 144p. It has its uses for minimum bandwidth.

    • @Triro
      @Triro 2 месяца назад +1

      @@mikethespike056 I mean, if u have to use 144p you might as well not be watching videos.

  • @vithorcicka3897
    @vithorcicka3897 Год назад +29

    One interesting fact is that youtube has started using AV1 heavily and there is a good chance that this video is in AV1 because decoding is quite easy and most things can do it even if not with hardware decoding.
    And it's hard for creators to encode without native hardware acceleration, but I think it will be the same as h.265 where there was hardware acceleration but no native. (sort of like a translation layer)
    And good based video.

    • @vextakes
      @vextakes  Год назад +5

      Thanks man. I didn’t know it was ez to decode, that’s cool to see. It might be possible for there to be a translation layer to help on systems without hardware acceleration

    • @XxmattitudexX
      @XxmattitudexX Год назад +5

      I saw a video showing that av1 is only used when view counts reach millions. you can force YT to use it by uploading in 8k resolution

    • @noblessus
      @noblessus Год назад +3

      Good points. However, I don't think youtube uses AV1 video yet although I wouldn't be surprised if they were planning on transitioning to it very soon given its superior efficiency. This current video, and all youtube videos I've analyzed so far, only offer three codec options for playback: mpeg4, vp9, and avc1 aka h264. We can check the current codec used on any youtube video by right-clicking it and clicking "stats for nerds". I have never seen them offer even h265 so far, perhaps because it is efficiently similar to vp9. I guess for now AV1 must be reserved for videos surpassing a million views or 8K resolution, like @XxmattitudexX suggested. There is, however, an option in the account settings to force AV1 to always be used, but it doesn't seem to change what I described previously.

  • @TabalugaDragon
    @TabalugaDragon Год назад +32

    I've been using AV1 for years. It's far from disappointing. From youtube videos, to AV1 coded movies. The compression is amazing.

  • @uncrunch398
    @uncrunch398 Год назад +11

    The main predicament is most phones won't decode AV1. Unless there's an alternate YT player. But that's where the largest benefit is for users to save bandwidth. If they got software decode support, it would drain the batteries faster. People would have to choose between shorter battery time and spending more on data. As of now, we can save battery and data by defaulting to a lower resolution / keeping data saver mode on. Any idea when AV1 hw support will be normal in phones?

    • @aresiastest
      @aresiastest Год назад +3

      This is not true. Samsung TVs (and phones) from late 2020 onward have AV1 hardware decoders and were mentioned by Netflix as some of the first outlets for their 4K AV1 content.

    • @uncrunch398
      @uncrunch398 Год назад +3

      @@aresiastest I'll have to look into this. I suspect a good portion of phones in use still do not have this support. The YT app on mine still only plays AVC. IIRC it was a new model in 2021; not a Samsung though. It's also too slow to play videos in a browser, so I quit trying.

    • @aalyx90
      @aalyx90 Год назад +1

      ​@@uncrunch398 Also, while AV1 encoding is really heavy without hardware acceleration, decoding it's really light, phones only need to do decoding, so it shouldn't impact battery life significantly.

    • @And-vx6ry
      @And-vx6ry Год назад +1

      Mediatek Dimensity 1000, Exynos 2100 or higher, Snapdragon 8 gen 2

    • @uncrunch398
      @uncrunch398 Год назад +1

      @@aalyx90 Can you measure the difference in battery drain with and without acceleration? Component power drain difference is usually huge comparing an ASIC vs doing the same thing on a general compute core, but as a portion of the total device drain? I still wonder.

  • @Z-add
    @Z-add Год назад +5

    You can get Arc a380 just for AV1 encoding.

  • @neutronpcxt372
    @neutronpcxt372 Год назад +24

    I can just smell the misinformation.
    1. You can use software decoding just fine on any relatively modern CPU.
    I can decode 4k30 10b video on a laptop 2C/4T Skylake trash CPU, so anything more modern should easily be able to cope with 1080p60 or whatever with not much load.
    2. Comparing HW encoding vs software encoding is not really fair, especially when considering the fact that game streaming is an area where HW encoding has an advatange due to zero copy. For archival recording, use HW encoding all you want. For low bitrate streaming/recording, might as well use software encoding to get efficiency gains.
    Let's not forget that it seems OBS, on windows specifically, has problems with SVT-AV1 performance for some reason. Even discounting that, encoding 1440p60 games by default is hard to encode footage, making your CPU work decently hard :)
    I feel this video would have been better if it took a deeper technical dive on the subject.

    • @vextakes
      @vextakes  Год назад +7

      Yep decoding is np.
      I think it is pretty fair to compare software to hardware because this video was focusing on the usability of av1. I simply can’t use it, that’s why it’s disappointing. And for my use case it’s only a small loss in quality when using hevc or slightly larger file size but it takes little toll on my system. Also I’m not willing to drop the resolution or frame rate cos my gpu can encode that level anyways
      Atm software encoding, whether it be windows, my specific configuration, or the svt-encoder itself, isn’t quite there yet. I might make a follow up if it gets better.
      I really wanted to make a video on this cos I couldn’t really find a good video covering the usability of av1. So I did most of the testing myself

    • @neutronpcxt372
      @neutronpcxt372 Год назад +1

      @@vextakes Still, you do make some good points.
      Also, considering my "expertise", maybe I should make a video on the subject :)

    • @vextakes
      @vextakes  Год назад +1

      @@neutronpcxt372 respect. U should. This was mainly just from my point of view

  • @limitlesswave
    @limitlesswave Год назад +12

    What an expert, he doesn't even know how many threads the 5900x has. It's 24 NOT 32.

    • @netronin504
      @netronin504 Месяц назад

      32 THREADS LOL

    • @IggyTs
      @IggyTs 29 дней назад

      he misspoke. who gives a shit? get a life.

  • @WaveSmash
    @WaveSmash 4 дня назад

    4:55 Actually RUclips once DID re-encode a huge chunk of the videos stored on their platform. In 2014 they added 60fps support for new and existing content, and did so by re-encoding all videos uploaded at 60fps. To be fair RUclips was significantly smaller at the time and it took far less computing power than it would have today. But it definitely was a huge task and RUclips was willing to do it. Kinda interesting.

  • @mianlo2624
    @mianlo2624 Год назад +6

    I believe RUclips should transcode everyone video to VP9. Instead of requiring user to upload video in 1440p, or have tons of views to get VP9. And if anyone isn't able to play VP9 videos, then they can have AVC copy. AV1 has similar requirements. But you'll need to upload video in 8K, or able to get a video to get millions of views.

    • @vextakes
      @vextakes  Год назад

      Yeah it sucks they have to transcode and store a bunch of different versions of the same video. So they can’t wait till eventually we could maybe standardize to av1

    • @mianlo2624
      @mianlo2624 Год назад +2

      @@vextakes We need to hope for tons of people to upgrade their phones to ones that are able to playback AV1 (which sucks (SD 8+ Gen 1)). Because I believe majority of people view RUclips videos on their phone, and second is TV. Thirdly, PS5 and XSX are able to play AV1 videos.

    • @vextakes
      @vextakes  Год назад +2

      Yeah I think the tech side of things is gonna go really fast. Every media device coming out in the next 5 years minimum will have av1. But similar to h264 and it’s adoption, it’s really only been in the past few years that we can reliably assume that everyone has devices that support it. Until then, video platforms will prolly just have to be backwards compatible and wait for the long term

  • @petouser
    @petouser Год назад +5

    I think the main take is that software encoding (on CPU) isn't good for gaming streams or recordings. Which is also the case for x265. The main reasons why people do software encoding is quality and file size. For things like talking streams, cooking streams, whatever, AV1 software encoding should be fine. Also, if your CPU is struggling, you can also choose a lower preset. Depending on resolution and stuff, SVT AV1 preset 9 or 10 can be the sweet spot, and they are probably still slightly better than hardware encoded x265.
    Just for comparison, SVT AV1 preset 6 is just as slow as x265 preset slow, and just as nice looking, AV1 maybe looking even better.

    • @vextakes
      @vextakes  Год назад +4

      Yeah it’s Ight. Obv hardware acceleration is always better. I just thought a 12 core cpu would handle it but it can’t. I think I had svt-av1 on 9-10 btw and that was still how bad it was. Anyway it might be better now, this vid is a little old now

    • @gozutheDJ
      @gozutheDJ Год назад +1

      this is false though, if you have the cores CPU encoding even in x264 is higher quality with less performance loss than using nvenc or amf

  • @AlphaYellow
    @AlphaYellow Год назад +3

    Intel Arc's version of AV1 encoding is pretty awesome tho...

  • @amateurwizard
    @amateurwizard Год назад +2

    The thing is right now our sAV1our is H.265...yes 5. Because the same thing happened an age ago with 264. It's going to be made license free and H.265 is going to adopted in an instant. >90% of devices currently support it to some degree.

    • @vextakes
      @vextakes  Год назад +2

      They might be too late to act, av1 has been picking up a lot of steam. Also u can now stream to RUclips with h265 (and av1) idk what that means for the licensing

  • @AgentSmith911
    @AgentSmith911 Год назад +4

    Imagine how good looking live streaming for video games will be like with 4K 120FPS AV1 in a few years, probably running at less than 30 Mbit/s

    • @vextakes
      @vextakes  Год назад +3

      Lol that’s so overkill

    • @vextakes
      @vextakes  Год назад +2

      Cool tho and mb possible

  • @lifelesshawk5725
    @lifelesshawk5725 Год назад +3

    Now what you can do is record at lossless quality and use handbrake to encode to av1 for smaller file sizes

    • @vextakes
      @vextakes  Год назад +5

      Technically yeah maybe for archiving afterwards. I’d rather just record hevc instead. Cos that takes a while to do

  • @dgillies5420
    @dgillies5420 Год назад

    Fun Fact. RUclips was not profitable until 2015 (I worked for Google Search until 2018). They could NOT be profitable until they moved EVERYTHING onto google datacenter servers. Google YAWN datacenters (the big North America ones) are the most efficient on earth. for every $1.00 spent on compute google spends only $1.10 total (the 10% overhead is everything else in the data center - there is NO A/C.) Traditional datacenters would spend $2.00 to produce $1.00 of compute (in terms of electricity to run cpu & disks) in a the data center. Next time you're mad at RUclips for the amount of revenue sharing remembeer that THEY LOST MONEY for a decade until 2015! P.S. Google shared its datacenter design techniques with all its competitors to save on world energy costs...

  • @LocknLoad_Gaming
    @LocknLoad_Gaming Год назад +1

    i cant wait till it is stable

  • @syspowertools3372
    @syspowertools3372 3 месяца назад

    I use AV1 transcoding for video streaming from my home media server and all H264 content is being converted into AV1 automatically now. H265 is "good enough" but half the space for the same quality as 264 is only part of the bennifit.
    If you are streaming multiple videos at once to several family members while away from the house that compression really matters.
    Also, AV1 playback is much less difficult than encoding, so lots of devices support it.

  • @leucome
    @leucome 7 месяцев назад +1

    Nowadays it became possible to stream AV1 with CPU using the SVT-AV1 codec. If somebody wonder I have a 1440p 60fps streaming test in a playlist called "unlisted video" on my channel. If somebody wonder what is the point of CPU streaming when we have a GPU encoder? Actually the image quality is a little bit better than the GPU encoder.

    • @ZenAndPsychedelicHealingCenter
      @ZenAndPsychedelicHealingCenter 7 месяцев назад

      It's nearly always more practical to use a GPU for encoding and now hardware encoding with higher end GPUs can be better quality than that of many CPUs. That said, yes CPU encoding with a good CPU can be great quality, just not practical for most just yet.

  • @stawsky
    @stawsky Год назад +4

    Great Video, very useful information. Just little correction. Ryzen 5900X has 12 cores and 24 threads. I do have same processor and it's sad that AV1 taking so much source from processor.

    • @Hi-levels
      @Hi-levels Год назад

      Can you do x264 slower 1080p60hz 7000kbit with with 5900x?

    • @stawsky
      @stawsky Год назад

      @@Hi-levels Yes You can of course but this will quite big hit to the cpu.I do normally was streaming with x264 before AV1 was available at medium preset and all was fine but slower/slow has big impact while cpu was responsible for gaming and streaming, it was for me not worth to have such a huge tasks for cpu, but medium was perfectly fine.

    • @Hi-levels
      @Hi-levels Год назад

      @@stawsky i have a dual system setup. Dedicated pc has 5700g pbo2 a d does 14.900 from r23 MC. I can do 1080p 60 hz slow but it is not very good in witcher 3 at 7000kbit
      I will get 7900 r9 or r9 5900x to do slower preset. I saw a streamer she had perfect imagw quality uses 12900K on a dedicated pc didnt tell me her settings but im sure she does slower lol. I have enhanced slow settings 936p atm. 864p and 720p actually looks worse.
      I have to test av1 with witcher 3 max motion and vegetation then decide if it is better than x264.. also many devices cant run av1 let alone modt android tvs and phones aldready cant display twitch 60fps lol
      Nvenc and qsync are good untul you stream w3

  • @Morris13-37
    @Morris13-37 Год назад +3

    GPU encoding has left CPU encoding in the dirt... AV1 encoding is not meant to be done by a CPU if you want real-time results. It's the same with video games--they used to run off of the CPU alone, until graphics cards started to help with the load. Nowadays you simply wouldn't imagine playing a game like Cyberpunk without using a graphics card. The same has happened to real-time video encoding. I'm sure your 5900X can stream with x264 'fast' preset just fine, but unless CPUs will get specific hardware acceleration for AV1, it's simply not a good idea for real-time video encoding.
    This video feels like a "mehh I can't use the new technology with what I already have." I don't really understand why this is surprising.
    Also, a lot of hardware already supports AV1 decoding, including phones. The tech world has been preparing for its mass-adoption for a couple of years now.

  • @oswaldjh
    @oswaldjh 4 месяца назад

    Going AV1 is expensive as far as hardware requirements.
    Last month I put together a capture PC with an AVermenda PCIE 4K capture card mated to a RTX 4060 for the AV1.
    Each card was about $400 CND, total system cost $1200 CND.

  • @noblessus
    @noblessus Год назад +2

    While it's true that encoding on software (CPU) is slower than encoding on hardware (like on a GPU's dedicated encoding circuitry), you could still use your old graphics card to hardware-encode in an old codec like HEVC at a higher bitrate, or even raw video (provided you have enough disk space), so that you don't get those high CPU usage percentages while gaming. Then once you are done recording, you can convert that HEVC video to AV1 overnight using a software encoder and at a lower bitrate. This way you get low CPU usage while recording your gaming session but also get to enjoy the benefits AV1 has to offer, albeit later.
    Also, AV1 decoding isn't as taxing as encoding, this is true for most codecs including h264/h265. While it is true that most systems might not yet support AV1 decoding, it is not forcefully necessary to do the decoding on hardware, it can be done by software as well and this is as simple as just downloading a codec. So you don't need a new device to decode AV1, if it can already decode h264/h265 well then it should have little problem decoding AV1.

  • @navyjonny117tng
    @navyjonny117tng Год назад +3

    Great vid Vex! I recorded a 45 minute video with AV1. It looked sick plus the size was 1.85GB lol 😆 😂 😅 Never seen something like that before and it didn't even ate the resources of my GPU and CPU. The best encoder and probably the future for recording and streaming

  • @tuapuikia
    @tuapuikia 12 дней назад

    Fun fact: transcode h264 1080p to hevc 720p to save space then use nvidia shield or samsung TV to upscale it to 4k 😊😊😊

  • @biglegs
    @biglegs Год назад +21

    The thing about AV1 is that it's very slow to encode (my computer is also 10 years old so it's worse), but you only have to encode videos in it once. And I've found that on that same 10-year old laptop that only has an integrated GPU, while it takes like over an hour to encode a few minutes, playback is smooth as butter.

    • @vextakes
      @vextakes  Год назад +4

      Yeah ig decoding must have some kind of translation layer

    • @dgillies5420
      @dgillies5420 Год назад +1

      The higher the compression quality, the slower the encoder. It's just a fact. Higher quality encoders have more signal processing to downsample the i-frames and d-frames, and some of the newer ones have a range of choices for which operations to use and they must try them all and pick the best ones to encode that next frame.

  • @charliebrown1947
    @charliebrown1947 5 дней назад

    5900x is a 12 core, 24 thread processor. not 32 threads as stated.

  • @WarrenGarabrandt
    @WarrenGarabrandt 10 месяцев назад

    I imagine there will be a transition period where AV1 and AVC are both saved in full quality, and only AV1 is served to clients that support it. But soon, AV1 will likely be the only codec that offers anything above 1080p. Legacy clients will be locked out of the 1440p, 4k, etc. higher resolutions, and people will just have to upgrade to get the higher quality videos.

  • @Triro
    @Triro 10 месяцев назад +1

    And 11 months later, RUclips supports av1 streaming.
    And Av1 GPU's are now cheap enough where the average gamer upgrading to something like a 7800XT (dont get the 7700XT PLEASE) or the 4060 can now utilize this amazing tech.
    And while only modern devices have hardware decoding, its still possible for every device to decode it. As CPU's can decode it. Though at a higher CPU usage rate, reason I personally disabled it on my system until I get a 7800XT in the mail.
    Their are also av1 enocoding cards made for av1.

  • @Fezzy976
    @Fezzy976 Год назад +1

    5900X is 12 core 24 threads. Not 32 cores.

  • @Sharki33
    @Sharki33 Год назад +1

    What about vp9... RUclips currently using it.

  • @carreraluu
    @carreraluu Год назад +1

    I just did a test my self with AOM AV1 running Battlefield 2042... Big improvement however it sucks the life out of my CPU!

  • @lnostdal
    @lnostdal 3 месяца назад

    All these codecs requiring dedicated HW to encode/decode is not good or sustainable IMO.

  • @nikolqy
    @nikolqy 9 месяцев назад

    One of the issues is that RUclips stores the original video. It might make sense, but they don’t technically have to do that. Also, I think you’re ideally supposed to record in whatever format has the highest quality, then compress it with ffmpeg and upload it to RUclips.
    Svt is the only av1 format that will use low cpu. Libaom isn’t multithreaded yet I don’t think.

    • @ZenAndPsychedelicHealingCenter
      @ZenAndPsychedelicHealingCenter 7 месяцев назад

      The part about 'supposed to record in highest quality and then use ffmpeg, simply isn't true, or at least hasn't been for years. Most who upload videos to RUclips now don't use ffmpeg, at least natively and probably have no idea what it even is. There are encoding and decoding solutions available such that ffmpeg just doesn't need to be considered for most purposes by the average user.

  • @TanteEmmaaa
    @TanteEmmaaa 11 месяцев назад +1

    stop whining, these new developments are great.

  • @SamJ-x9n
    @SamJ-x9n 3 месяца назад

    dont u mean 5900x is a 24 thread processor ?

  • @KooperK36
    @KooperK36 Год назад

    av1 doesn't suck cpu encoding sucks if you don't want to get a gpu just to encode av1 either stick to h.265 or deal with the cpu issues

  • @Sharkjumper
    @Sharkjumper Год назад

    6:47 The R9 5900x have 24 thread's, its R9 5950x have 32 thread's

  • @sach046
    @sach046 Год назад +1

    So it’s disappointing because you can’t buy the latest gpu?

    • @vextakes
      @vextakes  Год назад

      Nah cos I can’t use av1

    • @aresiastest
      @aresiastest Год назад +1

      @@vextakes funny that he is saying to you the same thing that i said you can't doesn't mean other people can't. You can use AV1 today that make your video wrong.

  • @enekoredondo2462
    @enekoredondo2462 Год назад +1

    6:46 you wish 5900X has 32 thread... It is 12/24.. I have the same CPU. I ended here cos I am trying the AV1 with CPU and it is unstreameable even at min. For now it seems that only works with dedicated chips.

  • @DemonSaine
    @DemonSaine Год назад +1

    why does this have so many dislikes? mans is speaking nothing but facts with the receipts, I don't see the problem here.

    • @vextakes
      @vextakes  Год назад +3

      Ppl just instant dislike, it is what it is

  • @rov_es
    @rov_es Год назад

    Wait for the 4050, all kids gonna buy that .

  • @nuwave4328
    @nuwave4328 Год назад

    Uses AV1 at 20,000 kb/s for CPU and 2500 kb/s for quality. Maybe try something reasonable like 5000-10,000 kb/s?

  • @JasonTaylor-po5xc
    @JasonTaylor-po5xc 10 месяцев назад

    You could look at getting the Intel A380 just for AV1 support for recording/editing.

  • @dannysmith713
    @dannysmith713 5 месяцев назад

    Ryzen 9 5900X is a 24 thread processor

  • @FOUR22
    @FOUR22 Год назад

    when recording I just throw the bitrate to something insane. like for 1080p60fps I run over triple the recommended bitrate for recording on SLOBS. it helps out with recording alot but that also using a LOT of recourses. I am probably only going to run twice the bitrate and use AV1 for a crisp picture

  • @J0ttaD
    @J0ttaD Год назад +1

    I'd assume AV1 is going to b e standard this year

    • @vextakes
      @vextakes  Год назад +2

      Maybe on the platform side of things and decoding but encoding for personal user I think will probably be pretty standard next year

  • @G4M3RGU1D3
    @G4M3RGU1D3 8 месяцев назад

    24 threads not 32

  • @pmAdministrator
    @pmAdministrator Год назад +1

    Don't worry lil man, you can ask money from your parents or some other adults that have money. You get a gpu in no time, lil bro. Just do your homework and do the dishes like every other kid.

  • @TrauniOfficial
    @TrauniOfficial 10 месяцев назад

    Totally true, think best option is Intels ARC cards (around the world) also for encode servers etc.

  • @Heldn100
    @Heldn100 5 месяцев назад

    so if we have 4000 or 7000 amd we are fine using AV1

  • @amateurwizard
    @amateurwizard Год назад

    I find that to be a reasonable amount of CPU usage for AV1. I would set the core affinity on OBS and call it a night, shouldn't affect gaming. H.265 is going to be the norm.

  • @LeicaM11
    @LeicaM11 Год назад

    I am using H.265 always, converting videos all the time, even right now in parallel on my iPad while watching this video on YT.😊 Why does your Ryzen AMD does not have any AV1 Encoder/Decoder engine?😮

  • @YannBOYERDev
    @YannBOYERDev 6 месяцев назад

    Real Time software encoding is pretty much a bad idea unless you are rocking a i9-14900K... On my i7-13620H laptop CPU I can get 250fps in econding AV1 videos with ffmpeg but it's not real time when I record my screen on OBS with SVT-AV1 my CPU is used at 40 to 50%.

  • @Heisenberg355
    @Heisenberg355 Год назад +1

    Are you really surprised your CPU is "pegged" when recording at 20000kbit? o.0 Even for H264 that bitrate is too high, so with AV1, you *definitely* dont need a bitrate that high. Seems to me you purposely did thst to get a clickbait thumbnail/title.

  • @DizConnected
    @DizConnected 11 месяцев назад

    I have a RTX4090 but bought an ARC A380 for the steam-pc and it just takes all the hardware AV1 I can shove into it. Don’t count out Intel ARC if you need hardware AV1 encoding.

  • @КОМБИНАТ-т1ъ
    @КОМБИНАТ-т1ъ 6 месяцев назад

    Intel has av1 decode hw accelerstion in mobile procedsors with iris xe igpu, and will have encode support in 14th gen mobile cpus

  • @tech360gamer
    @tech360gamer Год назад +1

    The Arc A380 is pretty much a cheap card for AV1

    • @vextakes
      @vextakes  Год назад

      Yeah still over $100 just to encode av1, not really worth unless ur committed it feels like

    • @tech360gamer
      @tech360gamer Год назад +1

      @@vextakes Yeah I agree with you on that for sure

  • @ksouvenir5561
    @ksouvenir5561 Год назад +1

    Even after watch this video, I am still confused. Can you explain why twitch is still using only H624?

    • @user-ui4fn6fj3p
      @user-ui4fn6fj3p 11 месяцев назад +3

      Everyone can decode h264 at a watchable rate, so twitch doesn't have to transcode every stream (they only transcode the partner streams). In order to switch to av1, they would either have to go the youtube route and transcode everything (expensive), or just wait until adoption is much higher.

    • @ksouvenir5561
      @ksouvenir5561 11 месяцев назад

      @@user-ui4fn6fj3p OK 👌. Is down to money 💰

  • @danobannanno147
    @danobannanno147 Год назад +1

    is this while playing the game on the Arc?

    • @vextakes
      @vextakes  Год назад +2

      I’m confused what ur talking about

    • @danobannanno147
      @danobannanno147 Год назад +1

      @@vextakes I’m sorry got some reason just because it was AV1 and I wasn’t paying attention I thought this was an arc GPU

    • @vextakes
      @vextakes  Год назад +1

      Gotcha

    • @danobannanno147
      @danobannanno147 Год назад +1

      @@vextakes do you know if an arc a380 as a dedicated encoder would be a good option to take some load off of my GPU while recording games? Edit: I record at 1080p on a 2070 & 5900x

    • @vextakes
      @vextakes  Год назад +2

      I already answered this, check the comments

  • @Autotrope
    @Autotrope Год назад

    Even vp9 beats h265
    Also h265 isnt closed source, the best implementations are open source, it just has the fear of patent entanglement to it. Which legislation really needs to sort out but until then there's alternatives from companies that commit to not charging patent royalties

  • @aixizu
    @aixizu Год назад

    i'm pretty sure av1 is more suited for a gpu encoder

  • @MishaXG
    @MishaXG Год назад

    There's so many videos about bitrate, CBR/VBR/CQP, AV1/HEVC/264 but none of them are clear and strait to the point when you use something else than 16:9 screens.
    Will only speak about recording, cause streaming... it's not that difficult to setup something look good.
    Recording is another story... main problem being: "file sizes". Ofc you can record at 120 - 140k bitrate CBR but the file will be stupidly heavy.
    Obviously AV1 > HEVC > 264, still HEVC seems "more secure" in my opinion
    So anyone can give me some hints about what settings should be used for 3440x1440p recording at 60 FPS?
    Definitely using CBR for edit it seems allways the best choice for a smooth editing experience.
    But for the bitrate.... i've to admit i'm a bit lost here. Some says 120k, others 60k, others 80k, others are just reading youtube: "something between 30k - 66k forr HDR and 24k - 53k for SDR".
    I guess if you using a 16:9 you can basically follow youtube advice for their upload and just use the same bitrate for your records.
    Still no one really answer the bitrate question for widescreen 3440x1440p.
    If anyone could help me with that it would be much appreciated so my files stop being so damn big 43min 18Go (x2 cause webcam + screen so 36 Go in total). Quality is definitely there, but those files seems to me a bit big in comparaison of the quality that we can see in this video with 2.5k or 20k bitrates.
    Computer config isnt a problem at all, it's about have the best quality for a more "normal" file size and not like installing an brand new AAA game on a drive every 1h of recording.
    Any help would be much appreciated

  • @chriscoveries
    @chriscoveries Год назад

    but the shotgun was unplugged
    im so confused

  • @VazgShah79
    @VazgShah79 7 месяцев назад

    I use Nvidia video card (NVIDIA GeForce 840A), but playing You tube video with AV1 codec, I receive tearing video quality other that with VP9, VP8 or H.264. Please help me to understand what should I do to have a smooth video play with AV1 videos with my video card?

  • @reinaldo1504
    @reinaldo1504 6 месяцев назад

    1 year later,.boom arc 310 for only $99 😂😂. So we don't have to wait 6 years.

  • @aresiastest
    @aresiastest Год назад +1

    So many missinformations here. This is not intended to be used for realtime encoding on CPU. If you don't just stick to hardware H265 acceleration or H264 CPU encoding.
    Snapdragon 8 Gen 2 and Dimensity 9000 phones have hardware decoding support. And even Netflix phone App is using AV1 software decoding on phones and it's not a lot heavier than H264 or H265 decoding.
    Samsung TVs (and phones) from late 2020 onward have AV1 hardware decoders and were mentioned by Netflix as some of the first outlets for their 4K AV1 content.

    • @vextakes
      @vextakes  Год назад +1

      Ey man what’s the encoding alternative? Most ppl can’t encode it so how is it misinformation- it’s disappointing. Also my 3080hqs hardware decoding and it’s great. Decoding is not the same as encoding tho, not even close, which is what this video is abt

    • @aresiastest
      @aresiastest Год назад

      @@vextakes it new, everything new takes time. Your video is like complaining of 4K blurays while still using an HD TV. It's on you, AV1 rocks. The alternative is using hardware H264 or hardware H265 if your GPU support it.

    • @vextakes
      @vextakes  Год назад +1

      U didn’t watch the video, ok I’ll stop

    • @maxcarrigan
      @maxcarrigan Год назад

      Encoding on CPU (software) is rough and inconsistent and would forget about it for this generation. New Nvidea and AMD GPUs have hardware to do it so much better only problem is that no one is buying them, ergo no one is posting the results. Conclusion: we have the tech for it and the demand however no one wants to spend a salary for the same GPUs we got last year with extra bells and whistles.
      Also really appreciate your content man earned yourself a sub

  • @fff444
    @fff444 Год назад

    tl;dr: it's using CPU.

    • @vextakes
      @vextakes  Год назад +1

      It’s in the thumbnail.

  • @HDRPC
    @HDRPC Год назад +1

    Av1 is just best. I love it. It is mainstream se all Intel cpu from 11 gen to 13 gen supports av1 decoding and RTX 30 and 40 series also support av1 decoding.
    You much have RTX 40 series, Intel arc GPU, rdna 3 GPU to do av1 encoding and decoding.
    Later this year in October 2023, meteorlake igpu will also support av1 encoding.

    • @gamesthatmatter9374
      @gamesthatmatter9374 Год назад

      its the same as hevc

    • @HDRPC
      @HDRPC Год назад +2

      @@gamesthatmatter9374 it is way better than hevc. I have seen comparisons and i have compared it in my RTX 4080.
      You can store much better quality than hevc in very small file sizes and low bitrate.

    • @gamesthatmatter9374
      @gamesthatmatter9374 Год назад

      @@HDRPC not true . i tested myself . they are equal in quality for same bit rate

    • @HDRPC
      @HDRPC Год назад +1

      @@gamesthatmatter9374 test it again using latest obs 29.1 version and latest nvidia driver.
      File sizes are much smaller and has same quality as hevc with large files.

  • @memoli801
    @memoli801 Год назад

    How about now, still not using AV1?
    Maybe just buy a recording card?

  • @toinfinityandyourmom2219
    @toinfinityandyourmom2219 5 месяцев назад

    av1 is meant to be used with hardware not the cpu

  • @Lucy_chan
    @Lucy_chan Год назад +2

    This video has to be a satire

    • @cowbee8865
      @cowbee8865 Год назад +3

      Yeah this has to be a really early april fools joke for him. None of his argument actually makes sense.

  • @panlukynek
    @panlukynek 11 месяцев назад

    nice room

  • @bjarnenilsson80
    @bjarnenilsson80 4 месяца назад

    Sell if you are on a desktop and have a pcie slot free you might look into getting a secondary gpu ( intel arc a310 ) they are cheap can be powered directly from the pcie bus and have hw av1 encoders on board. So you getbthe best of both worlds you can keep your current gpu for gaming and whatever else, and the cheap a310 asa dedicated encoder

  • @Abdullah______
    @Abdullah______ Год назад

    Is the av1 encoder available on the rx 6800 xt?

  • @markifi
    @markifi Год назад

    i was wondering about the audio! it was trash!

  • @doctorb4n3
    @doctorb4n3 Год назад +1

    You must limit game frame rate to 60 so that the PC has spare CPU time and use SVT-AV1 with "Faster" setting, for example I was able to do it live on a Ryzen 5900x ruclips.net/user/liveF9hPsJ1iwiY?feature=share

  • @Megabeans
    @Megabeans Год назад

    6:48 24 thread

  • @nofalware
    @nofalware 8 месяцев назад

    thanks

  • @emerson-biggons7078
    @emerson-biggons7078 Год назад

    I plan on acquiring a graphics card that can do AV1 encoding so I can record high res high fps footage.

  • @NGUNOW
    @NGUNOW 11 дней назад

    Interesting point.

  • @ivanpopov1016
    @ivanpopov1016 Год назад

    I use Xvid

  • @engahmednofal
    @engahmednofal 7 месяцев назад

    thanks

  • @Hi-levels
    @Hi-levels 10 месяцев назад

    How about witcher 3 at ultra settings. At 6000 8000 kbits 1080p 60 nvencs sucks at those games. No body talks about it. I have 3090 and i still do x264 with my second pc and pic quality is quite good

  • @Nogardtist
    @Nogardtist Год назад

    20mbps is never too much
    then it comes to best quality its probably low

    • @vextakes
      @vextakes  Год назад

      I was just talkin bout for av1 and just recoding gameplay. Prolly coulda done half that and got some pretty great footage

    • @Nogardtist
      @Nogardtist Год назад

      @@vextakes still its good to show real time sample and footage then some creators that show graphs
      also i think AV1 gonna have easier time in pixel style games like darkwood or terraria where no much motion going on

  • @iv177
    @iv177 Год назад

    there is so much wrong with this video
    why would you need to software encode av1 for an obs recording?
    that's not what its meant to be used for
    its not even meant to be used for obs recording at all even at hardware?
    its meant to be a way for content streaming platforms to save on bandwidth and storage
    it typically doesn't have much use for anything else except maybe faster video editing but i might be completely wrong on that one
    av1 is now already in use in alot of parts around youtube
    also an av1 encoder gpu like the intel arc is not expensive, 4070/90's are
    also i doubt that av1 to hevc to avc comparison was legit, the hevc looked like shit when it should have looked better then the av1, your settings for hevc might just be shit idk

    • @vextakes
      @vextakes  Год назад +2

      Wat? Can’t encode with hardware if u don’t have it. And av1 is def also supposed to be used by consumers, can’t stream av1 to these platforms if u can’t use it, can’t video edit with it if u can’t record it

    • @iv177
      @iv177 Год назад

      i think my response was deleted or there was a glitch or something

    • @vextakes
      @vextakes  Год назад

      Did u swear in it? Sometimes RUclips blocks that

    • @iv177
      @iv177 Год назад

      @@vextakes oh yea i did, woop

    • @iv177
      @iv177 Год назад

      ​@@vextakes i wrote a whole thing twice because the first time i accidentally pressed the back button and i lost the reply and then the second time i wrote a longer one but then yea it just got deleted
      basically what i said in short is that av1 cpu encoding dosent really do anything worth doing and av1 cpu decoding is pretty good cause i would guess most phones have enough power to cpu decode atleast 1080p youtube videos in av1 but thats just a guess, and youtube has alot of stuff in av1 now witch is cool and it saves youtube bandwith and storage also it looks better, and hardware av1 would be good for video editing cause av1 decoders are faster then avc i think and its better then raw video cause of storage ofc, also you can re encode videos without quality loss in to a different codec like av1 from hevc or avc using stuff like handbrake, also phones are coming with av1 de/encoders now
      i wrote like a whole paragraph before but thats basically what i said im tired

  • @GRAFIKAtv
    @GRAFIKAtv Год назад

    How to Rendering video in Premiere pro AV1 bro for best quality for Youtub , i have for music video 👏🏼☺️

  • @ImNotFlutters
    @ImNotFlutters 9 месяцев назад

    ?

  • @ImNotFlutters
    @ImNotFlutters Год назад

    ?

  • @blackrookgaming
    @blackrookgaming 8 месяцев назад

    Please don't belittle the Name of Jesus.

  • @theplaintech
    @theplaintech 7 месяцев назад

    I've turned on AV1 for RUclips videos. New Android phones have AV1. The new graphic cards have AV1 decoders but no encoders.
    You have a cat. All is forgiven.

  • @duladrop4252
    @duladrop4252 Год назад +1

    AV1 encoding on what I believe needs AI cores to make it run efficiently... So this will probably works as well on RTX30 series, which I believe has AI cores too.

    • @Antagon666
      @Antagon666 Год назад +2

      You don't know what you are talking, do you ?
      AV1 doesn't use 'AI'. It needs specialized hardware encoder not AI cores.

  • @jamesbyrd6636
    @jamesbyrd6636 Год назад +1

    Just found you and loving your vids!

    • @vextakes
      @vextakes  Год назад

      Thanks appreciate it

  • @KoFFeeFPS
    @KoFFeeFPS Год назад

    Vibey

  • @AmrKRZ
    @AmrKRZ Год назад

    bruh 5900x is old news now, what do you expect?

    • @hussein25580
      @hussein25580 Год назад +1

      Still a high end cpu for many, his point is that if a 12 core processor from 2020 struggles with av1 encoding (i think 1440p60) and playing at the same time (apparently 144 fps), then its going to struggle for the vast majority of users, the solution would be to pick a faster preset, lower recording resolution and lowering the game's framerate as well.

    • @vextakes
      @vextakes  Год назад

      👏🏼👏🏼

  • @vajona2495
    @vajona2495 5 месяцев назад

    nvenc h265 != h265