I am a member in the group for development of AV1. The reason it has taken so long to adopt is because it was only in the last two years that the codec was stabilized to an acceptable enough degree for mass deployment. By the end of 2025 most major platforms should have it fully or partially adopted.
You might not be able to answer but I'm gonna ask anyway. How far off is it? Just a ballpark is enough, you can be as vague as you need to be. Just give us hope. We've been hearing about twitch wanting to implement AV1 for years now, and at this point they might as well contribute to AV2's production or switch to VVC.
could you maybe tell us why it's so expensive on the recording side, if we go to to almost no quality loss of recording? it looks realy freaking good, almost 1-1 from my end to the encoding, but damn is it storage intensive. For reference, i encode in CQP lvl19 or 18 and 30-second to 1min can sometimes be already 1gb of storage.
Another massive benefit over H.264 that AV1 has is the ability to encode HDR footage that is tone-mapped for SDR playback as well. Nvidia recently pushed an update for Shadowplay enabling AV1 and now I’m able to play games in HDR and still clip footage for my friends without it looking like a washed out mess on their screens. When it gets added to Twitch and Discord allows it in their streaming pipeline AV1 is absolutely the future.
They actually also convert some assets to AV1 and store them on their servers. When an asset is encountered, it checks the servers, and if it doesn't find it, it uses a placeholder, it uploads it for conversion so it will be available in the future.
What gets me is that I'm getting back into the game after a 3-year+ hiatus, and the same day I decide to get back in thor uploads a short mentioning it.
It seems optimistic to assume that Twitch will retain the current bitrate ceiling so we all get better quality, rather than saving money by reducing the bitrate to offer the same quality at less bandwidth. But hopefully that's how it will turn out, because oh boy does Twitch quality tank whenever there's a lot going on on screen.
The smart move would be splitting the difference. They get to slightly reduce their costs while slightly increasing stream quality, plus eliminate the royalty nonsense.
this is what netflix is allegedly doing too, with 2/3rds of their library in AV1 format now. The biggest complaint about streaming services is the quality reduction from the compression, and AV1 would allow services to offer more quality - but they probably wont when you can just cut costs and free up space.
The biggest price draw is actually re-encoding, which is solved by an entirely different feature in OBS that was just added called "Enhanced Broadcast" where the re-encoding is done locally on the steamers PC. The drawback of this feature is that you are basically streaming the same stream over 3-5 times to twitch, which is much harder on your upload speeds. What AV1 does is allow basically anyone to use said feature, despite having worse internet, as the net bandwidth can greatly be reduced while at the same level of quality as h264.
That's the whole point of using something open source. They don't need to lower the bitrate cap, they're already making a huge saving, and thus have improved margins. Twitch are dumb, but they aren't that dumb.
I got a $100 arc GPU just for hardware encoding AV1, still use my 3080ti as primary GPU. (Also you can do CPU encode/decode for AV1 it just takes longer).
Me too! 6700 XT in my main rig and an a380 in my encoding machine. The nice part is that the Intel h246 hardware encoders also aren't half bad, so I've still been able to use it quite a bit. Funny story, I got my a380 right after the launch in the west and so it wasn't supported in just about anything and it was my first PC because I'm an idiot and make my life harder for no reason. Took me a few weeks to get everything working properly (although using Linux also didn't make my life any easier).
@@DekustahOnly issue is that for live streaming, trying to make your CPU do the video encode can often hamper performance on the rest of the system. If it was encoding video for later playback then yeah I’d say do software encode but hardware acceleration is probably best for livestreams
@@Youshisu and? New technology frequently requires new hardware support. New tech should never be held back until literally 100% of the world is fully complient with it. Nobody here is pretending that this is going to perfectly upgrade the experience of every human on earth overnight. It's going to be gradually phased in as more and more people gain access to the necessary hardware. Mobile devices will eventually have it, too.
@@LtdJorge thats great, altho im not hypying anything before I see it works :D, dont preorder, dont trust politics. If this gonna work on mobile thats great. We will siee :D
@@dumant7975No. However, you will notice that if Twitch serves you an AV1 stream, your CPU usage might be higher than h.264 because your CPU doesn't have optimisations for AV1
I knew that AV1 was a more efficient codec, but I didn't know that it was open source. That's awesome o: If I ever stream again then my 4090 will be outputting only the best quality for my 0 viewers c:
how efficient can it be if you need a 4000 series graphics card to encode and decode it. it kinda sounds like 8k. cool, but most people won't be able to use it for a long time.
@@crediblesalamander8056 It's more about having built in coprocessors in the GPU to make it really easy to encode. AV1 on the whole is also comparatively new (compared to h264/265) so the hardware support has just started showing up in the last couple years.
In terms of media storage, I use AV1 10-Bit. For my Blu-ray movies that I back up, I can take a 36GB raw rip and bring it down, with next to no quality loss, combined with 640kbs OPUS audio for surround (128kbs per channel), 2-3gb, while for equiviant quality for other codecs would be 4-5gb in h.264 or 3-4gb in h.265. Downside is AV1 decode is computationally more demanding then h264/h265 if the machine that's doing the decoding doesn't have the dedicated hardware, like my 4k Chromecast w/ Android TV dongle (yeah, I have a dumb TV), meaning I have to set up transcoding back down to h264 on my TrueNAS scale server running Jellyfin. Lukily, I grabed a $50 Intel ARC gpu and its a low power video encoding BEAST. I seriously hope that the new 'set-top' streaming box that Google is releasing soon has dedicated AV1 decoding hardware.
At such high bitrates you're likely getting no benefit from AV1. It is comparable to h.264 and substantially worse than h.265 at bitrates above ~15 Mbps. H.265 is equal to or better than AV1 at all bitrates, however licensing it for commercial use is a nightmare.
@@rightwingsafetysquad9872Judging from their words, I'd say they're actually worried about file size post-encoding, since they're saving backups from discs. They're describing a 20-40% file size reduction for storage. Not everyone is prioritizing stream quality here.
@@impishlyit9780 Yes, but you can do that with any codec. They're not benefiting from using AV1 as opposed to h.264/5. You are always making a choice of either what gets me the best quality at my target bitrate, what gets me the lowest bitrate at my desired quality, or what is computationally the cheapest. The answer to the first 2 is always h.265. The answer to the third is always h.264. It is only when you're in a commercial setting that would necessitate paying royalties that AV1 becomes more desirable than h.265. You never get any benefit for using AV1 as opposed to h.265 for personal use. Furrhermore, I'm asserting that neither AV1 nor H.265 was necessary for this compression because above ~20 Mbps they all look basically the same anyway. His storage reduction did not depend on a new codec.
I don't think any of the devices involved actually have dedicated hardware for any codec, they have dedicated hardware for primitives (certain types of calculations). This means if AV1 doesn't have some exotic demands on the calculations front, even if the hardware was made before AV1 was ready, a sort of driver update could still work.
@@autohmaeYes they do. AMD cards have separate ASICS for H.26x family and for AV1. And different ones for encoding and decoding. IIRC, the AV1 in AMD cards came from XILINX IP. They were the ones already with hardware implementations of the codec. Bot ASICS and blocks for their FPGAs.
1:02 You don't need a graphics card to encode AV1. SVT-AV1 at preset 11 is visually identical to what the current gpu generation can produce, and preset 10 already runs 4k60 on many cpus.
2:35 As far as decoding goes, tge software decoders are faster than realtime. That is the entire point behind AV1, complexity on the encoder side shouldn't be offloaded to the decoder side. Hardware decoders are just a tiny bit faster than software decoders, but save on power usage.
@@brettmurf if one is being a "serious" streamer, he probably would already get a cpu with many threads since things like layers, effects, etc, would still be done by the cpu, and some hardware encoders still requires some work by the CPU. Also, NVENC would produce garbage frames or skipping frames if the "gpu" part get overloaded, AMD's AMF and Intel QuickSync (and Arc) to my knowledge, never had this issue. So it can kinda get sometimes into a "pick your poison". I tried recording with SVT-AV1 encoder some Osu! gameplay and it was really fine, my Ryzen 5 1600 had 4 of its 6 cores banged up, but I imagine a faster, bigger processor could do it without losing frames @30fps while playing something more demanding like Cities Skylines 2 or smth
That's why I have a couple of Arc cards, A310 and A380, for use when I want to encode a great deal of stuff. My media server always has the A310 in it (single slot) for transcoding media to different nitrates and resolutions, while the A380 gets put into a workstation whenever I'm doing meaningful video work on that system. However, as of late, I've started just net catting the video stream over to the media server to process the video and just netcat the stream back.
@@OhhCrapGuy The a380 is such a cool piece of tech, got one myself for my stream encoding machine. Was a pain in the rear to get working in those first few months though.
I’m looking to build a new PC this upcoming winter, and the thing I’m having the most trouble wrapping my head around is PCIe bandwidth. Won’t getting a second card throttle the primary GPU? Or does that not matter for low end ARC cards?
@@RiverM8rix If my knowledge is correct, on most Motherboards your second (and sometimes third) PCIe slots don't interact with the first one at all. Usually, the first PCIe slot is the only one with the full x16 interface connected directly to the CPU, and the rest are either x8, x4, or x1 to save on lanes and are connected via chipset. If your board supports PCIe gen 4 or newer, most graphics cards should function fine on even x8 slots, but if you're only using it for encoding x4 (and even x1) should work fine. Encoding uses way less data than normal gameplay/rendering, so it's pretty easy to put it in a slot with less bandwidth. After doing some Googling, yea that isn't a problem. Using my mobo as an example (Asus x677 Prime-P AM5), I've got 28 PCIe 5.0 lanes supported by AM5. On my mobo (and all AM5 mobos if memory serves), 16 of those lanes are dedicated to the primary GPU slot. Another 8 are given to NVME drives. The last 4 are given to the chipset, which splits them across 2 more slots running PCIe 4.0 at x4 each, a PCIe 3.0 x1 slot, and an NVME slot running at PCIe 4.0 x4. For things running off the CPU directly (so my PCIe x16 slot and my first two NVME slots), bandwidth limiting will never be a problem since the CPU supports enough bandwidth for all of those slots. The chipset run slots theoretically could have bandwidth limiting, since it's splitting the lanes it has into more lanes. In practice, it's rather difficult to fully saturate those lanes since PCIe 5.0 supports 32 Gb/s transfer speed per lane, so I'd need to be using 128 Gb/s at the same time not counting my primary GPU and storage slots. An encoding card doesn't even get close to that under any normal workload, and even if it did it wouldn't touch my card on the x16 lane since that goes directly to the CPU and not through the chipset. Tl; dr: On any modern platform, if you plug your GPU into your primary slot it doesn't matter what the other slots do, they can't touch it.
I got a cheap intel gpu about a year ago to encode youtube streams. The quality at the same bitrate was noticibly better, i dont get any interference with streaming on my gaming GPU. I used to have intel driver issues with fan speed making it super loud but it got patched at some point. 10/10 now.
@@PeTe_FIN Ironically, they keep increasing subscription prices regardless across the globe and adding other new features absolutely nobody requested, like videos and audiobooks
0:01 I really wanted to make a joke about the "four thousand 90" because ive never heard someone call it that. then I remembered Thor goblin lord's speech is LAW and we all must abide by his lordly ways
Thor, just wanted to reach out to you because I was super excited today and thought you should know, as the "OP" so to speak. A coworker asked me today about my input on mounting a sensor, and my first instinct was to pull up MS Paint, and I quickly drew up a rough sketch of what he had and what I thought he needed, and he not only fully understood, but was smiling seeing the idea be created right before him, as was myself. Thank you for this huge tip for presenting and teaching, and I am super excited to use it again. Love ya, bro ❤❤ keep doing what you do.
NGL, I've re-encoded my main Plex library to AV1 to save roughly half of the storage of the server, just need the web-apps to get native AV1 streaming and it'll be done and done for everybody on the server :)
I didn't even realize I could have been using AV1. That's pretty awesome to learn since I'm just starting to learn about video editing. Another awesome lesson from Thor
for decoding AV1, it shouldn't matter what graphics card you have, you don't need a 30xx card, you can decode in software, it just takes longer. you can buffer.
@@heavygaming6596 mobile devices have powerful CPUs capable of software decoding AV1, but it's still unusable because software decode is power inefficient
@@обычныйчел-я3е Google just updated most modern Androids though to use libdav1d instead of the older AV1 software decoder, so it shouldn't really take that much power anymore.
But twitch/youtube/whatever won't give you an AV1 stream unless you specifically request it, since your machine doesn't report having the capability to hardware decode. If you look in your chrome/firefox settings there's a toggle for AV1 as well, it's off by default if your machine doesn't have hardware decoding.
I've been waiting for AV1 for a while now, can't wait for it to be widely available. So cool to see people solving problems that we've all just accepted.
I had no idea NVENC supported AV1, i thought it was only H.264 and H.265. This is useful for gameplay recording too, you can cut file sizes considerably on recordings while keeping identical quality. Great educational video!
ARC cards dont get enough credit for what they were for their time. They were moderately powerful cards, but their ability to AV1 encode was really good for the price.
I have several n100 mini PCs for this reason It has native av1 decode, so I can watch videos without decreased quality/increased power use from needing to do it in software Fantastic tech.
You can also have it on your iGPU inside your CPU if you don't want to change your graphics card. The AMD CPUs of the 8000G series has AV1 encoding (but not the 7000 or 9000)
@@nicolaspaglione spoken like a true fanboi, AMD has the BEST encoder for AV1, in fact it has major plaudits from anybody that uses AV1 for day to day work, it's just unfortunate that the NVENC encoder from Nvidia is pretty much mandatory for the time being due to software suite available from Nvidia.
As a fulltime Warframe streamer, I CANNOT WAIT for AV1, to the point that the moment it's announced I'll be upgrading my GPU instantly to a 7800 or 4080 super. it's going to make my life just so so much easier and better on a scale i can't even discribe. No more multi profiles for recording and streaming, no more worrying about if it looks like a potato (running a 3070 atm), Crispy clear video, being able to actually enjoy warframe at it's most beautiful. I just cannot wait anymore I really really want it asap.
I totally didnt think to check the video decoding abilities of the phone I use daily until now, Thank you! Looking forward to all the crisp and clean video playback of the future!
I’ve been using AV1 for years now just using my CPU, you are not only limited to GPU encoding. That said it is a very very heavy algorithm and takes up the whole CPU to do, so most cant do it without sacrificing half of your cores or having an entirely separate PC for encoding. Software decoding also is heavy, but I any device since 2017 should have enough power to play back AV1 via software with minimal hiccups
Software Decoding isn't heavy at all, as long as you have something decent enough to use Windows 10/11, it can decode av1 video. I did an informal experiment where I got a modern video player and an av1 encoded video onto my old Core 2 Duo desktop, and it was able to decode 720p30 with some hitches, and this is *without* fastdecode which is likely what most platforms would use.
About decoding: That is only for hardware decoding support, but decoding AV1 isn't that expensive compared to encoding. If your CPU is too old, you are out of luck, but pretty much all desktop CPUs and atleast more recent phone CPUs should work just fine. It most definitely will for 1080p for the vast majority, regardless of hardware support.
I’m going through the process of converting everything on my media server to AV1 but it’s definitely been a process. I’ve got literally thousands of hours of video ranging anywhere from 480p x264 videos to 4kHDR x265. That alone would be a challenge but I don’t have new enough equipment to do hardware-accelerated AV1 encoding. So naturally I’m just having it done by the CPU (which most people would argue leads to better compression anyways) but it’s all being done by a measly 12C/24T Xeon from 8 years ago. The lower quality stuff that’s x264 encoded normally can be done in just a few minutes but I’ve got dozens of 4K movies that can easily take 7-8 hours to encode per movie. Thankfully most stuff made in the past 5 years has hardware accelerated AV1 decode but until decide to slap a new GPU in that server, that poor CPU is going to remain pegged at 100% usage for literally weeks
if its already encoded in h265 you're just wasting time and energy, h265 and av1 have similar performance for larger files (especially 4k), please dont waste ur time and go and buy anything new
For those of you who don't want to/can't shell out mega-moolah for an Nvidia card, AMD's Radeon 7000 and newer cards have this encoder as well, as do Intel's Arc cards. AV1 _decoding_ (aka the viewer's job) is supported by Nvidia 3000 and newer, Radeon 6000 and newer, and Arc cards. An important note: this is a _hardware_ encoder/decoder, as in a physical chip that's really good at doing this one task really fast. If you have an older card, it will simply run the encoding/decoding in _software,_ which works fine, but it'll be slower (leading to buffering).
Let’s be real Thor, Twitch will pocket the difference and still charge you for 4K.😂 Big fan of your content, mentorship and the funny stuff too, thank you for sharing.
It absolutely infuriates me there's no real stream settings in discord. Half the time it looks bad and there's no way for me to make it better. And the other half the time the stream lags to crap and gives me no feedback as to why.
It still uses av1 if you stream vom a system with a 4000 nvidia gpu (can't tell for other brands cards) to a system with a 4000/or 3000 nvidia gpu. If you enable developer mode you can look up the codecs that are uses for the stream.
@@FreshChriss I enable developer option and I don't think I've *ever* seen a option to see what codec your stream is Oh yeah, you could just, use your cpu for encoding and decoding av1
@@aksGJOANUIFIFJiufjJU21 it is somewhere on the bottom left (like in the connection / ping tab, or somewhere else) and it opens a new page where you can see the current inbound and outbound audio and video streams. You must be in a call to access it of cause. Hope this helps
Vulkan encoding and decoding of h264, h265 and av1 is coming. That is multigpu and multiOS support for all codecs so linux also get windows level of codec support for those last % gamers that dont want to make the switch yet
What you're thinking of are the encoders x264 and x265, both H264 and H265 have patents which means services pay for their use, most H264 have expired so it's mostly free I think, but H265 is problem for services to use
And half of the reason why we don't use h.265 for most applications despite many video recorders being able to record it by default is because h.265 was a licensing nightmare compared to h.264. You'd have to convert a h.265 to a h.264 in order to use it in most editing programs like Premier Pro. AV1 fixes that issue in addition to being more efficient.
rdna3 cards support it, but not for FHD, because they have an unfixable hardware bug that makes 1080p encodes impossible, you can only do the closest one, 1082p
AV1 is CRAZY. Re-encoded my H.264 videos to AV1 with the same settings, and it cut out 70% of the size (1GB->300ish MB) and it doesn't have any noticeable difference in quality. But Twitch doesn't support it yet so.......
I used to swear by OGG (Vorbis), but if you're still using OGG, I recommend the newer, better open lossy audio codec, Opus! Opus is also amazing quality, beating every other standard codec at any given bitrate until transparency.
Its hard for Twitch to support AV1 because the majority of streamers do not have the hardware to encode at AV1 smoothly, likewise there are many viewers who cannot decode AV1 smoothly on whatever hardware they are running. Thats why we are running enhanced broadcast over eRTMP so that people with advanced hardware can supply both AV1 and AVC renditions. Twitch cannot supply this on the backend because we don't have a cost effective solution of transcoding AV1, it is ridiculously hardware intensive compared to transcoding AVC (which we supply to streamers already). If you arent part of the enhanced broadcast beta, you can try it!
That's not an excuse, because: 1. Allowing people to stream in AV1 doesn't force everyone to do so. 2. if you're using hardware encoders, it would be trivial for the streamers to transcode to lower resolutions (or maybe even other codecs but I'm not sure) 3. Twitch could transcode av1 to h264 just like YT does (at least for VODs) 4. Outside of power consumption, software Decoding AV1 is quite fast on any decent cpu.
You can also encode using your CPU which is often higher quality than the GPU encoded variants. However requires a lot of CPU power, but if you already have a streaming PC this like won't be news to you. Higher presets using SVT-AV1 yield worse quality but faster encode times. It's a balancing act. But you can achieve H265 like performance and beat H265 in quality with the same encoding time in my testing.
Especially with a regular sized monitor, I have a 28" 4k monitor and I can't see the difference between 4k and 1440p even with my face smooshed up against the screen.
Dude AV1 is one of the best video advancements we’ve made in a long time and yet we still use H.264. It drives me wild that AV1 hardware is lagging so far behind.
Yep. I use Handbrake to encode some of my own ripped DVDs for putting it on my phone or backing it up. Handbrake supports the Nvidia AV1 hardware encoder. I've never used anything faster. 1 Mbit video(dvd resolution), Opus audio which is close to transparent at 128 kBit for stereo, and you're off to the races. Netflix has used this stuff for years already, as it was also one of the companies that supported the development. All free codecs for both video and audio.
Just for the record, the reason that 4k is such a big damn deal is that a lot of people are consuming things like games, streams and RUclips on their big old 4k living room screens. Even 1440 looks crappy blown up that big, so it's gotta be 4k. If you mostly look at little screens, like phones and PC screens, or you only view traditional TV on the big TV and streaming on your phone, then it seems like a pointless gamer obsession with more pixels for the sake of pixels. It's not. It's about that 50-inch TV. I didn't really get this for a while, so somebody else needed to hear it.
Definitely thinking of upgrading to a Arc A770 from a RX 6600 cause I'm pretty sure AMD Graphics Card's do not play nice with Overwatch. Lots of frames dropped and I want AV1 encoding.
Thank you @Pirate Software. We've been working on converting videos TO H.264 and now I think maybe we're falling way behind. A new mission has started 🤓
I'm shocked Twitch is stuck on h.264. It was amazing a decade ago but... That was a decade ago. Honestly, it's why I bought a 7900XTX. Absolute dream with the AV1 encoder. Hard drive savings are amazing compared to h.264 and are genuinely impressive compared to h.265.
They are also experienting with having the streamers transcode to different resolutions as well. Instead if that being done on the Twitch servers which would save them even more money. It's already implemented in the new enhanced broadcasting thing they have.
Not to mention that within the specs of AV1 it can do 12bit 4:4:4 lossless encoding/decoding, so it will be a great format for intermediates within a big video/cinema production. It is just missing the extra alpha channel tho
The future becomes 4K 60fps. Great for the stream, because why not future proof it, but I can already see bunches of people trying to watch that on tiny little screens that can barely display 1080p much less 2160p.
The main reason you need newer GPUs for now is because they have the hardware encoders which are vastly more efficient it's the same with NVENC for H.264 for Nvidia cards. At some point more future hardware will begin natively supporting AV1 encoding not to mention the software support is a given as it's open source.
This is fantastic news! I honestly have no idea why a terrible proprietary standard that requires a license (H.264) has became widely adopted in this day and age, open-source is always better and makes everything more accessible to everyone and cheaper for companies to use. The fact it's better too makes it a no-brainier. Out with H.264 and in with AV1!
H264 is more supported because its way older (has had more time), has much more hardware support, and even if it doesn't (wth are you using) software decode is practically free. That's not to say AV1 shouldn't be the new default, but there's a reason why H264 is still a thing to this day.
I didnt know H.264 was paid holy shit that explains so much. This is like opus all over again, royalty free alternative that is so goddamn efficient that you can scale it down or up depending on what you want and better than mp3 at a fraction of the bitrate
I hate how they did JXL so dirty instead of embracing it. Apart from that I'm all for AV1 and can't wait for it's successor when we will get PVQ (pyramid vector quantization) which was dropped from AV1 as it would've required more work in creating the chips to do AV1 in hw since this is new. Which is funny considering PVQ is great for being adapted in hw iirc. But nothing existed, yet and it would've increased time to shelf further. I like how Mozilla/Xiph, Cisco and Google fused/merged their ideas while working on their own video Codecs and then started working together on finalizing a codec.
It will cost, because Twitch re-encodes you steam for multiple resolutions (for clients that have limited bandwidth). So it's either new hardware for hardware-accelerated decoding/encoding or much more CPU time for software encoding/decoding. But they could save some money on storage cuz AV1 files are much smaller.
I am a member in the group for development of AV1.
The reason it has taken so long to adopt is because it was only in the last two years that the codec was stabilized to an acceptable enough degree for mass deployment.
By the end of 2025 most major platforms should have it fully or partially adopted.
Thanks for the insight!
Thanks for the work you're doing!
You guys rock!
You might not be able to answer but I'm gonna ask anyway. How far off is it? Just a ballpark is enough, you can be as vague as you need to be. Just give us hope. We've been hearing about twitch wanting to implement AV1 for years now, and at this point they might as well contribute to AV2's production or switch to VVC.
could you maybe tell us why it's so expensive on the recording side, if we go to to almost no quality loss of recording? it looks realy freaking good, almost 1-1 from my end to the encoding, but damn is it storage intensive. For reference, i encode in CQP lvl19 or 18 and 30-second to 1min can sometimes be already 1gb of storage.
Another massive benefit over H.264 that AV1 has is the ability to encode HDR footage that is tone-mapped for SDR playback as well. Nvidia recently pushed an update for Shadowplay enabling AV1 and now I’m able to play games in HDR and still clip footage for my friends without it looking like a washed out mess on their screens. When it gets added to Twitch and Discord allows it in their streaming pipeline AV1 is absolutely the future.
Nice!
Oh that's actually awesome, I didn't know that! I've been waiting to try HDR content for so long but as I record most stuff I can't have it on
Meanwhile Discord removed the HDR stream experiment earlier this year which tonemapped HDR to SDR :/
Oh is that why PS5 HDR recordings look washed out on youtube (SDR)?
You can already use HLM on HEVC .265 at 4K 60 on OBS. Not sure why you'd use shadow play.
That thumbnail is insane
**that iconix yoshikage kira statement that involve the hands of the gioconda**
No it's perfectly sane I promise
chad thor is not something i expected to see more than once
gigachad
I absolutely love the professional paid editor for his Pants is keeping Thors theme of meme thumbnails. ❤😂
I find it hilarious that the MPEG group dropped the ball so hard on HEVC that AV1 was able to slide in mostly unopposed.
h265 is good though
@@MaxIronsThird and so was vp9. except for everyone delivering video on a scale where 25% efficiency matters
@@MaxIronsThird it is, but just like h264 you need to pay to use it
@@lunlunnnnn libx265 is free though
@@MaxIronsThird But the license is expensive, more expensive than h264 even. On the other hand, AV1 is free.
AV1 is also recommended by Valve to improve Steam Deck compatibility for in-game video files.
Right. Because the Steam Deck's APU is RDNA2, and RDNA2 supports AV1 decode.
@@GSBarlev last i checked it does actually support it. remind me tomorrow ill check again
They actually also convert some assets to AV1 and store them on their servers. When an asset is encountered, it checks the servers, and if it doesn't find it, it uses a placeholder, it uploads it for conversion so it will be available in the future.
@@immortalaxolotl where's the info on that?
@@SupinePandora It's part of proton
Dude, I love staring at pixels in the vague shape of Warframe
Fooking fashion frame! lol
What gets me is that I'm getting back into the game after a 3-year+ hiatus, and the same day I decide to get back in thor uploads a short mentioning it.
@@potterfanz6780 synchronicities go hard eh
It seems optimistic to assume that Twitch will retain the current bitrate ceiling so we all get better quality, rather than saving money by reducing the bitrate to offer the same quality at less bandwidth. But hopefully that's how it will turn out, because oh boy does Twitch quality tank whenever there's a lot going on on screen.
The smart move would be splitting the difference. They get to slightly reduce their costs while slightly increasing stream quality, plus eliminate the royalty nonsense.
this is what netflix is allegedly doing too, with 2/3rds of their library in AV1 format now. The biggest complaint about streaming services is the quality reduction from the compression, and AV1 would allow services to offer more quality - but they probably wont when you can just cut costs and free up space.
The biggest price draw is actually re-encoding, which is solved by an entirely different feature in OBS that was just added called "Enhanced Broadcast" where the re-encoding is done locally on the steamers PC. The drawback of this feature is that you are basically streaming the same stream over 3-5 times to twitch, which is much harder on your upload speeds. What AV1 does is allow basically anyone to use said feature, despite having worse internet, as the net bandwidth can greatly be reduced while at the same level of quality as h264.
Twitch been having 4k support bitrate since spring in alpha/beta. A Higher bitrate should be available by end of year, maybe next spring.
google it.
That's the whole point of using something open source.
They don't need to lower the bitrate cap, they're already making a huge saving, and thus have improved margins.
Twitch are dumb, but they aren't that dumb.
I got a $100 arc GPU just for hardware encoding AV1, still use my 3080ti as primary GPU. (Also you can do CPU encode/decode for AV1 it just takes longer).
ARC is really something more people should consider grabbing just for the AV1 support. And the (relatively) low price tag.
Me too! 6700 XT in my main rig and an a380 in my encoding machine. The nice part is that the Intel h246 hardware encoders also aren't half bad, so I've still been able to use it quite a bit. Funny story, I got my a380 right after the launch in the west and so it wasn't supported in just about anything and it was my first PC because I'm an idiot and make my life harder for no reason. Took me a few weeks to get everything working properly (although using Linux also didn't make my life any easier).
I do the same, grabbed an A380 as a 2nd gpu (my motherboard can handle) and my 3080 is for graphics
software encoding/cpu encode is usually better than hardware encode
@@DekustahOnly issue is that for live streaming, trying to make your CPU do the video encode can often hamper performance on the rest of the system. If it was encoding video for later playback then yeah I’d say do software encode but hardware acceleration is probably best for livestreams
TIL: AV1 is open source. This does put a smile on my face.
but gpu requirements.... im not gonna lie, thats huge downfall. You won't watch it on phone. And how abotu playing games and watching? :D
@@Youshisu and? New technology frequently requires new hardware support. New tech should never be held back until literally 100% of the world is fully complient with it. Nobody here is pretending that this is going to perfectly upgrade the experience of every human on earth overnight. It's going to be gradually phased in as more and more people gain access to the necessary hardware. Mobile devices will eventually have it, too.
@@Endofnames what do you mean "and". I just gave you reasons. You gonna lose more than half of viewership if they won't solve it xD
@@Youshisuwhat? Most modern mobile SoCs and computer GPUs support it (by modern I mean last 2 generations) for decode.
@@LtdJorge thats great, altho im not hypying anything before I see it works :D, dont preorder, dont trust politics. If this gonna work on mobile thats great. We will siee :D
AV-1 decoder are implemented in CPUs since
Intel 11th generation and newer.
AMD Ryzen 6000 or newer.
I have 8th generation intel. Do I have to buy new laptop to watch streams now?
@@dumant7975 well no. Most likely stream service providers will transcode it for you, but probably highest quality won't be available
@@Jakubg6even it they didn’t you can stream AV1, but the thing is it will be fully software decode, so the CPU does all the work (inefficiently).
@@dumant7975No. However, you will notice that if Twitch serves you an AV1 stream, your CPU usage might be higher than h.264 because your CPU doesn't have optimisations for AV1
@@dumant7975if youtube is fine now it will be fine.
I knew that AV1 was a more efficient codec, but I didn't know that it was open source. That's awesome o: If I ever stream again then my 4090 will be outputting only the best quality for my 0 viewers c:
how efficient can it be if you need a 4000 series graphics card to encode and decode it. it kinda sounds like 8k. cool, but most people won't be able to use it for a long time.
@@crediblesalamander8056 3000 also can decode it, it will still take a while to majority of people to use it.
That one viewer deserves the best they can get!
@@crediblesalamander8056you don't need 4090 processing power, the 40 series were just the first cards to support it
@@crediblesalamander8056 It's more about having built in coprocessors in the GPU to make it really easy to encode. AV1 on the whole is also comparatively new (compared to h264/265) so the hardware support has just started showing up in the last couple years.
In terms of media storage, I use AV1 10-Bit. For my Blu-ray movies that I back up, I can take a 36GB raw rip and bring it down, with next to no quality loss, combined with 640kbs OPUS audio for surround (128kbs per channel), 2-3gb, while for equiviant quality for other codecs would be 4-5gb in h.264 or 3-4gb in h.265.
Downside is AV1 decode is computationally more demanding then h264/h265 if the machine that's doing the decoding doesn't have the dedicated hardware, like my 4k Chromecast w/ Android TV dongle (yeah, I have a dumb TV), meaning I have to set up transcoding back down to h264 on my TrueNAS scale server running Jellyfin. Lukily, I grabed a $50 Intel ARC gpu and its a low power video encoding BEAST.
I seriously hope that the new 'set-top' streaming box that Google is releasing soon has dedicated AV1 decoding hardware.
At such high bitrates you're likely getting no benefit from AV1. It is comparable to h.264 and substantially worse than h.265 at bitrates above ~15 Mbps. H.265 is equal to or better than AV1 at all bitrates, however licensing it for commercial use is a nightmare.
@@rightwingsafetysquad9872Judging from their words, I'd say they're actually worried about file size post-encoding, since they're saving backups from discs. They're describing a 20-40% file size reduction for storage. Not everyone is prioritizing stream quality here.
@@impishlyit9780 Yes, but you can do that with any codec. They're not benefiting from using AV1 as opposed to h.264/5.
You are always making a choice of either what gets me the best quality at my target bitrate, what gets me the lowest bitrate at my desired quality, or what is computationally the cheapest. The answer to the first 2 is always h.265. The answer to the third is always h.264. It is only when you're in a commercial setting that would necessitate paying royalties that AV1 becomes more desirable than h.265. You never get any benefit for using AV1 as opposed to h.265 for personal use.
Furrhermore, I'm asserting that neither AV1 nor H.265 was necessary for this compression because above ~20 Mbps they all look basically the same anyway. His storage reduction did not depend on a new codec.
I don't think any of the devices involved actually have dedicated hardware for any codec, they have dedicated hardware for primitives (certain types of calculations).
This means if AV1 doesn't have some exotic demands on the calculations front, even if the hardware was made before AV1 was ready, a sort of driver update could still work.
@@autohmaeYes they do. AMD cards have separate ASICS for H.26x family and for AV1. And different ones for encoding and decoding. IIRC, the AV1 in AMD cards came from XILINX IP. They were the ones already with hardware implementations of the codec. Bot ASICS and blocks for their FPGAs.
The AMD RX 7xxx series can also do AV1 encoding. Allows you to do it more on a budget/with better game compatibility.
he did say that...
@@castleedits8700 must’ve misheard him then; I only heard him say the 4090 and Intel Arc cards
0:45
@@rightwingsafetysquad9872 yeah it was the fact that he didn’t say Rx or AMD that didn’t make it click for me
It's the same when someone says "3000 series" or "4000 series" that you would catch the one they meant is nvidia graphics card tho
We’re just gonna stand stand by and let him call it “four thousand ninety” and call it a “four thousand series card”
1:02 You don't need a graphics card to encode AV1. SVT-AV1 at preset 11 is visually identical to what the current gpu generation can produce, and preset 10 already runs 4k60 on many cpus.
2:35 As far as decoding goes, tge software decoders are faster than realtime. That is the entire point behind AV1, complexity on the encoder side shouldn't be offloaded to the decoder side. Hardware decoders are just a tiny bit faster than software decoders, but save on power usage.
Okay, but do you want your CPU doing that while it is also running a game?
@@brettmurf if one is being a "serious" streamer, he probably would already get a cpu with many threads since things like layers, effects, etc, would still be done by the cpu, and some hardware encoders still requires some work by the CPU.
Also, NVENC would produce garbage frames or skipping frames if the "gpu" part get overloaded, AMD's AMF and Intel QuickSync (and Arc) to my knowledge, never had this issue.
So it can kinda get sometimes into a "pick your poison".
I tried recording with SVT-AV1 encoder some Osu! gameplay and it was really fine, my Ryzen 5 1600 had 4 of its 6 cores banged up, but I imagine a faster, bigger processor could do it without losing frames @30fps while playing something more demanding like Cities Skylines 2 or smth
It's the guy that killed streamfx .
Those days that I lost can't be recovered .
Apparently using one of lowend Intel Arc cards a secondary GPU in your system is a price efficient way to get access to AV1
That's why I have a couple of Arc cards, A310 and A380, for use when I want to encode a great deal of stuff.
My media server always has the A310 in it (single slot) for transcoding media to different nitrates and resolutions, while the A380 gets put into a workstation whenever I'm doing meaningful video work on that system.
However, as of late, I've started just net catting the video stream over to the media server to process the video and just netcat the stream back.
*bitrate
@@OhhCrapGuy The a380 is such a cool piece of tech, got one myself for my stream encoding machine. Was a pain in the rear to get working in those first few months though.
I’m looking to build a new PC this upcoming winter, and the thing I’m having the most trouble wrapping my head around is PCIe bandwidth. Won’t getting a second card throttle the primary GPU? Or does that not matter for low end ARC cards?
@@RiverM8rix If my knowledge is correct, on most Motherboards your second (and sometimes third) PCIe slots don't interact with the first one at all. Usually, the first PCIe slot is the only one with the full x16 interface connected directly to the CPU, and the rest are either x8, x4, or x1 to save on lanes and are connected via chipset. If your board supports PCIe gen 4 or newer, most graphics cards should function fine on even x8 slots, but if you're only using it for encoding x4 (and even x1) should work fine. Encoding uses way less data than normal gameplay/rendering, so it's pretty easy to put it in a slot with less bandwidth.
After doing some Googling, yea that isn't a problem. Using my mobo as an example (Asus x677 Prime-P AM5), I've got 28 PCIe 5.0 lanes supported by AM5. On my mobo (and all AM5 mobos if memory serves), 16 of those lanes are dedicated to the primary GPU slot. Another 8 are given to NVME drives. The last 4 are given to the chipset, which splits them across 2 more slots running PCIe 4.0 at x4 each, a PCIe 3.0 x1 slot, and an NVME slot running at PCIe 4.0 x4. For things running off the CPU directly (so my PCIe x16 slot and my first two NVME slots), bandwidth limiting will never be a problem since the CPU supports enough bandwidth for all of those slots. The chipset run slots theoretically could have bandwidth limiting, since it's splitting the lanes it has into more lanes. In practice, it's rather difficult to fully saturate those lanes since PCIe 5.0 supports 32 Gb/s transfer speed per lane, so I'd need to be using 128 Gb/s at the same time not counting my primary GPU and storage slots. An encoding card doesn't even get close to that under any normal workload, and even if it did it wouldn't touch my card on the x16 lane since that goes directly to the CPU and not through the chipset.
Tl; dr: On any modern platform, if you plug your GPU into your primary slot it doesn't matter what the other slots do, they can't touch it.
I got a cheap intel gpu about a year ago to encode youtube streams. The quality at the same bitrate was noticibly better, i dont get any interference with streaming on my gaming GPU. I used to have intel driver issues with fan speed making it super loud but it got patched at some point. 10/10 now.
Twitch says it will support AV1 for years now and they're just beating around the bush.
Reminds me of Spotify and lossless audio lol
@@Brownd55 One BIG difference. AV1 would save Twitch money. lossless audio will absolutely cost spotify money on increased bandwidth.
@@PeTe_FIN Ironically, they keep increasing subscription prices regardless across the globe and adding other new features absolutely nobody requested, like videos and audiobooks
0:01 I really wanted to make a joke about the "four thousand 90" because ive never heard someone call it that. then I remembered Thor goblin lord's speech is LAW and we all must abide by his lordly ways
OK
it makes sense because he then references the whole 4000 series. you wouldn't call it the 40 series
@@equestrianrosie i call it 40 series and 4000 series just only heard fourty ninety but to each their own ig
comment got stolen by a bot so liking and commenting for the algorithm
@@TeamBobbo6326 thanks bro
Thor, just wanted to reach out to you because I was super excited today and thought you should know, as the "OP" so to speak. A coworker asked me today about my input on mounting a sensor, and my first instinct was to pull up MS Paint, and I quickly drew up a rough sketch of what he had and what I thought he needed, and he not only fully understood, but was smiling seeing the idea be created right before him, as was myself. Thank you for this huge tip for presenting and teaching, and I am super excited to use it again. Love ya, bro ❤❤ keep doing what you do.
NGL, I've re-encoded my main Plex library to AV1 to save roughly half of the storage of the server, just need the web-apps to get native AV1 streaming and it'll be done and done for everybody on the server :)
Which clients currently support AV1?
@@oskar_roos IIRC, it's just the web apps ( Chromium and Firefox) and some Smart TV devices, but I do need to check on the full details of it
@@oskar_roos iirc everything but the web app. And some smart TVs, but that's the TV's fault, not the app specifically.
I didn't even realize I could have been using AV1. That's pretty awesome to learn since I'm just starting to learn about video editing. Another awesome lesson from Thor
for decoding AV1, it shouldn't matter what graphics card you have, you don't need a 30xx card, you can decode in software, it just takes longer. you can buffer.
This needs to be said for the people in the back.
@@benjaminoechsli1941 Yes but how is this gonna work on lets say, mobile devices with practically 0 cpu
@@heavygaming6596 mobile devices have powerful CPUs capable of software decoding AV1, but it's still unusable because software decode is power inefficient
@@обычныйчел-я3е Google just updated most modern Androids though to use libdav1d instead of the older AV1 software decoder, so it shouldn't really take that much power anymore.
But twitch/youtube/whatever won't give you an AV1 stream unless you specifically request it, since your machine doesn't report having the capability to hardware decode. If you look in your chrome/firefox settings there's a toggle for AV1 as well, it's off by default if your machine doesn't have hardware decoding.
I have been waiting for AV1 for years.
they fired the guy working on it.... Twitch could have been ahead of the market.
And for this specific video, using some pulled out of their ahh data, RUclips decided I needed it in 480p
Narrator: It was in this moment that @muziatheotter realized he’d been using his data plan instead of Wi-Fi for three days straight
RUclips always defaults to 480p for me on my phone and iPad. My iPad doesn't even have a cellular modem.
I have my saved setting to 1080 on my laptop, 720 for mobile RUclips app
I've been waiting for AV1 for a while now, can't wait for it to be widely available. So cool to see people solving problems that we've all just accepted.
I had no idea NVENC supported AV1, i thought it was only H.264 and H.265.
This is useful for gameplay recording too, you can cut file sizes considerably on recordings while keeping identical quality.
Great educational video!
When you call them x264/x265 you're usually talking about the encoder in ffmpeg, not the codec. Those are H.264/H.265 (or HEVC)
@@acuteaura you are correct, will edit, thanks!
i work with ffmpeg so much that to me they’ve basically become synonymous hahah but you’re right
"I had no idea" yeah it shows dummy
@@-eMpTy- come on man, don't be so harsh on them.
ARC cards dont get enough credit for what they were for their time. They were moderately powerful cards, but their ability to AV1 encode was really good for the price.
I like how Thor just smack all his Pants tumbnail with Chad Thor face 😂
AV1 and Vulkan API are my happy place.
I have several n100 mini PCs for this reason
It has native av1 decode, so I can watch videos without decreased quality/increased power use from needing to do it in software
Fantastic tech.
Better for wireless VR too. Looking forward to my next card for sure.
You can also have it on your iGPU inside your CPU if you don't want to change your graphics card.
The AMD CPUs of the 8000G series has AV1 encoding (but not the 7000 or 9000)
Not sure why its not the case on desktop apus, but i was quite happy to find the 7940HS has an AV1 encoder.
7900 XTX has AV1 too. It works great in OBS for capturing clips.
Amd has the worse encoder
@@nicolaspaglione for H.264, AV1 on AMD cards is fine
@@GameCyborgChyeah, a fine 1082p av1 encoding
@@nicolaspaglione spoken like a true fanboi, AMD has the BEST encoder for AV1, in fact it has major plaudits from anybody that uses AV1 for day to day work, it's just unfortunate that the NVENC encoder from Nvidia is pretty much mandatory for the time being due to software suite available from Nvidia.
He also said the 7000 cards.
As a fulltime Warframe streamer, I CANNOT WAIT for AV1, to the point that the moment it's announced I'll be upgrading my GPU instantly to a 7800 or 4080 super. it's going to make my life just so so much easier and better on a scale i can't even discribe. No more multi profiles for recording and streaming, no more worrying about if it looks like a potato (running a 3070 atm), Crispy clear video, being able to actually enjoy warframe at it's most beautiful. I just cannot wait anymore I really really want it asap.
Beatsaber lightshows gonna go hard with this one!!
And for some reference, H.264 was first released in 2004 wild how it's still in use
Heck yeah, didn't know my lil arc card was good at encoding
The amount of space it saves on video files alone is a huge boon. I’ve been waiting to see it adopted in many popular NLEs
This is why I prefer YT over Twitch but that is such a minor thing since I have already had AV1 for a long time
The fact you said it’s gonna be “Mint” makes me smile!! Proppa manc word ❤ haha love it!
I just love how the recent videos are starting to have a Giga Chad Thor in each and one thumbnail! Im not complaining btw! 😂🤣
I totally didnt think to check the video decoding abilities of the phone I use daily until now, Thank you! Looking forward to all the crisp and clean video playback of the future!
Lot of technical stuff Im not aware of.
thanks for teaching
I love this community because everyone is so insightful. I always learn something
I’ve been using AV1 for years now just using my CPU, you are not only limited to GPU encoding. That said it is a very very heavy algorithm and takes up the whole CPU to do, so most cant do it without sacrificing half of your cores or having an entirely separate PC for encoding. Software decoding also is heavy, but I any device since 2017 should have enough power to play back AV1 via software with minimal hiccups
Software Decoding isn't heavy at all, as long as you have something decent enough to use Windows 10/11, it can decode av1 video.
I did an informal experiment where I got a modern video player and an av1 encoded video onto my old Core 2 Duo desktop, and it was able to decode 720p30 with some hitches, and this is *without* fastdecode which is likely what most platforms would use.
AV1 is hard to encode, easy to decode
About decoding:
That is only for hardware decoding support, but decoding AV1 isn't that expensive compared to encoding.
If your CPU is too old, you are out of luck, but pretty much all desktop CPUs and atleast more recent phone CPUs should work just fine. It most definitely will for 1080p for the vast majority, regardless of hardware support.
I got an A310 for AV1 transcoding on my media server
how's the speed for encoding a 15 sec 1080p60 video at reasonably low bitrates?
@@nliberty I haven't done any benchmarking but it's good enough for jellyfin
I’m going through the process of converting everything on my media server to AV1 but it’s definitely been a process. I’ve got literally thousands of hours of video ranging anywhere from 480p x264 videos to 4kHDR x265.
That alone would be a challenge but I don’t have new enough equipment to do hardware-accelerated AV1 encoding.
So naturally I’m just having it done by the CPU (which most people would argue leads to better compression anyways) but it’s all being done by a measly 12C/24T Xeon from 8 years ago.
The lower quality stuff that’s x264 encoded normally can be done in just a few minutes but I’ve got dozens of 4K movies that can easily take 7-8 hours to encode per movie.
Thankfully most stuff made in the past 5 years has hardware accelerated AV1 decode but until decide to slap a new GPU in that server, that poor CPU is going to remain pegged at 100% usage for literally weeks
If you have access to A310 just get it
if its already encoded in h265 you're just wasting time and energy, h265 and av1 have similar performance for larger files (especially 4k), please dont waste ur time and go and buy anything new
my machine is futureproof too, my uhd 730 supports av1 too😈😈
indeed, but only for viewing, not recording
For those of you who don't want to/can't shell out mega-moolah for an Nvidia card, AMD's Radeon 7000 and newer cards have this encoder as well, as do Intel's Arc cards. AV1 _decoding_ (aka the viewer's job) is supported by Nvidia 3000 and newer, Radeon 6000 and newer, and Arc cards.
An important note: this is a _hardware_ encoder/decoder, as in a physical chip that's really good at doing this one task really fast. If you have an older card, it will simply run the encoding/decoding in _software,_ which works fine, but it'll be slower (leading to buffering).
Can we please also just kill HDMI and make DisplayPort the industry standard for video output? Thanks.
Until they start making TVs with display port this can't happen
@@TheMasterZoran yeah, I know :c
@@thethingthatshouldnotbe3035 please yes
The 7900xt also supports encode/decode for AV1.
Let’s be real Thor, Twitch will pocket the difference and still charge you for 4K.😂
Big fan of your content, mentorship and the funny stuff too, thank you for sharing.
We need JPEG XL also, again cause it's licence free essentially, and has better abilities with photos like parallel decoding
0:55 Im pretty sure discord removed support for av1 because I havent seen it anywhere in the settings for months
It absolutely infuriates me there's no real stream settings in discord. Half the time it looks bad and there's no way for me to make it better. And the other half the time the stream lags to crap and gives me no feedback as to why.
It still uses av1 if you stream vom a system with a 4000 nvidia gpu (can't tell for other brands cards) to a system with a 4000/or 3000 nvidia gpu. If you enable developer mode you can look up the codecs that are uses for the stream.
@@FreshChriss I enable developer option and I don't think I've *ever* seen a option to see what codec your stream is
Oh yeah, you could just, use your cpu for encoding and decoding av1
@@aksGJOANUIFIFJiufjJU21 it is somewhere on the bottom left (like in the connection / ping tab, or somewhere else) and it opens a new page where you can see the current inbound and outbound audio and video streams. You must be in a call to access it of cause. Hope this helps
@@FreshChriss I see it
Your right, it says my codec is AV1 (101), thanks you very much
Vulkan encoding and decoding of h264, h265 and av1 is coming. That is multigpu and multiOS support for all codecs so linux also get windows level of codec support for those last % gamers that dont want to make the switch yet
Wait a minute, isn't H,264 open source but less efficient, and H.265 encoding being what cost money...
What you're thinking of are the encoders x264 and x265, both H264 and H265 have patents which means services pay for their use, most H264 have expired so it's mostly free I think, but H265 is problem for services to use
And half of the reason why we don't use h.265 for most applications despite many video recorders being able to record it by default is because h.265 was a licensing nightmare compared to h.264. You'd have to convert a h.265 to a h.264 in order to use it in most editing programs like Premier Pro. AV1 fixes that issue in addition to being more efficient.
This reminds me to check and see if the newest AMD cards have this coding.
The Radeon RX 7000 cards do support av1
And intel arc too, if anyone cares....
rdna3 cards support it, but not for FHD, because they have an unfixable hardware bug that makes 1080p encodes impossible, you can only do the closest one, 1082p
AFAIK, AMD even had it first.
@@rzr82Intel was first, followed by Nvidia and finally AMD.
AV1 is CRAZY. Re-encoded my H.264 videos to AV1 with the same settings, and it cut out 70% of the size (1GB->300ish MB) and it doesn't have any noticeable difference in quality.
But Twitch doesn't support it yet so.......
software decoding for av1 is already good enough, you can play the videos on your phone
we're also getting there on mobile beginning 2020.
currently most major SoCs from 2022 and newer support hardware AV1 decoding
Amazing! Everything I've watched by you is Next Level Profound. What a breath of Fresh Air. Thank you.
2:15 you can software encode and decode. hardware is obviously faster/more efficient but for the average user, its not noticeably more demanding
Proprietary protocols are evil. I had no idea H.264 was proprietary. This is the exact reason I use OGG instead of MP3!
I used to swear by OGG (Vorbis), but if you're still using OGG, I recommend the newer, better open lossy audio codec, Opus! Opus is also amazing quality, beating every other standard codec at any given bitrate until transparency.
Fyi mp3 is either mostly or entirely patent free now they expired.
For quality Opus would be better to use, but certain players do not support it
x264 is open source
Its hard for Twitch to support AV1 because the majority of streamers do not have the hardware to encode at AV1 smoothly, likewise there are many viewers who cannot decode AV1 smoothly on whatever hardware they are running. Thats why we are running enhanced broadcast over eRTMP so that people with advanced hardware can supply both AV1 and AVC renditions. Twitch cannot supply this on the backend because we don't have a cost effective solution of transcoding AV1, it is ridiculously hardware intensive compared to transcoding AVC (which we supply to streamers already). If you arent part of the enhanced broadcast beta, you can try it!
That's not an excuse, because:
1. Allowing people to stream in AV1 doesn't force everyone to do so.
2. if you're using hardware encoders, it would be trivial for the streamers to transcode to lower resolutions (or maybe even other codecs but I'm not sure)
3. Twitch could transcode av1 to h264 just like YT does (at least for VODs)
4. Outside of power consumption, software Decoding AV1 is quite fast on any decent cpu.
i feel like you are the real tech tips man then linus
Linus has been talking about AV1 for years now, though. It‘s not exactly new, adoption is just a little slow.
@@thethingthatshouldnotbe3035 i'm talking about how much he put out in the form of short and videos
You can also encode using your CPU which is often higher quality than the GPU encoded variants. However requires a lot of CPU power, but if you already have a streaming PC this like won't be news to you. Higher presets using SVT-AV1 yield worse quality but faster encode times. It's a balancing act. But you can achieve H265 like performance and beat H265 in quality with the same encoding time in my testing.
Imo 4k is overkill
I just use 1440p
Especially with a regular sized monitor, I have a 28" 4k monitor and I can't see the difference between 4k and 1440p even with my face smooshed up against the screen.
Same and agreed
1440p windowed on a 4k screen is the way to consume all media.
B-)
I cant wait for AV1 encoding to finally come out for twitch. Streams are gonna be looking CRISPY ASF
I mean, if that's all you wanted the GPU for, why get a 4090 when you could get a cheap Intel ARC?
Not only twitch will not pay more, but on the contrary, it will save tons of bandwith for same quality.
"Giga-Thor isn't real he cant hurt you"
*stares at thumbnail*
Since h.264 is from the dinosaur age, it is not paid anymore. The patents expired.
Dude AV1 is one of the best video advancements we’ve made in a long time and yet we still use H.264. It drives me wild that AV1 hardware is lagging so far behind.
I think it's because AV1 is something new, they want to do it right and thus are taking their time to get it right.
AV1 can be decoded in software by practically any modern desktop CPU so its pretty nice as long as companies implement it
Yep. I use Handbrake to encode some of my own ripped DVDs for putting it on my phone or backing it up. Handbrake supports the Nvidia AV1 hardware encoder. I've never used anything faster. 1 Mbit video(dvd resolution), Opus audio which is close to transparent at 128 kBit for stereo, and you're off to the races. Netflix has used this stuff for years already, as it was also one of the companies that supported the development. All free codecs for both video and audio.
I can't believe how amazing AV1 is.
Probably why it has taken longer to get out there...
Just for the record, the reason that 4k is such a big damn deal is that a lot of people are consuming things like games, streams and RUclips on their big old 4k living room screens. Even 1440 looks crappy blown up that big, so it's gotta be 4k. If you mostly look at little screens, like phones and PC screens, or you only view traditional TV on the big TV and streaming on your phone, then it seems like a pointless gamer obsession with more pixels for the sake of pixels. It's not. It's about that 50-inch TV. I didn't really get this for a while, so somebody else needed to hear it.
Definitely thinking of upgrading to a Arc A770 from a RX 6600 cause I'm pretty sure AMD Graphics Card's do not play nice with Overwatch. Lots of frames dropped and I want AV1 encoding.
Not sure how nicely Arc will play with dx11 so make sure you do your research first
luv his cute lil hand doing stuffs while he was talking
Thank you @Pirate Software. We've been working on converting videos TO H.264 and now I think maybe we're falling way behind. A new mission has started 🤓
I'm shocked Twitch is stuck on h.264. It was amazing a decade ago but... That was a decade ago. Honestly, it's why I bought a 7900XTX. Absolute dream with the AV1 encoder. Hard drive savings are amazing compared to h.264 and are genuinely impressive compared to h.265.
This is why I was so gungho about getting a 7000 series AMD card.
I've been very excited about AV1 for awhile now.
the power of this thumbnails jaw structure and chin, is nearly on power with Ubisoft.
They are also experienting with having the streamers transcode to different resolutions as well. Instead if that being done on the Twitch servers which would save them even more money. It's already implemented in the new enhanced broadcasting thing they have.
As a fan of Warframe, thank you for mentioning it.
Not to mention that within the specs of AV1 it can do 12bit 4:4:4 lossless encoding/decoding, so it will be a great format for intermediates within a big video/cinema production. It is just missing the extra alpha channel tho
avif can do transparencies
@@Mine18x yeah I know, that's curious
The future becomes 4K 60fps. Great for the stream, because why not future proof it, but I can already see bunches of people trying to watch that on tiny little screens that can barely display 1080p much less 2160p.
Free and open source is the future and it needs to be supported.
4k 60fps becoming the standard is actually insane
I love when I learn new things, thanks Thor.
The other saving is that they can push the same res at a lower data cost
The main reason you need newer GPUs for now is because they have the hardware encoders which are vastly more efficient it's the same with NVENC for H.264 for Nvidia cards.
At some point more future hardware will begin natively supporting AV1 encoding not to mention the software support is a given as it's open source.
This is fantastic news! I honestly have no idea why a terrible proprietary standard that requires a license (H.264) has became widely adopted in this day and age, open-source is always better and makes everything more accessible to everyone and cheaper for companies to use. The fact it's better too makes it a no-brainier. Out with H.264 and in with AV1!
H264 is more supported because its way older (has had more time), has much more hardware support, and even if it doesn't (wth are you using) software decode is practically free. That's not to say AV1 shouldn't be the new default, but there's a reason why H264 is still a thing to this day.
I didnt know H.264 was paid holy shit that explains so much.
This is like opus all over again, royalty free alternative that is so goddamn efficient that you can scale it down or up depending on what you want and better than mp3 at a fraction of the bitrate
I hate how they did JXL so dirty instead of embracing it. Apart from that I'm all for AV1 and can't wait for it's successor when we will get PVQ (pyramid vector quantization) which was dropped from AV1 as it would've required more work in creating the chips to do AV1 in hw since this is new.
Which is funny considering PVQ is great for being adapted in hw iirc. But nothing existed, yet and it would've increased time to shelf further.
I like how Mozilla/Xiph, Cisco and Google fused/merged their ideas while working on their own video Codecs and then started working together on finalizing a codec.
intel Arc is the best for AV1 encoding and is much cheaper than a 4090. You can use it conjunction with your discreet GPU for gaming.
It will cost, because Twitch re-encodes you steam for multiple resolutions (for clients that have limited bandwidth). So it's either new hardware for hardware-accelerated decoding/encoding or much more CPU time for software encoding/decoding. But they could save some money on storage cuz AV1 files are much smaller.
If you stream with hardware Encoders, it's possible to do transcoding on the fly.
AMD 7000 GPUs also encode it. My 7900xt does it.