I just shot a commercial video and did All-I with backup Long GOP and I thought I was losing my mind when the LGP was sharper and looked WAYYYY better than the ALL-I. Thank you for this review.
Some think everything based on paper and theories over real footages and experiences.. But you crushed them with this haha It's shocking that LongGOP has that much detail and sometimes looks better. I would definitely choose LongGOP for less storage burden.
Excellent. I was looking for someone to confirm my suspicion that Long GOP is better. I did a test with rain and trees in the wind and Long GOP was still better on both my FX6 and FX30. Significantly and obviously better in every way. Less grain smearing, less color breakdown, less artifacting and higher sharpness. I was really surprised and thought I was going crazy, but I'm glad you confirmed my results. From now on I'm only filming L.
@qiyuxuan9437 Perhaps you've just confused this, XAVC-I has 10 bits, XAVC-L has 8 bits, and in the FX3, it's 10 bits with the Long Gop codec. In my previous tests, the difference between 8 bits and 10 bits was surprisingly minimal anyway.
@@deadendboredom That was a lower case L😂. Yeah, the difference maybe minimal without grading, but 8 bit slog3 falls apart very quickly with grading. I wish FX6 can add a 10bit long gop option in 4k.
@@qiyuxuan9437 It seems I may have been a bit confused; distinguishing between a capital "I" and a lowercase "L" can be quite tricky at times ;) Have you directly compared 10-bit and 8-bit video with the FX6? I'm quite curious to know in which situations you find the 8-bit codec breaking down quickly.
Nice videos. I know how time consuming it is to shoot these comparisons tests so huge respect going into so much detail, especially doing a second video! I’ve had similar conclusions from using the different codecs and now just use Long GOP as the playback is ok on my machine. BRAW is really a game changer of a codec. Higher bit rate Pro Res is fine however the file sizes are frustrating. We can call dream of internal BRAW on a Sony camera one day 😂
And I stand by what I said: Manufacturers have to balance cost, power, usability, performance and these factors depend on the laws of physics and current technology levels. This is the way the world of design and manufacturing works. Plus Bitrate isn't everything, of course it's important, but Sony use some of the more complex H264 encoding techniques in XAVC that are not used by most others, techniques that are used as standard in H265. In the XAVC encoder chip there is a precoding pass where each frame is analysed and the optimum data distribution across the frame is calculated (PPS). Then based on the PPS each frame is dynamically encoded via a second pass main encode. Then PPS metadata is embedded in every frame and SPS metadata added to every clip so the decoder knows how each frame was encoded and how clip is optimised. Normally H264 files only have SPS as PPS adds a lot of extra encoding complexity that is difficult for a software based encoder. As a result of PPS XAVC at 300Mb/s performs similarly to most other H264 codecs at around 400MB/s. This is also why you can often find a wide range of variation in XAVC decoders and the decoded image quality as many generic software decoders don't correctly read or use the PPS data. Why is XAVC-I based around 30fps/300Mb/s - Originally there was to be a 12 bit 444 version using the same per frame compression. At 60fps this would have been 960Mb/s and with audio just a touch under 1Gb/s, which was considered to be the practical limit for solid state media at the time the standard was finalised. Additionally at the time 100Mb/s was considered optimum for high quality 10 bit interlaced HD broadcast acquisition and while 4K files are 4 times larger there is a 25 to 30% efficiency gain when encoding progressive frames over interlace frames, so 300Mb/s was chosen as optimum.
Can you elaborate more on different softwares producing different quality levels based on how they decode XAVC? What’s the best decoding method? How does Final Cut decode XAVC vs DaVinci vs Premiere vs Avid? Or should the files be batch processed through a different software? Have you made a video that relates to this? I’ve always seen you as THE guy when it comes to understanding Sony video cameras. And just this technical breakdown in the form of a wee RUclips comment is amazingly informative. Kudos of course to the creator of this video as well, as it is well done, too.
There's a reason that the BEST audio and visual equipment are developed in Japan and GERMANY. Unfortunately, Americans prefer to be comfortable where they are instead of constantly challenging themselves to achieve something better. This is bold to say and may offend some, but it needs to be said none the less. It's time to stop being frustrated about Sony's lack of answers to these very natural questions and to start putting out more critical material like THIS that will get their attention. I applaud you for doing this follow-up video and I look forward to watching your other content!
For some of us knowing that Long GOP is as good or better is actually nice to hear- it's the format I prefer for the field since it maximizes my card capacities...I worried that I was compromising image quality. I would prefer transcoding for editing if need be (but the M1 Macs actually handle Long GOP very well in Resolve I find...). Not a reason that Sony can't improve bit-rates on the I formats for those that prefer those of course! H.265 is obviously very well crafted in terms of image quality.
I have ALWAYS believed that Sony's 30p All-I at only 300mbp/s was the absolute BARE minimum for 10bit 4:2:2 intraframe recording. Over the years, Sony have been VERY stubborn about holding it there no matter what new camera they make. I just dont get it. Did their hardware encoders cap out at this 30p bit rate? Why in the Hell cant they raise that 300mbps to 400mbps...at least in 30p. C'mon Sony,....this looks shockingly bad. Do what Panasonic does!
Great video. I've done some testing of this on my FX6. I either shoot in S-Log3 in CineEI or with a baked-in Base Look (Phantom LUT) in Custom mode, depending on the job. Results? XAVC-I looks better in CineEI simply due to the 8-bit Long-GOP falling apart more easily on neutral gradients such as a white wall or blue sky. However, applying the LUT in-camera when shooting in Custom, I can barely see a difference. Both codecs look good at 800ISO, but as you increase them both, the XAVC-L holds up much better noise-wise on a static shot. This is important because if I'm filming a long interview, for example, the savings in data are huge and the image quality is arguably better.
@7:32 pretty much shows that All-I is actually better when there's a mixed lighting scenario. Though the lighting is quite intense and not what most would normally need, I feel like this is most representative of videographer's work in music videos... but also in many scenes with pronounced shadows. Where there's lots of contrast, the shadows end up with pretty big macro-blocks in Long-GOP.
You are absolutely right that All-i works better with crazy light - All those who record comparable scenes should select the codec accordingly. All-i is disadvantageous in other scenarios (and also there most likely in the dark areas). What I think we can say in any case is that an external recording in ProRes or RAW is the best recording option, with the file size as the only exception. Let’s hope that Sony will give a little more bitrate in the future.
Just came across, great tests. I did a review of some old FX9 all I clips and did see similar macro blocking issues like you showed, but only on some parts of the frame. However, in Resolve it went away oddly. Watching in Catalyst Browse show the blocks much more, but Resolve barely shows them at all.
Very interesting! After purchasing my FX6 first thing I did - connected my BMVA and shoot quick test recorded 4K Prores 422HQ and S-log3 ALL-I. Well, no difference at all (with heavy grading too) at 300% pixel peeping in Resolve. So now I use my BM as monitor only. One question - did you use noise reduction in fx6? I shoot my tests with NR turned off. Now I want to make more tests with LongGop. Currently I use LongGop for interviews (usually HD 10 bit LongGop) and in Cinetone for quick runaround (4K 8 Bit LongGop). For all serious commercial works only ALL-I and, to be honest, never had any quality issiues with my clients (including broadcast TV). By the way, Sony Venice have ALL-I class400 (380 Mb/s at 24p) but it has 6K, so it is approx. equal to FX6/9. And it was ok for Top Gun Maverick for cockpit scenes with crazy motion, screening in IMAX. So may be for our works it is fine too? ;)
Yes, as I mentioned, in most cases and situations, the internal recording is perfectly usable - not everyone needs maximum quality and details for green screen, etc. If I recall correctly, the noise reduction was turned off.
Good detailed examples. The ultimate real world test is your showing your edited version at a banquet hall on a 75 inch tv. Ive never seen any youtuber do this. Yes your video is very detailed but the real world anxiety of an editor presenting thier stuff on a big screen is the biggest test of all.
Interesting topic. But I have the feeling that before we talk about it, we have to distinguish between two things: the objective measurable and the subjective felt. It is objectively measurable that the quality of this camera with this codec is not as good as it could be. Whether this is visible to a viewer on a 75-inch screen depends on many factors. More on the objective side would be quality of the screen and ambient light and on the rather subjective side would be the experience and expectation of the viewer, as well as, for example, the plot of the film. All these factors consciously or unconsciously influence perception. In addition, a majority of viewers cannot attribute a feeling to a cause and thus may not recognize mistakes or problems, but feel it subconsciously - and this also applies to me from time to time. What you suggest is ultimately the relevant one (to show the end product in a high-quality but realistic environment), but the one that is difficult to measure objectively. I would be interested to know if you have an idea how to carry out such a „test“ that can exclude as many influences as possible that can falsify the result. Otherwise, this would be a video in which many people give their opinion but nothing is won in the end. A little riddle: Do an above-average number of films that were shot with an ARRI Alexa Look good because the expensive camera or do the films with good ideas have large enough budgets and can hire capable cameramen and the most expensive camera?
I've been saying this since the a7siii/fx3 came out, thank you for demonstrating it perfectly! Just one question: at 24p 100M 10Bit, XAVC-HS and XAVCS give me the same amount of record time. Since XAVC-HS is more compressed I can assume that the record time is the same but the quality is slightly better in XAVC-HS when both are set to 100M. So, does this mean XAVC-HS 24p at 50M is equivalent in image quality to XAVCS at 100M? (both same framerate and both at 10bit 422)
The question is not so easy to answer. I have only done a few short tests with XAVC-HS so far and saw that the 50M variant looks a little softer compared to XAVC S M100 and HS M100. With HS M50, the compression seems to remove more noise in particular. Looking under the magnifying glass, HS50M looks worse for me than S100M. Why this is, although the recording duration in relation to better compression suggests something else, I can't tell you. What I can tell you: in my tests Image quality was not the same.
@@arjuna207 Are you interested in this from a particular perspective? A full run-through with every flavor of codec (LT, 422, HQ) across various situations would be quite involved. When the BRAW update comes out, I’ll probably do a more comprehensive comparison. Currently, ProRes in LT and to some extent in 422 shows a similar pixel-blocking issue like All-I, though it’s much less pronounced. But, of course, ProRes handles fast motion better overall. Given the large file sizes of HQ, I’ve typically found RAW to be more worthwhile. Once BRAW is available, it’ll likely outperform ProRes in every area-except compatibility.
@@deadendboredom I'm dissapointed that prores is marginally better than all-i in regards to the blocks. In some places in your video the mesh looked horrendous. I shoot mostly 4k S, but sometimes i need to crop and i'm a little dissapointed of the quality. I'm also curious about braw, i'm a davinci user so that will be ideal.
Very interested in part 3 of this series, with ALL-I vs x264 vs x265, on the same 100mbit/s max bitrate, XAVC HS (in theory) should withstand noticeably more abuse before it will break.
Interesting: 25fps in PAL mode Long GOP x264 on FX3 140mbit/s, but 24fps in NTSC mode Long GOP x264 and x265 are 100mbit/s, so best bitrate per frame for x264 5.6mbit/frame, and for x265 4.2mbit/frame, 1.33 difference in bitrate, but it is still better than 1.5+ average bitrate difference for same perceptual image quality calculated by researchers. Overall 24fps 100mbit/s x265 should be perceptually 12+% better than 25fps 140mbit/s x264 from FX3.
@8:13 "with green screen and keying this is a completely different topic" - Do you mind clarifying? It looks like in the image you shared that LGOP has clearer edges which is strange because everything I've read says All-I is better for keying ?
As you probably saw in Part 1 from 1:33, I observed that purely qualitatively All-i has the worst edges. When someone says that a codec is better, the question is always: in what way? Is it about performance, about quality or about the workflow… At the end of the day, All-i is a very far-reaching paraphrase - just as the statement „you can walk through doors“ is only correct as long as the door is large enough. I would also always prefer an All-i codec for VFX, as it has complete single frames. But if the bit rate is comparatively low, as is the case here at Sony, then this is suboptimal. The best quality in this case still seems to be the external recording in ProRes HQ (Which is also an All-i codec) (or RAW) or in an emergency Long GOP converted to ProRes - when it comes purely to details and optical quality. It will still take some time, but currently a more complex VFX shoot is planned for the next few months, in which I will use several cameras for comparison. After that, I can hopefully make even more precise statements and show interesting examples.
So I know this is an old video, and I'm not really a sony guy but I've downloaded the files and the XAVC-I is 10 bit and the XAVC-L is 8 bit. Can you shoot in XAVC-L 10 bit?
@@vladlapadatescu You’re absolutely right. At least from the FX6, the Long GoP files are limited to 8-bit, no 10 Bit. The FX3 can shoot 10-bit Long GoP.
@@sounakbanerjee I believe that 10-bit is generally possible in the free version. In the past, some formats like MXF were not supported, but that was and is not dependent on 10-bit or 4:2:2, but on specific codecs and containers. That might have changed by now… I haven't used the free version for years. Otherwise, you can convert your material to ProRes 10-bit 4:2:2 and edit it without any problems in the free version (if I remember correctly)
@@deadendboredom Thanks. I tried opening an FX3 file, XAVC-I format, 10 bit 422. It only shows the audio. I suppose the free version doesn't support this.
@@sounakbanerjee I understand your concern, but converting to a higher quality codec shouldn't be an issue. Especially with Sony FX3 All-I, converting to ProRes 422 shouldn't cause any noticeable problems (though the files will be significantly larger).
The "Netflix Approved" certificate, to the best of my knowledge, represents more of a workflow endorsement rather than a quality seal. If my memory serves me right, Netflix even conveyed this in a dedicated video on the subject: ruclips.net/video/xhX55g0Ms7M/видео.htmlsi=CYdAfznZ9mcafDIa In any case, the camera is undeniably remarkable when it comes to the Quality, especially with an external recording. However, I personally wouldn't attach significant value to this seal. Ultimately, the camera operator's skill and the lighting department play crucial roles. Having watched numerous films recently, I've been attempting to gather clues about the camera used for shooting. It's become quite apparent to me that improper use can significantly degrade the quality of footage from even the most advanced camera.
Heat, processing power etc remain very real issues. You cannot simply move more data per frame without there being other impacts in the processing chain. These cameras are not computers with a bunch of surplus processing and cooling just sitting there unused. They are designed around the included codec chip and the design parameters of that chip. The encoder is a purpose designed ASIC and it has a limited range of hardware based capabilities. An ASIC is used because it uses a lot less power and generates a lot less heat than an FPGA or SoC. It is not a software encoder where you can just reprogram it or rewrite the code to change how it works. Its functions and processing capabilities are determined by the embedded hardware. Hardware that was designed around certain bit rates, frame rates and power consumption. XAVC was designed to provide a mezzanine level codec with consistent quality no matter the frame rate. It was designed to be a long lived, industry accepted standard, so that means working to fixed parameters so that the codec can be implemented in other hardware and software devices without fear of incompatibility. This is why things like the bit rate, the number of audio channels remains constant. Whatever the frame rate, each frame is encoded the same way, with the same amount of compression and the same optimisations as this creates a standard that can work at almost any frame rate with no change in image quality and exactly the same encoder/decoder used whether you are shooting slow motion or simply shooting at a higher base rate. Broadcasters and many other organisations have invested large amounts of money in hardware codecs (Sony sell the hardware codec chips so they can be included in 3rd party devices) and ingest devices, automated workflows, library and data management systems based around the XAVC standard. If you change the bit rate now, you break the standard. Perhaps in subsequent generations of cameras we will see a new codec family based on new codec chips but I very much doubt it is possible to increase the bit rate through firmware alone. And at the end of the day even if you record the cameras output to ProRes HQ on an external recorder the difference in the final result between that and any of the internal codecs is so small that for the vast majority of productions it simply isn't worth the extra effort. As for the raw out? It doesn't give you any more dynamic range, it doesn't give you more colour, the sensor and it's 12 bit A to D is the limiting factor. And again in real workflows the differences are so incredibly small that for most it simply isn't worth the effort.
Thank you for your detailed comment. If it is actually a limitation of the chip and a single frame cannot exceed 1.25 mb (as i ask/suspect at 10:00), Then that’s interesting to know, but then I would also like Sony to communicate about this. If there is a better option for future cameras, it does not have to replace the standard and paralyze existing systems, it could exist beside that. But I’m afraid we’ll go round in circles a bit, for example, when it comes to the meaning and purpose of the camera. I am completely with you when it comes to the point that manufacturers have to find a balance in certain areas. And as far as standards are concerned, I can absolutely understand this in the brodcast area. But if Sony sells a camera as a cinema camera, then as a customer I can expect more than broadcast standard codecs. When I then see that Sony, for example, deliberately incorporates limitation on the FX3 with the memory cards, then I find it difficult to distinguish between something that has technical limitation and something that has limitation for political reasons. In my previous tests, an external RAW recording was both qualitatively (much more details are visible) and better in terms of colors. P.s. I absolutely agree with you: for most it is not worth the effort, but for some it is.
Hello, thanks to share this interesting video. Unfortunately I did not understand well because my english is not good. I hope anyone with experience advise me. I only make documentaries about indigenous peoples. I shoot their daily life, people who work on red earth villages and in the countryside. I use V-Log 10bit 4:2:2 50p with the Lumix GH6. Do you recommend me shooting All-I or Long GOP? PS: I didn't quite understand if it's easier to manage All-I or Long GOP in video editing. In other words, which of the two clips causes the scroll bar to stutter in real time? THANK YOU
Hi! This video is specifically about the Sony FX6, to my knowledge the problem that I am investigating here is not present in the GH6. Since you don’t work with special effects or other special requirements in your work, I would say you don’t need to worry. ALL-I is usually smoother to edit, but the files are larger. ALL-I also has advantages in scenes with many moving details. It is best to record a scene twice the next time you have a shoot, once in all-i and once in Long GOP and then you check for yourself which one runs better on your computer and whether you see any problems that bother you in the image with one of the formats. P.S. Another problem that I have observed many times, which can lead to lagging in the timeline: if the movie files were recorded at a high bit rate and are edited from an HDD (hard drive). For softer editing, It can be an advantage to edit the files from an SSD.
I’m not even sure if Sony is actively withholding something from us as far as the codec is concerned - maybe the built-in chip is really not able to process frames with higher bitrates. Let’s hope for more in the future. But that Sony generally pursues a policy of withholding, we see at least with the UHS-II memory card Situation and other Features.
Following your logic proresRaw is the worst codec because the camera does the least cleaning up. Any artifacts and noise patterns will show up in proresRaw. The better the codec the less interference right? People lower the sharpness for a reason.
No, you misunderstood something. I'm not talking about noise, but the appearance of pixel blocks and the loss of details (not the good, analog, slightly blurred way). ProResRAW has more noise (thanks to less camera/codec „clean up"), the most beautiful colors and the highest dynamic range and i love it!
Yea, I think we can say with great certainty: if you only produce for RUclips, then you can also record on a potato, in the end everything is shredded by the upload anyway
i write such a long comment but it got erased by youtube stupid comment system... Basically you getting better deal with Long GOP 140Mbps literally twice less in file size than ALL-Intra 240Mbps while giving you *twice as much of bitrate* per each frame of video , because Long GOP is more efficient compression... To get same quality with ALL-Intra Sony would have to make ALL-Intra 4K @ 25fps write with 500Mbps which will make file video size twice as big, they won't do it because marketing reasons... Let me know how to contact you if you want i can explain how this works, not wasting any more time for youtube comments.
I read your initial comment in the email notification and was already wondering if the comment had disappeared. Thank you for your detailed comment. I am aware of why and how the size ratios are as they are, so no further explanation is needed. I find it absurd how often All-I is touted as the highest quality codec option on RUclips, which is why I made the video. I understand that Sony may not be inclined to offer a 500 Mbps option because it would involve additional effort, but I would wish for it, especially considering they've given the camera the name "Cinema."
I just shot a commercial video and did All-I with backup Long GOP and I thought I was losing my mind when the LGP was sharper and looked WAYYYY better than the ALL-I. Thank you for this review.
I appreciate your anality good sir. Keep up the solid work!
Some think everything based on paper and theories over real footages and experiences.. But you crushed them with this haha
It's shocking that LongGOP has that much detail and sometimes looks better. I would definitely choose LongGOP for less storage burden.
Excellent. I was looking for someone to confirm my suspicion that Long GOP is better. I did a test with rain and trees in the wind and Long GOP was still better on both my FX6 and FX30. Significantly and obviously better in every way. Less grain smearing, less color breakdown, less artifacting and higher sharpness. I was really surprised and thought I was going crazy, but I'm glad you confirmed my results. From now on I'm only filming L.
But isnt xavc l on FX6 is only 8 bit in 4k?
@qiyuxuan9437 Perhaps you've just confused this, XAVC-I has 10 bits, XAVC-L has 8 bits, and in the FX3, it's 10 bits with the Long Gop codec. In my previous tests, the difference between 8 bits and 10 bits was surprisingly minimal anyway.
@@deadendboredom That was a lower case L😂. Yeah, the difference maybe minimal without grading, but 8 bit slog3 falls apart very quickly with grading. I wish FX6 can add a 10bit long gop option in 4k.
@@qiyuxuan9437 It seems I may have been a bit confused; distinguishing between a capital "I" and a lowercase "L" can be quite tricky at times ;)
Have you directly compared 10-bit and 8-bit video with the FX6? I'm quite curious to know in which situations you find the 8-bit codec breaking down quickly.
Nice videos. I know how time consuming it is to shoot these comparisons tests so huge respect going into so much detail, especially doing a second video!
I’ve had similar conclusions from using the different codecs and now just use Long GOP as the playback is ok on my machine.
BRAW is really a game changer of a codec. Higher bit rate Pro Res is fine however the file sizes are frustrating.
We can call dream of internal BRAW on a Sony camera one day 😂
And I stand by what I said: Manufacturers have to balance cost, power, usability, performance and these factors depend on the laws of physics and current technology levels. This is the way the world of design and manufacturing works.
Plus Bitrate isn't everything, of course it's important, but Sony use some of the more complex H264 encoding techniques in XAVC that are not used by most others, techniques that are used as standard in H265. In the XAVC encoder chip there is a precoding pass where each frame is analysed and the optimum data distribution across the frame is calculated (PPS). Then based on the PPS each frame is dynamically encoded via a second pass main encode. Then PPS metadata is embedded in every frame and SPS metadata added to every clip so the decoder knows how each frame was encoded and how clip is optimised. Normally H264 files only have SPS as PPS adds a lot of extra encoding complexity that is difficult for a software based encoder. As a result of PPS XAVC at 300Mb/s performs similarly to most other H264 codecs at around 400MB/s. This is also why you can often find a wide range of variation in XAVC decoders and the decoded image quality as many generic software decoders don't correctly read or use the PPS data.
Why is XAVC-I based around 30fps/300Mb/s - Originally there was to be a 12 bit 444 version using the same per frame compression. At 60fps this would have been 960Mb/s and with audio just a touch under 1Gb/s, which was considered to be the practical limit for solid state media at the time the standard was finalised. Additionally at the time 100Mb/s was considered optimum for high quality 10 bit interlaced HD broadcast acquisition and while 4K files are 4 times larger there is a 25 to 30% efficiency gain when encoding progressive frames over interlace frames, so 300Mb/s was chosen as optimum.
Can you elaborate more on different softwares producing different quality levels based on how they decode XAVC? What’s the best decoding method? How does Final Cut decode XAVC vs DaVinci vs Premiere vs Avid? Or should the files be batch processed through a different software?
Have you made a video that relates to this? I’ve always seen you as THE guy when it comes to understanding Sony video cameras. And just this technical breakdown in the form of a wee RUclips comment is amazingly informative. Kudos of course to the creator of this video as well, as it is well done, too.
There's a reason that the BEST audio and visual equipment are developed in Japan and GERMANY. Unfortunately, Americans prefer to be comfortable where they are instead of constantly challenging themselves to achieve something better. This is bold to say and may offend some, but it needs to be said none the less. It's time to stop being frustrated about Sony's lack of answers to these very natural questions and to start putting out more critical material like THIS that will get their attention. I applaud you for doing this follow-up video and I look forward to watching your other content!
Thank you very much for all the work you have done, it is an incredible analysis that resolves all possible doubts.
Any update on this problem, or sony never address this in 2025?
An incredible video! Thank you for your hard work on this. Deserves far more views than it has!!
For some of us knowing that Long GOP is as good or better is actually nice to hear- it's the format I prefer for the field since it maximizes my card capacities...I worried that I was compromising image quality. I would prefer transcoding for editing if need be (but the M1 Macs actually handle Long GOP very well in Resolve I find...). Not a reason that Sony can't improve bit-rates on the I formats for those that prefer those of course! H.265 is obviously very well crafted in terms of image quality.
Very nicely summarized and „in a nutshell“, thank you for your comment!
I have ALWAYS believed that Sony's 30p All-I at only 300mbp/s was the absolute BARE minimum for 10bit 4:2:2 intraframe recording. Over the years, Sony have been VERY stubborn about holding it there no matter what new camera they make. I just dont get it. Did their hardware encoders cap out at this 30p bit rate? Why in the Hell cant they raise that 300mbps to 400mbps...at least in 30p. C'mon Sony,....this looks shockingly bad. Do what Panasonic does!
Fantastic, thanks for your effort.
Great video. I've done some testing of this on my FX6. I either shoot in S-Log3 in CineEI or with a baked-in Base Look (Phantom LUT) in Custom mode, depending on the job. Results? XAVC-I looks better in CineEI simply due to the 8-bit Long-GOP falling apart more easily on neutral gradients such as a white wall or blue sky. However, applying the LUT in-camera when shooting in Custom, I can barely see a difference. Both codecs look good at 800ISO, but as you increase them both, the XAVC-L holds up much better noise-wise on a static shot. This is important because if I'm filming a long interview, for example, the savings in data are huge and the image quality is arguably better.
less noise could also be a sign of loss of detail due to stronger in camera noise reduction.
@7:32 pretty much shows that All-I is actually better when there's a mixed lighting scenario. Though the lighting is quite intense and not what most would normally need, I feel like this is most representative of videographer's work in music videos... but also in many scenes with pronounced shadows. Where there's lots of contrast, the shadows end up with pretty big macro-blocks in Long-GOP.
You are absolutely right that All-i works better with crazy light - All those who record comparable scenes should select the codec accordingly. All-i is disadvantageous in other scenarios (and also there most likely in the dark areas).
What I think we can say in any case is that an external recording in ProRes or RAW is the best recording option, with the file size as the only exception.
Let’s hope that Sony will give a little more bitrate in the future.
Just came across, great tests. I did a review of some old FX9 all I clips and did see similar macro blocking issues like you showed, but only on some parts of the frame. However, in Resolve it went away oddly. Watching in Catalyst Browse show the blocks much more, but Resolve barely shows them at all.
I am happy how well long gop look compare to ALLI
lets hope sony will update firmware instead of make a new csmera to solve the problems 😅
Very interesting! After purchasing my FX6 first thing I did - connected my BMVA and shoot quick test recorded 4K Prores 422HQ and S-log3 ALL-I. Well, no difference at all (with heavy grading too) at 300% pixel peeping in Resolve. So now I use my BM as monitor only.
One question - did you use noise reduction in fx6? I shoot my tests with NR turned off.
Now I want to make more tests with LongGop. Currently I use LongGop for interviews (usually HD 10 bit LongGop) and in Cinetone for quick runaround (4K 8 Bit LongGop). For all serious commercial works only ALL-I and, to be honest, never had any quality issiues with my clients (including broadcast TV).
By the way, Sony Venice have ALL-I class400 (380 Mb/s at 24p) but it has 6K, so it is approx. equal to FX6/9. And it was ok for Top Gun Maverick for cockpit scenes with crazy motion, screening in IMAX. So may be for our works it is fine too? ;)
Yes, as I mentioned, in most cases and situations, the internal recording is perfectly usable - not everyone needs maximum quality and details for green screen, etc. If I recall correctly, the noise reduction was turned off.
Good detailed examples. The ultimate real world test is your showing your edited version at a banquet hall on a 75 inch tv. Ive never seen any youtuber do this. Yes your video is very detailed but the real world anxiety of an editor presenting thier stuff on a big screen is the biggest test of all.
Interesting topic.
But I have the feeling that before we talk about it, we have to distinguish between two things: the objective measurable and the subjective felt.
It is objectively measurable that the quality of this camera with this codec is not as good as it could be.
Whether this is visible to a viewer on a 75-inch screen depends on many factors. More on the objective side would be quality of the screen and ambient light and on the rather subjective side would be the experience and expectation of the viewer, as well as, for example, the plot of the film.
All these factors consciously or unconsciously influence perception. In addition, a majority of viewers cannot attribute a feeling to a cause and thus may not recognize mistakes or problems, but feel it subconsciously - and this also applies to me from time to time.
What you suggest is ultimately the relevant one (to show the end product in a high-quality but realistic environment), but the one that is difficult to measure objectively.
I would be interested to know if you have an idea how to carry out such a „test“ that can exclude as many influences as possible that can falsify the result.
Otherwise, this would be a video in which many people give their opinion but nothing is won in the end.
A little riddle:
Do an above-average number of films that were shot with an ARRI Alexa Look good because the expensive camera or do the films with good ideas have large enough budgets and can hire capable cameramen and the most expensive camera?
So its true? Only 8 bit 420 on LONG GOP on the fx6?
Yes, it is. But luckily it’s good in terms of quality. So if you didn’t know that it’s only 8 bit 420, you might not notice it in comparison.
@@deadendboredom yes but stil… damn cmon sony. Even my a7siii shoots 10 bit 422 on a descend codec
I've been saying this since the a7siii/fx3 came out, thank you for demonstrating it perfectly!
Just one question: at 24p 100M 10Bit, XAVC-HS and XAVCS give me the same amount of record time. Since XAVC-HS is more compressed I can assume that the record time is the same but the quality is slightly better in XAVC-HS when both are set to 100M.
So, does this mean XAVC-HS 24p at 50M is equivalent in image quality to XAVCS at 100M? (both same framerate and both at 10bit 422)
The question is not so easy to answer.
I have only done a few short tests with XAVC-HS so far and saw that the 50M variant looks a little softer compared to XAVC S M100 and HS M100.
With HS M50, the compression seems to remove more noise in particular.
Looking under the magnifying glass, HS50M looks worse for me than S100M.
Why this is, although the recording duration in relation to better compression suggests something else, I can't tell you.
What I can tell you: in my tests Image quality was not the same.
can you make a comparisson between external prores vs internal long GOP?
@@arjuna207 Are you interested in this from a particular perspective? A full run-through with every flavor of codec (LT, 422, HQ) across various situations would be quite involved. When the BRAW update comes out, I’ll probably do a more comprehensive comparison.
Currently, ProRes in LT and to some extent in 422 shows a similar pixel-blocking issue like All-I, though it’s much less pronounced. But, of course, ProRes handles fast motion better overall. Given the large file sizes of HQ, I’ve typically found RAW to be more worthwhile. Once BRAW is available, it’ll likely outperform ProRes in every area-except compatibility.
@@deadendboredom I'm dissapointed that prores is marginally better than all-i in regards to the blocks. In some places in your video the mesh looked horrendous. I shoot mostly 4k S, but sometimes i need to crop and i'm a little dissapointed of the quality. I'm also curious about braw, i'm a davinci user so that will be ideal.
Very interested in part 3 of this series, with ALL-I vs x264 vs x265, on the same 100mbit/s max bitrate, XAVC HS (in theory) should withstand noticeably more abuse before it will break.
Interesting: 25fps in PAL mode Long GOP x264 on FX3 140mbit/s, but 24fps in NTSC mode Long GOP x264 and x265 are 100mbit/s, so best bitrate per frame for x264 5.6mbit/frame, and for x265 4.2mbit/frame, 1.33 difference in bitrate, but it is still better than 1.5+ average bitrate difference for same perceptual image quality calculated by researchers. Overall 24fps 100mbit/s x265 should be perceptually 12+% better than 25fps 140mbit/s x264 from FX3.
@@igorzhidkov1957 Looks like you’re already halfway through making the third part
@@deadendboredom can help with math, but too lazy to go outside and shoot some nature;)
@8:13 "with green screen and keying this is a completely different topic" - Do you mind clarifying? It looks like in the image you shared that LGOP has clearer edges which is strange because everything I've read says All-I is better for keying ?
As you probably saw in Part 1 from 1:33, I observed that purely qualitatively All-i has the worst edges. When someone says that a codec is better, the question is always: in what way? Is it about performance, about quality or about the workflow… At the end of the day, All-i is a very far-reaching paraphrase - just as the statement „you can walk through doors“ is only correct as long as the door is large enough.
I would also always prefer an All-i codec for VFX, as it has complete single frames. But if the bit rate is comparatively low, as is the case here at Sony, then this is suboptimal.
The best quality in this case still seems to be the external recording in ProRes HQ (Which is also an All-i codec) (or RAW) or in an emergency Long GOP converted to ProRes - when it comes purely to details and optical quality.
It will still take some time, but currently a more complex VFX shoot is planned for the next few months, in which I will use several cameras for comparison. After that, I can hopefully make even more precise statements and show interesting examples.
So I know this is an old video, and I'm not really a sony guy but I've downloaded the files and the XAVC-I is 10 bit and the XAVC-L is 8 bit. Can you shoot in XAVC-L 10 bit?
@@vladlapadatescu You’re absolutely right. At least from the FX6, the Long GoP files are limited to 8-bit, no 10 Bit. The FX3 can shoot 10-bit Long GoP.
Are you able to edit 10 bit 422 files in Davinci Resolve free version, or do we need the studio version for that?
Thanks.
@@sounakbanerjee I believe that 10-bit is generally possible in the free version. In the past, some formats like MXF were not supported, but that was and is not dependent on 10-bit or 4:2:2, but on specific codecs and containers. That might have changed by now… I haven't used the free version for years. Otherwise, you can convert your material to ProRes 10-bit 4:2:2 and edit it without any problems in the free version (if I remember correctly)
@@deadendboredom Thanks. I tried opening an FX3 file, XAVC-I format, 10 bit 422. It only shows the audio. I suppose the free version doesn't support this.
@@sounakbanerjee Have you tried converting your material to ProRes?
(For example using shutterencoder?)
@deadendboredom No. Actually never done it. Due to my lack of knowhow. I always feared that conversions would degrade quality
@@sounakbanerjee I understand your concern, but converting to a higher quality codec shouldn't be an issue. Especially with Sony FX3 All-I, converting to ProRes 422 shouldn't cause any noticeable problems (though the files will be significantly larger).
Thanks for making these !
Interesting....as now I''m curious as to why Netflix Specifies only using All-I in their video submissions stipulations
The "Netflix Approved" certificate, to the best of my knowledge, represents more of a workflow endorsement rather than a quality seal. If my memory serves me right, Netflix even conveyed this in a dedicated video on the subject: ruclips.net/video/xhX55g0Ms7M/видео.htmlsi=CYdAfznZ9mcafDIa
In any case, the camera is undeniably remarkable when it comes to the Quality, especially with an external recording. However, I personally wouldn't attach significant value to this seal. Ultimately, the camera operator's skill and the lighting department play crucial roles. Having watched numerous films recently, I've been attempting to gather clues about the camera used for shooting. It's become quite apparent to me that improper use can significantly degrade the quality of footage from even the most advanced camera.
молодец мужик, крутой тест, спасибо
Good stuff
Awesome!
Presented ALL-I implementation probably suffer by low bitrate.
Heat, processing power etc remain very real issues. You cannot simply move more data per frame without there being other impacts in the processing chain. These cameras are not computers with a bunch of surplus processing and cooling just sitting there unused. They are designed around the included codec chip and the design parameters of that chip. The encoder is a purpose designed ASIC and it has a limited range of hardware based capabilities. An ASIC is used because it uses a lot less power and generates a lot less heat than an FPGA or SoC. It is not a software encoder where you can just reprogram it or rewrite the code to change how it works. Its functions and processing capabilities are determined by the embedded hardware. Hardware that was designed around certain bit rates, frame rates and power consumption.
XAVC was designed to provide a mezzanine level codec with consistent quality no matter the frame rate. It was designed to be a long lived, industry accepted standard, so that means working to fixed parameters so that the codec can be implemented in other hardware and software devices without fear of incompatibility. This is why things like the bit rate, the number of audio channels remains constant. Whatever the frame rate, each frame is encoded the same way, with the same amount of compression and the same optimisations as this creates a standard that can work at almost any frame rate with no change in image quality and exactly the same encoder/decoder used whether you are shooting slow motion or simply shooting at a higher base rate. Broadcasters and many other organisations have invested large amounts of money in hardware codecs (Sony sell the hardware codec chips so they can be included in 3rd party devices) and ingest devices, automated workflows, library and data management systems based around the XAVC standard. If you change the bit rate now, you break the standard. Perhaps in subsequent generations of cameras we will see a new codec family based on new codec chips but I very much doubt it is possible to increase the bit rate through firmware alone.
And at the end of the day even if you record the cameras output to ProRes HQ on an external recorder the difference in the final result between that and any of the internal codecs is so small that for the vast majority of productions it simply isn't worth the extra effort. As for the raw out? It doesn't give you any more dynamic range, it doesn't give you more colour, the sensor and it's 12 bit A to D is the limiting factor. And again in real workflows the differences are so incredibly small that for most it simply isn't worth the effort.
Thank you for your detailed comment.
If it is actually a limitation of the chip and a single frame cannot exceed 1.25 mb (as i ask/suspect at 10:00), Then that’s interesting to know, but then I would also like Sony to communicate about this. If there is a better option for future cameras, it does not have to replace the standard and paralyze existing systems, it could exist beside that.
But I’m afraid we’ll go round in circles a bit, for example, when it comes to the meaning and purpose of the camera. I am completely with you when it comes to the point that manufacturers have to find a balance in certain areas. And as far as standards are concerned, I can absolutely understand this in the brodcast area. But if Sony sells a camera as a cinema camera, then as a customer I can expect more than broadcast standard codecs.
When I then see that Sony, for example, deliberately incorporates limitation on the FX3 with the memory cards, then I find it difficult to distinguish between something that has technical limitation and something that has limitation for political reasons.
In my previous tests, an external RAW recording was both qualitatively (much more details are visible) and better in terms of colors.
P.s. I absolutely agree with you: for most it is not worth the effort, but for some it is.
Hello, thanks to share this interesting video. Unfortunately I did not understand well because my english is not good. I hope anyone with experience advise me.
I only make documentaries about indigenous peoples. I shoot their daily life, people who work on red earth villages and in the countryside. I use V-Log 10bit 4:2:2 50p with the Lumix GH6.
Do you recommend me shooting All-I or Long GOP?
PS: I didn't quite understand if it's easier to manage All-I or Long GOP in video editing. In other words, which of the two clips causes the scroll bar to stutter in real time?
THANK YOU
Hi! This video is specifically about the Sony FX6, to my knowledge the problem that I am investigating here is not present in the GH6. Since you don’t work with special effects or other special requirements in your work, I would say you don’t need to worry.
ALL-I is usually smoother to edit, but the files are larger. ALL-I also has advantages in scenes with many moving details.
It is best to record a scene twice the next time you have a shoot, once in all-i and once in Long GOP and then you check for yourself which one runs better on your computer and whether you see any problems that bother you in the image with one of the formats.
P.S. Another problem that I have observed many times, which can lead to lagging in the timeline: if the movie files were recorded at a high bit rate and are edited from an HDD (hard drive). For softer editing, It can be an advantage to edit the files from an SSD.
Thanks for this--
Sony clearly does withhold features unfortunately. But thank you for these tests.
I’m not even sure if Sony is actively withholding something from us as far as the codec is concerned - maybe the built-in chip is really not able to process frames with higher bitrates. Let’s hope for more in the future.
But that Sony generally pursues a policy of withholding, we see at least with the UHS-II memory card Situation and other Features.
Yeah, 240Mbps seems low, when the GH5 has a 400Mbps option.
Following your logic proresRaw is the worst codec because the camera does the least cleaning up. Any artifacts and noise patterns will show up in proresRaw. The better the codec the less interference right? People lower the sharpness for a reason.
No, you misunderstood something. I'm not talking about noise, but the appearance of pixel blocks and the loss of details (not the good, analog, slightly blurred way). ProResRAW has more noise (thanks to less camera/codec „clean up"), the most beautiful colors and the highest dynamic range and i love it!
RUclipss compression at it's max haha
Yea, I think we can say with great certainty: if you only produce for RUclips, then you can also record on a potato, in the end everything is shredded by the upload anyway
Blackmagic raw
@@ethelquinn Yes, I’m looking forward to it, too… the only question is when, since it looks like winter 2025.
i write such a long comment but it got erased by youtube stupid comment system...
Basically you getting better deal with Long GOP 140Mbps literally twice less in file size than ALL-Intra 240Mbps while giving you *twice as much of bitrate* per each frame of video , because Long GOP is more efficient compression...
To get same quality with ALL-Intra Sony would have to make ALL-Intra 4K @ 25fps write with 500Mbps which will make file video size twice as big, they won't do it because marketing reasons...
Let me know how to contact you if you want i can explain how this works, not wasting any more time for youtube comments.
I read your initial comment in the email notification and was already wondering if the comment had disappeared. Thank you for your detailed comment.
I am aware of why and how the size ratios are as they are, so no further explanation is needed. I find it absurd how often All-I is touted as the highest quality codec option on RUclips, which is why I made the video. I understand that Sony may not be inclined to offer a 500 Mbps option because it would involve additional effort, but I would wish for it, especially considering they've given the camera the name "Cinema."